WorldWideScience

Sample records for study comparing computed

  1. Comparative study between computed radiography and conventional radiography

    International Nuclear Information System (INIS)

    Noorhazleena Azaman; Khairul Anuar Mohd Salleh; Sapizah Rahim; Shaharudin Sayuti; Arshad Yassin; Abdul Razak Hamzah

    2010-01-01

    In Industrial Radiography, there are many criteria that need to be considered based on established standards to accept or reject the radiographic film. For conventional radiography, we need to consider the optical density by using the densitometer when viewing the film on the viewer. But in the computed radiography (CR) we need to evaluate and performed the analysis from the quality of the digital image through grey value. There are many factors that affected the digital image quality. One of the factors which are affected to the digital image quality in the image processing is grey value that related to the contrast resolution. In this work, we performed grey value study measurement on digital radiography systems and compared it with exposed films in conventional radiography. The test sample is a steel step wedge. We found out the contrast resolution is higher in Computed Radiography compared with Conventional Radiography. (author)

  2. Cloud computing for comparative genomics.

    Science.gov (United States)

    Wall, Dennis P; Kudtarkar, Parul; Fusaro, Vincent A; Pivovarov, Rimma; Patil, Prasad; Tonellato, Peter J

    2010-05-18

    Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  3. Cloud computing for comparative genomics

    Directory of Open Access Journals (Sweden)

    Pivovarov Rimma

    2010-05-01

    Full Text Available Abstract Background Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD, to run within Amazon's Elastic Computing Cloud (EC2. We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. Results We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. Conclusions The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  4. Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.

    Science.gov (United States)

    Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P

    2010-12-22

    Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.

  5. COMPARATIVE STUDY OF TERTIARY WASTEWATER TREATMENT BY COMPUTER SIMULATION

    Directory of Open Access Journals (Sweden)

    Stefania Iordache

    2010-01-01

    Full Text Available The aim of this work is to asses conditions for implementation of a Biological Nutrient Removal (BNR process in theWastewater Treatment Plant (WWTP of Moreni city (Romania. In order to meet the more increased environmentalregulations, the wastewater treatment plant that was studied, must update the actual treatment process and have tomodernize it. A comparative study was undertaken of the quality of effluents that could be obtained by implementationof biological nutrient removal process like A2/O (Anaerobic/Anoxic/Oxic and VIP (Virginia Plant Initiative aswastewater tertiary treatments. In order to asses the efficiency of the proposed treatment schemata based on the datamonitored at the studied WWTP, it were realized computer models of biological nutrient removal configurations basedon A2/O and VIP process. Computer simulation was realized using a well-known simulator, BioWin by EnviroSimAssociates Ltd. The simulation process allowed to obtain some data that can be used in design of a tertiary treatmentstage at Moreni WWTP, in order to increase the efficiency in operation.

  6. COMPARATIVE STUDY OF CLOUD COMPUTING AND MOBILE CLOUD COMPUTING

    OpenAIRE

    Nidhi Rajak*, Diwakar Shukla

    2018-01-01

    Present era is of Information and Communication Technology (ICT) and there are number of researches are going on Cloud Computing and Mobile Cloud Computing such security issues, data management, load balancing and so on. Cloud computing provides the services to the end user over Internet and the primary objectives of this computing are resource sharing and pooling among the end users. Mobile Cloud Computing is a combination of Cloud Computing and Mobile Computing. Here, data is stored in...

  7. A comparative study: use of a Brain-computer Interface (BCI) device by people with cerebral palsy in interaction with computers.

    Science.gov (United States)

    Heidrich, Regina O; Jensen, Emely; Rebelo, Francisco; Oliveira, Tiago

    2015-01-01

    This article presents a comparative study among people with cerebral palsy and healthy controls, of various ages, using a Brain-computer Interface (BCI) device. The research is qualitative in its approach. Researchers worked with Observational Case Studies. People with cerebral palsy and healthy controls were evaluated in Portugal and in Brazil. The study aimed to develop a study for product evaluation in order to perceive whether people with cerebral palsy could interact with the computer and compare whether their performance is similar to that of healthy controls when using the Brain-computer Interface. Ultimately, it was found that there are no significant differences between people with cerebral palsy in the two countries, as well as between populations without cerebral palsy (healthy controls).

  8. Comparative study of scintigraphy, ultrasonography and computed tomography in the evaluation of liver tumours

    International Nuclear Information System (INIS)

    Tohyama, Junko; Ishigaki, Takeo; Ishikawa, Tsutomu

    1982-01-01

    A comparative study of scintigraphy, ultrasonography and computed tomography in 67 proven patients with clinically suspected liver tumours was reported. Scintigraphy was superior in sensitivity to ultrasonography and computed tomography. However, in specificity, scintigraphy was inferior to other two. Diagnostic efficacy of ultrasonography and computed tomography in detecting focal masses of the liver was not greatly different, and simultaneous interpretation of ultrasonogram and computed tomogram was more helpful than independent interpretation. So they were thought to be complementary. In conclusion, scintigraphy was thought to be the initial procedure in the diagnostic approach for focal liver masses and ultrasonography was second procedure because of no radiation hazards. And computed tomography should follow then. (author)

  9. Comparative study on the performance of Pod type waterjet by experiment and computation

    Directory of Open Access Journals (Sweden)

    Moon-Chan Kim

    2010-03-01

    Full Text Available A comparative study between a computation and an experiment has been conducted to predict the performance of a Pod type waterjet for an amphibious wheeled vehicle. The Pod type waterjet has been chosen on the basis of the required specific speed of more than 2500. As the Pod type waterjet is an extreme type of axial flow type waterjet, theoretical as well as experimental works about Pod type waterjets are very rare. The main purpose of the present study is to validate and compare to the experimental results of the Pod type waterjet with the developed CFD in-house code based on the RANS equations. The developed code has been validated by comparing with the experimental results of the well-known turbine problem. The validation also extended to the flush type waterjet where the pressures along the duct surface and also velocities at nozzle area have been compared with experimental results. The Pod type waterjet has been designed and the performance of the designed waterjet system including duct, impeller and stator was analyzed by the previously mentioned in-house CFD Code. The pressure distributions and limiting streamlines on the blade surfaces were computed to confirm the performance of the designed waterjets. In addition, the torque and momentum were computed to find the entire efficiency and these were compared with the model test results. Measurements were taken of the flow rate at the nozzle exit, static pressure at the various sections along the duct and also the nozzle, revolution of the impeller, torque, thrust and towing forces at various advance speeds for the prediction of performance as well as for comparison with the computations. Based on these measurements, the performance was analyzed according to the ITTC96 standard analysis method. The full-scale effective and the delivered power of the wheeled vehicle were estimated for the prediction of the service speed. This paper emphasizes the confirmation of the ITTC96 analysis method and

  10. Comparative study of computational model for pipe whip analysis

    International Nuclear Information System (INIS)

    Koh, Sugoong; Lee, Young-Shin

    1993-01-01

    Many types of pipe whip restraints are installed to protect the structural components from the anticipated pipe whip phenomena of high energy lines in nuclear power plants. It is necessary to investigate these phenomena accurately in order to evaluate the acceptability of the pipe whip restraint design. Various research programs have been conducted in many countries to develop analytical methods and to verify the validity of the methods. In this study, various calculational models in ANSYS code and in ADLPIPE code, the general purpose finite element computer programs, were used to simulate the postulated pipe whips to obtain impact loads and the calculated results were compared with the specific experimental results from the sample pipe whip test for the U-shaped pipe whip restraints. Some calculational models, having the spring element between the pipe whip restraint and the pipe line, give reasonably good transient responses of the restraint forces compared with the experimental results, and could be useful in evaluating the acceptability of the pipe whip restraint design. (author)

  11. Land Cover Classification from Multispectral Data Using Computational Intelligence Tools: A Comparative Study

    Directory of Open Access Journals (Sweden)

    André Mora

    2017-11-01

    Full Text Available This article discusses how computational intelligence techniques are applied to fuse spectral images into a higher level image of land cover distribution for remote sensing, specifically for satellite image classification. We compare a fuzzy-inference method with two other computational intelligence methods, decision trees and neural networks, using a case study of land cover classification from satellite images. Further, an unsupervised approach based on k-means clustering has been also taken into consideration for comparison. The fuzzy-inference method includes training the classifier with a fuzzy-fusion technique and then performing land cover classification using reinforcement aggregation operators. To assess the robustness of the four methods, a comparative study including three years of land cover maps for the district of Mandimba, Niassa province, Mozambique, was undertaken. Our results show that the fuzzy-fusion method performs similarly to decision trees, achieving reliable classifications; neural networks suffer from overfitting; while k-means clustering constitutes a promising technique to identify land cover types from unknown areas.

  12. Handheld computers for self-administered sensitive data collection: A comparative study in Peru

    Directory of Open Access Journals (Sweden)

    Hughes James P

    2008-03-01

    Full Text Available Abstract Background Low-cost handheld computers (PDA potentially represent an efficient tool for collecting sensitive data in surveys. The goal of this study is to evaluate the quality of sexual behavior data collected with handheld computers in comparison with paper-based questionnaires. Methods A PDA-based program for data collection was developed using Open-Source tools. In two cross-sectional studies, we compared data concerning sexual behavior collected with paper forms to data collected with PDA-based forms in Ancon (Lima. Results The first study enrolled 200 participants (18–29 years. General agreement between data collected with paper format and handheld computers was 86%. Categorical variables agreement was between 70.5% and 98.5% (Kappa: 0.43–0.86 while numeric variables agreement was between 57.1% and 79.8% (Spearman: 0.76–0.95. Agreement and correlation were higher in those who had completed at least high school than those with less education. The second study enrolled 198 participants. Rates of responses to sensitive questions were similar between both kinds of questionnaires. However, the number of inconsistencies (p = 0.0001 and missing values (p = 0.001 were significantly higher in paper questionnaires. Conclusion This study showed the value of the use of handheld computers for collecting sensitive data, since a high level of agreement between paper and PDA responses was reached. In addition, a lower number of inconsistencies and missing values were found with the PDA-based system. This study has demonstrated that it is feasible to develop a low-cost application for handheld computers, and that PDAs are feasible alternatives for collecting field data in a developing country.

  13. Comparative study of cranial anthropometric measurement by traditional calipers to computed tomography and three-dimensional photogrammetry.

    Science.gov (United States)

    Mendonca, Derick A; Naidoo, Sybill D; Skolnick, Gary; Skladman, Rachel; Woo, Albert S

    2013-07-01

    Craniofacial anthropometry by direct caliper measurements is a common method of quantifying the morphology of the cranial vault. New digital imaging modalities including computed tomography and three-dimensional photogrammetry are similarly being used to obtain craniofacial surface measurements. This study sought to compare the accuracy of anthropometric measurements obtained by calipers versus 2 methods of digital imaging.Standard anterior-posterior, biparietal, and cranial index measurements were directly obtained on 19 participants with an age range of 1 to 20 months. Computed tomographic scans and three-dimensional photographs were both obtained on each child within 2 weeks of the clinical examination. Two analysts measured the anterior-posterior and biparietal distances on the digital images. Measures of reliability and bias between the modalities were calculated and compared.Caliper measurements were found to underestimate the anterior-posterior and biparietal distances as compared with those of the computed tomography and the three-dimensional photogrammetry (P photogrammetry (P = 0.002). The coefficients of variation for repeated measures based on the computed tomography and the three-dimensional photogrammetry were 0.008 and 0.007, respectively.In conclusion, measurements based on digital modalities are generally reliable and interchangeable. Caliper measurements lead to underestimation of anterior-posterior and biparietal values compared with digital imaging.

  14. Comparative study of mesothelioma and asbestosis using computed tomography and conventional chest radiography

    International Nuclear Information System (INIS)

    Rabinowitz, T.G.; Efremidis, S.C.; Cohen, B.; Dan, S.; Efremidis, A.; Chahinian, A.P.; Teirstein, A.S.

    1982-01-01

    A comparative study using computed tomography and conventional posteroanterior radiography was performed on 27 patients with mesothelioma and 13 patients with advanced asbestosis. The major pathologic features of both asbestosis and mesothelioma were well demonstrated by both modalities; computed tomography demonstrated the findings more frequently and in greater detail. No distinguishing features could be established based on configuration and size of the lesion. Many pleural plaques associated with advanced asbestosis were large and irregular and resembled those associated with mesothelioma. However, nodular involvement of the pleural fissures, pleural effusion, and ipsilateral volume loss with a fixed mediastinum were features predominating in mesothelioma. Growth determination of the plaques associated with asbestosis may be of minimal value since such plaques also undergo growth due to active inflammatory changes

  15. Mid-term survival analysis of closed wedge high tibial osteotomy: A comparative study of computer-assisted and conventional techniques.

    Science.gov (United States)

    Bae, Dae Kyung; Song, Sang Jun; Kim, Kang Il; Hur, Dong; Jeong, Ho Yeon

    2016-03-01

    The purpose of the present study was to compare the clinical and radiographic results and survival rates between computer-assisted and conventional closing wedge high tibial osteotomies (HTOs). Data from a consecutive cohort comprised of 75 computer-assisted HTOs and 75 conventional HTOs were retrospectively reviewed. The Knee Society knee and function scores, Hospital for Special Surgery (HSS) score and femorotibial angle (FTA) were compared between the two groups. Survival rates were also compared with procedure failure. The knee and function scores at one year postoperatively were slightly better in the computer-assisted group than those in conventional group (90.1 vs. 86.1) (82.0 vs. 76.0). The HSS scores at one year postoperatively were slightly better for the computer-assisted HTOs than those of conventional HTOs (89.5 vs. 81.8). The inlier of the postoperative FTA was wider in the computer-assisted group than that in the conventional HTO group (88.0% vs. 58.7%), and mean postoperative FTA was greater in the computer-assisted group that in the conventional HTO group (valgus 9.0° vs. valgus 7.6°, pclinical and radiographic results were better in the computer-assisted group that those in the conventional HTO group. Mid-term survival rates did not differ between computer-assisted and conventional HTOs. A comparative analysis of longer-term survival rate is required to demonstrate the long-term benefit of computer-assisted HTO. III. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Comparing Results from Constant Comparative and Computer Software Methods: A Reflection about Qualitative Data Analysis

    Science.gov (United States)

    Putten, Jim Vander; Nolen, Amanda L.

    2010-01-01

    This study compared qualitative research results obtained by manual constant comparative analysis with results obtained by computer software analysis of the same data. An investigated about issues of trustworthiness and accuracy ensued. Results indicated that the inductive constant comparative data analysis generated 51 codes and two coding levels…

  17. System matrix computation vs storage on GPU: A comparative study in cone beam CT.

    Science.gov (United States)

    Matenine, Dmitri; Côté, Geoffroi; Mascolo-Fortin, Julia; Goussard, Yves; Després, Philippe

    2018-02-01

    Iterative reconstruction algorithms in computed tomography (CT) require a fast method for computing the intersection distances between the trajectories of photons and the object, also called ray tracing or system matrix computation. This work focused on the thin-ray model is aimed at comparing different system matrix handling strategies using graphical processing units (GPUs). In this work, the system matrix is modeled by thin rays intersecting a regular grid of box-shaped voxels, known to be an accurate representation of the forward projection operator in CT. However, an uncompressed system matrix exceeds the random access memory (RAM) capacities of typical computers by one order of magnitude or more. Considering the RAM limitations of GPU hardware, several system matrix handling methods were compared: full storage of a compressed system matrix, on-the-fly computation of its coefficients, and partial storage of the system matrix with partial on-the-fly computation. These methods were tested on geometries mimicking a cone beam CT (CBCT) acquisition of a human head. Execution times of three routines of interest were compared: forward projection, backprojection, and ordered-subsets convex (OSC) iteration. A fully stored system matrix yielded the shortest backprojection and OSC iteration times, with a 1.52× acceleration for OSC when compared to the on-the-fly approach. Nevertheless, the maximum problem size was bound by the available GPU RAM and geometrical symmetries. On-the-fly coefficient computation did not require symmetries and was shown to be the fastest for forward projection. It also offered reasonable execution times of about 176.4 ms per view per OSC iteration for a detector of 512 × 448 pixels and a volume of 384 3 voxels, using commodity GPU hardware. Partial system matrix storage has shown a performance similar to the on-the-fly approach, while still relying on symmetries. Partial system matrix storage was shown to yield the lowest relative

  18. The Uses of Literacy in Studying Computer Games: Comparing Students' Oral and Visual Representations of Games

    Science.gov (United States)

    Pelletier, Caroline

    2005-01-01

    This paper compares the oral and visual representations which 12 to 13-year-old students produced in studying computer games as part of an English and Media course. It presents the arguments for studying multimodal texts as part of a literacy curriculum and then provides an overview of the games course devised by teachers and researchers. The…

  19. Comparative study of ultrasound imaging, computed tomography and magnetic resonance imaging in gynecology

    International Nuclear Information System (INIS)

    Ishii, Kenji; Kobayashi, Hisaaki; Hoshihara, Takayuki; Kobayashi, Mitsunao; Suda, Yoshio; Takenaka, Eiichi; Sasa, Hidenori.

    1989-01-01

    We studied 18 patients who were operated at the National Defense Medical College Hospital and confirmed by pathological diagnosis. We compared ultrasound imaging, computed tomography (CT) and magnetic resonance imaging (MRI) of the patients. MRI was useful to diagnose enlargement of the uterine cavity and a small amount of ascites and to understand orientation of the pelvic organs. Ultrasound imaging is the most useful examination to diagnose gynecological diseases. But when it is difficult to diagnose by ultrasound imaging alone, we should employ either CT or MRI, or preferably both. (author)

  20. Comparative study of dose descriptor in pediatric computed tomography exams

    International Nuclear Information System (INIS)

    Finatto, Jerusa Dalbosco; Silva, Ana Maria Marques da; Froner, Ana Paula Pastre; Pimentel, Juliana

    2014-01-01

    This work aims to investigate the dose descriptor, volumetric Computed Tomography Dose Index (CTDI), a pediatric patients sample undergoing to skull CT, comparing the results with the diagnostic reference levels of the literature. Were collected volumetric CTDI values of all skull CT exams performed retrospectively in children of 0-10 years of age in a period of 12 months in a large hospital size. Patients, in a total of 103, were divided into four groups, where the criterion of separation used was age, trying to use the same division used in international references dose descriptors. In all acquisitions we used the pediatric protocol and the Automatic Exposure Control (AEC) available on the equipment. The CDTI values, with and without the use of AEC for pediatric studies, were compared. There was a reduction of approximately 100% in the absorbed dose value due to the use of the AEC. From the data collected and analyzed in this work, it is concluded that the use of dose reduction systems is relevant, such as the Care Dose, to maintain volumetric CTDI values within the reference levels. Also it is important the observation of range of children age to the appropriate choice of parameters used in the test protocol. The values obtained are according to the diagnostic reference levels from the literature

  1. Comparing computing formulas for estimating concentration ratios

    International Nuclear Information System (INIS)

    Gilbert, R.O.; Simpson, J.C.

    1984-03-01

    This paper provides guidance on the choice of computing formulas (estimators) for estimating concentration ratios and other ratio-type measures of radionuclides and other environmental contaminant transfers between ecosystem components. Mathematical expressions for the expected value of three commonly used estimators (arithmetic mean of ratios, geometric mean of ratios, and the ratio of means) are obtained when the multivariate lognormal distribution is assumed. These expressions are used to explain why these estimators will not in general give the same estimate of the average concentration ratio. They illustrate that the magnitude of the discrepancies depends on the magnitude of measurement biases, and on the variances and correlations associated with spatial heterogeneity and measurement errors. This paper also reports on a computer simulation study that compares the accuracy of eight computing formulas for estimating a ratio relationship that is constant over time and/or space. Statistical models appropriate for both controlled spiking experiments and observational field studies for either normal or lognormal distributions are considered. 24 references, 15 figures, 7 tables

  2. Ultrasonography, computed tomography, and cholescintigraphy in suspected obstructive jaundice--a prospective comparative study

    DEFF Research Database (Denmark)

    Matzen, P; Malchow-Møller, A; Brun, B

    1983-01-01

    In order to compare their capacity to visualize the bile ducts, ultrasonography, computed tomography, and cholescintigraphy were performed in 56 consecutive jaundiced patients in whom extrahepatic cholestasis was clinically suspected. The predictions as to the patency of the large bill ducts were...

  3. Accuracy of three-dimensional cone beam computed tomography digital model measurements compared with plaster study casts

    Directory of Open Access Journals (Sweden)

    Shuaib Al Ali

    2017-01-01

    Full Text Available Purpose: The purpose of this study was to assess the accuracy of three-dimensional (3D cone beam computed tomography (CBCT study casts by comparing with direct measurements taken from plaster study casts. Materials and Methods: The dental arches of 30 patient subjects were imaged with a Kodak 9300 3D CBCT devise; Anatomodels were created and in vivo 5 imaging software was used to measure 10 dental arch variables which were compared to measurements of plaster study casts. Results: Three of the 10 variables, i.e., overbite, maxillary intermolar width, and arch length, were found significantly smaller (P < 0.05 using the Anatomodels following nonparametric Wilcoxon signed-rank testing. None of the differences found in the study averaged <0.5 mm. Conclusions: 3D CBCT imaging provided clinically acceptable accuracy for dental arch analysis. 3D CBCT imaging tended to underestimate the actual measurement compared to plaster study casts.

  4. Comparing the performance of SIMD computers by running large air pollution models

    DEFF Research Database (Denmark)

    Brown, J.; Hansen, Per Christian; Wasniewski, J.

    1996-01-01

    To compare the performance and use of three massively parallel SIMD computers, we implemented a large air pollution model on these computers. Using a realistic large-scale model, we gained detailed insight about the performance of the computers involved when used to solve large-scale scientific...... problems that involve several types of numerical computations. The computers used in our study are the Connection Machines CM-200 and CM-5, and the MasPar MP-2216...

  5. A Comparative Study of University of Wisconsin-Stout Freshmen and Senior Education Majors Computing and Internet Technology Skills / Knowledge and Associated Learning Experiences

    OpenAIRE

    Sveum, Evan Charles

    2010-01-01

    A study comparing University of Wisconsin-Stout freshmen and senior education majors’ computing and Internet technology skills/knowledge and associated learning experiences was conducted. Instruments used in this study included the IC³® Exam by Certiport, Inc. and the investigator’s Computing and Internet Skills Learning Experiences survey. UW-Stout freshmen education majors participating in the study demonstrated poor computing and Internet technology skills/knowledge. UW-Stout senior educat...

  6. An ergonomic evaluation comparing desktop, notebook, and subnotebook computers.

    Science.gov (United States)

    Szeto, Grace P; Lee, Raymond

    2002-04-01

    To evaluate and compare the postures and movements of the cervical and upper thoracic spine, the typing performance, and workstation ergonomic factors when using a desktop, notebook, and subnotebook computers. Repeated-measures design. A motion analysis laboratory with an electromagnetic tracking device. A convenience sample of 21 university students between ages 20 and 24 years with no history of neck or shoulder discomfort. Each subject performed a standardized typing task by using each of the 3 computers. Measurements during the typing task were taken at set intervals. Cervical and thoracic spines adopted a more flexed posture in using the smaller-sized computers. There were significantly greater neck movements in using desktop computers when compared with the notebook and subnotebook computers. The viewing distances adopted by the subjects decreased as the computer size decreased. Typing performance and subjective rating of difficulty in using the keyboards were also significantly different among the 3 types of computers. Computer users need to consider the posture of the spine and potential risk of developing musculoskeletal discomfort in choosing computers. Copyright 2002 by the American Congress of Rehabilitation Medicine and the American Academy of Physical Medicine and Rehabilitation

  7. A Comparative Study of University of Wisconsin-Stout Freshmen and Senior Education Major's Computing and Internet Technology Skills/Knowledge and Associated Learning Experiences

    Science.gov (United States)

    Sveum, Evan Charles

    2010-01-01

    A study comparing University of Wisconsin-Stout freshmen and senior education majors' computing and Internet technology skills/knowledge and associated learning experiences was conducted. Instruments used in this study included the IC[superscript 3][R] Exam by Certiport, Inc. and the investigator's Computing and Internet Skills Learning…

  8. Probability of detection - Comparative study of computed and film radiography for high-energy applications

    International Nuclear Information System (INIS)

    Venkatachalam, R.; Venugopal, M.; Prasad, T.

    2007-01-01

    Full text of publication follows: Suitability of computed radiography with Ir-192, Co-60 and up to 9 MeV x-rays for weld inspections is of importance to many heavy engineering and aerospace industries. CR is preferred because of lesser exposure and processing time as compared to film based radiography and also digital images offers other advantages such as image enhancements, quantitative measurements and easier archival. This paper describes systemic experimental approaches and image quality metrics to compare imaging performance of CR with film-based radiography. Experiments were designed using six-sigma methodology to validate performance of CR for steel thickness up to 160 mm with Ir- 192, Co-60 and x-ray energies varying from 100 kV up to 9 MeV. Weld specimens with defects such as lack of fusion, penetration, cracks, concavity, and porosities were studied for evaluating radiographic sensitivity and imaging performance of the system. Attempts were also made to quantify probability of detection using specimens with artificial and natural defects for various experimental conditions and were compared with film based systems. (authors)

  9. Comparative study of the macroscopic finding, conventional tomographic imaging, and computed tomographic imaging in locating the mandibular canal

    International Nuclear Information System (INIS)

    Choi, Hang Moon; You, Dong Soo

    1995-01-01

    The purpose of this study was comparison of conventional tomography with reformatted computed tomography for dental implant in locating the mandibular canal. Five dogs were used and after conventional tomographs and fitted computed tomographs were taken, four dentist traced all films. Mandibles were sectioned with 2 mm slice thickness and the sections were then radiographed (contact radiography). Each radiograpic image was traced and linear measurements were made from mandibular canal to alveolar crest, buccal cortex, lingual cortex, and inferior border. The following results were obtained; 1. Reformatted computed tomographs were exacter than conventional tomography by alveolar crest to canal length of -0.6 mm difference between real values and radiographs 2. The average measurements of buccal cortex to mandibular canal width and lingual cortex to mandibular canal width of conventional tomographs were exacter than reformatted computed tomographs, but standard deviations were higher than reformatted computed tomographs. 3. Standard deviations of reformatted computed tomographs were lower than conventional tomographs at all comparing sites 4. At reformatted computed tomography 62.5% of the measurements performed were within ±1 mm of the true value, and at conventional tomography 24.1% were. 5. Mandibular canal invisibility was 0.8% at reformatted computed tomography and 9.2% at conventional tomography. Reformatted computed tomography has been shown to be more useful radiographic technique for assessment of the mandibular canal than conventional tomography.

  10. A comparative study of three-dimensional reconstructive images of temporomandibular joint using computed tomogram

    International Nuclear Information System (INIS)

    Lim, Suk Young; Koh, Kwang Joon

    1993-01-01

    The purpose of this study was to clarify the spatial relationship of temporomandibular joint and to an aid in the diagnosis of temporomandibular disorder. For this study, three-dimensional images of normal temporomandibular joint were reconstructed by computer image analysis system and three-dimensional reconstructive program integrated in computed tomography. The obtained results were as follows : 1. Two-dimensional computed tomograms had the better resolution than three dimensional computed tomograms in the evaluation of bone structure and the disk of TMJ. 2. Direct sagittal computed tomograms and coronal computed tomograms had the better resolution in the evaluation of the disk of TMJ. 3. The positional relationship of the disk could be visualized, but the configuration of the disk could not be clearly visualized on three-dimensional reconstructive CT images. 4. Three-dimensional reconstructive CT images had the smoother margin than three-dimensional images reconstructed by computer image analysis system, but the images of the latter had the better perspective. 5. Three-dimensional reconstructive images had the better spatial relationship of the TMJ articulation, and the joint space were more clearly visualized on dissection images.

  11. A comparative study between xerographic, computer-assisted overlay generation and animated-superimposition methods in bite mark analyses.

    Science.gov (United States)

    Tai, Meng Wei; Chong, Zhen Feng; Asif, Muhammad Khan; Rahmat, Rabiah A; Nambiar, Phrabhakaran

    2016-09-01

    This study was to compare the suitability and precision of xerographic and computer-assisted methods for bite mark investigations. Eleven subjects were asked to bite on their forearm and the bite marks were photographically recorded. Alginate impressions of the subjects' dentition were taken and their casts were made using dental stone. The overlays generated by xerographic method were obtained by photocopying the subjects' casts and the incisal edge outlines were then transferred on a transparent sheet. The bite mark images were imported into Adobe Photoshop® software and printed to life-size. The bite mark analyses using xerographically generated overlays were done by comparing an overlay to the corresponding printed bite mark images manually. In computer-assisted method, the subjects' casts were scanned into Adobe Photoshop®. The bite mark analyses using computer-assisted overlay generation were done by matching an overlay and the corresponding bite mark images digitally using Adobe Photoshop®. Another comparison method was superimposing the cast images with corresponding bite mark images employing the Adobe Photoshop® CS6 and GIF-Animator©. A score with a range of 0-3 was given during analysis to each precision-determining criterion and the score was increased with better matching. The Kruskal Wallis H test showed significant difference between the three sets of data (H=18.761, p<0.05). In conclusion, bite mark analysis using the computer-assisted animated-superimposition method was the most accurate, followed by the computer-assisted overlay generation and lastly the xerographic method. The superior precision contributed by digital method is discernible despite the human skin being a poor recording medium of bite marks. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. Comparative study of a low-Z cone-beam computed tomography system

    International Nuclear Information System (INIS)

    Roberts, D A; Hansen, V N; Poludniowski, G; Evans, P M; Thompson, M G; Niven, A; Seco, J

    2011-01-01

    Computed tomography images have been acquired using an experimental (low atomic number (Z) insert) megavoltage cone-beam imaging system. These images have been compared with standard megavoltage and kilovoltage imaging systems. The experimental system requires a simple modification to the 4 MeV electron beam from an Elekta Precise linac. Low-energy photons are produced in the standard medium-Z electron window and a low-Z carbon electron absorber located after the window. The carbon electron absorber produces photons as well as ensuring that all remaining electrons from the source are removed. A detector sensitive to diagnostic x-ray energies is also employed. Quantitative assessment of cone-beam computed tomography (CBCT) contrast shows that the low-Z imaging system is an order of magnitude or more superior to a standard 6 MV imaging system. CBCT data with the same contrast-to-noise ratio as a kilovoltage imaging system (0.15 cGy) can be obtained in doses of 11 and 244 cGy for the experimental and standard 6 MV systems, respectively. Whilst these doses are high for everyday imaging, qualitative images indicate that kilovoltage like images suitable for patient positioning can be acquired in radiation doses of 1-8 cGy with the experimental low-Z system.

  13. An exploration of computer game-based instruction in the "world history" class in secondary education: a comparative study in China.

    Science.gov (United States)

    Yu, Zhonggen; Yu, Wei Hua; Fan, Xiaohui; Wang, Xiao

    2014-01-01

    So far, many studies on educational games have been carried out in America and Europe. Very few related empirical studies, however, have been conducted in China. This study, combining both quantitative with qualitative research methods, possibly compensated for this regret. The study compared data collected from two randomly selected classes (out of 13 classes) under computer game-based instruction (CGBI) and non-computer game-based instruction (NCGBI), respectively, in a senior high school located in Nanjing, Capital of Jiangsu Province, in China. The participants were 103 students, composed of 52 boys and 51 girls (aged 17-18 years old). The following conclusion was reached: (1) participants under CGBI obtained significantly greater learning achievement than those under NCGBI; (2) participants were significantly more motivated by CGBI compared with NCGBI; (3) there were no significant differences in learning achievement between boys and girls; although (4) boys were significantly more motivated by CGBI than girls. Both disadvantages and advantages were discussed, together with directions for future research.

  14. River suspended sediment estimation by climatic variables implication: Comparative study among soft computing techniques

    Science.gov (United States)

    Kisi, Ozgur; Shiri, Jalal

    2012-06-01

    Estimating sediment volume carried by a river is an important issue in water resources engineering. This paper compares the accuracy of three different soft computing methods, Artificial Neural Networks (ANNs), Adaptive Neuro-Fuzzy Inference System (ANFIS), and Gene Expression Programming (GEP), in estimating daily suspended sediment concentration on rivers by using hydro-meteorological data. The daily rainfall, streamflow and suspended sediment concentration data from Eel River near Dos Rios, at California, USA are used as a case study. The comparison results indicate that the GEP model performs better than the other models in daily suspended sediment concentration estimation for the particular data sets used in this study. Levenberg-Marquardt, conjugate gradient and gradient descent training algorithms were used for the ANN models. Out of three algorithms, the Conjugate gradient algorithm was found to be better than the others.

  15. Analyzing Dental Implant Sites From Cone Beam Computed Tomography Scans on a Tablet Computer: A Comparative Study Between iPad and 3 Display Systems.

    Science.gov (United States)

    Carrasco, Alejandro; Jalali, Elnaz; Dhingra, Ajay; Lurie, Alan; Yadav, Sumit; Tadinada, Aditya

    2017-06-01

    The aim of this study was to compare a medical-grade PACS (picture archiving and communication system) monitor, a consumer-grade monitor, a laptop computer, and a tablet computer for linear measurements of height and width for specific implant sites in the posterior maxilla and mandible, along with visualization of the associated anatomical structures. Cone beam computed tomography (CBCT) scans were evaluated. The images were reviewed using PACS-LCD monitor, consumer-grade LCD monitor using CB-Works software, a 13″ MacBook Pro, and an iPad 4 using OsiriX DICOM reader software. The operators had to identify anatomical structures in each display using a 2-point scale. User experience between PACS and iPad was also evaluated by means of a questionnaire. The measurements were very similar for each device. P-values were all greater than 0.05, indicating no significant difference between the monitors for each measurement. The intraoperator reliability was very high. The user experience was similar in each category with the most significant difference regarding the portability where the PACS display received the lowest score and the iPad received the highest score. The iPad with retina display was comparable with the medical-grade monitor, producing similar measurements and image visualization, and thus providing an inexpensive, portable, and reliable screen to analyze CBCT images in the operating room during the implant surgery.

  16. A comparative study of 2 computer-assisted methods of quantifying brightfield microscopy images.

    Science.gov (United States)

    Tse, George H; Marson, Lorna P

    2013-10-01

    Immunohistochemistry continues to be a powerful tool for the detection of antigens. There are several commercially available software packages that allow image analysis; however, these can be complex, require relatively high level of computer skills, and can be expensive. We compared 2 commonly available software packages, Adobe Photoshop CS6 and ImageJ, in their ability to quantify percentage positive area after picrosirius red (PSR) staining and 3,3'-diaminobenzidine (DAB) staining. On analysis of DAB-stained B cells in the mouse spleen, with a biotinylated primary rat anti-mouse-B220 antibody, there was no significant difference on converting images from brightfield microscopy to binary images to measure black and white pixels using ImageJ compared with measuring a range of brown pixels with Photoshop (Student t test, P=0.243, correlation r=0.985). When analyzing mouse kidney allografts stained with PSR, Photoshop achieved a greater interquartile range while maintaining a lower 10th percentile value compared with analysis with ImageJ. A lower 10% percentile reflects that Photoshop analysis is better at analyzing tissues with low levels of positive pixels; particularly relevant for control tissues or negative controls, whereas after ImageJ analysis the same images would result in spuriously high levels of positivity. Furthermore comparing the 2 methods by Bland-Altman plot revealed that these 2 methodologies did not agree when measuring images with a higher percentage of positive staining and correlation was poor (r=0.804). We conclude that for computer-assisted analysis of images of DAB-stained tissue there is no difference between using Photoshop or ImageJ. However, for analysis of color images where differentiation into a binary pattern is not easy, such as with PSR, Photoshop is superior at identifying higher levels of positivity while maintaining differentiation of low levels of positive staining.

  17. Quantified measurement of brain blood volume: comparative evaluations between the single photon emission computer tomography and the positron computer tomography

    International Nuclear Information System (INIS)

    Bouvard, G.; Fernandez, Y.; Petit-Taboue, M.C.; Derlon, J.M.; Travere, J.M.; Le Poec, C.

    1991-01-01

    The quantified measurement of cerebral blood volume is interesting for the brain blood circulation studies. This measurement is often used in positron computed tomography. It's more difficult in single photon emission computed tomography: there are physical problems with the limited resolution of the detector, the Compton effect and the photon attenuation. The objectif of this study is to compare the results between these two techniques. The quantified measurement of brain blood volume is possible with the single photon emission computer tomogragry. However, there is a loss of contrast [fr

  18. Comparative study of auxetic geometries by means of computer-aided design and engineering

    International Nuclear Information System (INIS)

    Álvarez Elipe, Juan Carlos; Díaz Lantada, Andrés

    2012-01-01

    Auxetic materials (or metamaterials) are those with a negative Poisson ratio (NPR) and display the unexpected property of lateral expansion when stretched, as well as an equal and opposing densification when compressed. Such geometries are being progressively employed in the development of novel products, especially in the fields of intelligent expandable actuators, shape morphing structures and minimally invasive implantable devices. Although several auxetic and potentially auxetic geometries have been summarized in previous reviews and research, precise information regarding relevant properties for design tasks is not always provided. In this study we present a comparative study of two-dimensional and three-dimensional auxetic geometries carried out by means of computer-aided design and engineering tools (from now on CAD–CAE). The first part of the study is focused on the development of a CAD library of auxetics. Once the library is developed we simulate the behavior of the different auxetic geometries and elaborate a systematic comparison, considering relevant properties of these geometries, such as Poisson ratio(s), maximum volume or area reductions attainable and equivalent Young’s modulus, hoping it may provide useful information for future designs of devices based on these interesting structures. (paper)

  19. Analyze image quality and comparative study between conventional and computed radiography applied to the inspection of alloys

    International Nuclear Information System (INIS)

    Machado, Alessandra S.; Oliveira, Davi F.; Silva, Aline S.S.; Nascimento, Joseilson R.; Lopes, Ricardo T.

    2011-01-01

    Piping system design takes into account relevant factors such as: internal coating, dimensioning, vibration system, adequate supports and principally, piping material. Cost is a decisive factor in the phase of material selection. The non-destructive testing method most commonly employed in industry to analyze the structure of an object is radiographic testing. Computed radiography (CR) is a quicker and much more efficient alternative to conventional radiography but, although CR presents numerous advantages, testing procedures are still largely based on trial and error, due to the lack of a consecrated methodology to choose parameters as it exists for conventional radiography. Notwithstanding, this paper presents a study that uses the technique of computed radiography to analyze metal alloys. These metal alloys are used as internal pipe coating aiming to protect against corrosion and cracks. This study seeks to evaluate parameters such as basic spatial resolution, Normalized Signal-to-Noise Ratio (SNRN), contrast, intensity and also to compare conventional radiography with CR. (author)

  20. Estimation Methods of the Point Spread Function Axial Position: A Comparative Computational Study

    Directory of Open Access Journals (Sweden)

    Javier Eduardo Diaz Zamboni

    2017-01-01

    Full Text Available The precise knowledge of the point spread function is central for any imaging system characterization. In fluorescence microscopy, point spread function (PSF determination has become a common and obligatory task for each new experimental device, mainly due to its strong dependence on acquisition conditions. During the last decade, algorithms have been developed for the precise calculation of the PSF, which fit model parameters that describe image formation on the microscope to experimental data. In order to contribute to this subject, a comparative study of three parameter estimation methods is reported, namely: I-divergence minimization (MIDIV, maximum likelihood (ML and non-linear least square (LSQR. They were applied to the estimation of the point source position on the optical axis, using a physical model. Methods’ performance was evaluated under different conditions and noise levels using synthetic images and considering success percentage, iteration number, computation time, accuracy and precision. The main results showed that the axial position estimation requires a high SNR to achieve an acceptable success level and higher still to be close to the estimation error lower bound. ML achieved a higher success percentage at lower SNR compared to MIDIV and LSQR with an intrinsic noise source. Only the ML and MIDIV methods achieved the error lower bound, but only with data belonging to the optical axis and high SNR. Extrinsic noise sources worsened the success percentage, but no difference was found between noise sources for the same method for all methods studied.

  1. The comparative study on diagnostic validity of cerebral aneurysm by computed tomography angiography versus digital subtraction angiography after subarachnoid hemorrhage

    Directory of Open Access Journals (Sweden)

    Masih Saboori

    2011-01-01

    Full Text Available Background: In order to declare the preoperative diagnostic value of brain aneurysms, two radiological modalities, computed tomographic angiography and digital subtraction angiography were compared. Methods: In this descriptive analytic study, diagnostic value of computed tomographic angiography (CTA was com-pared with digital subtraction angiography (DSA. Sensitivity, specificity, positive and negative predictive values were calculated and compared between the two modalities. All data were analyzed with SPSS software, version 16. Results: Mean age of patients was 49.5 ± 9.13 years. 57.9 % of subjects were female. CTA showed 89% sensitivity and 100% specificity whereas DSA demonstrated 74% sensitivity and 100% specificity. Positive predictive value of both methods was 100%, but negative predictive value of CTA and DSA was 85% and 69%, respectively. Conclusions: Based on our data, CTA is a valuable diagnostic modality for detection of brain aneurysm and su-barachnoid hemorrhage.

  2. Comparing the social skills of students addicted to computer games with normal students.

    Science.gov (United States)

    Zamani, Eshrat; Kheradmand, Ali; Cheshmi, Maliheh; Abedi, Ahmad; Hedayati, Nasim

    2010-01-01

    This study aimed to investigate and compare the social skills of studentsaddicted to computer games with normal students. The dependentvariable in the present study is the social skills. The study population included all the students in the second grade ofpublic secondary school in the city of Isfahan at the educational year of2009-2010. The sample size included 564 students selected using thecluster random sampling method. Data collection was conducted usingQuestionnaire of Addiction to Computer Games and Social SkillsQuestionnaire (The Teenage Inventory of Social Skill or TISS). The results of the study showed that generally, there was a significantdifference between the social skills of students addicted to computer gamesand normal students. In addition, the results indicated that normal studentshad a higher level of social skills in comparison with students addicted tocomputer games. As the study results showed, addiction to computer games may affectthe quality and quantity of social skills. In other words, the higher theaddiction to computer games, the less the social skills. The individualsaddicted to computer games have less social skills.).

  3. An Exploration of Computer Game-Based Instruction in the “World History” Class in Secondary Education: A Comparative Study in China

    Science.gov (United States)

    Yu, Zhonggen; Yu, Wei Hua; Fan, Xiaohui; Wang, Xiao

    2014-01-01

    So far, many studies on educational games have been carried out in America and Europe. Very few related empirical studies, however, have been conducted in China. This study, combining both quantitative with qualitative research methods, possibly compensated for this regret. The study compared data collected from two randomly selected classes (out of 13 classes) under computer game-based instruction (CGBI) and non-computer game-based instruction (NCGBI), respectively, in a senior high school located in Nanjing, Capital of Jiangsu Province, in China. The participants were 103 students, composed of 52 boys and 51 girls (aged 17-18 years old). The following conclusion was reached: (1) participants under CGBI obtained significantly greater learning achievement than those under NCGBI; (2) participants were significantly more motivated by CGBI compared with NCGBI; (3) there were no significant differences in learning achievement between boys and girls; although (4) boys were significantly more motivated by CGBI than girls. Both disadvantages and advantages were discussed, together with directions for future research. PMID:24816635

  4. A comparative study of attenuation correction algorithms in single photon emission computed tomography (SPECT)

    International Nuclear Information System (INIS)

    Murase, Kenya; Itoh, Hisao; Mogami, Hiroshi; Ishine, Masashiro; Kawamura, Masashi; Iio, Atsushi; Hamamoto, Ken

    1987-01-01

    A computer based simulation method was developed to assess the relative effectiveness and availability of various attenuation compensation algorithms in single photon emission computed tomography (SPECT). The effect of the nonuniformity of attenuation coefficient distribution in the body, the errors in determining a body contour and the statistical noise on reconstruction accuracy and the computation time in using the algorithms were studied. The algorithms were classified into three groups: precorrection, post correction and iterative correction methods. Furthermore, a hybrid method was devised by combining several methods. This study will be useful for understanding the characteristics limitations and strengths of the algorithms and searching for a practical correction method for photon attenuation in SPECT. (orig.)

  5. Comparative study between computed tomography and bronchoscopy in the diagnosis of lung cancer

    International Nuclear Information System (INIS)

    Oliveira, Christopher; Saraiva, Antonio

    2010-01-01

    Objective: to analyze the role of computed tomography and bronchoscopy in the diagnosis of lung cancer, evaluating the effectiveness of these techniques in the presence of this disease. Parameters such as age, gender, smoking habits, histological types, staging and treatment were also analyzed. Materials and methods: the sample of the present study included 70 patients assisted at the Department of Pneumology of Hospital Distrital da Figueira da Foz, Coimbra, Portugal, who were submitted to both diagnostic methods, namely, computed tomography and bronchoscopy, to confirm the presence or the absence of lung cancer. Results: thirty-seven patients (23 men and 14 women) were diagnosed with lung cancer. Histologically 40.54% were adenocarcinoma, followed by squamous carcinoma (32.43% cases) and small-cell lung cancer (18.92%). Staging showed 6.70% stage IB disease, 23.30% stage IIIA and 36.70% stage IIIB, and 33.30% stage IV. Chemotherapy alone was the first treatment of choice for 75.7% of patients. Bronchoscopy sensitivity was 83.8%, specificity 81.8%, and accuracy 82.8%. Computed tomography sensitivity was 81.1%, specificity 63.6%, and accuracy 72.8%. Conclusion: bronchoscopy results corroborated the relevance of the method in the diagnosis of lung cancer, considering its dependence on the anatomopathological study of tissue or cells obtained through different biopsy techniques. Computed tomography presented good sensitivity (81.1%), however the specificity of only 63.6% is related to the rate of false-positive results (36.4%). (author)

  6. Comparative study between computed tomography and bronchoscopy in the diagnosis of lung cancer

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Christopher; Saraiva, Antonio, E-mail: asaraiva@estescoimbra.p [Escola Superior de Tecnologia da Saude de Coimbra (ESTeSC), Coimbra (Portugal)

    2010-07-15

    Objective: to analyze the role of computed tomography and bronchoscopy in the diagnosis of lung cancer, evaluating the effectiveness of these techniques in the presence of this disease. Parameters such as age, gender, smoking habits, histological types, staging and treatment were also analyzed. Materials and methods: the sample of the present study included 70 patients assisted at the Department of Pneumology of Hospital Distrital da Figueira da Foz, Coimbra, Portugal, who were submitted to both diagnostic methods, namely, computed tomography and bronchoscopy, to confirm the presence or the absence of lung cancer. Results: thirty-seven patients (23 men and 14 women) were diagnosed with lung cancer. Histologically 40.54% were adenocarcinoma, followed by squamous carcinoma (32.43% cases) and small-cell lung cancer (18.92%). Staging showed 6.70% stage IB disease, 23.30% stage IIIA and 36.70% stage IIIB, and 33.30% stage IV. Chemotherapy alone was the first treatment of choice for 75.7% of patients. Bronchoscopy sensitivity was 83.8%, specificity 81.8%, and accuracy 82.8%. Computed tomography sensitivity was 81.1%, specificity 63.6%, and accuracy 72.8%. Conclusion: bronchoscopy results corroborated the relevance of the method in the diagnosis of lung cancer, considering its dependence on the anatomopathological study of tissue or cells obtained through different biopsy techniques. Computed tomography presented good sensitivity (81.1%), however the specificity of only 63.6% is related to the rate of false-positive results (36.4%). (author)

  7. Wind turbine power coefficient estimation by soft computing methodologies: Comparative study

    International Nuclear Information System (INIS)

    Shamshirband, Shahaboddin; Petković, Dalibor; Saboohi, Hadi; Anuar, Nor Badrul; Inayat, Irum; Akib, Shatirah; Ćojbašić, Žarko; Nikolić, Vlastimir; Mat Kiah, Miss Laiha; Gani, Abdullah

    2014-01-01

    Highlights: • Variable speed operation of wind turbine to increase power generation. • Changeability and fluctuation of wind has to be accounted. • To build an effective prediction model of wind turbine power coefficient. • The impact of the variation in the blade pitch angle and tip speed ratio. • Support vector regression methodology application as predictive methodology. - Abstract: Wind energy has become a large contender of traditional fossil fuel energy, particularly with the successful operation of multi-megawatt sized wind turbines. However, reasonable wind speed is not adequately sustainable everywhere to build an economical wind farm. In wind energy conversion systems, one of the operational problems is the changeability and fluctuation of wind. In most cases, wind speed can vacillate rapidly. Hence, quality of produced energy becomes an important problem in wind energy conversion plants. Several control techniques have been applied to improve the quality of power generated from wind turbines. In this study, the polynomial and radial basis function (RBF) are applied as the kernel function of support vector regression (SVR) to estimate optimal power coefficient value of the wind turbines. Instead of minimizing the observed training error, SVR p oly and SVR r bf attempt to minimize the generalization error bound so as to achieve generalized performance. The experimental results show that an improvement in predictive accuracy and capability of generalization can be achieved by the SVR approach in compare to other soft computing methodologies

  8. COMPARATIVE STUDY OF TERTIARY WASTEWATER TREATMENT BY COMPUTER SIMULATION

    OpenAIRE

    Stefania Iordache; Nicolae Petrescu; Cornel Ianache

    2010-01-01

    The aim of this work is to asses conditions for implementation of a Biological Nutrient Removal (BNR) process in theWastewater Treatment Plant (WWTP) of Moreni city (Romania). In order to meet the more increased environmentalregulations, the wastewater treatment plant that was studied, must update the actual treatment process and have tomodernize it. A comparative study was undertaken of the quality of effluents that could be obtained by implementationof biological nutrient removal process li...

  9. VISTA - computational tools for comparative genomics

    Energy Technology Data Exchange (ETDEWEB)

    Frazer, Kelly A.; Pachter, Lior; Poliakov, Alexander; Rubin,Edward M.; Dubchak, Inna

    2004-01-01

    Comparison of DNA sequences from different species is a fundamental method for identifying functional elements in genomes. Here we describe the VISTA family of tools created to assist biologists in carrying out this task. Our first VISTA server at http://www-gsd.lbl.gov/VISTA/ was launched in the summer of 2000 and was designed to align long genomic sequences and visualize these alignments with associated functional annotations. Currently the VISTA site includes multiple comparative genomics tools and provides users with rich capabilities to browse pre-computed whole-genome alignments of large vertebrate genomes and other groups of organisms with VISTA Browser, submit their own sequences of interest to several VISTA servers for various types of comparative analysis, and obtain detailed comparative analysis results for a set of cardiovascular genes. We illustrate capabilities of the VISTA site by the analysis of a 180 kilobase (kb) interval on human chromosome 5 that encodes for the kinesin family member3A (KIF3A) protein.

  10. Comparability of Computer-based and Paper-based Versions of Writing Section of PET in Iranian EFL Context

    Directory of Open Access Journals (Sweden)

    Mohammad Mohammadi

    2010-11-01

    Full Text Available Computer technology has provided language testing experts with opportunity to develop computerized versions of traditional paper-based language tests. New generations of TOEFL and Cambridge IELTS, BULATS, KET, PET are good examples of computer-based language tests. Since this new method of testing introduces new factors into the realm of language assessment ( e.g. modes of test delivery, familiarity with computer, etc.,the question may be whether the two modes of computer- and paper-based tests comparably measure the same construct, and hence, the scores obtained from the two modes can be used interchangeably. Accordingly, the present study aimed to investigate the comparability of the paper- and computer-based versions of a writing test. The data for this study were collected from administering the writing section of a Cambridge Preliminary English Test (PET to eighty Iranian intermediate EFL learners through the two modes of computer- and paper-based testing. Besides, a computer familiarity questionnaire was used to divide participants into two groups with high and low computer familiarity. The results of the independent samples t-test revealed that there was no statistically significant difference between the learners' computer- and paper-based writing scores. The results of the paired samples t-test showed no statistically significant difference between high- and low-computer-familiar groups on computer-based writing. The researchers concluded that the two modes comparably measured the same construct.

  11. An observer study comparing spot imaging regions selected by radiologists and a computer for an automated stereo spot mammography technique

    International Nuclear Information System (INIS)

    Goodsitt, Mitchell M.; Chan, Heang-Ping; Lydick, Justin T.; Gandra, Chaitanya R.; Chen, Nelson G.; Helvie, Mark A.; Bailey, Janet E.; Roubidoux, Marilyn A.; Paramagul, Chintana; Blane, Caroline E.; Sahiner, Berkman; Petrick, Nicholas A.

    2004-01-01

    We are developing an automated stereo spot mammography technique for improved imaging of suspicious dense regions within digital mammograms. The technique entails the acquisition of a full-field digital mammogram, automated detection of a suspicious dense region within that mammogram by a computer aided detection (CAD) program, and acquisition of a stereo pair of images with automated collimation to the suspicious region. The latter stereo spot image is obtained within seconds of the original full-field mammogram, without releasing the compression paddle. The spot image is viewed on a stereo video display. A critical element of this technique is the automated detection of suspicious regions for spot imaging. We performed an observer study to compare the suspicious regions selected by radiologists with those selected by a CAD program developed at the University of Michigan. True regions of interest (TROIs) were separately determined by one of the radiologists who reviewed the original mammograms, biopsy images, and histology results. We compared the radiologist and computer-selected regions of interest (ROIs) to the TROIs. Both the radiologists and the computer were allowed to select up to 3 regions in each of 200 images (mixture of 100 CC and 100 MLO views). We computed overlap indices (the overlap index is defined as the ratio of the area of intersection to the area of interest) to quantify the agreement between the selected regions in each image. The averages of the largest overlap indices per image for the 5 radiologist-to-computer comparisons were directly related to the average number of regions per image traced by the radiologists (about 50% for 1 region/image, 84% for 2 regions/image and 96% for 3 regions/image). The average of the overlap indices with all of the TROIs was 73% for CAD and 76.8%+/-10.0% for the radiologists. This study indicates that the CAD determined ROIs could potentially be useful for a screening technique that includes stereo spot

  12. A Comparative Computational Fluid Dynamics Study on an Innovative Exhaust Air Energy Recovery Wind Turbine Generator

    Directory of Open Access Journals (Sweden)

    Seyedsaeed Tabatabaeikia

    2016-05-01

    Full Text Available Recovering energy from exhaust air systems of building cooling towers is an innovative idea. A specific wind turbine generator was designed in order to achieve this goal. This device consists of two Giromill vertical axis wind turbines (VAWT combined with four guide vanes and two diffuser plates. It was clear from previous literatures that no comprehensive flow behavior study had been carried out on this innovative device. Therefore, the working principle of this design was simulated using the Analysis System (ANSYS Fluent computational fluid dynamics (CFD package and the results were compared to experimental ones. It was perceived from the results that by introducing the diffusers and then the guide vanes, the overall power output of the wind turbine was improved by approximately 5% and 34%, respectively, compared to using VAWT alone. In the case of the diffusers, the optimum angle was found to be 7°, while for guide vanes A and B, it was 70° and 60° respectively. These results were in good agreement with experimental results obtained in the previous experimental study. Overall, it can be concluded that exhaust air recovery turbines are a promising form of green technology.

  13. Comparing Computer Game and Traditional Lecture Using Experience Ratings from High and Low Achieving Students

    Science.gov (United States)

    Grimley, Michael; Green, Richard; Nilsen, Trond; Thompson, David

    2012-01-01

    Computer games are purported to be effective instructional tools that enhance motivation and improve engagement. The aim of this study was to investigate how tertiary student experiences change when instruction was computer game based compared to lecture based, and whether experiences differed between high and low achieving students. Participants…

  14. Measurements by activation foils and comparative computations by MCNP code

    International Nuclear Information System (INIS)

    Kyncl, J.

    2008-01-01

    Systematic study of the radioactive waste minimisation problem is subject of the SPHINX project. Its idea is that burning or transmutation of the waste inventory problematic part will be realized in a nuclear reactor the fuel of which is in the form of liquid fluorides. In frame of the project, several experiments have been performed with so-called inserted experimental channel. The channel was filled up by the fluorides mixture, surrounded by six fuel assemblies with moderator and placed into LR-0 reactor vessel. This formation was brought to critical state and measurement with activation foil detectors were carried out at selected positions of the inserted channel. Main aim of the measurements was to determine reaction rates for the detectors mentioned. For experiment evaluation, comparative computations were accomplished by code MCNP4a. The results obtained show that very often, computed values of reaction rates differ substantially from the values that were obtained from the experiment. This contribution deals with analysis of the reasons of these differences from the point of view of computations by Monte Carlo method. The analysis of concrete cases shows that the inaccuracy of reaction rate computed is caused mostly by three circumstances:-space region that is occupied by detector is relatively very small;- microscopic effective cross-section R(E) of the reaction changes strongly with energy just in the energy interval that gives the greatest contribution to the reaction; - in the energy interval that gives the greatest contribution to reaction rate, the error of the computed neutron flux is great. These circumstances evoke that the computation of reaction rate with casual accuracy submits extreme demands on computing time. (Author)

  15. Evolutionary computation techniques a comparative perspective

    CERN Document Server

    Cuevas, Erik; Oliva, Diego

    2017-01-01

    This book compares the performance of various evolutionary computation (EC) techniques when they are faced with complex optimization problems extracted from different engineering domains. Particularly focusing on recently developed algorithms, it is designed so that each chapter can be read independently. Several comparisons among EC techniques have been reported in the literature, however, they all suffer from one limitation: their conclusions are based on the performance of popular evolutionary approaches over a set of synthetic functions with exact solutions and well-known behaviors, without considering the application context or including recent developments. In each chapter, a complex engineering optimization problem is posed, and then a particular EC technique is presented as the best choice, according to its search characteristics. Lastly, a set of experiments is conducted in order to compare its performance to other popular EC methods.

  16. A Comparative Study of Load Balancing Algorithms in Cloud Computing Environment

    OpenAIRE

    Katyal, Mayanka; Mishra, Atul

    2014-01-01

    Cloud Computing is a new trend emerging in IT environment with huge requirements of infrastructure and resources. Load Balancing is an important aspect of cloud computing environment. Efficient load balancing scheme ensures efficient resource utilization by provisioning of resources to cloud users on demand basis in pay as you say manner. Load Balancing may even support prioritizing users by applying appropriate scheduling criteria. This paper presents various load balancing schemes in differ...

  17. Comparative micro computed tomography study of a vertebral body

    Science.gov (United States)

    Drews, Susanne; Beckmann, Felix; Herzen, Julia; Brunke, Oliver; Salmon, Phil; Friess, Sebastian; Laib, Andres; Koller, Bruno; Hemberger, Thomas; Müller-Gerbl, Magdalena; Müller, Bert

    2008-08-01

    Investigations of bony tissues are often performed using micro computed tomography based on X-rays, since the calcium distribution leads to superior contrast. Osteoporotic bone, for example, can be well compared with healthy one with respect to density and morphology. Degenerative and rheumatoid diseases usually start, however, at the bone-cartilage-interface, which is hardly accessible. The direct influence on the bone itself becomes only visible at later stage. For the development of suitable therapies against degenerative cartilage damages the exact three-dimensional description of the bone-cartilage interface is vital, as demonstrated for transplanted cartilage-cells or bone-cartilage-constructs in animal models. So far, the morphological characterization was restricted to magnetic resonance imaging (MRI) with poor spatial resolution or to time-consuming histological sectioning with appropriate spatial resolution only in two rather arbitrarily chosen directions. Therefore, one should develop μCT to extract the features of low absorbing cartilage. The morphology and the volume of the inter-vertebral cartilage disc of lumbar motion segments have been determined for one PMMA embedded specimen. Tomograms were recorded using nanotom® (Phoenix|x-ray, Wunstorf, Germany), μCT 35TM (Scanco Medical, Brütisellen, Switzerland), 1172TM and 1174TM (both Skyscan, Kontich, Belgium), as well as using the SRμCT at HASYLAB/DESY. Conventional and SRμCT can provide the morphology and the volume of cartilage between bones. Increasing the acquisition time, the signal-to-noise ratio becomes better and better but the prominent artifacts in conventional μCT as the result of inhomogeneously distributed bony tissue prevents the exact segmentation of cartilage. SRμCT allows segmenting the cartilage but requires long periods of expensive beam-time to obtain reasonable contrast.

  18. In vitro comparative study of manual and mechanical rotary instrumentation of root canals using computed tomography.

    Science.gov (United States)

    Limongi, Orlando; de Albuquerque, Diana Santana; Baratto Filho, Flares; Vanni, José Roberto; de Oliveira, Elias P Motcy; Barletta, Fernando Branco

    2007-01-01

    This in vitro study compared, using computed tomography (CT), the amount of dentin removed from root canal walls by manual and mechanical rotary instrumentation techniques. Forty mandibular incisors with dental crown and a single canal were selected. The teeth were randomly assigned to two groups, according to the technique used for root canal preparation: Group I - manual instrumentation with stainless steel files; Group II - mechanical instrumentation with RaCe rotary nickel-titanium instruments. In each tooth, root dentin thickness of the buccal, lingual, mesial and distal surfaces in the apical, middle and cervical thirds of the canal was measured (in mm) using a multislice CT scanner (Siemens Emotion, Duo). Data were stored in the SPSS v. 11.5 and SigmaPlot 2001 v. 7.101 softwares. After crown opening, working length was determined, root canals were instrumented and new CT scans were taken for assessment of root dentin thickness. Pre- and post-instrumentation data were compared and analyzed statistically by ANOVA and Tukey's post-hoc test for significant differences (p=0.05). Based on the findings of this study, it may be concluded that regarding dentin removal from root canal walls during instrumentation, neither of the techniques can be considered more effective than the other.

  19. Color doppler findings of gastric varices compared with findings on computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Sato, Takahiro; Yamazaki, Katsu; Toyota, Jouji; Karino, Yoshiyasu; Ohmura, Takumi; Suga, Toshihiro [Sapporo Kosei General Hospital (Japan)

    2002-08-01

    The aim of this study was to evaluate the hemodynamics of gastric varices. We evaluated the detection rates of gastric varices, inflowing vessels to gastric varices, and outflowing vessels from gastric varices in 24 patients with gastric varices, using color Doppler sonography, and compared these findings with computed tomography findings. Eighteen patients had F2-type varices and 6 had F3-type, classified according to the Japanese Research Society for Portal Hypertension. Fourteen patients had fundal varices, and 10 had cardiac and fundal varices. The detection rates of collateral veins using color Doppler sonography were as follows: gastric varices were detected in all 24 patients (100%); inflowing vessels, in 21 of the 24 patients (87.5%); and outflowing vessels, in 18 of the 24 patients (75.0%). The detection rates of collateral veins, using computed tomography, were: gastric varices were detected in all 24 patients (100%); inflowing vessels, in all 24 patients (100%); and outflowing vessles, in 21 of the 24 patients (87.5%). The color Doppler findings agreed perfectly with the computed tomography findings in 13 of the 24 patients (54.2%). Although color Doppler sonography is a useful, noninvasive modality for evaluating the hemodynamics of gastric varices, it falls short in visualizing the detailed hemodynamics of the inflowing and outflowing vessels of gastric varices in half of the patients when compared with computed tomography. (author)

  20. Computer-Aided Modelling and Analysis of PV Systems: A Comparative Study

    Directory of Open Access Journals (Sweden)

    Charalambos Koukouvaos

    2014-01-01

    Full Text Available Modern scientific advances have enabled remarkable efficacy for photovoltaic systems with regard to the exploitation of solar energy, boosting them into having a rapidly growing position among the systems developed for the production of renewable energy. However, in many cases the design, analysis, and control of photovoltaic systems are tasks which are quite complex and thus difficult to be carried out. In order to cope with this kind of problems, appropriate software tools have been developed either as standalone products or parts of general purpose software platforms used to model and simulate the generation, transmission, and distribution of solar energy. The utilization of this kind of software tools may be extremely helpful to the successful performance evaluation of energy systems with maximum accuracy and minimum cost in time and effort. The work presented in this paper aims on a first level at the performance analysis of various configurations of photovoltaic systems through computer-aided modelling. On a second level, it provides a comparative evaluation of the credibility of two of the most advanced graphical programming environments, namely, Simulink and LabVIEW, with regard to their application in photovoltaic systems.

  1. Comparative evaluation of computed tomography for dental implants on the mandibular edentulous area

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Kyung Hoon; Jeong, Ho Gul; Park, Hyok; Park, Chang Seo; Kim, Kee Deog [Department of Oral and Maxillofacial Radiology, Oral Science Research Center, College of Dentistry, Yonsei University, Seoul (Korea, Republic of)

    2009-03-15

    The purpose of this study was to evaluate the clinical usefulness of the recently developed multi-detector computed tomography and cone beam computed tomography in pre-operative implant evaluation, by comparing them with the single detector computed tomography, already confirmed for accuracy in this area. Five partially edentulous dry human mandibles, with 1 X 1 mm gutta percha cones, placed in 5 mm intervals posterior to the mental foramen on each side of the buccal part of the mandible, were used in this study. They were scanned as follows: 1) Single detector computed tomography: slice thickness 1 mm, 200 mA, 120 kV 2) Multi-detector computed tomography: slice thickness 0.75 mm, 250 mA, 120 kV 3) Cone beam computed tomography: 15 mAs, 120 kV Axial images acquired from three computed tomographs were transferred to personal computer, and then reformatted cross-sectional images were generated using V-Implant 2.0 (CyberMed Inc., Seoul, Korea) software. Among the cross-sectional images of the gutta percha cone, placed in the buccal body of the mandible, the most precise cross section was selected as the measuring point and the distance from the most superior border of the mandibular canal to the alveolar crest was measured and analyzed 10 times by a dentist. There were no significant intraobserver differences in the distance from the most superior border of the mandibular canal to the alveolar crest (p>0.05). There were no significant differences among single detector computed tomography, multi-detector computed tomography and cone beam computed tomography in the distance from the most superior border of the mandibular canal to the alveolar crest (p>0.05). Multi-detector computed tomography and cone beam computed tomography are clinically useful in the evaluation of pre-operative site for mandibular dental implants, with consideration for radiation exposure dose and scanning time.

  2. Comparative evaluation of computed tomography for dental implants on the mandibular edentulous area

    International Nuclear Information System (INIS)

    Sun, Kyung Hoon; Jeong, Ho Gul; Park, Hyok; Park, Chang Seo; Kim, Kee Deog

    2009-01-01

    The purpose of this study was to evaluate the clinical usefulness of the recently developed multi-detector computed tomography and cone beam computed tomography in pre-operative implant evaluation, by comparing them with the single detector computed tomography, already confirmed for accuracy in this area. Five partially edentulous dry human mandibles, with 1 X 1 mm gutta percha cones, placed in 5 mm intervals posterior to the mental foramen on each side of the buccal part of the mandible, were used in this study. They were scanned as follows: 1) Single detector computed tomography: slice thickness 1 mm, 200 mA, 120 kV 2) Multi-detector computed tomography: slice thickness 0.75 mm, 250 mA, 120 kV 3) Cone beam computed tomography: 15 mAs, 120 kV Axial images acquired from three computed tomographs were transferred to personal computer, and then reformatted cross-sectional images were generated using V-Implant 2.0 (CyberMed Inc., Seoul, Korea) software. Among the cross-sectional images of the gutta percha cone, placed in the buccal body of the mandible, the most precise cross section was selected as the measuring point and the distance from the most superior border of the mandibular canal to the alveolar crest was measured and analyzed 10 times by a dentist. There were no significant intraobserver differences in the distance from the most superior border of the mandibular canal to the alveolar crest (p>0.05). There were no significant differences among single detector computed tomography, multi-detector computed tomography and cone beam computed tomography in the distance from the most superior border of the mandibular canal to the alveolar crest (p>0.05). Multi-detector computed tomography and cone beam computed tomography are clinically useful in the evaluation of pre-operative site for mandibular dental implants, with consideration for radiation exposure dose and scanning time.

  3. Efficient Vocational Skills Training for People with Cognitive Disabilities: An Exploratory Study Comparing Computer-Assisted Instruction to One-on-One Tutoring.

    Science.gov (United States)

    Larson, James R; Juszczak, Andrew; Engel, Kathryn

    2016-03-01

    This study compared the effectiveness of computer-assisted instruction to that of one-on-one tutoring for teaching people with mild and moderate cognitive disabilities when both training methods are designed to take account of the specific mental deficits most commonly found in cognitive disability populations. Fifteen participants (age 22-71) received either computer-assisted instruction or one-on-one tutoring in three content domains that were of functional and daily relevance to them: behavioural limits, rights and responsibilities (two modules) and alphabetical sorting. Learning was assessed by means of a series of pretests and four learning cycle post-tests. Both instructional conditions maintained time-on-task and teaching material equivalence, and both incorporated a set of best-practices and empirically supported teaching techniques designed to address attentional deficits, stimulus processing inefficiencies and cognitive load limitations. Strong evidence of learning was found in both instructional method conditions. Moreover, in all content domains the two methods yielded approximately equivalent rates of learning and learning attainment. These findings offer tentative evidence that a repetitive, computer-assisted training program can produce learning outcomes in people with mild and moderate cognitive disabilities that are comparable to those achieved by high-quality one-on-one tutoring. © 2015 John Wiley & Sons Ltd.

  4. Comparative Analysis Between Computed and Conventional Inferior Alveolar Nerve Block Techniques.

    Science.gov (United States)

    Araújo, Gabriela Madeira; Barbalho, Jimmy Charles Melo; Dias, Tasiana Guedes de Souza; Santos, Thiago de Santana; Vasconcellos, Ricardo José de Holanda; de Morais, Hécio Henrique Araújo

    2015-11-01

    The aim of this randomized, double-blind, controlled trial was to compare the computed and conventional inferior alveolar nerve block techniques in symmetrically positioned inferior third molars. Both computed and conventional anesthetic techniques were performed in 29 healthy patients (58 surgeries) aged between 18 and 40 years. The anesthetic of choice was 2% lidocaine with 1: 200,000 epinephrine. The Visual Analogue Scale assessed the pain variable after anesthetic infiltration. Patient satisfaction was evaluated using the Likert Scale. Heart and respiratory rates, mean time to perform technique, and the need for additional anesthesia were also evaluated. Pain variable means were higher for the conventional technique as compared with computed, 3.45 ± 2.73 and 2.86 ± 1.96, respectively, but no statistically significant differences were found (P > 0.05). Patient satisfaction showed no statistically significant differences. The average computed technique runtime and the conventional were 3.85 and 1.61 minutes, respectively, showing statistically significant differences (P <0.001). The computed anesthetic technique showed lower mean pain perception, but did not show statistically significant differences when contrasted to the conventional technique.

  5. Determining the frequency of dry eye in computer users and comparing with control group

    Directory of Open Access Journals (Sweden)

    Mohammad Hossein Davari

    2017-08-01

    Full Text Available AIM: To determine the frequency of dry eye in computer users and to compare them with control group. METHODS: This study was a case control research conducted in 2015 in the city of Birjand. Sample size of study was estimated to be 304 subjects(152 subjects in each group, computer user group and control group. Non-randomized method of sampling was used in both groups. Schirmer test was used to evaluate dry eye of subjects. Then, subjects completed questionnaire. This questionnaire was developed based on objectives and reviewing the literature. After collecting the data, they were entered to SPSS Software and they were analyzed using Chi-square test or Fisher's test at the alpha level of 0.05.RESULTS: In total, 304 subjects(152 subjects in each groupwere included in the study. Frequency of dry eyes in the control group was 3.3%(5 subjectsand it was 61.8% in computer users group(94 subjects. Significant difference was observed between two groups in this regard(Pn=12, and it was 34.2% in computer users group(n=52, which significant difference was observed between two groups in this regard(PP=0.8. The mean working hour with computer per day in patients with dry eye was 6.65±3.52h, while it was 1.62±2.54h in healthy group(T=13.25, PCONCLUSION: This study showed a significant relationship between using computer and dry eye and ocular symptoms. Thus, it is necessary that officials need to pay particular attention to working hours with computer by employees. They should also develop appropriate plans to divide the working hours with computer among computer users. However, due to various confounding factors, it is recommended that these factors to be controlled in future studies.

  6. Primates, computation, and the path to language. Reply to comments on "Towards a Computational Comparative Neuroprimatology: Framing the language-ready brain"

    Science.gov (United States)

    Arbib, Michael A.

    2016-03-01

    The target article [6], henceforth TA, had as its main title Towards a Computational Comparative Neuroprimatology. This unpacks into three claims: Comparative Primatology: If one wishes to understand the behavior of any one primate species (whether monkey, ape or human - TA did not discuss, e.g., lemurs but that study could well be of interest), one will gain new insight by comparing behaviors across species, sharpening one's analysis of one class of behaviors by analyzing similarities and differences between two or more species.

  7. Indications for computed tomography (CT-) diagnostics in proximal humeral fractures: a comparative study of plain radiography and computed tomography

    OpenAIRE

    Weise Kuno; Pereira Philippe L; Dietz Klaus; Eingartner Christoph; Schmal Hagen; Südkamp Norbert P; Rolauffs Bernd; Bahrs Christian; Lingenfelter Erich; Helwig Peter

    2009-01-01

    Abstract Background Precise indications for computed tomography (CT) in proximal humeral fractures are not established. The purpose of this study was a comparison of conventional radiographic views with different CT reconstructions with 2 D and 3 D imaging to establish indications for additional CT diagnostics depending on the fractured parts. Methods In a prospective diagnostic study in two level 1 trauma centers, 44 patients with proximal humeral fractures were diagnosed with conventional X...

  8. What can be learned from computer modeling? Comparing expository and modeling approaches to teaching dynamic systems behavior

    NARCIS (Netherlands)

    van Borkulo, S.P.; van Joolingen, W.R.; Savelsbergh, E.R.; de Jong, T.

    2012-01-01

    Computer modeling has been widely promoted as a means to attain higher order learning outcomes. Substantiating these benefits, however, has been problematic due to a lack of proper assessment tools. In this study, we compared computer modeling with expository instruction, using a tailored assessment

  9. Comparing computer adaptive and curriculum-based measures of math in progress monitoring.

    Science.gov (United States)

    Shapiro, Edward S; Dennis, Minyi Shih; Fu, Qiong

    2015-12-01

    The purpose of the study was to compare the use of a Computer Adaptive Test and Curriculum-Based Measurement in the assessment of mathematics. This study also investigated the degree to which slope or rate of change predicted student outcomes on the annual state assessment of mathematics above and beyond scores of single point screening assessments (i.e., the computer adaptive test or the CBM assessment just before the administration of the state assessment). Repeated measurement of mathematics once per month across a 7-month period using a Computer Adaptive Test (STAR-Math) and Curriculum-Based Measurement (CBM, AIMSweb Math Computation, AIMSweb Math Concepts/Applications) was collected for a maximum total of 250 third, fourth, and fifth grade students. Results showed STAR-Math in all 3 grades and AIMSweb Math Concepts/Applications in the third and fifth grades had primarily linear growth patterns in mathematics. AIMSweb Math Computation in all grades and AIMSweb Math Concepts/Applications in Grade 4 had decelerating positive trends. Predictive validity evidence showed the strongest relationships were between STAR-Math and outcomes for third and fourth grade students. The blockwise multiple regression by grade revealed that slopes accounted for only a very small proportion of additional variance above and beyond what was explained by the scores obtained on a single point of assessment just prior to the administration of the state assessment. (c) 2015 APA, all rights reserved).

  10. A Comparative Study of Collagen Matrix Density Effect on Endothelial Sprout Formation Using Experimental and Computational Approaches.

    Science.gov (United States)

    Shamloo, Amir; Mohammadaliha, Negar; Heilshorn, Sarah C; Bauer, Amy L

    2016-04-01

    A thorough understanding of determining factors in angiogenesis is a necessary step to control the development of new blood vessels. Extracellular matrix density is known to have a significant influence on cellular behaviors and consequently can regulate vessel formation. The utilization of experimental platforms in combination with numerical models can be a powerful method to explore the mechanisms of new capillary sprout formation. In this study, using an integrative method, the interplay between the matrix density and angiogenesis was investigated. Owing the fact that the extracellular matrix density is a global parameter that can affect other parameters such as pore size, stiffness, cell-matrix adhesion and cross-linking, deeper understanding of the most important biomechanical or biochemical properties of the ECM causing changes in sprout morphogenesis is crucial. Here, we implemented both computational and experimental methods to analyze the mechanisms responsible for the influence of ECM density on the sprout formation that is difficult to be investigated comprehensively using each of these single methods. For this purpose, we first utilized an innovative approach to quantify the correspondence of the simulated collagen fibril density to the collagen density in the experimental part. Comparing the results of the experimental study and computational model led to some considerable achievements. First, we verified the results of the computational model using the experimental results. Then, we reported parameters such as the ratio of proliferating cells to migrating cells that was difficult to obtain from experimental study. Finally, this integrative system led to gain an understanding of the possible mechanisms responsible for the effect of ECM density on angiogenesis. The results showed that stable and long sprouts were observed at an intermediate collagen matrix density of 1.2 and 1.9 mg/ml due to a balance between the number of migrating and proliferating

  11. Molecular determinants of enzyme cold adaptation: comparative structural and computational studies of cold- and warm-adapted enzymes.

    Science.gov (United States)

    Papaleo, Elena; Tiberti, Matteo; Invernizzi, Gaetano; Pasi, Marco; Ranzani, Valeria

    2011-11-01

    The identification of molecular mechanisms underlying enzyme cold adaptation is a hot-topic both for fundamental research and industrial applications. In the present contribution, we review the last decades of structural computational investigations on cold-adapted enzymes in comparison to their warm-adapted counterparts. Comparative sequence and structural studies allow the definition of a multitude of adaptation strategies. Different enzymes carried out diverse mechanisms to adapt to low temperatures, so that a general theory for enzyme cold adaptation cannot be formulated. However, some common features can be traced in dynamic and flexibility properties of these enzymes, as well as in their intra- and inter-molecular interaction networks. Interestingly, the current data suggest that a family-centered point of view is necessary in the comparative analyses of cold- and warm-adapted enzymes. In fact, enzymes belonging to the same family or superfamily, thus sharing at least the three-dimensional fold and common features of the functional sites, have evolved similar structural and dynamic patterns to overcome the detrimental effects of low temperatures.

  12. Comparing the Social Skills of Students Addicted to Computer Games with Normal Students

    OpenAIRE

    Zamani, Eshrat; Kheradmand, Ali; Cheshmi, Maliheh; Abedi, Ahmad; Hedayati, Nasim

    2010-01-01

    Background This study aimed to investigate and compare the social skills of studentsaddicted to computer games with normal students. The dependentvariable in the present study is the social skills. Methods The study population included all the students in the second grade ofpublic secondary school in the city of Isfahan at the educational year of2009-2010. The sample size included 564 students selected using thecluster random sampling method. Data collection was conducted usingQuestionnaire o...

  13. Comparative study of computational intelligence approaches for NOx reduction of coal-fired boiler

    International Nuclear Information System (INIS)

    Wei, Zhongbao; Li, Xiaolu; Xu, Lijun; Cheng, Yanting

    2013-01-01

    This paper focuses on NO x emission prediction and operating parameters optimization for coal-fired boilers. Support Vector Regression (SVR) model based on CGA (Conventional Genetic Algorithm) was proposed to model the relationship between the operating parameters and the concentration of NO x emission. Then CGA and two modified algorithms, the Quantum Genetic Algorithm (QGA) and SAGA (Simulated Annealing Genetic Algorithm), were employed to optimize the operating parameters of the coal-fired boiler to reduce NO x emission. The results showed that the proposed SVR model was more accurate than the widely used Artificial Neural Network (ANN) model when employed to predict the concentration of NO x emission. The mean relative error and correlation coefficient calculated by the proposed SVR model were 2.08% and 0.95, respectively. Among the three optimization algorithms implemented in this paper, the SAGA showed superiority to the other two algorithms considering the quality of solution within a given computing time. The SVR plus SAGA method was preferable to predict the concentration of NO x emission and further to optimize the operating parameters to achieve low NO x emission for coal-fired boilers. - Highlights: • The CGA based SVR model is proposed to predict the concentration of NO x emission. • The CGA based SVR model performs better than the widely used ANN model. • CGA and two modified algorithms are compared to optimize the parameters. • The SAGA is preferable for its high quality of solution and low computing time. • The SVR plus SAGA is successfully employed to optimize the operating parameters

  14. A Qualitative Case Study Comparing a Computer-Mediated Delivery System to a Face-to-Face Mediated Delivery System for Teaching Creative Writing Fiction Workshops

    Science.gov (United States)

    Daniels, Mindy A.

    2012-01-01

    The purpose of this case study was to compare the pedagogical and affective efficiency and efficacy of creative prose fiction writing workshops taught via asynchronous computer-mediated online distance education with creative prose fiction writing workshops taught face-to-face in order to better understand their operational pedagogy and…

  15. An appraisal of wind speed distribution prediction by soft computing methodologies: A comparative study

    International Nuclear Information System (INIS)

    Petković, Dalibor; Shamshirband, Shahaboddin; Anuar, Nor Badrul; Saboohi, Hadi; Abdul Wahab, Ainuddin Wahid; Protić, Milan; Zalnezhad, Erfan; Mirhashemi, Seyed Mohammad Amin

    2014-01-01

    Highlights: • Probabilistic distribution functions of wind speed. • Two parameter Weibull probability distribution. • To build an effective prediction model of distribution of wind speed. • Support vector regression application as probability function for wind speed. - Abstract: The probabilistic distribution of wind speed is among the more significant wind characteristics in examining wind energy potential and the performance of wind energy conversion systems. When the wind speed probability distribution is known, the wind energy distribution can be easily obtained. Therefore, the probability distribution of wind speed is a very important piece of information required in assessing wind energy potential. For this reason, a large number of studies have been established concerning the use of a variety of probability density functions to describe wind speed frequency distributions. Although the two-parameter Weibull distribution comprises a widely used and accepted method, solving the function is very challenging. In this study, the polynomial and radial basis functions (RBF) are applied as the kernel function of support vector regression (SVR) to estimate two parameters of the Weibull distribution function according to previously established analytical methods. Rather than minimizing the observed training error, SVR p oly and SVR r bf attempt to minimize the generalization error bound, so as to achieve generalized performance. According to the experimental results, enhanced predictive accuracy and capability of generalization can be achieved using the SVR approach compared to other soft computing methodologies

  16. A Comparative Study of Multi-material Data Structures for Computational Physics Applications

    Energy Technology Data Exchange (ETDEWEB)

    Garimella, Rao Veerabhadra [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Robey, Robert W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-31

    The data structures used to represent the multi-material state of a computational physics application can have a drastic impact on the performance of the application. We look at efficient data structures for sparse applications where there may be many materials, but only one or few in most computational cells. We develop simple performance models for use in selecting possible data structures and programming patterns. We verify the analytic models of performance through a small test program of the representative cases.

  17. A Comparative Study of Paper-based and Computer-based Contextualization in Vocabulary Learning of EFL Students

    Directory of Open Access Journals (Sweden)

    Mousa Ahmadian

    2015-04-01

    Full Text Available Vocabulary acquisition is one of the largest and most important tasks in language classes. New technologies, such as computers, have helped a lot in this way. The importance of the issue led the researchers to do the present study which concerns the comparison of contextualized vocabulary learning on paper and through Computer Assisted Language Learning (CALL. To this end, 52 Pre-university EFL learners were randomly assigned in two groups: a paper-based group (PB and a computer-based (CB group each with 26 learners. The PB group received PB contextualization of vocabulary items, while the CB group received CB contextualization of the vocabulary items thorough PowerPoint (PP software. One pretest, posttest, along with an immediate and a delayed posttest were given to the learners. Paired samples t-test of pretest and posttest and independent samples t-test of the delayed and immediate posttest were executed by SPSS software. The results revealed that computer-based contextualization had more effects on vocabulary learning of Iranian EFL learners than paper-based contextualization of the words. Keywords: Computer-based contextualization, Paper-based contextualization, Vocabulary learning, CALL

  18. A comparative study of computed radiographic cephalometry and conventional cephalometry in reliability of head film measurements

    International Nuclear Information System (INIS)

    Kim, Hyung Done; Kim, Kee Deog; Park, Chang Seo

    1997-01-01

    The purpose of this study was to compare and to find out the variability of head film measurements (landmarks identification) between Fuji computed radiographic cephalometry and conventional cephalometry. 28 Korean adults were selected. Lateral cephalometric FCR film and conventional cephalometric film of each subject was taken. Four investigators identified 24 cephalometric landmarks on lateral cephalometric FCR film and conventional cephalometric film were statistically analysed. The results were as follows : 1. In FCR film and conventional film, coefficient of variation (C.V.) of 24 landmarks was taken horizontally and vertically. 2. In comparison of significant differences of landmarks variability between FCR film and conventional film, horizontal l value of coefficient of variation showed significant differences in four landmarks among twenty-four landmarks, but vertical a value of coefficient of variation showed significant differences in sixteen landmarks among twenty-four landmarks. FCR film showed significantly less variability than conventional film in 17 subjects among 20 (4+16) subjects that sho wed significant difference.

  19. Computed tomography compared to magnetic resonance imaging in occult or suspect hip fractures. A retrospective study in 44 patients

    Energy Technology Data Exchange (ETDEWEB)

    Collin, David; Goethlin, Jan H. [Sahlgrenska University Hospital, Department of Radiology, Moelndal (Sweden); Geijer, Mats [Lund University, Department of Medical Imaging and Physiology, Skaane University Hospital, Lund (Sweden)

    2016-11-15

    Computed tomography (CT) for evaluation of occult and suspect hip fractures has been proposed as a good second-line investigation. The diagnostic precision compared to magnetic resonance imaging (MRI) is unclear. To compare the diagnostic performance of CT and MRI in a retrospective study on patients with suspect and occult hip fractures. Forty-four elderly consecutive patients with low-energy trauma to the hip were identified where negative or suspect CT was followed by MRI. Primary reporting and review by two observers as well as the diagnostic performance of the two modalities were compared. Surgical treatment and clinical course were used as outcomes. Compared to the primary reports, the CT reviewers found fewer normal and no suspect cases. MRI changed the primary diagnoses in 27 cases, and in 14 and 15 cases, respectively, at review. There was no disagreement on MRI diagnoses. In our patient population, MRI was deemed a more reliable modality for hip fracture diagnosis in comparison to CT. For clinical decision making, MRI seems to have a higher accuracy than CT. A negative CT finding cannot completely rule out a hip fracture in patients where clinical findings of hip fracture persevere. (orig.)

  20. Digital fluorography and computed tomography in a department of neuroradiology - a comparative study

    International Nuclear Information System (INIS)

    Fawcitt, R.A.; Freer, C.; Jarvis, H.; Occleshaw, J.V.; Isherwood, I.

    1984-01-01

    Digital Subtraction Angiography (DSA) has the ability to display the intracranial circulation following an intravenous or intra-arterial injection of contrast medium. A study was performed in 57 patients with neurological disorders undergoing DSA, either by Digital Intravenous Injection Angiography (DIVA) or Digital Intra-arterial Injection Angiography (DART) to assess the ability of DIVA to replace DART, the latter being carried out by digital fluorography or by conventional film screen methods, and also to establish the role of DSA in relation to Computed Tomography. (U.K.)

  1. A comparative study for image quality and radiation dose of a cone beam computed tomography scanner and a multislice computed tomography scanner for paranasal sinus imaging.

    Science.gov (United States)

    De Cock, Jens; Zanca, Federica; Canning, John; Pauwels, Ruben; Hermans, Robert

    2015-07-01

    To evaluate image quality and radiation dose of a state of the art cone beam computed tomography (CBCT) system and a multislice computed tomography (MSCT) system in patients with sinonasal poliposis. In this retrospective study two radiologists evaluated 57 patients with sinonasal poliposis who underwent a CBCT or MSCT sinus examination, along with a control group of 90 patients with normal radiological findings. Tissue doses were measured using a phantom model with thermoluminescent dosimeters (TLD). Overall image quality in CBCT was scored significantly higher than in MSCT in patients with normal radiologic findings (p-value: 0.00001). In patients with sinonasal poliposis, MSCT scored significantly higher than CBCT (p-value: 0.00001). The average effective dose for MSCT was 42% higher compared to CBCT (108 μSv vs 63 μSv). CBCT and MSCT are both suited for the evaluation of sinonasal poliposis. In patients with sinonasal poliposis, clinically important structures of the paranasal sinuses can be better delineated with MSCT, whereas in patients without sinonasal poliposis, CBCT turns out to define the important structures of the sinonasal region better. However, given the lower radiation dose, CBCT can be considered for the evaluation of the sinonasal structures in patients with sinonasal poliposis. • CBCT and MSCT are both suited for evaluation of sinonasal poliposis. • Effective dose for MSCT was 42% higher compared to CBCT. • In patients with sinonasal poliposis, clinically important anatomical structures are better delineated with MSCT. • In patients with normal radiological findings, clinically important anatomical structures are better delineated with CBCT.

  2. Performance of cone-beam computed tomography and multidetector computed tomography in diagnostic imaging of the midface: A comparative study on Phantom and cadaver head scans

    Energy Technology Data Exchange (ETDEWEB)

    Veldhoen, Simon [University Medical Center Hamburg, Department of Diagnostic and Interventional Radiology, Hamburg (Germany); University Hospital Wuerzburg, Department of Diagnostic and Interventional Radiology, Wuerzburg (Germany); Schoellchen, Maximilian; Hanken, H.; Precht, C.; Heiland, M. [University Medical Center Hamburg, Department of Oral- and Maxillofacial Surgery, Hamburg (Germany); Henes, F.O.; Adam, G.; Regier, M. [University Medical Center Hamburg, Department of Diagnostic and Interventional Radiology, Hamburg (Germany); Schoen, G. [University Medical Center Hamburg, Department of Medical Biometry and Epidemiology, Hamburg (Germany); Nagel, H.D. [Science and Technology for Radiology, Buchholz (Germany); Schumacher, U. [University Medical Center Hamburg, Institute of Anatomy, Hamburg (Germany)

    2017-02-15

    To compare multidetector computed tomography (MDCT) and cone-beam computed tomography (CBCT) regarding radiation, resolution, image noise, and image quality. CBCT and 256-MDCT were compared based on three scan protocols: Standard-dose (∼24 mGy), reduced-dose (∼9 mGy), and low-dose (∼4 mGy). MDCT images were acquired in standard- and high-resolution mode (HR-MDCT) and reconstructed using filtered back projection (FBP) and iterative reconstruction (IR). Spatial resolution in linepairs (lp) and objective image noise (OIN) were assessed using dedicated phantoms. Image quality was assessed in scans of 25 cadaver heads using a Likert scale. OIN was markedly higher in FBP-MDCT when compared to CBCT. IR lowered the OIN to comparable values in standard-mode MDCT only. CBCT provided a resolution of 13 lp/cm at standard-dose and 11 lp/cm at reduced-dose vs. 11 lp/cm and 10 lp/cm in HR-MDCT. Resolution of 10 lp/cm was observed for both devices using low-dose settings. Quality scores of MDCT and CBCT did not differ at standard-dose (CBCT, 3.4; MDCT, 3.3-3.5; p > 0.05). Using reduced- and low-dose protocols, CBCT was superior (reduced-dose, 3.2 vs. 2.8; low dose, 3.0 vs. 2.3; p < 0.001). Using the low-dose protocol, the assessed CBCT provided better objective and subjective image quality and equality in resolution. Similar image quality, but better resolution using CBCT was observed at higher exposure settings. (orig.)

  3. The Use of Computer Simulation to Compare Student performance in Traditional versus Distance Learning Environments

    Directory of Open Access Journals (Sweden)

    Retta Guy

    2015-06-01

    Full Text Available Simulations have been shown to be an effective tool in traditional learning environments; however, as distance learning grows in popularity, the need to examine simulation effectiveness in this environment has become paramount. A casual-comparative design was chosen for this study to determine whether students using a computer-based instructional simulation in hybrid and fully online environments learned better than traditional classroom learners. The study spans a period of 6 years beginning fall 2008 through spring 2014. The population studied was 281 undergraduate business students self-enrolled in a 200-level microcomputer application course. The overall results support previous studies in that computer simulations are most effective when used as a supplement to face-to-face lectures and in hybrid environments.

  4. A comparative study of computed tomography with surgical specimen in 32 cases of hyperparathyroidism

    International Nuclear Information System (INIS)

    Iwamoto, Noriyuki; Yamazaki, Satoru; Hukuda, Toyofumi

    1984-01-01

    We have been localizing pathological parathyroid glands by computed tomography(CT) since December '80. We reviewed 32 cases of surgically-treated hyperparathyroidism, in which 99 parathyroid glands were resected, each weight ranging from 20 to 3300 mg. Comparing the resected parathyroid glands with preoperative CT, we concluded as follows: 1) Pathological parathyroid glands were identified in 25 of the 32 cases (78 %). 2) In parathyroid glands weighing over 300mg, 60 from 64 glands (94 %) were identified by CT. 3) In secondary hyperaparathyroidism with radiologically proven subperiosteal resorption, pathologically-enlarged parathyroid glands were identified by CT in 22 from 23 cases (95%). CT was considered a useful diagnostic method in secondary hyperparathyroidism. 4) Experiencing 2 false-positive cases and one false-negative, which were ectopic cases, we concluded it necesary to use bolus-enhancement in localizing ectopic parathyroid glands. (author)

  5. Studi Perbandingan Layanan Cloud Computing

    Directory of Open Access Journals (Sweden)

    Afdhal Afdhal

    2014-03-01

    Full Text Available In the past few years, cloud computing has became a dominant topic in the IT area. Cloud computing offers hardware, infrastructure, platform and applications without requiring end-users knowledge of the physical location and the configuration of providers who deliver the services. It has been a good solution to increase reliability, reduce computing cost, and make opportunities to IT industries to get more advantages. The purpose of this article is to present a better understanding of cloud delivery service, correlation and inter-dependency. This article compares and contrasts the different levels of delivery services and the development models, identify issues, and future directions on cloud computing. The end-users comprehension of cloud computing delivery service classification will equip them with knowledge to determine and decide which business model that will be chosen and adopted securely and comfortably. The last part of this article provides several recommendations for cloud computing service providers and end-users.

  6. A comparative analysis on computational methods for fitting an ERGM to biological network data

    Directory of Open Access Journals (Sweden)

    Sudipta Saha

    2015-03-01

    Full Text Available Exponential random graph models (ERGM based on graph theory are useful in studying global biological network structure using its local properties. However, computational methods for fitting such models are sensitive to the type, structure and the number of the local features of a network under study. In this paper, we compared computational methods for fitting an ERGM with local features of different types and structures. Two commonly used methods, such as the Markov Chain Monte Carlo Maximum Likelihood Estimation and the Maximum Pseudo Likelihood Estimation are considered for estimating the coefficients of network attributes. We compared the estimates of observed network to our random simulated network using both methods under ERGM. The motivation was to ascertain the extent to which an observed network would deviate from a randomly simulated network if the physical numbers of attributes were approximately same. Cut-off points of some common attributes of interest for different order of nodes were determined through simulations. We implemented our method to a known regulatory network database of Escherichia coli (E. coli.

  7. Standalone computer-aided detection compared to radiologists' performance for the detection of mammographic masses

    International Nuclear Information System (INIS)

    Hupse, Rianne; Samulski, Maurice; Imhof-Tas, Mechli W.; Karssemeijer, Nico; Lobbes, Marc; Boetes, Carla; Heeten, Ard den; Beijerinck, David; Pijnappel, Ruud

    2013-01-01

    We developed a computer-aided detection (CAD) system aimed at decision support for detection of malignant masses and architectural distortions in mammograms. The effect of this system on radiologists' performance depends strongly on its standalone performance. The purpose of this study was to compare the standalone performance of this CAD system to that of radiologists. In a retrospective study, nine certified screening radiologists and three residents read 200 digital screening mammograms without the use of CAD. Performances of the individual readers and of CAD were computed as the true-positive fraction (TPF) at a false-positive fraction of 0.05 and 0.2. Differences were analysed using an independent one-sample t-test. At a false-positive fraction of 0.05, the performance of CAD (TPF = 0.487) was similar to that of the certified screening radiologists (TPF = 0.518, P = 0.17). At a false-positive fraction of 0.2, CAD performance (TPF = 0.620) was significantly lower than the radiologist performance (TPF = 0.736, P <0.001). Compared to the residents, CAD performance was similar for all false-positive fractions. The sensitivity of CAD at a high specificity was comparable to that of human readers. These results show potential for CAD to be used as an independent reader in breast cancer screening. (orig.)

  8. Comparative phyloinformatics of virus genes at micro and macro levels in a distributed computing environment.

    Science.gov (United States)

    Singh, Dadabhai T; Trehan, Rahul; Schmidt, Bertil; Bretschneider, Timo

    2008-01-01

    Preparedness for a possible global pandemic caused by viruses such as the highly pathogenic influenza A subtype H5N1 has become a global priority. In particular, it is critical to monitor the appearance of any new emerging subtypes. Comparative phyloinformatics can be used to monitor, analyze, and possibly predict the evolution of viruses. However, in order to utilize the full functionality of available analysis packages for large-scale phyloinformatics studies, a team of computer scientists, biostatisticians and virologists is needed--a requirement which cannot be fulfilled in many cases. Furthermore, the time complexities of many algorithms involved leads to prohibitive runtimes on sequential computer platforms. This has so far hindered the use of comparative phyloinformatics as a commonly applied tool in this area. In this paper the graphical-oriented workflow design system called Quascade and its efficient usage for comparative phyloinformatics are presented. In particular, we focus on how this task can be effectively performed in a distributed computing environment. As a proof of concept, the designed workflows are used for the phylogenetic analysis of neuraminidase of H5N1 isolates (micro level) and influenza viruses (macro level). The results of this paper are hence twofold. Firstly, this paper demonstrates the usefulness of a graphical user interface system to design and execute complex distributed workflows for large-scale phyloinformatics studies of virus genes. Secondly, the analysis of neuraminidase on different levels of complexity provides valuable insights of this virus's tendency for geographical based clustering in the phylogenetic tree and also shows the importance of glycan sites in its molecular evolution. The current study demonstrates the efficiency and utility of workflow systems providing a biologist friendly approach to complex biological dataset analysis using high performance computing. In particular, the utility of the platform Quascade

  9. A comparative analysis of soft computing techniques for gene prediction.

    Science.gov (United States)

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Revisiting dibenzothiophene thermochemical data: Experimental and computational studies

    International Nuclear Information System (INIS)

    Freitas, Vera L.S.; Gomes, Jose R.B.; Ribeiro da Silva, Maria D.M.C.

    2009-01-01

    Thermochemical data of dibenzothiophene were studied in the present work by experimental techniques and computational calculations. The standard (p 0 =0.1MPa) molar enthalpy of formation, at T = 298.15 K, in the gaseous phase, was determined from the enthalpy of combustion and sublimation, obtained by rotating bomb calorimetry in oxygen, and by Calvet microcalorimetry, respectively. This value was compared with estimated data from G3(MP2)//B3LYP computations and also with the other results available in the literature.

  11. Postmortem computed tomography (PMCT) and autopsy in deadly gunshot wounds--a comparative study.

    Science.gov (United States)

    Kirchhoff, S M; Scaparra, E F; Grimm, J; Scherr, M; Graw, M; Reiser, M F; Peschel, O

    2016-05-01

    Postmortem computed tomography (PMCT) data in gunshot-related death were evaluated by two reader groups and compared to the gold standard autopsy for the determination of forensic pathology criteria. Reader group I consisted of two board-certified radiologists whereas one board-certified radiologist and one board-certified forensic pathologist formed group II. PMCT data of 51 gunshot-related deaths were evaluated for the forensic pathology criteria number of gun shots, localization of gunshot injury, caliber, and direction of the gunshot differentiating between entry and exit wound as well as associated injury to surrounding tissue. The results of both reader groups were compared to the each other and to autopsy findings considered as gold standard. Reader groups I and II and as gold standard the autopsy evaluation showed in general a good correlation between all results. The overall discrepancy rate was 12/51 (23.4%) cases for group I and 8/51 (15.6%) for group II. Ultimately, the designated reader is able to draw the following conclusion from the presented data. At first, physical autopsy is better than PMCT regarding the localization of most gunshot injuries. Second, PMCT presents with better results than physical autopsy in locating fragmented bullets/fragment clouds, and finally, PMCT results of two radiologists were equivalent to the results of one evaluating radiologist and one pathologist with the exception of caliber assessment. However, referring to the pure numbers, the slight but not significant difference in the overall discrepancy rate of both reader groups might indicate the advantage of combining expertise in evaluating imaging in cases of gunshot-related death.

  12. Computer Assisted Instruction in Special Education Three Case Studies

    Directory of Open Access Journals (Sweden)

    İbrahim DOĞAN

    2015-09-01

    Full Text Available The purpose of this study is to investigate the computer use of three students attending the special education center. Students have mental retardation, hearing problem and physical handicap respectively. The maximum variation sampling is used to select the type of handicap while the convenience sampling is used to select the participants. Three widely encountered handicap types in special education are chosen to select the study participants. The multiple holistic case study design is used in the study. Results of the study indicate that teachers in special education prefer to use educational games and drill and practice type of computers programs. Also it is found that over use of the animation, text and symbols cause cognitive overload on the student with mental retardation. Additionally, it is also discovered that the student with hearing problem learn words better when the computers are used in education as compared to the traditional method. Furthermore the student with physical handicap improved his fine muscle control abilities besides planned course objectives when computers are used in special education.

  13. Understanding the sorption mechanisms of aflatoxin B1 to kaolinite, illite, and smectite clays via a comparative computational study.

    Science.gov (United States)

    Kang, Fuxing; Ge, Yangyang; Hu, Xiaojie; Goikavi, Caspar; Waigi, Michael Gatheru; Gao, Yanzheng; Ling, Wanting

    2016-12-15

    In current adsorption studies of biotoxins to phyllosilicate clays, multiply weak bonding types regarding these adsorptions are not well known; the major attractive forces, especially for kaolinite and illite, are difficult to be identified as compared to smectite with exchangeable cations. Here, we discriminated the bonding types of aflatoxin B1 (AFB1) contaminant to these clays by combined batch experiment with model computation, expounded their bonding mechanisms which have been not quantitatively described by researchers. The observed adsorbent-to-solution distribution coefficients (K d ) of AFB1 presented in increasing order of 18.5-37.1, 141.6-158.3, and 354.6-484.7L/kg for kaolinite, illite, and smectite, respectively. Normalization of adsorbent-specific surface areas showed that adsorption affinity of AFB1 is mainly dependent on the outside surfaces of clay aggregates. The model computation and test of ionic effect further suggested that weakly electrostatic attractions ((Si/Al-OH) 2 ⋯(OC) 2 ) are responsible for AFB1-kaolinite adsorption (K d , 18.5-37.1L/kg); a moderate electron-donor-acceptor attraction ((CO) 2 ⋯K + ⋯(O-Al) 3 ) is related to AFB1-illite adsorption (K d , 141.6-158.3L/kg); a strong calcium-bridging linkage ((CO) 2 ⋯Ca 2+ ⋯(O-Si) 4 ) is involved in AFB1-smectite adsorption (K d , 354.6-484.7L/kg). Changes in Gibbs free energy (ΔG°) suggested that the computed result is reliable, providing a good reproduction of AFB1-clay interaction. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Results of comparative RBMK neutron computation using VNIIEF codes (cell computation, 3D statics, 3D kinetics). Final report

    Energy Technology Data Exchange (ETDEWEB)

    Grebennikov, A.N.; Zhitnik, A.K.; Zvenigorodskaya, O.A. [and others

    1995-12-31

    In conformity with the protocol of the Workshop under Contract {open_quotes}Assessment of RBMK reactor safety using modern Western Codes{close_quotes} VNIIEF performed a neutronics computation series to compare western and VNIIEF codes and assess whether VNIIEF codes are suitable for RBMK type reactor safety assessment computation. The work was carried out in close collaboration with M.I. Rozhdestvensky and L.M. Podlazov, NIKIET employees. The effort involved: (1) cell computations with the WIMS, EKRAN codes (improved modification of the LOMA code) and the S-90 code (VNIIEF Monte Carlo). Cell, polycell, burnup computation; (2) 3D computation of static states with the KORAT-3D and NEU codes and comparison with results of computation with the NESTLE code (USA). The computations were performed in the geometry and using the neutron constants presented by the American party; (3) 3D computation of neutron kinetics with the KORAT-3D and NEU codes. These computations were performed in two formulations, both being developed in collaboration with NIKIET. Formulation of the first problem maximally possibly agrees with one of NESTLE problems and imitates gas bubble travel through a core. The second problem is a model of the RBMK as a whole with imitation of control and protection system controls (CPS) movement in a core.

  15. Learning, epigenetics, and computation: An extension on Fitch's proposal. Comment on “Toward a computational framework for cognitive biology: Unifying approaches from cognitive neuroscience and comparative cognition” by W. Tecumseh Fitch

    Science.gov (United States)

    Okanoya, Kazuo

    2014-09-01

    The comparative computational approach of Fitch [1] attempts to renew the classical David Marr paradigm of computation, algorithm, and implementation, by introducing evolutionary view of the relationship between neural architecture and cognition. This comparative evolutionary view provides constraints useful in narrowing down the problem space for both cognition and neural mechanisms. I will provide two examples from our own studies that reinforce and extend Fitch's proposal.

  16. Computed tomography in dementia of Alzheimer type; Comparative study in each stage and comparison to single photon emission computed tomography with N-isopropyl-p-( sup 123 I) iodoamphetamine

    Energy Technology Data Exchange (ETDEWEB)

    Tsunoda, Masahiko; Fujii, Tsutomu; Tanii, Yasuyuki [Toyama Medical and Pharmaceutical Univ., Toyama (Japan); and others

    1990-05-01

    Computed tomography (CT) examinations of 7 patients with dementia of Alzheimer type were reviewed and correlated with clinical stages. The findings of CT were also compared with those of single photon emission computed tomography (SPECT). There was no positive correlation between the degree of cerebral atrophy on CT and clinical stage. Cerebral atrophy seemed to be influenced by aging, ill duration, and the degree of dementia. The cerebral/cerebellar uptake ratio of RI on SPECT was significantly decreased with the progression of clinical stage. SPECT seemed to reflect the degree of dementia, irrespective of ages and ill duration. (N.K.).

  17. Maxillary sinusitis - a comparative study of different imaging diagnosis methods

    International Nuclear Information System (INIS)

    Hueb, Marcelo Miguel; Borges, Fabiano de Almeida; Pulcinelli, Emilte; Souza, Wandir Ferreira; Borges, Luiz Marcondes

    1999-01-01

    We conducted prospective study comparing different methods (plain X-rays, computed tomography and ultrasonography mode-A) for the initial diagnosis of maxillary sinusitis. Twenty patients (40 maxillary sinuses) with a clinical history suggestive of sinusitis included in this study. The results were classified as abnormal or normal, using computed tomography as gold standard. The sensitivity for ultrasonography and plain X-rays was 84.6% and 69.2%, respectively. The specificity of both methods was 92.6%. This study suggests that ultrasonography can be used as a good follow-up method for patients with maxillary. sinusitis. (author)

  18. Two Studies Examining Argumentation in Asynchronous Computer Mediated Communication

    Science.gov (United States)

    Joiner, Richard; Jones, Sarah; Doherty, John

    2008-01-01

    Asynchronous computer mediated communication (CMC) would seem to be an ideal medium for supporting development in student argumentation. This paper investigates this assumption through two studies. The first study compared asynchronous CMC with face-to-face discussions. The transactional and strategic level of the argumentation (i.e. measures of…

  19. Comparative Study of Daylighting Calculation Methods

    Directory of Open Access Journals (Sweden)

    Mandala Ariani

    2018-01-01

    Full Text Available The aim of this study is to assess five daylighting calculation method commonly used in architectural study. The methods used include hand calculation methods (SNI/DPMB method and BRE Daylighting Protractors, scale models studied in an artificial sky simulator and computer programs using Dialux and Velux lighting software. The test room is conditioned by the uniform sky conditions, simple room geometry with variations of the room reflectance (black, grey, and white color. The analyses compared the result (including daylight factor, illumination, and coefficient of uniformity value and examines the similarity and contrast the result different. The color variations trial is used to analyses the internally reflection factor contribution to the result.

  20. Asymmetric energy flow in liquid alkylbenzenes: A computational study

    International Nuclear Information System (INIS)

    Leitner, David M.; Pandey, Hari Datt

    2015-01-01

    Ultrafast IR-Raman experiments on substituted benzenes [B. C. Pein et al., J. Phys. Chem. B 117, 10898–10904 (2013)] reveal that energy can flow more efficiently in one direction along a molecule than in others. We carry out a computational study of energy flow in the three alkyl benzenes, toluene, isopropylbenzene, and t-butylbenzene, studied in these experiments, and find an asymmetry in the flow of vibrational energy between the two chemical groups of the molecule due to quantum mechanical vibrational relaxation bottlenecks, which give rise to a preferred direction of energy flow. We compare energy flow computed for all modes of the three alkylbenzenes over the relaxation time into the liquid with energy flow through the subset of modes monitored in the time-resolved Raman experiments and find qualitatively similar results when using the subset compared to all the modes

  1. Comparative analysis of 11 different radioisotopes for palliative treatment of bone metastases by computational methods

    International Nuclear Information System (INIS)

    Guerra Liberal, Francisco D. C.; Tavares, Adriana Alexandre S.; Tavares, João Manuel R. S.

    2014-01-01

    Purpose: Throughout the years, the palliative treatment of bone metastases using bone seeking radiotracers has been part of the therapeutic resources used in oncology, but the choice of which bone seeking agent to use is not consensual across sites and limited data are available comparing the characteristics of each radioisotope. Computational simulation is a simple and practical method to study and to compare a variety of radioisotopes for different medical applications, including the palliative treatment of bone metastases. This study aims to evaluate and compare 11 different radioisotopes currently in use or under research for the palliative treatment of bone metastases using computational methods. Methods: Computational models were used to estimate the percentage of deoxyribonucleic acid (DNA) damage (fast Monte Carlo damage algorithm), the probability of correct DNA repair (Monte Carlo excision repair algorithm), and the radiation-induced cellular effects (virtual cell radiobiology algorithm) post-irradiation with selected particles emitted by phosphorus-32 ( 32 P), strontium-89 ( 89 Sr), yttrium-90 ( 90 Y ), tin-117 ( 117m Sn), samarium-153 ( 153 Sm), holmium-166 ( 166 Ho), thulium-170 ( 170 Tm), lutetium-177 ( 177 Lu), rhenium-186 ( 186 Re), rhenium-188 ( 188 Re), and radium-223 ( 223 Ra). Results: 223 Ra alpha particles, 177 Lu beta minus particles, and 170 Tm beta minus particles induced the highest cell death of all investigated particles and radioisotopes. The cell survival fraction measured post-irradiation with beta minus particles emitted by 89 Sr and 153 Sm, two of the most frequently used radionuclides in the palliative treatment of bone metastases in clinical routine practice, was higher than 177 Lu beta minus particles and 223 Ra alpha particles. Conclusions: 223 Ra and 177 Lu hold the highest potential for palliative treatment of bone metastases of all radioisotopes compared in this study. Data reported here may prompt future in vitro and in vivo

  2. A comparative study of the deviation of the menton on posteroanterior cephalograms and three dimensional computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hee Jin; Lee, Sun Gene; Lee, Eun Joo; Kang, Byung Cheol; Lee, Jae Seo; Lim, Hoi Jeong; Yoon, Suk Ja [School of Dentistry, Dental Science Research Institute, Chonnam National University, Gwangju (Korea, Republic of); Song, In Ja [Dept. of Nursing, Kwangju Women' s University, Gwangju (Korea, Republic of)

    2016-03-15

    Facial asymmetry has been measured by the severity of deviation of the menton (Me) on posteroanterior (PA) cephalograms and three-dimensional (3D) computed tomography (CT). This study aimed to compare PA cephalograms and 3D CT regarding the severity of Me deviation and the direction of the Me. PA cephalograms and 3D CT images of 35 patients who underwent orthognathic surgery (19 males and 16 females, with an average age of 22.1±3.3 years) were retrospectively reviewed in this study. By measuring the distance and direction of the Me from the midfacial reference line and the midsagittal plane in the cephalograms and 3D CT, respectively, the x-coordinates (x1 and x2) of the Me were obtained in each image. The difference between the x-coordinates was calculated and statistical analysis was performed to compare the severity of Me deviation and the direction of the Me in the two imaging modalities. A statistically significant difference in the severity of Me deviation was found between the two imaging modalities (Δx=2.45±2.03 mm, p<0.05) using the one-sample t-test. Statistically significant agreement was observed in the presence of deviation (k=0.64, p<0.05) and in the severity of Me deviation (k=0.27, p<0.05). A difference in the direction of the Me was detected in three patients (8.6%). The severity of the Me deviation was found to vary according to the imaging modality in 16 patients (45.7%). The measurement of Me deviation may be different between PA cephalograms and 3D CT in some patients.

  3. A comparative study of the deviation of the menton on posteroanterior cephalograms and three dimensional computed tomography

    International Nuclear Information System (INIS)

    Lee, Hee Jin; Lee, Sun Gene; Lee, Eun Joo; Kang, Byung Cheol; Lee, Jae Seo; Lim, Hoi Jeong; Yoon, Suk Ja; Song, In Ja

    2016-01-01

    Facial asymmetry has been measured by the severity of deviation of the menton (Me) on posteroanterior (PA) cephalograms and three-dimensional (3D) computed tomography (CT). This study aimed to compare PA cephalograms and 3D CT regarding the severity of Me deviation and the direction of the Me. PA cephalograms and 3D CT images of 35 patients who underwent orthognathic surgery (19 males and 16 females, with an average age of 22.1±3.3 years) were retrospectively reviewed in this study. By measuring the distance and direction of the Me from the midfacial reference line and the midsagittal plane in the cephalograms and 3D CT, respectively, the x-coordinates (x1 and x2) of the Me were obtained in each image. The difference between the x-coordinates was calculated and statistical analysis was performed to compare the severity of Me deviation and the direction of the Me in the two imaging modalities. A statistically significant difference in the severity of Me deviation was found between the two imaging modalities (Δx=2.45±2.03 mm, p<0.05) using the one-sample t-test. Statistically significant agreement was observed in the presence of deviation (k=0.64, p<0.05) and in the severity of Me deviation (k=0.27, p<0.05). A difference in the direction of the Me was detected in three patients (8.6%). The severity of the Me deviation was found to vary according to the imaging modality in 16 patients (45.7%). The measurement of Me deviation may be different between PA cephalograms and 3D CT in some patients

  4. Usability Studies in Virtual and Traditional Computer Aided Design Environments for Fault Identification

    Science.gov (United States)

    2017-08-08

    communicate their subjective opinions. Keywords: Usability Analysis; CAVETM (Cave Automatic Virtual Environments); Human Computer Interface (HCI...the differences in interaction when compared with traditional human computer interfaces. This paper provides analysis via usability study methods

  5. VALIDITY IN COMPUTER-BASED TESTING: A LITERATURE REVIEW OF COMPARABILITY ISSUES AND EXAMINEE PERSPECTIVES

    Directory of Open Access Journals (Sweden)

    Ika Kana Trisnawati

    2015-05-01

    Full Text Available These past years have seen the growing popularity of the Computer-Based Tests (CBTs in various disciplines, for various purposes, although the Paper-and Pencil Based Tests (P&Ps are still in use. However, many question on whether the use of CBTs outperform the effectiveness of the P&Ps or if the CBTs can become a valid measuring tool compared to the PBTs. This paper tries to present the comparison on both the CBTs and the P&Ps and their respective examinee perspectives in order to figure out if doubts should arise to the emergence of the CBTs over the classic P&Ps. Findings showed that the CBTs are advantageous in that they are both efficient (reducing testing time and effective (maintaining the test reliability over the P&P versions. Nevertheless, the CBTs still need to have their variables well-designed (e.g., study design, computer algorithm in order for the scores to be comparable to those in the P&P tests since the score equivalence is one of the validity evidences needed in a CBT.

  6. Computation of dominant eigenvalues and eigenvectors: A comparative study of algorithms

    International Nuclear Information System (INIS)

    Nightingale, M.P.; Viswanath, V.S.; Mueller, G.

    1993-01-01

    We investigate two widely used recursive algorithms for the computation of eigenvectors with extreme eigenvalues of large symmetric matrices---the modified Lanczoes method and the conjugate-gradient method. The goal is to establish a connection between their underlying principles and to evaluate their performance in applications to Hamiltonian and transfer matrices of selected model systems of interest in condensed matter physics and statistical mechanics. The conjugate-gradient method is found to converge more rapidly for understandable reasons, while storage requirements are the same for both methods

  7. Comparative cost analysis -- computed tomography vs. alternative diagnostic procedures, 1977-1980

    International Nuclear Information System (INIS)

    Gempel, P.A.; Harris, G.H.; Evans, R.G.

    1977-12-01

    In comparing the total national cost of utilizing computed tomography (CT) for medically indicated diagnoses with that of conventional x-ray, ultrasonography, nuclear medicine, and exploratory surgery, this investigation concludes that there was little, if any, added net cost from CT use in 1977 or will there be in 1980. Computed tomography, generally recognized as a reliable and useful diagnostic modality, has the potential to reduce net costs provided that an optimal number of units can be made available to physicians and patients to achieve projected reductions in alternative procedures. This study examines the actual cost impact of CT on both cranial and body diagnostic procedures. For abdominal and mediastinal disorders, CT scanning is just beginning to emerge as a diagnostic modality. As such, clinical experience is somewhat limited and the authors assume that no significant reduction in conventional procedures took place in 1977. It is estimated that the approximately 375,000 CT body procedures performed in 1977 represent only a 5 percent cost increase over use of other diagnostic modalities. It is projected that 2,400,000 CT body procedures will be performed in 1980 and, depending on assumptions used, total body diagnostic costs will increase only slightly or be reduced. Thirty-one tables appear throughout the text presenting cost data broken down by types of diagnostic procedures used and projections by years. Appendixes present technical cost components for diagnostic procedures, the comparative efficacy of CT as revealed in abstracts of published literature, selected medical diagnoses, and references

  8. Decoding Computer Games: Studying “Special Operation 85”

    Directory of Open Access Journals (Sweden)

    Bahareh Jalalzadeh

    2009-11-01

    Full Text Available As other media, computer games convey messages which have tow features: explicit and implicit. Semiologically studying computer games and comparing them with narrative structures, the present study attempts to discover the messages they convey. Therefore we have studied and decoded “Special operation 85” as a semiological text. Results show that the game’s features, as naming, interests and motivations of the engaged people, and the events narrated, all lead the producers to their goals of introducing and publicizing Iranian-Islamic cultural values. Although this feature makes “Special Opreation 85” a unique game, it fails in its attempt to produce a mythical personage in Iranian-Islamic cultural context.

  9. Initial phantom study comparing image quality in computed tomography using adaptive statistical iterative reconstruction and new adaptive statistical iterative reconstruction v.

    Science.gov (United States)

    Lim, Kyungjae; Kwon, Heejin; Cho, Jinhan; Oh, Jongyoung; Yoon, Seongkuk; Kang, Myungjin; Ha, Dongho; Lee, Jinhwa; Kang, Eunju

    2015-01-01

    The purpose of this study was to assess the image quality of a novel advanced iterative reconstruction (IR) method called as "adaptive statistical IR V" (ASIR-V) by comparing the image noise, contrast-to-noise ratio (CNR), and spatial resolution from those of filtered back projection (FBP) and adaptive statistical IR (ASIR) on computed tomography (CT) phantom image. We performed CT scans at 5 different tube currents (50, 70, 100, 150, and 200 mA) using 3 types of CT phantoms. Scanned images were subsequently reconstructed in 7 different scan settings, such as FBP, and 3 levels of ASIR and ASIR-V (30%, 50%, and 70%). The image noise was measured in the first study using body phantom. The CNR was measured in the second study using contrast phantom and the spatial resolutions were measured in the third study using a high-resolution phantom. We compared the image noise, CNR, and spatial resolution among the 7 reconstructed image scan settings to determine whether noise reduction, high CNR, and high spatial resolution could be achieved at ASIR-V. At quantitative analysis of the first and second studies, it showed that the images reconstructed using ASIR-V had reduced image noise and improved CNR compared with those of FBP and ASIR (P ASIR-V had significantly improved spatial resolution than those of FBP and ASIR (P ASIR-V provides a significant reduction in image noise and a significant improvement in CNR as well as spatial resolution. Therefore, this technique has the potential to reduce the radiation dose further without compromising image quality.

  10. Comparative randomised active drug controlled clinical trial of a herbal eye drop in computer vision syndrome.

    Science.gov (United States)

    Chatterjee, Pranab Kr; Bairagi, Debasis; Roy, Sudipta; Majumder, Nilay Kr; Paul, Ratish Ch; Bagchi, Sunil Ch

    2005-07-01

    A comparative double-blind placebo-controlled clinical trial of a herbal eye drop (itone) was conducted to find out its efficacy and safety in 120 patients with computer vision syndrome. Patients using computers for more than 3 hours continuously per day having symptoms of watering, redness, asthenia, irritation, foreign body sensation and signs of conjunctival hyperaemia, corneal filaments and mucus were studied. One hundred and twenty patients were randomly given either placebo, tears substitute (tears plus) or itone in identical vials with specific code number and were instructed to put one drop four times daily for 6 weeks. Subjective and objective assessments were done at bi-weekly intervals. In computer vision syndrome both subjective and objective improvements were noticed with itone drops. Itone drop was found significantly better than placebo (pcomputer vision syndrome.

  11. Audio computer-assisted self interview compared to traditional interview in an HIV-related behavioral survey in Vietnam.

    Science.gov (United States)

    Le, Linh Cu; Vu, Lan T H

    2012-10-01

    Globally, population surveys on HIV/AIDS and other sensitive topics have been using audio computer-assisted self interview for many years. This interview technique, however, is still new to Vietnam and little is known about its application and impact in general population surveys. One plausible hypothesis is that residents of Vietnam interviewed using this technique may provide a higher response rate and be more willing to reveal their true behaviors than if interviewed with traditional methods. This study aims to compare audio computer-assisted self interview with traditional face-to-face personal interview and self-administered interview with regard to rates of refusal and affirmative responses to questions on sensitive topics related to HIV/AIDS. In June 2010, a randomized study was conducted in three cities (Ha Noi, Da Nan and Can Tho), using a sample of 4049 residents aged 15 to 49 years. Respondents were randomly assigned to one of three interviewing methods: audio computer-assisted self interview, personal face-to-face interview, and self-administered paper interview. Instead of providing answers directly to interviewer questions as with traditional methods, audio computer-assisted self-interview respondents read the questions displayed on a laptop screen, while listening to the questions through audio headphones, then entered responses using a laptop keyboard. A MySQL database was used for data management and SPSS statistical package version 18 used for data analysis with bivariate and multivariate statistical techniques. Rates of high risk behaviors and mean values of continuous variables were compared for the three data collection methods. Audio computer-assisted self interview showed advantages over comparison techniques, achieving lower refusal rates and reporting higher prevalence of some sensitive and risk behaviors (perhaps indication of more truthful answers). Premarital sex was reported by 20.4% in the audio computer-assisted self-interview survey

  12. A Study on the Radiographic Diagnosis of Common Periapical Lesions by Using Computer

    International Nuclear Information System (INIS)

    Kim, Jae Duck; Kim, Seung Kug

    1990-01-01

    The purpose of this study was to estimate the diagnostic availability of the common periapical lesions by using computer. The author used a domestic personal computer and rearranged the applied program appropriately with RF (Rapid File), a program to answer the purpose of this study, and then input the consequence made out through collection, analysis and classification of the clinical and radiological features about the common periapical lesions as a basic data. The 256 cases (Cyst 91, Periapical granuloma 74, Periapical abscess 91) were obtained from the chart recordings and radiographs of the patients diagnosed or treated under the common periapical lesions during the past 8 years (1983-1990) at the infirmary of Dental School, Chosun University. Next, the clinical and radiographic features of the 256 cases were applied to RF program for diagnosis, and the diagnosis by using computer was compared with the hidden final diagnosis by clinical and histopathological examination. The obtained results were as follow: 1. In cases of the cyst, diagnosis through the computer program was shown rather lower accuracy (80.22%) as compared with accuracy (90.1%) by the radiologists. 2. In cases of the granuloma, diagnosis through the computer program was shown rather higher accuracy(75.7%) as compared with the accuracy (70.3%) by the radiologists. 3. In cases of periapical abscess, the diagnostic accuracy was shown 88% in both diagnoses. 4. The average diagnostic accuracy of 256 cases through the computer program was shown rather lower accuracy (81.2%) as compared with the accuracy (82.8%) by the radiologists. 5. The applied basic data for radiographic diagnosis of common periapical lesions by using computer was estimated to be available.

  13. A Study on the Radiographic Diagnosis of Common Periapical Lesions by Using Computer

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Duck; Kim, Seung Kug [Dept. of Oral Radiology, College of Dentistry, Chosun University, Kwangju (Korea, Republic of)

    1990-08-15

    The purpose of this study was to estimate the diagnostic availability of the common periapical lesions by using computer. The author used a domestic personal computer and rearranged the applied program appropriately with RF (Rapid File), a program to answer the purpose of this study, and then input the consequence made out through collection, analysis and classification of the clinical and radiological features about the common periapical lesions as a basic data. The 256 cases (Cyst 91, Periapical granuloma 74, Periapical abscess 91) were obtained from the chart recordings and radiographs of the patients diagnosed or treated under the common periapical lesions during the past 8 years (1983-1990) at the infirmary of Dental School, Chosun University. Next, the clinical and radiographic features of the 256 cases were applied to RF program for diagnosis, and the diagnosis by using computer was compared with the hidden final diagnosis by clinical and histopathological examination. The obtained results were as follow: 1. In cases of the cyst, diagnosis through the computer program was shown rather lower accuracy (80.22%) as compared with accuracy (90.1%) by the radiologists. 2. In cases of the granuloma, diagnosis through the computer program was shown rather higher accuracy(75.7%) as compared with the accuracy (70.3%) by the radiologists. 3. In cases of periapical abscess, the diagnostic accuracy was shown 88% in both diagnoses. 4. The average diagnostic accuracy of 256 cases through the computer program was shown rather lower accuracy (81.2%) as compared with the accuracy (82.8%) by the radiologists. 5. The applied basic data for radiographic diagnosis of common periapical lesions by using computer was estimated to be available.

  14. A comparative study of cranial, blunt trauma fractures as seen at medicolegal autopsy and by Computed Tomography

    International Nuclear Information System (INIS)

    Jacobsen, Christina; Bech, Birthe H; Lynnerup, Niels

    2009-01-01

    Computed Tomography (CT) has become a widely used supplement to medico legal autopsies at several forensic institutes. Amongst other things, it has proven to be very valuable in visualising fractures of the cranium. Also CT scan data are being used to create head models for biomechanical trauma analysis by Finite Element Analysis. If CT scan data are to be used for creating individual head models for retrograde trauma analysis in the future we need to ascertain how well cranial fractures are captured by CT scan. The purpose of this study was to compare the diagnostic agreement between CT and autopsy regarding cranial fractures and especially the precision with which cranial fractures are recorded. The autopsy fracture diagnosis was compared to the diagnosis of two CT readings (reconstructed with Multiplanar and Maximum Intensity Projection reconstructions) by registering the fractures on schematic drawings. The extent of the fractures was quantified by merging 3-dimensional datasets from both the autopsy as input by 3D digitizer tracing and CT scan. The results showed a good diagnostic agreement regarding fractures localised in the posterior fossa, while the fracture diagnosis in the medial and anterior fossa was difficult at the first CT scan reading. The fracture diagnosis improved during the second CT scan reading. Thus using two different CT reconstructions improved diagnosis in the medial fossa and at the impact points in the cranial vault. However, fracture diagnosis in the anterior and medial fossa and of hairline fractures in general still remained difficult. The study showed that the forensically important fracture systems to a large extent were diagnosed on CT images using Multiplanar and Maximum Intensity Projection reconstructions. Difficulties remained in the minute diagnosis of hairline fractures. These inconsistencies need to be resolved in order to use CT scan data of victims for individual head modelling and trauma analysis

  15. Comparative analysis of 11 different radioisotopes for palliative treatment of bone metastases by computational methods

    Energy Technology Data Exchange (ETDEWEB)

    Guerra Liberal, Francisco D. C., E-mail: meb12020@fe.up.pt, E-mail: adriana-tavares@msn.com; Tavares, Adriana Alexandre S., E-mail: meb12020@fe.up.pt, E-mail: adriana-tavares@msn.com; Tavares, João Manuel R. S., E-mail: tavares@fe.up.pt [Instituto de Engenharia Mecânica e Gestão Industrial, Faculdade de Engenharia, Universidade do Porto, Rua Dr. Roberto Frias s/n, Porto 4200-465 (Portugal)

    2014-11-01

    Purpose: Throughout the years, the palliative treatment of bone metastases using bone seeking radiotracers has been part of the therapeutic resources used in oncology, but the choice of which bone seeking agent to use is not consensual across sites and limited data are available comparing the characteristics of each radioisotope. Computational simulation is a simple and practical method to study and to compare a variety of radioisotopes for different medical applications, including the palliative treatment of bone metastases. This study aims to evaluate and compare 11 different radioisotopes currently in use or under research for the palliative treatment of bone metastases using computational methods. Methods: Computational models were used to estimate the percentage of deoxyribonucleic acid (DNA) damage (fast Monte Carlo damage algorithm), the probability of correct DNA repair (Monte Carlo excision repair algorithm), and the radiation-induced cellular effects (virtual cell radiobiology algorithm) post-irradiation with selected particles emitted by phosphorus-32 ({sup 32}P), strontium-89 ({sup 89}Sr), yttrium-90 ({sup 90}Y ), tin-117 ({sup 117m}Sn), samarium-153 ({sup 153}Sm), holmium-166 ({sup 166}Ho), thulium-170 ({sup 170}Tm), lutetium-177 ({sup 177}Lu), rhenium-186 ({sup 186}Re), rhenium-188 ({sup 188}Re), and radium-223 ({sup 223}Ra). Results: {sup 223}Ra alpha particles, {sup 177}Lu beta minus particles, and {sup 170}Tm beta minus particles induced the highest cell death of all investigated particles and radioisotopes. The cell survival fraction measured post-irradiation with beta minus particles emitted by {sup 89}Sr and {sup 153}Sm, two of the most frequently used radionuclides in the palliative treatment of bone metastases in clinical routine practice, was higher than {sup 177}Lu beta minus particles and {sup 223}Ra alpha particles. Conclusions: {sup 223}Ra and {sup 177}Lu hold the highest potential for palliative treatment of bone metastases of all

  16. Differences in prevalence of self-reported musculoskeletal symptoms among computer and non-computer users in a Nigerian population: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Ayanniyi O

    2010-08-01

    Full Text Available Abstract Background Literature abounds on the prevalent nature of Self Reported Musculoskeletal Symptoms (SRMS among computer users, but studies that actually compared this with non computer users are meagre thereby reducing the strength of the evidence. This study compared the prevalence of SRMS between computer and non computer users and assessed the risk factors associated with SRMS. Methods A total of 472 participants comprising equal numbers of age and sex matched computer and non computer users were assessed for the presence of SRMS. Information concerning musculoskeletal symptoms and discomforts from the neck, shoulders, upper back, elbows, wrists/hands, low back, hips/thighs, knees and ankles/feet were obtained using the Standardized Nordic questionnaire. Results The prevalence of SRMS was significantly higher in the computer users than the non computer users both over the past 7 days (χ2 = 39.11, p = 0.001 and during the past 12 month durations (χ2 = 53.56, p = 0.001. The odds of reporting musculoskeletal symptoms was least for participants above the age of 40 years (OR = 0.42, 95% CI = 0.31-0.64 over the past 7 days and OR = 0.61; 95% CI = 0.47-0.77 during the past 12 months and also reduced in female participants. Increasing daily hours and accumulated years of computer use and tasks of data processing and designs/graphics significantly (p Conclusion The prevalence of SRMS was significantly higher in the computer users than the non computer users and younger age, being male, working longer hours daily, increasing years of computer use, data entry tasks and computer designs/graphics were the significant risk factors for reporting musculoskeletal symptoms among the computer users. Computer use may explain the increase in prevalence of SRMS among the computer users.

  17. Computations for the 1:5 model of the THTR pressure vessel compared with experimental results

    International Nuclear Information System (INIS)

    Stangenberg, F.

    1972-01-01

    In this report experimental results measured at the 1:5-model of the prestressed concrete pressure vessel of the THTR-nuclear power station Schmehausen in 1971, are compared with the results of axis-symmetrical computations. Linear-elastic computations were performed as well as approximate computations for overload pressures taking into consideration the influences of the load history (prestressing, temperature, creep) and the effects of the steel components. (orig.) [de

  18. Comparing Experiment and Computation of Hypersonic Laminar Boundary Layers with Isolated Roughness

    Science.gov (United States)

    Bathel, Brett F.; Iyer, Prahladh S.; Mahesh, Krishnan; Danehy, Paul M.; Inman, Jennifer A.; Jones, Stephen B.; Johansen, Craig T.

    2014-01-01

    Streamwise velocity profile behavior in a hypersonic laminar boundary layer in the presence of an isolated roughness element is presented for an edge Mach number of 8.2. Two different roughness element types are considered: a 2-mm tall, 4-mm diameter cylinder, and a 2-mm radius hemisphere. Measurements of the streamwise velocity behavior using nitric oxide (NO) planar laser-induced fluorescence (PLIF) molecular tagging velocimetry (MTV) have been performed on a 20-degree wedge model. The top surface of this model acts as a flat-plate and is oriented at 5 degrees with respect to the freestream flow. Computations using direct numerical simulation (DNS) of these flows have been performed and are compared to the measured velocity profiles. Particular attention is given to the characteristics of velocity profiles immediately upstream and downstream of the roughness elements. In these regions, the streamwise flow can experience strong deceleration or acceleration. An analysis in which experimentally measured MTV profile displacements are compared with DNS particle displacements is performed to determine if the assumption of constant velocity over the duration of the MTV measurement is valid. This assumption is typically made when reporting MTV-measured velocity profiles, and may result in significant errors when comparing MTV measurements to computations in regions with strong deceleration or acceleration. The DNS computations with the cylindrical roughness element presented in this paper were performed with and without air injection from a rectangular slot upstream of the cylinder. This was done to determine the extent to which gas seeding in the MTV measurements perturbs the boundary layer flowfield.

  19. Electromagnetic computation methods for lightning surge protection studies

    CERN Document Server

    Baba, Yoshihiro

    2016-01-01

    This book is the first to consolidate current research and to examine the theories of electromagnetic computation methods in relation to lightning surge protection. The authors introduce and compare existing electromagnetic computation methods such as the method of moments (MOM), the partial element equivalent circuit (PEEC), the finite element method (FEM), the transmission-line modeling (TLM) method, and the finite-difference time-domain (FDTD) method. The application of FDTD method to lightning protection studies is a topic that has matured through many practical applications in the past decade, and the authors explain the derivation of Maxwell's equations required by the FDTD, and modeling of various electrical components needed in computing lightning electromagnetic fields and surges with the FDTD method. The book describes the application of FDTD method to current and emerging problems of lightning surge protection of continuously more complex installations, particularly in critical infrastructures of e...

  20. Computed tomography study of otitis media

    International Nuclear Information System (INIS)

    Bahia, Paulo Roberto Valle; Marchiori, Edson

    1997-01-01

    The findings of computed tomography (CT) of 89 patients clinically suspected of having otitis media were studied in this work. Such results were compared to clinical diagnosis, otoscopy, surgical findings and previous data. Among the results of our analysis, we studied seven patients with acute otitis media and 83 patients with chronic otitis media. The patients with acute otitis media have undergone CT examinations to evaluate possible spread to central nervous system. The diagnosis of cholesteatoma, its extension and complications were the main indication. for chronic otitis media study. The main findings of the cholesteatomatous otitis were the occupation of the epitympanun, the bony wall destruction and the ossicular chain erosion. The CT demonstrated a great sensibility to diagnose the cholesteatoma. (author)

  1. Comparative study for small computer supported clearance determination with 131iodine hippuran using CdTe detectors

    International Nuclear Information System (INIS)

    Duerr, G.

    1986-01-01

    With the goal to work out a simple, non-invasive method for the total clearance determination also for immobile patients, we carried out this clearance study with CdTe semi-conductor detectors. The 131 iodine hippuran clearance determination was carried out on 69 patients in the nuclear medicine department of the Radiological Policlinic in the framework of a routine diagnosis with ambulant and stationary patients with a gamma camera and a connecting evaluation system. At the same time we recorded the shoulder curves using two CdTe semi-conductor detectors and deposited the data in a portable semi-conductor memory. Next the hypotheses for the routine use with the inclusion of commercially common small computers was worked out. The plasma disappearance curves which were recorded over the shoulder region were evaluated with a small computer according to the method of the modified Oberhausen tables and the Oberhausen formula. (orig./DG) [de

  2. A parametric study of a solar calcinator using computational fluid dynamics

    International Nuclear Information System (INIS)

    Fidaros, D.K.; Baxevanou, C.A.; Vlachos, N.S.

    2007-01-01

    In this work a horizontal rotating solar calcinator is studied numerically using computational fluid dynamics. The specific solar reactor is a 10 kW model designed and used for efficiency studies. The numerical model is based on the solution of the Navier-Stokes equations for the gas flow, and on Lagrangean dynamics for the discrete particles. All necessary mathematical models were developed and incorporated into a computational fluid dynamics model with the influence of turbulence simulated by a two-equation (RNG k-ε) model. The efficiency of the reactor was calculated for different thermal inputs, feed rates, rotational speeds and particle diameters. The numerically computed degrees of calcination compared well with equivalent experimental results

  3. MetaCompare: A computational pipeline for prioritizing environmental resistome risk.

    Science.gov (United States)

    Oh, Min; Pruden, Amy; Chen, Chaoqi; Heath, Lenwood S; Xia, Kang; Zhang, Liqing

    2018-04-26

    The spread of antibiotic resistance is a growing public health concern. While numerous studies have highlighted the importance of environmental sources and pathways of the spread of antibiotic resistance, a systematic means of comparing and prioritizing risks represented by various environmental compartments is lacking. Here we introduce MetaCompare, a publicly-available tool for ranking 'resistome risk,' which we define as the potential for antibiotic resistance genes (ARGs) to be associated with mobile genetic elements (MGEs) and mobilize to pathogens based on metagenomic data. A computational pipeline was developed in which each ARG is evaluated based on relative abundance, mobility, and presence within a pathogen. This is determined through assembly of shotgun sequencing data and analysis of contigs containing ARGs to determine if they contain sequence similarity to MGEs or human pathogens. Based on the assembled metagenomes, samples are projected into a 3-D hazard space and assigned resistome risk scores. To validate, we tested previously published metagenomic data derived from distinct aquatic environments. Based on unsupervised machine learning, the test samples clustered in the hazard space in a manner consistent with their origin. The derived scores produced a well-resolved ascending resistome risk ranking of: wastewater treatment plant effluent, dairy lagoon, hospital sewage.

  4. Parallel computing of physical maps--a comparative study in SIMD and MIMD parallelism.

    Science.gov (United States)

    Bhandarkar, S M; Chirravuri, S; Arnold, J

    1996-01-01

    Ordering clones from a genomic library into physical maps of whole chromosomes presents a central computational problem in genetics. Chromosome reconstruction via clone ordering is usually isomorphic to the NP-complete Optimal Linear Arrangement problem. Parallel SIMD and MIMD algorithms for simulated annealing based on Markov chain distribution are proposed and applied to the problem of chromosome reconstruction via clone ordering. Perturbation methods and problem-specific annealing heuristics are proposed and described. The SIMD algorithms are implemented on a 2048 processor MasPar MP-2 system which is an SIMD 2-D toroidal mesh architecture whereas the MIMD algorithms are implemented on an 8 processor Intel iPSC/860 which is an MIMD hypercube architecture. A comparative analysis of the various SIMD and MIMD algorithms is presented in which the convergence, speedup, and scalability characteristics of the various algorithms are analyzed and discussed. On a fine-grained, massively parallel SIMD architecture with a low synchronization overhead such as the MasPar MP-2, a parallel simulated annealing algorithm based on multiple periodically interacting searches performs the best. For a coarse-grained MIMD architecture with high synchronization overhead such as the Intel iPSC/860, a parallel simulated annealing algorithm based on multiple independent searches yields the best results. In either case, distribution of clonal data across multiple processors is shown to exacerbate the tendency of the parallel simulated annealing algorithm to get trapped in a local optimum.

  5. Comparative validity and reproducibility study of various landmark-oriented reference planes in 3-dimensional computed tomographic analysis for patients receiving orthognathic surgery.

    Science.gov (United States)

    Lin, Hsiu-Hsia; Chuang, Ya-Fang; Weng, Jing-Ling; Lo, Lun-Jou

    2015-01-01

    Three-dimensional computed tomographic imaging has become popular in clinical evaluation, treatment planning, surgical simulation, and outcome assessment for maxillofacial intervention. The purposes of this study were to investigate whether there is any correlation among landmark-based horizontal reference planes and to validate the reproducibility and reliability of landmark identification. Preoperative and postoperative cone-beam computed tomographic images of patients who had undergone orthognathic surgery were collected. Landmark-oriented reference planes including the Frankfort horizontal plane (FHP) and the lateral semicircular canal plane (LSP) were established. Four FHPs were defined by selecting 3 points from the orbitale, porion, or midpoint of paired points. The LSP passed through both the lateral semicircular canal points and nasion. The distances between the maxillary or mandibular teeth and the reference planes were measured, and the differences between the 2 sides were calculated and compared. The precision in locating the landmarks was evaluated by performing repeated tests, and the intraobserver reproducibility and interobserver reliability were assessed. A total of 30 patients with facial deformity and malocclusion--10 patients with facial symmetry, 10 patients with facial asymmetry, and 10 patients with cleft lip and palate--were recruited. Comparing the differences among the 5 reference planes showed no statistically significant difference among all patient groups. Regarding intraobserver reproducibility, the mean differences in the 3 coordinates varied from 0 to 0.35 mm, with correlation coefficients between 0.96 and 1.0, showing high correlation between repeated tests. Regarding interobserver reliability, the mean differences among the 3 coordinates varied from 0 to 0.47 mm, with correlation coefficients between 0.88 and 1.0, exhibiting high correlation between the different examiners. The 5 horizontal reference planes were reliable and

  6. Comparing the influence of spectro-temporal integration in computational speech segregation

    DEFF Research Database (Denmark)

    Bentsen, Thomas; May, Tobias; Kressner, Abigail Anne

    2016-01-01

    The goal of computational speech segregation systems is to automatically segregate a target speaker from interfering maskers. Typically, these systems include a feature extraction stage in the front-end and a classification stage in the back-end. A spectrotemporal integration strategy can...... be applied in either the frontend, using the so-called delta features, or in the back-end, using a second classifier that exploits the posterior probability of speech from the first classifier across a spectro-temporal window. This study systematically analyzes the influence of such stages on segregation...... metric that comprehensively predicts computational segregation performance and correlates well with intelligibility. The outcome of this study could help to identify the most effective spectro-temporal integration strategy for computational segregation systems....

  7. Costs of cloud computing for a biometry department. A case study.

    Science.gov (United States)

    Knaus, J; Hieke, S; Binder, H; Schwarzer, G

    2013-01-01

    "Cloud" computing providers, such as the Amazon Web Services (AWS), offer stable and scalable computational resources based on hardware virtualization, with short, usually hourly, billing periods. The idea of pay-as-you-use seems appealing for biometry research units which have only limited access to university or corporate data center resources or grids. This case study compares the costs of an existing heterogeneous on-site hardware pool in a Medical Biometry and Statistics department to a comparable AWS offer. The "total cost of ownership", including all direct costs, is determined for the on-site hardware, and hourly prices are derived, based on actual system utilization during the year 2011. Indirect costs, which are difficult to quantify are not included in this comparison, but nevertheless some rough guidance from our experience is given. To indicate the scale of costs for a methodological research project, a simulation study of a permutation-based statistical approach is performed using AWS and on-site hardware. In the presented case, with a system utilization of 25-30 percent and 3-5-year amortization, on-site hardware can result in smaller costs, compared to hourly rental in the cloud dependent on the instance chosen. Renting cloud instances with sufficient main memory is a deciding factor in this comparison. Costs for on-site hardware may vary, depending on the specific infrastructure at a research unit, but have only moderate impact on the overall comparison and subsequent decision for obtaining affordable scientific computing resources. Overall utilization has a much stronger impact as it determines the actual computing hours needed per year. Taking this into ac count, cloud computing might still be a viable option for projects with limited maturity, or as a supplement for short peaks in demand.

  8. A computational study of high entropy alloys

    Science.gov (United States)

    Wang, Yang; Gao, Michael; Widom, Michael; Hawk, Jeff

    2013-03-01

    As a new class of advanced materials, high-entropy alloys (HEAs) exhibit a wide variety of excellent materials properties, including high strength, reasonable ductility with appreciable work-hardening, corrosion and oxidation resistance, wear resistance, and outstanding diffusion-barrier performance, especially at elevated and high temperatures. In this talk, we will explain our computational approach to the study of HEAs that employs the Korringa-Kohn-Rostoker coherent potential approximation (KKR-CPA) method. The KKR-CPA method uses Green's function technique within the framework of multiple scattering theory and is uniquely designed for the theoretical investigation of random alloys from the first principles. The application of the KKR-CPA method will be discussed as it pertains to the study of structural and mechanical properties of HEAs. In particular, computational results will be presented for AlxCoCrCuFeNi (x = 0, 0.3, 0.5, 0.8, 1.0, 1.3, 2.0, 2.8, and 3.0), and these results will be compared with experimental information from the literature.

  9. Computational and the real energy performance of a single-family residential building in Poland – an attempt to compare: a case study

    Directory of Open Access Journals (Sweden)

    Kowalski Piotr

    2017-01-01

    Full Text Available The paper presents energy use for heating and ventilation (one of the energy performance components determined in three ways. A case of a single family building located near Wroclaw in Poland is analyzed. The first and the second variant are both computational and the third presents actual measured energy consumption. Computational variants are based on the Polish methodology for the EPC (the Energy Performance Certificate. This methodology is based on ‘the Energy Performance of Buildings Directive 2010/31/EU’. Energy use for heating and ventilation is calculated using monthly method presented in EN ISO 13790. In the first computational option standard input data (parameters such as indoor and outdoor air temperature etc. are taken from standards and regulations are implemented. In the second variant this input data are partially taken from measurements. The results of energy use from both computational variants are compared to the actual measured energy consumption. On the basis of this comparison the influence of three factors: solar radiation heat gains, building air tightness and the SCOP of the heat pump on energy use calculations are analyzed. Conclusions aim to point the differences between them and the actual energy consumption.

  10. Verification of SACI-2 computer code comparing with experimental results of BIBLIS-A and LOOP-7 computer code

    International Nuclear Information System (INIS)

    Soares, P.A.; Sirimarco, L.F.

    1984-01-01

    SACI-2 is a computer code created to study the dynamic behaviour of a PWR nuclear power plant. To evaluate the quality of its results, SACI-2 was used to recalculate commissioning tests done in BIBLIS-A nuclear power plant and to calculate postulated transients for Angra-2 reactor. The results of SACI-2 computer code from BIBLIS-A showed as much good agreement as those calculated with the KWU Loop 7 computer code for Angra-2. (E.G.) [pt

  11. The feasibility of using UML to compare the impact of different brands of computer system on the clinical consultation.

    Science.gov (United States)

    Kumarapeli, Pushpa; de Lusignan, Simon; Koczan, Phil; Jones, Beryl; Sheeler, Ian

    2007-01-01

    UK general practice is universally computerised, with computers used in the consulting room at the point of care. Practices use a range of different brands of computer system, which have developed organically to meet the needs of general practitioners and health service managers. Unified Modelling Language (UML) is a standard modelling and specification notation widely used in software engineering. To examine the feasibility of UML notation to compare the impact of different brands of general practice computer system on the clinical consultation. Multi-channel video recordings of simulated consultation sessions were recorded on three different clinical computer systems in common use (EMIS, iSOFT Synergy and IPS Vision). User action recorder software recorded time logs of keyboard and mouse use, and pattern recognition software captured non-verbal communication. The outputs of these were used to create UML class and sequence diagrams for each consultation. We compared 'definition of the presenting problem' and 'prescribing', as these tasks were present in all the consultations analysed. Class diagrams identified the entities involved in the clinical consultation. Sequence diagrams identified common elements of the consultation (such as prescribing) and enabled comparisons to be made between the different brands of computer system. The clinician and computer system interaction varied greatly between the different brands. UML sequence diagrams are useful in identifying common tasks in the clinical consultation, and for contrasting the impact of the different brands of computer system on the clinical consultation. Further research is needed to see if patterns demonstrated in this pilot study are consistently displayed.

  12. Comparative study on findings of the brain computed tomography (X-ray-CT) and dynamic topography of VEP (VDT)

    International Nuclear Information System (INIS)

    Matsuura, Masashi

    1985-01-01

    Comparative study between morphological Xray-CT and functional VDT was conducted on 20 cases of cerebral diseases with visual dysfunction. Subjects were patients with cerebral infarction, intracranial hemorrhage, hemispherectomy, traumatic brain atrophy, brain tumor, Creutzfeldt-Jakob disease, anoxic encephalopathy, porencephaly, microcephaly and optic tract lesion. VEP topography was performed by flash stimulation and brain electrical activity mappings were displayed by EEG topography computer. In 9 cases out of 20, abolished function in VDT was correlated to the defective findings of Xray-CT. Cases with homonymous hemianopsia showed 2 types of BEAM. In cases with a lesion in the inner surface of the occipital lobe, asymmetric electric activity was distributed along the sagittal axis of the scalp. While, in cases with outer surface lesion of the occipital lobe, asymmetric electric activity appeared along the coronary axis. In cases with multi focal brain lesions in Xray-CT, there was no regular tendency in abnormality of VDT. Various aberration of VEP and VDT, such as component defect, stagnation, reduction, condensation and abnormal flow were demonstrated. In a case of optic tract lesion, Xray-CT showed no pathological findings but VDT showed a remarkable asymmetry of brain activity. (author)

  13. Computed tomography scanner applied to soil compaction studies

    International Nuclear Information System (INIS)

    Vaz, C.M.P.

    1989-11-01

    The soil compaction problem was studied using a first generation computed tomography scanner (CT). This apparatus gets images of soil cross sections samples, with resolution of a few millimeters. We performed the following laboratory and field experiments: basic experiments of equipment calibrations and resolutions studies; measurements of compacted soil thin layers; measurements of soil compaction caused by agricultural tools; stress-strain modelling in confined soil sample, with several moisture degree; characterizations of soil bulk density profile with samples collected in a hole (trench), comparing with a cone penetrometer technique. (author)

  14. Comparing genomes: databases and computational tools for comparative analysis of prokaryotic genomes - DOI: 10.3395/reciis.v1i2.Sup.105en

    Directory of Open Access Journals (Sweden)

    Marcos Catanho

    2007-12-01

    Full Text Available Since the 1990's, the complete genetic code of more than 600 living organisms has been deciphered, such as bacteria, yeasts, protozoan parasites, invertebrates and vertebrates, including Homo sapiens, and plants. More than 2,000 other genome projects representing medical, commercial, environmental and industrial interests, or comprising model organisms, important for the development of the scientific research, are currently in progress. The achievement of complete genome sequences of numerous species combined with the tremendous progress in computation that occurred in the last few decades allowed the use of new holistic approaches in the study of genome structure, organization and evolution, as well as in the field of gene prediction and functional classification. Numerous public or proprietary databases and computational tools have been created attempting to optimize the access to this information through the web. In this review, we present the main resources available through the web for comparative analysis of prokaryotic genomes. We concentrated on the group of mycobacteria that contains important human and animal pathogens. The birth of Bioinformatics and Computational Biology and the contributions of these disciplines to the scientific development of this field are also discussed.

  15. Development Of The Computer Code For Comparative Neutron Activation Analysis

    International Nuclear Information System (INIS)

    Purwadi, Mohammad Dhandhang

    2001-01-01

    The qualitative and quantitative chemical analysis with Neutron Activation Analysis (NAA) is an importance utilization of a nuclear research reactor, and this should be accelerated and promoted in application and its development to raise the utilization of the reactor. The application of Comparative NAA technique in GA Siwabessy Multi Purpose Reactor (RSG-GAS) needs special (not commercially available yet) soft wares for analyzing the spectrum of multiple elements in the analysis at once. The application carried out using a single spectrum software analyzer, and comparing each result manually. This method really degrades the quality of the analysis significantly. To solve the problem, a computer code was designed and developed for comparative NAA. Spectrum analysis in the code is carried out using a non-linear fitting method. Before the spectrum analyzed, it was passed to the numerical filter which improves the signal to noise ratio to do the deconvolution operation. The software was developed using the G language and named as PASAN-K The testing result of the developed software was benchmark with the IAEA spectrum and well operated with less than 10 % deviation

  16. Assessing the relationship between computational speed and precision: a case study comparing an interpreted versus compiled programming language using a stochastic simulation model in diabetes care.

    Science.gov (United States)

    McEwan, Phil; Bergenheim, Klas; Yuan, Yong; Tetlow, Anthony P; Gordon, Jason P

    2010-01-01

    Simulation techniques are well suited to modelling diseases yet can be computationally intensive. This study explores the relationship between modelled effect size, statistical precision, and efficiency gains achieved using variance reduction and an executable programming language. A published simulation model designed to model a population with type 2 diabetes mellitus based on the UKPDS 68 outcomes equations was coded in both Visual Basic for Applications (VBA) and C++. Efficiency gains due to the programming language were evaluated, as was the impact of antithetic variates to reduce variance, using predicted QALYs over a 40-year time horizon. The use of C++ provided a 75- and 90-fold reduction in simulation run time when using mean and sampled input values, respectively. For a series of 50 one-way sensitivity analyses, this would yield a total run time of 2 minutes when using C++, compared with 155 minutes for VBA when using mean input values. The use of antithetic variates typically resulted in a 53% reduction in the number of simulation replications and run time required. When drawing all input values to the model from distributions, the use of C++ and variance reduction resulted in a 246-fold improvement in computation time compared with VBA - for which the evaluation of 50 scenarios would correspondingly require 3.8 hours (C++) and approximately 14.5 days (VBA). The choice of programming language used in an economic model, as well as the methods for improving precision of model output can have profound effects on computation time. When constructing complex models, more computationally efficient approaches such as C++ and variance reduction should be considered; concerns regarding model transparency using compiled languages are best addressed via thorough documentation and model validation.

  17. Early characterization of atherosclerotic coronary plaques with multidetector computed tomography in patients with acute coronary syndrome. A comparative study with intravascular ultrasound

    Energy Technology Data Exchange (ETDEWEB)

    Iriart, Xavier; Dos-Santos, Pierre [Universite Bordeaux 2, Inserm U. 441 Atherosclerose, Bordeaux (France); Brunot, Sebastien [CHU de Bordeaux, Hopital du Haut-Leveque, Unite d' Imagerie Thoracique et Cardiovasculaire, Pessac (France); Unite de Soins Intensifs Cardiologiques, Pessac (France); Unite d' Imagerie Thoracique et Cardiovasculaire, Hopital Cardiologique, Pessac (France); Coste, Pierre; Leroux, Lionel [Universite Bordeaux 2, Inserm U. 441 Atherosclerose, Bordeaux (France); Unite de Soins Intensifs Cardiologiques, Pessac (France); Montaudon, Michel [Universite Bordeaux 2, Inserm U. 885 F 33076, Bordeaux (France); CHU de Bordeaux, Hopital du Haut-Leveque, Unite d' Imagerie Thoracique et Cardiovasculaire, Pessac (France); Labeque, Jean-Noel; Jais, Catherine [Unite de Soins Intensifs Cardiologiques, Pessac (France); Laurent, Francois [Universite Bordeaux 2, Inserm U. 885 F 33076, Bordeaux (France); CHU de Bordeaux, Hopital du Haut-Leveque, Unite d' Imagerie Thoracique et Cardiovasculaire, Pessac (France); Unite d' Imagerie Thoracique et Cardiovasculaire, Hopital Cardiologique, Pessac (France)

    2007-10-15

    We compared 16-slice computed tomography (CT) with intravascular ultrasound (IVUS) in their ability to identify the culprit lesion, and to assess plaque characterization and vascular remodelling in acute coronary syndrome (ACS). Twenty patients were prospectively studied. Coronary plaque identification and characterization were compared using 16-slice CT and 40-MHz catheter-based IVUS. Minimum lumen area (MLA), cross-sectional vessel area (CVA) and vessel remodelling were determined for each comparable lesion. One hundred and sixty-nine segments were compared and 84 plaques analysed. Sixteen-slice CT detected 95% of culprit lesions (19/20). No feature suggestive of plaque rupture was detected by 16-slice CT. Attenuation measurements within all lesions revealed different values for hypoechoic (38 {+-} 33 HU), hyperechoic (94 {+-} 44 HU), and calcified plaques (561 {+-} 216 HU), (P < 0.001). Agreement between 16-slice CT and IVUS on measuring MLA and CVA was evaluated using Bland-Altman analysis. Pearson and intra-class coefficient (ICC) were 0.81 and 0.70 for MLA, and 0.81 and 0.36 for CVA, for 16-slice CT and IVUS, respectively. Agreement between both techniques for vessel positive remodelling was moderate (kappa = 0.54, P < 0.001). Sixteen-slice CT has shown moderate accuracy in quantifying and characterizing coronary plaques compared with IVUS. Spatial resolution of 16-slice CT remains a major limitation, however, to accurately assess the complex lesions involved in ACS. (orig.)

  18. Splenic abnormalities: a comparative review of ultrasound, microbubble-enhanced ultrasound and computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Peddu, P.; Shah, M.; Sidhu, P.S. E-mail: paul.sidhu@kingsch.nhs.uk

    2004-09-01

    The ultrasound appearances of abnormalities of the spleen are reviewed and images compared with computed tomography. Focal lesions, both benign and malignant, trauma, infarction and congenital abnormalities are presented. The use of microbubble ultrasound contrast media as an aid to identifying and characterizing abnormalities is discussed.

  19. Variance stabilization for computing and comparing grand mean waveforms in MEG and EEG.

    Science.gov (United States)

    Matysiak, Artur; Kordecki, Wojciech; Sielużycki, Cezary; Zacharias, Norman; Heil, Peter; König, Reinhard

    2013-07-01

    Grand means of time-varying signals (waveforms) across subjects in magnetoencephalography (MEG) and electroencephalography (EEG) are commonly computed as arithmetic averages and compared between conditions, for example, by subtraction. However, the prerequisite for these operations, homogeneity of the variance of the waveforms in time, and for most common parametric statistical tests also between conditions, is rarely met. We suggest that the heteroscedasticity observed instead results because waveforms may differ by factors and additive terms and follow a mixed model. We propose to apply the asinh-transformation to stabilize the variance in such cases. We demonstrate the homogeneous variance and the normal distributions of data achieved by this transformation using simulated waveforms, and we apply it to real MEG data and show its benefits. The asinh-transformation is thus an essential and useful processing step prior to computing and comparing grand mean waveforms in MEG and EEG. Copyright © 2013 Society for Psychophysiological Research.

  20. A comparative study to validate the use of ultrasonography and computed tomography in patients with post-operative intra-abdominal sepsis

    International Nuclear Information System (INIS)

    Go, H.L.S.; Baarslag, H.J.; Vermeulen, H.; Lameris, J.S.; Legemate, D.A.

    2005-01-01

    Purpose: To validate abdominal ultrasonography and helical computed tomography in detecting causes for sepsis in patients after abdominal surgery and to determine improved criteria for its use. Materials and methods: Eighty-five consecutive surgical patients primarily operated for non-infectious disease were included in this prospective study. Forty-one patients were admitted to the intensive care unit. All patients were suspected of an intra-abdominal sepsis after abdominal surgery. Both ultrasonography (US) and helical abdominal computed tomography (CT) were performed to investigate the origin of an intra-abdominal sepsis. The images of both US and CT were interpreted on a four-point scale by different radiologists or residents in radiology, the investigators were blinded of each other's test. Interpretations of US and CT were compared with a reference standard which was defined by the result of diagnostic aspiration of suspected fluid collections (re)laparotomy, clinical course or the opinion of an independent panel. Likelihood ratios and post-test probabilities were calculated and interobserver agreement was determined using κ statistics. Results: The overall prevalence of an abdominal infection was 0.49. The likelihood ratio (LR) of a positive test-result for US was 1.33 (95% CI: 0.8-2.5) and for CT scan 2.53 (95% CI: 1.4-5.0); corresponding post-test probabilities for US 0.57 (95% CI: 0.42-0.70) and for CT 0.71 (95% CI: 0.57-0.83). The LR of a negative test-result was, respectively, 0.60 (95% CI: 0.3-1.3) and 0.18 (95% CI: 0.06-0.5); corresponding post-test probabilities for US 0.37 (95% CI: 0.20-0.57) and for CT 0.15 (95% CI: 0.06-0.32) were calculated. Conclusion: Computed tomography can be used as the imaging modality of choice in patients suspected of intra-abdominal sepsis after abdominal surgery. Because of the low discriminatory power ultrasonography should not be performed as initial diagnostic test

  1. Effects of mobile phone-based app learning compared to computer-based web learning on nursing students: pilot randomized controlled trial.

    Science.gov (United States)

    Lee, Myung Kyung

    2015-04-01

    This study aimed to determine the effect of mobile-based discussion versus computer-based discussion on self-directed learning readiness, academic motivation, learner-interface interaction, and flow state. This randomized controlled trial was conducted at one university. Eighty-six nursing students who were able to use a computer, had home Internet access, and used a mobile phone were recruited. Participants were randomly assigned to either the mobile phone app-based discussion group (n = 45) or a computer web-based discussion group (n = 41). The effect was measured at before and after an online discussion via self-reported surveys that addressed academic motivation, self-directed learning readiness, time distortion, learner-learner interaction, learner-interface interaction, and flow state. The change in extrinsic motivation on identified regulation in the academic motivation (p = 0.011) as well as independence and ability to use basic study (p = 0.047) and positive orientation to the future in self-directed learning readiness (p = 0.021) from pre-intervention to post-intervention was significantly more positive in the mobile phone app-based group compared to the computer web-based discussion group. Interaction between learner and interface (p = 0.002), having clear goals (p = 0.012), and giving and receiving unambiguous feedback (p = 0.049) in flow state was significantly higher in the mobile phone app-based discussion group than it was in the computer web-based discussion group at post-test. The mobile phone might offer more valuable learning opportunities for discussion teaching and learning methods in terms of self-directed learning readiness, academic motivation, learner-interface interaction, and the flow state of the learning process compared to the computer.

  2. A comparative approach to closed-loop computation.

    Science.gov (United States)

    Roth, E; Sponberg, S; Cowan, N J

    2014-04-01

    Neural computation is inescapably closed-loop: the nervous system processes sensory signals to shape motor output, and motor output consequently shapes sensory input. Technological advances have enabled neuroscientists to close, open, and alter feedback loops in a wide range of experimental preparations. The experimental capability of manipulating the topology-that is, how information can flow between subsystems-provides new opportunities to understand the mechanisms and computations underlying behavior. These experiments encompass a spectrum of approaches from fully open-loop, restrained preparations to the fully closed-loop character of free behavior. Control theory and system identification provide a clear computational framework for relating these experimental approaches. We describe recent progress and new directions for translating experiments at one level in this spectrum to predictions at another level. Operating across this spectrum can reveal new understanding of how low-level neural mechanisms relate to high-level function during closed-loop behavior. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Prospective pilot study of a tablet computer in an Emergency Department.

    Science.gov (United States)

    Horng, Steven; Goss, Foster R; Chen, Richard S; Nathanson, Larry A

    2012-05-01

    The recent availability of low-cost tablet computers can facilitate bedside information retrieval by clinicians. To evaluate the effect of physician tablet use in the Emergency Department. Prospective cohort study comparing physician workstation usage with and without a tablet. 55,000 visits/year Level 1 Emergency Department at a tertiary academic teaching hospital. 13 emergency physicians (7 Attendings, 4 EM3s, and 2 EM1s) worked a total of 168 scheduled shifts (130 without and 38 with tablets) during the study period. Physician use of a tablet computer while delivering direct patient care in the Emergency Department. The primary outcome measure was the time spent using the Emergency Department Information System (EDIS) at a computer workstation per shift. The secondary outcome measure was the number of EDIS logins at a computer workstation per shift. Clinician use of a tablet was associated with a 38min (17-59) decrease in time spent per shift using the EDIS at a computer workstation (pcomputer was associated with a reduction in the number of times physicians logged into a computer workstation and a reduction in the amount of time they spent there using the EDIS. The presumed benefit is that decreasing time at a computer workstation increases physician availability at the bedside. However, this association will require further investigation. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  4. Duplex ultrasound and computed tomography angiography in the follow-up of endovascular abdominal aortic aneurysm repair: a comparative study

    International Nuclear Information System (INIS)

    Cantador, Alex Aparecido; Siqueira, Daniel Emilio Dalledone; Jacobsen, Octavio Barcellos; Baracat, Jamal; Pereira, Ines Minniti Rodrigues; Menezes, Fabio Hüsemann; Guillaumon, Ana Terezinha

    2016-01-01

    Objective: To compare duplex ultrasound and computed tomography (CT) angiography in terms of their performance in detecting endoleaks, as well as in determining the diameter of the aneurysm sac, in the postoperative follow-up of endovascular abdominal aortic aneurysm repair. Materials and Methods: This was a prospective study involving 30 patients who had undergone endovascular repair of infrarenal aortoiliac aneurysms. Duplex ultrasound and CT angiography were performed simultaneously by independent radiologists. Measurements of the aneurysm sac diameter were assessed, and the presence or absence of endoleaks was determined. Results: The average diameter of the aneurysm sac, as determined by duplex ultrasound and CT angiography was 6.09 ± 1.95 and 6.27 ± 2.16 cm, respectively. Pearson's correlation coefficient showing a statistically significant correlation (R = 0.88; p < 0.01). Comparing the duplex ultrasound and CT angiography results regarding the detection of endoleaks, we found that the former had a negative predictive value of 92.59% and a specificity of 96.15%. Conclusion: Our results show that there is little variation between the two methods evaluated, and that the choice between the two would have no significant effect on clinical management. Duplex ultrasound could replace CT angiography in the postoperative follow-up of endovascular aneurysm repair of the infrarenal aorta, because it is a low-cost procedure without the potential clinical complications related to the use of iodinated contrast and exposure to radiation. (author)

  5. Duplex ultrasound and computed tomography angiography in the follow-up of endovascular abdominal aortic aneurysm repair: a comparative study

    Energy Technology Data Exchange (ETDEWEB)

    Cantador, Alex Aparecido; Siqueira, Daniel Emilio Dalledone; Jacobsen, Octavio Barcellos; Baracat, Jamal; Pereira, Ines Minniti Rodrigues; Menezes, Fabio Hüsemann; Guillaumon, Ana Terezinha, E-mail: alex_cantador@yahoo.com.br [Universidade Estadual de Campinas (FCM/UNICAMP), Campinas, SP (Brazil). Faculdade de Ciencias Medicas

    2016-07-15

    Objective: To compare duplex ultrasound and computed tomography (CT) angiography in terms of their performance in detecting endoleaks, as well as in determining the diameter of the aneurysm sac, in the postoperative follow-up of endovascular abdominal aortic aneurysm repair. Materials and Methods: This was a prospective study involving 30 patients who had undergone endovascular repair of infrarenal aortoiliac aneurysms. Duplex ultrasound and CT angiography were performed simultaneously by independent radiologists. Measurements of the aneurysm sac diameter were assessed, and the presence or absence of endoleaks was determined. Results: The average diameter of the aneurysm sac, as determined by duplex ultrasound and CT angiography was 6.09 ± 1.95 and 6.27 ± 2.16 cm, respectively. Pearson's correlation coefficient showing a statistically significant correlation (R = 0.88; p < 0.01). Comparing the duplex ultrasound and CT angiography results regarding the detection of endoleaks, we found that the former had a negative predictive value of 92.59% and a specificity of 96.15%. Conclusion: Our results show that there is little variation between the two methods evaluated, and that the choice between the two would have no significant effect on clinical management. Duplex ultrasound could replace CT angiography in the postoperative follow-up of endovascular aneurysm repair of the infrarenal aorta, because it is a low-cost procedure without the potential clinical complications related to the use of iodinated contrast and exposure to radiation. (author)

  6. A computational study of the supersonic coherent jet

    International Nuclear Information System (INIS)

    Jeong, Mi Seon; Kim, Heuy Dong

    2003-01-01

    In steel-making process of iron and steel industry, the purity and quality of steel can be dependent on the amount of CO contained in the molten metal. Recently, the supersonic oxygen jet is being applied to the molten metal in the electric furnace and thus reduces the CO amount through the chemical reactions between the oxygen jet and molten metal, leading to a better quality of steel. In this application, the supersonic oxygen jet is limited in the distance over which the supersonic velocity is maintained. In order to get longer supersonic jet propagation into the molten metal, a supersonic coherent jet is suggested as one of the alternatives which are applicable to the electric furnace system. It has a flame around the conventional supersonic jet and thus the entrainment effect of the surrounding gas into the supersonic jet is reduced, leading to a longer propagation of the supersonic jet. In this regard, gasdynamics mechanism about why the combustion phenomenon surrounding the supersonic jet causes the jet core length to be longer is not yet clarified. The present study investigates the major characteristics of the supersonic coherent jet, compared with the conventional supersonic jet. A computational study is carried out to solve the compressible, axisymmetric Navier-Stokes equations. The computational results of the supersonic coherent jet are compared with the conventional supersonic jets

  7. Computed tomographic findings of progressive supranuclear palsy compared with Parkinson's disease

    Energy Technology Data Exchange (ETDEWEB)

    Yuki, Nobuhiro; Sato, Shuzo; Yuasa, Tatsuhiko; Ito, Jusuke; Miyatake, Tadashi [Niigata Univ. (Japan). School of Dentistry

    1990-10-01

    We investigated computed tomographic (CT) films of 4 pathologically documented cases of progressive supranuclear palsy (PSP) in which the clinical presentations were atypical and compared the findings with those of 15 patients with Parkinson's disease (PD). Dilatation of the third ventricle, atrophy of the midbrain tegmentum, and enlargement of the interpeduncular cistern toward the aqueduct were found to be the characteristic findings in PSP. Thus, radiological findings can be useful when the differential diagnosis between PSP and PD is clinically difficult. (author).

  8. Comparing the similarity of responses received from studies in Amazon's Mechanical Turk to studies conducted online and with direct recruitment.

    Science.gov (United States)

    Bartneck, Christoph; Duenser, Andreas; Moltchanova, Elena; Zawieska, Karolina

    2015-01-01

    Computer and internet based questionnaires have become a standard tool in Human-Computer Interaction research and other related fields, such as psychology and sociology. Amazon's Mechanical Turk (AMT) service is a new method of recruiting participants and conducting certain types of experiments. This study compares whether participants recruited through AMT give different responses than participants recruited through an online forum or recruited directly on a university campus. Moreover, we compare whether a study conducted within AMT results in different responses compared to a study for which participants are recruited through AMT but which is conducted using an external online questionnaire service. The results of this study show that there is a statistical difference between results obtained from participants recruited through AMT compared to the results from the participant recruited on campus or through online forums. We do, however, argue that this difference is so small that it has no practical consequence. There was no significant difference between running the study within AMT compared to running it with an online questionnaire service. There was no significant difference between results obtained directly from within AMT compared to results obtained in the campus and online forum condition. This may suggest that AMT is a viable and economical option for recruiting participants and for conducting studies as setting up and running a study with AMT generally requires less effort and time compared to other frequently used methods. We discuss our findings as well as limitations of using AMT for empirical studies.

  9. Comparative study of single and multislice computed tomography for assessment of the mandibular canal

    Directory of Open Access Journals (Sweden)

    Adriana da Silva Ferreira Paes

    2007-06-01

    Full Text Available OBJECTIVE: The purpose of this study was to evaluate the accuracy of relative measurements from the roof of the mandibular canal to the alveolar crest in multislice (multidetector computed tomography (MDCT and single-slice computed tomography (SSCT. MATERIAL AND METHODS: The sample consisted of 26 printed CT films (7 SSCT and 19 MDCT from the files of the LABI-3D (3D Imaging Laboratory of the School of Dentistry of the University of São Paulo (FOUSP, which had been acquired using different protocols. Two observers analyzed in a randomized and independent order a series of 22 oblique CT reconstructions of each patient. Each observer analyzed the CT scans twice. The length of the mandibular canal and the distance between the mandibular canal roof and the crest of the alveolar ridge were obtained. Dahlberg test was used for statistical analysis. RESULTS: The mean error found for the mandibular canal length measurements obtained from SSCT was 0.53 mm in the interobserver analysis, and 0.38 mm for both observers. On MDCT images, the mean error was 0.0 mm in the interobserver analysis, and 0.0 and 0.23 mm in the intraobserver analysis. Regarding the distance between the mandibular canal roof and the alveolar bone crest, the SSCT images showed a mean error of 1.16 mm in the interobserver analysis and 0.66 and 0.59 mm in the intraobserver analysis. In the MDCT images, the mean error was 0.72 mm in the interobserver analysis and 0.50 and 0.54 mm in the intraobserver analysis. CONCLUSION: Multislice CT was demonstrated a more accurate method and demonstrated high reproducibility in the analysis of important anatomical landmarks for planning of mandibular dental implants, namely the mandibular canal pathway and alveolar crest height.

  10. Defining Effectiveness Using Finite Sets A Study on Computability

    DEFF Research Database (Denmark)

    Macedo, Hugo Daniel dos Santos; Haeusler, Edward H.; Garcia, Alex

    2016-01-01

    finite sets and uses category theory as its mathematical foundations. The model relies on the fact that every function between finite sets is computable, and that the finite composition of such functions is also computable. Our approach is an alternative to the traditional model-theoretical based works...... which rely on (ZFC) set theory as a mathematical foundation, and our approach is also novel when compared to the already existing works using category theory to approach computability results. Moreover, we show how to encode Turing machine computations in the model, thus concluding the model expresses...

  11. Computer-aided detection of lung nodules on multidetector CT in concurrent-reader and second-reader modes: A comparative study

    International Nuclear Information System (INIS)

    Matsumoto, Sumiaki; Ohno, Yoshiharu; Aoki, Takatoshi; Yamagata, Hitoshi; Nogami, Munenobu; Matsumoto, Keiko; Yamashita, Yoshiko; Sugimura, Kazuro

    2013-01-01

    Purpose: To compare the reading times and detection performances of radiologists in concurrent-reader and second-reader modes of computer-aided detection (CAD) for lung nodules on multidetector computed tomography (CT). Materials and Methods: Fifty clinical multidetector CT datasets containing nodules up to 20 mm in diameter were retrospectively collected. For the detection and rating of non-calcified nodules larger than 4 mm in diameter, 6 radiologists (3 experienced radiologists and 3 resident radiologists) independently interpreted these datasets twice, once with concurrent-reader CAD and once with second-reader CAD. The reference standard of nodules in the datasets was determined by the consensus of two experienced chest radiologists. The reading times and detection performances in the two modes of CAD were statistically compared, where jackknife free-response receiver operating characteristic (JAFROC) analysis was used for the comparison of detection performances. Results: Two hundreds and seven nodules constituted the reference standard. Reading time was significantly shorter in the concurrent-reader mode than in the second-reader mode, with the mean reading time for the 6 radiologists being 132 s with concurrent-reader CAD and 210 s with second-reader CAD (p < 0.01). JAFROC analysis revealed no significant difference between the detection performances in the two modes, with the average figure-of-merit value for the 6 radiologists being 0.70 with concurrent-reader CAD and 0.72 with second-reader CAD (p = 0.35). Conclusion: In CAD for lung nodules on multidetector CT, the concurrent-reader mode is more time-efficient than the second-reader mode, and there can be no significant difference between the two modes in terms of detection performance of radiologists

  12. A study of Computing doctorates in South Africa from 1978 to 2014

    Directory of Open Access Journals (Sweden)

    Ian D Sanders

    2015-12-01

    Full Text Available This paper studies the output of South African universities in terms of computing-related doctorates in order to determine trends in numbers of doctorates awarded and to identify strong doctoral study research areas. Data collected from a variety of sources relating to Computing doctorates conferred since the late 1970s was used to compare the situation in Computing with that of all doctorates. The number of Computing doctorates awarded has increased considerably over the period of study. Nearly three times as many doctorates were awarded in the period 2010–2014 as in 2000–2004. The universities producing the most Computing doctorates were either previously “traditional” universities or comprehensive universities formed by amalgamating a traditional research university with a technikon. Universities of technology have not yet produced many doctorates as they do not have a strong research tradition. The analysis of topic keywords using ACM Computing classifications is preliminary but shows that professional issues are dominant in Information Systems, models are often built in Computer Science and several topics, including computing in education, are evident in both IS and CS. The relevant data is in the public domain but access is difficult as record keeping was generally inconsistent and incomplete. In addition, electronic databases at universities are not easily searchable and access to HEMIS data is limited. The database built for this paper is more inclusive in terms of discipline-related data than others.

  13. Comparative randomised controlled clinical trial of a herbal eye drop with artificial tear and placebo in computer vision syndrome.

    Science.gov (United States)

    Biswas, N R; Nainiwal, S K; Das, G K; Langan, U; Dadeya, S C; Mongre, P K; Ravi, A K; Baidya, P

    2003-03-01

    A comparative randomised double masked multicentric clinical trial has been conducted to find out the efficacy and safety of a herbal eye drop preparation, itone eye drops with artificial tear and placebo in 120 patients with computer vision syndrome. Patients using computer for at least 2 hours continuosly per day having symptoms of irritation, foreign body sensation, watering, redness, headache, eyeache and signs of conjunctival congestion, mucous/debris, corneal filaments, corneal staining or lacrimal lake were included in this study. Every patient was instructed to put two drops of either herbal drugs or placebo or artificial tear in the eyes regularly four times for 6 weeks. Objective and subjective findings were recorded at bi-weekly intervals up to six weeks. Side-effects, if any, were also noted. In computer vision syndrome the herbal eye drop preparation was found significantly better than artificial tear (p computer vision syndrome.

  14. Computerized Cognitive Rehabilitation: Comparing Different Human-Computer Interactions.

    Science.gov (United States)

    Quaglini, Silvana; Alloni, Anna; Cattani, Barbara; Panzarasa, Silvia; Pistarini, Caterina

    2017-01-01

    In this work we describe an experiment involving aphasic patients, where the same speech rehabilitation exercise was administered in three different modalities, two of which are computer-based. In particular, one modality exploits the "Makey Makey", an electronic board which allows interacting with the computer using physical objects.

  15. A study of computer-related upper limb discomfort and computer vision syndrome.

    Science.gov (United States)

    Sen, A; Richardson, Stanley

    2007-12-01

    Personal computers are one of the commonest office tools in Malaysia today. Their usage, even for three hours per day, leads to a health risk of developing Occupational Overuse Syndrome (OOS), Computer Vision Syndrome (CVS), low back pain, tension headaches and psychosocial stress. The study was conducted to investigate how a multiethnic society in Malaysia is coping with these problems that are increasing at a phenomenal rate in the west. This study investigated computer usage, awareness of ergonomic modifications of computer furniture and peripherals, symptoms of CVS and risk of developing OOS. A cross-sectional questionnaire study of 136 computer users was conducted on a sample population of university students and office staff. A 'Modified Rapid Upper Limb Assessment (RULA) for office work' technique was used for evaluation of OOS. The prevalence of CVS was surveyed incorporating a 10-point scoring system for each of its various symptoms. It was found that many were using standard keyboard and mouse without any ergonomic modifications. Around 50% of those with some low back pain did not have an adjustable backrest. Many users had higher RULA scores of the wrist and neck suggesting increased risk of developing OOS, which needed further intervention. Many (64%) were using refractive corrections and still had high scores of CVS commonly including eye fatigue, headache and burning sensation. The increase of CVS scores (suggesting more subjective symptoms) correlated with increase in computer usage spells. It was concluded that further onsite studies are needed, to follow up this survey to decrease the risks of developing CVS and OOS amongst young computer users.

  16. A comparative study of accuracy of linear measurements using cone beam and multi-slice computed tomographies for evaluation of mandibular canal location in dry mandibles.

    Science.gov (United States)

    Naser, Asieh Zamani; Mehr, Bahar Behdad

    2013-01-01

    Cross- sectional tomograms have been used for optimal pre-operative planning of dental implant placement. The aim of the present study was to assess the accuracy of Cone Beam Computed Tomography (CBCT) measurements of specific distances around the mandibular canal by comparing them to those obtained from Multi-Slice Computed Tomography (MSCT) images. Ten hemi-mandible specimens were examined using CBCT and MSCT. Before imaging, wires were placed at 7 locations between the anterior margin of the third molar and the anterior margin of the second premolar as reference points. Following distances were measured by two observers on each cross-sectional CBCT and MSCT image: Mandibular Width (W), Length (L), Upper Distance (UD), Lower Distance (LD), Buccal Distance (BD), and Lingual Distance (LID). The obtained data were evaluated using SPSS software, applying paired t-test and intra-class correlation coefficient (ICC). There was a significant difference between the values obtained by MSCT and CBCT measurement for all areas such as H, W, UD, LD, BD, and LID, (P < 0.001), with a difference less than 1 mm. The ICC for all distances by both techniques, measured by a single observer with a one week interval and between 2 observers was 99% and 98%, respectively. Comparing the obtained data of both techniques indicates that the difference between two techniques is 2.17% relative to MSCT. The results of this study showed that there is significant difference between measurements obtained by CBCT and MSCT. However, the difference is not clinically significant.

  17. Self-study manual for introduction to computational fluid dynamics

    OpenAIRE

    Nabatov, Andrey

    2017-01-01

    Computational Fluid Dynamics (CFD) is the branch of Fluid Mechanics and Computational Physics that plays a decent role in modern Mechanical Engineering Design process due to such advantages as relatively low cost of simulation comparing with conduction of real experiment, an opportunity to easily correct the design of a prototype prior to manufacturing of the final product and a wide range of application: mixing, acoustics, cooling and aerodynamics. This makes CFD particularly and Computation...

  18. Comparative Analysis on the Utilization of Computers | Nkata ...

    African Journals Online (AJOL)

    The findings reveal among others that extent of usability of computers in the two universities had a significant difference. It was concluded that the level of computer utilization in UNIPORT is more than in the RUST. It was recommended that periodical, pre and post qualification seminars be organized for the 2 university ...

  19. Computational study on effects of rib height and thickness on heat ...

    Indian Academy of Sciences (India)

    A computational study was carried out for the heat transfer augmentation in a three-dimensional square channel fitted with different types of ribs. The standard k–e model and its two variants (RNG and realizable) were used for turbulence modeling. The predictions were compared with available experimental ...

  20. Study of Material Flow of End-of-Life Computer Equipment (e-wastes ...

    African Journals Online (AJOL)

    In this study, a material flow model for the analysis of e-waste generation from computer equipment in Kaduna and Abuja in Nigeria has been developed and compared with that of Lagos which has been studied earlier. Data used to develop the models are the sales data from major distributors of electronics in the study ...

  1. COSA II Further benchmark exercises to compare geomechanical computer codes for salt

    International Nuclear Information System (INIS)

    Lowe, M.J.S.; Knowles, N.C.

    1989-01-01

    Project COSA (COmputer COdes COmparison for SAlt) was a benchmarking exercise involving the numerical modelling of the geomechanical behaviour of heated rock salt. Its main objective was to assess the current European capability to predict the geomechanical behaviour of salt, in the context of the disposal of heat-producing radioactive waste in salt formations. Twelve organisations participated in the exercise in which their solutions to a number of benchmark problems were compared. The project was organised in two distinct phases: The first, from 1984-1986, concentrated on the verification of the computer codes. The second, from 1986-1988 progressed to validation, using three in-situ experiments at the Asse research facility in West Germany as a basis for comparison. This document reports the activities of the second phase of the project and presents the results, assessments and conclusions

  2. Comparing the similarity of responses received from studies in Amazon's Mechanical Turk to studies conducted online and with direct recruitment.

    Directory of Open Access Journals (Sweden)

    Christoph Bartneck

    Full Text Available Computer and internet based questionnaires have become a standard tool in Human-Computer Interaction research and other related fields, such as psychology and sociology. Amazon's Mechanical Turk (AMT service is a new method of recruiting participants and conducting certain types of experiments. This study compares whether participants recruited through AMT give different responses than participants recruited through an online forum or recruited directly on a university campus. Moreover, we compare whether a study conducted within AMT results in different responses compared to a study for which participants are recruited through AMT but which is conducted using an external online questionnaire service. The results of this study show that there is a statistical difference between results obtained from participants recruited through AMT compared to the results from the participant recruited on campus or through online forums. We do, however, argue that this difference is so small that it has no practical consequence. There was no significant difference between running the study within AMT compared to running it with an online questionnaire service. There was no significant difference between results obtained directly from within AMT compared to results obtained in the campus and online forum condition. This may suggest that AMT is a viable and economical option for recruiting participants and for conducting studies as setting up and running a study with AMT generally requires less effort and time compared to other frequently used methods. We discuss our findings as well as limitations of using AMT for empirical studies.

  3. Examination of the Effects of Dimensionality on Cognitive Processing in Science: A Computational Modeling Experiment Comparing Online Laboratory Simulations and Serious Educational Games

    Science.gov (United States)

    Lamb, Richard L.

    2016-01-01

    Within the last 10 years, new tools for assisting in the teaching and learning of academic skills and content within the context of science have arisen. These new tools include multiple types of computer software and hardware to include (video) games. The purpose of this study was to examine and compare the effect of computer learning games in the…

  4. Shaping ability of the conventional nickel-titanium and reciprocating nickel-titanium file systems: a comparative study using micro-computed tomography.

    Science.gov (United States)

    Hwang, Young-Hye; Bae, Kwang-Shik; Baek, Seung-Ho; Kum, Kee-Yeon; Lee, WooCheol; Shon, Won-Jun; Chang, Seok Woo

    2014-08-01

    This study used micro-computed tomographic imaging to compare the shaping ability of Mtwo (VDW, Munich, Germany), a conventional nickel-titanium file system, and Reciproc (VDW), a reciprocating file system morphologically similar to Mtwo. Root canal shaping was performed on the mesiobuccal and distobuccal canals of extracted maxillary molars. In the RR group (n = 15), Reciproc was used in a reciprocating motion (150° counterclockwise/30° clockwise, 300 rpm); in the MR group, Mtwo was used in a reciprocating motion (150° clockwise/30° counterclockwise, 300 rpm); and in the MC group, Mtwo was used in a continuous rotating motion (300 rpm). Micro-computed tomographic images taken before and after canal shaping were used to analyze canal volume change and the degree of transportation at the cervical, middle, and apical levels. The time required for canal shaping was recorded. Afterward, each file was analyzed using scanning electron microscopy. No statistically significant differences were found among the 3 groups in the time for canal shaping or canal volume change (P > .05). Transportation values of the RR and MR groups were not significantly different at any level. However, the transportation value of the MC group was significantly higher than both the RR and MR groups at the cervical and apical levels (P file deformation was observed for 1 file in group RR (1/15), 3 files in group MR (3/15), and 5 files in group MC (5/15). In terms of shaping ability, Mtwo used in a reciprocating motion was not significantly different from the Reciproc system. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  5. Reproducibility and accuracy of linear measurements on dental models derived from cone-beam computed tomography compared with digital dental casts

    NARCIS (Netherlands)

    Waard, O. de; Rangel, F.A.; Fudalej, P.S.; Bronkhorst, E.M.; Kuijpers-Jagtman, A.M.; Breuning, K.H.

    2014-01-01

    INTRODUCTION: The aim of this study was to determine the reproducibility and accuracy of linear measurements on 2 types of dental models derived from cone-beam computed tomography (CBCT) scans: CBCT images, and Anatomodels (InVivoDental, San Jose, Calif); these were compared with digital models

  6. FISS: a computer program for reactor systems studies

    International Nuclear Information System (INIS)

    Tamm, H.; Sherman, G.R.; Wright, J.H.; Nieman, R.E.

    1979-08-01

    ΣFISSΣ is a computer code for use in investigating alternative fuel cycle strategies for Canadian and world nuclear programs. The code performs a system simulation accounting for dynamic effects of growing nuclear systems. Facilities in the model include storage for irradiated fuel, mines, plants for enrichment, fuel fabrication, fuel reprocessing and heavy water, and reactors. FISS is particularly useful for comparing various reactor strategies and studying sensitivities of resource consumption, capital investment and energy costs with changes in fuel cycle parameters, reactor parameters and financial variables. (author)

  7. A computer simulation study comparing lesion detection accuracy with digital mammography, breast tomosynthesis, and cone-beam CT breast imaging

    International Nuclear Information System (INIS)

    Gong Xing; Glick, Stephen J.; Liu, Bob; Vedula, Aruna A.; Thacker, Samta

    2006-01-01

    Although conventional mammography is currently the best modality to detect early breast cancer, it is limited in that the recorded image represents the superposition of a three-dimensional (3D) object onto a 2D plane. Recently, two promising approaches for 3D volumetric breast imaging have been proposed, breast tomosynthesis (BT) and CT breast imaging (CTBI). To investigate possible improvements in lesion detection accuracy with either breast tomosynthesis or CT breast imaging as compared to digital mammography (DM), a computer simulation study was conducted using simulated lesions embedded into a structured 3D breast model. The computer simulation realistically modeled x-ray transport through a breast model, as well as the signal and noise propagation through a CsI based flat-panel imager. Polyenergetic x-ray spectra of Mo/Mo 28 kVp for digital mammography, Mo/Rh 28 kVp for BT, and W/Ce 50 kVp for CTBI were modeled. For the CTBI simulation, the intensity of the x-ray spectra for each projection view was determined so as to provide a total average glandular dose of 4 mGy, which is approximately equivalent to that given in conventional two-view screening mammography. The same total dose was modeled for both the DM and BT simulations. Irregular lesions were simulated by using a stochastic growth algorithm providing lesions with an effective diameter of 5 mm. Breast tissue was simulated by generating an ensemble of backgrounds with a power law spectrum, with the composition of 50% fibroglandular and 50% adipose tissue. To evaluate lesion detection accuracy, a receiver operating characteristic (ROC) study was performed with five observers reading an ensemble of images for each case. The average area under the ROC curves (A z ) was 0.76 for DM, 0.93 for BT, and 0.94 for CTBI. Results indicated that for the same dose, a 5 mm lesion embedded in a structured breast phantom was detected by the two volumetric breast imaging systems, BT and CTBI, with statistically

  8. Comparative Evaluation of a Four-Implant-Supported Polyetherketoneketone Framework Prosthesis: A Three-Dimensional Finite Element Analysis Based on Cone Beam Computed Tomography and Computer-Aided Design.

    Science.gov (United States)

    Lee, Ki-Sun; Shin, Sang-Wan; Lee, Sang-Pyo; Kim, Jong-Eun; Kim, Jee-Hwan; Lee, Jeong-Yol

    The purpose of this pilot study was to evaluate and compare polyetherketoneketone (PEKK) with different framework materials for implant-supported prostheses by means of a three-dimensional finite element analysis (3D-FEA) based on cone beam computed tomography (CBCT) and computer-aided design (CAD) data. A geometric model that consisted of four maxillary implants supporting a prosthesis framework was constructed from CBCT and CAD data of a treated patient. Three different materials (zirconia, titanium, and PEKK) were selected, and their material properties were simulated using FEA software in the generated geometric model. In the PEKK framework (ie, low elastic modulus) group, the stress transferred to the implant and simulated adjacent tissue was reduced when compressive stress was dominant, but increased when tensile stress was dominant. This study suggests that the shock-absorbing effects of a resilient implant-supported framework are limited in some areas and that rigid framework material shows a favorable stress distribution and safety of overall components of the prosthesis.

  9. Some gender issues in educational computer use: results of an international comparative survey

    OpenAIRE

    Janssen Reinen, I.A.M.; Plomp, T.

    1993-01-01

    In the framework of the Computers in Education international study of the International Association for the Evaluation of Educational Achievement (IEA), data have been collected concerning the use of computers in 21 countries. This article examines some results regarding the involvement of women in the implementation and use of computers in the educational practice of elementary, lower secondary and upper secondary education in participating countries. The results show that in many countries ...

  10. CBT for depression: a pilot RCT comparing mobile phone vs. computer.

    Science.gov (United States)

    Watts, Sarah; Mackenzie, Anna; Thomas, Cherian; Griskaitis, Al; Mewton, Louise; Williams, Alishia; Andrews, Gavin

    2013-02-07

    This paper reports the results of a pilot randomized controlled trial comparing the delivery modality (mobile phone/tablet or fixed computer) of a cognitive behavioural therapy intervention for the treatment of depression. The aim was to establish whether a previously validated computerized program (The Sadness Program) remained efficacious when delivered via a mobile application. 35 participants were recruited with Major Depression (80% female) and randomly allocated to access the program using a mobile app (on either a mobile phone or iPad) or a computer. Participants completed 6 lessons, weekly homework assignments, and received weekly email contact from a clinical psychologist or psychiatrist until completion of lesson 2. After lesson 2 email contact was only provided in response to participant request, or in response to a deterioration in psychological distress scores. The primary outcome measure was the Patient Health Questionnaire 9 (PHQ-9). Of the 35 participants recruited, 68.6% completed 6 lessons and 65.7% completed the 3-months follow up. Attrition was handled using mixed-model repeated-measures ANOVA. Both the Mobile and Computer Groups were associated with statistically significantly benefits in the PHQ-9 at post-test. At 3 months follow up, the reduction seen for both groups remained significant. These results provide evidence to indicate that delivering a CBT program using a mobile application, can result in clinically significant improvements in outcomes for patients with depression. Australian New Zealand Clinical Trials Registry ACTRN 12611001257954.

  11. A comparative study of cone beam computed tomography and conventional radiography in diagnosing the extent of root resorptions

    Directory of Open Access Journals (Sweden)

    Elham Alamadi

    2017-11-01

    Full Text Available Abstract Background Root resorptions are assessed and diagnosed using different radiographical techniques. A comparison of the ability to assess resorptions on two-dimensional (2D and three-dimensional (3D radiographs is, hitherto, lacking. The aims of this study were to evaluate the accuracy of 2D (periapical radiographs, PA and panoramic radiograph, PAN and 3D (cone beam computed tomography, CBCT radiographic techniques in measuring slanted root resorptions compared to the true resorptions, a histological gold standard, in addition to a comparison of all the radiographic techniques to each other. Methods Radiographs (CBCT, PA, and PAN, in addition to histological sections, of extracted deciduous canines from thirty-four patients were analyzed. Linear measurements of the most and least resorbed side of the root, i.e., “slanted” resorptions, were measured using an analyzing software (Facad ®. For classification of slanted root resorptions, a modified Malmgren index was used. Results PAN underestimated the root length on both the least and most resorbed side. Small resorptions, i.e., low modified Malmgren scores, were more difficult to record and were only assessed accurately using CBCT. The root resorption scores were underestimated using PA and PAN. In assessment of linear measures, PAN differed significantly from both CBCT and PA. Conclusions CBCT is the most accurate technique when measuring and scoring slanted root resorptions.

  12. Gd-EOB-DTPA-enhanced magnetic resonance imaging features of hepatic hemangioma compared with enhanced computed tomography

    OpenAIRE

    Tateyama, Akihiro; Fukukura, Yoshihiko; Takumi, Koji; Shindo, Toshikazu; Kumagae, Yuichi; Kamimura, Kiyohisa; Nakajo, Masayuki

    2012-01-01

    AIM: To clarify features of hepatic hemangiomas on gadolinium-ethoxybenzyl-diethylenetriaminpentaacetic acid (Gd-EOB-DTPA)-enhanced magnetic resonance imaging (MRI) compared with enhanced computed tomography (CT).

  13. Comparative evaluation of effect of rotary and reciprocating single-file systems on pericervical dentin: A cone-beam computed tomography study.

    Science.gov (United States)

    Zinge, Priyanka Ramdas; Patil, Jayaprakash

    2017-01-01

    The aim of this study is to evaluate and compare the effect of one shape, Neolix rotary single-file systems and WaveOne, Reciproc reciprocating single-file systems on pericervical dentin (PCD) using cone-beam computed tomography (CBCT). A total of 40 freshly extracted mandibular premolars were collected and divided into two groups, namely, Group A - Rotary: A 1 - Neolix and A 2 - OneShape and Group B - Reciprocating: B 1 - WaveOne and B 2 - Reciproc. Preoperative scans of each were taken followed by conventional access cavity preparation and working length determination with 10-k file. Instrumentation of the canal was done according to the respective file system, and postinstrumentation CBCT scans of teeth were obtained. 90 μm thick slices were obtained 4 mm apical and coronal to the cementoenamel junction. The PCD thickness was calculated as the shortest distance from the canal outline to the closest adjacent root surface, which was measured in four surfaces, i.e., facial, lingual, mesial, and distal for all the groups in the two obtained scans. There was no significant difference found between rotary single-file systems and reciprocating single-file systems in their effect on PCD, but in Group B 2 , there was most significant loss of tooth structure in the mesial, lingual, and distal surface ( P file system removes more PCD as compared to other experimental groups, whereas Neolix single file system had the least effect on PCD.

  14. Muoniated radical states in the group 16 elements: Computational studies

    International Nuclear Information System (INIS)

    Macrae, Roderick M.

    2009-01-01

    Recent experimental studies on positive muon implantation in silicon, selenium, and tellurium have been interpreted on the basis that the primary paramagnetic species observed is XMu (X=S, Se, or Te), the muonium-substituted analog of the appropriate diatomic chalcogen monohydride radical. However, temperature-dependent signal visibility, broadening, and hyperfine shift effects remain puzzling. The interplay of degeneracy, spin-orbit coupling, and vibrational averaging in these species makes them computationally challenging despite their small size. In this work computational studies are carried out on all hydrogen isotopomers of the series OH, SH, SeH, and TeH. Several different methodological approaches are compared, and the effects of wavefunction symmetry, spin-orbit coupling, and zero-point vibrational corrections on the isotropic and anisotropic components of the hyperfine interaction are examined. Additionally, some models of the Mu site in rhombic sulfur are briefly considered.

  15. Fast computation of the characteristics method on vector computers

    International Nuclear Information System (INIS)

    Kugo, Teruhiko

    2001-11-01

    Fast computation of the characteristics method to solve the neutron transport equation in a heterogeneous geometry has been studied. Two vector computation algorithms; an odd-even sweep (OES) method and an independent sequential sweep (ISS) method have been developed and their efficiency to a typical fuel assembly calculation has been investigated. For both methods, a vector computation is 15 times faster than a scalar computation. From a viewpoint of comparison between the OES and ISS methods, the followings are found: 1) there is a small difference in a computation speed, 2) the ISS method shows a faster convergence and 3) the ISS method saves about 80% of computer memory size compared with the OES method. It is, therefore, concluded that the ISS method is superior to the OES method as a vectorization method. In the vector computation, a table-look-up method to reduce computation time of an exponential function saves only 20% of a whole computation time. Both the coarse mesh rebalance method and the Aitken acceleration method are effective as acceleration methods for the characteristics method, a combination of them saves 70-80% of outer iterations compared with a free iteration. (author)

  16. Examination of the Effects of Dimensionality on Cognitive Processing in Science: A Computational Modeling Experiment Comparing Online Laboratory Simulations and Serious Educational Games

    Science.gov (United States)

    Lamb, Richard L.

    2016-02-01

    Within the last 10 years, new tools for assisting in the teaching and learning of academic skills and content within the context of science have arisen. These new tools include multiple types of computer software and hardware to include (video) games. The purpose of this study was to examine and compare the effect of computer learning games in the form of three-dimensional serious educational games, two-dimensional online laboratories, and traditional lecture-based instruction in the context of student content learning in science. In particular, this study examines the impact of dimensionality, or the ability to move along the X-, Y-, and Z-axis in the games. Study subjects ( N = 551) were randomly selected using a stratified sampling technique. Independent strata subsamples were developed based upon the conditions of serious educational games, online laboratories, and lecture. The study also computationally models a potential mechanism of action and compares two- and three-dimensional learning environments. F test results suggest a significant difference for the main effect of condition across the factor of content gain score with large effect. Overall, comparisons using computational models suggest that three-dimensional serious educational games increase the level of success in learning as measured with content examinations through greater recruitment and attributional retraining of cognitive systems. The study supports assertions in the literature that the use of games in higher dimensions (i.e., three-dimensional versus two-dimensional) helps to increase student understanding of science concepts.

  17. Educational NASA Computational and Scientific Studies (enCOMPASS)

    Science.gov (United States)

    Memarsadeghi, Nargess

    2013-01-01

    Educational NASA Computational and Scientific Studies (enCOMPASS) is an educational project of NASA Goddard Space Flight Center aimed at bridging the gap between computational objectives and needs of NASA's scientific research, missions, and projects, and academia's latest advances in applied mathematics and computer science. enCOMPASS achieves this goal via bidirectional collaboration and communication between NASA and academia. Using developed NASA Computational Case Studies in university computer science/engineering and applied mathematics classes is a way of addressing NASA's goals of contributing to the Science, Technology, Education, and Math (STEM) National Objective. The enCOMPASS Web site at http://encompass.gsfc.nasa.gov provides additional information. There are currently nine enCOMPASS case studies developed in areas of earth sciences, planetary sciences, and astrophysics. Some of these case studies have been published in AIP and IEEE's Computing in Science and Engineering magazines. A few university professors have used enCOMPASS case studies in their computational classes and contributed their findings to NASA scientists. In these case studies, after introducing the science area, the specific problem, and related NASA missions, students are first asked to solve a known problem using NASA data and past approaches used and often published in a scientific/research paper. Then, after learning about the NASA application and related computational tools and approaches for solving the proposed problem, students are given a harder problem as a challenge for them to research and develop solutions for. This project provides a model for NASA scientists and engineers on one side, and university students, faculty, and researchers in computer science and applied mathematics on the other side, to learn from each other's areas of work, computational needs and solutions, and the latest advances in research and development. This innovation takes NASA science and

  18. Using Computational and Mechanical Models to Study Animal Locomotion

    Science.gov (United States)

    Miller, Laura A.; Goldman, Daniel I.; Hedrick, Tyson L.; Tytell, Eric D.; Wang, Z. Jane; Yen, Jeannette; Alben, Silas

    2012-01-01

    Recent advances in computational methods have made realistic large-scale simulations of animal locomotion possible. This has resulted in numerous mathematical and computational studies of animal movement through fluids and over substrates with the purpose of better understanding organisms’ performance and improving the design of vehicles moving through air and water and on land. This work has also motivated the development of improved numerical methods and modeling techniques for animal locomotion that is characterized by the interactions of fluids, substrates, and structures. Despite the large body of recent work in this area, the application of mathematical and numerical methods to improve our understanding of organisms in the context of their environment and physiology has remained relatively unexplored. Nature has evolved a wide variety of fascinating mechanisms of locomotion that exploit the properties of complex materials and fluids, but only recently are the mathematical, computational, and robotic tools available to rigorously compare the relative advantages and disadvantages of different methods of locomotion in variable environments. Similarly, advances in computational physiology have only recently allowed investigators to explore how changes at the molecular, cellular, and tissue levels might lead to changes in performance at the organismal level. In this article, we highlight recent examples of how computational, mathematical, and experimental tools can be combined to ultimately answer the questions posed in one of the grand challenges in organismal biology: “Integrating living and physical systems.” PMID:22988026

  19. Test-retest reliability and comparability of paper and computer questionnaires for the Finnish version of the Tampa Scale of Kinesiophobia.

    Science.gov (United States)

    Koho, P; Aho, S; Kautiainen, H; Pohjolainen, T; Hurri, H

    2014-12-01

    To estimate the internal consistency, test-retest reliability and comparability of paper and computer versions of the Finnish version of the Tampa Scale of Kinesiophobia (TSK-FIN) among patients with chronic pain. In addition, patients' personal experiences of completing both versions of the TSK-FIN and preferences between these two methods of data collection were studied. Test-retest reliability study. Paper and computer versions of the TSK-FIN were completed twice on two consecutive days. The sample comprised 94 consecutive patients with chronic musculoskeletal pain participating in a pain management or individual rehabilitation programme. The group rehabilitation design consisted of physical and functional exercises, evaluation of the social situation, psychological assessment of pain-related stress factors, and personal pain management training in order to regain overall function and mitigate the inconvenience of pain and fear-avoidance behaviour. The mean TSK-FIN score was 37.1 [standard deviation (SD) 8.1] for the computer version and 35.3 (SD 7.9) for the paper version. The mean difference between the two versions was 1.9 (95% confidence interval 0.8 to 2.9). Test-retest reliability was 0.89 for the paper version and 0.88 for the computer version. Internal consistency was considered to be good for both versions. The intraclass correlation coefficient for comparability was 0.77 (95% confidence interval 0.66 to 0.85), indicating substantial reliability between the two methods. Both versions of the TSK-FIN demonstrated substantial intertest reliability, good test-retest reliability, good internal consistency and acceptable limits of agreement, suggesting their suitability for clinical use. However, subjects tended to score higher when using the computer version. As such, in an ideal situation, data should be collected in a similar manner throughout the course of rehabilitation or clinical research. Copyright © 2014 Chartered Society of Physiotherapy. Published

  20. Epileptic seizure predictors based on computational intelligence techniques: a comparative study with 278 patients.

    Science.gov (United States)

    Alexandre Teixeira, César; Direito, Bruno; Bandarabadi, Mojtaba; Le Van Quyen, Michel; Valderrama, Mario; Schelter, Bjoern; Schulze-Bonhage, Andreas; Navarro, Vincent; Sales, Francisco; Dourado, António

    2014-05-01

    The ability of computational intelligence methods to predict epileptic seizures is evaluated in long-term EEG recordings of 278 patients suffering from pharmaco-resistant partial epilepsy, also known as refractory epilepsy. This extensive study in seizure prediction considers the 278 patients from the European Epilepsy Database, collected in three epilepsy centres: Hôpital Pitié-là-Salpêtrière, Paris, France; Universitätsklinikum Freiburg, Germany; Centro Hospitalar e Universitário de Coimbra, Portugal. For a considerable number of patients it was possible to find a patient specific predictor with an acceptable performance, as for example predictors that anticipate at least half of the seizures with a rate of false alarms of no more than 1 in 6 h (0.15 h⁻¹). We observed that the epileptic focus localization, data sampling frequency, testing duration, number of seizures in testing, type of machine learning, and preictal time influence significantly the prediction performance. The results allow to face optimistically the feasibility of a patient specific prospective alarming system, based on machine learning techniques by considering the combination of several univariate (single-channel) electroencephalogram features. We envisage that this work will serve as benchmark data that will be of valuable importance for future studies based on the European Epilepsy Database. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  1. Comparing Virtual and Physical Robotics Environments for Supporting Complex Systems and Computational Thinking

    Science.gov (United States)

    Berland, Matthew; Wilensky, Uri

    2015-01-01

    Both complex systems methods (such as agent-based modeling) and computational methods (such as programming) provide powerful ways for students to understand new phenomena. To understand how to effectively teach complex systems and computational content to younger students, we conducted a study in four urban middle school classrooms comparing…

  2. Study of tip loss corrections using CFD rotor computations

    DEFF Research Database (Denmark)

    Shen, Wen Zhong; Zhu, Wei Jun; Sørensen, Jens Nørkær

    2014-01-01

    Tip loss correction is known to play an important role for engineering prediction of wind turbine performance. There are two different types of tip loss corrections: tip corrections on momentum theory and tip corrections on airfoil data. In this paper, we study the latter using detailed CFD...... computations for wind turbines with sharp tip. Using the technique of determination of angle of attack and the CFD results for a NordTank 500 kW rotor, airfoil data are extracted and a new tip loss function on airfoil data is derived. To validate, BEM computations with the new tip loss function are carried out...... and compared with CFD results for the NordTank 500 kW turbine and the NREL 5 MW turbine. Comparisons show that BEM with the new tip loss function can predict correctly the loading near the blade tip....

  3. Functional imaging using computer methods to compare the effect of salbutamol and ipratropium bromide in patient-specific airway models of COPD

    Directory of Open Access Journals (Sweden)

    De Backer LA

    2011-11-01

    Full Text Available LA De Backer1, WG Vos2, R Salgado3, JW De Backer2, A Devolder1, SL Verhulst1, R Claes1, PR Germonpré1, WA De Backer11Department of Respiratory Medicine, 2FluidDA, 3Department of Radiology, Antwerp University Hospital, Antwerp, BelgiumBackground: Salbutamol and ipratropium bromide improve lung function in patients with chronic obstructive pulmonary disease (COPD. However, their bronchodilating effect has not yet been compared in the central and distal airways. Functional imaging using computational fluid dynamics offers the possibility of making such a comparison. The objective of this study was to assess the effects of salbutamol and ipratropium bromide on the geometry and computational fluid dynamics-based resistance of the central and distal airways.Methods: Five patients with Global Initiative for Chronic Obstructive Lung Disease Stage III COPD were randomized to a single dose of salbutamol or ipratropium bromide in a crossover manner with a 1-week interval between treatments. Patients underwent lung function testing and a multislice computed tomography scan of the thorax that was used for functional imaging. Two hours after dosing, the patients again underwent lung function tests and repeat computed tomography.Results: Lung function parameters, including forced expiratory volume in 1 second, vital capacity, overall airway resistance, and specific airway resistance, changed significantly after administration of each product. On functional imaging, the bronchodilating effect was greater in the distal airways, with a corresponding drop in airway resistance, compared with the central airways. Salbutamol and ipratropium bromide were equally effective at first glance when looking at lung function tests, but when viewed in more detail with functional imaging, hyporesponsiveness could be shown for salbutamol in one patient. Salbutamol was more effective in the other patients.Conclusion: This pilot study gives an innovative insight into the modes of

  4. Conventional versus computer-navigated TKA: a prospective randomized study.

    Science.gov (United States)

    Todesca, Alessandro; Garro, Luca; Penna, Massimo; Bejui-Hugues, Jacques

    2017-06-01

    The purpose of this study was to assess the midterm results of total knee arthroplasty (TKA) implanted with a specific computer navigation system in a group of patients (NAV) and to assess the same prosthesis implanted with the conventional technique in another group (CON); we hypothesized that computer navigation surgery would improve implant alignment, functional scores and survival of the implant compared to the conventional technique. From 2008 to 2009, 225 patients were enrolled in the study and randomly assigned in CON and NAV groups; 240 consecutive mobile-bearing ultra-congruent score (Amplitude, Valence, France) TKAs were performed by a single surgeon, 117 using the conventional method and 123 using the computer-navigated approach. Clinical outcome assessment was based on the Knee Society Score (KSS), the Hospital for Special Surgery Knee Score and the Western Ontario Mac Master University Index score. Component survival was calculated by Kaplan-Meier analysis. Median follow-up was 6.4 years (range 6-7 years). Two patients were lost to follow-up. No differences were seen between the two groups in age, sex, BMI and side of implantation. Three patients of CON group referred feelings of instability during walking, but clinical tests were all negative. NAV group showed statistical significant better KSS Score and wider ROM and fewer outliers from neutral mechanical axis, lateral distal femoral angle, medial proximal tibial angle and tibial slope in post-operative radiographic assessment. There was one case of early post-operative superficial infection (caused by Staph. Aureus) successfully treated with antibiotics. No mechanical loosening, mobile-bearing dislocation or patellofemoral complication was seen. At 7 years of follow-up, component survival in relation to the risk of aseptic loosening or other complications was 100 %. There were no implant revisions. This study demonstrates superior accuracy in implant positioning and statistical significant

  5. CBT for depression: a pilot RCT comparing mobile phone vs. computer

    Directory of Open Access Journals (Sweden)

    Watts Sarah

    2013-02-01

    Full Text Available Abstract Background This paper reports the results of a pilot randomized controlled trial comparing the delivery modality (mobile phone/tablet or fixed computer of a cognitive behavioural therapy intervention for the treatment of depression. The aim was to establish whether a previously validated computerized program (The Sadness Program remained efficacious when delivered via a mobile application. Method 35 participants were recruited with Major Depression (80% female and randomly allocated to access the program using a mobile app (on either a mobile phone or iPad or a computer. Participants completed 6 lessons, weekly homework assignments, and received weekly email contact from a clinical psychologist or psychiatrist until completion of lesson 2. After lesson 2 email contact was only provided in response to participant request, or in response to a deterioration in psychological distress scores. The primary outcome measure was the Patient Health Questionnaire 9 (PHQ-9. Of the 35 participants recruited, 68.6% completed 6 lessons and 65.7% completed the 3-months follow up. Attrition was handled using mixed-model repeated-measures ANOVA. Results Both the Mobile and Computer Groups were associated with statistically significantly benefits in the PHQ-9 at post-test. At 3 months follow up, the reduction seen for both groups remained significant. Conclusions These results provide evidence to indicate that delivering a CBT program using a mobile application, can result in clinically significant improvements in outcomes for patients with depression. Trial registration Australian New Zealand Clinical Trials Registry ACTRN 12611001257954

  6. Computational study of duct and pipe flows using the method of pseudocompressibility

    Science.gov (United States)

    Williams, Robert W.

    1991-01-01

    A viscous, three-dimensional, incompressible, Navier-Stokes Computational Fluid Dynamics code employing pseudocompressibility is used for the prediction of laminar primary and secondary flows in two 90-degree bends of constant cross section. Under study are a square cross section duct bend with 2.3 radius ratio and a round cross section pipe bend with 2.8 radius ratio. Sensitivity of predicted primary and secondary flow to inlet boundary conditions, grid resolution, and code convergence is investigated. Contour and velocity versus spanwise coordinate plots comparing prediction to experimental data flow components are shown at several streamwise stations before, within, and after the duct and pipe bends. Discussion includes secondary flow physics, computational method, computational requirements, grid dependence, and convergence rates.

  7. Comparative Observations of Learning Engagement by Students with Developmental Disabilities Using an iPad and Computer: A Pilot Study

    Science.gov (United States)

    Arthanat, Sajay; Curtin, Christine; Knotak, David

    2013-01-01

    This study examined the use of the Apple iPad for learning by children with developmental disabilities (DD), including those on the autism spectrum. A single case design was used to record the participation of four students with DD when taught with their standard computer at baseline, followed by the introduction of the iPad. A six-component…

  8. Human law and computer law comparative perspectives

    CERN Document Server

    Hildebrandt, Mireille

    2014-01-01

    This book probes the epistemological and hermeneutic implications of data science and artificial intelligence for democracy and the Rule of Law, and the challenges posed by computing technologies traditional legal thinking and the regulation of human affairs.

  9. Quantitative Study on Computer Self-Efficacy and Computer Anxiety Differences in Academic Major and Residential Status

    Science.gov (United States)

    Binkley, Zachary Wayne McClellan

    2017-01-01

    This study investigates computer self-efficacy and computer anxiety within 61 students across two academic majors, Aviation and Sports and Exercise Science, while investigating the impact residential status, age, and gender has on those two psychological constructs. The purpose of the study is to find if computer self-efficacy and computer anxiety…

  10. A comparative study of the tail ion distribution with reduced Fokker-Planck models

    Science.gov (United States)

    McDevitt, C. J.; Tang, Xian-Zhu; Guo, Zehua; Berk, H. L.

    2014-03-01

    A series of reduced models are used to study the fast ion tail in the vicinity of a transition layer between plasmas at disparate temperatures and densities, which is typical of the gas and pusher interface in inertial confinement fusion targets. Emphasis is placed on utilizing progressively more comprehensive models in order to identify the essential physics for computing the fast ion tail at energies comparable to the Gamow peak. The resulting fast ion tail distribution is subsequently used to compute the fusion reactivity as a function of collisionality and temperature. While a significant reduction of the fusion reactivity in the hot spot compared to the nominal Maxwellian case is present, this reduction is found to be partially recovered by an increase of the fusion reactivity in the neighboring cold region.

  11. Study guide to accompany computers data and processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Study Guide to Accompany Computer and Data Processing provides information pertinent to the fundamental aspects of computers and computer technology. This book presents the key benefits of using computers.Organized into five parts encompassing 19 chapters, this book begins with an overview of the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. This text then introduces computer hardware and describes the processor. Other chapters describe how microprocessors are made and describe the physical operation of computers. This book discusses as w

  12. Racking the brain: Detection of cerebral edema on postmortem computed tomography compared with forensic autopsy

    Energy Technology Data Exchange (ETDEWEB)

    Berger, Nicole [Institute of Forensic Medicine, Virtopsy, University of Zurich, Winterthurerstrasse 190/52, 8057 Zurich (Switzerland); Institute of Diagnostic and Interventional Radiology, University Hospital of Zurich, Raemistrasse 100, 8091 Zurich (Switzerland); Ampanozi, Garyfalia; Schweitzer, Wolf; Ross, Steffen G.; Gascho, Dominic [Institute of Forensic Medicine, Virtopsy, University of Zurich, Winterthurerstrasse 190/52, 8057 Zurich (Switzerland); Ruder, Thomas D. [Institute of Forensic Medicine, Virtopsy, University of Zurich, Winterthurerstrasse 190/52, 8057 Zurich (Switzerland); Institute of Diagnostic, Interventional and Pediatric Radiology, University Hospital of Bern, Freiburgstrasse, 3010 Bern (Switzerland); Thali, Michael J. [Institute of Forensic Medicine, Virtopsy, University of Zurich, Winterthurerstrasse 190/52, 8057 Zurich (Switzerland); Flach, Patricia M., E-mail: patricia.flach@irm.uzh.ch [Institute of Forensic Medicine, Virtopsy, University of Zurich, Winterthurerstrasse 190/52, 8057 Zurich (Switzerland); Institute of Diagnostic and Interventional Radiology, University Hospital of Zurich, Raemistrasse 100, 8091 Zurich (Switzerland)

    2015-04-15

    Graphical abstract: -- Highlights: •Postmortem swelling of the brain is a typical finding on PMCT and occurs concomitant with potential antemortem or agonal brain edema. •Cerebral edema despite normal postmortem swelling is indicated by narrowed temporal horns and symmetrical herniation of the cerebral tonsils on PMCT. •Cases with intoxication or asphyxia demonstrated higher deviations of the attenuation between white and gray matter (>20 Hounsfield Units) and a ratio >1.58 between the gray and white matter. •The Hounsfield measurements of the white and gray matter help to determine the cause of death in cases of intoxication or asphyxia. -- Abstract: Purpose: The purpose of this study was to compare postmortem computed tomography with forensic autopsy regarding their diagnostic reliability of differentiating between pre-existing cerebral edema and physiological postmortem brain swelling. Materials and methods: The study collective included a total of 109 cases (n = 109/200, 83 male, 26 female, mean age: 53.2 years) and were retrospectively evaluated for the following parameters (as related to the distinct age groups and causes of death): tonsillar herniation, the width of the outer and inner cerebrospinal fluid spaces and the radiodensity measurements (in Hounsfield Units) of the gray and white matter. The results were compared with the findings of subsequent autopsies as the gold standard for diagnosing cerebral edema. p-Values <0.05 were considered statistically significant. Results: Cerebellar edema (despite normal postmortem swelling) can be reliably assessed using postmortem computed tomography and is indicated by narrowed temporal horns and symmetrical herniation of the cerebellar tonsils (p < 0.001). There was a significant difference (p < 0.001) between intoxication (or asphyxia) and all other causes of death; the former causes demonstrated higher deviations of the attenuation between white and gray matter (>20 Hounsfield Units), and the gray to

  13. Racking the brain: Detection of cerebral edema on postmortem computed tomography compared with forensic autopsy

    International Nuclear Information System (INIS)

    Berger, Nicole; Ampanozi, Garyfalia; Schweitzer, Wolf; Ross, Steffen G.; Gascho, Dominic; Ruder, Thomas D.; Thali, Michael J.; Flach, Patricia M.

    2015-01-01

    Graphical abstract: -- Highlights: •Postmortem swelling of the brain is a typical finding on PMCT and occurs concomitant with potential antemortem or agonal brain edema. •Cerebral edema despite normal postmortem swelling is indicated by narrowed temporal horns and symmetrical herniation of the cerebral tonsils on PMCT. •Cases with intoxication or asphyxia demonstrated higher deviations of the attenuation between white and gray matter (>20 Hounsfield Units) and a ratio >1.58 between the gray and white matter. •The Hounsfield measurements of the white and gray matter help to determine the cause of death in cases of intoxication or asphyxia. -- Abstract: Purpose: The purpose of this study was to compare postmortem computed tomography with forensic autopsy regarding their diagnostic reliability of differentiating between pre-existing cerebral edema and physiological postmortem brain swelling. Materials and methods: The study collective included a total of 109 cases (n = 109/200, 83 male, 26 female, mean age: 53.2 years) and were retrospectively evaluated for the following parameters (as related to the distinct age groups and causes of death): tonsillar herniation, the width of the outer and inner cerebrospinal fluid spaces and the radiodensity measurements (in Hounsfield Units) of the gray and white matter. The results were compared with the findings of subsequent autopsies as the gold standard for diagnosing cerebral edema. p-Values <0.05 were considered statistically significant. Results: Cerebellar edema (despite normal postmortem swelling) can be reliably assessed using postmortem computed tomography and is indicated by narrowed temporal horns and symmetrical herniation of the cerebellar tonsils (p < 0.001). There was a significant difference (p < 0.001) between intoxication (or asphyxia) and all other causes of death; the former causes demonstrated higher deviations of the attenuation between white and gray matter (>20 Hounsfield Units), and the gray to

  14. Plastic deformation of crystals: analytical and computer simulation studies of dislocation glide

    International Nuclear Information System (INIS)

    Altintas, S.

    1978-05-01

    The plastic deformation of crystals is usually accomplished through the motion of dislocations. The glide of a dislocation is impelled by the applied stress and opposed by microstructural defects such as point defects, voids, precipitates and other dislocations. The planar glide of a dislocation through randomly distributed obstacles is considered. The objective of the present research work is to calculate the critical resolved shear stress (CRSS) for athermal glide and the velocity of the dislocation at finite temperature as a function of the applied stress and the nature and strength of the obstacles. Dislocation glide through mixtures of obstacles has been studied analytically and by computer simulation. Arrays containing two kinds of obstacles as well as square distribution of obstacle strengths are considered. The critical resolved shear stress for an array containing obstacles with a given distribution of strengths is calculated using the sum of the quadratic mean of the stresses for the individual obstacles and is found to be in good agreement with the computer simulation data. Computer simulation of dislocation glide through randomly distributed obstacles containing up to 10 6 obstacles show that the CRSS decreases as the size of the array increases and approaches a limiting value. Histograms of forces and of segment lengths are obtained and compared with theoretical predictions. Effects of array shape and boundary conditions on the dislocation glide are also studied. Analytical and computer simulation results are compared with experimental results obtained on precipitation-, irradiation-, forest-, and impurity cluster-hardening systems and are found to be in good agreement

  15. Plastic deformation of crystals: analytical and computer simulation studies of dislocation glide

    Energy Technology Data Exchange (ETDEWEB)

    Altintas, S.

    1978-05-01

    The plastic deformation of crystals is usually accomplished through the motion of dislocations. The glide of a dislocation is impelled by the applied stress and opposed by microstructural defects such as point defects, voids, precipitates and other dislocations. The planar glide of a dislocation through randomly distributed obstacles is considered. The objective of the present research work is to calculate the critical resolved shear stress (CRSS) for athermal glide and the velocity of the dislocation at finite temperature as a function of the applied stress and the nature and strength of the obstacles. Dislocation glide through mixtures of obstacles has been studied analytically and by computer simulation. Arrays containing two kinds of obstacles as well as square distribution of obstacle strengths are considered. The critical resolved shear stress for an array containing obstacles with a given distribution of strengths is calculated using the sum of the quadratic mean of the stresses for the individual obstacles and is found to be in good agreement with the computer simulation data. Computer simulation of dislocation glide through randomly distributed obstacles containing up to 10/sup 6/ obstacles show that the CRSS decreases as the size of the array increases and approaches a limiting value. Histograms of forces and of segment lengths are obtained and compared with theoretical predictions. Effects of array shape and boundary conditions on the dislocation glide are also studied. Analytical and computer simulation results are compared with experimental results obtained on precipitation-, irradiation-, forest-, and impurity cluster-hardening systems and are found to be in good agreement.

  16. Comparative study of Graves' ophthalmopathy by ultrasonography, computed tomography, and fish bioassay

    International Nuclear Information System (INIS)

    Mann, K.; Schoener, W.; Juengst, D.; Karl, H.J.; Maier-Hauff, K.; Rothe, R.

    1979-01-01

    In 35 patients with Graves' ophthalmopathy (GO) thyroid function was tested by T 3 -RIA, T 4 -RIA, TBI, TRH-test, thyroid scanning, and determination of thyroid autoantibodies. Additional ultrasonography (A-scan), computed tomography (CT) of the orbit, and the determination of an exophthalmogenic serum activity in fish bioassay was performed. Typical alterations for GO were observed in 26 cases with ultrasonography. CT showed an enlargement of medial and/or lateral rectus muscles in 24 of 33 patients, and in 17 cases a region of high density in the apex of the muscle cone. The density of retrobulbar fat after i.v. injection of contrast medium did not differ significantly from that observed in normal men. Characteristic signs of GO were not detected in only 2 cases using both methods together. Exophthalmogenic serum activity was found in the IgG fraction of serum protein. The incidence rate was high (69%), but for diagnostic purpose the fish bioassay cannot be recommended. (orig.) 891 AJ/orig. 892 BRE [de

  17. Cloud computing for comparative genomics with windows azure platform.

    Science.gov (United States)

    Kim, Insik; Jung, Jae-Yoon; Deluca, Todd F; Nelson, Tristan H; Wall, Dennis P

    2012-01-01

    Cloud computing services have emerged as a cost-effective alternative for cluster systems as the number of genomes and required computation power to analyze them increased in recent years. Here we introduce the Microsoft Azure platform with detailed execution steps and a cost comparison with Amazon Web Services.

  18. Comparative Evaluation of the Pharyngeal Airway Space in Unilateral and Bilateral Cleft Lip and Palate Individuals With Noncleft Individuals: A Cone Beam Computed Tomography Study.

    Science.gov (United States)

    Gandedkar, Narayan H; Chng, Chai Kiat; Basheer, Mohammad Abdul; Chen, Por Yong; Yeow, Vincent Kok Leng

    2017-09-01

      To evaluate the pharyngeal airway space changes in complete unilateral cleft lip and palate (UCLP) and bilateral cleft lip and palate (BCLP) individuals, and compare with age and sex-matched noncleft (NC) control subjects.   Retrospective study.   Cleft and Craniofacial Centre, KK Women's and Children's Hospital, Singapore.   Twenty UCLP (mean age: 13.4 ± 0.5 years), 18 BCLP (mean age: 13.5 ± 0.5 years) and 20 skeletal Class I subjects (mean age: 13.4 ± 0.6 years) were included in the study. Cone beam computed tomography scans were assessed for pharyngeal airway space (PAS) (oropharyngeal, nasopharyngeal, total airway space volume), and compared with PAS of age and sex-matched skeletal Class I NC individuals.   Pharyngeal airway space showed statistically significant differences in the UCLP, BCLP, and NC control subjects. Oropharyngeal (9338 ± 1108 mm 3 , P space (12 250 ± 1185 mm 3 , P .05).   The pharyngeal airway space was significantly reduced in the BCLP group than were those in UCLP and control groups. This reduced PAS should be taken into account when planning treatment for these individuals.

  19. Risk indices in comparative risk assessment studies

    International Nuclear Information System (INIS)

    Hubert, P.

    1984-01-01

    More than a decade ago the development of comparative risk assessment studies aroused overwhelming interest. There was no doubt that data on the health and safety aspects of energy systems would greatly benefit, or even end, the debate on nuclear energy. Although such attempts are still strongly supported, the rose-coloured expectations of the early days have faded. The high uncertainties, and the contradictory aspect, of the first results might explain this evolution. The loose connection between the range of computed risk indices and the questions on which the debate was focused is another reason for this decline in interest. Important research work is being carried out aiming at reducing the different kinds of uncertainties. Rather than the uncertainties, the paper considers the meaning of available risk indices and proposes more significant indices with respect to the goals of risk assessment. First, the indices which are of frequent use in comparative studies are listed. The stress is put on a French comparative study from which most examples are drawn. Secondly, the increase in magnitude of the indices and the decrease in the attributability of the risk to a given system is shown to be a consequence of the trend towards more comprehensive analyses. Thirdly, the ambiguity of such indices as the collective occupational risk is underlined, and a possible solution is suggested. Whenever risk assessments are related to pragmatic decision making problems it is possible to find satisfactory risk indices. The development of cost-effectiveness analyses and the proposals for quantitative safety goals clearly demonstrate this point. In the field of comparison of social impacts some proposals are made, but there remain some gaps still to be filled. (author)

  20. The Research of the Parallel Computing Development from the Angle of Cloud Computing

    Science.gov (United States)

    Peng, Zhensheng; Gong, Qingge; Duan, Yanyu; Wang, Yun

    2017-10-01

    Cloud computing is the development of parallel computing, distributed computing and grid computing. The development of cloud computing makes parallel computing come into people’s lives. Firstly, this paper expounds the concept of cloud computing and introduces two several traditional parallel programming model. Secondly, it analyzes and studies the principles, advantages and disadvantages of OpenMP, MPI and Map Reduce respectively. Finally, it takes MPI, OpenMP models compared to Map Reduce from the angle of cloud computing. The results of this paper are intended to provide a reference for the development of parallel computing.

  1. Planar Millimeter-Wave Antennas: A Comparative Study

    Directory of Open Access Journals (Sweden)

    K. Pitra

    2011-04-01

    Full Text Available The paper describes the design and the experimental verification of three types of wideband antennas. Attention is turned to the bow-tie antenna, the Vivaldi antenna and the spiral antenna designed for the operation at millimeter waves. Bandwidth, input impedance, gain, and directivity pattern are the investigated parameters. Antennas are compared considering computer simulations in CST Microwave Studio and measured data.

  2. Comparative evaluation of soft and hard tissue dimensions in the anterior maxilla using radiovisiography and cone beam computed tomography: A pilot study

    Directory of Open Access Journals (Sweden)

    Savita Mallikarjun

    2016-01-01

    Full Text Available Aims: To assess and compare the thickness of gingiva in the anterior maxilla using radiovisiography (RVG and cone beam computed tomography (CBCT and its correlation with the thickness of underlying alveolar bone. Settings and Design: This cross-sectional study included 10 male subjects in the age group of 20–45 years. Materials and Methods: After analyzing the width of keratinized gingiva of the maxillary right central incisor, the radiographic assessment was done using a modified technique for RVG and CBCT, to measure the thickness of both the labial gingiva and labial plate of alveolar bone at 4 predetermined locations along the length of the root in each case. Statistical Analysis Used: Statistical analysis was performed using Student's t-test and Pearson's correlation test, with the help of statistical software (SPSS V13. Results: No statistically significant differences were obtained in the measurement made using RVG and CBCT. The results of the present study also failed to reveal any significant correlation between the width of gingiva and the alveolar bone in the maxillary anterior region. Conclusions: Within the limitations of this study, it can be concluded that both CBCT and RVG can be used as valuable tools in the assessment of the soft and hard tissue dimensions.

  3. Online Studies on Variation in Orthopedic Surgery: Computed Tomography in MPEG4 Versus DICOM Format

    NARCIS (Netherlands)

    Mellema, Jos J.; Mallee, Wouter H.; Guitton, Thierry G.; van Dijk, C. Niek; Ring, David; Doornberg, Job N.; Babis, G. C.; Jeray, K. J.; Prayson, M. J.; Pesantez, R.; Acacio, R.; Verbeek, D. O.; Melvanki, P.; Kreis, B. E.; Mehta, S.; Meylaerts, S.; Wojtek, S.; Yeap, E. J.; Haapasalo, H.; Kristan, A.; Coles, C.; Marsh, J. L.; Mormino, M.; Menon, M.; Tyllianakis, M.; Schandelmaier, P.; Jenkinson, R. J.; Neuhaus, V.; Shahriar, C. M. H.; Belangero, W. D.; Kannan, S. G.; Leonidovich, G. M.; Davenport, J. H.; Kabir, K.; Althausen, P. L.; Weil, Y.; Toom, A.; Sa da Costa, D.; Lijoi, F.; Koukoulias, N. E.; Manidakis, N.; van den Bogaert, M.; Patczai, B.; Grauls, A.; Kurup, H.; van den Bekerom, M. P.; Lansdaal, J. R.; Vale, M.; Ousema, P.; Barquet, A.; Cross, B. J.; Broekhuyse, H.; Haverkamp, D.; Merchant, M.; Harvey, E.; Pemovska, E. Stojkovska; Frihagen, F.; Seibert, F. J.; Garnavos, C.; van der Heide, H.; Villamizar, H. A.; Harris, I.; Borris, L. C.; Brink, O.; Brink, P. R. G.; Choudhari, P.; Swiontkowski, M.; Mittlmeier, T.; Tosounidis, T.; van Rensen, I.; Martinelli, N.; Park, D. H.; Lasanianos, N.; Vide, J.; Engvall, A.; Zura, R. D.; Jubel, A.; Kawaguchi, A.; Goost, H.; Bishop, J.; Mica, L.; Pirpiris, M.; van Helden, S. H.; Bouaicha, S.; Schepers, T.; Havliček, T.; Giordano, V.

    2017-01-01

    The purpose of this study was to compare the observer participation and satisfaction as well as interobserver reliability between two online platforms, Science of Variation Group (SOVG) and Traumaplatform Study Collaborative, for the evaluation of complex tibial plateau fractures using computed

  4. Comparative study on computed orthopantomography and film radiographic techniques in the radiography of temporomandibular joint

    International Nuclear Information System (INIS)

    Chen Tao; Ning Lixia; Liu Yuai; Li Ningyi; Chen Feng

    2007-01-01

    Objective: To compare the computed orthopantomography (COPT) with Shriller radiography(SR), film orthopantomography (FOPT) and other traditional radiographic techniques in the radiography of temporomandibular joint (TMJ). Methods: Ninty-eight cases were randomly divided into 3 groups, and the open and close positions of TMJs of both sides were examined with SR, FOPT, and COPT, respectively. The satisfactory rates of the X-ray pictures were statistically analyzed with Pearson chi-square in SPSS10.0, and the satisfactory rates were analyzed with q test between the groups. Results: One hundred and forty-four of the open and close positions of 144 TMJ pictures of the COPT group, 128 of 128 of the FOPT group, and 6 of 120 of the SR group were satisfactory in the mandible ramus of the TMJ, with satisfactory rate being 100%, 100%, and 5%, respectively (P 0.01), respectively between FOPT and COPT groups. The difference was not statistically significant. The exposure was as follows: COPT, 99-113 mAs; FOPT, 210-225 mAs; and SR, 48-75 mAs. Therefore, COPT and FOPT were superior to SR in the pictures of the mandible ramus, coronoid process, and incisure, but inferior in the joint space pictures. The satisfactory rates of the condylar process and articular tubercle were same in the 3 groups. The exposure of the FOPT group was greater than that of the COPT and SR groups. Conclusion: COPT is superior to SR and FOPT in TMJ radiography, and should be applied widely in the clinic. (authors)

  5. Students' Computing Use and Study: When More is Less

    Directory of Open Access Journals (Sweden)

    Christine A McLachlan

    2016-02-01

    Full Text Available Since the turn of the century there has been a steady decline in enrolments of students in senior secondary computing classes in Australia. A flow on effect has seen reduced enrolments in tertiary computing courses and the subsequent predictions of shortages in skilled computing professionals. This paper investigates the relationship between students’ computing literacy levels, their use and access to computing tools, and students’ interest in and attitudes to formal computing study. Through the use of secondary data obtained from Australian and international reports, a reverse effect was discovered indicating that the more students used computing tools, the less interested they become in computing studies. Normal 0 false false false EN-AU X-NONE X-NONE

  6. ScalaLab and GroovyLab: Comparing Scala and Groovy for Scientific Computing

    Directory of Open Access Journals (Sweden)

    Stergios Papadimitriou

    2015-01-01

    Full Text Available ScalaLab and GroovyLab are both MATLAB-like environments for the Java Virtual Machine. ScalaLab is based on the Scala programming language and GroovyLab is based on the Groovy programming language. They present similar user interfaces and functionality to the user. They also share the same set of Java scientific libraries and of native code libraries. From the programmer's point of view though, they have significant differences. This paper compares some aspects of the two environments and highlights some of the strengths and weaknesses of Scala versus Groovy for scientific computing. The discussion also examines some aspects of the dilemma of using dynamic typing versus static typing for scientific programming. The performance of the Java platform is continuously improved at a fast pace. Today Java can effectively support demanding high-performance computing and scales well on multicore platforms. Thus, both systems can challenge the performance of the traditional C/C++/Fortran scientific code with an easier to use and more productive programming environment.

  7. Mandibular dimensions of subjects with asymmetric skeletal class III malocclusion and normal occlusion compared with cone-beam computed tomography.

    Science.gov (United States)

    Lee, HyoYeon; Bayome, Mohamed; Kim, Seong-Hun; Kim, Ki Beom; Behrents, Rolf G; Kook, Yoon-Ah

    2012-08-01

    The purpose of this study was to use cone-beam computed tomography to compare mandibular dimensions in subjects with asymmetric skeletal Class III malocclusion and those with normal occlusion. Cone-beam computed tomography scans of 38 subjects with normal occlusion and 28 patients with facial asymmetry were evaluated and digitized with Invivo software (Anatomage, San Jose, Calif). Three midsagittal and 13 right and left measurements were taken. The paired t test was used to compare the right and left sides in each group. The Mann-Whitney U test was used to compare the midsagittal variables and the differences between the 2 sides of the group with normal occlusion with those of asymmetry patients. The posterior part of the mandibular body showed significant differences between the deviated and nondeviated sides in asymmetric Class III patients. The difference of the asymmetry group was significantly greater than that of the normal occlusion group for the mediolateral ramal and the anteroposterior condylar inclinations (P = 0.007 and P = 0.019, respectively). The asymmetric skeletal Class III group showed significant differences in condylar height, ramus height, and posterior part of the mandibular body compared with the subjects with normal occlusion. These results might be useful for diagnosis and treatment planning of asymmetric Class III patients. Copyright © 2012 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  8. Indications for computed tomography (CT- diagnostics in proximal humeral fractures: a comparative study of plain radiography and computed tomography

    Directory of Open Access Journals (Sweden)

    Weise Kuno

    2009-04-01

    Full Text Available Abstract Background Precise indications for computed tomography (CT in proximal humeral fractures are not established. The purpose of this study was a comparison of conventional radiographic views with different CT reconstructions with 2 D and 3 D imaging to establish indications for additional CT diagnostics depending on the fractured parts. Methods In a prospective diagnostic study in two level 1 trauma centers, 44 patients with proximal humeral fractures were diagnosed with conventional X-rays (22 AP + axillary views, 22 AP + scapular Y-views and CT (multi-planar reconstruction (MPR and maximum intensity projection (MIP with 2 D and 3 D imaging. 3 observers assessed the technical image quality, the assessment of the relevant anatomical structures (2-sample-t-test and the percentage of the osseous overlap of the proximal humerus (Welch-test using a scoring system. The quality of the different diagnostic methods was assessed according to the number of fractured parts (Bonferroni-Holm adjustment. Results There was significantly more overlap of the fractured region on the scapular Y-views (mean 71.5%, range 45–90% than on axillary views (mean 56.2%, range 10.5–100%. CT-diagnostics allowed a significantly better assessment of the relevant structures than conventional diagnostics (p Conclusion Conventional X-rays with AP view and a high-quality axillary view are useful for primary diagnostics of the fracture and often but not always show a clear presentation of the relevant bony structures such as both tuberosities, the glenoid and humeral head. CT with thin slices technology and additional 3 D imaging provides always a clear presentation of the fractured region. Clinically, a CT should be performed – independently of the number of fractured parts – when the proximal humerus and the shoulder joint are not presented with sufficient X-ray-quality to establish a treatment plan.

  9. Computed tomography versus intravenous urography in diagnosis of acute flank pain from urolithiasis: a randomized study comparing imaging costs and radiation dose

    International Nuclear Information System (INIS)

    Thomson, J.M.Z.; Maling, T.M.J.; Glocer, J.; Mark, S.; Abbott, C.

    2001-01-01

    The equivalent sensitivity of non-contrast computed tomography (NCCT) and intravenous urography (IVU) in the diagnosis of suspected ureteric colic has been established. Approximately 50% of patients with suspected ureteric colic do not have a nephro-urological cause for pain. Because many such patients require further imaging studies, NCCT may obviate the need for these studies and, in so doing, be more cost effective and involve less overall radiation exposure. The present study compares the total imaging cost and radiation dose of NCCT versus IVU in the diagnosis of acute flank pain. Two hundred and twenty-four patients (157 men; mean age 45 years; age range 19-79 years) with suspected renal colic were randomized either to NCCT or IVU. The number of additional diagnostic imaging studies, cost (IVU A$ 136; CTU A$ 173), radiation exposure and imaging times were compared. Of 119(53%) patients with renal obstruction, 105 had no nephro-urological causes of pain. For 21 (20%) of these patients an alternative diagnosis was made at the initial imaging, 10 of which were significant. Of 118 IVU patients, 28 (24%) required 32 additional imaging tests to reach a diagnosis, whereas seven of 106 (6%) NCCT patients required seven additional imaging studies. The average total diagnostic imaging cost for the NCCT group was A$181.94 and A$175.46 for the IVU group (P< 0.43). Mean radiation dose to diagnosis was 5.00 mSv (NCCT) versus 3.50 mSv (IVU) (P < 0.001). Mean imaging time was 30 min (NCCT) versus 75 min (IVU) (P < 0.001). Diagnostic imaging costs were remarkably similar. Although NCCT involves a higher radiation dose than IVU, its advantages of faster diagnosis, the avoidance of additional diagnostic imaging tests and its ability to diagnose other causes makes it the study of choice for acute flank pain at Christchurch Hospital. Copyright (2001) Blackwell Science Pty Ltd

  10. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    Science.gov (United States)

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  11. Computer stress study of bone with computed tomography

    International Nuclear Information System (INIS)

    Linden, M.J.; Marom, S.A.; Linden, C.N.

    1986-01-01

    A computer processing tool has been developed which, together with a finite element program, determines the stress-deformation pattern in a long bone, utilizing Computed Tomography (CT) data files for the geometry and radiographic density information. The geometry, together with mechanical properties and boundary conditions: loads and displacements, comprise the input of the Finite element (FE) computer program. The output of the program is the stresses and deformations in the bone. The processor is capable of developing an accurate three-dimensional finite element model from a scanned human long bone due to the CT high pixel resolution and the local mechanical properties determined from the radiographic densities of the scanned bone. The processor, together with the finite element program, serves first as an analysis tool towards improved understanding of bone function and remodelling. In this first stage, actual long bones may be scanned and analyzed under applied loads and displacements, determined from existing gait analyses. The stress-deformation patterns thus obtained may be used for studying the biomechanical behavior of particular long bones such as bones with implants and with osteoporosis. As a second stage, this processor may serve as a diagnostic tool for analyzing the biomechanical response of a specific patient's long long bone under applied loading by utilizing a CT data file of the specific bone as an input to the processor with the FE program

  12. Comparative study of heuristic evaluation and usability testing methods.

    Science.gov (United States)

    Thyvalikakath, Thankam Paul; Monaco, Valerie; Thambuganipalle, Himabindu; Schleyer, Titus

    2009-01-01

    Usability methods, such as heuristic evaluation, cognitive walk-throughs and user testing, are increasingly used to evaluate and improve the design of clinical software applications. There is still some uncertainty, however, as to how those methods can be used to support the development process and evaluation in the most meaningful manner. In this study, we compared the results of a heuristic evaluation with those of formal user tests in order to determine which usability problems were detected by both methods. We conducted heuristic evaluation and usability testing on four major commercial dental computer-based patient records (CPRs), which together cover 80% of the market for chairside computer systems among general dentists. Both methods yielded strong evidence that the dental CPRs have significant usability problems. An average of 50% of empirically-determined usability problems were identified by the preceding heuristic evaluation. Some statements of heuristic violations were specific enough to precisely identify the actual usability problem that study participants encountered. Other violations were less specific, but still manifested themselves in usability problems and poor task outcomes. In this study, heuristic evaluation identified a significant portion of problems found during usability testing. While we make no assumptions about the generalizability of the results to other domains and software systems, heuristic evaluation may, under certain circumstances, be a useful tool to determine design problems early in the development cycle.

  13. Screening for cognitive impairment in older individuals. Validation study of a computer-based test.

    Science.gov (United States)

    Green, R C; Green, J; Harrison, J M; Kutner, M H

    1994-08-01

    This study examined the validity of a computer-based cognitive test that was recently designed to screen the elderly for cognitive impairment. Criterion-related validity was examined by comparing test scores of impaired patients and normal control subjects. Construct-related validity was computed through correlations between computer-based subtests and related conventional neuropsychological subtests. University center for memory disorders. Fifty-two patients with mild cognitive impairment by strict clinical criteria and 50 unimpaired, age- and education-matched control subjects. Control subjects were rigorously screened by neurological, neuropsychological, imaging, and electrophysiological criteria to identify and exclude individuals with occult abnormalities. Using a cut-off total score of 126, this computer-based instrument had a sensitivity of 0.83 and a specificity of 0.96. Using a prevalence estimate of 10%, predictive values, positive and negative, were 0.70 and 0.96, respectively. Computer-based subtests correlated significantly with conventional neuropsychological tests measuring similar cognitive domains. Thirteen (17.8%) of 73 volunteers with normal medical histories were excluded from the control group, with unsuspected abnormalities on standard neuropsychological tests, electroencephalograms, or magnetic resonance imaging scans. Computer-based testing is a valid screening methodology for the detection of mild cognitive impairment in the elderly, although this particular test has important limitations. Broader applications of computer-based testing will require extensive population-based validation. Future studies should recognize that normal control subjects without a history of disease who are typically used in validation studies may have a high incidence of unsuspected abnormalities on neurodiagnostic studies.

  14. Comparison of Swedish and Norwegian Use of Cone-Beam Computed Tomography: a Questionnaire Study

    Directory of Open Access Journals (Sweden)

    Jerker Edén Strindberg

    2015-12-01

    Full Text Available Objectives: Cone-beam computed tomography in dentistry can be used in some countries by other dentists than specialists in radiology. The frequency of buying cone-beam computed tomography to examine patients is rapidly growing, thus knowledge of how to use it is very important. The aim was to compare the outcome of an investigation on the use of cone-beam computed tomography in Sweden with a previous Norwegian study, regarding specifically technical aspects. Material and Methods: The questionnaire contained 45 questions, including 35 comparable questions to Norwegian clinics one year previous. Results were based on inter-comparison of the outcome from each of the two questionnaire studies. Results: Responses rate was 71% in Sweden. There, most of cone-beam computed tomography (CBCT examinations performed by dental nurses, while in Norway by specialists. More than two-thirds of the CBCT units had a scout image function, regularly used in both Sweden (79% and Norway (75%. In Sweden 4% and in Norway 41% of the respondents did not wait for the report from the radiographic specialist before initiating treatment. Conclusions: The bilateral comparison showed an overall similarity between the two countries. The survey gave explicit and important knowledge of the need for education and training of the whole team, since radiation dose to the patient could vary a lot for the same kind of radiographic examination. It is essential to establish quality assurance protocols with defined responsibilities in the team in order to maintain high diagnostic accuracy for all examinations when using cone-beam computed tomography for patient examinations.

  15. The comparative effect of individually-generated vs. collaboratively-generated computer-based concept mapping on science concept learning

    Science.gov (United States)

    Kwon, So Young

    Using a quasi-experimental design, the researcher investigated the comparative effects of individually-generated and collaboratively-generated computer-based concept mapping on middle school science concept learning. Qualitative data were analyzed to explain quantitative findings. One hundred sixty-one students (74 boys and 87 girls) in eight, seventh grade science classes at a middle school in Southeast Texas completed the entire study. Using prior science performance scores to assure equivalence of student achievement across groups, the researcher assigned the teacher's classes to one of the three experimental groups. The independent variable, group, consisted of three levels: 40 students in a control group, 59 students trained to individually generate concept maps on computers, and 62 students trained to collaboratively generate concept maps on computers. The dependent variables were science concept learning as demonstrated by comprehension test scores, and quality of concept maps created by students in experimental groups as demonstrated by rubric scores. Students in the experimental groups received concept mapping training and used their newly acquired concept mapping skills to individually or collaboratively construct computer-based concept maps during study time. The control group, the individually-generated concept mapping group, and the collaboratively-generated concept mapping group had equivalent learning experiences for 50 minutes during five days, excepting that students in a control group worked independently without concept mapping activities, students in the individual group worked individually to construct concept maps, and students in the collaborative group worked collaboratively to construct concept maps during their study time. Both collaboratively and individually generated computer-based concept mapping had a positive effect on seventh grade middle school science concept learning but neither strategy was more effective than the other. However

  16. A comparative study between use of arthroscopic lavage and arthrocentesis of temporomandibular joint based on computational fluid dynamics analysis.

    Directory of Open Access Journals (Sweden)

    Yue Xu

    Full Text Available Arthroscopic lavage and arthrocentesis, performed with different inner-diameter lavage needles, are the current minimally invasive techniques used in temporomandibular joint disc displacement (TMJ-DD for pain reduction and functional improvement. In the current study, we aimed to explore the biomechanical influence and explain the diverse clinical outcomes of these two approaches with computational fluid dynamics. Data was retrospectively analyzed from 78 cases that had undergone arthroscopic lavage or arthrocentesis for TMJ-DD from 2002 to 2010. Four types of finite volume models, featuring irrigation needles of different diameters, were constructed based on computed tomography images. We investigated the flow pattern and pressure distribution of lavage fluid secondary to caliber-varying needles. Our results demonstrated that the size of outflow portal was the critical factor in determining irrigated flow rate, with a larger inflow portal and a smaller outflow portal leading to higher intra-articular pressure. This was consistent with clinical data suggesting that increasing the mouth opening and maximal contra-lateral movement led to better outcomes following arthroscopic lavage. The findings of this study could be useful for choosing the lavage apparatus according to the main complaint of pain, or limited mouth opening, and examination of joint movements.

  17. Implant Supported Distal Extension over Denture Retained by Two Types of Attachments. A Comparative Radiographic Study by Cone Beam Computed Tomography

    Science.gov (United States)

    Mahrous, Ahmed I; Aldawash, Hussien A; Soliman, Tarek A; Banasr, Fahad H; Abdelwahed, Ahmed

    2015-01-01

    Background: This study was conducted to compare and evaluate the effect of two different attachments (locator attachment and ball and socket [B&S] attachment) on implants and natural abutments supporting structures, in cases of limited inter-arch spaces in mandibular Kennedy Class I implant supported removable partial over dentures by measuring the bone height changes through the cone beam radiographic technology. Materials and Methods: Two implants were positioned in the first or second molar area following the two-stage surgical protocol. Two equal groups were divided ten for each: Group I: Sides were the placed implants restored by the locator attachment. Group II: The other sides, implants were restored by B&S attachment. Evaluation of the implants and main abutments supporting structures of each group was done at the time of removable partial over denture insertion, 6, 12 and 18 months by measuring the bone height changes using cone beam computed tomography. Results: Implants with locator attachment showed marginal bone height better effects on implants and main abutments supporting structures. Conclusion: Implants restored by locator attachment shows better effects on bone of both main natural abutments and implant than those restored with ball and socket. PMID:26028894

  18. Clinical cone beam computed tomography compared to high-resolution peripheral computed tomography in the assessment of distal radius bone.

    Science.gov (United States)

    de Charry, C; Boutroy, S; Ellouz, R; Duboeuf, F; Chapurlat, R; Follet, H; Pialat, J B

    2016-10-01

    Clinical cone beam computed tomography (CBCT) was compared to high-resolution peripheral quantitative computed tomography (HR-pQCT) for the assessment of ex vivo radii. Strong correlations were found for geometry, volumetric density, and trabecular structure. Using CBCT, bone architecture assessment was feasible but compared to HR-pQCT, trabecular parameters were overestimated whereas cortical ones were underestimated. HR-pQCT is the most widely used technique to assess bone microarchitecture in vivo. Yet, this technology has been only applicable at peripheral sites, in only few research centers. Clinical CBCT is more widely available but quantitative assessment of the bone structure is usually not performed. We aimed to compare the assessment of bone structure with CBCT (NewTom 5G, QR, Verona, Italy) and HR-pQCT (XtremeCT, Scanco Medical AG, Brüttisellen, Switzerland). Twenty-four distal radius specimens were scanned with these two devices with a reconstructed voxel size of 75 μm for Newtom 5G and 82 μm for XtremeCT, respectively. A rescaling-registration scheme was used to define the common volume of interest. Cortical and trabecular compartments were separated using a semiautomated double contouring method. Density and microstructure were assessed with the HR-pQCT software on both modality images. Strong correlations were found for geometry parameters (r = 0.98-0.99), volumetric density (r = 0.91-0.99), and trabecular structure (r = 0.94-0.99), all p < 0.001. Correlations were lower for cortical microstructure (r = 0.80-0.89), p < 0.001. However, absolute differences were observed between modalities for all parameters, with an overestimation of the trabecular structure (trabecular number, 1.62 ± 0.37 vs. 1.47 ± 0.36 mm(-1)) and an underestimation of the cortical microstructure (cortical porosity, 3.3 ± 1.3 vs. 4.4 ± 1.4 %) assessed on CBCT images compared to HR-pQCT images. Clinical CBCT devices are able to

  19. On several computer-oriented studies

    International Nuclear Information System (INIS)

    Takahashi, Ryoichi

    1982-01-01

    To utilize fully digital techniques for solving various difficult problems, nuclear engineers have recourse to computer-oriented approaches. The current trend, in such fields as optimization theory, control system theory and computational fluid dynamics reflect the ability to use computers to obtain numerical solutions to complex problems. Special purpose computers will be used as the integral part of the solving system to process a large amount of data, to implement a control law and even to produce a decision-making. Many problem-solving systems designed in the future will incorporate special-purpose computers as system component. The optimum use of computer system is discussed: why are energy model, energy data base and a big computer used; why will the economic process-computer be allocated to nuclear plants in the future; why should the super-computer be demonstrated at once. (Mori, K.)

  20. New accountant job market reform by computer algorithm: an experimental study

    Directory of Open Access Journals (Sweden)

    Hirose Yoshitaka

    2017-01-01

    Full Text Available The purpose of this study is to examine the matching of new accountants with accounting firms in Japan. A notable feature of the present study is that it brings a computer algorithm to the job-hiring task. Job recruitment activities for new accountants in Japan are one-time, short-term struggles. Accordingly, many have searched for new rules to replace the current ones of the process. Job recruitment activities for new accountants in Japan change every year. This study proposes modifying these job recruitment activities by combining computer and human efforts. Furthermore, the study formulates the job recruitment activities by using a model and conducting experiments. As a result, the Deferred Acceptance (DA algorithm derives a high truth-telling percentage, a stable matching percentage, and greater efficiency compared with the previous approach. This suggests the potential of the Deferred Acceptance algorithm as a replacement for current approaches. In terms of accurate percentage and stability, the DA algorithm is superior to the current methods and should be adopted.

  1. Computational study of performance characteristics for truncated conical aerospike nozzles

    Science.gov (United States)

    Nair, Prasanth P.; Suryan, Abhilash; Kim, Heuy Dong

    2017-12-01

    Aerospike nozzles are advanced rocket nozzles that can maintain its aerodynamic efficiency over a wide range of altitudes. It belongs to class of altitude compensating nozzles. A vehicle with an aerospike nozzle uses less fuel at low altitudes due to its altitude adaptability, where most missions have the greatest need for thrust. Aerospike nozzles are better suited to Single Stage to Orbit (SSTO) missions compared to conventional nozzles. In the current study, the flow through 20% and 40% aerospike nozzle is analyzed in detail using computational fluid dynamics technique. Steady state analysis with implicit formulation is carried out. Reynolds averaged Navier-Stokes equations are solved with the Spalart-Allmaras turbulence model. The results are compared with experimental results from previous work. The transition from open wake to closed wake happens in lower Nozzle Pressure Ratio for 20% as compared to 40% aerospike nozzle.

  2. An Intervention Study on Mental Computation for Second Graders in Taiwan

    Science.gov (United States)

    Yang, Der-Ching; Huang, Ke-Lun

    2014-01-01

    The authors compared the mental computation performance and mental strategies used by an experimental Grade 2 class and a control Grade 2 class before and after instructional intervention. Results indicate that students in the experimental group had better performance on mental computation. The use of mental strategies (counting, separation,…

  3. A Codesign Case Study in Computer Graphics

    DEFF Research Database (Denmark)

    Brage, Jens P.; Madsen, Jan

    1994-01-01

    The paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...

  4. Computational studies of tokamak plasmas

    International Nuclear Information System (INIS)

    Takizuka, Tomonori; Tsunematsu, Toshihide; Tokuda, Shinji

    1981-02-01

    Computational studies of tokamak plasmas are extensively advanced. Many computational codes have been developed by using several kinds of models, i.e., the finite element formulation of MHD equations, the time dependent multidimensional fluid model, and the particle model with the Monte-Carlo method. These codes are applied to the analyses of the equilibrium of an axisymmetric toroidal plasma (SELENE), the time evolution of the high-beta tokamak plasma (APOLLO), the low-n MHD stability (ERATO-J) and high-n ballooning mode stability (BOREAS) in the INTOR tokamak, the nonlinear MHD stability, such as the positional instability (AEOLUS-P), resistive internal mode (AEOLUS-I) etc., and the divertor functions. (author)

  5. The Effects of Computer Simulation and Animation (CSA) on Students' Cognitive Processes: A Comparative Case Study in an Undergraduate Engineering Course

    Science.gov (United States)

    Fang, N.; Tajvidi, M.

    2018-01-01

    This study focuses on the investigation of the effects of computer simulation and animation (CSA) on students' cognitive processes in an undergraduate engineering course. The revised Bloom's taxonomy, which consists of six categories in the cognitive process domain, was employed in this study. Five of the six categories were investigated,…

  6. Computing and Comparing Effective Properties for Flow and Transport in Computer-Generated Porous Media

    KAUST Repository

    Allen, Rebecca; Sun, Shuyu

    2017-01-01

    We compute effective properties (i.e., permeability, hydraulic tortuosity, and diffusive tortuosity) of three different digital porous media samples, including in-line array of uniform shapes, staggered-array of squares, and randomly distributed squares. The permeability and hydraulic tortuosity are computed by solving a set of rescaled Stokes equations obtained by homogenization, and the diffusive tortuosity is computed by solving a homogenization problem given for the effective diffusion coefficient that is inversely related to diffusive tortuosity. We find that hydraulic and diffusive tortuosity can be quantitatively different by up to a factor of ten in the same pore geometry, which indicates that these tortuosity terms cannot be used interchangeably. We also find that when a pore geometry is characterized by an anisotropic permeability, the diffusive tortuosity (and correspondingly the effective diffusion coefficient) can also be anisotropic. This finding has important implications for reservoir-scale modeling of flow and transport, as it is more realistic to account for the anisotropy of both the permeability and the effective diffusion coefficient.

  7. Computing and Comparing Effective Properties for Flow and Transport in Computer-Generated Porous Media

    KAUST Repository

    Allen, Rebecca

    2017-02-13

    We compute effective properties (i.e., permeability, hydraulic tortuosity, and diffusive tortuosity) of three different digital porous media samples, including in-line array of uniform shapes, staggered-array of squares, and randomly distributed squares. The permeability and hydraulic tortuosity are computed by solving a set of rescaled Stokes equations obtained by homogenization, and the diffusive tortuosity is computed by solving a homogenization problem given for the effective diffusion coefficient that is inversely related to diffusive tortuosity. We find that hydraulic and diffusive tortuosity can be quantitatively different by up to a factor of ten in the same pore geometry, which indicates that these tortuosity terms cannot be used interchangeably. We also find that when a pore geometry is characterized by an anisotropic permeability, the diffusive tortuosity (and correspondingly the effective diffusion coefficient) can also be anisotropic. This finding has important implications for reservoir-scale modeling of flow and transport, as it is more realistic to account for the anisotropy of both the permeability and the effective diffusion coefficient.

  8. Identifying a Computer Forensics Expert: A Study to Measure the Characteristics of Forensic Computer Examiners

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2010-03-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 The usage of digital evidence from electronic devices has been rapidly expanding within litigation, and along with this increased usage, the reliance upon forensic computer examiners to acquire, analyze, and report upon this evidence is also rapidly growing. This growing demand for forensic computer examiners raises questions concerning the selection of individuals qualified to perform this work. While courts have mechanisms for qualifying witnesses that provide testimony based on scientific data, such as digital data, the qualifying criteria covers a wide variety of characteristics including, education, experience, training, professional certifications, or other special skills. In this study, we compare task performance responses from forensic computer examiners with an expert review panel and measure the relationship with the characteristics of the examiners to their quality responses. The results of this analysis provide insight into identifying forensic computer examiners that provide high-quality responses. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

  9. Study of three-dimensional Rayleigh--Taylor instability in compressible fluids through level set method and parallel computation

    International Nuclear Information System (INIS)

    Li, X.L.

    1993-01-01

    Computation of three-dimensional (3-D) Rayleigh--Taylor instability in compressible fluids is performed on a MIMD computer. A second-order TVD scheme is applied with a fully parallelized algorithm to the 3-D Euler equations. The computational program is implemented for a 3-D study of bubble evolution in the Rayleigh--Taylor instability with varying bubble aspect ratio and for large-scale simulation of a 3-D random fluid interface. The numerical solution is compared with the experimental results by Taylor

  10. Non-Determinism: An Abstract Concept in Computer Science Studies

    Science.gov (United States)

    Armoni, Michal; Gal-Ezer, Judith

    2007-01-01

    Non-determinism is one of the most important, yet abstract, recurring concepts of Computer Science. It plays an important role in Computer Science areas such as formal language theory, computability theory, distributed computing, and operating systems. We conducted a series of studies on the perception of non-determinism. In the current research,…

  11. Computer-Assisted, Self-Interviewing (CASI) Compared to Face-to-Face Interviewing (FTFI) with Open-Ended, Non-Sensitive Questions

    OpenAIRE

    John Fairweather PhD; Tiffany Rinne PhD; Gary Steel PhD

    2012-01-01

    This article reports results from research on cultural models, and assesses the effects of computers on data quality by comparing open-ended questions asked in two formats—face-to-face interviewing (FTFI) and computer-assisted, self-interviewing (CASI). We expected that for our non-sensitive topic, FTFI would generate fuller and richer accounts because the interviewer could facilitate the interview process. Although the interviewer indeed facilitated these interviews, which resulted in more w...

  12. Structure and dynamics of amorphous polymers: computer simulations compared to experiment and theory

    International Nuclear Information System (INIS)

    Paul, Wolfgang; Smith, Grant D

    2004-01-01

    This contribution considers recent developments in the computer modelling of amorphous polymeric materials. Progress in our capabilities to build models for the computer simulation of polymers from the detailed atomistic scale up to coarse-grained mesoscopic models, together with the ever-improving performance of computers, have led to important insights from computer simulations into the structural and dynamic properties of amorphous polymers. Structurally, chain connectivity introduces a range of length scales from that of the chemical bond to the radius of gyration of the polymer chain covering 2-4 orders of magnitude. Dynamically, this range of length scales translates into an even larger range of time scales observable in relaxation processes in amorphous polymers ranging from about 10 -13 to 10 -3 s or even to 10 3 s when glass dynamics is concerned. There is currently no single simulation technique that is able to describe all these length and time scales efficiently. On large length and time scales basic topology and entropy become the governing properties and this fact can be exploited using computer simulations of coarse-grained polymer models to study universal aspects of the structure and dynamics of amorphous polymers. On the largest length and time scales chain connectivity is the dominating factor leading to the strong increase in longest relaxation times described within the reptation theory of polymer melt dynamics. Recently, many of the universal aspects of this behaviour have been further elucidated by computer simulations of coarse-grained polymer models. On short length scales the detailed chemistry and energetics of the polymer are important, and one has to be able to capture them correctly using chemically realistic modelling of specific polymers, even when the aim is to extract generic physical behaviour exhibited by the specific chemistry. Detailed studies of chemically realistic models highlight the central importance of torsional dynamics

  13. The use of computers in education worldwide : results from a comparative survey in 18 countries

    NARCIS (Netherlands)

    Pelgrum, W.J.; Plomp, T.

    1991-01-01

    In 1989, the International Association for the Evaluation of Educational Achievement (IEA) Computers in Education study collected data on computer use in elementary, and lower- and upper-secondary education in 22 countries. Although all data sets from the participating countries had not been

  14. Thermochemistry of 6-propyl-2-thiouracil: An experimental and computational study

    Energy Technology Data Exchange (ETDEWEB)

    Szterner, Piotr; Galvão, Tiago L.P.; Amaral, Luísa M.P.F.; Ribeiro da Silva, Maria D.M.C., E-mail: mdsilva@fc.up.pt; Ribeiro da Silva, Manuel A.V.

    2014-07-01

    Highlights: • Thermochemistry of 6-propyl-2-thiouracil – experimental and computational study. • Vapor pressure study of the 6-propyl-2-thiouracil by Knudsen effusion technique. • Enthalpies of formation of 6-propyl-2-thiouracil by rotating combustion calorimetry. • Accurate computational calculations (G3 and G4 composite methods) were performed. - Abstract: The standard (p{sup o} = 0.1 MPa) molar enthalpy of formation of 6-propyl-2-thiouracil was derived from its standard molar energy of combustion, in oxygen, to yield CO{sub 2} (g), N{sub 2} (g) and H{sub 2}SO{sub 4}·115H{sub 2}O (l), at T = 298.15 K, measured by rotating bomb combustion calorimetry. The vapor pressures as function of temperature were measured by the Knudsen effusion technique and the standard molar enthalpy of sublimation, Δ{sub cr}{sup g}H{sub m}{sup o}, at T = 298.15 K, was derived by the Clausius–Clapeyron equation. These two thermodynamic parameters yielded the standard molar enthalpy of formation, in the gaseous phase, at T = 298.15 K: −(142.5 ± 1.9) kJ mol{sup −1}. This value was compared with estimates obtained from very accurate computational calculations using the G3 and G4 composite methods.

  15. Comparative merits of radioimmunoscintigraphy and computer tomography in diagnosis and follow-up of primary ovarial cancer

    International Nuclear Information System (INIS)

    Barzen, G.; Cordes, M.; Langer, M.; Felix, R.; Friedman, W.; Mayr, A.C.

    1990-01-01

    In this prospective study the diagnostic merit of radioimmunoscintigraphy (RIS) was compared with computed tomography (CT) and operation in the primary diagnostic procedure and follow-up of women with suspected ovarial cancer. In primary diagnosis, sensitivity, specificity, and diagnostic accuracy was 100%, 60% and 90% for RIS. In follow-up, sensitivity for local recurrence was slightly higher in CT than in RIS. It was possible to detect peritoneal carcinosis in the pelvis and lower abdominal region better with RIS, but in the upper abdominal region, peritoneal carcinosis was detected better with CT. If no differentiation between benign or malignant lesion, is possible with CT, differentiation will in many cases be possible with RIS. (orig.) [de

  16. Verification study of the FORE-2M nuclear/thermal-hydraulilc analysis computer code

    International Nuclear Information System (INIS)

    Coffield, R.D.; Tang, Y.S.; Markley, R.A.

    1982-01-01

    The verification of the LMFBR core transient performance code, FORE-2M, was performed in two steps. Different components of the computation (individual models) were verified by comparing with analytical solutions and with results obtained from other conventionally accepted computer codes (e.g., TRUMP, LIFE, etc.). For verification of the integral computation method of the code, experimental data in TREAT, SEFOR and natural circulation experiments in EBR-II were compared with the code calculations. Good agreement was obtained for both of these steps. Confirmation of the code verification for undercooling transients is provided by comparisons with the recent FFTF natural circulation experiments. (orig.)

  17. Efficiency using computer simulation of Reverse Threshold Model Theory on assessing a “One Laptop Per Child” computer versus desktop computer

    Directory of Open Access Journals (Sweden)

    Supat Faarungsang

    2017-04-01

    Full Text Available The Reverse Threshold Model Theory (RTMT model was introduced based on limiting factor concepts, but its efficiency compared to the Conventional Model (CM has not been published. This investigation assessed the efficiency of RTMT compared to CM using computer simulation on the “One Laptop Per Child” computer and a desktop computer. Based on probability values, it was found that RTMT was more efficient than CM among eight treatment combinations and an earlier study verified that RTMT gives complete elimination of random error. Furthermore, RTMT has several advantages over CM and is therefore proposed to be applied to most research data.

  18. Detection of Cement Leakage After Vertebroplasty with a Non-Flat-Panel Angio Unit Compared to Multidetector Computed Tomography - An Ex Vivo Study

    International Nuclear Information System (INIS)

    Baumann, Clemens; Fuchs, Heiko; Westphalen, Kerstin; Hierholzer, Johannes

    2008-01-01

    The purpose of this study was to investigate the detection of cement leakages after vertebroplasty using angiographic computed tomography (ACT) in a non-flat-panel angio unit compared to multidetector computed tomography (MDCT). Vertebroplasty was performed in 19 of 33 cadaver vertebrae (23 thoracic and 10 lumbar segments). In the angio suite, ACT (190 o ; 1.5 o per image) was performed to obtain volumetric data. Another volumetric data set of the specimen was obtained by MDCT using a standard algorithm. Nine multiplanar reconstructions in standardized axial, coronal, and sagittal planes of every vertebra were generated from both data sets. Images were evaluated on the basis of a nominal scale with 18 criteria, comprising osseous properties (e.g., integrity of the end plate) and cement distribution (e.g., presence of intraspinal cement). MDCT images were regarded as gold standard and analyzed by two readers in a consensus mode. Rotational acquisitions were analyzed by six blinded readers. Results were correlated with the gold standard using Cohen's κ-coefficient analysis. Furthermore, interobserver variability was calculated. Correlation with the gold standard ranged from no correlation (osseous margins of the neuroforamen, κ = 0.008) to intermediate (trace of vertebroplasty canula; κ = 0.615) for criteria referring to osseous morphology. However, there was an excellent correlation for those criteria referring to cement distribution, with κ values ranging from 0.948 (paravertebral cement distribution) to 0.972 (intraspinal cement distribution). With a minimum of κ = 0.768 ('good correlation') and a maximum of κ = 0.91 ('excellent'), interobserver variability was low. In conclusion, ACT in an angio suite without a flat-panel detector depicts a cement leakage after vertebroplasty as well as MDCT. However, the method does not provide sufficient depiction of osseous morphology.

  19. Comparative evaluation of ultrasonography, computed tomography and magnetic resonance imaging in the follow-up of Budd-Chiari syndrome

    Energy Technology Data Exchange (ETDEWEB)

    Legmann, P; Levesque, M; De, Broucker F; F,; Hay, J M; Maillard, J N

    1988-01-01

    A comparative evaluation of ultrasonography, computed tomography and MRI in 8 patients operated for Budd-Chiari syndrome is reported. The results obtained, evaluated separately for each technique and then compared between each other and with the data of superior coelio-mesenteric angiography and inferior cavography, show that the MRI data is very clearly superior to the data obtained by ultrasonography and computed tomography. MRI allows simultaneous assessment of the hepatic parenchyma, evaluation of portal hypertension and the porto-caval anastomosis, which are all essential elements in the follow-up Budd-Chiari syndrome. However, in the light of the literature, the authors stress that ultrasonography associated with pulsed Doppler also ensures satisfactory vascular and parenchymal assessment of this disease in the majority of cases.

  20. Comparative evaluation of ultrasonography, computed tomography and magnetic resonance imaging in the follow-up of Budd-Chiari syndrome

    International Nuclear Information System (INIS)

    Legmann, P.; Levesque, M.; Broucker, F. de; Hay, J.M.; Maillard, J.N.

    1989-01-01

    The authors report a comparative evaluation of ultrasonography, computed tomography and MRI in 8 patients operated for Budd-Chiari syndrome. The results obtained, evaluated separately for each technique and then compared between each other and with the data of superior coelio-mesenteric angiography and inferior cavography, showed that MRI data is very clearly superior to the data obtained by ultrasonography and computed tomography. MRI allows simultaneous assessment of the hepatic parenchyma, and evaluation of portal hypertension and the porto-caval anastomosis, which are all essential elements in the follow-up Budd-Chiari syndrome. However, in the light of the literature, the authors stress that ultrasonography associated with pulsed Doppler also ensures satisfactory vascular and parenchymal assessment of this disease in the majority of cases [fr

  1. Comparative Analysis of Canal Centering Ability of Different Single File Systems Using Cone Beam Computed Tomography- An In-Vitro Study.

    Science.gov (United States)

    Agarwal, Rolly S; Agarwal, Jatin; Jain, Pradeep; Chandra, Anil

    2015-05-01

    The ability of an endodontic instrument to remain centered in the root canal system is one of the most important characteristic influencing the clinical performance of a particular file system. Thus, it is important to assess the canal centering ability of newly introduced single file systems before they can be considered a viable replacement of full-sequence rotary file systems. The aim of the study was to compare the canal transportation, centering ability, and time taken for preparation of curved root canals after instrumentation with single file systems One Shape and Wave One, using cone-beam computed tomography (CBCT). Sixty mesiobuccal canals of mandibular molars with an angle of curvature ranging from 20(o) to 35(o) were divided into three groups of 20 samples each: ProTaper PT (group I) - full-sequence rotary control group, OneShape OS (group II)- single file continuous rotation, WaveOne WO - single file reciprocal motion (group III). Pre instrumentation and post instrumentation three-dimensional CBCT images were obtained from root cross-sections at 3mm, 6mm and 9mm from the apex. Scanned images were then accessed to determine canal transportation and centering ability. The data collected were evaluated using one-way analysis of variance (ANOVA) with Tukey's honestly significant difference test. It was observed that there were no differences in the magnitude of transportation between the rotary instruments (p >0.05) at both 3mm as well as 6mm from the apex. At 9 mm from the apex, Group I PT showed significantly higher mean canal transportation and lower centering ability (0.19±0.08 and 0.39±0.16), as compared to Group II OS (0.12±0.07 and 0.54±0.24) and Group III WO (0.13±0.06 and 0.55±0.18) while the differences between OS and WO were not statistically significant. It was concluded that there was minor difference between the tested groups. Single file systems demonstrated average canal transportation and centering ability comparable to full sequence

  2. Comparative Analysis of Canal Centering Ability of Different Single File Systems Using Cone Beam Computed Tomography- An In-Vitro Study

    Science.gov (United States)

    Agarwal, Jatin; Jain, Pradeep; Chandra, Anil

    2015-01-01

    Background The ability of an endodontic instrument to remain centered in the root canal system is one of the most important characteristic influencing the clinical performance of a particular file system. Thus, it is important to assess the canal centering ability of newly introduced single file systems before they can be considered a viable replacement of full-sequence rotary file systems. Aim The aim of the study was to compare the canal transportation, centering ability, and time taken for preparation of curved root canals after instrumentation with single file systems One Shape and Wave One, using cone-beam computed tomography (CBCT). Materials and Methods Sixty mesiobuccal canals of mandibular molars with an angle of curvature ranging from 20o to 35o were divided into three groups of 20 samples each: ProTaper PT (group I) – full-sequence rotary control group, OneShape OS (group II)- single file continuous rotation, WaveOne WO – single file reciprocal motion (group III). Pre instrumentation and post instrumentation three-dimensional CBCT images were obtained from root cross-sections at 3mm, 6mm and 9mm from the apex. Scanned images were then accessed to determine canal transportation and centering ability. The data collected were evaluated using one-way analysis of variance (ANOVA) with Tukey’s honestly significant difference test. Results It was observed that there were no differences in the magnitude of transportation between the rotary instruments (p >0.05) at both 3mm as well as 6mm from the apex. At 9 mm from the apex, Group I PT showed significantly higher mean canal transportation and lower centering ability (0.19±0.08 and 0.39±0.16), as compared to Group II OS (0.12±0.07 and 0.54±0.24) and Group III WO (0.13±0.06 and 0.55±0.18) while the differences between OS and WO were not statistically significant Conclusion It was concluded that there was minor difference between the tested groups. Single file systems demonstrated average canal

  3. Comparative analysis of quality control tests on computed tomography in accordance with national and international laws

    International Nuclear Information System (INIS)

    Ramos, Fernando S.; Vasconcelos, Rebeca S.; Goncalves, Marcel S.; Oliveira, Marcus V.L. de

    2014-01-01

    The objective of this study is to perform a comparative analysis between the Brazilian legislation and internationals protocols, with respect to the quality control tests for computerized tomography. We used 07 references, published from 1998-2012: the Protocolo Brasileiro - Portaria 453/98 SVS/MS and the Guia de Radiodiagnostico Medico da ANVISA; Quality Assurance Programme for Computed Tomography: Diagnostic and Therapy Applications of the IAEA; European Protocol - European Guidelines on Quality Criteria for Computed Tomography of the EUR No. 16262 EN; Radiation Protection No. 162 - Criteria for Acceptability of Medical Radiology, Nuclear Medicine and Radiotherapy of the European Commission; the Protocols of Control de Calidad en Radiodiagnostico IAEA / ARCAL XLIX; and the Protocolo Espanol de Control de Calidad en Radiodignostico. The comparative analysis of these legislations was based on aspects of tolerance / limit, frequency and objectives of the recommended tests. Were found 18 tests in the Brazilian legislation. The tests were grouped according to their nature (dosimetric tests / exposure and geometric tests and image quality tests). Among the evaluated protocols was identified divergence between tests contained in the documents and the criteria of assessment set out in this work. It is clear, moreover, that for certain documents are not observed tolerances, well-defined methodologies and even frequency of testing. We conclude that the current legislation in Brazil differs in certain respects from international protocols analyzed, although this has a great numbers of quality control tests. However, it is necessary that the Brazilian legislation takes into account technological advances presented to time

  4. Computer usage and national energy consumption: Results from a field-metering study

    Energy Technology Data Exchange (ETDEWEB)

    Desroches, Louis-Benoit [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Fuchs, Heidi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Greenblatt, Jeffery [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Pratt, Stacy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Willem, Henry [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Claybaugh, Erin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Beraki, Bereket [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Nagaraju, Mythri [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Price, Sarah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Young, Scott [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division

    2014-12-01

    The electricity consumption of miscellaneous electronic loads (MELs) in the home has grown in recent years, and is expected to continue rising. Consumer electronics, in particular, are characterized by swift technological innovation, with varying impacts on energy use. Desktop and laptop computers make up a significant share of MELs electricity consumption, but their national energy use is difficult to estimate, given uncertainties around shifting user behavior. This report analyzes usage data from 64 computers (45 desktop, 11 laptop, and 8 unknown) collected in 2012 as part of a larger field monitoring effort of 880 households in the San Francisco Bay Area, and compares our results to recent values from the literature. We find that desktop computers are used for an average of 7.3 hours per day (median = 4.2 h/d), while laptops are used for a mean 4.8 hours per day (median = 2.1 h/d). The results for laptops are likely underestimated since they can be charged in other, unmetered outlets. Average unit annual energy consumption (AEC) for desktops is estimated to be 194 kWh/yr (median = 125 kWh/yr), and for laptops 75 kWh/yr (median = 31 kWh/yr). We estimate national annual energy consumption for desktop computers to be 20 TWh. National annual energy use for laptops is estimated to be 11 TWh, markedly higher than previous estimates, likely reflective of laptops drawing more power in On mode in addition to greater market penetration. This result for laptops, however, carries relatively higher uncertainty compared to desktops. Different study methodologies and definitions, changing usage patterns, and uncertainty about how consumers use computers must be considered when interpreting our results with respect to existing analyses. Finally, as energy consumption in On mode is predominant, we outline several energy savings opportunities: improved power management (defaulting to low-power modes after periods of inactivity as well as power scaling), matching the rated power

  5. Low-Dose and Standard-Dose Unenhanced Helical Computed Tomography for the Assessment of Acute Renal Colic: Prospective Comparative Study

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Bong Soo; Hwang, Im Kyung; Choi, Yo Won; Namkung, Sook; Kim, Heung Cheol; Hwang, Woo Cheol; Choi, Kuk Myung; Park, Ji Kang; Han, Tae Il; Kang, Weechang [Cheju National Univ. College of Medicine, Jeju (Korea, Republic of). Dept. of Diagnostic Radiology

    2005-11-01

    Purpose: To compare the efficacy of low-dose and standard-dose computed tomography (CT) for the diagnosis of ureteral stones. Material and Methods: Unenhanced helical CT was performed with both a standard dose (260 mAs, pitch 1.5) and a low dose (50 mAs, pitch 1.5) in 121 patients suspected of having acute renal colic. The two studies were prospectively and independently interpreted for the presence and location of ureteral stones, abnormalities unrelated to stone disease, identification of secondary signs, i.e. hydronephrosis and perinephric stranding, and tissue rim sign. The standard-dose CT images were interpreted by one reviewer and the low-dose CT images independently by two reviewers unaware of the standard-dose CT findings. The findings of the standard and low-dose CT scans were compared with the exact McNemar test. Interobserver agreements were assessed with kappa analysis. The effective radiation doses resulting from two different protocols were calculated by means of commercially available software to which the Monte-Carlo phantom model was given. Results: The sensitivity, specificity, and accuracy of standard-dose CT for detecting ureteral stones were 99%, 93%, and 98%, respectively, whereas for the two reviewers the sensitivity of low-dose CT was 93% and 95%, specificity 86%, and accuracy 92% and 94%. We found no significant differences between standard-dose and low-dose CT in the sensitivity and specificity for diagnosing ureter stones ( P >0.05 for both). However, the sensitivity of low-dose CT for detection of 19 stones less than or equal to 2 mm in diameter was 79% and 68%, respectively, for the two reviewers. Low-dose CT was comparable to standard-dose CT in visualizing hydronephrosis and the tissue rim sign. Perinephric stranding was far less clear on low-dose CT. Low-dose CT had the same diagnostic performance as standard-dose CT in diagnosing alternative diseases. Interobserver agreement between the two low-dose CT reviewers in the diagnosis of

  6. Shaping ability of reciprocating motion of WaveOne and HyFlex in moderate to severe curved canals: A comparative study with cone beam computed tomography

    Science.gov (United States)

    Simpsy, Gurram Samuel; Sajjan, Girija S.; Mudunuri, Padmaja; Chittem, Jyothi; Prasanthi, Nalam N. V. D.; Balaga, Pankaj

    2016-01-01

    Introduction: M-Wire and reciprocating motion of WaveOne and controlled memory (CM) wire) of HyFlex were the recent innovations using thermal treatment. Therefore, a study was planned to evaluate the shaping ability of reciprocating motion of WaveOne and HyFlex using cone beam computed tomography (CBCT). Methodology: Forty-five freshly extracted mandibular teeth were selected and stored in saline until use. All teeth were scanned pre- and post-operatively using CBCT (Kodak 9000). All teeth were accessed and divided into three groups. (1) Group 1 (control n = 15): Instrumented with ProTaper. (2) Group 2 (n = 15): Instrumented with primary file (8%/25) WaveOne. (3) Group 3 (n = 15): Instrumented with (4%/25) HyFlex CM. Sections at 1, 3, and 5 mm were obtained from the pre- and post-operative scans. Measurement was done using CS3D software and Adobe Photoshop software. Apical transportation and degree of straightening were measured and statistically analyzed. Results: HyFlex showed lesser apical transportation when compared to other groups at 1 and 3 mm. WaveOne showed lesser degree of straightening when compared to other groups. Conclusion: This present study concluded that all systems could be employed in routine endodontics whereas HyFlex and WaveOne could be employed in severely curved canals. PMID:27994323

  7. A computational study of the topology of vortex breakdown

    Science.gov (United States)

    Spall, Robert E.; Gatski, Thomas B.

    1991-01-01

    A fully three-dimensional numerical simulation of vortex breakdown using the unsteady, incompressible Navier-Stokes equations has been performed. Solutions to four distinct types of breakdown are identified and compared with experimental results. The computed solutions include weak helical, double helix, spiral, and bubble-type breakdowns. The topological structure of the various breakdowns as well as their interrelationship are studied. The data reveal that the asymmetric modes of breakdown may be subject to additional breakdowns as the vortex core evolves in the streamwise direction. The solutions also show that the freestream axial velocity distribution has a significant effect on the position and type of vortex breakdown.

  8. Computer-assisted surgery for screw insertion into the distal sesamoid bone in horses: an in vitro study.

    Science.gov (United States)

    Gygax, Diego; Lischer, Christoph; Auer, Joerg A

    2006-10-01

    To compare the precision of computer-assisted surgery with a conventional technique (CV) using a special guiding device for screw insertion into the distal sesamoid bone in horses. In vitro experimental study. Cadaveric forelimb specimens. Insertion of a 3.5 mm cortex screw in lag fashion along the longitudinal axis of intact (non-fractured) distal sesamoid bones was evaluated in 2 groups (8 limbs each): CV and computer-assisted surgery (CAS). For CV, the screw was inserted using a special guiding device and fluoroscopy, whereas for CAS, the screw was inserted using computer-assisted navigation. The accuracy of screw placement was verified by radiography, computed tomography, and specimen dissection. Surgical precision was better in CAS compared with CV. CAS improves the accuracy of lateromedial screw insertion, in lag fashion, into the distal sesamoid bone. The CAS technique should be considered for improved accuracy of screw insertion in fractures of the distal sesamoid bone.

  9. Writing Apprehension, Computer Anxiety and Telecomputing: A Pilot Study.

    Science.gov (United States)

    Harris, Judith; Grandgenett, Neal

    1992-01-01

    A study measured graduate students' writing apprehension and computer anxiety levels before and after using electronic mail, computer conferencing, and remote database searching facilities during an educational technology course. Results indicted postcourse computer anxiety levels significantly related to usage statistics. Precourse writing…

  10. Prospective comparative study of spiral computer tomography and magnetic resonance imaging for detection of hepatocellular carcinoma

    NARCIS (Netherlands)

    Stoker, J.; Romijn, M. G.; de Man, R. A.; Brouwer, J. T.; Weverling, G. J.; van Muiswinkel, J. M.; Zondervan, P. E.; Laméris, J. S.; Ijzermans, J. N. M.

    2002-01-01

    Background: Hepatocellular carcinoma (HCC) is often detected at a relatively late stage when tumour size prohibits curative surgery. Screening to detect HCC at an early stage is performed for patients at risk. Aim: The aim of this study was to compare prospectively the diagnostic accuracy and

  11. Ground-glass opacity: High-resolution computed tomography and 64-multi-slice computed tomography findings comparison

    International Nuclear Information System (INIS)

    Sergiacomi, Gianluigi; Ciccio, Carmelo; Boi, Luca; Velari, Luca; Crusco, Sonia; Orlacchio, Antonio; Simonetti, Giovanni

    2010-01-01

    Objective: Comparative evaluation of ground-glass opacity using conventional high-resolution computed tomography technique and volumetric computed tomography by 64-row multi-slice scanner, verifying advantage of volumetric acquisition and post-processing technique allowed by 64-row CT scanner. Methods: Thirty-four patients, in which was assessed ground-glass opacity pattern by previous high-resolution computed tomography during a clinical-radiological follow-up for their lung disease, were studied by means of 64-row multi-slice computed tomography. Comparative evaluation of image quality was done by both CT modalities. Results: It was reported good inter-observer agreement (k value 0.78-0.90) in detection of ground-glass opacity with high-resolution computed tomography technique and volumetric Computed Tomography acquisition with moderate increasing of intra-observer agreement (k value 0.46) using volumetric computed tomography than high-resolution computed tomography. Conclusions: In our experience, volumetric computed tomography with 64-row scanner shows good accuracy in detection of ground-glass opacity, providing a better spatial and temporal resolution and advanced post-processing technique than high-resolution computed tomography.

  12. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study.

    Science.gov (United States)

    Milesi, Carolina; Perez-Felkner, Lara; Brown, Kevin; Schneider, Barbara

    2017-01-01

    While the underrepresentation of women in the fast-growing STEM field of computer science (CS) has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM) to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were "beeped" several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research.

  13. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study

    Science.gov (United States)

    Milesi, Carolina; Perez-Felkner, Lara; Brown, Kevin; Schneider, Barbara

    2017-01-01

    While the underrepresentation of women in the fast-growing STEM field of computer science (CS) has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM) to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were “beeped” several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research. PMID:28487664

  14. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study

    Directory of Open Access Journals (Sweden)

    Carolina Milesi

    2017-04-01

    Full Text Available While the underrepresentation of women in the fast-growing STEM field of computer science (CS has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were “beeped” several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research.

  15. Detecting SYN flood attacks via statistical monitoring charts: A comparative study

    KAUST Repository

    Bouyeddou, Benamar

    2017-12-14

    Accurate detection of cyber-attacks plays a central role in safeguarding computer networks and information systems. This paper addresses the problem of detecting SYN flood attacks, which are the most popular Denial of Service (DoS) attacks. Here, we compare the detection capacity of three commonly monitoring charts namely, a Shewhart chart, a Cumulative Sum (CUSUM) control chart and exponentially weighted moving average (EWMA) chart, in detecting SYN flood attacks. The comparison study is conducted using the publicly available benchmark datasets: the 1999 DARPA Intrusion Detection Evaluation Datasets.

  16. Magnetic-fusion energy and computers

    International Nuclear Information System (INIS)

    Killeen, J.

    1982-01-01

    The application of computers to magnetic fusion energy research is essential. In the last several years the use of computers in the numerical modeling of fusion systems has increased substantially. There are several categories of computer models used to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies are also in use. To meet the needs of the fusion program, the National Magnetic Fusion Energy Computer Center has been established at the Lawrence Livermore National Laboratory. A large central computing facility is linked to smaller computer centers at each of the major MFE laboratories by a communication network. In addition to providing cost effective computing services, the NMFECC environment stimulates collaboration and the sharing of computer codes among the various fusion research groups

  17. Magnetic fusion energy and computers

    International Nuclear Information System (INIS)

    Killeen, J.

    1982-01-01

    The application of computers to magnetic fusion energy research is essential. In the last several years the use of computers in the numerical modeling of fusion systems has increased substantially. There are several categories of computer models used to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies are also in use. To meet the needs of the fusion program, the National Magnetic Fusion Energy Computer Center has been established at the Lawrence Livermore National Laboratory. A large central computing facility is linked to smaller computer centers at each of the major MFE laboratories by a communication network. In addition to providing cost effective computing services, the NMFECC environment stimulates collaboration and the sharing of computer codes among the various fusion research groups

  18. A computational study on outliers in world music

    Science.gov (United States)

    Benetos, Emmanouil; Dixon, Simon

    2017-01-01

    The comparative analysis of world music cultures has been the focus of several ethnomusicological studies in the last century. With the advances of Music Information Retrieval and the increased accessibility of sound archives, large-scale analysis of world music with computational tools is today feasible. We investigate music similarity in a corpus of 8200 recordings of folk and traditional music from 137 countries around the world. In particular, we aim to identify music recordings that are most distinct compared to the rest of our corpus. We refer to these recordings as ‘outliers’. We use signal processing tools to extract music information from audio recordings, data mining to quantify similarity and detect outliers, and spatial statistics to account for geographical correlation. Our findings suggest that Botswana is the country with the most distinct recordings in the corpus and China is the country with the most distinct recordings when considering spatial correlation. Our analysis includes a comparison of musical attributes and styles that contribute to the ‘uniqueness’ of the music of each country. PMID:29253027

  19. A computational study on outliers in world music.

    Science.gov (United States)

    Panteli, Maria; Benetos, Emmanouil; Dixon, Simon

    2017-01-01

    The comparative analysis of world music cultures has been the focus of several ethnomusicological studies in the last century. With the advances of Music Information Retrieval and the increased accessibility of sound archives, large-scale analysis of world music with computational tools is today feasible. We investigate music similarity in a corpus of 8200 recordings of folk and traditional music from 137 countries around the world. In particular, we aim to identify music recordings that are most distinct compared to the rest of our corpus. We refer to these recordings as 'outliers'. We use signal processing tools to extract music information from audio recordings, data mining to quantify similarity and detect outliers, and spatial statistics to account for geographical correlation. Our findings suggest that Botswana is the country with the most distinct recordings in the corpus and China is the country with the most distinct recordings when considering spatial correlation. Our analysis includes a comparison of musical attributes and styles that contribute to the 'uniqueness' of the music of each country.

  20. Integrating user studies into computer graphics-related courses.

    Science.gov (United States)

    Santos, B S; Dias, P; Silva, S; Ferreira, C; Madeira, J

    2011-01-01

    This paper presents computer graphics. Computer graphics and visualization are essentially about producing images for a target audience, be it the millions watching a new CG-animated movie or the small group of researchers trying to gain insight into the large amount of numerical data resulting from a scientific experiment. To ascertain the final images' effectiveness for their intended audience or the designed visualizations' accuracy and expressiveness, formal user studies are often essential. In human-computer interaction (HCI), such user studies play a similar fundamental role in evaluating the usability and applicability of interaction methods and metaphors for the various devices and software systems we use.

  1. Computer-aided detection of pulmonary nodules: a comparative study using the public LIDC/IDRI database

    International Nuclear Information System (INIS)

    Jacobs, Colin; Prokop, Mathias; Rikxoort, Eva M. van; Ginneken, Bram van; Murphy, Keelin; Schaefer-Prokop, Cornelia M.

    2016-01-01

    To benchmark the performance of state-of-the-art computer-aided detection (CAD) of pulmonary nodules using the largest publicly available annotated CT database (LIDC/IDRI), and to show that CAD finds lesions not identified by the LIDC's four-fold double reading process. The LIDC/IDRI database contains 888 thoracic CT scans with a section thickness of 2.5 mm or lower. We report performance of two commercial and one academic CAD system. The influence of presence of contrast, section thickness, and reconstruction kernel on CAD performance was assessed. Four radiologists independently analyzed the false positive CAD marks of the best CAD system. The updated commercial CAD system showed the best performance with a sensitivity of 82 % at an average of 3.1 false positive detections per scan. Forty-five false positive CAD marks were scored as nodules by all four radiologists in our study. On the largest publicly available reference database for lung nodule detection in chest CT, the updated commercial CAD system locates the vast majority of pulmonary nodules at a low false positive rate. Potential for CAD is substantiated by the fact that it identifies pulmonary nodules that were not marked during the extensive four-fold LIDC annotation process. (orig.)

  2. Tingling/numbness in the hands of computer users: neurophysiological findings from the NUDATA study

    DEFF Research Database (Denmark)

    Overgaard, E.; Brandt, L. P.; Ellemann, K.

    2004-01-01

    OBJECTIVES: To investigate whether tingling/numbness of the hands and fingers among computer users is associated with elevated vibration threshold as a sign of early nerve compression. METHODS: Within the Danish NUDATA study, vibratory sensory testing with monitoring of the digital vibration...... once a week or daily within the last 3 months. Participants with more than slight muscular pain or disorders of the neck and upper extremities, excessive alcohol consumption, previous injuries of the upper extremities, or concurrent medical diseases were excluded. The two groups had a similar amount...... of work with mouse, keyboard, and computer. RESULTS: Seven of the 20 cases (35%) had elevated vibration thresholds, compared with 3 of the 20 controls (15%); this difference was not statistically significant (chi2=2.13, P=0.14). Compared with controls, cases had increased perception threshold for all...

  3. Towards a Computational Comparative Neuroprimatology: Framing the language-ready brain

    Science.gov (United States)

    Arbib, Michael A.

    2016-03-01

    We make the case for developing a Computational Comparative Neuroprimatology to inform the analysis of the function and evolution of the human brain. First, we update the mirror system hypothesis on the evolution of the language-ready brain by (i) modeling action and action recognition and opportunistic scheduling of macaque brains to hypothesize the nature of the last common ancestor of macaque and human (LCA-m); and then we (ii) introduce dynamic brain modeling to show how apes could acquire gesture through ontogenetic ritualization, hypothesizing the nature of evolution from LCA-m to the last common ancestor of chimpanzee and human (LCA-c). We then (iii) hypothesize the role of imitation, pantomime, protosign and protospeech in biological and cultural evolution from LCA-c to Homo sapiens with a language-ready brain. Second, we suggest how cultural evolution in Homo sapiens led from protolanguages to full languages with grammar and compositional semantics. Third, we assess the similarities and differences between the dorsal and ventral streams in audition and vision as the basis for presenting and comparing two models of language processing in the human brain: A model of (i) the auditory dorsal and ventral streams in sentence comprehension; and (ii) the visual dorsal and ventral streams in defining ;what language is about; in both production and perception of utterances related to visual scenes provide the basis for (iii) a first step towards a synthesis and a look at challenges for further research.

  4. Measuring Students' Writing Ability on a Computer-Analytic Developmental Scale: An Exploratory Validity Study

    Science.gov (United States)

    Burdick, Hal; Swartz, Carl W.; Stenner, A. Jackson; Fitzgerald, Jill; Burdick, Don; Hanlon, Sean T.

    2013-01-01

    The purpose of the study was to explore the validity of a novel computer-analytic developmental scale, the Writing Ability Developmental Scale. On the whole, collective results supported the validity of the scale. It was sensitive to writing ability differences across grades and sensitive to within-grade variability as compared to human-rated…

  5. Comparing two iteration algorithms of Broyden electron density mixing through an atomic electronic structure computation

    International Nuclear Information System (INIS)

    Zhang Man-Hong

    2016-01-01

    By performing the electronic structure computation of a Si atom, we compare two iteration algorithms of Broyden electron density mixing in the literature. One was proposed by Johnson and implemented in the well-known VASP code. The other was given by Eyert. We solve the Kohn-Sham equation by using a conventional outward/inward integration of the differential equation and then connect two parts of solutions at the classical turning points, which is different from the method of the matrix eigenvalue solution as used in the VASP code. Compared to Johnson’s algorithm, the one proposed by Eyert needs fewer total iteration numbers. (paper)

  6. Image based 3D city modeling : Comparative study

    Directory of Open Access Journals (Sweden)

    S. P. Singh

    2014-06-01

    Full Text Available 3D city model is a digital representation of the Earth’s surface and it’s related objects such as building, tree, vegetation, and some manmade feature belonging to urban area. The demand of 3D city modeling is increasing rapidly for various engineering and non-engineering applications. Generally four main image based approaches were used for virtual 3D city models generation. In first approach, researchers were used Sketch based modeling, second method is Procedural grammar based modeling, third approach is Close range photogrammetry based modeling and fourth approach is mainly based on Computer Vision techniques. SketchUp, CityEngine, Photomodeler and Agisoft Photoscan are the main softwares to represent these approaches respectively. These softwares have different approaches & methods suitable for image based 3D city modeling. Literature study shows that till date, there is no complete such type of comparative study available to create complete 3D city model by using images. This paper gives a comparative assessment of these four image based 3D modeling approaches. This comparative study is mainly based on data acquisition methods, data processing techniques and output 3D model products. For this research work, study area is the campus of civil engineering department, Indian Institute of Technology, Roorkee (India. This 3D campus acts as a prototype for city. This study also explains various governing parameters, factors and work experiences. This research work also gives a brief introduction, strengths and weakness of these four image based techniques. Some personal comment is also given as what can do or what can’t do from these softwares. At the last, this study shows; it concluded that, each and every software has some advantages and limitations. Choice of software depends on user requirements of 3D project. For normal visualization project, SketchUp software is a good option. For 3D documentation record, Photomodeler gives good

  7. A comparative study of computed tomographic techniques for the detection of emphysema in middle-aged and older patient populations

    International Nuclear Information System (INIS)

    Tanino, Michie; Nishimura, Masaharu; Betsuyaku, Tomoko; Takeyabu, Kimihiro; Tanino, Yoshinori; Kawakami, Yoshikazu; Miyamoto, Kenji

    2000-01-01

    Helical-scan computed tomography (CT) is now widely utilized as a mass screening procedure for lung cancer. By adding 3 slices of high-resolution CT (HRCT) to the standard screening procedure, we were able to compare the efficacy of helical-scan CT and HRCT in detecting pulmonary emphysema. Additionally, the prevalence of emphysema detected by HRCT was examined as a function of patient age and smoking history. The subjects (106 men and 28 women) were all community-based middle-aged and older volunteers who participated in a mass lung cancer screening program. Based on visual assessments of the CT films, emphysema was detected in 29 subjects (22%) by HRCT, but in only 4 (3%) by helical-scan CT. Although the prevalence of emphysema was higher among subjects with a higher smoking index, no correlations with age were observed. We concluded that the efficacy of helical scan CT in detecting pulmonary emphysema can be significantly improved with the inclusion of 3 slices of HRCT, and confirmed that cigarette smoking is linked to the development of pulmonary emphysema. (author)

  8. EFQPSK Versus CERN: A Comparative Study

    Science.gov (United States)

    Borah, Deva K.; Horan, Stephen

    2001-01-01

    This report presents a comparative study on Enhanced Feher's Quadrature Phase Shift Keying (EFQPSK) and Constrained Envelope Root Nyquist (CERN) techniques. These two techniques have been developed in recent times to provide high spectral and power efficiencies under nonlinear amplifier environment. The purpose of this study is to gain insights into these techniques and to help system planners and designers with an appropriate set of guidelines for using these techniques. The comparative study presented in this report relies on effective simulation models and procedures. Therefore, a significant part of this report is devoted to understanding the mathematical and simulation models of the techniques and their set-up procedures. In particular, mathematical models of EFQPSK and CERN, effects of the sampling rate in discrete time signal representation, and modeling of nonlinear amplifiers and predistorters have been considered in detail. The results of this study show that both EFQPSK and CERN signals provide spectrally efficient communications compared to filtered conventional linear modulation techniques when a nonlinear power amplifier is used. However, there are important differences. The spectral efficiency of CERN signals, with a small amount of input backoff, is significantly better than that of EFQPSK signals if the nonlinear amplifier is an ideal clipper. However, to achieve such spectral efficiencies with a practical nonlinear amplifier, CERN processing requires a predistorter which effectively translates the amplifier's characteristics close to those of an ideal clipper. Thus, the spectral performance of CERN signals strongly depends on the predistorter. EFQPSK signals, on the other hand, do not need such predistorters since their spectra are almost unaffected by the nonlinear amplifier, Ibis report discusses several receiver structures for EFQPSK signals. It is observed that optimal receiver structures can be realized for both coded and uncoded EFQPSK

  9. Computing camera heading: A study

    Science.gov (United States)

    Zhang, John Jiaxiang

    2000-08-01

    An accurate estimate of the motion of a camera is a crucial first step for the 3D reconstruction of sites, objects, and buildings from video. Solutions to the camera heading problem can be readily applied to many areas, such as robotic navigation, surgical operation, video special effects, multimedia, and lately even in internet commerce. From image sequences of a real world scene, the problem is to calculate the directions of the camera translations. The presence of rotations makes this problem very hard. This is because rotations and translations can have similar effects on the images, and are thus hard to tell apart. However, the visual angles between the projection rays of point pairs are unaffected by rotations, and their changes over time contain sufficient information to determine the direction of camera translation. We developed a new formulation of the visual angle disparity approach, first introduced by Tomasi, to the camera heading problem. Our new derivation makes theoretical analysis possible. Most notably, a theorem is obtained that locates all possible singularities of the residual function for the underlying optimization problem. This allows identifying all computation trouble spots beforehand, and to design reliable and accurate computational optimization methods. A bootstrap-jackknife resampling method simultaneously reduces complexity and tolerates outliers well. Experiments with image sequences show accurate results when compared with the true camera motion as measured with mechanical devices.

  10. Learning by Computer Simulation Does Not Lead to Better Test Performance on Advanced Cardiac Life Support Than Textbook Study.

    Science.gov (United States)

    Kim, Jong Hoon; Kim, Won Oak; Min, Kyeong Tae; Yang, Jong Yoon; Nam, Yong Taek

    2002-01-01

    For an effective acquisition and the practical application of rapidly increasing amounts of information, computer-based learning has already been introduced in medical education. However, there have been few studies that compare this innovative method to traditional learning methods in studying advanced cardiac life support (ACLS). Senior medical students were randomized to computer simulation and a textbook study. Each group studied ACLS for 150 minutes. Tests were done one week before, immediately after, and one week after the study period. Testing consisted of 20 questions. All questions were formulated in such a way that there was a single best answer. Each student also completed a questionnaire designed to assess computer skills as well as satisfaction with and benefit from the study materials. Test scores improved after both textbook study and computer simulation study in both groups but the improvement in scores was significantly higher for the textbook group only immediately after the study. There was no significant difference between groups in their computer skill and satisfaction with the study materials. The textbook group reported greater benefit from study materials than did the computer simulation group. Studying ACLS with a hard copy textbook may be more effective than computer simulation for the acquisition of simple information during a brief period. However, the difference in effectiveness is likely transient.

  11. Open-Source Software in Computational Research: A Case Study

    Directory of Open Access Journals (Sweden)

    Sreekanth Pannala

    2008-04-01

    Full Text Available A case study of open-source (OS development of the computational research software MFIX, used for multiphase computational fluid dynamics simulations, is presented here. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow, as well as the dissemination of information to other areas such as geophysical and volcanology research, is demonstrated. This study shows that the advantages of OS development were realized in the case of MFIX: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; the facilitation of peer review of the results of computational research.

  12. High-resolution computed tomography findings of influenza virus pneumonia. A comparative study between seasonal and novel (H1N1) influenza virus pneumonia

    International Nuclear Information System (INIS)

    Tanaka, Nobuyuki; Kunihiro, Yoshie; Matsunaga, Naofumi; Hasegawa, Shunji; Ichiyama, Takashi; Emoto, Takuya; Suda, Hiroki

    2012-01-01

    The purpose of this study was to evaluate the high-resolution computed tomography (HRCT) findings of novel influenza virus (n-IFV) pneumonia and compare them with the findings for seasonal (s-IFV) pneumonia. We evaluated 29 cases of pure IFV pneumonia that occurred between 1990 and 2010. We evaluated the existence, extent, and patterns of HRCT findings and compared these features between s-IFV and n-IFV. Consolidation was less frequent in s-IFV than in n-IFV (40.0 vs. 84.2%, respectively; p=0.014). Consolidation with a loss of volume was frequent in n-IFV (62.5%). There was no significant difference in the occurrence of ground-glass opacity (GGO) between s-IFV and n-IFV (100 vs. 84.2%, respectively). GGO with reticular opacities was more frequent in s-IFV than in n-IFV (70.0 vs. 25.0%, respectively; p=0.024). The frequency of nodules was not significantly different between the two groups. The mosaic pattern was more frequent in s-IFV than in n-IFV patients (80.0 vs. 15.8%, respectively; p=0.0007). Mucoid impaction was more frequent in patients with n-IFV than with s-IFV (52.6 vs. 10.0%, respectively; p=0.025). Consolidation and mucoid impaction were more frequent in n-IFV, whereas GGO with reticular opacities and a mosaic pattern occurred more frequently in s-IFV; otherwise, there were no significant differences between the two groups. (author)

  13. Case Studies in Library Computer Systems.

    Science.gov (United States)

    Palmer, Richard Phillips

    Twenty descriptive case studies of computer applications in a variety of libraries are presented in this book. Computerized circulation, serial and acquisition systems in public, high school, college, university and business libraries are included. Each of the studies discusses: 1) the environment in which the system operates, 2) the objectives of…

  14. Tautomerism of 4-hydrazinoquinazolines: vibrational spectra and computational study

    Directory of Open Access Journals (Sweden)

    Tetiana Yu. Sergeieva

    2014-03-01

    Full Text Available The tautomerism of 4-hydrazinoquinazoline and its derivatives was investigated. Geometry and thermodynamic parameters were computed theoretically using Gaussian 03 software. All calculations were performed at the MP2 level of theory using the standard 6-31G(d basis. Energetics and relative stabilities of tautomers were compared and analyzed in a gas phase. The effect of solvents (1,4-dioxane, acetic acid, ethanol and water on the tautomeric equlibria was evaluated using PCM. It was determined that solvents induced slight changes in the relative stability. In all cases 4-hydrazinoquinazoline exists predominantly as the amino form. The variation of dipole moments was studied. The anharmonic vibrational wavenumbers for unsubstituted 4-hydrazinoquinazoline were calculated at MP2/6-31G(d level and compared with experimental data. The modes of IR spectra were assigned. The calculated herein wavenumbers and intensities of amino form are in good agreement with those observed experimentally.      

  15. Hispanic women overcoming deterrents to computer science: A phenomenological study

    Science.gov (United States)

    Herling, Lourdes

    The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the U.S. population which they represent. The overall enrollment in computer science programs has continued to decline with the enrollment of women declining at a higher rate than that of men. This study addressed three aspects of underrepresentation about which there has been little previous research: addressing computing disciplines specifically rather than embedding them within the STEM disciplines, what attracts women and minorities to computer science, and addressing the issues of race/ethnicity and gender in conjunction rather than in isolation. Since women of underrepresented ethnicities are more severely underrepresented than women in general, it is important to consider whether race and ethnicity play a role in addition to gender as has been suggested by previous research. Therefore, this study examined what attracted Hispanic women to computer science specifically. The study determines whether being subjected to multiple marginalizations---female and Hispanic---played a role in the experiences of Hispanic women currently in computer science. The study found five emergent themes within the experiences of Hispanic women in computer science. Encouragement and role models strongly influenced not only the participants' choice to major in the field, but to persist as well. Most of the participants experienced a negative atmosphere and feelings of not fitting in while in college and industry. The interdisciplinary nature of computer science was the most common aspect that attracted the participants to computer science. The aptitudes participants commonly believed are needed for success in computer science are the Twenty

  16. Computer processing of dynamic scintigraphic studies

    International Nuclear Information System (INIS)

    Ullmann, V.

    1985-01-01

    The methods are discussed of the computer processing of dynamic scintigraphic studies which were developed, studied or implemented by the authors within research task no. 30-02-03 in nuclear medicine within the five year plan 1981 to 85. This was mainly the method of computer processing radionuclide angiography, phase radioventriculography, regional lung ventilation, dynamic sequential scintigraphy of kidneys and radionuclide uroflowmetry. The problems are discussed of the automatic definition of fields of interest, the methodology of absolute volumes of the heart chamber in radionuclide cardiology, the design and uses are described of the multipurpose dynamic phantom of heart activity for radionuclide angiocardiography and ventriculography developed within the said research task. All methods are documented with many figures showing typical clinical (normal and pathological) and phantom measurements. (V.U.)

  17. [A computer-aided image diagnosis and study system].

    Science.gov (United States)

    Li, Zhangyong; Xie, Zhengxiang

    2004-08-01

    The revolution in information processing, particularly the digitizing of medicine, has changed the medical study, work and management. This paper reports a method to design a system for computer-aided image diagnosis and study. Combined with some good idea of graph-text system and picture archives communicate system (PACS), the system was realized and used for "prescription through computer", "managing images" and "reading images under computer and helping the diagnosis". Also typical examples were constructed in a database and used to teach the beginners. The system was developed by the visual developing tools based on object oriented programming (OOP) and was carried into operation on the Windows 9X platform. The system possesses friendly man-machine interface.

  18. A comparative study to investigate burnup in research reactor fuel using two independent experimental methods

    International Nuclear Information System (INIS)

    Iqbal, M.; Mehmood, T.; Ayazuddin, S.K.; Salahuddin, A.; Pervez, S.

    2001-01-01

    Two independent experimental methods have been used for the comparative study of fuel burnup measurement in low enriched uranium, plate type research reactor. In the first method a gamma ray activity ratio method was employed. An experimental setup was established for gamma ray scanning using prior calibrated high purity germanium detector. The computer software KORIGEN gave the theoretical support. In the second method reactivity difference technique was used. At the same location in the same core configuration the fresh and burned fuel element's reactivity worth was estimated. For theoretical estimated curve, group cross-sections were generated using computer code WIMS-D/4, and three dimensional modeling was made by computer code CITATION. The measured burnup of different fuel elements using these methods were found to be in good agreement

  19. Comparing Postsecondary Marketing Student Performance on Computer-Based and Handwritten Essay Tests

    Science.gov (United States)

    Truell, Allen D.; Alexander, Melody W.; Davis, Rodney E.

    2004-01-01

    The purpose of this study was to determine if there were differences in postsecondary marketing student performance on essay tests based on test format (i.e., computer-based or handwritten). Specifically, the variables of performance, test completion time, and gender were explored for differences based on essay test format. Results of the study…

  20. Comparative study of adaptive-noise-cancellation algorithms for intrusion detection systems

    International Nuclear Information System (INIS)

    Claassen, J.P.; Patterson, M.M.

    1981-01-01

    Some intrusion detection systems are susceptible to nonstationary noise resulting in frequent nuisance alarms and poor detection when the noise is present. Adaptive inverse filtering for single channel systems and adaptive noise cancellation for two channel systems have both demonstrated good potential in removing correlated noise components prior detection. For such noise susceptible systems the suitability of a noise reduction algorithm must be established in a trade-off study weighing algorithm complexity against performance. The performance characteristics of several distinct classes of algorithms are established through comparative computer studies using real signals. The relative merits of the different algorithms are discussed in the light of the nature of intruder and noise signals

  1. Learning by computer simulation does not lead to better test performance than textbook study in the diagnosis and treatment of dysrhythmias.

    Science.gov (United States)

    Kim, Jong Hoon; Kim, Won Oak; Min, Kyeong Tae; Yang, Jong Yoon; Nam, Yong Taek

    2002-08-01

    To compare computer-based learning with traditional learning methods in studying advanced cardiac life support (ACLS). Prospective, randomized study. University hospital. Senior medical students were randomized to perform computer simulation and textbook study. Each group studied ACLS for 150 minutes. Tests were performed 1 week before, immediately after, and 1 week after the study period. Testing consisted of 20 questions. All questions were formulated in such a way that there was a single best answer. Each student also completed a questionnaire designed to assess computer skills, as well as satisfaction with and benefit from the study materials. Test scores improved after both textbook study and computer simulation study in both groups, although the improvement in scores was significantly higher for the textbook group only immediately after the study. There was no significant difference between groups in their computer skill and satisfaction with the study materials. The textbook group reported greater benefit from study materials than did the computer simulation group. Studying ACLS with a hard-copy textbook may be more effective than computer simulation for acquiring simple information during a brief period. However, the difference in effectiveness is likely transient.

  2. A Computational Study of an Oscillating VR-12 Airfoil with a Gurney Flap

    Science.gov (United States)

    Rhee, Myung

    2004-01-01

    Computations of the flow over an oscillating airfoil with a Gurney-flap are performed using a Reynolds Averaged Navier-Stokes code and compared with recent experimental data. The experimental results have been generated for different sizes of the Gurney flaps. The computations are focused mainly on a configuration. The baseline airfoil without a Gurney flap is computed and compared with the experiments in both steady and unsteady cases for the purpose of initial testing of the code performance. The are carried out with different turbulence models. Effects of the grid refinement are also examined and unsteady cases, in addition to the assessment of solver effects. The results of the comparisons of steady lift and drag computations indicate that the code is reasonably accurate in the attached flow of the steady condition but largely overpredicts the lift and underpredicts the drag in the higher angle steady flow.

  3. Case studies in intelligent computing achievements and trends

    CERN Document Server

    Issac, Biju

    2014-01-01

    Although the field of intelligent systems has grown rapidly in recent years, there has been a need for a book that supplies a timely and accessible understanding of this important technology. Filling this need, Case Studies in Intelligent Computing: Achievements and Trends provides an up-to-date introduction to intelligent systems.This edited book captures the state of the art in intelligent computing research through case studies that examine recent developments, developmental tools, programming, and approaches related to artificial intelligence (AI). The case studies illustrate successful ma

  4. Portraits of PBL: Course Objectives and Students' Study Strategies in Computer Engineering, Psychology and Physiotherapy.

    Science.gov (United States)

    Dahlgren, Madeleine Abrandt

    2000-01-01

    Compares the role of course objectives in relation to students' study strategies in problem-based learning (PBL). Results comprise data from three PBL programs at Linkopings University (Sweden), in physiotherapy, psychology, and computer engineering. Faculty provided course objectives to function as supportive structures and guides for students'…

  5. On the Impact of Execution Models: A Case Study in Computational Chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Chavarría-Miranda, Daniel; Halappanavar, Mahantesh; Krishnamoorthy, Sriram; Manzano Franco, Joseph B.; Vishnu, Abhinav; Hoisie, Adolfy

    2015-05-25

    Efficient utilization of high-performance computing (HPC) platforms is an important and complex problem. Execution models, abstract descriptions of the dynamic runtime behavior of the execution stack, have significant impact on the utilization of HPC systems. Using a computational chemistry kernel as a case study and a wide variety of execution models combined with load balancing techniques, we explore the impact of execution models on the utilization of an HPC system. We demonstrate a 50 percent improvement in performance by using work stealing relative to a more traditional static scheduling approach. We also use a novel semi-matching technique for load balancing that has comparable performance to a traditional hypergraph-based partitioning implementation, which is computationally expensive. Using this study, we found that execution model design choices and assumptions can limit critical optimizations such as global, dynamic load balancing and finding the correct balance between available work units and different system and runtime overheads. With the emergence of multi- and many-core architectures and the consequent growth in the complexity of HPC platforms, we believe that these lessons will be beneficial to researchers tuning diverse applications on modern HPC platforms, especially on emerging dynamic platforms with energy-induced performance variability.

  6. Computed tomographic study of aged schizophrenic patients

    International Nuclear Information System (INIS)

    Seno, Haruo; Fujimoto, Akihiko; Ishino, Hiroshi; Shibata, Masahiro; Kuroda, Hiroyuki; Kanno, Hiroshi.

    1997-01-01

    The width of interhemispheric fissure, lateral ventricles and third ventricle were measured using cranial computed tomography (CT; linear method) in 45 elderly inpatients with chronic schizophrenia and in 28 age-matched control subjects. Twenty-three patients were men and 22 were women. In addition, Mini-Mental State Examination, Brief Psychiatric Rating Scale (BPRS) and a subclass of BPRS were undertaken in all patients. There is a significant enlargement of the maximum width of the interhemispheric fissure (in both male and female) and a significant enlargement of ventricular system (more severe in men than in women) in aged schizophrenics, as seen with CT, compared with normal controls. These findings are consistent with previous studies of non-aged schizophrenic patients. Based upon the relation between psychiatric symptoms and CT findings, the most striking is a significant negative correlation between the third ventricle enlargement and the positive and depressive symptoms in all patients. This result suggests that the advanced third ventricle enlargement may decrease these symptoms in aged schizophrenics. (author)

  7. Parallelized computation for computer simulation of electrocardiograms using personal computers with multi-core CPU and general-purpose GPU.

    Science.gov (United States)

    Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong

    2010-10-01

    Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  8. US QCD computational performance studies with PERI

    International Nuclear Information System (INIS)

    Zhang, Y; Fowler, R; Huck, K; Malony, A; Porterfield, A; Reed, D; Shende, S; Taylor, V; Wu, X

    2007-01-01

    We report on some of the interactions between two SciDAC projects: The National Computational Infrastructure for Lattice Gauge Theory (USQCD), and the Performance Engineering Research Institute (PERI). Many modern scientific programs consistently report the need for faster computational resources to maintain global competitiveness. However, as the size and complexity of emerging high end computing (HEC) systems continue to rise, achieving good performance on such systems is becoming ever more challenging. In order to take full advantage of the resources, it is crucial to understand the characteristics of relevant scientific applications and the systems these applications are running on. Using tools developed under PERI and by other performance measurement researchers, we studied the performance of two applications, MILC and Chroma, on several high performance computing systems at DOE laboratories. In the case of Chroma, we discuss how the use of C++ and modern software engineering and programming methods are driving the evolution of performance tools

  9. Effect of field-of-view size on gray values derived from cone-beam computed tomography compared with the Hounsfield unit values from multidetector computed tomography scans.

    Science.gov (United States)

    Shokri, Abbas; Ramezani, Leila; Bidgoli, Mohsen; Akbarzadeh, Mahdi; Ghazikhanlu-Sani, Karim; Fallahi-Sichani, Hamed

    2018-03-01

    This study aimed to evaluate the effect of field-of-view (FOV) size on the gray values derived from conebeam computed tomography (CBCT) compared with the Hounsfield unit values from multidetector computed tomography (MDCT) scans as the gold standard. A radiographic phantom was designed with 4 acrylic cylinders. One cylinder was filled with distilled water, and the other 3 were filled with 3 types of bone substitute: namely, Nanobone, Cenobone, and Cerabone. The phantom was scanned with 2 CBCT systems using 2 different FOV sizes, and 1 MDCT system was used as the gold standard. The mean gray values (MGVs) of each cylinder were calculated in each imaging protocol. In both CBCT systems, significant differences were noted in the MGVs of all materials between the 2 FOV sizes ( P <.05) except for Cerabone in the Cranex3D system. Significant differences were found in the MGVs of each material compared with the others in both FOV sizes for each CBCT system. No significant difference was seen between the Cranex3D CBCT system and the MDCT system in the MGVs of bone substitutes on images obtained with a small FOV. The size of the FOV significantly changed the MGVs of all bone substitutes, except for Cerabone in the Cranex3D system. Both CBCT systems had the ability to distinguish the 3 types of bone substitutes based on a comparison of their MGVs. The Cranex3D CBCT system used with a small FOV had a significant correlation with MDCT results.

  10. Towards a Computational Comparative Neuroprimatology: Framing the language-ready brain.

    Science.gov (United States)

    Arbib, Michael A

    2016-03-01

    We make the case for developing a Computational Comparative Neuroprimatology to inform the analysis of the function and evolution of the human brain. First, we update the mirror system hypothesis on the evolution of the language-ready brain by (i) modeling action and action recognition and opportunistic scheduling of macaque brains to hypothesize the nature of the last common ancestor of macaque and human (LCA-m); and then we (ii) introduce dynamic brain modeling to show how apes could acquire gesture through ontogenetic ritualization, hypothesizing the nature of evolution from LCA-m to the last common ancestor of chimpanzee and human (LCA-c). We then (iii) hypothesize the role of imitation, pantomime, protosign and protospeech in biological and cultural evolution from LCA-c to Homo sapiens with a language-ready brain. Second, we suggest how cultural evolution in Homo sapiens led from protolanguages to full languages with grammar and compositional semantics. Third, we assess the similarities and differences between the dorsal and ventral streams in audition and vision as the basis for presenting and comparing two models of language processing in the human brain: A model of (i) the auditory dorsal and ventral streams in sentence comprehension; and (ii) the visual dorsal and ventral streams in defining "what language is about" in both production and perception of utterances related to visual scenes provide the basis for (iii) a first step towards a synthesis and a look at challenges for further research. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. The Impact of Computer and Mathematics Software Usage on Performance of School Leavers in the Western Cape Province of South Africa: A Comparative Analysis

    Science.gov (United States)

    Smith, Garth Spencer; Hardman, Joanne

    2014-01-01

    In this study the impact of computer immersion on performance of school leavers Senior Certificate mathematics scores was investigated across 31 schools in the EMDC East education district of Cape Town, South Africa by comparing performance between two groups: a control and an experimental group. The experimental group (14 high schools) had access…

  12. Comparing the use of computer-supported collaboration tools among university students with different life circumstances

    Directory of Open Access Journals (Sweden)

    Miikka J. Eriksson

    2014-11-01

    Full Text Available The proportion of higher education students who integrate learning with various life circumstances such as employment or raising children is increasing. This study aims to compare whether and what kinds of differences exist between the perceived use of synchronous and asynchronous computer-mediated communication tools among university students with children or in full-time employment and students without these commitments. The data were collected in a Finnish University by the means of an online questionnaire. The results indicate that students with multiple commitments were using more virtual learning environments and less instant messaging (IM especially when communicating with their peers. The low level of IM might be an indication of not being able to or not wanting to create close ties with their peer students. The practical implication of the study is that pedagogical choices should support different kinds of learning strategies. Students with multiple commitments, and especially students with children, should be encouraged and assisted to create stronger ties with their peers, if they are willing to do so.

  13. Traumatic brain injury in a rural and an urban Tanzanian hospital--a comparative, retrospective analysis based on computed tomography.

    Science.gov (United States)

    Maier, Daniel; Njoku, Innocent; Schmutzhard, Erich; Dharsee, Jaffer; Doppler, Magdalena; Härtl, Roger; Winkler, Andrea Sylvia

    2014-01-01

    In a resource-poor environment such as rural East Africa, expensive medical devices such as computed tomographic (CT) scanners are rare. The CT scanner at the rural Haydom Lutheran Hospital (HLH) in Tanzania therefore offers a unique chance to observe possible differences with urban medical centers in the disease pattern of trauma-related cranial pathologies. The purpose of this study was to compare traumatic brain injuries (TBIs) between a rural and an urban area of Tanzania. HLH has 350 beds and one CT scanner. The urban Aga Khan Hospital is a private hospital with 80 beds and one CT scanner. This was a retrospective study. Data of 248 patients at HLH and of 432 patients at Aga Khan Hospital with TBI could be collected. The prevalence of TBI was significantly higher in the rural area compared to the urban area (34.2% vs. 21.9%, P workplace is primarily urban or rural. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Aberration studies and computer algebra

    International Nuclear Information System (INIS)

    Hawkes, P.W.

    1981-01-01

    The labour of calculating expressions for aberration coefficients is considerably lightened if a computer algebra language is used to perform the various substitutions and expansions involved. After a brief discussion of matrix representations of aberration coefficients, a particular language, which has shown itself to be well adapted to particle optics, is described and applied to the study of high frequency cavity lenses. (orig.)

  15. Clinical study on left atrial thrombi. Comparative study between echocardiography and CT scan

    Energy Technology Data Exchange (ETDEWEB)

    Shimada, E; Asano, H; Kurasawa, T; Mitsumoto, K; Yamane, Y [Tokyo Kosei-Nenkin Hospital (Japan)

    1981-09-01

    We studied left atrial thrombi (LAT) by both echocardiography and computed tomography (CT) and compared the features of the 2 methods. A total of 15 patients with mitral stenosis complicated by atrial fibrillation were selected as the subjects. LAT were noted on the M-mode echocardiograms in 2 patients including a questionably positive one, on the two-dimensional echocardiograms in 5, and on the CT scans in 6 of 15. The history of thromboembolism was rather frequent and was found in 7 of 15 patients. However, LAT was found in only 3 of these on the CT scans. A shaggy or fuzzy pattern on the M-mode echocardiogram cannot be regarded as representing thrombi, while a laminar pattern undoubtedly represented thrombi. Two-dimensional echocardiography has considerably contributed to the improved detection rate of LAT. For the characteristic properties of ultrasound beams, however, it was impossible to investigate the entire left atrium. The detection of the thrombi in the appendage was especially difficult. However, computed tomography, permitting transverse cross-sectional tomography, was capable of sectioning the heart even in the presence of air and bones. The measurement of CT values was suggestive of the properties of the substance or substances involved, and also allowed the presumption as to whether the thrombus has been fibrosed. Furthermore, it was possible to estimate more accurately as well as 3-dimensionally the location, shape and dimensions of the thrombi by the reconstruction of the heart according to the CT values. It was concluded that echocardiography and computed tomography are the mutual aid to further improvement in the detection rate of left atrial thrombi.

  16. Children as Educational Computer Game Designers: An Exploratory Study

    Science.gov (United States)

    Baytak, Ahmet; Land, Susan M.; Smith, Brian K.

    2011-01-01

    This study investigated how children designed computer games as artifacts that reflected their understanding of nutrition. Ten 5th grade students were asked to design computer games with the software "Game Maker" for the purpose of teaching 1st graders about nutrition. The results from the case study show that students were able to…

  17. FORMING SCHOOLCHILD’S PERSONALITY IN COMPUTER STUDY LESSONS AT PRIMARY SCHOOL

    Directory of Open Access Journals (Sweden)

    Natalia Salan

    2017-04-01

    Full Text Available The influence of computer on the formation of primary schoolchildren’s personality and their implementing into learning activity are considered in the article. Based on the materials of state standards and the Law of Ukraine on Higher Education the concepts “computer”, “information culture” are defined, modern understanding of the concept “basics of computer literacy” is identified. The main task of school propaedeutic course in Computer Studies is defined. Interactive methods of activity are singled out. They are didactic games, designing, research, collaboration in pairs, and group interaction, etc. The essential characteristics of didactic game technologies are distinguished, the peculiarities of their use at primary school in Computer Study lessons are analyzed. Positive and negative aspects of using these technologies in Computer Study lessons are defined. The expediency of using game technologies while organizing students’ educational and cognitive activity in Computer Studies is substantiated. The idea to create a school course “Computer Studies at primary school” is caused by the wide introduction of computer technics into the educational system. Today’s schoolchild has to be able to use a computer as freely and easily as he can use a pen, a pencil or a ruler. That’s why it is advisable to start studying basics of Computer Studies at the primary school age. This course is intended for the pupils of the 2nd-4th forms. Firstly, it provides mastering practical skills of computer work and, secondly, it anticipates the development of children’s logical and algorithmic thinking styles. At these lessons students acquire practical skills to work with information on the computer. Having mastered the computer skills at primary school, children will be able to use it successfully in their work. In senior classes they will be able to realize acquired knowledge of the methods of work with information, ways of problem solving

  18. A computational study on oblique shock wave-turbulent boundary layer interaction

    Science.gov (United States)

    Joy, Md. Saddam Hossain; Rahman, Saeedur; Hasan, A. B. M. Toufique; Ali, M.; Mitsutake, Y.; Matsuo, S.; Setoguchi, T.

    2016-07-01

    A numerical computation of an oblique shock wave incident on a turbulent boundary layer was performed for free stream flow of air at M∞ = 2.0 and Re1 = 10.5×106 m-1. The oblique shock wave was generated from a 8° wedge. Reynolds averaged Navier-Stokes (RANS) simulation with k-ω SST turbulence model was first utilized for two dimensional (2D) steady case. The results were compared with the experiment at the same flow conditions. Further, to capture the unsteadiness, a 2D Large Eddy Simulation (LES) with sub-grid scale model WMLES was performed which showed the unsteady effects. The frequency of the shock oscillation was computed and was found to be comparable with that of experimental measurement.

  19. Studying fatigue damage evolution in uni-directional composites using x-ray computed tomography

    DEFF Research Database (Denmark)

    Mikkelsen, Lars Pilgaard

    , it will be possible to lower the costs of energy for wind energy based electricity. In the present work, a lab-source x-ray computed tomography equipment (Zeiss Xradia 520 Versa) has been used in connection with ex-situ fatigue testing of uni-directional composites in order to identify fibre failure during...... comparable x-ray studies) have been used in order to ensure a representative test volume during the ex-situ fatigue testing. Using the ability of the x-ray computed tomography to zoom into regions of interest, non-destructive, the fatigue damage evolution in a repeating ex-situ fatigue loaded test sample has...... improving the fatigue resistance of non-crimp fabric used in the wind turbine industry can be made....

  20. Studies of left ventricular volume estimation from single photon emission computed tomography

    International Nuclear Information System (INIS)

    Hiraki, Yoshio; Shimizu, Mitsuharu; Joja, Ikuo; Aono, Kaname; Yanagi, Hidekiyo; Indo, Haruaki; Seno, Yoshimasa; Teramoto, Shigeru; Nagaya, Isao.

    1988-01-01

    We studied the comparative accuracy of 99m Tc cardiac blood pool Single Photon Emission Computed Tomography (SPECT) for the measurement of left ventricular volume in 20 patients undergoing SPECT and single plane contrast left ventriculography (LVG). Left ventricular volume was calculated based on the total number of voxels in left ventricle. End-diastolic left ventricular volume (EDV) and end-systolic left ventricular volume (ESV) calculated from SPECT were compared with those from LVG. SPECT volume values showed a high degree of correlation with those by LVG (r = 0.923 for EDV, r = 0.903 for ESV). We appreciated the usefulness and accuracy of SPECT in measuring left ventricular volume because of its three-dimensional information. (author)

  1. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1988-01-01

    This paper gives a collective summary of the studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRANTIC, FTAP, computer code package RALLY, and BOUNDS codes. Two reference study cases were executed by each code. The results obtained logic/probabilistic analysis as well as computation time are compared

  2. Dense Descriptors for Optical Flow Estimation: A Comparative Study

    Directory of Open Access Journals (Sweden)

    Ahmadreza Baghaie

    2017-02-01

    Full Text Available Estimating the displacements of intensity patterns between sequential frames is a very well-studied problem, which is usually referred to as optical flow estimation. The first assumption among many of the methods in the field is the brightness constancy during movements of pixels between frames. This assumption is proven to be not true in general, and therefore, the use of photometric invariant constraints has been studied in the past. One other solution can be sought by use of structural descriptors rather than pixels for estimating the optical flow. Unlike sparse feature detection/description techniques and since the problem of optical flow estimation tries to find a dense flow field, a dense structural representation of individual pixels and their neighbors is computed and then used for matching and optical flow estimation. Here, a comparative study is carried out by extending the framework of SIFT-flow to include more dense descriptors, and comprehensive comparisons are given. Overall, the work can be considered as a baseline for stimulating more interest in the use of dense descriptors for optical flow estimation.

  3. Tropical pulmonary eosinophilia: a comparative evaluation of plain chest radiography and computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Sandhu Manavijit; Mukhopadhyay Sima; Sharma, S.K. [All India Inst. of Medical Sciences, New Delhi (India). Dept. of Nuclear Medicine

    1996-02-01

    Plain chest radiography and computed tomography (CT) of the chest were performed on 10 patients with tropical pulmonary eosinophilia (TPE). Chest radiographs revealed bilateral diffuse lesions in the lungs of all the patients with relative sparing of lower lobes in one patient. However, computed tomography revealed bilateral diffuse lung lesions in all of the patients with relative sparing of lower lobes in three patients. In seven (70%) of the 10 patients, CT provided additional information. Computed tomography was found to be superior for the detection of reticulonodular pattern, bronchiectasis, air trapping, calcification and mediastinal adenopathy. No correlation was found between pulmonary function and gas exchange data using CT densities. There was also no correlation between the absolute eosinophil count (AEC) and the radiological severity of lesions. In six patients, high-resolution CT (HRCT) was performed in addition to conventional CT (CCT), and nodularity of lesions was better appreciated in these patients. It is concluded from this study that CT is superior to plain radiography for the evaluation of patients with TPE. 17 refs., 2 tabs., 4 figs.

  4. Geoid-to-Quasigeoid Separation Computed Using the GRACE/GOCE Global Geopotential Model GOCO02S - A Case Study of Himalayas and Tibet

    Directory of Open Access Journals (Sweden)

    Mohammad Bagherbandi Robert Tenzer

    2013-01-01

    Full Text Available The geoid-to-quasigeoid correction has been traditionally computed approximately as a function of the planar Bouguer gravity anomaly and the topographic height. Recent numerical studies based on newly developed theoretical models, however, indicate that the computation of this correction using the approximate formula yields large errors especially in mountainous regions with computation points at high elevations. In this study we investigate these approximation errors at the study area which comprises Himalayas and Tibet where this correction reaches global maxima. Since the GPS-leveling and terrestrial gravity datasets in this part of the world are not (freely available, global gravitational models (GGMs are used to compute this correction utilizing the expressions for a spherical harmonic analysis of the gravity field. The computation of this correction can be done using the GGM coefficients taken from the Earth Gravitational Model 2008 (EGM08 complete to degree 2160 of spherical harmonics. The recent studies based on a regional accuracy assessment of GGMs have shown that the combined GRACE/GOCE solutions provide a substantial improvement of the Earth¡¦s gravity field at medium wavelengths of spherical harmonics compared to EGM08. We address this aspect in numerical analysis by comparing the gravity field quantities computed using the satellite-only combined GRACE/GOCE model GOCO02S against the EGM08 results. The numerical results reveal that errors in the geoid-to-quasigeoid correction computed using the approximate formula can reach as much as ~1.5 m. We also demonstrate that the expected improvement of the GOCO02S gravity field quantities at medium wavelengths (within the frequency band approximately between 100 and 250 compared to EGM08 is as much as ±60 mGal and ±0.2 m in terms of gravity anomalies and geoid/quasigeoid heights respectively.

  5. Computer-Assisted, Programmed Text, and Lecture Modes of Instruction in Three Medical Training Courses: Comparative Evaluation. Final Report.

    Science.gov (United States)

    Deignan, Gerard M.; And Others

    This report contains a comparative analysis of the differential effectiveness of computer-assisted instruction (CAI), programmed instructional text (PIT), and lecture methods of instruction in three medical courses--Medical Laboratory, Radiology, and Dental. The summative evaluation includes (1) multiple regression analyses conducted to predict…

  6. Computing on Knights and Kepler Architectures

    International Nuclear Information System (INIS)

    Bortolotti, G; Caberletti, M; Ferraro, A; Giacomini, F; Manzali, M; Maron, G; Salomoni, D; Crimi, G; Zanella, M

    2014-01-01

    A recent trend in scientific computing is the increasingly important role of co-processors, originally built to accelerate graphics rendering, and now used for general high-performance computing. The INFN Computing On Knights and Kepler Architectures (COKA) project focuses on assessing the suitability of co-processor boards for scientific computing in a wide range of physics applications, and on studying the best programming methodologies for these systems. Here we present in a comparative way our results in porting a Lattice Boltzmann code on two state-of-the-art accelerators: the NVIDIA K20X, and the Intel Xeon-Phi. We describe our implementations, analyze results and compare with a baseline architecture adopting Intel Sandy Bridge CPUs.

  7. Comparing staging by positron emission tomography with contrast-enhanced computed tomography and by pathology in head and neck squamous cell carcinoma.

    Science.gov (United States)

    Qualliotine, J R; Mydlarz, W K; Chan, J Y K; Zhou, X; Wang, H; Agrawal, N

    2015-12-01

    This study aimed to evaluate the ability of positron emission tomography with contrast-enhanced computed tomography to correctly stage head and neck squamous cell carcinomas, in comparison with pathological staging. Positron emission tomography computed tomography was used to determine the tumour-node-metastasis classification and overall cancer stage in 85 head and neck squamous cell carcinoma patients who underwent pre-operative imaging using this modality and primary surgery between July 2010 and January 2013. Staging by positron emission tomography computed tomography was retrospectively compared with staging using pathological specimens. Agreement between imaging stage and pathological stage was examined by univariate and multivariate analysis both overall and for each primary tumour site. This imaging modality was 87.5 per cent sensitive and 44.8 per cent specific in identifying regional cervical metastases, and had false positive and false negative rates of 18.8 per cent and 8.2 per cent, respectively. The positive predictive and negative predictive values were 75.4 per cent and 65.0 per cent, respectively. Univariate and multivariate analyses revealed a significant agreement between positron emission tomography computed tomography and pathological node classification in older patients and for the oral cavity primary tumour site. There was significant agreement between both methods in the overall classification only for tumours classified as T3 or greater. Positron emission tomography computed tomography should be used with caution for the pre-operative staging of head and neck cancers because of its high false positive and false negative rates.

  8. A comparative study of deep learning models for medical image classification

    Science.gov (United States)

    Dutta, Suvajit; Manideep, B. C. S.; Rai, Shalva; Vijayarajan, V.

    2017-11-01

    Deep Learning(DL) techniques are conquering over the prevailing traditional approaches of neural network, when it comes to the huge amount of dataset, applications requiring complex functions demanding increase accuracy with lower time complexities. Neurosciences has already exploited DL techniques, thus portrayed itself as an inspirational source for researchers exploring the domain of Machine learning. DL enthusiasts cover the areas of vision, speech recognition, motion planning and NLP as well, moving back and forth among fields. This concerns with building models that can successfully solve variety of tasks requiring intelligence and distributed representation. The accessibility to faster CPUs, introduction of GPUs-performing complex vector and matrix computations, supported agile connectivity to network. Enhanced software infrastructures for distributed computing worked in strengthening the thought that made researchers suffice DL methodologies. The paper emphases on the following DL procedures to traditional approaches which are performed manually for classifying medical images. The medical images are used for the study Diabetic Retinopathy(DR) and computed tomography (CT) emphysema data. Both DR and CT data diagnosis is difficult task for normal image classification methods. The initial work was carried out with basic image processing along with K-means clustering for identification of image severity levels. After determining image severity levels ANN has been applied on the data to get the basic classification result, then it is compared with the result of DNNs (Deep Neural Networks), which performed efficiently because of its multiple hidden layer features basically which increases accuracy factors, but the problem of vanishing gradient in DNNs made to consider Convolution Neural Networks (CNNs) as well for better results. The CNNs are found to be providing better outcomes when compared to other learning models aimed at classification of images. CNNs are

  9. A Comparative Study on Optimal Structural Dynamics Using Wavelet Functions

    Directory of Open Access Journals (Sweden)

    Seyed Hossein Mahdavi

    2015-01-01

    Full Text Available Wavelet solution techniques have become the focus of interest among researchers in different disciplines of science and technology. In this paper, implementation of two different wavelet basis functions has been comparatively considered for dynamic analysis of structures. For this aim, computational technique is developed by using free scale of simple Haar wavelet, initially. Later, complex and continuous Chebyshev wavelet basis functions are presented to improve the time history analysis of structures. Free-scaled Chebyshev coefficient matrix and operation of integration are derived to directly approximate displacements of the corresponding system. In addition, stability of responses has been investigated for the proposed algorithm of discrete Haar wavelet compared against continuous Chebyshev wavelet. To demonstrate the validity of the wavelet-based algorithms, aforesaid schemes have been extended to the linear and nonlinear structural dynamics. The effectiveness of free-scaled Chebyshev wavelet has been compared with simple Haar wavelet and two common integration methods. It is deduced that either indirect method proposed for discrete Haar wavelet or direct approach for continuous Chebyshev wavelet is unconditionally stable. Finally, it is concluded that numerical solution is highly benefited by the least computation time involved and high accuracy of response, particularly using low scale of complex Chebyshev wavelet.

  10. Comparative Analysis of Stability to Induced Deadlocks for Computing Grids with Various Node Architectures

    Directory of Open Access Journals (Sweden)

    Tatiana R. Shmeleva

    2018-01-01

    Full Text Available In this paper, we consider the classification and applications of switching methods, their advantages and disadvantages. A model of a computing grid was constructed in the form of a colored Petri net with a node which implements cut-through packet switching. The model consists of packet switching nodes, traffic generators and guns that form malicious traffic disguised as usual user traffic. The characteristics of the grid model were investigated under a working load with different intensities. The influence of malicious traffic such as traffic duel was estimated on the quality of service parameters of the grid. A comparative analysis of the computing grids stability was carried out with nodes which implement the store-and-forward and cut-through switching technologies. It is shown that the grids performance is approximately the same under work load conditions, and under peak load conditions the grid with the node implementing the store-and-forward technology is more stable. The grid with nodes implementing SAF technology comes to a complete deadlock through an additional load which is less than 10 percent. After a detailed study, it is shown that the traffic duel configuration does not affect the grid with cut-through nodes if the workload is increases to the peak load, at which the grid comes to a complete deadlock. The execution intensity of guns which generate a malicious traffic is determined by a random function with the Poisson distribution. The modeling system CPN Tools is used for constructing models and measuring parameters. Grid performance and average package delivery time are estimated in the grid on various load options.

  11. Spacelab experiment computer study. Volume 1: Executive summary (presentation)

    Science.gov (United States)

    Lewis, J. L.; Hodges, B. C.; Christy, J. O.

    1976-01-01

    A quantitative cost for various Spacelab flight hardware configurations is provided along with varied software development options. A cost analysis of Spacelab computer hardware and software is presented. The cost study is discussed based on utilization of a central experiment computer with optional auxillary equipment. Groundrules and assumptions used in deriving the costing methods for all options in the Spacelab experiment study are presented. The groundrules and assumptions, are analysed and the options along with their cost considerations, are discussed. It is concluded that Spacelab program cost for software development and maintenance is independent of experimental hardware and software options, that distributed standard computer concept simplifies software integration without a significant increase in cost, and that decisions on flight computer hardware configurations should not be made until payload selection for a given mission and a detailed analysis of the mission requirements are completed.

  12. A comparative study on the customized design of mandibular reconstruction plates using finite element method

    Directory of Open Access Journals (Sweden)

    Abdulrahman Al-Ahmari

    2015-07-01

    Full Text Available Mandible defects and its deformities are serious complications and its precise reconstruction is one of the most challenging tasks in oral maxillofacial surgery. The commercially available standard mandible implants are manually bended before surgery to custom fit the patient’s jaw. A slight mismatch in the plate and bone alignment may result in the implant failure. However, with the integration of computer-aided design, rapid prototyping, and advanced imaging systems (computed tomography or magnetic resonance imaging, it is possible to produce a customized mandible implant that can precisely fit the patient’s jaw. The aim of this article is to compare a new design of customized mandible implant (sinewave plate and compare it with the commonly used straight implant design. The finite element–simulated results reveal that the commonly used straight reconstruction plates are more prone to loosening of the screws due to its higher strain concentration on the screw hole when compared to newly designed sinewave reconstruction plate. Moreover, the straight plate is more sensitive to the chewing load variations and develops almost 20% increase in the stresses when compared to sinewave plate. The study reveals that the sinewave reconstruction plate can significantly enhance the stability and safety of the mandible implant.

  13. Algebraic computing program for studying the gauge theory

    International Nuclear Information System (INIS)

    Zet, G.

    2005-01-01

    An algebraic computing program running on Maple V platform is presented. The program is devoted to the study of the gauge theory with an internal Lie group as local symmetry. The physical quantities (gauge potentials, strength tensors, dual tensors etc.) are introduced either as equations in terms of previous defined quantities (tensors), or by manual entry of the component values. The components of the strength tensor and of its dual are obtained with respect to a given metric of the space-time used for describing the gauge theory. We choose a Minkowski space-time endowed with spherical symmetry and give some example of algebraic computing that are adequate for studying electroweak or gravitational interactions. The field equations are also obtained and their solutions are determined using the DEtools facilities of the Maple V computing program. (author)

  14. A comparative analysis of multi-level computer-assisted decision making systems for traumatic injuries

    Directory of Open Access Journals (Sweden)

    Huynh Toan

    2009-01-01

    Full Text Available Abstract Background This paper focuses on the creation of a predictive computer-assisted decision making system for traumatic injury using machine learning algorithms. Trauma experts must make several difficult decisions based on a large number of patient attributes, usually in a short period of time. The aim is to compare the existing machine learning methods available for medical informatics, and develop reliable, rule-based computer-assisted decision-making systems that provide recommendations for the course of treatment for new patients, based on previously seen cases in trauma databases. Datasets of traumatic brain injury (TBI patients are used to train and test the decision making algorithm. The work is also applicable to patients with traumatic pelvic injuries. Methods Decision-making rules are created by processing patterns discovered in the datasets, using machine learning techniques. More specifically, CART and C4.5 are used, as they provide grammatical expressions of knowledge extracted by applying logical operations to the available features. The resulting rule sets are tested against other machine learning methods, including AdaBoost and SVM. The rule creation algorithm is applied to multiple datasets, both with and without prior filtering to discover significant variables. This filtering is performed via logistic regression prior to the rule discovery process. Results For survival prediction using all variables, CART outperformed the other machine learning methods. When using only significant variables, neural networks performed best. A reliable rule-base was generated using combined C4.5/CART. The average predictive rule performance was 82% when using all variables, and approximately 84% when using significant variables only. The average performance of the combined C4.5 and CART system using significant variables was 89.7% in predicting the exact outcome (home or rehabilitation, and 93.1% in predicting the ICU length of stay for

  15. Scalar localization by cone-beam computed tomography of cochlear implant carriers: a comparative study between straight and periomodiolar precurved electrode arrays.

    Science.gov (United States)

    Boyer, Eric; Karkas, Alexandre; Attye, Arnaud; Lefournier, Virginie; Escude, Bernard; Schmerber, Sebastien

    2015-03-01

    To compare the incidence of dislocation of precurved versus straight flexible cochlear implant electrode arrays using cone-beam computed tomography (CBCT) image analyses. Consecutive nonrandomized case-comparison study. Tertiary referral center. Analyses of patients' CBCT images after cochlear implant surgery. Precurved and straight flexible electrode arrays from two different manufacturers were implanted. A round window insertion was performed in most cases. Two cases necessitated a cochleostomy. The patients' CBCT images were reconstructed in the coronal oblique, sagittal oblique, and axial oblique section. The insertion depth angle and the incidence of dislocation from the scala tympani to the scala vestibuli were determined. The CBCT images and the incidence of dislocation were analyzed in 54 patients (61 electrode arrays). Thirty-one patients were implanted with a precurved perimodiolar electrode array and 30 patients with a straight flexible electrode array. A total of nine (15%) scalar dislocations were observed in both groups. Eight (26%) scalar dislocations were observed in the precurved array group and one (3%) in the straight array group. Dislocation occurred at an insertion depth angle between 170 and 190 degrees in the precurved array group and at approximately 370 degrees in the straight array group. With precurved arrays, dislocation usually occurs in the ascending part of the basal turn of the cochlea. With straight flexible electrode arrays, the incidence of dislocation was lower, and it seems that straight flexible arrays have a higher chance of a confined position within the scala tympani than perimodiolar precurved arrays.

  16. The study of radiographic technique with low exposure using computed panoramic tomography

    International Nuclear Information System (INIS)

    Saito, Yasuhiro

    1987-01-01

    A new imaging system for the dental field that combines recent advances in both the electronics and computer technologies was developed. This new imaging system is a computed panoramic tomography process based on the newly developed laser-scan system. In this study a quantitative image evaluation was performed comparing anatomical landmark in computed panoramic tomography at a low exposure (LPT) and in conventional panoramic tomography at a routin (CPT), and the following results were obtained: 1. The diagnostic value of the CPT decreased with decreasing exposure, paticularly with regard to the normal anatomical landmarks of such microstructural parts as the periodontal space, lamina dura and the enamel-dentin border. 2. The LPT was highly diagnostic value for all normal anatomical landmark, averaging about twice as valuable diagnostically as CPT. 3. The visually diagnostic value of the periodontal space, lamina dura, enamel-dentin border and the anatomical morphology of the teeth on the LPT beeing slightly dependent on the spatial frequency enhancement rank. 4. The LPT formed images with almost the same range of density as the CPT. 5. Computed panoramic tomographs taken at a low exposure revealed more information of the trabecular bone pattern on the image than conventional panoramic tomographs taken under routine condition in the visual spatial frequency range (0.1 - 5.0 cycle/mm). (author) 67 refs

  17. Demographics of undergraduates studying games in the United States: a comparison of computer science students and the general population

    Science.gov (United States)

    McGill, Monica M.; Settle, Amber; Decker, Adrienne

    2013-06-01

    Our study gathered data to serve as a benchmark of demographics of undergraduate students in game degree programs. Due to the high number of programs that are cross-disciplinary with computer science programs or that are housed in computer science departments, the data is presented in comparison to data from computing students (where available) and the US population. Participants included students studying games at four nationally recognized postsecondary institutions. The results of the study indicate that there is no significant difference between the ratio of men to women studying in computing programs or in game degree programs, with women being severely underrepresented in both. Women, blacks, Hispanics/Latinos, and heterosexuals are underrepresented compared to the US population. Those with moderate and conservative political views and with religious affiliations are underrepresented in the game student population. Participants agree that workforce diversity is important and that their programs are adequately diverse, but only one-half of the participants indicated that diversity has been discussed in any of their courses.

  18. A Comparative Study of RCS Computation Codes

    National Research Council Canada - National Science Library

    Tong, Chia T; Wah, Ang T; Hwee, Lim K; Philip, Ou S; Heng, Yar K; Rowse, David; Amos, Matthew; Keen, Alan; Pegg, Neil; Thain, Andrew

    2005-01-01

    .... The first test object is a (fictitious) generic missile. It provides a test problem for benchmarking the performance of CEM codes on geometries containing real world deficiencies, such as thin bodies and sharp corners...

  19. Studi Perbandingan Layanan Cloud Computing

    OpenAIRE

    Afdhal, Afdhal

    2013-01-01

    In the past few years, cloud computing has became a dominant topic in the IT area. Cloud computing offers hardware, infrastructure, platform and applications without requiring end-users knowledge of the physical location and the configuration of providers who deliver the services. It has been a good solution to increase reliability, reduce computing cost, and make opportunities to IT industries to get more advantages. The purpose of this article is to present a better understanding of cloud d...

  20. Brain-computer interfacing under distraction: an evaluation study

    Science.gov (United States)

    Brandl, Stephanie; Frølich, Laura; Höhne, Johannes; Müller, Klaus-Robert; Samek, Wojciech

    2016-10-01

    Objective. While motor-imagery based brain-computer interfaces (BCIs) have been studied over many years by now, most of these studies have taken place in controlled lab settings. Bringing BCI technology into everyday life is still one of the main challenges in this field of research. Approach. This paper systematically investigates BCI performance under 6 types of distractions that mimic out-of-lab environments. Main results. We report results of 16 participants and show that the performance of the standard common spatial patterns (CSP) + regularized linear discriminant analysis classification pipeline drops significantly in this ‘simulated’ out-of-lab setting. We then investigate three methods for improving the performance: (1) artifact removal, (2) ensemble classification, and (3) a 2-step classification approach. While artifact removal does not enhance the BCI performance significantly, both ensemble classification and the 2-step classification combined with CSP significantly improve the performance compared to the standard procedure. Significance. Systematically analyzing out-of-lab scenarios is crucial when bringing BCI into everyday life. Algorithms must be adapted to overcome nonstationary environments in order to tackle real-world challenges.

  1. A comparative phylogenetic study of genetics and folk music.

    Science.gov (United States)

    Pamjav, Horolma; Juhász, Zoltán; Zalán, Andrea; Németh, Endre; Damdin, Bayarlkhagva

    2012-04-01

    Computer-aided comparison of folk music from different nations is one of the newest research areas. We were intrigued to have identified some important similarities between phylogenetic studies and modern folk music. First of all, both of them use similar concepts and representation tools such as multidimensional scaling for modelling relationship between populations. This gave us the idea to investigate whether these connections are merely accidental or if they mirror population migrations from the past. We raised the question; does the complex structure of musical connections display a clear picture and can this system be interpreted by the genetic analysis? This study is the first to systematically investigate the incidental genetic background of the folk music context between different populations. Paternal (42 populations) and maternal lineages (56 populations) were compared based on Fst genetic distances of the Y chromosomal and mtDNA haplogroup frequencies. To test this hypothesis, the corresponding musical cultures were also compared using an automatic overlap analysis of parallel melody styles for 31 Eurasian nations. We found that close musical relations of populations indicate close genetic distances (music; maternal lineages have a more important role in folk music traditions than paternal lineages. Furthermore, the combination of these disciplines establishing a new interdisciplinary research field of "music-genetics" can be an efficient tool to get a more comprehensive picture on the complex behaviour of populations in prehistoric time.

  2. Sampling variability in forensic likelihood-ratio computation: A simulation study

    NARCIS (Netherlands)

    Ali, Tauseef; Spreeuwers, Lieuwe Jan; Veldhuis, Raymond N.J.; Meuwly, Didier

    2015-01-01

    Recently, in the forensic biometric community, there is a growing interest to compute a metric called “likelihood- ratio‿ when a pair of biometric specimens is compared using a biometric recognition system. Generally, a biomet- ric recognition system outputs a score and therefore a likelihood-ratio

  3. A comparative study of electrocardiogram multi-segment reconstruction and dual source computed tomography using a computer controlled coronary phantom

    International Nuclear Information System (INIS)

    Ohashi, Kazuya; Higashide, Ryo; Kunitomo, Hirosi; Ichikawa, Katsuhiro

    2011-01-01

    Currently, there are two main methods for improving temporal resolution of coronary computed tomography (CT): electrocardiogram-gated multi-segment reconstruction (EMR) and dual source scanning using dual source CT (DSCT). We developed a motion phantom system for image quality assessment of cardiac CT to evaluate these two methods. This phantom system was designed to move an object at arbitrary speeds during a desired phase range in cyclic motion. By using this system, we obtained coronary CT mode images for motion objects like coronary arteries. We investigated the difference in motion artifacts between EMR and the DSCT using a 3-mm-diameter acrylic rod resembling the coronary artery. EMR was evaluated using 16-row multi-slice CT (16MSCT). To evaluate the image quality, we examined the degree of motion artifacts by analyzing the profiles around the rod and the displacement of a peak pixel in the rod image. In the 16MSCT, remarkable increases of artifacts and displacement were caused by the EMR. In contrast, the DSCT presented excellent images with fewer artifacts. The results showed the validity of DSCT to improve true temporal resolution. (author)

  4. Does computer use affect the incidence of distal arm pain? A one-year prospective study using objective measures of computer use

    DEFF Research Database (Denmark)

    Mikkelsen, S.; Lassen, C. F.; Vilstrup, Imogen

    2012-01-01

    PURPOSE: To study how objectively recorded mouse and keyboard activity affects distal arm pain among computer workers. METHODS: Computer activities were recorded among 2,146 computer workers. For 52 weeks mouse and keyboard time, sustained activity, speed and micropauses were recorded with a soft......PURPOSE: To study how objectively recorded mouse and keyboard activity affects distal arm pain among computer workers. METHODS: Computer activities were recorded among 2,146 computer workers. For 52 weeks mouse and keyboard time, sustained activity, speed and micropauses were recorded...... with a software program installed on the participants' computers. Participants reported weekly pain scores via the software program for elbow, forearm and wrist/hand as well as in a questionnaire at baseline and 1-year follow up. Associations between pain development and computer work were examined for three pain...... were not risk factors for acute pain, nor did they modify the effects of mouse or keyboard time. Computer usage parameters were not associated with prolonged or chronic pain. A major limitation of the study was low keyboard times. CONCLUSION: Computer work was not related to the development...

  5. Multicenter study of quantitative computed tomography analysis using a computer-aided three-dimensional system in patients with idiopathic pulmonary fibrosis.

    Science.gov (United States)

    Iwasawa, Tae; Kanauchi, Tetsu; Hoshi, Toshiko; Ogura, Takashi; Baba, Tomohisa; Gotoh, Toshiyuki; Oba, Mari S

    2016-01-01

    To evaluate the feasibility of automated quantitative analysis with a three-dimensional (3D) computer-aided system (i.e., Gaussian histogram normalized correlation, GHNC) of computed tomography (CT) images from different scanners. Each institution's review board approved the research protocol. Informed patient consent was not required. The participants in this multicenter prospective study were 80 patients (65 men, 15 women) with idiopathic pulmonary fibrosis. Their mean age was 70.6 years. Computed tomography (CT) images were obtained by four different scanners set at different exposures. We measured the extent of fibrosis using GHNC, and used Pearson's correlation analysis, Bland-Altman plots, and kappa analysis to directly compare the GHNC results with manual scoring by radiologists. Multiple linear regression analysis was performed to determine the association between the CT data and forced vital capacity (FVC). For each scanner, the extent of fibrosis as determined by GHNC was significantly correlated with the radiologists' score. In multivariate analysis, the extent of fibrosis as determined by GHNC was significantly correlated with FVC (p < 0.001). There was no significant difference between the results obtained using different CT scanners. Gaussian histogram normalized correlation was feasible, irrespective of the type of CT scanner used.

  6. Development of highly potent melanogenesis inhibitor by in vitro, in vivo and computational studies

    Directory of Open Access Journals (Sweden)

    Abbas Q

    2017-07-01

    Full Text Available Qamar Abbas,1 Zaman Ashraf,2 Mubashir Hassan,1 Humaira Nadeem,3 Muhammad Latif,4 Samina Afzal,5 Sung-Yum Seo1 1Department of Biology, College of Natural Sciences, Kongju National University, Gongju, Republic of Korea; 2Department of Chemistry, Allama Iqbal Open University, Islamabad, 3Riphah Institute of Pharmaceutical Sciences, Riphah International University, Islamabad, Pakistan; 4Center for Genetics and Inherited Diseases, Taibah University, Almadinah Almunawwarah, Kingdom of Saudi Arabia; 5Faculty of Pharmacy, Bahauddin Zakria University, Multan, Pakistan Abstract: The present work describes the synthesis of few hydroxylated amide derivatives as melanogenesis inhibitors. In vitro, in vivo and computational studies proved that compound 6d is a highly potent melanogenesis inhibitor compared to standard kojic acid. The title amides 4a–e and 6a–e were synthesized following simple reaction routes with excellent yields. Most of the synthesized compounds exhibited good mushroom tyrosinase inhibitory activity, but compound 6d showed excellent activity (IC50 0.15 µM compared to standard kojic acid (IC50 16.69 µM. Lineweaver–Burk plots were used for the determination of kinetic mechanism, and it was found that compounds 4c and 6d showed non-competitive inhibition while 6a and 6b showed mixed-type inhibition. The kinetic mechanism further revealed that compound 6d formed irreversible complex with the target enzyme tyrosinase. The Ki values determined for compounds 4c, 6a, 6b and 6d are 0.188, 0.84, 2.20 and 0.217 µM respectively. Results of human tyrosinase inhibitory activity in A375 human melanoma cells showed that compound 6d exhibited 91.9% inhibitory activity at a concentration of 50 µg/mL. In vivo cytotoxicity evaluation of compound 6d in zebrafish embryos showed that it is non-toxic to zebrafish. Melanin depigmentation assay performed in zebrafish indicated that compound 6d possessed greater potential in decreasing melanin contents

  7. Ranking of radioimmunoscintigraphy compared with computed tomography in diagnosis of present and recurrent colorectal tumours

    International Nuclear Information System (INIS)

    Barzen, G.; Zwicker, C.; Neumann, K.; Richter, W.; Hierholzer, J.; Langer, M.; Felix, R.; Loehde, E.; Raakow, R.; Boese-Landgraf, W.

    1992-01-01

    Radioimmunoscintigraphy (= RIS, scintigraphic 'specific' imaging of benign and malignant diseases by means of radioactively marked monoclonal antibodies) has been performed in Germany in clinical studies since 1985 in patients suffering from colorectal cancer. After having been successfully proven in primary studies, RIS is now being used in the early diagnosis of recurrences and metastases. In the prospective study presented here the clinical usefulness of RIS was assessed in comparison against well-tried diagnostic methods including computed tomography in patients suffering from colorectal cancer. It was shown that RIS in SPECT technique (= single photon emission computed tomography) with 99m Tc-labelled monoclonal CEA antibodies can visualise local recurrences if diagnostic findings are doubtful, with a sensitivity of 78% versus 50% for CT findings. (orig.) [de

  8. Computational study of a High Pressure Turbine Nozzle/Blade Interaction

    Science.gov (United States)

    Kopriva, James; Laskowski, Gregory; Sheikhi, Reza

    2015-11-01

    A downstream high pressure turbine blade has been designed for this study to be coupled with the upstream uncooled nozzle of Arts and Rouvroit [1992]. The computational domain is first held to a pitch-line section that includes no centrifugal forces (linear sliding-mesh). The stage geometry is intended to study the fundamental nozzle/blade interaction in a computationally cost efficient manner. Blade/Nozzle count of 2:1 is designed to maintain computational periodic boundary conditions for the coupled problem. Next the geometry is extended to a fully 3D domain with endwalls to understand the impact of secondary flow structures. A set of systematic computational studies are presented to understand the impact of turbulence on the nozzle and down-stream blade boundary layer development, resulting heat transfer, and downstream wake mixing in the absence of cooling. Doing so will provide a much better understanding of stage mixing losses and wall heat transfer which, in turn, can allow for improved engine performance. Computational studies are performed using WALE (Wale Adapted Local Eddy), IDDES (Improved Delayed Detached Eddy Simulation), SST (Shear Stress Transport) models in Fluent.

  9. Comparative study on developmental stages of the clavicle by postmortem MRI and CT imaging

    DEFF Research Database (Denmark)

    Larsen, Sara Tangmose; Lynnerup, Niels; Jensen, K.E.

    2013-01-01

    Objectives: The developmental stages of the clavicles are important for forensic age estimation purposes in adolescents. This study compares the 4-stage system to evaluate the ossification of the medial end of the clavicle as visualized by magnetic resonance imaging (MRI) and computed tomography...... (CT). As several forensic institutes routinely perform CT scans, the large amount of available data may serve as reference sample for MRI in specific cases. Material and methods: This prospective study included an MRI and CT scan of 47 autopsy cases performed prior to medico-legal autopsy (age range...

  10. Kidney lower pole pelvicaliceal anatomy: comparative analysis between intravenous urogram and three-dimensional helical computed tomography.

    Science.gov (United States)

    Rachid Filho, Daibes; Favorito, Luciano A; Costa, Waldemar S; Sampaio, Francisco J B

    2009-12-01

    The aim of our study was to evaluate if there is any advantage of three-dimensional helical computed tomography (3D-HCT) over intravenous urogram (IVU) in the morphometric and morphological analysis of lower pole spatial anatomy of the kidney. We analyzed 52 renal collecting systems in 30 patients, ranging in age from 23 to 80 years. The study compared the following features: (1) the angle formed between the lower infundibulum and the renal pelvis (i.e., lower infundibulum-pelvic angle [IPA]), (2) the lower infundibulum diameter (ID), and (3) the spatial distribution and number of lower pole calices (i.e., caliceal distribution [CD]). The study started with the 3D-HCT images obtained for posterior reconstruction and analysis. Afterward, we obtained anteroposterior and oblique IVU images. For IPA (in degrees) we found a mean +/- standard deviation (SD) value of 75.79 +/- 15.3 with 3D-HCT and 77.4 +/- 17.17 with IVU, which were not statistically significant. For ID (in mm) we found a mean +/- SD value of 7.5 +/- 2.92 with 3D-HCT and 8.15 +/- 3.27 with IVU. For CD we found a mean +/- SD value of 2.37 +/- 0.75 calices with 3D-HCT and 2.43 +/- 0.67 calices with IVU. On analyzing the difference between 3D-HCT and IVU, we found a mean +/- SD value of 0.06 +/- 0.51, and we verified that 74.5% of the examinations compared did not present statistically significant difference, with a Wilcoxon p-value of 0.405. Although 3D-HCT is more precise to study calculus location, tumors, and vessels, IVU was also demonstrated to be as precise as 3D-HCT for studying the lower pole spatial anatomy. We did not observe any statistically significant difference in the measurements of IPA, ID, and CD obtained using 3D-HCT when compared with those obtained using IVU. Therefore, 3D-HCT does not present any advantage over IVU in the evaluation of lower pole caliceal anatomy.

  11. Computed laminography and reconstruction algorithm

    International Nuclear Information System (INIS)

    Que Jiemin; Cao Daquan; Zhao Wei; Tang Xiao

    2012-01-01

    Computed laminography (CL) is an alternative to computed tomography if large objects are to be inspected with high resolution. This is especially true for planar objects. In this paper, we set up a new scanning geometry for CL, and study the algebraic reconstruction technique (ART) for CL imaging. We compare the results of ART with variant weighted functions by computer simulation with a digital phantom. It proves that ART algorithm is a good choice for the CL system. (authors)

  12. A comparative approach to computer aided design model of a dog femur.

    Science.gov (United States)

    Turamanlar, O; Verim, O; Karabulut, A

    2016-01-01

    Computer assisted technologies offer new opportunities in medical imaging and rapid prototyping in biomechanical engineering. Three dimensional (3D) modelling of soft tissues and bones are becoming more important. The accuracy of the analysis in modelling processes depends on the outline of the tissues derived from medical images. The aim of this study is the evaluation of the accuracy of 3D models of a dog femur derived from computed tomography data by using point cloud method and boundary line method on several modelling software. Solidworks, Rapidform and 3DSMax software were used to create 3D models and outcomes were evaluated statistically. The most accurate 3D prototype of the dog femur was created with stereolithography method using rapid prototype device. Furthermore, the linearity of the volumes of models was investigated between software and the constructed models. The difference between the software and real models manifests the sensitivity of the software and the devices used in this manner.

  13. Diagnosis of lumbar disc hernia with computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Yoshizumi, Atsuro; Ohira, Nobuhiro; Ojima, Tadashi; Oshida, Midori; Horaguchi, Mitsuru (Tohoku Rosai Hospital, Sendai (Japan))

    1982-07-01

    Results of computed tomography performed on patients with clinically diagnosed hernia were compared with those of myelography and operative findings. This comparative study suggested that computed tomography is quite different from other methods and very useful in diagnosis of hernia. Some cases of hernia were shown, and the characteristics of CT were reviewed.

  14. Numerical study comparing RANS and LES approaches on a circulation control airfoil

    International Nuclear Information System (INIS)

    Rumsey, Christopher L.; Nishino, Takafumi

    2011-01-01

    Highlights: → RANS compared with LES for circulation control airfoil. → RANS turbulence models need to account for streamline curvature. → RANS models yield higher lift than LES in spite of predicting similar jet separation. - Abstract: A numerical study over a nominally two-dimensional circulation control airfoil is performed using a large-eddy simulation code and two Reynolds-averaged Navier-Stokes codes. Different Coanda jet blowing conditions are investigated. In addition to investigating the influence of grid density, a comparison is made between incompressible and compressible flow solvers. The incompressible equations are found to yield negligible differences from the compressible equations up to at least a jet exit Mach number of 0.64. The effects of different turbulence models are also studied. Models that do not account for streamline curvature effects tend to predict jet separation from the Coanda surface too late, and can produce non-physical solutions at high blowing rates. Three different turbulence models that account for streamline curvature are compared with each other and with large eddy simulation solutions. All three models are found to predict the Coanda jet separation location reasonably well, but one of the models predicts specific flow field details near the Coanda surface prior to separation much better than the other two. All Reynolds-averaged Navier-Stokes computations produce higher circulation than large eddy simulation computations, with different stagnation point location and greater flow acceleration around the nose onto the upper surface. The precise reasons for the higher circulation are not clear, although it is not solely a function of predicting the jet separation location correctly.

  15. Numerical study comparing RANS and LES approaches on a circulation control airfoil

    Energy Technology Data Exchange (ETDEWEB)

    Rumsey, Christopher L., E-mail: c.l.rumsey@nasa.gov [Computational AeroSciences Branch, NASA Langley Research Center, Hampton, VA 23681-2199 (United States); Nishino, Takafumi [Advanced Supercomputing Division, NASA Ames Research Center, Moffett Field, CA 94035-1000 (United States)

    2011-10-15

    Highlights: > RANS compared with LES for circulation control airfoil. > RANS turbulence models need to account for streamline curvature. > RANS models yield higher lift than LES in spite of predicting similar jet separation. - Abstract: A numerical study over a nominally two-dimensional circulation control airfoil is performed using a large-eddy simulation code and two Reynolds-averaged Navier-Stokes codes. Different Coanda jet blowing conditions are investigated. In addition to investigating the influence of grid density, a comparison is made between incompressible and compressible flow solvers. The incompressible equations are found to yield negligible differences from the compressible equations up to at least a jet exit Mach number of 0.64. The effects of different turbulence models are also studied. Models that do not account for streamline curvature effects tend to predict jet separation from the Coanda surface too late, and can produce non-physical solutions at high blowing rates. Three different turbulence models that account for streamline curvature are compared with each other and with large eddy simulation solutions. All three models are found to predict the Coanda jet separation location reasonably well, but one of the models predicts specific flow field details near the Coanda surface prior to separation much better than the other two. All Reynolds-averaged Navier-Stokes computations produce higher circulation than large eddy simulation computations, with different stagnation point location and greater flow acceleration around the nose onto the upper surface. The precise reasons for the higher circulation are not clear, although it is not solely a function of predicting the jet separation location correctly.

  16. Optimization of scaffold design for bone tissue engineering: A computational and experimental study.

    Science.gov (United States)

    Dias, Marta R; Guedes, José M; Flanagan, Colleen L; Hollister, Scott J; Fernandes, Paulo R

    2014-04-01

    In bone tissue engineering, the scaffold has not only to allow the diffusion of cells, nutrients and oxygen but also provide adequate mechanical support. One way to ensure the scaffold has the right properties is to use computational tools to design such a scaffold coupled with additive manufacturing to build the scaffolds to the resulting optimized design specifications. In this study a topology optimization algorithm is proposed as a technique to design scaffolds that meet specific requirements for mass transport and mechanical load bearing. Several micro-structures obtained computationally are presented. Designed scaffolds were then built using selective laser sintering and the actual features of the fabricated scaffolds were measured and compared to the designed values. It was possible to obtain scaffolds with an internal geometry that reasonably matched the computational design (within 14% of porosity target, 40% for strut size and 55% for throat size in the building direction and 15% for strut size and 17% for throat size perpendicular to the building direction). These results support the use of these kind of computational algorithms to design optimized scaffolds with specific target properties and confirm the value of these techniques for bone tissue engineering. Copyright © 2014 IPEM. Published by Elsevier Ltd. All rights reserved.

  17. Gradient matching methods for computational inference in mechanistic models for systems biology: a review and comparative analysis

    Directory of Open Access Journals (Sweden)

    Benn eMacdonald

    2015-11-01

    Full Text Available Parameter inference in mathematical models of biological pathways, expressed as coupled ordinary differential equations (ODEs, is a challenging problem in contemporary systems biology. Conventional methods involve repeatedly solving the ODEs by numerical integration, which is computationally onerous and does not scale up to complex systems. Aimed at reducing the computational costs, new concepts based on gradient matching have recently been proposed in the computational statistics and machine learning literature. In a preliminary smoothing step, the time series data are interpolated; then, in a second step, the parameters of the ODEs are optimised so as to minimise some metric measuring the difference between the slopes of the tangents to the interpolants, and the time derivatives from the ODEs. In this way, the ODEs never have to be solved explicitly. This review provides a concise methodological overview of the current state-of-the-art methods for gradient matching in ODEs, followed by an empirical comparative evaluation based on a set of widely used and representative benchmark data.

  18. Computer Crimes: A Case Study of What Malaysia Can Learn from Others?

    Directory of Open Access Journals (Sweden)

    Janaletchumi Appudurai

    2007-06-01

    Full Text Available Rapid development of information technology (IT has brought with it many new applications such as e-commerce and global business. The past few years have seen activities in the legislative arena covering issues such as digital signatures, the international recognition of electronic documents and privacy and data protection. Both the developed and developing countries have exhibited keenness to embrace the IT environment. Securing this electronic environment from intrusion, however, continues to be problematic. A particular favorite form of computer crime would be hacking. As more computer systems move on to on-line processing and improved telecommunications, computer hackers are now a real threat. Legislation criminalizing intrusion and destruction activities directed at computers are needed. Malaysia joined the list of countries with computer-specific legislation with the enactment of its Computer Crime Act 1997 (CCA. This paper focuses on hacking as a criminal act, and compares the Malaysian CCA with legislation from other countries. The current computer crime situation in Malaysia is looked at and exposes the difficulties and obstacles Malaysia faces in enforcing the Act. The paper concludes with recommendations for Malaysia in terms of policy, practices and penalties.

  19. A Comparative Study of Human Saposins

    Directory of Open Access Journals (Sweden)

    María Garrido-Arandia

    2018-02-01

    Full Text Available Saposins are small proteins implicated in trafficking and loading of lipids onto Cluster of Differentiation 1 (CD1 receptor proteins that in turn present lipid antigens to T cells and a variety of T-cell receptors, thus playing a crucial role in innate and adaptive immune responses in humans. Despite their low sequence identity, the four types of human saposins share a similar folding pattern consisting of four helices linked by three conserved disulfide bridges. However, their lipid-binding abilities as well as their activities in extracting, transporting and loading onto CD1 molecules a variety of sphingo- and phospholipids in biological membranes display two striking characteristics: a strong pH-dependence and a structural change between a compact, closed conformation and an open conformation. In this work, we present a comparative computational study of structural, electrostatic, and dynamic features of human saposins based upon their available experimental structures. By means of structural alignments, surface analyses, calculation of pH-dependent protonation states, Poisson-Boltzmann electrostatic potentials, and molecular dynamics simulations at three pH values representative of biological media where saposins fulfill their function, our results shed light into their intrinsic features. The similarities and differences in this class of proteins depend on tiny variations of local structural details that allow saposins to be key players in triggering responses in the human immune system.

  20. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author)

  1. Use of computer codes for system reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sabek, M.; Gaafar, M. (Nuclear Regulatory and Safety Centre, Atomic Energy Authority, Cairo (Egypt)); Poucet, A. (Commission of the European Communities, Ispra (Italy). Joint Research Centre)

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author).

  2. Does computer use affect the incidence of distal arm pain? A one-year prospective study using objective measures of computer use

    DEFF Research Database (Denmark)

    Mikkelsen, Sigurd; Lassen, Christina Funch; Vilstrup, Imogen

    2012-01-01

    PURPOSE: To study how objectively recorded mouse and keyboard activity affects distal arm pain among computer workers. METHODS: Computer activities were recorded among 2,146 computer workers. For 52 weeks mouse and keyboard time, sustained activity, speed and micropauses were recorded with a soft......PURPOSE: To study how objectively recorded mouse and keyboard activity affects distal arm pain among computer workers. METHODS: Computer activities were recorded among 2,146 computer workers. For 52 weeks mouse and keyboard time, sustained activity, speed and micropauses were recorded...... with a software program installed on the participants' computers. Participants reported weekly pain scores via the software program for elbow, forearm and wrist/hand as well as in a questionnaire at baseline and 1-year follow up. Associations between pain development and computer work were examined for three pain...... were not risk factors for acute pain, nor did they modify the effects of mouse or keyboard time. Computer usage parameters were not associated with prolonged or chronic pain. A major limitation of the study was low keyboard times. CONCLUSION: Computer work was not related to the development...

  3. 18 F-fluorodeoxyglucose positron emission tomography-computed tomography for preoperative lymph node staging in patients undergoing radical cystectomy for bladder cancer: a prospective study.

    Science.gov (United States)

    Hitier-Berthault, Maryam; Ansquer, Catherine; Branchereau, Julien; Renaudin, Karine; Bodere, Françoise; Bouchot, Olivier; Rigaud, Jérôme

    2013-08-01

    The objective of our study was to analyze the diagnostic performance of (18) F-fluorodeoxyglucose positron emission tomography-computed tomography for lymph node staging in patients with bladder cancer before radical cystectomy and to compare it with that of computed tomography. A total of 52 patients operated on between 2005 and 2010 were prospectively included in this prospective, mono-institutional, open, non-randomized pilot study. Patients who had received neoadjuvant chemotherapy or radiotherapy were excluded. (18) F-fluorodeoxyglucose positron emission tomography-computed tomography in addition to computed tomography was carried out for lymph node staging of bladder cancer before radical cystectomy. Lymph node dissection during radical cystectomy was carried out. Findings from (18) F-fluorodeoxyglucose positron emission tomography-computed tomography and computed tomography were compared with the results of definitive histological examination of the lymph node dissection. The diagnostic performance of the two imaging modalities was assessed and compared. The mean number of lymph nodes removed during lymph node dissection was 16.5 ± 10.9. Lymph node metastasis was confirmed on histological examination in 22 cases (42.3%). This had been suspected in five cases (9.6%) on computed tomography and in 12 cases (23.1%) on (18) F-fluorodeoxyglucose positron emission tomography-computed tomography. Sensitivity, specificity, positive predictive value, negative predictive value, relative risk and accuracy were 9.1%, 90%, 40%, 57.4%, 0.91 and 55.7%, respectively, for computed tomography, and 36.4%, 86.7%, 66.7%, 65%, 2.72, 65.4%, respectively, for (18) F-fluorodeoxyglucose positron emission tomography-computed tomography. (18) F-fluorodeoxyglucose positron emission tomography-computed tomography is more reliable than computed tomography for preoperative lymph node staging in patients with invasive bladder carcinoma undergoing radical cystectomy. © 2012 The Japanese

  4. Diagnosis of lumbar disc hernia with computed tomography

    International Nuclear Information System (INIS)

    Yoshizumi, Atsuro; Ohira, Nobuhiro; Ojima, Tadashi; Oshida, Midori; Horaguchi, Mitsuru

    1982-01-01

    Results of computed tomography performed on patients with clinically diagnosed hernia were compared with those of myelography and operative findings. This comparative study suggested that computed tomography is quite different from other methods and very useful in diagnosis of hernia. Some cases of hernia were shown, and the characteristics of CT were reviewed. (Ueda, J.)

  5. Computer Assisted Project-Based Instruction: The Effects on Science Achievement, Computer Achievement and Portfolio Assessment

    Science.gov (United States)

    Erdogan, Yavuz; Dede, Dinçer

    2015-01-01

    The purpose of this study is to compare the effects of computer assisted project-based instruction on learners' achievement in a science and technology course, in a computer course and in portfolio development. With this aim in mind, a quasi-experimental design was used and a sample of 70 seventh grade secondary school students from Org. Esref…

  6. Computer use and stress, sleep disturbances, and symptoms of depression among young adults--a prospective cohort study.

    Science.gov (United States)

    Thomée, Sara; Härenstam, Annika; Hagberg, Mats

    2012-10-22

    We have previously studied prospective associations between computer use and mental health symptoms in a selected young adult population. The purpose of this study was to investigate if high computer use is a prospective risk factor for developing mental health symptoms in a population-based sample of young adults. The study group was a cohort of young adults (n = 4163), 20-24 years old, who responded to a questionnaire at baseline and 1-year follow-up. Exposure variables included time spent on computer use (CU) in general, email/chat use, computer gaming, CU without breaks, and CU at night causing lost sleep. Mental health outcomes included perceived stress, sleep disturbances, symptoms of depression, and reduced performance due to stress, depressed mood, or tiredness. Prevalence ratios (PRs) were calculated for prospective associations between exposure variables at baseline and mental health outcomes (new cases) at 1-year follow-up for the men and women separately. Both high and medium computer use compared to low computer use at baseline were associated with sleep disturbances in the men at follow-up. High email/chat use was negatively associated with perceived stress, but positively associated with reported sleep disturbances for the men. For the women, high email/chat use was (positively) associated with several mental health outcomes, while medium computer gaming was associated with symptoms of depression, and CU without breaks with most mental health outcomes. CU causing lost sleep was associated with mental health outcomes for both men and women. Time spent on general computer use was prospectively associated with sleep disturbances and reduced performance for the men. For the women, using the computer without breaks was a risk factor for several mental health outcomes. Some associations were enhanced in interaction with mobile phone use. Using the computer at night and consequently losing sleep was associated with most mental health outcomes for both men

  7. Computer Assisted Fluid Power Instruction: A Comparison of Hands-On and Computer-Simulated Laboratory Experiences for Post-Secondary Students

    Science.gov (United States)

    Wilson, Scott B.

    2005-01-01

    The primary purpose of this study was to examine the effectiveness of utilizing a combination of lecture and computer resources to train personnel to assume roles as hydraulic system technicians and specialists in the fluid power industry. This study compared computer simulated laboratory instruction to traditional hands-on laboratory instruction,…

  8. DISTANCE LEARNERSÕ PERCEPTIONS OF COMPUTER MEDIATED COMMUNICATION

    OpenAIRE

    Mujgan Bozkaya; Irem Erdem Aydin

    2011-01-01

    In this study, perspectives of the first year students in the completely online Information Management Associate Degree Program at Anadolu University regarding computer as a communication medium were investigated. StudentsÕ perspectives on computer-mediated communications were analyzed in the light of three different views in the area of computer-mediated communications: The first view suggests that face-to-face settings are better communication environments compared to computer-mediated envi...

  9. Bridging the digital divide through the integration of computer and information technology in science education: An action research study

    Science.gov (United States)

    Brown, Gail Laverne

    The presence of a digital divide, computer and information technology integration effectiveness, and barriers to continued usage of computer and information technology were investigated. Thirty-four African American and Caucasian American students (17 males and 17 females) in grades 9--11 from 2 Georgia high school science classes were exposed to 30 hours of hands-on computer and information technology skills. The purpose of the exposure was to improve students' computer and information technology skills. Pre-study and post-study skills surveys, and structured interviews were used to compare race, gender, income, grade-level, and age differences with respect to computer usage. A paired t-test and McNemar test determined mean differences between student pre-study and post-study perceived skills levels. The results were consistent with findings of the National Telecommunications and Information Administration (2000) that indicated the presence of a digital divide and digital inclusion. Caucasian American participants were found to have more at-home computer and Internet access than African American participants, indicating that there is a digital divide by ethnicity. Caucasian American females were found to have more computer and Internet access which was an indication of digital inclusion. Sophomores had more at-home computer access and Internet access than other levels indicating digital inclusion. Students receiving regular meals had more computer and Internet access than students receiving free/reduced meals. Older students had more computer and Internet access than younger students. African American males had been using computer and information technology the longest which is an indication of inclusion. The paired t-test and McNemar test revealed significant perceived student increases in all skills levels. Interviews did not reveal any barriers to continued usage of the computer and information technology skills.

  10. Programming PHREEQC calculations with C++ and Python a comparative study

    Science.gov (United States)

    Charlton, Scott R.; Parkhurst, David L.; Muller, Mike

    2011-01-01

    The new IPhreeqc module provides an application programming interface (API) to facilitate coupling of other codes with the U.S. Geological Survey geochemical model PHREEQC. Traditionally, loose coupling of PHREEQC with other applications required methods to create PHREEQC input files, start external PHREEQC processes, and process PHREEQC output files. IPhreeqc eliminates most of this effort by providing direct access to PHREEQC capabilities through a component object model (COM), a library, or a dynamically linked library (DLL). Input and calculations can be specified through internally programmed strings, and all data exchange between an application and the module can occur in computer memory. This study compares simulations programmed in C++ and Python that are tightly coupled with IPhreeqc modules to the traditional simulations that are loosely coupled to PHREEQC. The study compares performance, quantifies effort, and evaluates lines of code and the complexity of the design. The comparisons show that IPhreeqc offers a more powerful and simpler approach for incorporating PHREEQC calculations into transport models and other applications that need to perform PHREEQC calculations. The IPhreeqc module facilitates the design of coupled applications and significantly reduces run times. Even a moderate knowledge of one of the supported programming languages allows more efficient use of PHREEQC than the traditional loosely coupled approach.

  11. NASA Computational Case Study: The Flight of Friendship 7

    Science.gov (United States)

    Simpson, David G.

    2012-01-01

    In this case study, we learn how to compute the position of an Earth-orbiting spacecraft as a function of time. As an exercise, we compute the position of John Glenn's Mercury spacecraft Friendship 7 as it orbited the Earth during the third flight of NASA's Mercury program.

  12. Pair Interactions and Mode of Communication: Comparing Face-to-Face and Computer Mediated Communication

    Science.gov (United States)

    Tan, Lan Liana; Wigglesworth, Gillian; Storch, Neomy

    2010-01-01

    In today's second language classrooms, students are often asked to work in pairs or small groups. Such collaboration can take place face-to-face, but now more often via computer mediated communication. This paper reports on a study which investigated the effect of the medium of communication on the nature of pair interaction. The study involved…

  13. Towards early software reliability prediction for computer forensic tools (case study).

    Science.gov (United States)

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  14. Comparative study on γ-ray spectrum by several filtering method

    International Nuclear Information System (INIS)

    Yuan Xinyu; Liu Liangjun; Zhou Jianliang

    2011-01-01

    Comparative study was conducted on results of gamma-ray spectrum by using a majority of active smoothing method, which were used to show filtering effect. The results showed that peak was widened and overlap peaks increased with energy domain filter in γ-ray spectrum. Filter and its parameters should be seriously taken into consideration in frequency domain. Wavelet transformation can keep signal in high frequency region well. Improved threshold method showed the advantages of hard and soft threshold method at the same time by comparison, which was suitable for weak peaks detection. A new filter was put forward to eke out gravity model approach, whose denoise level was detected by standard deviation. This method not only kept signal and net area of peak well,but also attained better result and had simple computer program. (authors)

  15. Comparison of progressive addition lenses for general purpose and for computer vision: an office field study.

    Science.gov (United States)

    Jaschinski, Wolfgang; König, Mirjam; Mekontso, Tiofil M; Ohlendorf, Arne; Welscher, Monique

    2015-05-01

    Two types of progressive addition lenses (PALs) were compared in an office field study: 1. General purpose PALs with continuous clear vision between infinity and near reading distances and 2. Computer vision PALs with a wider zone of clear vision at the monitor and in near vision but no clear distance vision. Twenty-three presbyopic participants wore each type of lens for two weeks in a double-masked four-week quasi-experimental procedure that included an adaptation phase (Weeks 1 and 2) and a test phase (Weeks 3 and 4). Questionnaires on visual and musculoskeletal conditions as well as preferences regarding the type of lenses were administered. After eight more weeks of free use of the spectacles, the preferences were assessed again. The ergonomic conditions were analysed from photographs. Head inclination when looking at the monitor was significantly lower by 2.3 degrees with the computer vision PALs than with the general purpose PALs. Vision at the monitor was judged significantly better with computer PALs, while distance vision was judged better with general purpose PALs; however, the reported advantage of computer vision PALs differed in extent between participants. Accordingly, 61 per cent of the participants preferred the computer vision PALs, when asked without information about lens design. After full information about lens characteristics and additional eight weeks of free spectacle use, 44 per cent preferred the computer vision PALs. On average, computer vision PALs were rated significantly better with respect to vision at the monitor during the experimental part of the study. In the final forced-choice ratings, approximately half of the participants preferred either the computer vision PAL or the general purpose PAL. Individual factors seem to play a role in this preference and in the rated advantage of computer vision PALs. © 2015 The Authors. Clinical and Experimental Optometry © 2015 Optometry Australia.

  16. Computer use changes generalization of movement learning.

    Science.gov (United States)

    Wei, Kunlin; Yan, Xiang; Kong, Gaiqing; Yin, Cong; Zhang, Fan; Wang, Qining; Kording, Konrad Paul

    2014-01-06

    Over the past few decades, one of the most salient lifestyle changes for us has been the use of computers. For many of us, manual interaction with a computer occupies a large portion of our working time. Through neural plasticity, this extensive movement training should change our representation of movements (e.g., [1-3]), just like search engines affect memory [4]. However, how computer use affects motor learning is largely understudied. Additionally, as virtually all participants in studies of perception and actions are computer users, a legitimate question is whether insights from these studies bear the signature of computer-use experience. We compared non-computer users with age- and education-matched computer users in standard motor learning experiments. We found that people learned equally fast but that non-computer users generalized significantly less across space, a difference negated by two weeks of intensive computer training. Our findings suggest that computer-use experience shaped our basic sensorimotor behaviors, and this influence should be considered whenever computer users are recruited as study participants. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. A primary study on the increasing of efficiency in the computer cooling system by means of external air

    Energy Technology Data Exchange (ETDEWEB)

    Kim, S. H.; Kim, M. H. [Silla University, Busan (Korea, Republic of)

    2009-07-01

    In recent years, since the continuing increase in the capacity of in personal computer such as the optimal performance, high quality and high resolution image, the computer system's components produce large amounts of heat during operation. This study analyzes and investigates an ability and efficiency of the cooling system inside the computer by means of Central Processing Unit (CPU) and power supply cooling fan. This research was conducted for increasing an ability of the cooling system inside the computer by making a structure which produces different air pressures in an air inflow tube. Consequently, when temperatures of the CPU and room inside computer were compared with a general personal computer, temperatures of the tested CPU, the room and the heat sink were as low as 5 .deg. C, 2.5 .deg. C and 7 .deg. C respectively. In addition to, Revolution Per Minute (RPM) was shown as low as 250 after 1 hour operation. This research explored the possibility of enhancing the effective cooling of high-performance computer systems.

  18. Computer mapping as an aid in air-pollution studies: Montreal region study

    Energy Technology Data Exchange (ETDEWEB)

    Granger, J M

    1972-01-01

    Through the use of computer-mapping programs, an operational technique has been designed which allows an almost-instant appraisal of the intensity of atmospheric pollution in an urban region on the basis of epiphytic sensitivity. The epiphytes considered are essentially lichens and mosses growing on trees. This study was applied to the Montreal region, with 349 samplings statiions distributed nearly uniformly. Computer graphics of the findings are included in the appendix.

  19. [Usage patterns of internet and computer games : Results of an observational study of Tyrolean adolescents].

    Science.gov (United States)

    Riedl, David; Stöckl, Andrea; Nussbaumer, Charlotte; Rumpold, Gerhard; Sevecke, Kathrin; Fuchs, Martin

    2016-12-01

    The use of digital media such as the Internet and Computer games has greatly increased. In the western world, almost all young people regularly use these relevant technologies. Against this background, forms of use with possible negative consequences for young people have been recognized and scientifically examined. The aim of our study was therefore to investigate the prevalence of pathological use of these technologies in a sample of young Tyrolean people. 398 students (average age 15.2 years, SD ± 2.3 years, 34.2% female) were interviewed by means of the structured questionnaires CIUS (Internet), CSV-S (Computer games) and SWE (Self efficacy). Additionally, socio demographic data were collected. In line with previous studies, 7.7% of the adolescents of our sample showed criteria for problematic internet use, 3.3% for pathological internet use. 5.4% of the sample reported pathological computer game usage. The most important aspect to influence our results was the gender of the subjects. Intensive users in the field of Internet and Computer games were more often young men, young women, however, showed significantly less signs of pathological computer game use. A significant percentage of Tyrolean adolescents showed difficulties in the development of competent media use, indicating the growing significance of prevention measures such as media education. In a follow-up project, a sample of adolescents with mental disorders will be examined concerning their media use and be compared with our school-sample.

  20. A computational study of free-piston diesel engine combustion

    Energy Technology Data Exchange (ETDEWEB)

    Mikalsen, R.; Roskilly, A.P. [Sir Joseph Swan Institute for Energy Research, Newcastle University, Devonshire Building, Newcastle upon Tyne, NE1 7RU (United Kingdom)

    2009-07-15

    This paper investigates the in-cylinder gas motion, combustion process and nitrogen oxide formation in a free-piston diesel engine and compares the results to those of a conventional engine, using a computational fluid dynamics engine model. Enhanced radial gas flow (squish and reverse squish) around top dead centre is found for the free-piston engine compared to a conventional engine, however it is found that this has only minor influence on the combustion process. A higher heat release rate from the pre-mixed combustion phase due to an increased ignition delay was found, along with potential reductions in nitrogen oxides emissions formation for the free-piston engine. (author)

  1. A framework for the comparative study of language.

    Science.gov (United States)

    Uriagereka, Juan; Reggia, James A; Wilkinson, Gerald S

    2013-07-18

    Comparative studies of language are difficult because few language precursors are recognized. In this paper we propose a framework for designing experiments that test for structural and semantic patterns indicative of simple or complex grammars as originally described by Chomsky. We argue that a key issue is whether animals can recognize full recursion, which is the hallmark of context-free grammar. We discuss limitations of recent experiments that have attempted to address this issue, and point out that experiments aimed at detecting patterns that follow a Fibonacci series have advantages over other artificial context-free grammars. We also argue that experiments using complex sequences of behaviors could, in principle, provide evidence for fully recursive thought. Some of these ideas could also be approached using artificial life simulations, which have the potential to reveal the types of evolutionary transitions that could occur over time. Because the framework we propose has specific memory and computational requirements, future experiments could target candidate genes with the goal of revealing the genetic underpinnings of complex cognition.

  2. Reconstruction for interior region-of-interest inverse geometry computed tomography: preliminary study

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong Su; Kim, Tae Ho; Kim, Kyeong Hyeon; Yoon, Do Kun; Suh, Tae Suk [Dept. of Biomedical Engineering, Research Institute of Biomedical Engineering, College of Medicine, The Catholic University of Korea, Seoul (Korea, Republic of); Kang, Seong Hee [Dept. of Radiation Oncology, Seoul National University Hospital, Seoul (Korea, Republic of); Cho, Min Seok [Dept. of Radiation Oncology, Asan Medical Center, Seoul (Korea, Republic of); Noh, Yu Yoon [Dept. of Radiation Oncology, Eulji University Hospital, Daejeon (Korea, Republic of)

    2017-04-15

    The inverse geometry computed tomography (IGCT) composed of multiple source and small size detector has several merits such as reduction of scatter effect and large volumetric imaging within one rotation without cone-beam artifact, compared to conventional cone-beam computed tomography (CBCT). By using this multi-source characteristics, we intend to present a selective and multiple interior region-of-interest (ROI) imaging method by using a designed source on-off sequence of IGCT. ROI-IGCT showed comparable image quality and has the capability to provide multi ROI image within a rotation. In this regard, it seems to be useful for diagnostic or image guidance for radiotherapy. ROI-IGCT showed comparable image quality and has the capability to provide multi ROI image within a rotation. Projection of ROI-IGCT is performed by selective irradiation, hence unnecessary imaging dose to non-interest region can be reduced. In this regard, it seems to be useful for diagnostic or image guidance for radiotherapy.

  3. Comparative study of shale-gas production using single- and dual-continuum approaches

    KAUST Repository

    El-Amin, Mohamed; Amir, Sahar Z.; Salama, Amgad; Urozayev, Dias; Sun, Shuyu

    2017-01-01

    parameters. Several results are discussed such as pressure, production rate and cumulative production. We compare the results of the two models using the same dimensions and physical and computational parameters. We found that the DPDP and the SDFM models

  4. Comparative study of dobutamine stress echocardiography and dual single-photon emission computed tomography (Thallium-201 and I-123 BMIPP) for assessing myocardial viability after acute myocardial infarction

    International Nuclear Information System (INIS)

    Yasugi, Naoko; Hiroki, Tadayuki

    2002-01-01

    Discordance between the 123 I-labelled 15-iodophenyl-3-R, S-methyl pentadecanoic acid (BMIPP) and 201 Tl findings may indicate myocardial viability (MV). This study compared dobutamine stress echocardiography (DSE) and single-photon emission computed tomography (SPECT) using the dual tracers for assessment of MV and prediction of functional recovery after acute myocardial infarction (AMI). DSE and dual SPECT were studied in 35 patients after AMI, of whom 28 underwent percutaneous coronary intervention in the acute stage. Dual SPECT was performed to compare the defect score of BMIPP and 201 Tl. The left ventricular wall motion score (WMS) was estimated during DSE and 6 months later to assess functional recovery of the infarct area. The rate of agreement of MV between dual SPECT and DSE was 89% (p 201 Tl were significantly smaller in patients with functional recovery than in those without. Assessment of MV using DSE concords with the results of dual SPECT in the early stage of AMI. DSE may have a higher predictive value for long-term functional recovery at the infarct area. However, a finding of positive MV by dual SPECT, without functional recovery, may indicate residual stenosis of the infarct-related artery, although the number of cases was small. Combined assessment by dual SPECT and DSE may be useful for detecting MV and jeopardized myocardium. Furthermore, the results suggest that functional recovery of dysfunctional myocardium may depend on the size of the infarct and risk area. (author)

  5. Study on GPU Computing for SCOPE2 with CUDA

    International Nuclear Information System (INIS)

    Kodama, Yasuhiro; Tatsumi, Masahiro; Ohoka, Yasunori

    2011-01-01

    For improving safety and cost effectiveness of nuclear power plants, a core calculation code SCOPE2 has been developed, which adopts detailed calculation models such as the multi-group nodal SP3 transport calculation method in three-dimensional pin-by-pin geometry to achieve high predictability. However, it is difficult to apply the code to loading pattern optimizations since it requires much longer computation time than that of codes based on the nodal diffusion method which is widely used in core design calculations. In this study, we studied possibility of acceleration of SCOPE2 with GPU computing capability which has been recognized as one of the most promising direction of high performance computing. In the previous study with an experimental programming framework, it required much effort to convert the algorithms to ones which fit to GPU computation. It was found, however, that this conversion was tremendously difficult because of the complexity of algorithms and restrictions in implementation. In this study, to overcome this complexity, we utilized the CUDA programming environment provided by NVIDIA which is a versatile and flexible language as an extension to the C/C++ languages. It was confirmed that we could enjoy high performance without degradation of maintainability through test implementation of GPU kernels for neutron diffusion/simplified P3 equation solvers. (author)

  6. A computed tomographic prolective trohoc study of chronic schizophrenics

    International Nuclear Information System (INIS)

    Glueck, E.; Radue, E.W.; Mundt, C.; Gerhardt, P.

    1980-01-01

    The maximal width of the third ventricle, the maximal distance between the outer tips of the anterior horns, and the number of enlarged cerebral sulci on the two highest CT slices were measured in 68 chronic schizophrenic patients on cranial computed tomograms in order to detect a possible enlargement of the cerebrospinal fluid (CSF) filled intracranial spaces. These results were compared with values obtained from a control group which was formed in accordance with definite exclusion criteria and matched-pair parameters (sex, age and maximal inner diameter of the skull). In a prolective trohoc study no difference was found in the size of the CSF spaces of schizophrenics and the controls. The psychopathological condition of the patients, which was classified in a semistandardized dialogue, also showed no correlation with the ventricular size or the number of enlarged cerebral sulci. (orig.)

  7. Vanillin and isovanillin: Comparative vibrational spectroscopic studies, conformational stability and NLO properties by density functional theory calculations

    Science.gov (United States)

    Balachandran, V.; Parimala, K.

    This study is a comparative analysis of FT-IR and FT-Raman spectra of vanillin (3-methoxy-4-hydroxybenzaldehyde) and isovanillin (3-hydroxy-4-methoxybenzaldehyde). The molecular structure, vibrational wavenumbers, infrared intensities, Raman scattering activities were calculated for both molecules using the B3LYP density functional theory (DFT) with the standard 6-311++G∗∗ basis set. The computed values of frequencies are scaled using multiple scaling factors to yield good coherence with the observed values. The calculated harmonic vibrational frequencies are compared with experimental FT-IR and FT-Raman spectra. The geometrical parameters and total energies of vanillin and isovanillin were obtained for all the eight conformers (a-h) from DFT/B3LYP method with 6-311++G∗∗ basis set. The computational results identified the most stable conformer of vanillin and isovanillin as in the "a" form. Non-linear properties such as electric dipole moment (μ), polarizability (α), and hyperpolarizability (β) values of the investigated molecules have been computed using B3LYP quantum chemical calculation. The calculated HOMO and LUMO energies show that charge transfer occurs within the molecules.

  8. Computational Approaches to Simulation and Optimization of Global Aircraft Trajectories

    Science.gov (United States)

    Ng, Hok Kwan; Sridhar, Banavar

    2016-01-01

    This study examines three possible approaches to improving the speed in generating wind-optimal routes for air traffic at the national or global level. They are: (a) using the resources of a supercomputer, (b) running the computations on multiple commercially available computers and (c) implementing those same algorithms into NASAs Future ATM Concepts Evaluation Tool (FACET) and compares those to a standard implementation run on a single CPU. Wind-optimal aircraft trajectories are computed using global air traffic schedules. The run time and wait time on the supercomputer for trajectory optimization using various numbers of CPUs ranging from 80 to 10,240 units are compared with the total computational time for running the same computation on a single desktop computer and on multiple commercially available computers for potential computational enhancement through parallel processing on the computer clusters. This study also re-implements the trajectory optimization algorithm for further reduction of computational time through algorithm modifications and integrates that with FACET to facilitate the use of the new features which calculate time-optimal routes between worldwide airport pairs in a wind field for use with existing FACET applications. The implementations of trajectory optimization algorithms use MATLAB, Python, and Java programming languages. The performance evaluations are done by comparing their computational efficiencies and based on the potential application of optimized trajectories. The paper shows that in the absence of special privileges on a supercomputer, a cluster of commercially available computers provides a feasible approach for national and global air traffic system studies.

  9. Study of basic computer competence among public health nurses in Taiwan.

    Science.gov (United States)

    Yang, Kuei-Feng; Yu, Shu; Lin, Ming-Sheng; Hsu, Chia-Ling

    2004-03-01

    Rapid advances in information technology and media have made distance learning on the Internet possible. This new model of learning allows greater efficiency and flexibility in knowledge acquisition. Since basic computer competence is a prerequisite for this new learning model, this study was conducted to examine the basic computer competence of public health nurses in Taiwan and explore factors influencing computer competence. A national cross-sectional randomized study was conducted with 329 public health nurses. A questionnaire was used to collect data and was delivered by mail. Results indicate that basic computer competence of public health nurses in Taiwan is still needs to be improved (mean = 57.57 +- 2.83, total score range from 26-130). Among the five most frequently used software programs, nurses were most knowledgeable about Word and least knowledgeable about PowerPoint. Stepwise multiple regression analysis revealed eight variables (weekly number of hours spent online at home, weekly amount of time spent online at work, weekly frequency of computer use at work, previous computer training, computer at workplace and Internet access, job position, education level, and age) that significantly influenced computer competence, which accounted for 39.0 % of the variance. In conclusion, greater computer competence, broader educational programs regarding computer technology, and a greater emphasis on computers at work are necessary to increase the usefulness of distance learning via the Internet in Taiwan. Building a user-friendly environment is important in developing this new media model of learning for the future.

  10. Comparative study of surrogate models for groundwater contamination source identification at DNAPL-contaminated sites

    Science.gov (United States)

    Hou, Zeyu; Lu, Wenxi

    2018-05-01

    Knowledge of groundwater contamination sources is critical for effectively protecting groundwater resources, estimating risks, mitigating disaster, and designing remediation strategies. Many methods for groundwater contamination source identification (GCSI) have been developed in recent years, including the simulation-optimization technique. This study proposes utilizing a support vector regression (SVR) model and a kernel extreme learning machine (KELM) model to enrich the content of the surrogate model. The surrogate model was itself key in replacing the simulation model, reducing the huge computational burden of iterations in the simulation-optimization technique to solve GCSI problems, especially in GCSI problems of aquifers contaminated by dense nonaqueous phase liquids (DNAPLs). A comparative study between the Kriging, SVR, and KELM models is reported. Additionally, there is analysis of the influence of parameter optimization and the structure of the training sample dataset on the approximation accuracy of the surrogate model. It was found that the KELM model was the most accurate surrogate model, and its performance was significantly improved after parameter optimization. The approximation accuracy of the surrogate model to the simulation model did not always improve with increasing numbers of training samples. Using the appropriate number of training samples was critical for improving the performance of the surrogate model and avoiding unnecessary computational workload. It was concluded that the KELM model developed in this work could reasonably predict system responses in given operation conditions. Replacing the simulation model with a KELM model considerably reduced the computational burden of the simulation-optimization process and also maintained high computation accuracy.

  11. Penerapan Teknologi Cloud Computing Di Universitas Studi Kasus: Fakultas Teknologi Informasi Ukdw

    OpenAIRE

    Kurniawan, Erick

    2015-01-01

    Teknologi Cloud Computing adalah paradigma baru dalam penyampaian layanan komputasi. Cloud Computing memiliki banyak kelebihan dibandingkan dengan sistem konvensional. Artikel ini membahas tentang arsitektur cloud computing secara umum dan beberapa contoh penerapan layanan cloud computing beserta manfaatnya di lingkungan universitas. Studi kasus yang diambil adalah penerapan layanan cloud computing di Fakultas Teknologi Informasi UKDW.

  12. Computer versus Compensatory Calendar Training in Individuals with Mild Cognitive Impairment: Functional Impact in a Pilot Study.

    Science.gov (United States)

    Chandler, Melanie J; Locke, Dona E C; Duncan, Noah L; Hanna, Sherrie M; Cuc, Andrea V; Fields, Julie A; Hoffman Snyder, Charlene R; Lunde, Angela M; Smith, Glenn E

    2017-09-06

    This pilot study examined the functional impact of computerized versus compensatory calendar training in cognitive rehabilitation participants with mild cognitive impairment (MCI). Fifty-seven participants with amnestic MCI completed randomly assigned calendar or computer training. A standard care control group was used for comparison. Measures of adherence, memory-based activities of daily living (mADLs), and self-efficacy were completed. The calendar training group demonstrated significant improvement in mADLs compared to controls, while the computer training group did not. Calendar training may be more effective in improving mADLs than computerized intervention. However, this study highlights how behavioral trials with fewer than 30-50 participants per arm are likely underpowered, resulting in seemingly null findings.

  13. Comparative evaluation of the accuracy of linear measurements between cone beam computed tomography and 3D microtomography

    Directory of Open Access Journals (Sweden)

    Francesca Mangione

    2013-09-01

    Full Text Available OBJECTIVE: The aim of this study was to evaluate the influence of artifacts on the accuracy of linear measurements estimated with a common cone beam computed tomography (CBCT system used in dental clinical practice, by comparing it with microCT system as standard reference. MATERIALS AND METHODS: Ten bovine bone cylindrical samples containing one implant each, able to provide both points of reference and image quality degradation, have been scanned by CBCT and microCT systems. Thanks to the software of the two systems, for each cylindrical sample, two diameters taken at different levels, by using implants different points as references, have been measured. Results have been analyzed by ANOVA and a significant statistically difference has been found. RESULTS AND DISCUSSION: Due to the obtained results, in this work it is possible to say that the measurements made with the two different instruments are still not statistically comparable, although in some samples were obtained similar performances and therefore not statistically significant. CONCLUSION: With the improvement of the hardware and software of CBCT systems, in the near future the two instruments will be able to provide similar performances.

  14. A comparative study of cranial, blunt trauma fractures as seen at medicolegal autopsy and by computed tomography

    DEFF Research Database (Denmark)

    Jacobsen, Christina; Bech, Birthe H; Lynnerup, Niels

    2009-01-01

    BACKGROUND: Computed Tomography (CT) has become a widely used supplement to medico legal autopsies at several forensic institutes. Amongst other things, it has proven to be very valuable in visualising fractures of the cranium. Also CT scan data are being used to create head models for biomechani....... Difficulties remained in the minute diagnosis of hairline fractures. These inconsistencies need to be resolved in order to use CT scan data of victims for individual head modelling and trauma analysis....

  15. A study on measurement of scattery ray of computed tomography

    International Nuclear Information System (INIS)

    Cho, Pyong Kon; Lee, Joon Hyup; Kim, Yoon Sik; Lee, Chang Yeop

    2003-01-01

    Computed tomographic equipment is essential for diagnosis by means of radiation. With passage of time and development of science computed tomographic was developed time and again and in future examination by means of this equipment is expected to increase. In this connection these authors measured rate of scatter ray generation at front of lead glass for patients within control room of computed tomographic equipment room and outside of entrance door for exit and entrance of patients and attempted to find out method for minimizing exposure to scatter ray. From November 2001 twenty five units of computed tomographic equipment which were already installed and operation by 13 general hospitals and university hospitals in Seoul were subjected to this study. As condition of photographing those recommended by manufacturer for measuring exposure to scatter ray was use. At the time objects used DALI CT Radiation Dose Test Phantom fot Head (φ 16 cm Plexglas) and Phantom for Stomache (φ 32 cm Plexglas) were used. For measurement of scatter ray Reader (Radiation Monitor Controller Model 2026) and G-M Survey were used to Survey Meter of Radical Corporation, model 20 x 5-1800, Electrometer/Ion Chamber, S/N 21740. Spots for measurement of scatter ray included front of lead glass for patients within control room of computed tomographic equipment room which is place where most of work by gradiographic personnel are carried out and is outside of entrance door for exit and entrance of patients and their guardians and at spot 100 cm off from isocenter at the time of scanning the object. Work environment within computed tomography room which was installed and under operation by each hospital showed considerable difference depending on circumstances of pertinent hospitals and status of scatter ray was as follows. 1) From isocenter of computed tomographic equipment to lead glass for patients within control room average distance was 377 cm. At that time scatter ray showed diverse

  16. Decomposing dendrophilia. Comment on “Toward a computational framework for cognitive biology: Unifying approaches from cognitive neuroscience and comparative cognition” by W. Tecumseh Fitch

    Science.gov (United States)

    Honing, Henkjan; Zuidema, Willem

    2014-09-01

    The future of cognitive science will be about bridging neuroscience and behavioral studies, with essential roles played by comparative biology, formal modeling, and the theory of computation. Nowhere will this integration be more strongly needed than in understanding the biological basis of language and music. We thus strongly sympathize with the general framework that Fitch [1] proposes, and welcome the remarkably broad and readable review he presents to support it.

  17. Computer Game Play as an Imaginary Stage for Reading: Implicit Spatial Effects of Computer Games Embedded in Hard Copy Books

    Science.gov (United States)

    Smith, Glenn Gordon

    2012-01-01

    This study compared books with embedded computer games (via pentop computers with microdot paper and audio feedback) with regular books with maps, in terms of fifth graders' comprehension and retention of spatial details from stories. One group read a story in hard copy with embedded computer games, the other group read it in regular book format…

  18. A comparative study of turbulence models for dissolved air flotation flow analysis

    International Nuclear Information System (INIS)

    Park, Min A; Lee, Kyun Ho; Chung, Jae Dong; Seo, Seung Ho

    2015-01-01

    The dissolved air flotation (DAF) system is a water treatment process that removes contaminants by attaching micro bubbles to them, causing them to float to the water surface. In the present study, two-phase flow of air-water mixture is simulated to investigate changes in the internal flow analysis of DAF systems caused by using different turbulence models. Internal micro bubble distribution, velocity, and computation time are compared between several turbulence models for a given DAF geometry and condition. As a result, it is observed that the standard κ-ε model, which has been frequently used in previous research, predicts somewhat different behavior than other turbulence models

  19. Computer Game Theories for Designing Motivating Educational Software: A Survey Study

    Science.gov (United States)

    Ang, Chee Siang; Rao, G. S. V. Radha Krishna

    2008-01-01

    The purpose of this study is to evaluate computer game theories for educational software. We propose a framework for designing engaging educational games based on contemporary game studies which includes ludology and narratology. Ludology focuses on the study of computer games as play and game activities, while narratology revolves around the…

  20. An Experimental Study into the use of computers for teaching of ...

    African Journals Online (AJOL)

    This study was an experimental study which sought to establish how English language teachers used computers for teaching composition writing at Prince Edward High School in Harare. The findings of the study show that computers were rarely used in the teaching of composition despite the observation that the school ...

  1. A Comparative Study of Distribution System Parameter Estimation Methods

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Yannan; Williams, Tess L.; Gourisetti, Sri Nikhil Gup

    2016-07-17

    In this paper, we compare two parameter estimation methods for distribution systems: residual sensitivity analysis and state-vector augmentation with a Kalman filter. These two methods were originally proposed for transmission systems, and are still the most commonly used methods for parameter estimation. Distribution systems have much lower measurement redundancy than transmission systems. Therefore, estimating parameters is much more difficult. To increase the robustness of parameter estimation, the two methods are applied with combined measurement snapshots (measurement sets taken at different points in time), so that the redundancy for computing the parameter values is increased. The advantages and disadvantages of both methods are discussed. The results of this paper show that state-vector augmentation is a better approach for parameter estimation in distribution systems. Simulation studies are done on a modified version of IEEE 13-Node Test Feeder with varying levels of measurement noise and non-zero error in the other system model parameters.

  2. Effects of different computer typing speeds on acceleration and peak contact pressure of the fingertips during computer typing.

    Science.gov (United States)

    Yoo, Won-Gyu

    2015-01-01

    [Purpose] This study showed the effects of different computer typing speeds on acceleration and peak contact pressure of the fingertips during computer typing. [Subjects] Twenty-one male computer workers voluntarily consented to participate in this study. They consisted of 7 workers who could type 200-300 characteristics/minute, 7 workers who could type 300-400 characteristics/minute, and 7 workers who could type 400-500 chracteristics/minute. [Methods] This study was used to measure the acceleration and peak contact pressure of the fingertips for different typing speed groups using an accelerometer and CONFORMat system. [Results] The fingertip contact pressure was increased in the high typing speed group compared with the low and medium typing speed groups. The fingertip acceleration was increased in the high typing speed group compared with the low and medium typing speed groups. [Conclusion] The results of the present study indicate that a fast typing speed cause continuous pressure stress to be applied to the fingers, thereby creating pain in the fingers.

  3. Computer applications in controlled fusion research

    International Nuclear Information System (INIS)

    Killeen, J.

    1975-01-01

    The application of computers to controlled thermonuclear research (CTR) is essential. In the near future the use of computers in the numerical modeling of fusion systems should increase substantially. A recent panel has identified five categories of computational models to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies is called for. The development and application of computer codes to implement these models is a vital step in reaching the goal of fusion power. To meet the needs of the fusion program the National CTR Computer Center has been established at the Lawrence Livermore Laboratory. A large central computing facility is linked to smaller computing centers at each of the major CTR Laboratories by a communication network. The crucial element needed for success is trained personnel. The number of people with knowledge of plasma science and engineering trained in numerical methods and computer science must be increased substantially in the next few years. Nuclear engineering departments should encourage students to enter this field and provide the necessary courses and research programs in fusion computing

  4. Comparative use of the computer-aided angiography and rapid prototyping technology versus conventional imaging in the management of the Tile C pelvic fractures.

    Science.gov (United States)

    Li, Baofeng; Chen, Bei; Zhang, Ying; Wang, Xinyu; Wang, Fei; Xia, Hong; Yin, Qingshui

    2016-01-01

    Computed tomography (CT) scan with three-dimensional (3D) reconstruction has been used to evaluate complex fractures in pre-operative planning. In this study, rapid prototyping of a life-size model based on 3D reconstructions including bone and vessel was applied to evaluate the feasibility and prospect of these new technologies in surgical therapy of Tile C pelvic fractures by observing intra- and perioperative outcomes. The authors conducted a retrospective study on a group of 157 consecutive patients with Tile C pelvic fractures. Seventy-six patients were treated with conventional pre-operative preparation (A group) and 81 patients were treated with the help of computer-aided angiography and rapid prototyping technology (B group). Assessment of the two groups considered the following perioperative parameters: length of surgical procedure, intra-operative complications, intra- and postoperative blood loss, postoperative pain, postoperative nausea and vomiting (PONV), length of stay, and type of discharge. The two groups were homogeneous when compared in relation to mean age, sex, body weight, injury severity score, associated injuries and pelvic fracture severity score. Group B was performed in less time (105 ± 19 minutes vs. 122 ± 23 minutes) and blood loss (31.0 ± 8.2 g/L vs. 36.2 ± 7.4 g/L) compared with group A. Patients in group B experienced less pain (2.5 ± 2.3 NRS score vs. 2.8 ± 2.0 NRS score), and PONV affected only 8 % versus 10 % of cases. Times to discharge were shorter (7.8 ± 2.0 days vs. 10.2 ± 3.1 days) in group B, and most of patients were discharged to home. In our study, patients of Tile C pelvic fractures treated with computer-aided angiography and rapid prototyping technology had a better perioperative outcome than patients treated with conventional pre-operative preparation. Further studies are necessary to investigate the advantages in terms of clinical results in the short and long run.

  5. Fitting models of continuous trait evolution to incompletely sampled comparative data using approximate Bayesian computation.

    Science.gov (United States)

    Slater, Graham J; Harmon, Luke J; Wegmann, Daniel; Joyce, Paul; Revell, Liam J; Alfaro, Michael E

    2012-03-01

    In recent years, a suite of methods has been developed to fit multiple rate models to phylogenetic comparative data. However, most methods have limited utility at broad phylogenetic scales because they typically require complete sampling of both the tree and the associated phenotypic data. Here, we develop and implement a new, tree-based method called MECCA (Modeling Evolution of Continuous Characters using ABC) that uses a hybrid likelihood/approximate Bayesian computation (ABC)-Markov-Chain Monte Carlo approach to simultaneously infer rates of diversification and trait evolution from incompletely sampled phylogenies and trait data. We demonstrate via simulation that MECCA has considerable power to choose among single versus multiple evolutionary rate models, and thus can be used to test hypotheses about changes in the rate of trait evolution across an incomplete tree of life. We finally apply MECCA to an empirical example of body size evolution in carnivores, and show that there is no evidence for an elevated rate of body size evolution in the pinnipeds relative to terrestrial carnivores. ABC approaches can provide a useful alternative set of tools for future macroevolutionary studies where likelihood-dependent approaches are lacking. © 2011 The Author(s). Evolution© 2011 The Society for the Study of Evolution.

  6. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    Science.gov (United States)

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  7. Conformational effects on the circular dichroism of Human Carbonic Anhydrase II: a multilevel computational study.

    Directory of Open Access Journals (Sweden)

    Tatyana G Karabencheva-Christova

    Full Text Available Circular Dichroism (CD spectroscopy is a powerful method for investigating conformational changes in proteins and therefore has numerous applications in structural and molecular biology. Here a computational investigation of the CD spectrum of the Human Carbonic Anhydrase II (HCAII, with main focus on the near-UV CD spectra of the wild-type enzyme and it seven tryptophan mutant forms, is presented and compared to experimental studies. Multilevel computational methods (Molecular Dynamics, Semiempirical Quantum Mechanics, Time-Dependent Density Functional Theory were applied in order to gain insight into the mechanisms of interaction between the aromatic chromophores within the protein environment and understand how the conformational flexibility of the protein influences these mechanisms. The analysis suggests that combining CD semi empirical calculations, crystal structures and molecular dynamics (MD could help in achieving a better agreement between the computed and experimental protein spectra and provide some unique insight into the dynamic nature of the mechanisms of chromophore interactions.

  8. Does the accuracy of single reading with CAD (computer-aided detection) compare with that of double reading?: A review of the literature

    International Nuclear Information System (INIS)

    Bennett, R.L.; Blanks, R.G.; Moss, S.M.

    2006-01-01

    Aim: To examine current evidence to determine whether the accuracy of single reading with computed-aided detection (CAD) compares with that of double reading. Methods: We performed a literature review to identify studies where both protocols had been investigated and compared. We identified eight studies that compared single reading with CAD against double reading, of which six reported on comparisons of both sensitivity and specificity. Results: Of the six studies identified, three showed no differences in either sensitivity or specificity. One showed single reading with CAD had a higher sensitivity at the same specificity, another that single reading with CAD had a higher specificity at the same sensitivity. However, one study, in a real-life setting, showed that single reading with CAD had a higher sensitivity but a lower specificity. Conclusion: As the majority of the studies were not in a real-life setting, used test sets, lacked sufficient training in the use of CAD and simulated double reading (using a protocol of recall if one suggests), current evidence is therefore limited as to the accuracy, in terms of sensitivity and specificity, of single reading with CAD in comparison with the most common practice in the UK of double reading using a protocol of consensus or arbitration

  9. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    Science.gov (United States)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and

  10. Computer use and stress, sleep disturbances, and symptoms of depression among young adults – a prospective cohort study

    Directory of Open Access Journals (Sweden)

    Thomée Sara

    2012-10-01

    Full Text Available Abstract Background We have previously studied prospective associations between computer use and mental health symptoms in a selected young adult population. The purpose of this study was to investigate if high computer use is a prospective risk factor for developing mental health symptoms in a population-based sample of young adults. Methods The study group was a cohort of young adults (n = 4163, 20–24 years old, who responded to a questionnaire at baseline and 1-year follow-up. Exposure variables included time spent on computer use (CU in general, email/chat use, computer gaming, CU without breaks, and CU at night causing lost sleep. Mental health outcomes included perceived stress, sleep disturbances, symptoms of depression, and reduced performance due to stress, depressed mood, or tiredness. Prevalence ratios (PRs were calculated for prospective associations between exposure variables at baseline and mental health outcomes (new cases at 1-year follow-up for the men and women separately. Results Both high and medium computer use compared to low computer use at baseline were associated with sleep disturbances in the men at follow-up. High email/chat use was negatively associated with perceived stress, but positively associated with reported sleep disturbances for the men. For the women, high email/chat use was (positively associated with several mental health outcomes, while medium computer gaming was associated with symptoms of depression, and CU without breaks with most mental health outcomes. CU causing lost sleep was associated with mental health outcomes for both men and women. Conclusions Time spent on general computer use was prospectively associated with sleep disturbances and reduced performance for the men. For the women, using the computer without breaks was a risk factor for several mental health outcomes. Some associations were enhanced in interaction with mobile phone use. Using the computer at night and consequently losing

  11. Conventional frontal radiographs compared with frontal radiographs obtained from cone beam computed tomography.

    Science.gov (United States)

    Nur, Metin; Kayipmaz, Saadettin; Bayram, Mehmet; Celikoglu, Mevlut; Kilkis, Dogan; Sezgin, Omer Said

    2012-07-01

    To test the hypothesis that there is no difference between measurements performed on conventional frontal radiographs (FRs) and those performed on FRs obtained from cone beam computed tomography (CBCT) scans. This study consisted of conventional FRs and CBCT-constructed FRs obtained from 30 young adult patients. Twenty-three landmarks were identified on both types of cephalometric radiographs. Twenty-one widely used cephalometric variables (14 linear distances, 4 angles, and 3 ratios) were calculated. Paired t-tests were performed to compare the means of corresponding measurements on two cephalometric radiographs of the same patient. Reproducibility of measurements ranged from 0.85 to 0.99 for CBCT-constructed FRs, and from 0.78 to 0.96 for conventional FRs. A statistically significant difference was observed between conventional FRs and CBCT-constructed FRs for all linear measurements (eurR-eurL, loR-loL, moR-moL, zygR-zygL, lapR-lapL, mxR-mxL, maR-maL, umR-umL, lmR-lmL, agR-agL, me-ans) (P .05). However, no statistically significant differences were noted between conventional FRs and CBCT-constructed FRs for ratios and angular measurements (P > .05). The hypothesis was rejected. A difference has been noted between measurements performed on conventional FRs and those performed on CBCT-constructed FRs, particularly in terms of linear measurements.

  12. Using FlowLab, an educational computational fluid dynamics tool, to perform a comparative study of turbulence models

    International Nuclear Information System (INIS)

    Parihar, A.; Kulkarni, A.; Stern, F.; Xing, T.; Moeykens, S.

    2005-01-01

    Flow over an Ahmed body is a key benchmark case for validating the complex turbulent flow field around vehicles. In spite of the simple geometry, the flow field around an Ahmed body retains critical features of real, external vehicular flow. The present study is an attempt to implement such a real life example into the course curriculum for undergraduate engineers. FlowLab, which is a Computational Fluid Dynamics (CFD) tool developed by Fluent Inc. for use in engineering education, allows students to conduct interactive application studies. This paper presents a synopsis of FlowLab, a description of one FlowLab exercise, and an overview of the educational experience gained by students through using FlowLab, which is understood through student surveys and examinations. FlowLab-based CFD exercises were implemented into 57:020 Mechanics of Fluids and Transport Processes and 58:160 Intermediate Mechanics of Fluids courses at the University of Iowa in the fall of 2004, although this report focuses only on experiences with the Ahmed body exercise, which was used only in the intermediate-level fluids class, 58:160. This exercise was developed under National Science Foundation funding by the authors of this paper. The focus of this study does not include validating the various turbulence models used for the Ahmed body simulation, because a two-dimensional simplification was applied. With the two-dimensional simplification, students may setup, run, and post process this model in a 50 minute class period using a single-CPU PC, as required for the 58:160 class at the University of Iowa. It is educational for students to understand the implication of a two- dimensional approximation for essentially a three-dimensional flow field, along with the consequent variation in both qualitative and quantitative results. Additionally, through this exercise, students may realize that the choice of the respective turbulence model will affect simulation prediction. (author)

  13. Restructuring of schools as a consequence of computer use?

    NARCIS (Netherlands)

    Plomp, T.; Pelgrum, W.J.

    1993-01-01

    The central question discussed is whether the use of computers leads to the restructuring of schools or classrooms. Several authors argue that intensive use of computers must lead to new classroom patterns or new forms of schooling. Data from the international comparative study of computers in

  14. Computational and experimental methods for enclosed natural convection

    International Nuclear Information System (INIS)

    Larson, D.W.; Gartling, D.K.; Schimmel, W.P. Jr.

    1977-10-01

    Two computational procedures and one optical experimental procedure for studying enclosed natural convection are described. The finite-difference and finite-element numerical methods are developed and several sample problems are solved. Results obtained from the two computational approaches are compared. A temperature-visualization scheme using laser holographic interferometry is described, and results from this experimental procedure are compared with results from both numerical methods

  15. Effect of Computer-Based Video Games on Children: An Experimental Study

    Science.gov (United States)

    Chuang, Tsung-Yen; Chen, Wei-Fan

    2009-01-01

    This experimental study investigated whether computer-based video games facilitate children's cognitive learning. In comparison to traditional computer-assisted instruction (CAI), this study explored the impact of the varied types of instructional delivery strategies on children's learning achievement. One major research null hypothesis was…

  16. Symbolic math for computation of radiation shielding

    International Nuclear Information System (INIS)

    Suman, Vitisha; Datta, D.; Sarkar, P.K.; Kushwaha, H.S.

    2010-01-01

    Radiation transport calculations for shielding studies in the field of accelerator technology often involve intensive numerical computations. Traditionally, radiation transport equation is solved using finite difference scheme or advanced finite element method with respect to specific initial and boundary conditions suitable for the geometry of the problem. All these computations need CPU intensive computer codes for accurate calculation of scalar and angular fluxes. Computation using symbols of the analytical expression representing the transport equation as objects is an enhanced numerical technique in which the computation is completely algorithm and data oriented. Algorithm on the basis of symbolic math architecture is developed using Symbolic math toolbox of MATLAB software. Present paper describes the symbolic math algorithm and its application as a case study in which shielding calculation of rectangular slab geometry is studied for a line source of specific activity. Study of application of symbolic math in this domain evolves a new paradigm compared to the existing computer code such as DORT. (author)

  17. Study Of Visual Disorders In Egyptian Computer Operators

    International Nuclear Information System (INIS)

    Al-Awadi, M.Y.; Awad Allah, H.; Hegazy, M. T.; Naguib, N.; Akmal, M.

    2012-01-01

    The aim of the study was to evaluate the probable effects of exposure to electromagnetic waves radiated from visual display terminals on some of visual functions. 300 computer operators working in different institutes were selected randomly. They were asked to fill a pre-tested questionnaire (written in Arabic) after obtaining their verbal consent. Among them, one hundred fifty exposed to visual display terminals were selected for the clinical study (group I). The control group includes one hundred fifty participants (their age matched with group I) but working in a field that did not expose to visual display terminals (group II). All chosen individuals were not suffering from any apparent health problems or any apparent diseases that could affect their visual conditions. All exposed candidates were using a VDT of LCD type size 15 and 17 and larger. Data entry and analysis were done using the SPSS version 17.0 applying appropriate statistical methods. The results showed that among the 150 exposed studied subjects, high significant occurrence of dryness and high significant association between occurrence of asthenopia and background variables (working hours using computers) were observed. Exposed subjects showed that 92% complained of tired eyes and eye strain, 37.33% complained of dry or sore eyes, 68% complained of headache, 68% complained of blurred distant vision 45.33% complained of asthenopia and 89.33% complained of neck, shoulder and back aches. Meantime, the control group showed that 18% complained of tired eyes, 21.33% of dry eyes and 12.67% of neck, shoulder and back aches. It could be concluded that prevalence of computer vision syndrome was noted to be quite high among computer operators.

  18. A comparative experimental and computational study of methanol, ethanol, and n-butanol flames

    Energy Technology Data Exchange (ETDEWEB)

    Veloo, Peter S.; Wang, Yang L.; Egolfopoulos, Fokion N. [Department of Aerospace and Mechanical Engineering, University of Southern California, Los Angeles, CA 90089-1453 (United States); Westbrook, Charles K. [Lawrence Livermore National Laboratory, Livermore, CA 94550 (United States)

    2010-10-15

    Laminar flame speeds and extinction strain rates of premixed methanol, ethanol, and n-butanol flames were determined experimentally in the counterflow configuration at atmospheric pressure and elevated unburned mixture temperatures. Additional measurements were conducted also to determine the laminar flame speeds of their n-alkane/air counterparts, namely methane, ethane, and n-butane in order to compare the effect of alkane and alcohol molecular structures on high-temperature flame kinetics. For both propagation and extinction experiments the flow velocities were determined using the digital particle image velocimetry method. Laminar flame speeds were derived through a non-linear extrapolation approach based on direct numerical simulations of the experiments. Two recently developed detailed kinetics models of n-butanol oxidation were used to simulate the experiments. The experimental results revealed that laminar flame speeds of ethanol/air and n-butanol/air flames are similar to those of their n-alkane/air counterparts, and that methane/air flames have consistently lower laminar flame speeds than methanol/air flames. The laminar flame speeds of methanol/air flames are considerably higher compared to both ethanol/air and n-butanol/air flames under fuel-rich conditions. Numerical simulations of n-butanol/air freely propagating flames, revealed discrepancies between the two kinetic models regarding the consumption pathways of n-butanol and its intermediates. (author)

  19. Detection of blood aspiration in deadly head gunshots comparing postmortem computed tomography (PMCT) and autopsy.

    Science.gov (United States)

    Scaparra, E; Peschel, O; Kirchhoff, C; Reiser, M; Kirchhoff, S M

    2016-11-01

    The aim of our study was to analyze the reliability of postmortem computed tomography (PMCT) versus autopsy in detecting signs of blood aspiration in a distinct group of patients following deadly head, mouth or floor of mouth gunshot injuries. In this study, in 41 cases PMCT was compared to autopsy reports, the gold standard of postmortem exams, regarding detection of blood aspiration. PMCT was evaluated for the presence and level of typical signs of blood aspiration in the major airways and lung using a semi-quantitative scale ranging from level 0 (no aspiration) to 3 (significant aspiration) also taking density values of the described potential aspiratory changes into account. Overall, in 29 (70.7%) of 41 enrolled cases PMCT and autopsy revealed the same level of aspiration. A difference of one level between PMCT and autopsy resulted for 5 (12.2%) of the remaining 12 cases. More than one level difference between both methods resulted for 7 cases (17.2%). Autopsy described no signs of aspiration in 10 cases, compared to 31 cases with reported blood aspiration. In contrast, PMCT revealed no signs of blood aspiration in 15 cases whereas 26 cases were rated as positive for signs of aspiration in the major airways. In 18 of these 26 cases considered positive for blood aspiration by autopsy and PMCT, clear signs of aspiration signs were also described bilaterally by both methods. The presented study provides evidence for the assumption that PMCT seems to be helpful in the detection of blood aspiration in cases of deadly head gunshots. In conclusion, it seems reasonable to suggest performing PMCT additionally to traditional postmortem exams in cases of suspected aspiration to rule out false-negative cases and to possibly allow for a more detailed and rather evidence based examination reconnoitering the cause of death. However, the adequate use of PMCT in this context needs further evaluation and the definition of an objective scale for aspiration detection on PMCT needs

  20. F-18 Sodium Fluoride Positron Emission Tomography/Computed Tomography for Detection of Thyroid Cancer Bone Metastasis Compared with Bone Scintigraphy.

    Science.gov (United States)

    Lee, Hyunjong; Lee, Won Woo; Park, So Yeon; Kim, Sang Eun

    2016-01-01

    The aim of the study was to compare the diagnostic performances of F-18 sodium fluoride positron emission tomography/computed tomography (bone PET/CT) and bone scintigraphy (BS) for the detection of thyroid cancer bone metastasis. We retrospectively enrolled 6 thyroid cancer patients (age = 44.7 ± 9.8 years, M:F = 1:5, papillary:follicular = 2:4) with suspected bone metastatic lesions in the whole body iodine scintigraphy or BS, who subsequently underwent bone PET/CT. Pathologic diagnosis was conducted for 4 lesions of 4 patients. Of the 17 suspected bone lesions, 10 were metastatic and 7 benign. Compared to BS, bone PET/CT exhibited superior sensitivity (10/10 = 100% vs. 2/10 = 20%, p = 0.008), and accuracy (14/17 = 82.4% vs. 7/17 = 41.2%, p 0.05). Bone PET/CT may be more sensitive and accurate than BS for the detection of thyroid cancer bone metastasis.

  1. Comparative study of probabilistic methodologies for small signal stability assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rueda, J.L.; Colome, D.G. [Universidad Nacional de San Juan (IEE-UNSJ), San Juan (Argentina). Inst. de Energia Electrica], Emails: joseluisrt@iee.unsj.edu.ar, colome@iee.unsj.edu.ar

    2009-07-01

    Traditional deterministic approaches for small signal stability assessment (SSSA) are unable to properly reflect the existing uncertainties in real power systems. Hence, the probabilistic analysis of small signal stability (SSS) is attracting more attention by power system engineers. This paper discusses and compares two probabilistic methodologies for SSSA, which are based on the two point estimation method and the so-called Monte Carlo method, respectively. The comparisons are based on the results obtained for several power systems of different sizes and with different SSS performance. It is demonstrated that although with an analytical approach the amount of computation of probabilistic SSSA can be reduced, the different degrees of approximations that are adopted, lead to deceptive results. Conversely, Monte Carlo based probabilistic SSSA can be carried out with reasonable computational effort while holding satisfactory estimation precision. (author)

  2. Distal radius plate of CFR-PEEK has minimal effect compared to titanium plates on bone parameters in high-resolution peripheral quantitative computed tomography: a pilot study.

    Science.gov (United States)

    de Jong, Joost J A; Lataster, Arno; van Rietbergen, Bert; Arts, Jacobus J; Geusens, Piet P; van den Bergh, Joop P W; Willems, Paul C

    2017-02-27

    Carbon-fiber-reinforced poly-ether-ether-ketone (CFR-PEEK) has superior radiolucency compared to other orthopedic implant materials, e.g. titanium or stainless steel, thus allowing metal-artifact-free postoperative monitoring by computed tomography (CT). Recently, high-resolution peripheral quantitative CT (HRpQCT) proved to be a promising technique to monitor the recovery of volumetric bone mineral density (vBMD), micro-architecture and biomechanical parameters in stable conservatively treated distal radius fractures. When using HRpQCT to monitor unstable distal radius fractures that require volar distal radius plating for fixation, radiolucent CFR-PEEK plates may be a better alternative to currently used titanium plates to allow for reliable assessment. In this pilot study, we assessed the effect of a volar distal radius plate made from CFR-PEEK on bone parameters obtained from HRpQCT in comparison to two titanium plates. Plates were instrumented in separate cadaveric human fore-arms (n = 3). After instrumentation and after removal of the plates duplicate HRpQCT scans were made of the region covered by the plate. HRpQCT images were visually checked for artifacts. vBMD, micro-architectural and biomechanical parameters were calculated, and compared between the uninstrumented and instrumented radii. No visible image artifacts were observed in the CFR-PEEK plate instrumented radius, and errors in bone parameters ranged from -3.2 to 2.6%. In the radii instrumented with the titanium plates, severe image artifacts were observed and errors in bone parameters ranged between -30.2 and 67.0%. We recommend using CFR-PEEK plates in longitudinal in vivo studies that monitor the healing process of unstable distal radius fractures treated operatively by plating or bone graft ingrowth.

  3. Student Study Choices in the Principles of Economics: A Case Study of Computer Usage

    OpenAIRE

    Grimes, Paul W.; Sanderson, Patricia L.; Ching, Geok H.

    1996-01-01

    Principles of Economics students at Mississippi State University were provided the opportunity to use computer assisted instruction (CAI) as a supplemental study activity. Students were free to choose the extent of their computer work. Throughout the course, weekly surveys were conducted to monitor the time each student spent with their textbook, computerized tutorials, workbook, class notes, and study groups. The surveys indicated that only a minority of the students actively pursued CAI....

  4. Computer use and carpal tunnel syndrome: A meta-analysis.

    Science.gov (United States)

    Shiri, Rahman; Falah-Hassani, Kobra

    2015-02-15

    Studies have reported contradictory results on the role of keyboard or mouse use in carpal tunnel syndrome (CTS). This meta-analysis aimed to assess whether computer use causes CTS. Literature searches were conducted in several databases until May 2014. Twelve studies qualified for a random-effects meta-analysis. Heterogeneity and publication bias were assessed. In a meta-analysis of six studies (N=4964) that compared computer workers with the general population or other occupational populations, computer/typewriter use (pooled odds ratio (OR)=0.72, 95% confidence interval (CI) 0.58-0.90), computer/typewriter use ≥1 vs. computer/typewriter use ≥4 vs. computer/typewriter use (pooled OR=1.34, 95% CI 1.08-1.65), mouse use (OR=1.93, 95% CI 1.43-2.61), frequent computer use (OR=1.89, 95% CI 1.15-3.09), frequent mouse use (OR=1.84, 95% CI 1.18-2.87) and with years of computer work (OR=1.92, 95% CI 1.17-3.17 for long vs. short). There was no evidence of publication bias for both types of studies. Studies that compared computer workers with the general population or several occupational groups did not control their estimates for occupational risk factors. Thus, office workers with no or little computer use are a more appropriate comparison group than the general population or several occupational groups. This meta-analysis suggests that excessive computer use, particularly mouse usage might be a minor occupational risk factor for CTS. Further prospective studies among office workers with objectively assessed keyboard and mouse use, and CTS symptoms or signs confirmed by a nerve conduction study are needed. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Computed Tomography Study Of Complicated Bacterial Meningitis ...

    African Journals Online (AJOL)

    To monitor the structural intracranial complications of bacterial meningitis using computed tomography (CT) scan. Retrospective study of medical and radiological records of patients who underwent CT scan over a 4 year period. AUniversityTeachingHospital in a developing country. Thirty three patients with clinically and ...

  6. Computer applications in controlled fusion research

    International Nuclear Information System (INIS)

    Killeen, J.

    1975-02-01

    The role of Nuclear Engineering Education in the application of computers to controlled fusion research can be a very important one. In the near future the use of computers in the numerical modelling of fusion systems should increase substantially. A recent study group has identified five categories of computational models to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies are called for. The development and application of computer codes to implement these models is a vital step in reaching the goal of fusion power. In order to meet the needs of the fusion program the National CTR Computer Center has been established at the Lawrence Livermore Laboratory. A large central computing facility is linked to smaller computing centers at each of the major CTR laboratories by a communications network. The crucial element that is needed for success is trained personnel. The number of people with knowledge of plasma science and engineering that are trained in numerical methods and computer science is quite small, and must be increased substantially in the next few years. Nuclear Engineering departments should encourage students to enter this field and provide the necessary courses and research programs in fusion computing. (U.S.)

  7. Comparative analysis of bone mineral contents with dual-energy quantitative computed tomography

    International Nuclear Information System (INIS)

    Choi, T. J.; Yoon, S. M.; Kim, O. B.; Lee, S. M.; Suh, S. J.

    1997-01-01

    The Dual-Energy Quantitative Computed Tomography(DEQCT) was compared with bone equivalent K 2 HPO 4 standard solution and ash weight of animal cadaveric trabecular bone in the measurement of bone mineral contents(BMC). The attenuation coefficient of tissues highly depends on the radiation energy, density and effective atomic number of composition. The bone mineral content of DEQCT in this experiments was determined from empirical constants and mass attenuation coefficients of bone, fat and soft tissue equivalent solution in two photon spectra. In this experiments, the BMC of DEQCT with 80 and 120kV p X rays was compared to ash weight of animal trabecular bone. We obtained the mass attenuation coefficient of 0.2409, 0.5608 and 0.2206 in 80kV p , and 0.2046, 0.3273 and 0.1971 cm 2 /g in 120kV p X-ray spectra for water, bone and fat equivalent materials, respectively. The BMC with DEQCT was accomplished with empirical constants K 1 =0.3232, K 2 =0.2450 and mass attenuation coefficients has very closed to ash weight of animal trabecular bone. The BMC of empirical DEQCT and that of manufacturing DEQCT were correlated with ash weight as a correlation r=0.998 and r=0.996, respectively. The BMC of empirical DEQCT using the experimental mass attenuation coefficients and that of manufacture have showed very close to ash weight of animal trabecular bone. (author)

  8. Comparative analysis of cervical spine management in a subset of severe traumatic brain injury cases using computer simulation.

    Directory of Open Access Journals (Sweden)

    Kimbroe J Carter

    Full Text Available BACKGROUND: No randomized control trial to date has studied the use of cervical spine management strategies in cases of severe traumatic brain injury (TBI at risk for cervical spine instability solely due to damaged ligaments. A computer algorithm is used to decide between four cervical spine management strategies. A model assumption is that the emergency room evaluation shows no spinal deficit and a computerized tomogram of the cervical spine excludes the possibility of fracture of cervical vertebrae. The study's goal is to determine cervical spine management strategies that maximize brain injury functional survival while minimizing quadriplegia. METHODS/FINDINGS: The severity of TBI is categorized as unstable, high risk and stable based on intracranial hypertension, hypoxemia, hypotension, early ventilator associated pneumonia, admission Glasgow Coma Scale (GCS and age. Complications resulting from cervical spine management are simulated using three decision trees. Each case starts with an amount of primary and secondary brain injury and ends as a functional survivor, severely brain injured, quadriplegic or dead. Cervical spine instability is studied with one-way and two-way sensitivity analyses providing rankings of cervical spine management strategies for probabilities of management complications based on QALYs. Early collar removal received more QALYs than the alternative strategies in most arrangements of these comparisons. A limitation of the model is the absence of testing against an independent data set. CONCLUSIONS: When clinical logic and components of cervical spine management are systematically altered, changes that improve health outcomes are identified. In the absence of controlled clinical studies, the results of this comparative computer assessment show that early collar removal is preferred over a wide range of realistic inputs for this subset of traumatic brain injury. Future research is needed on identifying factors in

  9. Experimental and computational studies on a gasifier based stove

    International Nuclear Information System (INIS)

    Varunkumar, S.; Rajan, N.K.S.; Mukunda, H.S.

    2012-01-01

    Highlights: ► A simple method to calculate the fraction of HHC was devised. ► η g for stove is same as that of a downdraft gasifier. ► Gas from stove contains 5.5% of CH 4 equivalent of HHC. ► Effect of vessel size on utilization efficiency brought out clearly. ► Contribution of radiative heat transfer from char bed to efficiency is 6%. - Abstract: The work reported here is concerned with a detailed thermochemical evaluation of the flaming mode behaviour of a gasifier based stove. Determination of the gas composition over the fuel bed, surface and gas temperatures in the gasification process constitute principal experimental features. A simple atomic balance for the gasification reaction combined with the gas composition from the experiments is used to determine the CH 4 equivalent of higher hydrocarbons and the gasification efficiency (η g ). The components of utilization efficiency, namely, gasification–combustion and heat transfer are explored. Reactive flow computational studies using the measured gas composition over the fuel bed are used to simulate the thermochemical flow field and heat transfer to the vessel; hither-to-ignored vessel size effects in the extraction of heat from the stove are established clearly. The overall flaming mode efficiency of the stove is 50–54%; the convective and radiative components of heat transfer are established to be 45–47 and 5–7% respectively. The efficiency estimates from reacting computational fluid dynamics (RCFD) compare well with experiments.

  10. Comparative study of the geostatistical ore reserve estimation method over the conventional methods

    International Nuclear Information System (INIS)

    Kim, Y.C.; Knudsen, H.P.

    1975-01-01

    Part I contains a comprehensive treatment of the comparative study of the geostatistical ore reserve estimation method over the conventional methods. The conventional methods chosen for comparison were: (a) the polygon method, (b) the inverse of the distance squared method, and (c) a method similar to (b) but allowing different weights in different directions. Briefly, the overall result from this comparative study is in favor of the use of geostatistics in most cases because the method has lived up to its theoretical claims. A good exposition on the theory of geostatistics, the adopted study procedures, conclusions and recommended future research are given in Part I. Part II of this report contains the results of the second and the third study objectives, which are to assess the potential benefits that can be derived by the introduction of the geostatistical method to the current state-of-the-art in uranium reserve estimation method and to be instrumental in generating the acceptance of the new method by practitioners through illustrative examples, assuming its superiority and practicality. These are given in the form of illustrative examples on the use of geostatistics and the accompanying computer program user's guide

  11. Brain-computer interfacing under distraction: an evaluation study

    DEFF Research Database (Denmark)

    Brandl, Stephanie; Frølich, Laura; Höhne, Johannes

    2016-01-01

    Objective. While motor-imagery based brain-computer interfaces (BCIs) have been studied over many years by now, most of these studies have taken place in controlled lab settings. Bringing BCI technology into everyday life is still one of the main challenges in this field of research. Approach...

  12. Follow up study of Alzheimer's type dementia with computed tomography

    International Nuclear Information System (INIS)

    Hirata, Nobuhide

    1987-01-01

    In 54 patients who were diagnosed as having Alzheimer's type dementia based on the Diagnostic and Statistical Manual of Mental Disorders, III, cranial computed tomography (CT) scans were obtained before and after their follow-up study ranging from 6 to 24 months (mean 15.4 +- 4.7 months). Cerebrospinal percentage and CT density in various regions of interest were examined. Six patients died during the study. Comparison of the group of the deceased (Group I) with the group of survivors (Group II) revealed: (1) there was no difference in average age and the degree of mental disorder at first presentation; (2) Group I had decreased activities of daily living; and (3) CT density was significantly decreased in the bilateral lateral and frontal lobes in Group I. As for Group II, decreased CT numbers were noticeable during the follow-up period in the frontal lobe, parietal lobe, and caudate nucleus in the group evaluated as aggravated, as compared with the group evaluated as unchanged. (Namekawa, K.)

  13. Diagnostic significance of rib series in minor thorax trauma compared to plain chest film and computed tomography.

    Science.gov (United States)

    Hoffstetter, Patrick; Dornia, Christian; Schäfer, Stephan; Wagner, Merle; Dendl, Lena M; Stroszczynski, Christian; Schreyer, Andreas G

    2014-01-01

    Rib series (RS) are a special radiological technique to improve the visualization of the bony parts of the chest. The aim of this study was to evaluate the diagnostic accuracy of rib series in minor thorax trauma. Retrospective study of 56 patients who received RS, 39 patients where additionally evaluated by plain chest film (PCF). All patients underwent a computed tomography (CT) of the chest. RS and PCF were re-read independently by three radiologists, the results were compared with the CT as goldstandard. Sensitivity, specificity, negative and positive predictive value were calculated. Significance in the differences of findings was determined by McNemar test, interobserver variability by Cohens kappa test. 56 patients were evaluated (34 men, 22 women, mean age =61 y.). In 22 patients one or more rib fracture could be identified by CT. In 18 of these cases (82%) the correct diagnosis was made by RS, in 16 cases (73%) the correct number of involved ribs was detected. These differences were significant (p = 0.03). Specificity was 100%, negative and positive predictive value were 85% and 100%. Kappa values for the interobserver agreement was 0.92-0.96. Sensitivity of PCF was 46% and was significantly lower (p = 0.008) compared to CT. Rib series does not seem to be an useful examination in evaluating minor thorax trauma. CT seems to be the method of choice to detect rib fractures, but the clinical value of the radiological proof has to be discussed and investigated in larger follow up studies.

  14. Security Problems in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Rola Motawie

    2016-12-01

    Full Text Available Cloud is a pool of computing resources which are distributed among cloud users. Cloud computing has many benefits like scalability, flexibility, cost savings, reliability, maintenance and mobile accessibility. Since cloud-computing technology is growing day by day, it comes with many security problems. Securing the data in the cloud environment is most critical challenges which act as a barrier when implementing the cloud. There are many new concepts that cloud introduces, such as resource sharing, multi-tenancy, and outsourcing, create new challenges for the security community. In this work, we provide a comparable study of cloud computing privacy and security concerns. We identify and classify known security threats, cloud vulnerabilities, and attacks.

  15. Progression criteria for cancer antigen 15.3 and carcinoembryonic antigen in metastatic breast cancer compared by computer simulation of marker data

    DEFF Research Database (Denmark)

    Sölétormos, G; Hyltoft Petersen, P; Dombernowsky, P

    2000-01-01

    .3 and carcinoembryonic antigen concentrations were combined with representative values for background variations in a computer simulation model. Fifteen criteria for assessment of longitudinal tumor marker data were obtained from the literature and computerized. Altogether, 7200 different patients, each based on 50......BACKGROUND: We investigated the utility of computer simulation models for performance comparisons of different tumor marker assessment criteria to define progression or nonprogression of metastatic breast cancer. METHODS: Clinically relevant values for progressive cancer antigen 15...... of progression. CONCLUSIONS: The computer simulation model is a fast, effective, and inexpensive approach for comparing the diagnostic potential of assessment criteria during clinically relevant conditions of steady-state and progressive disease. The model systems can be used to generate tumor marker assessment...

  16. [Results of the marketing research study "Acceptance of physician's office computer systems"].

    Science.gov (United States)

    Steinhausen, D; Brinkmann, F; Engelhard, A

    1998-01-01

    We report on a market research study on the acceptance of computer systems in surgeries. 11,000 returned questionnaires of surgeons--user and nonuser--were analysed. We found out that most of the surgeons used their computers in a limited way, i.e. as a device for accounting. Concerning the level of utilisation there are differentials of Men-Women, West-East and Young-Old. In this study we also analysed the computer using behaviour of gynaecologic surgeons. As a result two third of all nonusers are not intending to utilise a computer in the future.

  17. E-commerce, paper and energy use: a case study concerning a Dutch electronic computer retailer

    Energy Technology Data Exchange (ETDEWEB)

    Hoogeveen, M.J.; Reijnders, L. [Open University Netherlands, Heerlen (Netherlands)

    2002-07-01

    Impacts of the application of c-commerce on paper and energy use are analysed in a case study concerning a Dutch electronic retailer (e-tailer) of computers. The estimated use of paper associated with the e-tailer concerned was substantially reduced if compared with physical retailing or traditional mail-order retailing. However, the overall effect of e-tailing on paper use strongly depends on customer behaviour. Some characteristics of c-commerce, as practised by the e-tailer concerned, such as diminished floor space requirements, reduced need for personal transport and simplified logistics, improve energy efficiency compared with physical retailing. Substitution of paper information by online information has an energetic effect that is dependent on the time of online information perusal and the extent to which downloaded information is printed. Increasing distances from producers to consumers, outsourcing, and increased use of computers, associated equipment and electronic networks are characteristics of e-commerce that may have an upward effect on energy use. In this case study, the upward effects thereof on energy use were less than the direct energy efficiency gains. However, the indirect effects associated with increased buying power and the rebound effect on transport following from freefalling travel time, greatly exceeded direct energy efficiency gains. (author)

  18. Cloud Computing as Evolution of Distributed Computing – A Case Study for SlapOS Distributed Cloud Computing Platform

    Directory of Open Access Journals (Sweden)

    George SUCIU

    2013-01-01

    Full Text Available The cloud computing paradigm has been defined from several points of view, the main two directions being either as an evolution of the grid and distributed computing paradigm, or, on the contrary, as a disruptive revolution in the classical paradigms of operating systems, network layers and web applications. This paper presents a distributed cloud computing platform called SlapOS, which unifies technologies and communication protocols into a new technology model for offering any application as a service. Both cloud and distributed computing can be efficient methods for optimizing resources that are aggregated from a grid of standard PCs hosted in homes, offices and small data centers. The paper fills a gap in the existing distributed computing literature by providing a distributed cloud computing model which can be applied for deploying various applications.

  19. Computer self-efficacy - is there a gender gap in tertiary level introductory computing classes?

    Directory of Open Access Journals (Sweden)

    Shirley Gibbs

    Full Text Available This paper explores the relationship between introductory computing students, self-efficacy, and gender. Since the use of computers has become more common there has been speculation that the confidence and ability to use them differs between genders. Self-efficacy is an important and useful concept used to describe how a student may perceive their own ability or confidence in using and learning new technology. A survey of students in an introductory computing class has been completed intermittently since the late 1990\\'s. Although some questions have been adapted to meet the changing technology the aim of the survey has remain unchanged. In this study self-efficacy is measured using two self-rating questions. Students are asked to rate their confidence using a computer and also asked to give their perception of their computing knowledge. This paper examines these two aspects of a person\\'s computer self-efficacy in order to identify any differences that may occur between genders in two introductory computing classes, one in 1999 and the other in 2012. Results from the 1999 survey are compared with those from the survey completed in 2012 and investigated to ascertain if the perception that males were more likely to display higher computer self-efficacy levels than their female classmates does or did exist in a class of this type. Results indicate that while overall there has been a general increase in self-efficacy levels in 2012 compared with 1999, there is no significant gender gap.

  20. Comparative Study of the Effectiveness of Three Learning Environments: Hyper-Realistic Virtual Simulations, Traditional Schematic Simulations and Traditional Laboratory

    Science.gov (United States)

    Martinez, Guadalupe; Naranjo, Francisco L.; Perez, Angel L.; Suero, Maria Isabel; Pardo, Pedro J.

    2011-01-01

    This study compared the educational effects of computer simulations developed in a hyper-realistic virtual environment with the educational effects of either traditional schematic simulations or a traditional optics laboratory. The virtual environment was constructed on the basis of Java applets complemented with a photorealistic visual output.…

  1. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  2. Two-dimensional speckle-tracking strain echocardiography in long-term heart transplant patients: a study comparing deformation parameters and ejection fraction derived from echocardiography and multislice computed tomography.

    Science.gov (United States)

    Syeda, Bonni; Höfer, Peter; Pichler, Philipp; Vertesich, Markus; Bergler-Klein, Jutta; Roedler, Susanne; Mahr, Stephane; Goliasch, Georg; Zuckermann, Andreas; Binder, Thomas

    2011-07-01

    Longitudinal strain determined by speckle tracking is a sensitive parameter to detect systolic left ventricular dysfunction. In this study, we assessed regional and global longitudinal strain values in long-term heart transplants and compared deformation indices with ejection fraction as determined by transthoracic echocardiography (TTE) and multislice computed tomographic coronary angiography (MSCTA). TTE and MSCTA were prospectively performed in 31 transplant patients (10.6 years post-transplantation) and in 42 control subjects. Grey-scale apical views were recorded for speckle tracking (EchoPAC 7.0, GE) of the 16 segments of the left ventricle. The presence of coronary artery disease (CAD) was assessed by MSCTA. Strain analysis was performed in 1168 segments [496 in transplant patients (42.5%), 672 in control subjects (57.7%)]. Global longitudinal peak systolic strain was significantly lower in the transplant recipients than in the healthy population (-13.9 ± 4.2 vs. -17.4 ± 5.8%, PSimpsons method) was 60.7 ± 10.1%/60.2 ± 6.7% in transplant recipients vs. 64.7 ± 6.4%/63.0 ± 6.2% in the healthy population, P=ns. Even though 'healthy' heart transplants without CAD exhibit normal ejection fraction, deformation indices are reduced in this population when compared with control subjects. Our findings suggests that strain analysis is more sensitive than assessment of ejection fraction for the detection of abnormalities of systolic function.

  3. Using Computational and Mechanical Models to Study Animal Locomotion

    OpenAIRE

    Miller, Laura A.; Goldman, Daniel I.; Hedrick, Tyson L.; Tytell, Eric D.; Wang, Z. Jane; Yen, Jeannette; Alben, Silas

    2012-01-01

    Recent advances in computational methods have made realistic large-scale simulations of animal locomotion possible. This has resulted in numerous mathematical and computational studies of animal movement through fluids and over substrates with the purpose of better understanding organisms’ performance and improving the design of vehicles moving through air and water and on land. This work has also motivated the development of improved numerical methods and modeling techniques for animal locom...

  4. Assessment of the ABC/2 Method of Epidural Hematoma Volume Measurement as Compared to Computer-Assisted Planimetric Analysis.

    Science.gov (United States)

    Hu, Ting-Ting; Yan, Ling; Yan, Peng-Fei; Wang, Xuan; Yue, Ge-Fen

    2016-01-01

    Epidural hematoma volume (EDHV) is an independent predictor of prognosis in patients with epidural hematoma (EDH) and plays a central role in treatment decision making. This study's objective was to determine the accuracy and reliability of the widely used volume measurement method ABC/2 in estimating EDHV by comparing it to the computer-assisted planimetric method. A data set of computerized tomography (CT) scans of 35 patients with EDH was evaluated to determine the accuracy of ABC/2 method, using computer-assisted planimetric technique to establish the reference criterion of EDHV for each patient. Another data set was constructed by randomly selecting 5 patients then replicating each case twice to yield 15 patients. Intra- and interobserver reliability were evaluated by asking four observers to independently estimate EDHV for the latter data set using the ABC/2 method. Estimation of EDHV using the ABC/2 method showed high intra- and interobserver reliability (intra-class correlation coefficient = .99). These estimates were closely correlated with planimetric measures (r = .99). But the ABC/2 method generally overestimated EDHV, especially in the nonellipsoid-like group. The difference between the ABC/2 measures and planimetric measures was statistically significant (p ABC/2 method could be used for EDHV measurement, which would contribute to treatment decision making as well as clinical outcome prediction. However, clinicians should be aware that the ABC/2 method results in a general volume overestimation. Future studies focusing on justification of the technique to improve its accuracy would be of practical value. © The Author(s) 2015.

  5. Factors affecting the adoption of cloud computing: an exploratory study

    OpenAIRE

    Morgan, Lorraine; Conboy, Kieran

    2013-01-01

    peer-reviewed While it is widely acknowledged that cloud computing has the potential to transform a large part of the IT industry, issues surrounding the adoption of cloud computing have received relatively little attention. Drawing on three case studies of service providers and their customers, this study will contribute to the existing cloud technologies literature that does not address the complex and multifaceted nature of adoption. The findings are analyzed using the adoption of innov...

  6. [The elaboration of aggressiveness in adolescence: Comparative structural study based on the Rorschach test].

    Science.gov (United States)

    Schiltz, L; Diwo, R; de Tychey, C

    2015-09-01

    In adolescence, a component of a successful identity quest consists in elaborating the aggressiveness, be it endured or acted out, in an imaginary and symbolic manner. We will present a comparative study between anxious and violent adolescents, based on the Rorschach test. As the handling of aggressiveness by means of various defense mechanisms and coping strategies contributes to the construction of a sense of reality and of coherent representations of oneself and the others, the Rorschach test is a pertinent tool to study the vicissitudes of the identity quest of medium adolescence. On the other hand, many studies demonstrate that it is also a precious tool allowing diagnosis of the risks of evolution towards character pathology and personality disorders belonging to cluster B of the DSM, or towards emotional disorders and suicidal tendencies. Thus, it can help initiating appropriate therapeutic measures in a spirit of tertiary prevention. We present a comparative study between a sample of 20 adolescents suffering from anxiety and inhibition of aggressiveness (subgroup anxiety) and a second sample of 20 adolescents suffering from exteriorized aggressiveness and violent behavior (subgroup violence). The inclusion into the subgroups was based on clinical interviews and a thorough psychological assessment, using the criteria of categorical psychopathology. The comparative study between the two subgroups is based on an original rating scale constructed in the phenomenological and structural tradition, reflecting the global judgment of the experienced clinical psychologist. It permits using the Rorschach test as a research tool by making the step from qualitative analysis towards quantification and the use of inferential and multidimensional statistics. It also allows computing correlations between the Rorschach test and psychometric scales or other projective tests, using specific rating scales of the same type. After showing the descriptive demographic data, we

  7. Computer tomographic and angiographic studies of histologically confirmed intrahepatic masses

    International Nuclear Information System (INIS)

    Janson, R.; Lackner, K.; Paquet, K.J.; Thelen, M.; Thurn, P.

    1980-01-01

    The computer tomographic and angiographic findings in 53 patients with intrahepatic masses were compared. The histological findings show that 17 were due to echinococcus, 12 were due to hepatic carcinoma, ten were metastases, five patients had focal nodular hyperplasia, three an alveolar echinococcus and there were three cases with an haemangioma of the liver and a further three liver abscesses. Computer tomography proved superior in peripherally situated lesions, and in those in the left lobe of the liver. Arteriography was better at demonstrating lesions below 2 cm in size, particularly vascular tumours. As a pre-operative measure, angiography is to be preferred since it is able to demonstrate anatomic anomalies and variations in the blood supply, as well as invasion of the portal vein or of the inferior vena cava. (orig.) [de

  8. Computer tomographic and angiographic studies of histologically confirmed intrahepatic masses

    Energy Technology Data Exchange (ETDEWEB)

    Janson, R.; Lackner, K.; Paquet, K.J.; Thelen, M.; Thurn, P.

    1980-06-01

    The computer tomographic and angiographic findings in 53 patients with intrahepatic masses were compared. The histological findings show that 17 were due to echinococcus, 12 were due to hepatic carcinoma, ten were metastases, five patients had focal nodular hyperplasia, three an alveolar echinococcus and there were three cases with an haemangioma of the liver and a further three liver abscesses. Computer tomography proved superior in peripherally situated lesions, and in those in the left lobe of the liver. Arteriography was better at demonstrating lesions below 2 cm in size, particularly vascular tumours. As a pre-operative measure, angiography is to be preferred since it is able to demonstrate anatomic anomalies and variations in the blood supply, as well as invasion of the portal vein or of the inferior vena cava.

  9. Comparative study of PCA in classification of multichannel EMG signals.

    Science.gov (United States)

    Geethanjali, P

    2015-06-01

    Electromyographic (EMG) signals are abundantly used in the field of rehabilitation engineering in controlling the prosthetic device and significantly essential to find fast and accurate EMG pattern recognition system, to avoid intrusive delay. The main objective of this paper is to study the influence of Principal component analysis (PCA), a transformation technique, in pattern recognition of six hand movements using four channel surface EMG signals from ten healthy subjects. For this reason, time domain (TD) statistical as well as auto regression (AR) coefficients are extracted from the four channel EMG signals. The extracted statistical features as well as AR coefficients are transformed using PCA to 25, 50 and 75 % of corresponding original feature vector space. The classification accuracy of PCA transformed and non-PCA transformed TD statistical features as well as AR coefficients are studied with simple logistic regression (SLR), decision tree (DT) with J48 algorithm, logistic model tree (LMT), k nearest neighbor (kNN) and neural network (NN) classifiers in the identification of six different movements. The Kruskal-Wallis (KW) statistical test shows that there is a significant reduction (P PCA transformed features compared to non-PCA transformed features. SLR with non-PCA transformed time domain (TD) statistical features performs better in accuracy and computational power compared to other features considered in this study. In addition, the motion control of three drives for six movements of the hand is implemented with SLR using TD statistical features in off-line with TMSLF2407 digital signal controller (DSC).

  10. Computer Assisted Language Learning. Routledge Studies in Computer Assisted Language Learning

    Science.gov (United States)

    Pennington, Martha

    2011-01-01

    Computer-assisted language learning (CALL) is an approach to language teaching and learning in which computer technology is used as an aid to the presentation, reinforcement and assessment of material to be learned, usually including a substantial interactive element. This books provides an up-to date and comprehensive overview of…

  11. Scrotal Irradiation in Primary Testicular Lymphoma: Review of the Literature and In Silico Planning Comparative Study

    International Nuclear Information System (INIS)

    Brouwer, Charlotte L.; Wiesendanger, Esther M.; Hulst, Peter C. van der; Imhoff, Gustaaf W. van; Langendijk, Johannes A.; Beijert, Max

    2013-01-01

    We examined adjuvant irradiation of the scrotum in primary testicular lymphoma (PTL) by means of a literature review in MEDLINE, a telephone survey among Dutch institutes, and an in silico planning comparative study on scrotal irradiation in PTL. We did not find any uniform adjuvant irradiation technique assuring a safe planning target volume (PTV) coverage in published reports, and the definition of the clinical target volume is unclear. Histopathologic studies of PTL show a high invasion rate of the tunica albuginea, the epididymis, and the spermatic cord. In retrospective studies, a prescribed dose of at least 30 Gy involving the scrotum is associated with best survival. The majority of Dutch institutes irradiate the whole scrotum without using a planning computed tomography scan, with a single electron beam and a total dose of 30 Gy. The in silico planning comparative study showed that all evaluated approaches met a D 95% scrotal dose of at least 85% of the prescription dose, without exceeding the dose limits of critical organs. Photon irradiation with 2 oblique beams using wedges resulted in the best PTV coverage, with a mean value of 95% of the prescribed dose, with lowest maximum dose. Adjuvant photon or electron irradiation of the whole scrotum including the contralateral testicle with a minimum dose of 30 Gy is recommended in PTL. Computed tomography-based radiation therapy treatment planning with proper patient positioning and position verification guarantees optimal dose coverage.

  12. A computer-simulated liver phantom (virtual liver phantom) for multidetector computed tomography evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Funama, Yoshinori [Kumamoto University, Department of Radiological Sciences, School of Health Sciences, Kumamoto (Japan); Awai, Kazuo; Nakayama, Yoshiharu; Liu, Da; Yamashita, Yasuyuki [Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto (Japan); Miyazaki, Osamu; Goto, Taiga [Hitachi Medical Corporation, Tokyo (Japan); Hori, Shinichi [Gate Tower Institute of Image Guided Therapy, Osaka (Japan)

    2006-04-15

    The purpose of study was to develop a computer-simulated liver phantom for hepatic CT studies. A computer-simulated liver phantom was mathematically constructed on a computer workstation. The computer-simulated phantom was calibrated using real CT images acquired by an actual four-detector CT. We added an inhomogeneous texture to the simulated liver by referring to CT images of chronically damaged human livers. The mean CT number of the simulated liver was 60 HU and we added numerous 5-to 10-mm structures with 60{+-}10 HU/mm. To mimic liver tumors we added nodules measuring 8, 10, and 12 mm in diameter with CT numbers of 60{+-}10, 60{+-}15, and 60{+-}20 HU. Five radiologists visually evaluated similarity of the texture of the computer-simulated liver phantom and a real human liver to confirm the appropriateness of the virtual liver images using a five-point scale. The total score was 44 in two radiologists, and 42, 41, and 39 in one radiologist each. They evaluated that the textures of virtual liver were comparable to those of human liver. Our computer-simulated liver phantom is a promising tool for the evaluation of the image quality and diagnostic performance of hepatic CT imaging. (orig.)

  13. Computed tomography of the brain in cases with venous vasculitis compared with an age-matched reference group

    International Nuclear Information System (INIS)

    Hannerz, J.; Ericson, K.; Bergstrand, G.; Berggren, B.M.; Edman, G.; Karolinska Sjukhuset, Stockholm; Karolinska Sjukhuset, Stockholm

    1988-01-01

    Patients with a particular, steroid-sensitive headache and often characteristic pathology at orbital phlebography, have been suggested to suffer from venous vasculitis. Fifty such patients were examined with computed tomography (CT) of the brain. The findings were compared with those of an age-matched reference group selected at random to represent normal subjects. The CT examinations were analyzed with respect to size of lateral ventricles and signs of atrophy. In both groups, there was a significant increase of atrophy with age. There was also a significantly higher degree of atrophy in the patient group as compared with the reference group. The findings indicate that the supposedly underlying venous vasculitis is related to early aging and atrophy of the brain. (orig.)

  14. Experimental and computational development of a natural breast phantom for dosimetry studies

    International Nuclear Information System (INIS)

    Nogueira, Luciana B.; Campos, Tarcisio P.R.

    2013-01-01

    This paper describes the experimental and computational development of a natural breast phantom, anthropomorphic and anthropometric for studies in dosimetry of brachytherapy and teletherapy of breast. The natural breast phantom developed corresponding to fibroadipose breasts of women aged 30 to 50 years, presenting radiographically medium density. The experimental breast phantom was constituted of three tissue-equivalents (TE's): glandular TE, adipose TE and skin TE. These TE's were developed according to chemical composition of human breast and present radiological response to exposure. Completed the construction of experimental breast phantom this was mounted on a thorax phantom previously developed by the research group NRI/UFMG. Then the computational breast phantom was constructed by performing a computed tomography (CT) by axial slices of the chest phantom. Through the images generated by CT a computational model of voxels of the thorax phantom was developed by SISCODES computational program, being the computational breast phantom represented by the same TE's of the experimental breast phantom. The images generated by CT allowed evaluating the radiological equivalence of the tissues. The breast phantom is being used in studies of experimental dosimetry both in brachytherapy as in teletherapy of breast. Dosimetry studies by MCNP-5 code using the computational model of the phantom breast are in progress. (author)

  15. Reheating breakfast: Age and multitasking on a computer-based and a non-computer-based task

    OpenAIRE

    Feinkohl, I.; Cress, U.; Kimmerle, J.

    2016-01-01

    Computer-based assessments are popular means to measure individual differences, including age differences, in cognitive ability, but are rarely tested for the extent to which they correspond to more realistic behavior. In the present study, we explored the extent to which performance on an existing computer-based task of multitasking ('cooking breakfast') may be generalizable by comparing it with a newly developed version of the same task that required interaction with physical objects. Twent...

  16. Defragging Computer/Videogame Implementation and Assessment in the Social Studies

    Science.gov (United States)

    McBride, Holly

    2014-01-01

    Students in this post-industrial technological age require opportunities for the acquisition of new skills, especially in the marketplace of innovation. A pedagogical strategy that is becoming more and more popular within social studies classrooms is the use of computer and video games as enhancements to everyday lesson plans. Computer/video games…

  17. A computational study on the influence of insect wing geometry on bee flight mechanics

    Directory of Open Access Journals (Sweden)

    Jeffrey Feaster

    2017-12-01

    Full Text Available Two-dimensional computational fluid dynamics (CFD is applied to better understand the effects of wing cross-sectional morphology on flow field and force production. This study investigates the influence of wing cross-section on insect scale flapping flight performance, for the first time, using a morphologically representative model of a bee (Bombus pensylvanicus wing. The bee wing cross-section was determined using a micro-computed tomography scanner. The results of the bee wing are compared with flat and elliptical cross-sections, representative of those used in modern literature, to determine the impact of profile variation on aerodynamic performance. The flow field surrounding each cross-section and the resulting forces are resolved using CFD for a flight speed range of 1 to 5 m/s. A significant variation in vortex formation is found when comparing the ellipse and flat plate with the true bee wing. During the upstroke, the bee and approximate wing cross-sections have a much shorter wake structure than the flat plate or ellipse. During the downstroke, the flat plate and elliptical cross-sections generate a single leading edge vortex, while the approximate and bee wings generate numerous, smaller structures that are shed throughout the stroke. Comparing the instantaneous aerodynamic forces on the wing, the ellipse and flat plate sections deviate progressively with velocity from the true bee wing. Based on the present findings, a simplified cross-section of an insect wing can misrepresent the flow field and force production. We present the first aerodynamic study using a true insect wing cross-section and show that the wing corrugation increases the leading edge vortex formation frequency for a given set of kinematics.

  18. Studies on variable swirl intake system for DI diesel engine using computational fluid dynamics

    Directory of Open Access Journals (Sweden)

    Jebamani Rathnaraj David

    2008-01-01

    Full Text Available It is known that a helical port is more effective than a tangential port to attain the required swirl ratio with minimum sacrifice in the volumetric efficiency. The swirl port is designed for lesser swirl ratio to reduce emissions at higher speeds. But this condition increases the air fuel mixing time and particulate smoke emissions at lower speeds. Optimum swirl ratio is necessary according to the engine operating condition for optimum combustion and emission reduction. Hence the engine needs variable swirl to enhance the combustion in the cylinder according to its operating conditions, for example at partial load or low speed condition it requires stronger swirl, while the air quantity is more important than the swirl under very high speed or full load and maximum torque conditions. The swirl and charging quantity can easily trade off and can be controlled by the opening of the valve. Hence in this study the steady flow rig experiment is used to evaluate the swirl of a helical intake port design for different operating conditions. The variable swirl plate set up of the W06DTIE2 engine is used to experimentally study the swirl variation for different openings of the valve. The sliding of the swirl plate results in the variation of the area of inlet port entry. Therefore in this study a swirl optimized combustion system varying according to the operating conditions by a variable swirl plate mechanism is studied experimentally and compared with the computational fluid dynamics predictions. In this study the fluent computational fluid dynamics code has been used to evaluate the flow in the port-cylinder system of a DI diesel engine in a steady flow rig. The computational grid is generated directly from 3-D CAD data and in cylinder flow simulations, with inflow boundary conditions from experimental measurements, are made using the fluent computational fluid dynamics code. The results are in very good agreement with experimental results.

  19. The Use of Computer-Assisted Home Exercises to Preserve Physical Function after a Vestibular Rehabilitation Program: A Randomized Controlled Study

    DEFF Research Database (Denmark)

    Brandt, Michael Smærup; Læssøe, Uffe; Grönvall, Erik

    2016-01-01

    . Materials and Methods. Single-blind, randomized, controlled follow-up study. Fifty-seven elderly patients with chronic dizziness were randomly assigned to a computer-assisted home exercise program or to home exercises as described in printed instructions and followed for tree month after discharge from......, and quality of life three months following discharge from hospital. In this specific setup, no greater effect was found by introducing a computer-assisted training program, when compared to standard home training guided by printed instructions. This trial is registered with NCT01344408.......Objective. The purpose of this study was to evaluate whether elderly patients with vestibular dysfunction are able to preserve physical functional level, reduction in dizziness, and the patient's quality of life when assistive computer technology is used in comparison with printed instructions...

  20. Computational Controversy

    OpenAIRE

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have appeared, using new data sources such as Wikipedia, which help us now better understand these phenomena. However, compared to what social sciences have discovered about such debates, the existing computati...

  1. Design and study of parallel computing environment of Monte Carlo simulation for particle therapy planning using a public cloud-computing infrastructure

    International Nuclear Information System (INIS)

    Yokohama, Noriya

    2013-01-01

    This report was aimed at structuring the design of architectures and studying performance measurement of a parallel computing environment using a Monte Carlo simulation for particle therapy using a high performance computing (HPC) instance within a public cloud-computing infrastructure. Performance measurements showed an approximately 28 times faster speed than seen with single-thread architecture, combined with improved stability. A study of methods of optimizing the system operations also indicated lower cost. (author)

  2. Intraoperative ultrasound in determining the extent of resection of parenchymal brain tumors - a comparative study with computed tomography and histopathology

    International Nuclear Information System (INIS)

    Chacko, A.G.; Rajshekhar, V.; Kumar, N.K.S.; Athyal, R.; Chacko, G.

    2003-01-01

    Radical excision of parenchymal brain tumours is generally associated with a better long-term outcome; however, it is difficult to ascertain the extent of resection at surgery. We used intra-operative ultrasound [IOUS] to help detect residual tumour and define the tumour-brain interface. Thirty-five patients with parenchymal brain lesions including 11 low-grade and 22 high-grade tumours and 2 inflammatory granulomata were included in the study. The IOUS was used to localize tumours not seen on the surface, define their margins and assess the extent of resection at the end of surgery. Multiple samples from the tumour-brain interface which were reported as tumour or normal tissue an IOUS were submitted to histopathology. The IOUS findings were compared with a postoperative contrast enhanced computed tomogram [CT] and with histopathology. All tumours irrespective of histology were hyperechoic an IOUS. IOUS was useful in localizing those tumours not seen on the surface of the brain. In 71.4 % of cases IOUS was useful in defining their margins, however in the remaining cases the margins were ill-defined. The tumour margins were ill-defined in those treated previously by radiation. With regard to the extent of excision, after excluding the cases who were irradiated, it was found that in the 28 patients who had parenchymal neoplasms, there was concordance between the ultrasound findings and the postoperative CT scan in 23 cases. Of the 79 samples taken from the tumor-brain interface which were reported as tumour on ultrasound, 66 had histopathological evidence of tumour while 13 samples were negative for tumour. On the other hand, in the tissue sent from 17 sites where the IOUS showed no residual tumour, 2 were positive for tumour on histopathology while 15 were negative. In conclusion, IOUS is a cheap and useful real-time tool for localizing tumours not seen on the brain surface, for defining their margins and for determining the extent of resection. (author)

  3. Security in hybrid cloud computing

    OpenAIRE

    Koudelka, Ondřej

    2016-01-01

    This bachelor thesis deals with the area of hybrid cloud computing, specifically with its security. The major aim of the thesis is to analyze and compare the chosen hybrid cloud providers. For the minor aim this thesis compares the security challenges of hybrid cloud as opponent to other deployment models. In order to accomplish said aims, this thesis defines the terms cloud computing and hybrid cloud computing in its theoretical part. Furthermore the security challenges for cloud computing a...

  4. The use of combined single photon emission computed tomography and X-ray computed tomography to assess the fate of inhaled aerosol.

    Science.gov (United States)

    Fleming, John; Conway, Joy; Majoral, Caroline; Tossici-Bolt, Livia; Katz, Ira; Caillibotte, Georges; Perchet, Diane; Pichelin, Marine; Muellinger, Bernhard; Martonen, Ted; Kroneberg, Philipp; Apiou-Sbirlea, Gabriela

    2011-02-01

    Gamma camera imaging is widely used to assess pulmonary aerosol deposition. Conventional planar imaging provides limited information on its regional distribution. In this study, single photon emission computed tomography (SPECT) was used to describe deposition in three dimensions (3D) and combined with X-ray computed tomography (CT) to relate this to lung anatomy. Its performance was compared to planar imaging. Ten SPECT/CT studies were performed on five healthy subjects following carefully controlled inhalation of radioaerosol from a nebulizer, using a variety of inhalation regimes. The 3D spatial distribution was assessed using a central-to-peripheral ratio (C/P) normalized to lung volume and for the right lung was compared to planar C/P analysis. The deposition by airway generation was calculated for each lung and the conducting airways deposition fraction compared to 24-h clearance. The 3D normalized C/P ratio correlated more closely with 24-h clearance than the 2D ratio for the right lung [coefficient of variation (COV), 9% compared to 15% p computer analysis is a useful approach for applications requiring regional information on deposition.

  5. Comparative study of thallium-201 single-photon emission computed tomography and electrocardiography in Duchenne and other types of muscular dystrophy

    International Nuclear Information System (INIS)

    Yamamoto, S.; Matsushima, H.; Suzuki, A.; Sotobata, I.; Indo, T.; Matsuoka, Y.

    1988-01-01

    Single-photon emission computed tomography (SPECT) using thallium-201 was compared with 12-lead electrocardiography (ECG) in patients with Duchenne (29), facioscapulohumeral (7), limb-girdle (6) and myotonic (5) dystrophies, by dividing the left ventricular (LV) wall into 5 segments. SPECT showed thallium defects (37 patients, mostly in the posteroapical wall), malrotation (23), apical aneurysm (5) and dilatation (7). ECG showed abnormal QRS (36 patients), particularly as a posterolateral pattern (13). Both methods of assessment were normal in only 7 patients. The Duchenne type frequently showed both a thallium defect (particularly in the posteroapical wall) and an abnormal QRS (predominantly in the posterolateral wall); the 3 other types showed abnormalities over the 5 LV wall segments in both tests. The percent of agreement between the 2 tests was 64, 66, 70, 72 and 72 for the lateral, apical, anteroseptal, posterior and inferior walls, respectively. The 2 tests were discordant in 31% of the LV wall, with SPECT (+) but ECG (-) in 21% (mostly in the apicoinferior wall) and SPECT (-) but ECG (+) in 10% (mostly in the lateral wall). Some patients showed large SPECT hypoperfusion despite minimal electrocardiographic changes. ECG thus appeared to underestimate LV fibrosis and to reflect posteroapical rather than posterolateral dystrophy in its posterolateral QRS pattern. In this disease, extensive SPECT hypoperfusion was also shown, irrespective of clinical subtype and skeletal involvement

  6. Computer-designed surgical guide template compared with free-hand operation for mesiodens extraction in premaxilla using “trapdoor” method

    Science.gov (United States)

    Hu, Ying Kai; Xie, Qian Yang; Yang, Chi; Xu, Guang Zhou

    2017-01-01

    Abstract The aim of this study was to introduce a novel method of mesiodens extraction using a vascularized pedicled bone flap by piezosurgery and to compare the differences between a computer-aided design surgical guide template and free-hand operation. A total of 8 patients with mesiodens, 4 with a surgical guide (group I), and 4 without it (group II) were included in the study. The surgical design was to construct a trapdoor pedicle on the superior mucoperiosteal attachment with application of piezosurgery. The bone lid was repositioned after mesiodens extraction. Group I patients underwent surgeries based on the preoperative planning with surgical guide templates, while group II patients underwent free-hand operation. The outcome variables were success rate, intraoperative time, anterior nasal spine (ANS) position, changes of nasolabial angle (NLA), and major complications. Data from the 2 groups were compared by SPSS 17.0, using Wilcoxon test. The operative time was significantly shorter in group I patients. All the mesiodentes were extracted successfully and no obvious differences of preoperative and postoperative ANS position and NLA value were found in both groups. The patients were all recovered uneventfully. Surgical guide templates can enhance clinical accuracy and reduce operative time by facilitating accurate osteotomies. PMID:28658139

  7. A comparative evaluation of Cone Beam Computed Tomography (CBCT) and Multi-Slice CT (MSCT). Part II: On 3D model accuracy

    International Nuclear Information System (INIS)

    Liang Xin; Lambrichts, Ivo; Sun Yi; Denis, Kathleen; Hassan, Bassam; Li Limin; Pauwels, Ruben; Jacobs, Reinhilde

    2010-01-01

    Aim: The study aim was to compare the geometric accuracy of three-dimensional (3D) surface model reconstructions between five Cone Beam Computed Tomography (CBCT) scanners and one Multi-Slice CT (MSCT) system. Materials and methods: A dry human mandible was scanned with five CBCT systems (NewTom 3G, Accuitomo 3D, i-CAT, Galileos, Scanora 3D) and one MSCT scanner (Somatom Sensation 16). A 3D surface bone model was created from the six systems. The reference (gold standard) 3D model was obtained with a high resolution laser surface scanner. The 3D models from the five systems were compared with the gold standard using a point-based rigid registration algorithm. Results: The mean deviation from the gold standard for MSCT was 0.137 mm and for CBCT were 0.282, 0.225, 0.165, 0.386 and 0.206 mm for the i-CAT, Accuitomo, NewTom, Scanora and Galileos, respectively. Conclusion: The results show that the accuracy of CBCT 3D surface model reconstructions is somewhat lower but acceptable comparing to MSCT from the gold standard.

  8. CFD Vision 2030 Study: A Path to Revolutionary Computational Aerosciences

    Science.gov (United States)

    Slotnick, Jeffrey; Khodadoust, Abdollah; Alonso, Juan; Darmofal, David; Gropp, William; Lurie, Elizabeth; Mavriplis, Dimitri

    2014-01-01

    This report documents the results of a study to address the long range, strategic planning required by NASA's Revolutionary Computational Aerosciences (RCA) program in the area of computational fluid dynamics (CFD), including future software and hardware requirements for High Performance Computing (HPC). Specifically, the "Vision 2030" CFD study is to provide a knowledge-based forecast of the future computational capabilities required for turbulent, transitional, and reacting flow simulations across a broad Mach number regime, and to lay the foundation for the development of a future framework and/or environment where physics-based, accurate predictions of complex turbulent flows, including flow separation, can be accomplished routinely and efficiently in cooperation with other physics-based simulations to enable multi-physics analysis and design. Specific technical requirements from the aerospace industrial and scientific communities were obtained to determine critical capability gaps, anticipated technical challenges, and impediments to achieving the target CFD capability in 2030. A preliminary development plan and roadmap were created to help focus investments in technology development to help achieve the CFD vision in 2030.

  9. Comparative study on skull CT scan and bone scintigraphy in chronic hemodialysed patients

    International Nuclear Information System (INIS)

    Ochi, Hironobu; Inoue, Yuichi; Fukuda, Teruo; Shibakiri, Ippei; Tsuda, Kazuyoshi

    1981-01-01

    A comparative study of computed tomography (XCT) scan utilizing EMI head unit and radionuclide bone scan was performed in 17 patients with chronic renal failure on maintenance hemodialysis. Bone scintigram was positive in 7 out of 17 patients. EMI number of the skull in the positive bone scintigram group was significantly lower than that of the negative bone scintigram. Radionuclide bone scan is the most useful method to detect early bone change and XCT scan will determine the grade of the bone mineral contents. XCT is especially useful to follow patients under the medical treatment (active vitamine D 3 therapy) in order to know the therapeutic effect. (author)

  10. Experimental and computational study on thermoelectric generators using thermosyphons with phase change as heat exchangers

    International Nuclear Information System (INIS)

    Araiz, M.; Martínez, A.; Astrain, D.; Aranguren, P.

    2017-01-01

    Highlights: • Thermosyphon with phase change heat exchanger computational model. • Construction and experimentation of a prototype. • ±9% of maximum deviation from experimental values of the main outputs. • Influence of the auxiliary equipment on the net power generation. - Abstract: An important issue in thermoelectric generators is the thermal design of the heat exchangers since it can improve their performance by increasing the heat absorbed or dissipated by the thermoelectric modules. Due to its several advantages, compared to conventional dissipation systems, a thermosyphon heat exchanger with phase change is proposed to be placed on the cold side of thermoelectric generators. Some of these advantages are: high heat-transfer rates; absence of moving parts and lack of auxiliary consumption (because fans or pumps are not required); and the fact that these systems are wickless. A computational model is developed to design and predict the behaviour of this heat exchangers. Furthermore, a prototype has been built and tested in order to demonstrate its performance and validate the computational model. The model predicts the thermal resistance of the heat exchanger with a relative error in the interval [−8.09; 7.83] in the 95% of the cases. Finally, the use of thermosyphons with phase change in thermoelectric generators has been studied in a waste-heat recovery application, stating that including them on the cold side of the generators improves the net thermoelectric production by 36% compared to that obtained with finned dissipators under forced convection.

  11. A Study of the Correlation between Computer Games and Adolescent Behavioral Problems.

    Science.gov (United States)

    Shokouhi-Moqhaddam, Solmaz; Khezri-Moghadam, Noshiravan; Javanmard, Zeinab; Sarmadi-Ansar, Hassan; Aminaee, Mehran; Shokouhi-Moqhaddam, Majid; Zivari-Rahman, Mahmoud

    2013-01-01

    Today, due to developing communicative technologies, computer games and other audio-visual media as social phenomena, are very attractive and have a great effect on children and adolescents. The increasing popularity of these games among children and adolescents results in the public uncertainties about plausible harmful effects of these games. This study aimed to investigate the correlation between computer games and behavioral problems on male guidance school students. This was a descriptive-correlative study on 384 randomly chosen male guidance school students. They were asked to answer the researcher's questionnaire about computer games and Achenbach's Youth Self-Report (YSR). The Results of this study indicated that there was about 95% direct significant correlation between the amount of playing games among adolescents and anxiety/depression, withdrawn/depression, rule-breaking behaviors, aggression, and social problems. However, there was no statistically significant correlation between the amount of computer game usage and physical complaints, thinking problems, and attention problems. In addition, there was a significant correlation between the students' place of living and their parents' job, and using computer games. Computer games lead to anxiety, depression, withdrawal, rule-breaking behavior, aggression, and social problems in adolescents.

  12. A Study of the Correlation between Computer Games and Adolescent Behavioral Problems

    Science.gov (United States)

    Shokouhi-Moqhaddam, Solmaz; Khezri-Moghadam, Noshiravan; Javanmard, Zeinab; Sarmadi-Ansar, Hassan; Aminaee, Mehran; Shokouhi-Moqhaddam, Majid; Zivari-Rahman, Mahmoud

    2013-01-01

    Background Today, due to developing communicative technologies, computer games and other audio-visual media as social phenomena, are very attractive and have a great effect on children and adolescents. The increasing popularity of these games among children and adolescents results in the public uncertainties about plausible harmful effects of these games. This study aimed to investigate the correlation between computer games and behavioral problems on male guidance school students. Methods This was a descriptive-correlative study on 384 randomly chosen male guidance school students. They were asked to answer the researcher's questionnaire about computer games and Achenbach’s Youth Self-Report (YSR). Findings The Results of this study indicated that there was about 95% direct significant correlation between the amount of playing games among adolescents and anxiety/depression, withdrawn/depression, rule-breaking behaviors, aggression, and social problems. However, there was no statistically significant correlation between the amount of computer game usage and physical complaints, thinking problems, and attention problems. In addition, there was a significant correlation between the students’ place of living and their parents’ job, and using computer games. Conclusion Computer games lead to anxiety, depression, withdrawal, rule-breaking behavior, aggression, and social problems in adolescents. PMID:24494157

  13. High performance computing system in the framework of the Higgs boson studies

    CERN Document Server

    Belyaev, Nikita; The ATLAS collaboration

    2017-01-01

    The Higgs boson physics is one of the most important and promising fields of study in modern High Energy Physics. To perform precision measurements of the Higgs boson properties, the use of fast and efficient instruments of Monte Carlo event simulation is required. Due to the increasing amount of data and to the growing complexity of the simulation software tools, the computing resources currently available for Monte Carlo simulation on the LHC GRID are not sufficient. One of the possibilities to address this shortfall of computing resources is the usage of institutes computer clusters, commercial computing resources and supercomputers. In this paper, a brief description of the Higgs boson physics, the Monte-Carlo generation and event simulation techniques are presented. A description of modern high performance computing systems and tests of their performance are also discussed. These studies have been performed on the Worldwide LHC Computing Grid and Kurchatov Institute Data Processing Center, including Tier...

  14. Office workers' computer use patterns are associated with workplace stressors.

    Science.gov (United States)

    Eijckelhof, Belinda H W; Huysmans, Maaike A; Blatter, Birgitte M; Leider, Priscilla C; Johnson, Peter W; van Dieën, Jaap H; Dennerlein, Jack T; van der Beek, Allard J

    2014-11-01

    This field study examined associations between workplace stressors and office workers' computer use patterns. We collected keyboard and mouse activities of 93 office workers (68F, 25M) for approximately two work weeks. Linear regression analyses examined the associations between self-reported effort, reward, overcommitment, and perceived stress and software-recorded computer use duration, number of short and long computer breaks, and pace of input device usage. Daily duration of computer use was, on average, 30 min longer for workers with high compared to low levels of overcommitment and perceived stress. The number of short computer breaks (30 s-5 min long) was approximately 20% lower for those with high compared to low effort and for those with low compared to high reward. These outcomes support the hypothesis that office workers' computer use patterns vary across individuals with different levels of workplace stressors. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  15. Mg co-ordination with potential carcinogenic molecule acrylamide: Spectroscopic, computational and cytotoxicity studies

    Science.gov (United States)

    Singh, Ranjana; Mishra, Vijay K.; Singh, Hemant K.; Sharma, Gunjan; Koch, Biplob; Singh, Bachcha; Singh, Ranjan K.

    2018-03-01

    Acrylamide (acr) is a potential toxic molecule produced in thermally processed food stuff. Acr-Mg complex has been synthesized chemically and characterized by spectroscopic techniques. The binding sites of acr with Mg were identified by experimental and computational methods. Both experimental and theoretical results suggest that Mg coordinated with the oxygen atom of Cdbnd O group of acr. In-vitro cytotoxicity studies revealed significant decrease in the toxic level of acr-Mg complex as compared to pure acr. The decrease in toxicity on complexation with Mg may be a useful step for future research to reduce the toxicity of acr.

  16. An Exploratory Study of Pauses in Computer-Assisted EFL Writing

    Science.gov (United States)

    Xu, Cuiqin; Ding, Yanren

    2014-01-01

    The advance of computer input log and screen-recording programs over the last two decades has greatly facilitated research into the writing process in real time. Using Inputlog 4.0 and Camtasia 6.0 to record the writing process of 24 Chinese EFL writers in an argumentative task, this study explored L2 writers' pausing patterns in computer-assisted…

  17. Computational methods for stellerator configurations

    International Nuclear Information System (INIS)

    Betancourt, O.

    1992-01-01

    This project had two main objectives. The first one was to continue to develop computational methods for the study of three dimensional magnetic confinement configurations. The second one was to collaborate and interact with researchers in the field who can use these techniques to study and design fusion experiments. The first objective has been achieved with the development of the spectral code BETAS and the formulation of a new variational approach for the study of magnetic island formation in a self consistent fashion. The code can compute the correct island width corresponding to the saturated island, a result shown by comparing the computed island with the results of unstable tearing modes in Tokamaks and with experimental results in the IMS Stellarator. In addition to studying three dimensional nonlinear effects in Tokamaks configurations, these self consistent computed island equilibria will be used to study transport effects due to magnetic island formation and to nonlinearly bifurcated equilibria. The second objective was achieved through direct collaboration with Steve Hirshman at Oak Ridge, D. Anderson and R. Talmage at Wisconsin as well as through participation in the Sherwood and APS meetings

  18. Musculoskeletal Problems Associated with University Students Computer Users: A Cross-Sectional Study

    Directory of Open Access Journals (Sweden)

    Rakhadani PB

    2017-07-01

    Full Text Available While several studies have examined the prevalence and correlates of musculoskeletal problems among university students, scanty information exists in South African context. The objective of this study was to determine the prevalence, causes and consequences of musculoskeletal problems among University of Venda students’ computer users. This cross-sectional study involved 694 university students at the University of Venda. A self-designed questionnaire was used to collect information on the sociodemographic characteristics, problems associated with computer users, and causes of musculoskeletal problems associated with computer users. The majority (84.6% of the participants use computer for internet, wording processing (20.3%, and games (18.7%. The students reported neck pain when using computer (52.3%; shoulder (47.0%, finger (45.0%, lower back (43.1%, general body pain (42.9%, elbow (36.2%, wrist (33.7%, hip and foot (29.1% and knee (26.2%. Reported causes of musculoskeletal pains associated with computer usage were: sitting position, low chair, a lot of time spent on computer, uncomfortable laboratory chairs, and stressfulness. Eye problems (51.9%, muscle cramp (344.0%, headache (45.3%, blurred vision (38.0%, feeling of illness (39.9% and missed lectures (29.1% were consequences of musculoskeletal problems linked to computer use. The majority of students reported having mild pain (43.7%, moderate (24.2%, and severe (8.4% pains. Years of computer use were significantly associated with neck, shoulder and wrist pain. Using computer for internet was significantly associated with neck pain (OR=0.60; 95% CI 0.40-0.93; games: neck (OR=0.60; 95% CI 0.40-0.85 and hip/foot (OR=0.60; CI 95% 0.40-0.92, programming for elbow (OR= 1.78; CI 95% 1.10-2.94 and wrist (OR=2.25; CI 95% 1.36-3.73, while word processing was significantly associated with lower back (OR=1.45; CI 95% 1.03-2.04. Undergraduate study had a significant association with elbow pain (OR=2

  19. Computer use and addiction in Romanian children and teenagers--an observational study.

    Science.gov (United States)

    Chiriţă, V; Chiriţă, Roxana; Stefănescu, C; Chele, Gabriela; Ilinca, M

    2006-01-01

    The computer has provided some wonderful opportunities for our children. Although research on the effects of children's use of computer is still ambiguous, some initial indications of positive and negative effects are beginning t emerge. They commonly use computers for playing games, completing school assignments, email, and connecting to the Internet. This may sometimes come at the expense of other activities such as homework or normal social interchange. Although most children seem to naturally correct the problem, parents and educators must monitor the signs of misuse. Studies of general computer users suggest that some children's may experience psychological problems such as social isolation, depression, loneliness, and time mismanagement related to their computer use and failure at school. The purpose of this study is to investigate issues related to computer use by school students from 11 to 18 years old. The survey included a representative sample of 439 school students of ages 11 to 18. All of the students came from 3 gymnasium schools and 5 high schools of Iaşi, Romania. The students answered to a questionnaire comprising 34 questions related to computer activities. The children's parents answered to a second questionnaire with the same subject. Most questions supposed to rate on a scale the frequency of occurrence of a certain event or issue; some questions solicited an open-answer or to choose an answer from a list. These were aimed at highlighting: (1) The frequency of computer use by the students; (2) The interference of excessive use with school performance and social life; (3) The identification of a possible computer addiction. The data was processed using the SPSS statistics software, version 11.0. Results show that the school students prefer to spend a considerable amount of time with their computers, over 3 hours/day. More than 65.7% of the students have a computer at home. More than 70% of the parents admit they do not or only occasionally

  20. A blinded prospective study comparing four current noninvasive approaches in the differential diagnosis of medical versus surgical jaundice

    International Nuclear Information System (INIS)

    O'Connor, K.W.; Snodgrass, P.J.; Swonder, J.E.; Mahoney, S.; Burt, R.; Cockerill, E.M.; Lumeng, L.

    1983-01-01

    A prospective study was undertaken to compare the diagnostic accuracy of clinical evaluation, ultrasound, computed tomography, and technetium 99m-HIDA or -PIPIDA biliary scans in distinguishing between intrahepatic and extrahepatic jaundice. A final diagnosis was established in each of the 50 patients who completed the study, among whom 29 had intrahepatic cholestasis and 21 had extrahepatic obstruction. In the diagnosis of extrahepatic obstruction, the sensitivities of clinical evaluation, ultrasound, computed tomography, and nuclear medicine biliary scan were 95%, 55%, 63%, and 41%, respectively; the specificities were 76%, 93%, 93%, and 88%; and the overall accuracies were 84%, 78%, 81%, and 68%. These data support the conclusion that when the clinical evaluation is carefully performed, it is the single most effective noninvasive means of detecting extrahepatic biliary obstruction in a jaundiced patient. Although ultrasound, computed tomography, and radionuclide biliary scan are less sensitive, they are highly reliable if they indicate that extrahepatic obstruction is present. A flow chart of invasive and noninvasive approaches for evaluation of the jaundiced patient is presented

  1. [The Psychomat computer complex for psychophysiologic studies].

    Science.gov (United States)

    Matveev, E V; Nadezhdin, D S; Shemsudov, A I; Kalinin, A V

    1991-01-01

    The authors analyze the principles of the design of a computed psychophysiological system for universal uses. Show the effectiveness of the use of computed technology as a combination of universal computation and control potentialities of a personal computer equipped with problem-oriented specialized facilities of stimuli presentation and detection of the test subject's reactions. Define the hardware and software configuration of the microcomputer psychophysiological system "Psychomat". Describe its functional possibilities and the basic medico-technical characteristics. Review organizational issues of the maintenance of its full-scale production.

  2. Studies in Mathematics, Volume 22. Studies in Computer Science.

    Science.gov (United States)

    Pollack, Seymour V., Ed.

    The nine articles in this collection were selected because they represent concerns central to computer science, emphasize topics of particular interest to mathematicians, and underscore the wide range of areas deeply and continually affected by computer science. The contents consist of: "Introduction" (S. V. Pollack), "The…

  3. Metrics for comparing dynamic earthquake rupture simulations

    Science.gov (United States)

    Barall, Michael; Harris, Ruth A.

    2014-01-01

    Earthquakes are complex events that involve a myriad of interactions among multiple geologic features and processes. One of the tools that is available to assist with their study is computer simulation, particularly dynamic rupture simulation. A dynamic rupture simulation is a numerical model of the physical processes that occur during an earthquake. Starting with the fault geometry, friction constitutive law, initial stress conditions, and assumptions about the condition and response of the near‐fault rocks, a dynamic earthquake rupture simulation calculates the evolution of fault slip and stress over time as part of the elastodynamic numerical solution (Ⓔ see the simulation description in the electronic supplement to this article). The complexity of the computations in a dynamic rupture simulation make it challenging to verify that the computer code is operating as intended, because there are no exact analytic solutions against which these codes’ results can be directly compared. One approach for checking if dynamic rupture computer codes are working satisfactorily is to compare each code’s results with the results of other dynamic rupture codes running the same earthquake simulation benchmark. To perform such a comparison consistently, it is necessary to have quantitative metrics. In this paper, we present a new method for quantitatively comparing the results of dynamic earthquake rupture computer simulation codes.

  4. A novel computer algorithm for modeling and treating mandibular fractures: A pilot study.

    Science.gov (United States)

    Rizzi, Christopher J; Ortlip, Timothy; Greywoode, Jewel D; Vakharia, Kavita T; Vakharia, Kalpesh T

    2017-02-01

    To describe a novel computer algorithm that can model mandibular fracture repair. To evaluate the algorithm as a tool to model mandibular fracture reduction and hardware selection. Retrospective pilot study combined with cross-sectional survey. A computer algorithm utilizing Aquarius Net (TeraRecon, Inc, Foster City, CA) and Adobe Photoshop CS6 (Adobe Systems, Inc, San Jose, CA) was developed to model mandibular fracture repair. Ten different fracture patterns were selected from nine patients who had already undergone mandibular fracture repair. The preoperative computed tomography (CT) images were processed with the computer algorithm to create virtual images that matched the actual postoperative three-dimensional CT images. A survey comparing the true postoperative image with the virtual postoperative images was created and administered to otolaryngology resident and attending physicians. They were asked to rate on a scale from 0 to 10 (0 = completely different; 10 = identical) the similarity between the two images in terms of the fracture reduction and fixation hardware. Ten mandible fracture cases were analyzed and processed. There were 15 survey respondents. The mean score for overall similarity between the images was 8.41 ± 0.91; the mean score for similarity of fracture reduction was 8.61 ± 0.98; and the mean score for hardware appearance was 8.27 ± 0.97. There were no significant differences between attending and resident responses. There were no significant differences based on fracture location. This computer algorithm can accurately model mandibular fracture repair. Images created by the algorithm are highly similar to true postoperative images. The algorithm can potentially assist a surgeon planning mandibular fracture repair. 4. Laryngoscope, 2016 127:331-336, 2017. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.

  5. Thermodynamic properties of 2,7-di-tert-butylfluorene – An experimental and computational study

    International Nuclear Information System (INIS)

    Oliveira, Juliana A.S.A.; Freitas, Vera L.S.; Notario, Rafael; Ribeiro da Silva, Maria D.M.C.; Monte, Manuel J.S.

    2016-01-01

    Highlights: • Enthalpies and Gibbs energies of formation of 2,7-di-tert-butylfluorene were determined. • Vapour pressures were measured at different temperatures. • Phase transition thermodynamic properties were determined. - Abstract: This work presents a comprehensive experimental and computational study of the thermodynamic properties of 2,7-di-tert-butylfluorene. The standard (p"o = 0.1 MPa) molar enthalpy of formation in the crystalline phase was derived from the standard molar energy of combustion, measured by static bomb combustion calorimetry. The enthalpies and temperatures of transition between condensed phases were determined from DSC experiments. The vapour pressures of the crystalline and liquid phases were measured between (349.14 and 404.04) K, using two different experimental methods. From these results the standard molar enthalpies, entropies and Gibbs energies of sublimation and of vaporization were derived. The enthalpy of sublimation was also determined using Calvet microcalorimetry. The thermodynamic stability of 2,7-di-tert-butylfluorene in the crystalline and gaseous phases was evaluated by the determination of the standard Gibbs energies of formation, at the temperature 298.15 K, and compared with the ones reported in the literature for fluorene. A computational study at the G3(MP2)//B3LYP and G3 levels has been carried out. A conformational analysis has been performed and the enthalpy of formation of 2,7-di-tert-butylfluorene has been calculated, using atomization and isodesmic reactions. The calculated enthalpies of formation have been compared to the experimental values.

  6. Annotated Computer Output for Illustrative Examples of Clustering Using the Mixture Method and Two Comparable Methods from SAS.

    Science.gov (United States)

    1987-06-26

    BUREAU OF STANDAR-S1963-A Nw BOM -ILE COPY -. 4eo .?3sa.9"-,,A WIN* MAT HEMATICAL SCIENCES _*INSTITUTE AD-A184 687 DTICS!ELECTE ANNOTATED COMPUTER OUTPUT...intoduction to the use of mixture models in clustering. Cornell University Biometrics Unit Technical Report BU-920-M and Mathematical Sciences Institute...mixture method and two comparable methods from SAS. Cornell University Biometrics Unit Technical Report BU-921-M and Mathematical Sciences Institute

  7. Benchmark study of some thermal and structural computer codes for nuclear shipping casks

    International Nuclear Information System (INIS)

    Ikushima, Takeshi; Kanae, Yoshioki; Shimada, Hirohisa; Shimoda, Atsumu; Halliquist, J.O.

    1984-01-01

    There are many computer codes which could be applied to the design and analysis of nuclear material shipping casks. One of problems which the designer of shipping cask faces is the decision regarding the choice of the computer codes to be used. For this situation, the thermal and structural benchmark tests for nuclear shipping casks are carried out to clarify adequacy of the calculation results. The calculation results are compared with the experimental ones. This report describes the results and discussion of the benchmark test. (author)

  8. A comparative study of image low level feature extraction algorithms

    Directory of Open Access Journals (Sweden)

    M.M. El-gayar

    2013-07-01

    Full Text Available Feature extraction and matching is at the base of many computer vision problems, such as object recognition or structure from motion. Current methods for assessing the performance of popular image matching algorithms are presented and rely on costly descriptors for detection and matching. Specifically, the method assesses the type of images under which each of the algorithms reviewed herein perform to its maximum or highest efficiency. The efficiency is measured in terms of the number of matches founds by the algorithm and the number of type I and type II errors encountered when the algorithm is tested against a specific pair of images. Current comparative studies asses the performance of the algorithms based on the results obtained in different criteria such as speed, sensitivity, occlusion, and others. This study addresses the limitations of the existing comparative tools and delivers a generalized criterion to determine beforehand the level of efficiency expected from a matching algorithm given the type of images evaluated. The algorithms and the respective images used within this work are divided into two groups: feature-based and texture-based. And from this broad classification only three of the most widely used algorithms are assessed: color histogram, FAST (Features from Accelerated Segment Test, SIFT (Scale Invariant Feature Transform, PCA-SIFT (Principal Component Analysis-SIFT, F-SIFT (fast-SIFT and SURF (speeded up robust features. The performance of the Fast-SIFT (F-SIFT feature detection methods are compared for scale changes, rotation, blur, illumination changes and affine transformations. All the experiments use repeatability measurement and the number of correct matches for the evaluation measurements. SIFT presents its stability in most situations although its slow. F-SIFT is the fastest one with good performance as the same as SURF, SIFT, PCA-SIFT show its advantages in rotation and illumination changes.

  9. A Comparative Study of Simulated and Measured Gear-Flap Flow Interaction

    Science.gov (United States)

    Khorrami, Mehdi R.; Mineck, Raymond E.; Yao, Chungsheng; Jenkins, Luther N.; Fares, Ehab

    2015-01-01

    The ability of two CFD solvers to accurately characterize the transient, complex, interacting flowfield asso-ciated with a realistic gear-flap configuration is assessed via comparison of simulated flow with experimental measurements. The simulated results, obtained with NASA's FUN3D and Exa's PowerFLOW® for a high-fidelity, 18% scale semi-span model of a Gulfstream aircraft in landing configuration (39 deg flap deflection, main landing gear on and off) are compared to two-dimensional and stereo particle image velocimetry measurements taken within the gear-flap flow interaction region during wind tunnel tests of the model. As part of the bench-marking process, direct comparisons of the mean and fluctuating velocity fields are presented in the form of planar contour plots and extracted line profiles at measurement planes in various orientations stationed in the main gear wake. The measurement planes in the vicinity of the flap side edge and downstream of the flap trailing edge are used to highlight the effects of gear presence on tip vortex development and the ability of the computational tools to accurately capture such effects. The present study indicates that both computed datasets contain enough detail to construct a relatively accurate depiction of gear-flap flow interaction. Such a finding increases confidence in using the simulated volumetric flow solutions to examine the behavior of pertinent aer-odynamic mechanisms within the gear-flap interaction zone.

  10. Improved computer-assisted nuclear imaging in renovascular hypertension

    International Nuclear Information System (INIS)

    Gross, M.L.; Nally, J.V.; Potvini, W.J.; Clarke, H.S. Jr.; Higgins, J.T.; Windham, J.P.

    1985-01-01

    A computer-assisted program with digital background subtraction has been developed to analyze the initial 90 second Tc-99m DTPA renal flow scans in an attempt to quantitate the early isotope delivery to and uptake by the kidney. This study was designed to compare the computer-assisted 90 second DTPA scan with the conventional 30 minute I-131 Hippuran scan. Six patients with angiographically-proven unilateral renal artery stenosis were studied. The time activity curves for both studies were derived from regions of interest selected from the computer acquired dynamic images. The following parameters were used to assess renal blood flow: differential maximum activity, minimum/maximum activity ratio, and peak width. The computer-assisted DTPA study accurately predicted (6/6) the stenosed side documented angiographically, whereas the conventional Hippuran scan was clearly predictive in only 2/6. In selected cases successfully corrected surgically, the DTPA study proved superior in assessing the degree of patency of the graft. The best discriminatory factors when compared to a template synthesized from curves obtained from normal subjects were differential maximum activity and peak width. The authors conclude that: 1) the computer-assisted 90 second DTPA renal blood flow scan was superior to the conventional I-131 Hippuran scan in demonstrating unilateral reno-vascular disease; 2) the DTPA study was highly predictive of the angiographic findings; and 3) this non-invasive study should prove useful in the diagnosis and serial evaluation following surgery and/or angioplasty for renal artery stenosis

  11. Does It Matter if I Take My Mathematics Test on Computer? A Second Empirical Study of Mode Effects in NAEP

    Science.gov (United States)

    Bennett, Randy Elliot; Braswell, James; Oranje, Andreas; Sandene, Brent; Kaplan, Bruce; Yan, Fred

    2008-01-01

    This article describes selected results from the Math Online (MOL) study, one of three field investigations sponsored by the National Center for Education Statistics (NCES) to explore the use of new technology in NAEP. Of particular interest in the MOL study was the comparability of scores from paper- and computer-based tests. A nationally…

  12. A High-Throughput Computational Framework for Identifying Significant Copy Number Aberrations from Array Comparative Genomic Hybridisation Data

    Directory of Open Access Journals (Sweden)

    Ian Roberts

    2012-01-01

    Full Text Available Reliable identification of copy number aberrations (CNA from comparative genomic hybridization data would be improved by the availability of a generalised method for processing large datasets. To this end, we developed swatCGH, a data analysis framework and region detection heuristic for computational grids. swatCGH analyses sequentially displaced (sliding windows of neighbouring probes and applies adaptive thresholds of varying stringency to identify the 10% of each chromosome that contains the most frequently occurring CNAs. We used the method to analyse a published dataset, comparing data preprocessed using four different DNA segmentation algorithms, and two methods for prioritising the detected CNAs. The consolidated list of the most commonly detected aberrations confirmed the value of swatCGH as a simplified high-throughput method for identifying biologically significant CNA regions of interest.

  13. Computational study of nonlinear plasma waves. I. Simulation model and monochromatic wave propagation

    International Nuclear Information System (INIS)

    Matsuda, Y.; Crawford, F.W.

    1975-01-01

    An economical low-noise plasma simulation model originated by Denavit is applied to a series of problems associated with electrostatic wave propagation in a one-dimensional, collisionless, Maxwellian plasma, in the absence of magnetic field. The model is described and tested, first in the absence of an applied signal, and then with a small amplitude perturbation. These tests serve to establish the low-noise features of the model, and to verify the theoretical linear dispersion relation at wave energy levels as low as 10 -6 of the plasma thermal energy: Better quantitative results are obtained, for comparable computing time, than can be obtained by conventional particle simulation models, or direct solution of the Vlasov equation. The method is then used to study propagation of an essentially monochromatic plane wave. Results on amplitude oscillation and nonlinear frequency shift are compared with available theories

  14. On Study of Building Smart Campus under Conditions of Cloud Computing and Internet of Things

    Science.gov (United States)

    Huang, Chao

    2017-12-01

    two new concepts in the information era are cloud computing and internet of things, although they are defined differently, they share close relationship. It is a new measure to realize leap-forward development of campus by virtue of cloud computing, internet of things and other internet technologies to build smart campus. This paper, centering on the construction of smart campus, analyzes and compares differences between network in traditional campus and that in smart campus, and makes proposals on how to build smart campus finally from the perspectives of cloud computing and internet of things.

  15. Heuristic Synthesis of Reversible Logic – A Comparative Study

    Directory of Open Access Journals (Sweden)

    Chua Shin Cheng

    2014-01-01

    Full Text Available Reversible logic circuits have been historically motivated by theoretical research in low-power, and recently attracted interest as components of the quantum algorithm, optical computing and nanotechnology. However due to the intrinsic property of reversible logic, traditional irreversible logic design and synthesis methods cannot be carried out. Thus a new set of algorithms are developed correctly to synthesize reversible logic circuit. This paper presents a comprehensive literature review with comparative study on heuristic based reversible logic synthesis. It reviews a range of heuristic based reversible logic synthesis techniques reported by researchers (BDD-based, cycle-based, search-based, non-search-based, rule-based, transformation-based, and ESOP-based. All techniques are described in detail and summarized in a table based on their features, limitation, library used and their consideration metric. Benchmark comparison of gate count and quantum cost are analysed for each synthesis technique. Comparing the synthesis algorithm outputs over the years, it can be observed that different approach has been used for the synthesis of reversible circuit. However, the improvements are not significant. Quantum cost and gate count has improved over the years, but arguments and debates are still on certain issues such as the issue of garbage outputs that remain the same. This paper provides the information of all heuristic based synthesis of reversible logic method proposed over the years. All techniques are explained in detail and thus informative for new reversible logic researchers and bridging the knowledge gap in this area.

  16. A Comparative Study of Wireless Sensor Networks and Their Routing Protocols

    Directory of Open Access Journals (Sweden)

    Subhajit Pal

    2010-11-01

    Full Text Available Recent developments in the area of micro-sensor devices have accelerated advances in the sensor networks field leading to many new protocols specifically designed for wireless sensor networks (WSNs. Wireless sensor networks with hundreds to thousands of sensor nodes can gather information from an unattended location and transmit the gathered data to a particular user, depending on the application. These sensor nodes have some constraints due to their limited energy, storage capacity and computing power. Data are routed from one node to other using different routing protocols. There are a number of routing protocols for wireless sensor networks. In this review article, we discuss the architecture of wireless sensor networks. Further, we categorize the routing protocols according to some key factors and summarize their mode of operation. Finally, we provide a comparative study on these various protocols.

  17. Off-Line and Dynamic Production Scheduling – A Comparative Case Study

    Directory of Open Access Journals (Sweden)

    Bożek Andrzej

    2016-03-01

    Full Text Available A comprehensive case study of manufacturing scheduling solutions development is given. It includes highly generalized scheduling problem as well as a few scheduling modes, methods and problem models. The considered problem combines flexible job shop structure, lot streaming with variable sublots, transport times, setup times, and machine calendars. Tabu search metaheuristic and constraint programming methods have been used for the off-line scheduling. Two dynamic scheduling methods have also been implemented, i.e., dispatching rules for the completely reactive scheduling and a multi-agent system for the predictivereactive scheduling. In these implementations three distinct models of the problem have been used, based on: graph representation, optimal constraint satisfaction, and Petri net formalism. Each of these solutions has been verified in computational experiments. The results are compared and some findings about advantages, disadvantages, and suggestions on using the solutions are formulated.

  18. Comparative performance of the conjugate gradient and SOR [Successive Over Relaxation] methods for computational thermal hydraulics

    International Nuclear Information System (INIS)

    King, J.B.; Anghaie, S.; Domanus, H.M.

    1987-01-01

    Finite difference approximations to the continuity, momentum, and energy equations in thermal hydraulics codes result in a system of N by N equations for a problem having N field points. In a three dimensional problem, N increases as the problem becomes larger or more complex, and more rapidly as the computational mesh size is reduced. As a consequence, the execution time required to solve the problem increases, which may lead to placing limits on the problem resolution or accuracy. A conventinal method of solution of these systems of equations is the Successive Over Relaxation (SOR) technique. However, for a wide range of problems the execution time may be reduced by using a more efficient linear equation solver. One such method is the conjugate gradient method which was implemented in COMMIX-1B thermal hydraulics code. It was found that the execution time required to solve the resulting system of equations was reduced by a factor of about 2 for some problems. This paper summarizes the characteristics of these iterative solution procedures and compares their performance in modeling of a variety of reactor thermal hydraulic problems, using the COMMIX-1B computer code

  19. Potential and limitations of X-Ray micro-computed tomography in arthropod neuroanatomy: A methodological and comparative survey

    Science.gov (United States)

    Sombke, Andy; Lipke, Elisabeth; Michalik, Peter; Uhl, Gabriele; Harzsch, Steffen

    2015-01-01

    Classical histology or immunohistochemistry combined with fluorescence or confocal laser scanning microscopy are common techniques in arthropod neuroanatomy, and these methods often require time-consuming and difficult dissections and sample preparations. Moreover, these methods are prone to artifacts due to compression and distortion of tissues, which often result in information loss and especially affect the spatial relationships of the examined parts of the nervous system in their natural anatomical context. Noninvasive approaches such as X-ray micro-computed tomography (micro-CT) can overcome such limitations and have been shown to be a valuable tool for understanding and visualizing internal anatomy and structural complexity. Nevertheless, knowledge about the potential of this method for analyzing the anatomy and organization of nervous systems, especially of taxa with smaller body size (e.g., many arthropods), is limited. This study set out to analyze the brains of selected arthropods with micro-CT, and to compare these results with available histological and immunohistochemical data. Specifically, we explored the influence of different sample preparation procedures. Our study shows that micro-CT is highly suitable for analyzing arthropod neuroarchitecture in situ and allows specific neuropils to be distinguished within the brain to extract quantitative data such as neuropil volumes. Moreover, data acquisition is considerably faster compared with many classical histological techniques. Thus, we conclude that micro-CT is highly suitable for targeting neuroanatomy, as it reduces the risk of artifacts and is faster than classical techniques. J. Comp. Neurol. 523:1281–1295, 2015. © 2015 Wiley Periodicals, Inc. PMID:25728683

  20. Analysis of the computed tomography in the acute abdomen

    International Nuclear Information System (INIS)

    Hochhegger, Bruno; Moraes, Everton; Haygert, Carlos Jesus Pereira; Antunes, Paulo Sergio Pase; Gazzoni, Fernando; Lopes, Luis Felipe Dias

    2007-01-01

    Introduction: This study tends to test the capacity of the computed tomography in assist in the diagnosis and the approach of the acute abdomen. Material and method: This is a longitudinal and prospective study, in which were analyzed the patients with the diagnosis of acute abdomen. There were obtained 105 cases of acute abdomen and after the application of the exclusions criteria were included 28 patients in the study. Results: Computed tomography changed the diagnostic hypothesis of the physicians in 50% of the cases (p 0.05), where 78.57% of the patients had surgical indication before computed tomography and 67.86% after computed tomography (p = 0.0546). The index of accurate diagnosis of computed tomography, when compared to the anatomopathologic examination and the final diagnosis, was observed in 82.14% of the cases (p = 0.013). When the analysis was done dividing the patients in surgical and nonsurgical group, were obtained an accuracy of 89.28% (p 0.0001). The difference of 7.2 days of hospitalization (p = 0.003) was obtained compared with the mean of the acute abdomen without use the computed tomography. Conclusion: The computed tomography is correlative with the anatomopathology and has great accuracy in the surgical indication, associated with the capacity of increase the confident index of the physicians, reduces the hospitalization time, reduces the number of surgeries and is cost-effective. (author)

  1. A comparative study of cone-beam computed tomography and digital periapical radiography in detecting mandibular molars root perforations

    Energy Technology Data Exchange (ETDEWEB)

    Haghanifar, Sina; Moudi, Ehsan; Mesgarani, Abbas; Abbaszadeh, Naghi [Dental Material Research Center, Dental Faculty, Babol University of Medical Sciences, Babol (Iran, Islamic Republic of); Bijani, Ali [Non-Communicable Pediatric Diseases Research Center, Babol University of Medical Sciences, Babol (Iran, Islamic Republic of)

    2014-06-15

    The aim of this in vitro study was to determine the sensitivity and specificity of cone-beam computed tomography (CBCT) and digital periapical radiography in the detection of mesial root perforations of mandibular molars. In this in vitro study, 48 mandibular molars were divided into 4 groups. First, the mesial canals of all the 48 teeth were endodontically prepared. In 2 groups (24 teeth each), the roots were axially perforated in the mesiolingual canal 1-3 mm below the furcation region, penetrating the root surface ({sup r}oot perforation{sup )}. Then, in one of these 2 groups, the mesial canals were filled with gutta-percha and AH26 sealer. Mesial canals in one of the other 2 groups without perforation (control groups) were filled with the same materials. The CBCT and periapical radiographs with 3 different angulations were evaluated by 2 oral and maxillofacial radiologists. The specificity and sensitivity of the two methods were calculated, and P<0.05 was considered significant. The sensitivity and specificity of CBCT scans in the detection of obturated root canal perforations were 79% and 96%, respectively, and in the case of three-angled periapical radiographs, they were 92% and 100%, respectively. In non-obturated root canals, the sensitivity and specificity of CBCT scans in perforation detection were 92% and 100%, respectively, and for three-angled periapical radiographs, they were 50% and 96%, respectively. For perforation detection in filled-root canals, periapical radiography with three different horizontal angulations would be trustworthy, but it is recommended that CBCT be used for perforation detection before obturating root canals.

  2. Quality control and radioprotection in dental cone beam computed tomography - case study

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, Ligiane C.N.; Ferreira, Nadya M.P.D., E-mail: lnadya@ime.eb.br [Instituto Militar de Engenharia (IME), Rio de Janeiro, RJ (Brazil)

    2011-07-01

    The radiological protection in medical and odontologic radiology follows The Order (Portaria) 453/98 of the Ministry of Health, which presents the minimum set of tests for the constancy X-ray equipment. These tests follow the procedures set forth in the Resolution no. 64, the National Agency for Sanitary Vigilance. This work aims to show a study on dental cone beam computed tomography (CBCT), evaluating the physical parameters that influence the performance and image quality and presenting the appropriate tests to this new system. The authors analyzed the tests specific for computed tomography (CT) of the Resolution no. 64, feasibility assessment of them and if their interpretations are compatible with CBCT. Once determined if testing is feasible, compare with those presented in the manual provided by the equipment manufacturer. The CT scanner used was the Mini-Cat Tomography Scanner Xoran Technologies of KAVO. In the study it was verified that four tests could be reproduced in CBCT: noise, accuracy and uniformity in the number of CT of water and spatial resolution. Considering experimental data, the methodology and tolerance of manufacturer for the first two tests were more appropriate. For the uniformity test of the CT number, we recommend using the phantom quality control. Three new tests were suggested to be made in the quality control of the Cone Beam: linearity, artifacts and alignment of the beam. (author)

  3. Quality control and radioprotection in dental cone beam computed tomography - case study

    International Nuclear Information System (INIS)

    Rodrigues, Ligiane C.N.; Ferreira, Nadya M.P.D.

    2011-01-01

    The radiological protection in medical and odontologic radiology follows The Order (Portaria) 453/98 of the Ministry of Health, which presents the minimum set of tests for the constancy X-ray equipment. These tests follow the procedures set forth in the Resolution no. 64, the National Agency for Sanitary Vigilance. This work aims to show a study on dental cone beam computed tomography (CBCT), evaluating the physical parameters that influence the performance and image quality and presenting the appropriate tests to this new system. The authors analyzed the tests specific for computed tomography (CT) of the Resolution no. 64, feasibility assessment of them and if their interpretations are compatible with CBCT. Once determined if testing is feasible, compare with those presented in the manual provided by the equipment manufacturer. The CT scanner used was the Mini-Cat Tomography Scanner Xoran Technologies of KAVO. In the study it was verified that four tests could be reproduced in CBCT: noise, accuracy and uniformity in the number of CT of water and spatial resolution. Considering experimental data, the methodology and tolerance of manufacturer for the first two tests were more appropriate. For the uniformity test of the CT number, we recommend using the phantom quality control. Three new tests were suggested to be made in the quality control of the Cone Beam: linearity, artifacts and alignment of the beam. (author)

  4. The effect of Vaccinium uliginosum extract on tablet computer-induced asthenopia: randomized placebo-controlled study.

    Science.gov (United States)

    Park, Choul Yong; Gu, Namyi; Lim, Chi-Yeon; Oh, Jong-Hyun; Chang, Minwook; Kim, Martha; Rhee, Moo-Yong

    2016-08-18

    To investigate the alleviation effect of Vaccinium uliginosum extract (DA9301) on tablet computer-induced asthenopia. This was a randomized, placebo-controlled, double-blind and parallel study (Trial registration number: 2013-95). A total 60 volunteers were randomized into DA9301 (n = 30) and control (n = 30) groups. The DA9301 group received DA9301 oral pill (1000 mg/day) for 4 weeks and the control group received placebo. Asthenopia was evaluated by administering a questionnaire containing 10 questions (responses were scored on a scales of 0-6; total score: 60) regarding ocular symptoms before (baseline) and 4 weeks after receiving pills (DA9301 or placebo). The participants completed the questionnaire before and after tablet computer (iPad Air, Apple Inc.) watching at each visit. The change in total asthenopia score (TAS) was calculated and compared between the groups TAS increased significantly after tablet computer watching at baseline in DA9301 group. (from 20.35 to 23.88; p = 0.031) However, after receiving DA9301 for 4 weeks, TAS remained stable after tablet computer watching. In the control group, TAS changes induced by tablet computer watching were not significant both at baseline and at 4 weeks after receiving placebo. Further analysis revealed the scores for "tired eyes" (p = 0.001), "sore/aching eyes" (p = 0.038), "irritated eyes" (p = 0.010), "watery eyes" (p = 0.005), "dry eyes" (p = 0.003), "eye strain" (p = 0.006), "blurred vision" (p = 0.034), and "visual discomfort" (p = 0.018) significantly improved in the DA9301 group. We found that oral intake of DA9301 (1000 mg/day for 4 weeks) was effective in alleviating asthenopia symptoms induced by tablet computer watching. The study is registered at www.clinicaltrials.gov (registration number: NCT02641470, date of registration December 30, 2015).

  5. Comparative Study of Aerodynamic Interference During AFT Dispense of Munitions

    National Research Council Canada - National Science Library

    Burkinshaw, Matthew G

    2007-01-01

    .... A computational fluid dynamics (CFD) study was performed followed by a wind tunnel experiment. The study consisted of a strut-mounted cone simulating a parent vehicle and a sting mounted cone-cylinder store situated directly behind the cone...

  6. Layering of stomach contents in drowning cases in post-mortem computed tomography compared to forensic autopsy.

    Science.gov (United States)

    Gotsmy, Walther; Lombardo, Paolo; Jackowski, Christian; Brencicova, Eva; Zech, Wolf-Dieter

    2018-04-24

    In forensic autopsy, the analysis of stomach contents is important when investigating drowning cases. Three-layering of stomach contents may be interpreted as a diagnostic hint to drowning due to swallowing of larger amounts of water or other drowning media. The authors experienced frequent discrepancies of numbers of stomach content layering in drowning cases between post-mortem computed tomography (PMCT) and autopsy in forensic casework. Therefore, the goal of this study was to compare layering of stomach contents in drowning cases between PMCT and forensic autopsy. Drowning cases (n = 55; 40 male, 15 female, mean age 45.3 years; mean amount of stomach content 223 ml) that received PMCT prior to forensic autopsy were retrospectively analyzed by a forensic pathologist and a radiologist. Number of layers of stomach content in PMCT were compared to number of layers at forensic autopsy. In 28 of the 55 evaluated drowning cases, a discrepancy between layering of stomach contents at autopsy compared to PMCT was observed: 1 layer at autopsy (n = 28): 50% discrepancy to PMCT, 2 layers (n = 20): 45% discrepancy, and 3 layers (n = 7): 71.4% discrepancy. Sensitivity of correctly determining layering (as observed at forensic autopsy) in PMCT was 52% (positive predictive value 44.8%). Specificity was 46.6% (negative predictive value 53.8%). In a control group (n = 35) of non-drowning cases, three-layering of stomach contents was not observed. Discrepancies of observed numbers of stomach content layers between PMCT and forensic autopsy are a frequent finding possibly due to stomach content sampling technique at autopsy and movement of the corpse prior to PMCT and autopsy. Three-layering in PMCT, if indeed present, may be interpreted as a hint to drowning.

  7. (18)F-fluoride positron emission tomography/computed tomography and bone scintigraphy for diagnosis of bone metastases in newly diagnosed, high-risk prostate cancer patients: study protocol for a multicentre, diagnostic test accuracy study.

    Science.gov (United States)

    Fonager, Randi F; Zacho, Helle D; Langkilde, Niels C; Petersen, Lars J

    2016-01-11

    For decades, planar bone scintigraphy has been the standard practice for detection of bone metastases in prostate cancer and has been endorsed by recent oncology/urology guidelines. It is a sensitive method with modest specificity. (18)F-fluoride positron emission tomography/computed tomography has shown improved sensitivity and specificity over bone scintigraphy, but because of methodological issues such as retrospective design and verification bias, the existing level of evidence with (18)F-fluoride positron emission tomography/computed tomography is limited. The primary objective is to compare the diagnostic properties of (18)F-fluoride positron emission tomography/computed tomography versus bone scintigraphy on an individual patient basis. One hundred forty consecutive, high-risk prostate cancer patients will be recruited from several hospitals in Denmark. Sample size was calculated using Hayen's method for diagnostic comparative studies. This study will be conducted in accordance with recommendations of standards for reporting diagnostic accuracy studies. Eligibility criteria comprise the following: 1) biopsy-proven prostate cancer, 2) PSA ≥ 50 ng/ml (equals a prevalence of bone metastasis of ≈ 50% in the study population on bone scintigraphy), 3) patients must be eligible for androgen deprivation therapy, 4) no current or prior cancer (within the past 5 years), 5) ability to comply with imaging procedures, and 6) patients must not receive any investigational drugs. Planar bone scintigraphy and (18)F-fluoride positron emission tomography/computed tomography will be performed within a window of 14 days at baseline. All scans will be repeated after 26 weeks of androgen deprivation therapy, and response of individual lesions will be used for diagnostic classification of the lesions on baseline imaging among responding patients. A response is defined as PSA normalisation or ≥ 80% reduction compared with baseline levels, testosterone below castration levels

  8. 4 cases of 'ataxic hemiparesis'. A comparative study of computed tomography and electrophysiological findings

    Energy Technology Data Exchange (ETDEWEB)

    Eguchi, Kiyoshi; Kamei, Hidekazu; Kitamura, Eiko; Komatsuzaki, Satoshi; Yamane, Kiyomi; Takemiya, Toshiko; Kobayashi, Itsuro; Maruyama, Shoichi

    1984-10-01

    Ataxic hemiparesis is described as a syndrome in which pyramidal and cerebellar signs occur ipsilaterally. Fisher who suggested the designation ''ataxic hemiparesis'' for this syndrome confirmed by pathological study that causative lesion was in the basis pontis at the level of the junction of the upper one third and lower two thirds on the opposite side of the neurological deficit and he also reported that CT might fail to show the lesion. We observed 4 patients with ataxic hemiparesis and examined them in auditory brainstem response (ABR), somatosensory evoked potential (SEP), and blink reflex as electrophysiological study. Their CT and electrophysiological findings were compared with each others to define the responsible lesion more clearly. Essentially, these abnormal electrophysiological findings were recognized only in the case of pontine hemorrhage, and these findings recovered to normal as clinical and CT findings were improved. In the other cases, the electrophysiological findings were not prominent and CT revealed the lesions in deep frontal region, internal capsule and cerebellar hemispheres respectively. These results might show that many cases of extra-pontine lesions could develop the syndrome of ataxic hemiparesis. However, the relation between responsible lesions for ataxic hemiparesis and electrophysiological findings are still uncertain. Further evidences including clinicopathological studies will be required to clarify this relation and to get the more accurate anatomical interpretation of ataxic hemiparesis from lesions besides the pontine region. (author).

  9. Comparing classifiers for pronunciation error detection

    NARCIS (Netherlands)

    Strik, H.; Truong, K.; Wet, F. de; Cucchiarini, C.

    2007-01-01

    Providing feedback on pronunciation errors in computer assisted language learning systems requires that pronunciation errors be detected automatically. In the present study we compare four types of classifiers that can be used for this purpose: two acoustic-phonetic classifiers (one of which employs

  10. Using Volunteer Computing to Study Some Features of Diagonal Latin Squares

    Science.gov (United States)

    Vatutin, Eduard; Zaikin, Oleg; Kochemazov, Stepan; Valyaev, Sergey

    2017-12-01

    In this research, the study concerns around several features of diagonal Latin squares (DLSs) of small order. Authors of the study suggest an algorithm for computing minimal and maximal numbers of transversals of DLSs. According to this algorithm, all DLSs of a particular order are generated, and for each square all its transversals and diagonal transversals are constructed. The algorithm was implemented and applied to DLSs of order at most 7 on a personal computer. The experiment for order 8 was performed in the volunteer computing project Gerasim@home. In addition, the problem of finding pairs of orthogonal DLSs of order 10 was considered and reduced to Boolean satisfiability problem. The obtained problem turned out to be very hard, therefore it was decomposed into a family of subproblems. In order to solve the problem, the volunteer computing project SAT@home was used. As a result, several dozen pairs of described kind were found.

  11. Framework for Computation Offloading in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Dejan Kovachev

    2012-12-01

    Full Text Available The inherently limited processing power and battery lifetime of mobile phones hinder the possible execution of computationally intensive applications like content-based video analysis or 3D modeling. Offloading of computationally intensive application parts from the mobile platform into a remote cloud infrastructure or nearby idle computers addresses this problem. This paper presents our Mobile Augmentation Cloud Services (MACS middleware which enables adaptive extension of Android application execution from a mobile client into the cloud. Applications are developed by using the standard Android development pattern. The middleware does the heavy lifting of adaptive application partitioning, resource monitoring and computation offloading. These elastic mobile applications can run as usual mobile application, but they can also use remote computing resources transparently. Two prototype applications using the MACS middleware demonstrate the benefits of the approach. The evaluation shows that applications, which involve costly computations, can benefit from offloading with around 95% energy savings and significant performance gains compared to local execution only.

  12. A comparative study of NiZn ferrites modified by the addition of cobalt

    Directory of Open Access Journals (Sweden)

    Pereira S.L.

    1999-01-01

    Full Text Available Off-stoichiometric NiZn ferrite was obtained by hydrothermal process and compacted in torus form under different pressures. Two samples A1 and A2 - cobalt doped (0.5 % were sintered at 1573 K in air atmosphere during 3 h. The magnetic properties were studied by vibrating sample magnetometry, Mössbauer spectroscopy and complex impedanciometry. X-ray diffraction and Hg porosimetry were used in order to determine the average grain size and the type of packing in the samples. Both samples exhibited superparamagnetic behavior in the hysteresis loop. This effect does not agree with Mössbauer results, which were fitted using Normos, a commercial computer program. All samples parameters were compared.

  13. Comparative study of incompressible and isothermal compressible flow solvers for cavitating flow dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sun Ho [Korea Maritime and Ocean University, Busan (Korea, Republic of); Rhee, Shin Hyung [Seoul National University, Seoul (Korea, Republic of)

    2015-08-15

    Incompressible flow solvers are generally used for numerical analysis of cavitating flows, but with limitations in handling compressibility effects on vapor phase. To study compressibility effects on vapor phase and cavity interface, pressure-based incompressible and isothermal compressible flow solvers based on a cell-centered finite volume method were developed using the OpenFOAM libraries. To validate the solvers, cavitating flow around a hemispherical head-form body was simulated and validated against the experimental data. The cavity shedding behavior, length of a re-entrant jet, drag history, and the Strouhal number were compared between the two solvers. The results confirmed that computations of the cavitating flow including compressibility effects improved the reproduction of cavitation dynamics.

  14. Comparative diagnostic performance of multidetector computed tomography and MRI for characterization of pancreatic cystic lesions

    International Nuclear Information System (INIS)

    Moon, Sung Min; Shin, Sang Soo; Park, Jin Gyoon; Jeong, Yong Yeon

    2015-01-01

    To compare the diagnostic performance of multidetector computed tomography (MDCT) and magnetic resonance imaging (MRI) in characterization of pancreatic cystic lesions. We conducted a retrospective study on 34 patients with histopathologically proven cystic pancreatic lesions who underwent both preoperative MDCT and MRI. CT and MRI were independently evaluated for differentiating mucinous vs. non-mucinous lesions, differentiating aggressive vs. non-aggressive lesion, analyzing morphological features, and evaluating specific leading diagnoses. Sensitivity, specificity, and accuracy were determined. Competency assessment of lesional morphology analysis was performed using the kappa values of the 2 tests. The sensitivity, specificity, and accuracy of MRI for differentiating mucinous vs. non-mucinous lesions were higher than CT (p = 0.03). For differentiating aggressiveness, the sensitivity of MRI was better than CT, but the specificity of CT was better than MRI. In evaluation of morphologic features, MRI showed better performance in characterization of septa and wall. Otherwise, the 2 modalities showed similarly good performance. MRI was better than CT in determining a specific diagnosis (58.8% vs. 47.2%, respectively). CT and MRI are reasonable diagnostic methods for characterization of pancreatic cystic lesions. However, MRI enables more confident assessment than CT in differentiating mucinous vs. non-mucinous lesions and characterization of the septa and wall

  15. Computed tomography study of otitis media; A tomografia computadorizada no estudo das otites medias

    Energy Technology Data Exchange (ETDEWEB)

    Bahia, Paulo Roberto Valle; Marchiori, Edson [Universidade Federal, Rio de Janeiro, RJ (Brazil). Dept. de Radiologia

    1997-03-01

    The findings of computed tomography (CT) of 89 patients clinically suspected of having otitis media were studied in this work. Such results were compared to clinical diagnosis, otoscopy, surgical findings and previous data. Among the results of our analysis, we studied seven patients with acute otitis media and 83 patients with chronic otitis media. The patients with acute otitis media have undergone CT examinations to evaluate possible spread to central nervous system. The diagnosis of cholesteatoma, its extension and complications were the main indication. for chronic otitis media study. The main findings of the cholesteatomatous otitis were the occupation of the epitympanun, the bony wall destruction and the ossicular chain erosion. The CT demonstrated a great sensibility to diagnose the cholesteatoma. (author) 25 refs., 10 figs.

  16. A comparative study of two fast nonlinear free-surface water wave models

    DEFF Research Database (Denmark)

    Ducrozet, Guillaume; Bingham, Harry B.; Engsig-Karup, Allan Peter

    2012-01-01

    simply directly solves the three-dimensional problem. Both models have been well validated on standard test cases and shown to exhibit attractive convergence properties and an optimal scaling of the computational effort with increasing problem size. These two models are compared for solution of a typical...... used in OceanWave3D, the closer the results come to the HOS model....

  17. Prospective study comparing three-dimensional computed tomography and magnetic resonance imaging for evaluating the renal vascular anatomy in potential living renal donors.

    Science.gov (United States)

    Bhatti, Aftab A; Chugtai, Aamir; Haslam, Philip; Talbot, David; Rix, David A; Soomro, Naeem A

    2005-11-01

    To prospectively compare the accuracy of multislice spiral computed tomographic angiography (CTA) and magnetic resonance angiography (MRA) in evaluating the renal vascular anatomy in potential living renal donors. Thirty-one donors underwent multislice spiral CTA and gadolinium-enhanced MRA. In addition to axial images, multiplanar reconstruction and maximum intensity projections were used to display the renal vascular anatomy. Twenty-four donors had a left laparoscopic donor nephrectomy (LDN), whereas seven had right open donor nephrectomy (ODN); LDN was only considered if the renal vascular anatomy was favourable on the left. CTA and MRA images were analysed by two radiologists independently. The radiological and surgical findings were correlated after the surgery. CTA showed 33 arteries and 32 veins (100% sensitivity) whereas MRA showed 32 arteries and 31 veins (97% sensitivity). CTA detected all five accessory renal arteries whereas MRA only detected one. CTA also identified all three accessory renal veins whereas MRA identified two. CTA had a sensitivity of 97% and 47% for left lumbar and left gonadal veins, whereas MRA had a sensitivity of 74% and 46%, respectively. Multislice spiral CTA with three-dimensional reconstruction was more accurate than MRA for both renal arterial and venous anatomy.

  18. Diagnosis of asbestosis by a time expanded wave form analysis, auscultation and high resolution computed tomography: a comparative study.

    Science.gov (United States)

    al Jarad, N; Strickland, B; Bothamley, G; Lock, S; Logan-Sinclair, R; Rudd, R M

    1993-01-01

    BACKGROUND--Crackles are a prominent clinical feature of asbestosis and may be an early sign of the condition. Auscultation, however, is subjective and interexaminer disagreement is a problem. Computerised lung sound analysis can visualise, store, and analyse lung sounds and disagreement on the presence of crackles is minimal. High resolution computed tomography (HRCT) is superior to chest radiography in detecting early signs of asbestosis. The aim of this study was to compare clinical auscultation, time expanded wave form analysis (TEW), chest radiography, and HRCT in detecting signs of asbestosis in asbestos workers. METHODS--Fifty three asbestos workers (51 men and two women) were investigated. Chest radiography and HRCT were assessed by two independent readers for detection of interstitial opacities. HRCT was performed in the supine position with additional sections at the bases in the prone position. Auscultation for persistent fine inspiratory crackles was performed by two independent examiners unacquainted with the diagnosis. TEW analysis was obtained from a 33 second recording of lung sounds over the lung bases. TEW and auscultation were performed in a control group of 13 subjects who had a normal chest radiograph. There were 10 current smokers and three previous smokers. In asbestos workers the extent of pulmonary opacities on the chest radiograph was scored according to the International Labour Office (ILO) scale. Patients were divided into two groups: 21 patients in whom the chest radiograph was > 1/0 (group 1) and 32 patients in whom the chest radiograph was scored auscultation in seven (22%) patients and by TEW in 14 (44%). HRCT detected definite interstitial opacities in 11 (34%) and gravity dependent subpleural lines in two (6%) patients. All but two patients with evidence of interstitial disease or gravity dependent subpleural lines on HRCT had crackles detected by TEW. In patients with an ILO score of > 1/0 auscultation and TEW revealed mid to late

  19. A Qualitative Study of Students' Computational Thinking Skills in a Data-Driven Computing Class

    Science.gov (United States)

    Yuen, Timothy T.; Robbins, Kay A.

    2014-01-01

    Critical thinking, problem solving, the use of tools, and the ability to consume and analyze information are important skills for the 21st century workforce. This article presents a qualitative case study that follows five undergraduate biology majors in a computer science course (CS0). This CS0 course teaches programming within a data-driven…

  20. F-8 sodium fluoride position emission tomography/computed tomography for detection of thyroid cancer bone metastasis compared with bone scintigraphy

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hyun Jong; Lee, Won Woo; Park, So Yeon; Kim, Sang Eun [Seoul National University Bundang Hospital, Seoul National University College of Medicine, Seoul (Korea, Republic of)

    2016-04-15

    The aim of the study was to compare the diagnostic performances of F-18 sodium fluoride positron emission tomography/computed tomography (bone PET/CT) and bone scintigraphy (BS) for the detection of thyroid cancer bone metastasis. We retrospectively enrolled 6 thyroid cancer patients (age = 44.7 ± 9.8 years, M:F = 1:5, papillary:follicular = 2:4) with suspected bone metastatic lesions in the whole body iodine scintigraphy or BS, who subsequently underwent bone PET/CT. Pathologic diagnosis was conducted for 4 lesions of 4 patients. Of the 17 suspected bone lesions, 10 were metastatic and 7 benign. Compared to BS, bone PET/CT exhibited superior sensitivity (10/10 = 100% vs. 2/10 = 20%, p = 0.008), and accuracy (14/17 = 82.4% vs. 7/17 = 41.2%, p < 0.025). The specificity (4/7 = 57.1%) of bone PET/CT was not significantly different from that of BS (5/7 = 71.4%, p > 0.05). Bone PET/CT may be more sensitive and accurate than BS for the detection of thyroid cancer bone metastasis.