WorldWideScience

Sample records for computational database screening

  1. A database for CO2 Separation Performances of MOFs based on Computational Materials Screening.

    Science.gov (United States)

    Altintas, Cigdem; Avci, Gokay; Daglar, Hilal; Nemati Vesali Azar, Ayda; Velioglu, Sadiye; Erucar, Ilknur; Keskin, Seda

    2018-05-03

    Metal organic frameworks (MOFs) have been considered as great candidates for CO2 capture. Considering the very large number of available MOFs, high-throughput computational screening plays a critical role in identifying the top performing materials for target applications in a time-effective manner. In this work, we used molecular simulations to screen the most recent and complete MOF database for identifying the most promising materials for CO2 separation from flue gas (CO2/N2) and landfill gas (CO2/CH4) under realistic operating conditions. We first validated our approach by comparing the results of our molecular simulations for the CO2 uptakes, CO2/N2 and CO2/CH4 selectivities of various types of MOFs with the available experimental data. We then computed binary CO2/N2 and CO2/CH4 mixture adsorption data for the entire MOF database and used these results to calculate several adsorbent selection metrics such as selectivity, working capacity, adsorbent performance score, regenerability, and separation potential. MOFs were ranked based on the combination of these metrics and the top performing MOF adsorbents that can achieve CO2/N2 and CO2/CH4 separations with high performance were identified. Molecular simulations for the adsorption of a ternary CO2/N2/CH4 mixture were performed for these top materials in order to provide a more realistic performance assessment of MOF adsorbents. Structure-performance analysis showed that MOFs with ΔQ>30 kJ/mol, 3.8 A≤PLD≤5 A, 5 A≤LCD≤7.5 A, 0.5≤ϕ≤0.75, SA≤1,000 m2/g, ρ>1 g/cm 3 are the best candidates for selective separation of CO2 from flue gas and landfill gas. This information will be very useful to design novel MOFs with the desired structural features that can lead to high CO2 separation potentials. Finally, an online, freely accessible database https://cosmoserc.ku.edu.tr was established, for the first time in the literature, which reports all computed adsorbent metrics of 3,816 MOFs for CO2/N2, CO2/CH4

  2. iScreen: world's first cloud-computing web server for virtual screening and de novo drug design based on TCM database@Taiwan.

    Science.gov (United States)

    Tsai, Tsung-Ying; Chang, Kai-Wei; Chen, Calvin Yu-Chian

    2011-06-01

    The rapidly advancing researches on traditional Chinese medicine (TCM) have greatly intrigued pharmaceutical industries worldwide. To take initiative in the next generation of drug development, we constructed a cloud-computing system for TCM intelligent screening system (iScreen) based on TCM Database@Taiwan. iScreen is compacted web server for TCM docking and followed by customized de novo drug design. We further implemented a protein preparation tool that both extract protein of interest from a raw input file and estimate the size of ligand bind site. In addition, iScreen is designed in user-friendly graphic interface for users who have less experience with the command line systems. For customized docking, multiple docking services, including standard, in-water, pH environment, and flexible docking modes are implemented. Users can download first 200 TCM compounds of best docking results. For TCM de novo drug design, iScreen provides multiple molecular descriptors for a user's interest. iScreen is the world's first web server that employs world's largest TCM database for virtual screening and de novo drug design. We believe our web server can lead TCM research to a new era of drug development. The TCM docking and screening server is available at http://iScreen.cmu.edu.tw/.

  3. iScreen: world's first cloud-computing web server for virtual screening and de novo drug design based on TCM database@Taiwan

    Science.gov (United States)

    Tsai, Tsung-Ying; Chang, Kai-Wei; Chen, Calvin Yu-Chian

    2011-06-01

    The rapidly advancing researches on traditional Chinese medicine (TCM) have greatly intrigued pharmaceutical industries worldwide. To take initiative in the next generation of drug development, we constructed a cloud-computing system for TCM intelligent screening system (iScreen) based on TCM Database@Taiwan. iScreen is compacted web server for TCM docking and followed by customized de novo drug design. We further implemented a protein preparation tool that both extract protein of interest from a raw input file and estimate the size of ligand bind site. In addition, iScreen is designed in user-friendly graphic interface for users who have less experience with the command line systems. For customized docking, multiple docking services, including standard, in-water, pH environment, and flexible docking modes are implemented. Users can download first 200 TCM compounds of best docking results. For TCM de novo drug design, iScreen provides multiple molecular descriptors for a user's interest. iScreen is the world's first web server that employs world's largest TCM database for virtual screening and de novo drug design. We believe our web server can lead TCM research to a new era of drug development. The TCM docking and screening server is available at http://iScreen.cmu.edu.tw/.

  4. High-Throughput Computational Screening of the Metal Organic Framework Database for CH4/H2 Separations.

    Science.gov (United States)

    Altintas, Cigdem; Erucar, Ilknur; Keskin, Seda

    2018-01-31

    Metal organic frameworks (MOFs) have been considered as one of the most exciting porous materials discovered in the last decade. Large surface areas, high pore volumes, and tailorable pore sizes make MOFs highly promising in a variety of applications, mainly in gas separations. The number of MOFs has been increasing very rapidly, and experimental identification of materials exhibiting high gas separation potential is simply impractical. High-throughput computational screening studies in which thousands of MOFs are evaluated to identify the best candidates for target gas separation is crucial in directing experimental efforts to the most useful materials. In this work, we used molecular simulations to screen the most complete and recent collection of MOFs from the Cambridge Structural Database to unlock their CH 4 /H 2 separation performances. This is the first study in the literature, which examines the potential of all existing MOFs for adsorption-based CH 4 /H 2 separation. MOFs (4350) were ranked based on several adsorbent evaluation metrics including selectivity, working capacity, adsorbent performance score, sorbent selection parameter, and regenerability. A large number of MOFs were identified to have extraordinarily large CH 4 /H 2 selectivities compared to traditional adsorbents such as zeolites and activated carbons. We examined the relations between structural properties of MOFs such as pore sizes, porosities, and surface areas and their selectivities. Correlations between the heat of adsorption, adsorbility, metal type of MOFs, and selectivities were also studied. On the basis of these relations, a simple mathematical model that can predict the CH 4 /H 2 selectivity of MOFs was suggested, which will be very useful in guiding the design and development of new MOFs with extraordinarily high CH 4 /H 2 separation performances.

  5. A prediction model-based algorithm for computer-assisted database screening of adverse drug reactions in the Netherlands.

    Science.gov (United States)

    Scholl, Joep H G; van Hunsel, Florence P A M; Hak, Eelko; van Puijenbroek, Eugène P

    2018-02-01

    The statistical screening of pharmacovigilance databases containing spontaneously reported adverse drug reactions (ADRs) is mainly based on disproportionality analysis. The aim of this study was to improve the efficiency of full database screening using a prediction model-based approach. A logistic regression-based prediction model containing 5 candidate predictors was developed and internally validated using the Summary of Product Characteristics as the gold standard for the outcome. All drug-ADR associations, with the exception of those related to vaccines, with a minimum of 3 reports formed the training data for the model. Performance was based on the area under the receiver operating characteristic curve (AUC). Results were compared with the current method of database screening based on the number of previously analyzed associations. A total of 25 026 unique drug-ADR associations formed the training data for the model. The final model contained all 5 candidate predictors (number of reports, disproportionality, reports from healthcare professionals, reports from marketing authorization holders, Naranjo score). The AUC for the full model was 0.740 (95% CI; 0.734-0.747). The internal validity was good based on the calibration curve and bootstrapping analysis (AUC after bootstrapping = 0.739). Compared with the old method, the AUC increased from 0.649 to 0.740, and the proportion of potential signals increased by approximately 50% (from 12.3% to 19.4%). A prediction model-based approach can be a useful tool to create priority-based listings for signal detection in databases consisting of spontaneous ADRs. © 2017 The Authors. Pharmacoepidemiology & Drug Safety Published by John Wiley & Sons Ltd.

  6. Database computing in HEP

    International Nuclear Information System (INIS)

    Day, C.T.; Loken, S.; MacFarlane, J.F.; May, E.; Lifka, D.; Lusk, E.; Price, L.E.; Baden, A.

    1992-01-01

    The major SSC experiments are expected to produce up to 1 Petabyte of data per year each. Once the primary reconstruction is completed by farms of inexpensive processors. I/O becomes a major factor in further analysis of the data. We believe that the application of database techniques can significantly reduce the I/O performed in these analyses. We present examples of such I/O reductions in prototype based on relational and object-oriented databases of CDF data samples

  7. Danish Quality Database for Mammography Screening

    DEFF Research Database (Denmark)

    Mikkelsen, Ellen Margrethe; Njor, Sisse Helle; Vejborg, Ilse Merete Munk

    2016-01-01

    diagnosed with breast cancer between screening rounds, 7) invasive breast tumors, 8) node-negative cancers, 9) invasive tumors ≤10 mm, 10) ratio of surgery for benign vs malignant lesions, and 11) breast-conserving therapy. DESCRIPTIVE DATA: As of August 10, 2015, the database included data from 888...

  8. Computational 2D Materials Database

    DEFF Research Database (Denmark)

    Rasmussen, Filip Anselm; Thygesen, Kristian Sommer

    2015-01-01

    We present a comprehensive first-principles study of the electronic structure of 51 semiconducting monolayer transition-metal dichalcogenides and -oxides in the 2H and 1T hexagonal phases. The quasiparticle (QP) band structures with spin-orbit coupling are calculated in the G(0)W(0) approximation...... and used as input to a 2D hydrogenic model to estimate exciton binding energies. Throughout the paper we focus on trends and correlations in the electronic structure rather than detailed analysis of specific materials. All the computed data is available in an open database......., and comparison is made with different density functional theory descriptions. Pitfalls related to the convergence of GW calculations for two-dimensional (2D) materials are discussed together with possible solutions. The monolayer band edge positions relative to vacuum are used to estimate the band alignment...

  9. Computational Chemistry Comparison and Benchmark Database

    Science.gov (United States)

    SRD 101 NIST Computational Chemistry Comparison and Benchmark Database (Web, free access)   The NIST Computational Chemistry Comparison and Benchmark Database is a collection of experimental and ab initio thermochemical properties for a selected set of molecules. The goals are to provide a benchmark set of molecules for the evaluation of ab initio computational methods and allow the comparison between different ab initio computational methods for the prediction of thermochemical properties.

  10. Computer screens and brain cancer

    International Nuclear Information System (INIS)

    Wood, A.W.

    1995-01-01

    Australia, both in the media and at the federal government level, over possible links between screen-based computer use and cancer, brain tumour in particular. The screen emissions assumed to be the sources of the putative hazard are the magnetic fields responsible for horizontal and vertical scanning of the display. Time-varying fluctuations in these magnetic fields induce electrical current flows in exposed tissues. This paper estimates that the induced current densities in the brain of the computer user are up to 1 mA/m 2 (due to the vertical flyback). Corresponding values for other electrical appliances or installations are in general much less than this. The epidemiological literature shows no obvious signs of a sudden increase in brain tumour incidence, but the widespread use of computers is a relatively recent phenomenon. The occupational use of other equipment based on cathode ray tubes (such as TV repair) has a much longer history and has been statistically linked to brain tumour in some studies. A number of factors make this an unreliable indicator of the risk from computer screens, however. 42 refs., 3 tabs., 2 figs

  11. Computer Application Of Object Oriented Database Management ...

    African Journals Online (AJOL)

    Object Oriented Systems (OOS) have been widely adopted in software engineering because of their superiority with respect to data extensibility. The present trend in the software engineering process (SEP) towards concurrent computing raises novel concerns for the facilities and technology available in database ...

  12. The CSB Incident Screening Database: description, summary statistics and uses.

    Science.gov (United States)

    Gomez, Manuel R; Casper, Susan; Smith, E Allen

    2008-11-15

    This paper briefly describes the Chemical Incident Screening Database currently used by the CSB to identify and evaluate chemical incidents for possible investigations, and summarizes descriptive statistics from this database that can potentially help to estimate the number, character, and consequences of chemical incidents in the US. The report compares some of the information in the CSB database to roughly similar information available from databases operated by EPA and the Agency for Toxic Substances and Disease Registry (ATSDR), and explores the possible implications of these comparisons with regard to the dimension of the chemical incident problem. Finally, the report explores in a preliminary way whether a system modeled after the existing CSB screening database could be developed to serve as a national surveillance tool for chemical incidents.

  13. Bacterial contamination of computer touch screens.

    Science.gov (United States)

    Gerba, Charles P; Wuollet, Adam L; Raisanen, Peter; Lopez, Gerardo U

    2016-03-01

    The goal of this study was to determine the occurrence of opportunistic bacterial pathogens on the surfaces of computer touch screens used in hospitals and grocery stores. Opportunistic pathogenic bacteria were isolated on touch screens in hospitals; Clostridium difficile and vancomycin-resistant Enterococcus and in grocery stores; methicillin-resistant Staphylococcus aureus. Enteric bacteria were more common on grocery store touch screens than on hospital computer touch screens. Published by Elsevier Inc.

  14. Validity of data in the Danish colorectal cancer screening database

    DEFF Research Database (Denmark)

    Thomsen, Mette Kielsholm; Njor, Sisse Helle; Rasmussen, Morten

    2017-01-01

    Background: In Denmark, a nationwide screening program for colorectal cancer was implemented in March 2014. Along with this, a clinical database for program monitoring and research purposes was established. Objective: The aim of this study was to estimate the agreement and validity of diagnosis...... and procedure codes in the Danish Colorectal Cancer Screening Database (DCCSD). Methods: All individuals with a positive immunochemical fecal occult blood test (iFOBT) result who were invited to screening in the first 3 months since program initiation were identified. From these, a sample of 150 individuals...... was selected using stratified random sampling by age, gender and region of residence. Data from the DCCSD were compared with data from hospital records, which were used as the reference. Agreement, sensitivity, specificity and positive and negative predictive values were estimated for categories of codes...

  15. Computer Screen Use Detection Using Smart Eyeglasses

    Directory of Open Access Journals (Sweden)

    Florian Wahl

    2017-05-01

    Full Text Available Screen use can influence the circadian phase and cause eye strain. Smart eyeglasses with an integrated color light sensor can detect screen use. We present a screen use detection approach based on a light sensor embedded into the bridge of smart eyeglasses. By calculating the light intensity at the user’s eyes for different screens and content types, we found only computer screens to have a significant impact on the circadian phase. Our screen use detection is based on ratios between color channels and used a linear support vector machine to detect screen use. We validated our detection approach in three studies. A test bench was built to detect screen use under different ambient light sources and intensities in a controlled environment. In a lab study, we evaluated recognition performance for different ambient light intensities. By using participant-independent models, we achieved an ROC AUC above 0.9 for ambient light intensities below 200 lx. In a study of typical ADLs, screen use was detected with an average ROC AUC of 0.83 assuming screen use for 30% of the time.

  16. Screening Information - JSNP | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us JSNP Screening Information Data detail Data name Screening Information DOI 10.18908/lsdba.nb...dc00114-003 Description of data contents Information from polymorphism screening experiments. Derived from E...the sequence for polymorphism screening Screened Position position of the polymorphism in the sequence for polymorphism screeni...ng Screened Symbol gene name related to the sequence for polymorphism screening Screened ...OMIM-ID OMIM ID related to the sequence for polymorphism screening About This Dat

  17. Distributed Database Access in the LHC Computing Grid with CORAL

    CERN Document Server

    Molnár, Z; Düllmann, D; Giacomo, G; Kalkhof, A; Valassi, A; CERN. Geneva. IT Department

    2009-01-01

    The CORAL package is the LCG Persistency Framework foundation for accessing relational databases. From the start CORAL has been designed to facilitate the deployment of the LHC experiment database applications in a distributed computing environment. In particular we cover - improvements to database service scalability by client connection management - platform-independent, multi-tier scalable database access by connection multiplexing, caching - a secure authentication and authorisation scheme integrated with existing grid services. We will summarize the deployment experience from several experiment productions using the distributed database infrastructure, which is now available in LCG. Finally, we present perspectives for future developments in this area.

  18. CamMedNP: building the Cameroonian 3D structural natural products database for virtual screening.

    Science.gov (United States)

    Ntie-Kang, Fidele; Mbah, James A; Mbaze, Luc Meva'a; Lifongo, Lydia L; Scharfe, Michael; Hanna, Joelle Ngo; Cho-Ngwa, Fidelis; Onguéné, Pascal Amoa; Owono Owono, Luc C; Megnassan, Eugene; Sippl, Wolfgang; Efange, Simon M N

    2013-04-16

    Computer-aided drug design (CADD) often involves virtual screening (VS) of large compound datasets and the availability of such is vital for drug discovery protocols. We present CamMedNP - a new database beginning with more than 2,500 compounds of natural origin, along with some of their derivatives which were obtained through hemisynthesis. These are pure compounds which have been previously isolated and characterized using modern spectroscopic methods and published by several research teams spread across Cameroon. In the present study, 224 distinct medicinal plant species belonging to 55 plant families from the Cameroonian flora have been considered. About 80 % of these have been previously published and/or referenced in internationally recognized journals. For each compound, the optimized 3D structure, drug-like properties, plant source, collection site and currently known biological activities are given, as well as literature references. We have evaluated the "drug-likeness" of this database using Lipinski's "Rule of Five". A diversity analysis has been carried out in comparison with the ChemBridge diverse database. CamMedNP could be highly useful for database screening and natural product lead generation programs.

  19. Simple re-instantiation of small databases using cloud computing.

    Science.gov (United States)

    Tan, Tin Wee; Xie, Chao; De Silva, Mark; Lim, Kuan Siong; Patro, C Pawan K; Lim, Shen Jean; Govindarajan, Kunde Ramamoorthy; Tong, Joo Chuan; Choo, Khar Heng; Ranganathan, Shoba; Khan, Asif M

    2013-01-01

    Small bioinformatics databases, unlike institutionally funded large databases, are vulnerable to discontinuation and many reported in publications are no longer accessible. This leads to irreproducible scientific work and redundant effort, impeding the pace of scientific progress. We describe a Web-accessible system, available online at http://biodb100.apbionet.org, for archival and future on demand re-instantiation of small databases within minutes. Depositors can rebuild their databases by downloading a Linux live operating system (http://www.bioslax.com), preinstalled with bioinformatics and UNIX tools. The database and its dependencies can be compressed into an ".lzm" file for deposition. End-users can search for archived databases and activate them on dynamically re-instantiated BioSlax instances, run as virtual machines over the two popular full virtualization standard cloud-computing platforms, Xen Hypervisor or vSphere. The system is adaptable to increasing demand for disk storage or computational load and allows database developers to use the re-instantiated databases for integration and development of new databases. Herein, we demonstrate that a relatively inexpensive solution can be implemented for archival of bioinformatics databases and their rapid re-instantiation should the live databases disappear.

  20. "Transit data"-based MST computation

    Directory of Open Access Journals (Sweden)

    Thodoris Karatasos

    2017-10-01

    Full Text Available In this work, we present an innovative image recognition technique which is based on the exploitation of transit-data in images or simple photographs of sites of interest. Our objective is to automatically transform real-world images to graphs and, then, compute Minimum Spanning Trees (MST in them.We apply this framework and present an application which automatically computes efficient construction plans (for escalator or low-emission hot spots for connecting all points of interest in cultural sites, i.e., archaeological sites, museums, galleries, etc, aiming to to facilitate global physical access to cultural heritage and artistic work and make it accessible to all groups of population.

  1. The SQL Server Database for Non Computer Professional Teaching Reform

    Science.gov (United States)

    Liu, Xiangwei

    2012-01-01

    A summary of the teaching methods of the non-computer professional SQL Server database, analyzes the current situation of the teaching course. According to non computer professional curriculum teaching characteristic, put forward some teaching reform methods, and put it into practice, improve the students' analysis ability, practice ability and…

  2. CERN database services for the LHC computing grid

    Energy Technology Data Exchange (ETDEWEB)

    Girone, M [CERN IT Department, CH-1211 Geneva 23 (Switzerland)], E-mail: maria.girone@cern.ch

    2008-07-15

    Physics meta-data stored in relational databases play a crucial role in the Large Hadron Collider (LHC) experiments and also in the operation of the Worldwide LHC Computing Grid (WLCG) services. A large proportion of non-event data such as detector conditions, calibration, geometry and production bookkeeping relies heavily on databases. Also, the core Grid services that catalogue and distribute LHC data cannot operate without a reliable database infrastructure at CERN and elsewhere. The Physics Services and Support group at CERN provides database services for the physics community. With an installed base of several TB-sized database clusters, the service is designed to accommodate growth for data processing generated by the LHC experiments and LCG services. During the last year, the physics database services went through a major preparation phase for LHC start-up and are now fully based on Oracle clusters on Intel/Linux. Over 100 database server nodes are deployed today in some 15 clusters serving almost 2 million database sessions per week. This paper will detail the architecture currently deployed in production and the results achieved in the areas of high availability, consolidation and scalability. Service evolution plans for the LHC start-up will also be discussed.

  3. CERN database services for the LHC computing grid

    International Nuclear Information System (INIS)

    Girone, M

    2008-01-01

    Physics meta-data stored in relational databases play a crucial role in the Large Hadron Collider (LHC) experiments and also in the operation of the Worldwide LHC Computing Grid (WLCG) services. A large proportion of non-event data such as detector conditions, calibration, geometry and production bookkeeping relies heavily on databases. Also, the core Grid services that catalogue and distribute LHC data cannot operate without a reliable database infrastructure at CERN and elsewhere. The Physics Services and Support group at CERN provides database services for the physics community. With an installed base of several TB-sized database clusters, the service is designed to accommodate growth for data processing generated by the LHC experiments and LCG services. During the last year, the physics database services went through a major preparation phase for LHC start-up and are now fully based on Oracle clusters on Intel/Linux. Over 100 database server nodes are deployed today in some 15 clusters serving almost 2 million database sessions per week. This paper will detail the architecture currently deployed in production and the results achieved in the areas of high availability, consolidation and scalability. Service evolution plans for the LHC start-up will also be discussed

  4. Computational screening of mixed metal halide ammines

    DEFF Research Database (Denmark)

    Jensen, Peter Bjerre; Lysgaard, Steen; Quaade, Ulrich

    2013-01-01

    Metal halide ammines, e.g. Mg(NH3)6Cl2 and Sr(NH3)8Cl2, can reversibly store ammonia, with high volumetric hydrogen storage capacities. The storage in the halide ammines is very safe, and the salts are therefore highly relevant as a carbon-free energy carrier in future transportation infrastructure...... selection. The GA is evolving from an initial (random) population and selecting those with highest fitness, a function based on e.g. stability, release temperature and storage capacity. The search space includes all alkaline, alkaline earth, 3d and 4d metals and the four lightest halides. In total...... the search spaces consists of millions combinations, which makes a GA ideal, to reduce the number of necessary calculations. We are screening for a one step release from either a hexa or octa ammine, and we have found promising candidates, which will be further investigated ? both computationally...

  5. Computer system for International Reactor Pressure Vessel Materials Database support

    International Nuclear Information System (INIS)

    Arutyunjan, R.; Kabalevsky, S.; Kiselev, V.; Serov, A.

    1997-01-01

    This report presents description of the computer tools for support of International Reactor Pressure Vessel Materials Database developed at IAEA. Work was focused on raw, qualified, processed materials data, search, retrieval, analysis, presentation and export possibilities of data. Developed software has the following main functions: provides software tools for querying and search of any type of data in the database; provides the capability to update the existing information in the database; provides the capability to present and print selected data; provides the possibility of export on yearly basis the run-time IRPVMDB with raw, qualified and processed materials data to Database members; provides the capability to export any selected sets of raw, qualified, processed materials data

  6. Digital Dental X-ray Database for Caries Screening

    Science.gov (United States)

    Rad, Abdolvahab Ehsani; Rahim, Mohd Shafry Mohd; Rehman, Amjad; Saba, Tanzila

    2016-06-01

    Standard database is the essential requirement to compare the performance of image analysis techniques. Hence the main issue in dental image analysis is the lack of available image database which is provided in this paper. Periapical dental X-ray images which are suitable for any analysis and approved by many dental experts are collected. This type of dental radiograph imaging is common and inexpensive, which is normally used for dental disease diagnosis and abnormalities detection. Database contains 120 various Periapical X-ray images from top to bottom jaw. Dental digital database is constructed to provide the source for researchers to use and compare the image analysis techniques and improve or manipulate the performance of each technique.

  7. Ionic Liquids for Absorption and Separation of Gases: An Extensive Database and a Systematic Screening Method

    DEFF Research Database (Denmark)

    Zhao, Yongsheng; Gani, Rafiqul; Afzal, Raja Muhammad

    2017-01-01

    requirements remains a challenging task. In this study, an extensive database of estimated Henry's law constants of twelve gases in more than ten thousand ILs at 313.15 K is established using the COSMO-RS method. Based on the database, a new systematic and efficient screening method for IL selection...

  8. Benchmarking Ligand-Based Virtual High-Throughput Screening with the PubChem Database

    Directory of Open Access Journals (Sweden)

    Mariusz Butkiewicz

    2013-01-01

    Full Text Available With the rapidly increasing availability of High-Throughput Screening (HTS data in the public domain, such as the PubChem database, methods for ligand-based computer-aided drug discovery (LB-CADD have the potential to accelerate and reduce the cost of probe development and drug discovery efforts in academia. We assemble nine data sets from realistic HTS campaigns representing major families of drug target proteins for benchmarking LB-CADD methods. Each data set is public domain through PubChem and carefully collated through confirmation screens validating active compounds. These data sets provide the foundation for benchmarking a new cheminformatics framework BCL::ChemInfo, which is freely available for non-commercial use. Quantitative structure activity relationship (QSAR models are built using Artificial Neural Networks (ANNs, Support Vector Machines (SVMs, Decision Trees (DTs, and Kohonen networks (KNs. Problem-specific descriptor optimization protocols are assessed including Sequential Feature Forward Selection (SFFS and various information content measures. Measures of predictive power and confidence are evaluated through cross-validation, and a consensus prediction scheme is tested that combines orthogonal machine learning algorithms into a single predictor. Enrichments ranging from 15 to 101 for a TPR cutoff of 25% are observed.

  9. Software for Distributed Computation on Medical Databases: A Demonstration Project

    Directory of Open Access Journals (Sweden)

    Balasubramanian Narasimhan

    2017-05-01

    Full Text Available Bringing together the information latent in distributed medical databases promises to personalize medical care by enabling reliable, stable modeling of outcomes with rich feature sets (including patient characteristics and treatments received. However, there are barriers to aggregation of medical data, due to lack of standardization of ontologies, privacy concerns, proprietary attitudes toward data, and a reluctance to give up control over end use. Aggregation of data is not always necessary for model fitting. In models based on maximizing a likelihood, the computations can be distributed, with aggregation limited to the intermediate results of calculations on local data, rather than raw data. Distributed fitting is also possible for singular value decomposition. There has been work on the technical aspects of shared computation for particular applications, but little has been published on the software needed to support the "social networking" aspect of shared computing, to reduce the barriers to collaboration. We describe a set of software tools that allow the rapid assembly of a collaborative computational project, based on the flexible and extensible R statistical software and other open source packages, that can work across a heterogeneous collection of database environments, with full transparency to allow local officials concerned with privacy protections to validate the safety of the method. We describe the principles, architecture, and successful test results for the site-stratified Cox model and rank-k singular value decomposition.

  10. Needs assessment for next generation computer-aided mammography reference image databases and evaluation studies.

    Science.gov (United States)

    Horsch, Alexander; Hapfelmeier, Alexander; Elter, Matthias

    2011-11-01

    Breast cancer is globally a major threat for women's health. Screening and adequate follow-up can significantly reduce the mortality from breast cancer. Human second reading of screening mammograms can increase breast cancer detection rates, whereas this has not been proven for current computer-aided detection systems as "second reader". Critical factors include the detection accuracy of the systems and the screening experience and training of the radiologist with the system. When assessing the performance of systems and system components, the choice of evaluation methods is particularly critical. Core assets herein are reference image databases and statistical methods. We have analyzed characteristics and usage of the currently largest publicly available mammography database, the Digital Database for Screening Mammography (DDSM) from the University of South Florida, in literature indexed in Medline, IEEE Xplore, SpringerLink, and SPIE, with respect to type of computer-aided diagnosis (CAD) (detection, CADe, or diagnostics, CADx), selection of database subsets, choice of evaluation method, and quality of descriptions. 59 publications presenting 106 evaluation studies met our selection criteria. In 54 studies (50.9%), the selection of test items (cases, images, regions of interest) extracted from the DDSM was not reproducible. Only 2 CADx studies, not any CADe studies, used the entire DDSM. The number of test items varies from 100 to 6000. Different statistical evaluation methods are chosen. Most common are train/test (34.9% of the studies), leave-one-out (23.6%), and N-fold cross-validation (18.9%). Database-related terminology tends to be imprecise or ambiguous, especially regarding the term "case". Overall, both the use of the DDSM as data source for evaluation of mammography CAD systems, and the application of statistical evaluation methods were found highly diverse. Results reported from different studies are therefore hardly comparable. Drawbacks of the DDSM

  11. Computer Simulation of Breast Cancer Screening

    National Research Council Canada - National Science Library

    Boone, John

    1999-01-01

    Breast cancer will affect approximately 12.5% of the women in the United States, and currently mammographic screening is considered the best way to reduce mortality from this disease through early detection...

  12. Computer-Aided Solvent Screening for Biocatalysis

    DEFF Research Database (Denmark)

    Abildskov, Jens; Leeuwen, M.B. van; Boeriu, C.G.

    2013-01-01

    constrained properties related to chemical reaction equilibrium, substrate and product solubility, water solubility, boiling points, toxicity and others. Two examples are provided, covering the screening of solvents for lipase-catalyzed transesterification of octanol and inulin with vinyl laurate....... Esterification of acrylic acid with octanol is also addressed. Solvents are screened and candidates identified, confirming existing experimental results. Although the examples involve lipases, the method is quite general, so there seems to be no preclusion against application to other biocatalysts....

  13. Computer-assisted indexing for the INIS database

    International Nuclear Information System (INIS)

    Nevyjel, A.

    2006-01-01

    INIS has identified computer-assisted indexing as areas where information technology could assist best in maintaining database quality and indexing consistency, while containing production costs. Subject analysis is a very important but also very expensive process in the production of the INIS database. Given the current necessity to process an increased number of records, including subject analysis, without additional staff, INIS as well as the member states need improvements in their processing efficiency. Computer assisted subject analysis is a promising way to achieve this. The quality of the INIS database is defined by its inputting rules. The Thesaurus is a terminological control device used in translating from the natural language of documents, indexers or users into a more constrained system language. It is a controlled and dynamic vocabulary of semantically and generically related terms. It is the essential tool for subject analysis as well as for advanced search engines. To support the identification of descriptors in the free text (title, abstract, free keywords) 'hidden terms' have been introduced as extension of the Thesaurus, which identify phrases or character strings of free text and point to the valid descriptor, which should be suggested. In the process of computer-assisted subject analysis the bibliographic records (including title and abstract) are analyzed by the software, resulting in a list of suggested descriptors. Within the working platform (graphical user interface) the suggested descriptors are sorted by importance (by their relevance for the content of the document) and the subject specialist clearly sees the highlighted context from which the terms were selected. The system allows the subject specialist to accept or reject descriptors from the suggested list and to assign additional descriptors when necessary. First experiences show that a performance enhancement of about 80-100% can be achieved in the subject analysis process. (author)

  14. An Algorithm for Computing Screened Coulomb Scattering in Geant4

    OpenAIRE

    Mendenhall, Marcus H.; Weller, Robert A.

    2004-01-01

    An algorithm has been developed for the Geant4 Monte-Carlo package for the efficient computation of screened Coulomb interatomic scattering. It explicitly integrates the classical equations of motion for scattering events, resulting in precise tracking of both the projectile and the recoil target nucleus. The algorithm permits the user to plug in an arbitrary screening function, such as Lens-Jensen screening, which is good for backscattering calculations, or Ziegler-Biersack-Littmark screenin...

  15. The Candidate Cancer Gene Database: a database of cancer driver genes from forward genetic screens in mice.

    Science.gov (United States)

    Abbott, Kenneth L; Nyre, Erik T; Abrahante, Juan; Ho, Yen-Yi; Isaksson Vogel, Rachel; Starr, Timothy K

    2015-01-01

    Identification of cancer driver gene mutations is crucial for advancing cancer therapeutics. Due to the overwhelming number of passenger mutations in the human tumor genome, it is difficult to pinpoint causative driver genes. Using transposon mutagenesis in mice many laboratories have conducted forward genetic screens and identified thousands of candidate driver genes that are highly relevant to human cancer. Unfortunately, this information is difficult to access and utilize because it is scattered across multiple publications using different mouse genome builds and strength metrics. To improve access to these findings and facilitate meta-analyses, we developed the Candidate Cancer Gene Database (CCGD, http://ccgd-starrlab.oit.umn.edu/). The CCGD is a manually curated database containing a unified description of all identified candidate driver genes and the genomic location of transposon common insertion sites (CISs) from all currently published transposon-based screens. To demonstrate relevance to human cancer, we performed a modified gene set enrichment analysis using KEGG pathways and show that human cancer pathways are highly enriched in the database. We also used hierarchical clustering to identify pathways enriched in blood cancers compared to solid cancers. The CCGD is a novel resource available to scientists interested in the identification of genetic drivers of cancer. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. Do Computers Write on Electric Screens?

    Directory of Open Access Journals (Sweden)

    Samuel Goyet

    2016-09-01

    Full Text Available How do we, humans, communicate with computers, or computational machines? What are the activities do humans and machines share, what are the meeting points between the two? Eventually, how can we build concepts of these meeting points that leaves space for the proper mode of existence of both humans and machines, without subduing one to the other? Computers are machines that operates on a scale different from humans: the calculus done by machines is too fast and untangible for humans. This is why computers activities has to be textualized, put into a form that can be understand for humans. For instance into a graphical interface, or a command line. More generally, this article tackles the problem of interface between humans and machines, the way the relation between humans and machines has been conceptualized. It is inspired both by philosophy of the modes of existence – since computers are machines with their own mode of existence – and semiotics, since computers activities have to be converted in some sort of signs that can be read by humans.

  17. SCREENING CHEMICALS FOR ESTROGEN RECEPTOR BIOACTIVITY USING A COMPUTATIONAL MODEL

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) is considering the use high-throughput and computational methods for regulatory applications in the Endocrine Disruptor Screening Program (EDSP). To use these new tools for regulatory decision making, computational methods must be a...

  18. BUSINESS MODELLING AND DATABASE DESIGN IN CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    Mihai-Constantin AVORNICULUI

    2015-04-01

    Full Text Available Electronic commerce is growing constantly from one year to another in the last decade, few are the areas that also register such a growth. It covers the exchanges of computerized data, but also electronic messaging, linear data banks and electronic transfer payment. Cloud computing, a relatively new concept and term, is a model of access services via the internet to distributed systems of configurable calculus resources at request which can be made available quickly with minimum management effort and intervention from the client and the provider. Behind an electronic commerce system in cloud there is a data base which contains the necessary information for the transactions in the system. Using business modelling, we get many benefits, which makes the design of the database used by electronic commerce systems in cloud considerably easier.

  19. Artist Material BRDF Database for Computer Graphics Rendering

    Science.gov (United States)

    Ashbaugh, Justin C.

    The primary goal of this thesis was to create a physical library of artist material samples. This collection provides necessary data for the development of a gonio-imaging system for use in museums to more accurately document their collections. A sample set was produced consisting of 25 panels and containing nearly 600 unique samples. Selected materials are representative of those commonly used by artists both past and present. These take into account the variability in visual appearance resulting from the materials and application techniques used. Five attributes of variability were identified including medium, color, substrate, application technique and overcoat. Combinations of these attributes were selected based on those commonly observed in museum collections and suggested by surveying experts in the field. For each sample material, image data is collected and used to measure an average bi-directional reflectance distribution function (BRDF). The results are available as a public-domain image and optical database of artist materials at art-si.org. Additionally, the database includes specifications for each sample along with other information useful for computer graphics rendering such as the rectified sample images and normal maps.

  20. Computed tomographic characteristics of interval and post screen carcinomas in lung cancer screening

    International Nuclear Information System (INIS)

    Scholten, Ernst T.; Horeweg, Nanda; Koning, Harry J. de; Vliegenthart, Rozemarijn; Oudkerk, Matthijs; Mali, Willem P.T.M.; Jong, Pim A. de

    2015-01-01

    To analyse computed tomography (CT) findings of interval and post-screen carcinomas in lung cancer screening. Consecutive interval and post-screen carcinomas from the Dutch-Belgium lung cancer screening trial were included. The prior screening and the diagnostic chest CT were reviewed by two experienced radiologists in consensus with knowledge of the tumour location on the diagnostic CT. Sixty-one participants (53 men) were diagnosed with an interval or post-screen carcinoma. Twenty-two (36 %) were in retrospect visible on the prior screening CT. Detection error occurred in 20 cancers and interpretation error in two cancers. Errors involved intrabronchial tumour (n = 5), bulla with wall thickening (n = 5), lymphadenopathy (n = 3), pleural effusion (n = 1) and intraparenchymal solid nodules (n = 8). These were missed because of a broad pleural attachment (n = 4), extensive reticulation surrounding a nodule (n = 1) and extensive scarring (n = 1). No definite explanation other than human error was found in two cases. None of the interval or post-screen carcinomas involved a subsolid nodule. Interval or post-screen carcinomas that were visible in retrospect were mostly due to detection errors of solid nodules, bulla wall thickening or endobronchial lesions. Interval or post-screen carcinomas without explanation other than human errors are rare. (orig.)

  1. Computed tomographic characteristics of interval and post screen carcinomas in lung cancer screening

    Energy Technology Data Exchange (ETDEWEB)

    Scholten, Ernst T. [University Medical Centre, Department of Radiology, Utrecht (Netherlands); Kennemer Gasthuis, Department of Radiology, Haarlem (Netherlands); Horeweg, Nanda [Erasmus University Medical Centre, Department of Public Health, Rotterdam (Netherlands); Erasmus University Medical Centre, Department of Pulmonary Medicine, Rotterdam (Netherlands); Koning, Harry J. de [Erasmus University Medical Centre, Department of Public Health, Rotterdam (Netherlands); Vliegenthart, Rozemarijn [University of Groningen, University Medical Centre Groningen, Department of Radiology, Groningen (Netherlands); University of Groningen, University Medical Centre Groningen, Center for Medical Imaging-North East Netherlands, Groningen (Netherlands); Oudkerk, Matthijs [University of Groningen, University Medical Centre Groningen, Center for Medical Imaging-North East Netherlands, Groningen (Netherlands); Mali, Willem P.T.M.; Jong, Pim A. de [University Medical Centre, Department of Radiology, Utrecht (Netherlands)

    2015-01-15

    To analyse computed tomography (CT) findings of interval and post-screen carcinomas in lung cancer screening. Consecutive interval and post-screen carcinomas from the Dutch-Belgium lung cancer screening trial were included. The prior screening and the diagnostic chest CT were reviewed by two experienced radiologists in consensus with knowledge of the tumour location on the diagnostic CT. Sixty-one participants (53 men) were diagnosed with an interval or post-screen carcinoma. Twenty-two (36 %) were in retrospect visible on the prior screening CT. Detection error occurred in 20 cancers and interpretation error in two cancers. Errors involved intrabronchial tumour (n = 5), bulla with wall thickening (n = 5), lymphadenopathy (n = 3), pleural effusion (n = 1) and intraparenchymal solid nodules (n = 8). These were missed because of a broad pleural attachment (n = 4), extensive reticulation surrounding a nodule (n = 1) and extensive scarring (n = 1). No definite explanation other than human error was found in two cases. None of the interval or post-screen carcinomas involved a subsolid nodule. Interval or post-screen carcinomas that were visible in retrospect were mostly due to detection errors of solid nodules, bulla wall thickening or endobronchial lesions. Interval or post-screen carcinomas without explanation other than human errors are rare. (orig.)

  2. Computational Screening of MOFs for Acetylene Separation.

    Science.gov (United States)

    Nemati Vesali Azar, Ayda; Keskin, Seda

    2018-01-01

    Efficient separation of acetylene (C 2 H 2 ) from CO 2 and CH 4 is important to meet the requirement of high-purity acetylene in various industrial applications. Metal organic frameworks (MOFs) are great candidates for adsorption-based C 2 H 2 /CO 2 and C 2 H 2 /CH 4 separations due to their unique properties such as wide range of pore sizes and tunable chemistries. Experimental studies on the limited number of MOFs revealed that MOFs offer remarkable C 2 H 2 /CO 2 and C 2 H 2 /CH 4 selectivities based on single-component adsorption data. We performed the first large-scale molecular simulation study to investigate separation performances of 174 different MOF structures for C 2 H 2 /CO 2 and C 2 H 2 /CH 4 mixtures. Using the results of molecular simulations, several adsorbent performance evaluation metrics, such as selectivity, working capacity, adsorbent performance score, sorbent selection parameter, and regenerability were computed for each MOF. Based on these metrics, the best adsorbent candidates were identified for both separations. Results showed that the top three most promising MOF adsorbents exhibit C 2 H 2 /CO 2 selectivities of 49, 47, 24 and C 2 H 2 /CH 4 selectivities of 824, 684, 638 at 1 bar, 298 K and these are the highest C 2 H 2 selectivities reported to date in the literature. Structure-performance analysis revealed that the best MOF adsorbents have pore sizes between 4 and 11 Å, surface areas in the range of 600-1,200 m 2 /g and porosities between 0.4 and 0.6 for selective separation of C 2 H 2 from CO 2 and CH 4 . These results will guide the future studies for the design of new MOFs with high C 2 H 2 separation potentials.

  3. Computational Screening of MOFs for Acetylene Separation

    Science.gov (United States)

    Nemati Vesali Azar, Ayda; Keskin, Seda

    2018-02-01

    Efficient separation of acetylene (C2H2) from CO2 and CH4 is important to meet the requirement of high-purity acetylene in various industrial applications. Metal organic frameworks (MOFs) are great candidates for adsorption-based C2H2/CO2 and C2H2/CH4 separations due to their unique properties such as wide range of pore sizes and tunable chemistries. Experimental studies on the limited number of MOFs revealed that MOFs offer remarkable C2H2/CO2 and C2H2/CH4 selectivities based on single-component adsorption data. We performed the first large-scale molecular simulation study to investigate separation performances of 174 different MOF structures for C2H2/CO2 and C2H2/CH4 mixtures. Using the results of molecular simulations, several adsorbent performance evaluation metrics, such as selectivity, working capacity, adsorbent performance score, sorbent selection parameter and regenerability were computed for each MOF. Based on these metrics, the best adsorbent candidates were identified for both separations. Results showed that the top three most promising MOF adsorbents exhibit C2H2/CO2 selectivities of 49, 47, 24 and C2H2/CH4 selectivities of 824, 684, 638 at 1 bar, 298 K and these are the highest C2H2 selectivities reported to date in the literature. Structure-performance analysis revealed that the best MOF adsorbents have pore sizes between 4-11 Å, surface areas in the range of 600-1,200 m2/g and porosities between 0.4-0.6 for selective separation of C2H2 from CO2 and CH4. These results will guide the future studies for the design of new MOFs with high C2H2 separation potentials.

  4. Computational Screening of MOFs for Acetylene Separation

    Directory of Open Access Journals (Sweden)

    Ayda Nemati Vesali Azar

    2018-02-01

    Full Text Available Efficient separation of acetylene (C2H2 from CO2 and CH4 is important to meet the requirement of high-purity acetylene in various industrial applications. Metal organic frameworks (MOFs are great candidates for adsorption-based C2H2/CO2 and C2H2/CH4 separations due to their unique properties such as wide range of pore sizes and tunable chemistries. Experimental studies on the limited number of MOFs revealed that MOFs offer remarkable C2H2/CO2 and C2H2/CH4 selectivities based on single-component adsorption data. We performed the first large-scale molecular simulation study to investigate separation performances of 174 different MOF structures for C2H2/CO2 and C2H2/CH4 mixtures. Using the results of molecular simulations, several adsorbent performance evaluation metrics, such as selectivity, working capacity, adsorbent performance score, sorbent selection parameter, and regenerability were computed for each MOF. Based on these metrics, the best adsorbent candidates were identified for both separations. Results showed that the top three most promising MOF adsorbents exhibit C2H2/CO2 selectivities of 49, 47, 24 and C2H2/CH4 selectivities of 824, 684, 638 at 1 bar, 298 K and these are the highest C2H2 selectivities reported to date in the literature. Structure-performance analysis revealed that the best MOF adsorbents have pore sizes between 4 and 11 Å, surface areas in the range of 600–1,200 m2/g and porosities between 0.4 and 0.6 for selective separation of C2H2 from CO2 and CH4. These results will guide the future studies for the design of new MOFs with high C2H2 separation potentials.

  5. Radiation levels from computer monitor screens within Benue State ...

    African Journals Online (AJOL)

    Investigation of possible presence of soft X-ray levels from Computer Screens at distances of 0.5m and 1.0m was carried out within Benue State University, Makurdi, using ten different monitor models. Radiation measurement was carried out using a portable digital radiation meter, INSPECTOR 06250 (SE international Inc.

  6. Discovery of technical methanation catalysts based on computational screening

    DEFF Research Database (Denmark)

    Sehested, Jens; Larsen, Kasper Emil; Kustov, Arkadii

    2007-01-01

    Methanation is a classical reaction in heterogeneous catalysis and significant effort has been put into improving the industrially preferred nickel-based catalysts. Recently, a computational screening study showed that nickel-iron alloys should be more active than the pure nickel catalyst and at ...

  7. An algorithm for computing screened Coulomb scattering in GEANT4

    Energy Technology Data Exchange (ETDEWEB)

    Mendenhall, Marcus H. [Vanderbilt University Free Electron Laser Center, P.O. Box 351816 Station B, Nashville, TN 37235-1816 (United States)]. E-mail: marcus.h.mendenhall@vanderbilt.edu; Weller, Robert A. [Department of Electrical Engineering and Computer Science, Vanderbilt University, P.O. Box 351821 Station B, Nashville, TN 37235-1821 (United States)]. E-mail: robert.a.weller@vanderbilt.edu

    2005-01-01

    An algorithm has been developed for the GEANT4 Monte-Carlo package for the efficient computation of screened Coulomb interatomic scattering. It explicitly integrates the classical equations of motion for scattering events, resulting in precise tracking of both the projectile and the recoil target nucleus. The algorithm permits the user to plug in an arbitrary screening function, such as Lens-Jensen screening, which is good for backscattering calculations, or Ziegler-Biersack-Littmark screening, which is good for nuclear straggling and implantation problems. This will allow many of the applications of the TRIM and SRIM codes to be extended into the much more general GEANT4 framework where nuclear and other effects can be included.

  8. An algorithm for computing screened Coulomb scattering in GEANT4

    International Nuclear Information System (INIS)

    Mendenhall, Marcus H.; Weller, Robert A.

    2005-01-01

    An algorithm has been developed for the GEANT4 Monte-Carlo package for the efficient computation of screened Coulomb interatomic scattering. It explicitly integrates the classical equations of motion for scattering events, resulting in precise tracking of both the projectile and the recoil target nucleus. The algorithm permits the user to plug in an arbitrary screening function, such as Lens-Jensen screening, which is good for backscattering calculations, or Ziegler-Biersack-Littmark screening, which is good for nuclear straggling and implantation problems. This will allow many of the applications of the TRIM and SRIM codes to be extended into the much more general GEANT4 framework where nuclear and other effects can be included

  9. [Adverse Effect Predictions Based on Computational Toxicology Techniques and Large-scale Databases].

    Science.gov (United States)

    Uesawa, Yoshihiro

    2018-01-01

     Understanding the features of chemical structures related to the adverse effects of drugs is useful for identifying potential adverse effects of new drugs. This can be based on the limited information available from post-marketing surveillance, assessment of the potential toxicities of metabolites and illegal drugs with unclear characteristics, screening of lead compounds at the drug discovery stage, and identification of leads for the discovery of new pharmacological mechanisms. This present paper describes techniques used in computational toxicology to investigate the content of large-scale spontaneous report databases of adverse effects, and it is illustrated with examples. Furthermore, volcano plotting, a new visualization method for clarifying the relationships between drugs and adverse effects via comprehensive analyses, will be introduced. These analyses may produce a great amount of data that can be applied to drug repositioning.

  10. The influence of the negative-positive ratio and screening database size on the performance of machine learning-based virtual screening.

    Science.gov (United States)

    Kurczab, Rafał; Bojarski, Andrzej J

    2017-01-01

    The machine learning-based virtual screening of molecular databases is a commonly used approach to identify hits. However, many aspects associated with training predictive models can influence the final performance and, consequently, the number of hits found. Thus, we performed a systematic study of the simultaneous influence of the proportion of negatives to positives in the testing set, the size of screening databases and the type of molecular representations on the effectiveness of classification. The results obtained for eight protein targets, five machine learning algorithms (SMO, Naïve Bayes, Ibk, J48 and Random Forest), two types of molecular fingerprints (MACCS and CDK FP) and eight screening databases with different numbers of molecules confirmed our previous findings that increases in the ratio of negative to positive training instances greatly influenced most of the investigated parameters of the ML methods in simulated virtual screening experiments. However, the performance of screening was shown to also be highly dependent on the molecular library dimension. Generally, with the increasing size of the screened database, the optimal training ratio also increased, and this ratio can be rationalized using the proposed cost-effectiveness threshold approach. To increase the performance of machine learning-based virtual screening, the training set should be constructed in a way that considers the size of the screening database.

  11. Go Figure: Computer Database Adds the Personal Touch.

    Science.gov (United States)

    Gaffney, Jean; Crawford, Pat

    1992-01-01

    A database for recordkeeping for a summer reading club was developed for a public library system using an IBM PC and Microsoft Works. Use of the database resulted in more efficient program management, giving librarians more time to spend with patrons and enabling timely awarding of incentives. (LAE)

  12. Computer-aided diagnosis workstation and database system for chest diagnosis based on multi-helical CT images

    Science.gov (United States)

    Satoh, Hitoshi; Niki, Noboru; Mori, Kiyoshi; Eguchi, Kenji; Kaneko, Masahiro; Kakinuma, Ryutarou; Moriyama, Noriyuki; Ohmatsu, Hironobu; Masuda, Hideo; Machida, Suguru; Sasagawa, Michizou

    2006-03-01

    Multi-helical CT scanner advanced remarkably at the speed at which the chest CT images were acquired for mass screening. Mass screening based on multi-helical CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. To overcome this problem, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images and a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification. We also have developed electronic medical recording system and prototype internet system for the community health in two or more regions by using the Virtual Private Network router and Biometric fingerprint authentication system and Biometric face authentication system for safety of medical information. Based on these diagnostic assistance methods, we have now developed a new computer-aided workstation and database that can display suspected lesions three-dimensionally in a short time. This paper describes basic studies that have been conducted to evaluate this new system. The results of this study indicate that our computer-aided diagnosis workstation and network system can increase diagnostic speed, diagnostic accuracy and safety of medical information.

  13. SWEETLEAD: an in silico database of approved drugs, regulated chemicals, and herbal isolates for computer-aided drug discovery.

    Directory of Open Access Journals (Sweden)

    Paul A Novick

    Full Text Available In the face of drastically rising drug discovery costs, strategies promising to reduce development timelines and expenditures are being pursued. Computer-aided virtual screening and repurposing approved drugs are two such strategies that have shown recent success. Herein, we report the creation of a highly-curated in silico database of chemical structures representing approved drugs, chemical isolates from traditional medicinal herbs, and regulated chemicals, termed the SWEETLEAD database. The motivation for SWEETLEAD stems from the observance of conflicting information in publicly available chemical databases and the lack of a highly curated database of chemical structures for the globally approved drugs. A consensus building scheme surveying information from several publicly accessible databases was employed to identify the correct structure for each chemical. Resulting structures are filtered for the active pharmaceutical ingredient, standardized, and differing formulations of the same drug were combined in the final database. The publically available release of SWEETLEAD (https://simtk.org/home/sweetlead provides an important tool to enable the successful completion of computer-aided repurposing and drug discovery campaigns.

  14. Databases

    Digital Repository Service at National Institute of Oceanography (India)

    Kunte, P.D.

    Information on bibliographic as well as numeric/textual databases relevant to coastal geomorphology has been included in a tabular form. Databases cover a broad spectrum of related subjects like coastal environment and population aspects, coastline...

  15. Depression screening using the Patient Health Questionnaire-9 administered on a touch screen computer.

    Science.gov (United States)

    Fann, Jesse R; Berry, Donna L; Wolpin, Seth; Austin-Seymour, Mary; Bush, Nigel; Halpenny, Barbara; Lober, William B; McCorkle, Ruth

    2009-01-01

    To (1) evaluate the feasibility of touch screen depression screening in cancer patients using the Patient Health Questionnaire-9 (PHQ-9), (2) evaluate the construct validity of the PHQ-9 using the touch screen modality, and (3) examine the prevalence and severity of depression using this screening modality. The PHQ-9 was placed in a web-based survey within a study of the clinical impact of computerized symptom and quality of life screening. Patients in medical oncology, radiation oncology, and hematopoietic stem cell transplantation (HSCT) clinics used the program on a touch screen computer in waiting rooms prior to therapy (T1) and during therapy (T2). Responses of depressed mood or anhedonia (PHQ-2 cardinal depression symptoms) triggered additional items. PHQ-9 scores were provided to the oncology team in real time. Among 342 patients enrolled, 33 (9.6%) at T1 and 69 (20.2%) at T2 triggered the full PHQ-9 by endorsing at least one cardinal symptom. Feasibility was high, with at least 97% completing the PHQ-2 and at least 96% completing the PHQ-9 when triggered and a mean completion time of about 2 min. The PHQ-9 had good construct validity. Medical oncology patients had the highest percent of positive screens (12.9%) at T1, while HSCT patients had the highest percent (30.5%) at T2. Using this method, 21 (6.1%) at T1 and 54 (15.8%) at T2 of the total sample had moderate to severe depression. The PHQ-9 administered on a touch screen computer is feasible and provides valid depression data in a diverse cancer population. (c) 2008 John Wiley & Sons, Ltd.

  16. DNA algorithms of implementing biomolecular databases on a biological computer.

    Science.gov (United States)

    Chang, Weng-Long; Vasilakos, Athanasios V

    2015-01-01

    In this paper, DNA algorithms are proposed to perform eight operations of relational algebra (calculus), which include Cartesian product, union, set difference, selection, projection, intersection, join, and division, on biomolecular relational databases.

  17. Computer Aided Design for Soil Classification Relational Database ...

    African Journals Online (AJOL)

    unique firstlady

    engineering, several developers were asked what rules they applied to identify ... classification is actually a part of all good science. As Michalski ... by a large number of soil scientists. .... and use. The calculus relational database processing is.

  18. Computer-aided diagnosis for screening of breast cancer on mammograms. Current status and future potential

    International Nuclear Information System (INIS)

    Doi, Kunio

    2007-01-01

    Described are the history, current status and future potential of computer-aided diagnosis (CAD) with particular emphasis on screening mammography for breast cancer. The systematic basic and clinical studies on CAD started around 20 years before and the significance of CAD has been well recognized to be evident because of human errors occurring in the visual check by doctors of so numerous screening images. Improvement of diagnostic accuracy by CAD has been demonstrated by statistical analysis of ROC (receiver operating characteristic) curves. In mammography, reviewed is detection of the early stage breast cancer like microcalcifications by computer alone, by CAD plus one or more doctors' reading, and by practical clinical CAD diagnosis. For differential diagnosis for malignancy, microcalcifications and masses are given their characteristic image properties and the results are that the Az-value (area under ROC curve) is higher in CAD than in doctor's (0.80 vs 0.61) in the former and, doctor's (0.93) is improved by CAD to 0.96 in the latter masses. In this diagnosis, similar images in the digital database are useful and the database can learn by repeated input of individual data by neural network. Detection of the lesion and especially, its differential diagnosis will be more important in parallel to database development and CAD will be also useful for doctor' carrier as an educational mean. (R.T.)

  19. Computational Screening of Materials for Water Splitting Applications

    DEFF Research Database (Denmark)

    Castelli, Ivano Eligio

    Design new materials for energy production in a photoelectrochemical cell, where water is split into hydrogen and oxygen by solar light, is one possible solution to the problem of increasing energy demand and storage. A screening procedure based on ab-initio density functional theory calculations...... Project database, which is based on the experimental ICSD database, and the bandgaps were calculated with focus on finding materials with potential as light harvesters. 24 materials have been proposed for the one-photon water splitting and 23 for the two-photon mechanism. Another method to obtain energy...... from Sun is using a photovoltaic cell that converts solar light into electricity. The absorption spectra of 70 experimentally known compounds, that are expected to be useful for light-to-electricity generation, have been calculated. 17 materials have been predicted to be promising for a single...

  20. Decision trees and integrated features for computer aided mammographic screening

    Energy Technology Data Exchange (ETDEWEB)

    Kegelmeyer, W.P. Jr.; Groshong, B.; Allmen, M.; Woods, K.

    1997-02-01

    Breast cancer is a serious problem, which in the United States causes 43,000 deaths a year, eventually striking 1 in 9 women. Early detection is the only effective countermeasure, and mass mammography screening is the only reliable means for early detection. Mass screening has many shortcomings which could be addressed by a computer-aided mammographic screening system. Accordingly, we have applied the pattern recognition methods developed in earlier investigations of speculated lesions in mammograms to the detection of microcalcifications and circumscribed masses, generating new, more rigorous and uniform methods for the detection of both those signs. We have also improved the pattern recognition methods themselves, through the development of a new approach to combinations of multiple classifiers.

  1. Individual Stochastic Screening for the Development of Computer Graphics

    Directory of Open Access Journals (Sweden)

    Maja Turčić¹*

    2012-12-01

    Full Text Available With the emergence of new tools and media, art and design have developed into digital computer-generated works. This article presents a sequence of creating art graphics because their original authors have not published the procedures. The goal is to discover the mathematics of an image and the programming libretto with the purpose of organizing a structural base of computer graphics. We will elaborate the procedures used to produce graphics known throughout the history of art, but that are nowadays also found in design and security graphics. The results are closely related graphics obtained by changing parameters that initiate them. The aim is to control the graphics, i.e. to use controlled stochastic to achieve desired solutions. Since the artists from the past have never published the procedures of screening methods, their ideas have remained “only” the works of art. In this article we will present the development of the algorithm that, more or less successfully, simulates those screening solutions. It has been proven that mathematically defined graphical elements serve as screening elements. New technological and mathematical solutions are introduced in the reproduction with individual screening elements to be used in printing.

  2. Statistical screening of input variables in a complex computer code

    International Nuclear Information System (INIS)

    Krieger, T.J.

    1982-01-01

    A method is presented for ''statistical screening'' of input variables in a complex computer code. The object is to determine the ''effective'' or important input variables by estimating the relative magnitudes of their associated sensitivity coefficients. This is accomplished by performing a numerical experiment consisting of a relatively small number of computer runs with the code followed by a statistical analysis of the results. A formula for estimating the sensitivity coefficients is derived. Reference is made to an earlier work in which the method was applied to a complex reactor code with good results

  3. Databases

    Directory of Open Access Journals (Sweden)

    Nick Ryan

    2004-01-01

    Full Text Available Databases are deeply embedded in archaeology, underpinning and supporting many aspects of the subject. However, as well as providing a means for storing, retrieving and modifying data, databases themselves must be a result of a detailed analysis and design process. This article looks at this process, and shows how the characteristics of data models affect the process of database design and implementation. The impact of the Internet on the development of databases is examined, and the article concludes with a discussion of a range of issues associated with the recording and management of archaeological data.

  4. Enabling On-Demand Database Computing with MIT SuperCloud Database Management System

    Science.gov (United States)

    2015-09-15

    arc.liv.ac.uk/trac/SGE) provides these services and is independent of programming language (C, Fortran, Java , Matlab, etc) or parallel programming...a MySQL database to store DNS records. The DNS records are controlled via a simple web service interface that allows records to be created

  5. Cloud Computing Databases: Latest Trends and Architectural Concepts

    OpenAIRE

    Tarandeep Singh; Parvinder S. Sandhu

    2011-01-01

    The Economic factors are leading to the rise of infrastructures provides software and computing facilities as a service, known as cloud services or cloud computing. Cloud services can provide efficiencies for application providers, both by limiting up-front capital expenses, and by reducing the cost of ownership over time. Such services are made available in a data center, using shared commodity hardware for computation and storage. There is a varied set of cloud services...

  6. Computer Aided Design for Soil Classification Relational Database ...

    African Journals Online (AJOL)

    The paper focuses on the problems associated with classification, storage and retrieval of information on soil data, such as the incompatibility of soil data semantics; inadequate documentation, and lack of indexing; hence it is pretty difficult to efficiently access large database. Consequently, information on soil is very difficult ...

  7. Database organization for computer-aided characterization of laser diode

    International Nuclear Information System (INIS)

    Oyedokun, Z.O.

    1988-01-01

    Computer-aided data logging involves a huge amount of data which must be properly managed for optimized storage space, easy access, retrieval and utilization. An organization method is developed to enhance the advantages of computer-based data logging of the testing of the semiconductor injection laser which optimize storage space, permit authorized user easy access and inhibits penetration. This method is based on unique file identification protocol tree structure and command file-oriented access procedures

  8. Computer screen photo-excited surface plasmon resonance imaging.

    Science.gov (United States)

    Filippini, Daniel; Winquist, Fredrik; Lundström, Ingemar

    2008-09-12

    Angle and spectra resolved surface plasmon resonance (SPR) images of gold and silver thin films with protein deposits is demonstrated using a regular computer screen as light source and a web camera as detector. The screen provides multiple-angle illumination, p-polarized light and controlled spectral radiances to excite surface plasmons in a Kretchmann configuration. A model of the SPR reflectances incorporating the particularities of the source and detector explain the observed signals and the generation of distinctive SPR landscapes is demonstrated. The sensitivity and resolution of the method, determined in air and solution, are 0.145 nm pixel(-1), 0.523 nm, 5.13x10(-3) RIU degree(-1) and 6.014x10(-4) RIU, respectively, encouraging results at this proof of concept stage and considering the ubiquity of the instrumentation.

  9. Data Linkage Graph: computation, querying and knowledge discovery of life science database networks

    Directory of Open Access Journals (Sweden)

    Lange Matthias

    2007-12-01

    Full Text Available To support the interpretation of measured molecular facts, like gene expression experiments or EST sequencing, the functional or the system biological context has to be considered. Doing so, the relationship to existing biological knowledge has to be discovered. In general, biological knowledge is worldwide represented in a network of databases. In this paper we present a method for knowledge extraction in life science databases, which prevents the scientists from screen scraping and web clicking approaches.

  10. Stereoselective virtual screening of the ZINC database using atom pair 3D-fingerprints.

    Science.gov (United States)

    Awale, Mahendra; Jin, Xian; Reymond, Jean-Louis

    2015-01-01

    Tools to explore large compound databases in search for analogs of query molecules provide a strategically important support in drug discovery to help identify available analogs of any given reference or hit compound by ligand based virtual screening (LBVS). We recently showed that large databases can be formatted for very fast searching with various 2D-fingerprints using the city-block distance as similarity measure, in particular a 2D-atom pair fingerprint (APfp) and the related category extended atom pair fingerprint (Xfp) which efficiently encode molecular shape and pharmacophores, but do not perceive stereochemistry. Here we investigated related 3D-atom pair fingerprints to enable rapid stereoselective searches in the ZINC database (23.2 million 3D structures). Molecular fingerprints counting atom pairs at increasing through-space distance intervals were designed using either all atoms (16-bit 3DAPfp) or different atom categories (80-bit 3DXfp). These 3D-fingerprints retrieved molecular shape and pharmacophore analogs (defined by OpenEye ROCS scoring functions) of 110,000 compounds from the Cambridge Structural Database with equal or better accuracy than the 2D-fingerprints APfp and Xfp, and showed comparable performance in recovering actives from decoys in the DUD database. LBVS by 3DXfp or 3DAPfp similarity was stereoselective and gave very different analogs when starting from different diastereomers of the same chiral drug. Results were also different from LBVS with the parent 2D-fingerprints Xfp or APfp. 3D- and 2D-fingerprints also gave very different results in LBVS of folded molecules where through-space distances between atom pairs are much shorter than topological distances. 3DAPfp and 3DXfp are suitable for stereoselective searches for shape and pharmacophore analogs of query molecules in large databases. Web-browsers for searching ZINC by 3DAPfp and 3DXfp similarity are accessible at www.gdb.unibe.ch and should provide useful assistance to drug

  11. The Use of a Relational Database in Qualitative Research on Educational Computing.

    Science.gov (United States)

    Winer, Laura R.; Carriere, Mario

    1990-01-01

    Discusses the use of a relational database as a data management and analysis tool for nonexperimental qualitative research, and describes the use of the Reflex Plus database in the Vitrine 2001 project in Quebec to study computer-based learning environments. Information systems are also discussed, and the use of a conceptual model is explained.…

  12. The establishment of the Blacknest seismological database on the Rutherford Laboratory system 360/195 computer

    International Nuclear Information System (INIS)

    Blamey, C.

    1977-01-01

    In order to assess the problems which might arise from monitoring a comprehensive test ban treaty by seismological methods, an experimental monitoring operation is being conducted. This work has involved the establishment of a database on the Rutherford Laboratory 360/195 system computer. The database can be accessed in the UK over the public telephone network and in the USA via ARPANET. (author)

  13. Computer Cataloging of Electronic Journals in Unstable Aggregator Databases: The Hong Kong Baptist University Library Experience.

    Science.gov (United States)

    Li, Yiu-On; Leung, Shirley W.

    2001-01-01

    Discussion of aggregator databases focuses on a project at the Hong Kong Baptist University library to integrate full-text electronic journal titles from three unstable aggregator databases into its online public access catalog (OPAC). Explains the development of the electronic journal computer program (EJCOP) to generate MARC records for…

  14. Structure-based virtual screening and characterization of a novel IL-6 antagonistic compound from synthetic compound database

    Directory of Open Access Journals (Sweden)

    Wang J

    2016-12-01

    Full Text Available Jing Wang,1,* Chunxia Qiao,1,* He Xiao,1 Zhou Lin,1 Yan Li,1 Jiyan Zhang,1 Beifen Shen,1 Tinghuan Fu,2 Jiannan Feng1 1Department of Molecular Immunology, Beijing Institute of Basic Medical Sciences, 2First Affiliated Hospital of PLA General Hospital, Beijing, People’s Republic of China *These authors contributed equally to this work Abstract: According to the three-dimensional (3D complex structure of (hIL-6·hIL-6R·gp 1302 and the binding orientation of hIL-6, three compounds with high affinity to hIL-6R and bioactivity to block hIL-6 in vitro were screened theoretically from the chemical databases, including 3D-Available Chemicals Directory (ACD and MDL Drug Data Report (MDDR, by means of the computer-guided virtual screening method. Using distance geometry, molecular modeling and molecular dynamics trajectory analysis methods, the binding mode and binding energy of the three compounds were evaluated theoretically. Enzyme-linked immunosorbent assay analysis demonstrated that all the three compounds could block IL-6 binding to IL-6R specifically. However, only compound 1 could effectively antagonize the function of hIL-6 and inhibit the proliferation of XG-7 cells in a dose-dependent manner, whereas it showed no cytotoxicity to SP2/0 or L929 cells. These data demonstrated that the compound 1 could be a promising candidate of hIL-6 antagonist. Keywords: virtual screening, structural optimization, human interlukin-6, small molecular antagonist, XG-7 cells, apoptosis

  15. Computer-aided visualization of database structural relationships

    International Nuclear Information System (INIS)

    Cahn, D.F.

    1980-04-01

    Interactive computer graphic displays can be extremely useful in augmenting understandability of data structures. In complexly interrelated domains such as bibliographic thesauri and energy information systems, node and link displays represent one such tool. This paper presents examples of data structure representations found useful in these domains and discusses some of their generalizable components. 2 figures

  16. Some Aspects of Process Computers Configuration Control in Nuclear Power Plant Krsko - Process Computer Signal Configuration Database (PCSCDB)

    International Nuclear Information System (INIS)

    Mandic, D.; Kocnar, R.; Sucic, B.

    2002-01-01

    During the operation of NEK and other nuclear power plants it has been recognized that certain issues related to the usage of digital equipment and associated software in NPP technological process protection, control and monitoring, is not adequately addressed in the existing programs and procedures. The term and the process of Process Computers Configuration Control joins three 10CFR50 Appendix B quality requirements of Process Computers application in NPP: Design Control, Document Control and Identification and Control of Materials, Parts and Components. This paper describes Process Computer Signal Configuration Database (PCSCDB), that was developed and implemented in order to resolve some aspects of Process Computer Configuration Control related to the signals or database points that exist in the life cycle of different Process Computer Systems (PCS) in Nuclear Power Plant Krsko. PCSCDB is controlled, master database, related to the definition and description of the configurable database points associated with all Process Computer Systems in NEK. PCSCDB holds attributes related to the configuration of addressable and configurable real time database points and attributes related to the signal life cycle references and history data such as: Input/Output signals, Manually Input database points, Program constants, Setpoints, Calculated (by application program or SCADA calculation tools) database points, Control Flags (example: enable / disable certain program feature) Signal acquisition design references to the DCM (Document Control Module Application software for document control within Management Information System - MIS) and MECL (Master Equipment and Component List MIS Application software for identification and configuration control of plant equipment and components) Usage of particular database point in particular application software packages, and in the man-machine interface features (display mimics, printout reports, ...) Signals history (EEAR Engineering

  17. Efficacy of computer-aided detection system for screening mammography

    International Nuclear Information System (INIS)

    Saito, Mioko; Ohnuki, Koji; Yamada, Takayuki; Saito, Haruo; Ishibashi, Tadashi; Ohuchi, Noriaki; Takahashi, Shoki

    2002-01-01

    A study was conducted to evaluate the efficacy of a computer-aided detection (CAD) system for screening mammography (MMG). Screening mammograms of 2,231 women aged over 50 yr were examined. Medio-lateral oblique (MLO) images were obtained, and two expert observers interpreted the mammograms by consensus. First, each mammogram was interpreted without the assistance of CAD, followed immediately by a re-evaluation of areas marked by the CAD system. Data were recorded to measure the effect of CAD on the recall rate, cancer detection rate and detection rate of masses, microcalcifications and other findings. The CAD system increased the recall rate from 2.3% to 2.6%. Six recalled cases were diagnosed as breast cancer pathologically, and CAD detected all of these lesions. Seven additional cases in which CAD detected abnormal findings had no malignancy. The detection rate of CAD for microcalcifications was high (95.0%). However, the detection rate for mass lesions and other findings was low (29.2% and 25.0% respectively). The false positivity rate was 0.13/film for microcalcifications, and 0.25/film for mass lesions. The efficacy of the CAD system for detecting microcalcifications on screening mammograms was confirmed. However, the low detection rate of mass lesions and relatively high rate of false positivity need to be further improved. (author)

  18. Screening of Gas-Cooled Reactor Thermal-Hydraulic and Safety Analysis Tools and Experimental Database

    International Nuclear Information System (INIS)

    Lee, Won Jae; Kim, Min Hwan; Lee, Seung Wook

    2007-08-01

    This report is a final report of I-NERI Project, 'Screening of Gas-cooled Reactor Thermal Hydraulic and Safety Analysis Tools and Experimental Database 'jointly carried out by KAERI, ANL and INL. In this study, we developed the basic technologies required to develop and validate the VHTR TH/safety analysis tools and evaluated the TH/safety database information. The research tasks consist of; 1) code qualification methodology (INL), 2) high-level PIRTs for major nucleus set of events (KAERI, ANL, INL), 3) initial scaling and scoping analysis (ANL, KAERI, INL), 4) filtering of TH/safety tools (KAERI, INL), 5) evaluation of TH/safety database information (KAERI, INL, ANL) and 6) key scoping analysis (KAERI). The code qualification methodology identifies the role of PIRTs in the R and D process and the bottom-up and top-down code validation methods. Since the design of VHTR is still evolving, we generated the high-level PIRTs referencing 600MWth block-type GT-MHR and 400MWth pebble-type PBMR. Nucleus set of events that represents the VHTR safety and operational transients consists of the enveloping scenarios of HPCC (high pressure conduction cooling: loss of primary flow), LPCC/Air-Ingress (low pressure conduction cooling: loss of coolant), LC (load changes: power maneuvering), ATWS (anticipated transients without scram: reactivity insertion), WS (water ingress: water-interfacing system break) and HU (hydrogen-side upset: loss of heat sink). The initial scaling analysis defines dimensionless parameters that need to be reflected in mixed convection modeling and the initial scoping analysis provided the reference system transients used in the PIRTs generation. For the PIRTs phenomena, we evaluated the modeling capability of the candidate TH/safety tools and derived a model improvement need. By surveying and evaluating the TH/safety database information, a tools V and V matrix has been developed. Through the key scoping analysis using available database, the modeling

  19. Screening of Gas-Cooled Reactor Thermal-Hydraulic and Safety Analysis Tools and Experimental Database

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Kim, Min Hwan; Lee, Seung Wook (and others)

    2007-08-15

    This report is a final report of I-NERI Project, 'Screening of Gas-cooled Reactor Thermal Hydraulic and Safety Analysis Tools and Experimental Database 'jointly carried out by KAERI, ANL and INL. In this study, we developed the basic technologies required to develop and validate the VHTR TH/safety analysis tools and evaluated the TH/safety database information. The research tasks consist of; 1) code qualification methodology (INL), 2) high-level PIRTs for major nucleus set of events (KAERI, ANL, INL), 3) initial scaling and scoping analysis (ANL, KAERI, INL), 4) filtering of TH/safety tools (KAERI, INL), 5) evaluation of TH/safety database information (KAERI, INL, ANL) and 6) key scoping analysis (KAERI). The code qualification methodology identifies the role of PIRTs in the R and D process and the bottom-up and top-down code validation methods. Since the design of VHTR is still evolving, we generated the high-level PIRTs referencing 600MWth block-type GT-MHR and 400MWth pebble-type PBMR. Nucleus set of events that represents the VHTR safety and operational transients consists of the enveloping scenarios of HPCC (high pressure conduction cooling: loss of primary flow), LPCC/Air-Ingress (low pressure conduction cooling: loss of coolant), LC (load changes: power maneuvering), ATWS (anticipated transients without scram: reactivity insertion), WS (water ingress: water-interfacing system break) and HU (hydrogen-side upset: loss of heat sink). The initial scaling analysis defines dimensionless parameters that need to be reflected in mixed convection modeling and the initial scoping analysis provided the reference system transients used in the PIRTs generation. For the PIRTs phenomena, we evaluated the modeling capability of the candidate TH/safety tools and derived a model improvement need. By surveying and evaluating the TH/safety database information, a tools V and V matrix has been developed. Through the key scoping analysis using available database, the

  20. Virtual screening using the ligand ZINC database for novel lipoxygenase-3 inhibitors.

    Science.gov (United States)

    Monika; Kour, Janmeet; Singh, Kulwinder

    2013-01-01

    The leukotrienes constitute a group of arachidonic acid-derived compounds with biologic activities suggesting important roles in inflammation and immediate hypersensitivity. Epidermis-type lipoxygenase-3 (ALOXE3), a distinct subclass within the multigene family of mammalian lipoxygenases, is a novel isoenzyme involved in the metabolism of leukotrienes and plays a very important role in skin barrier functions. Lipoxygenase selective inhibitors such as azelastine and zileuton are currently used to reduce inflammatory response. Nausea, pharyngolaryngeal pain, headache, nasal burning and somnolence are the most frequently reported adverse effects of these drugs. Therefore, there is still a need to develop more potent lipoxygenase inhibitors. In this paper, we report the screening of various compounds from the ZINC database (contains over 21 million compounds) using the Molegro Virtual Docker software against the ALOXE3 protein. Screening was performed using molecular constraints tool to filter compounds with physico-chemical properties similar to the 1N8Q bound ligand protocatechuic acid. The analysis resulted in 4319 Lipinski compliant hits which are docked and scored to identify structurally novel ligands that make similar interactions to those of known ligands or may have different interactions with other parts of the binding site. Our screening approach identified four molecules ZINC84299674; ZINC76643455; ZINC84299122 & ZINC75626957 with MolDock score of -128.901, -120.22, -116.873 & - 102.116 kcal/mol, respectively. Their energy scores were better than the 1N8Q bound co-crystallized ligand protocatechuic acid (with MolDock score of -77.225 kcal/mol). All the ligands were docked within the binding pocket forming interactions with amino acid residues.

  1. Computer application for database management and networking of service radio physics

    International Nuclear Information System (INIS)

    Ferrando Sanchez, A.; Cabello Murillo, E.; Diaz Fuentes, R.; Castro Novais, J.; Clemente Gutierrez, F.; Casa de Juan, M. A. de la; Adaimi Hernandez, P.

    2011-01-01

    The databases in the quality control prove to be a powerful tool for recording, management and statistical process control. Developed in a Windows environment and under Access (Micros of Office) our service implements this philosophy on the canter's computer network. A computer that acts as the server provides the database to the treatment units to record quality control measures daily and incidents. To remove shortcuts stop working with data migration, possible use of duplicate and erroneous data loss because of errors in network connections, which are common problems, we proceeded to manage connections and access to databases ease of maintenance and use horn to all service personnel.

  2. Trajectory Based Optimal Segment Computation in Road Network Databases

    DEFF Research Database (Denmark)

    Li, Xiaohui; Ceikute, Vaida; Jensen, Christian S.

    2013-01-01

    Finding a location for a new facility such that the facility attracts the maximal number of customers is a challenging problem. Existing studies either model customers as static sites and thus do not consider customer movement, or they focus on theoretical aspects and do not provide solutions...... that are shown empirically to be scalable. Given a road network, a set of existing facilities, and a collection of customer route traversals, an optimal segment query returns the optimal road network segment(s) for a new facility. We propose a practical framework for computing this query, where each route...... traversal is assigned a score that is distributed among the road segments covered by the route according to a score distribution model. The query returns the road segment(s) with the highest score. To achieve low latency, it is essential to prune the very large search space. We propose two algorithms...

  3. Trajectory Based Optimal Segment Computation in Road Network Databases

    DEFF Research Database (Denmark)

    Li, Xiaohui; Ceikute, Vaida; Jensen, Christian S.

    Finding a location for a new facility such that the facility attracts the maximal number of customers is a challenging problem. Existing studies either model customers as static sites and thus do not consider customer movement, or they focus on theoretical aspects and do not provide solutions...... that are shown empirically to be scalable. Given a road network, a set of existing facilities, and a collection of customer route traversals, an optimal segment query returns the optimal road network segment(s) for a new facility. We propose a practical framework for computing this query, where each route...... traversal is assigned a score that is distributed among the road segments covered by the route according to a score distribution model. The query returns the road segment(s) with the highest score. To achieve low latency, it is essential to prune the very large search space. We propose two algorithms...

  4. Database for High Throughput Screening Hits (dHITS): a simple tool to retrieve gene specific phenotypes from systematic screens done in yeast.

    Science.gov (United States)

    Chuartzman, Silvia G; Schuldiner, Maya

    2018-03-25

    In the last decade several collections of Saccharomyces cerevisiae yeast strains have been created. In these collections every gene is modified in a similar manner such as by a deletion or the addition of a protein tag. Such libraries have enabled a diversity of systematic screens, giving rise to large amounts of information regarding gene functions. However, often papers describing such screens focus on a single gene or a small set of genes and all other loci affecting the phenotype of choice ('hits') are only mentioned in tables that are provided as supplementary material and are often hard to retrieve or search. To help unify and make such data accessible, we have created a Database of High Throughput Screening Hits (dHITS). The dHITS database enables information to be obtained about screens in which genes of interest were found as well as the other genes that came up in that screen - all in a readily accessible and downloadable format. The ability to query large lists of genes at the same time provides a platform to easily analyse hits obtained from transcriptional analyses or other screens. We hope that this platform will serve as a tool to facilitate investigation of protein functions to the yeast community. © 2018 The Authors Yeast Published by John Wiley & Sons Ltd.

  5. Screen for genes involved in radiation survival of Escherichia coli and construction of a reference database

    Energy Technology Data Exchange (ETDEWEB)

    Sargentini, Neil J., E-mail: nsargentini@atsu.edu; Gularte, Nicholas P.; Hudman, Deborah A.

    2016-11-15

    Highlights: • 3907 Keio knockout mutants of E. coli screened for UV and X-radiation sensitivity. • 76 mutants showed significantly increased radiation sensitivity. • A database of 9 screening studies listed 352 genes only once; 103 genes, 2–7 times. • 33 genes from this study are uncommon and potentially novel. • Common and uncommon genes differ in gene function profile. - Abstract: A set of 3907 single-gene knockout (Keio collection) strains of Escherichia coli K-12 was examined for strains with increased susceptibility to killing by X- or UV-radiation. After screening with a high-throughput resazurin-based assay and determining radiation survival with triplicate clonogenic assays, we identified 76 strains (and associated deleted genes) showing statistically-significant increased radiation sensitivity compared to a control strain. To determine gene novelty, we constructed a reference database comprised of genes found in nine similar studies including ours. This database contains 455 genes comprised of 103 common genes (found 2–7 times), and 352 uncommon genes (found once). Our 76 genes includes 43 common genes and 33 uncommon (potentially novel) genes, i.e., appY, atoS, betB, bglJ, clpP, cpxA, cysB, cysE, ddlA, dgkA, dppF, dusB, elfG, eutK, fadD, glnA, groL, guaB, intF, prpR, queA, rplY, seqA, sufC,yadG, yagJ, yahD, yahO, ybaK, ybfA, yfaL, yhjV, and yiaL. Of our 33 uncommon gene mutants, 4 (12%) were sensitive only to UV-radiation, 10 (30%) only to X-radiation, and 19 (58%) to both radiations. Our uncommon mutants vs. our common mutants showed more radiation specificity, i.e., 12% vs. 9% (sensitive only to UV-); 30% vs. 16% (X-) and 58% vs. 74% (both radiations). Considering just our radiation-sensitive mutants, the median UV-radiation survival (75 J m{sup −2}) for 23 uncommon mutants was 6.84E-3 compared to 1.85E-3 for 36 common mutants (P = 0.025). Similarly, the average X-radiation survival for 29 uncommon mutants was 1.08E-2, compared to 6.19E

  6. Using AMDD method for Database Design in Mobile Cloud Computing Systems

    OpenAIRE

    Silviu Claudiu POPA; Mihai-Constantin AVORNICULUI; Vasile Paul BRESFELEAN

    2013-01-01

    The development of the technologies of wireless telecommunications gave birth of new kinds of e-commerce, the so called Mobile e-Commerce or m-Commerce. Mobile Cloud Computing (MCC) represents a new IT research area that combines mobile computing and cloud compu-ting techniques. Behind a cloud mobile commerce system there is a database containing all necessary information for transactions. By means of Agile Model Driven Development (AMDD) method, we are able to achieve many benefits that smoo...

  7. Computational screening of organic materials towards improved photovoltaic properties

    Science.gov (United States)

    Dai, Shuo; Olivares-Amaya, Roberto; Amador-Bedolla, Carlos; Aspuru-Guzik, Alan; Borunda, Mario

    2015-03-01

    The world today faces an energy crisis that is an obstruction to the development of the human civilization. One of the most promising solutions is solar energy harvested by economical solar cells. Being the third generation of solar cell materials, organic photovoltaic (OPV) materials is now under active development from both theoretical and experimental points of view. In this study, we constructed a parameter to select the desired molecules based on their optical spectra performance. We applied it to investigate a large collection of potential OPV materials, which were from the CEPDB database set up by the Harvard Clean Energy Project. Time dependent density functional theory (TD-DFT) modeling was used to calculate the absorption spectra of the molecules. Then based on the parameter, we screened out the top performing molecules for their potential OPV usage and suggested experimental efforts toward their synthesis. In addition, from those molecules, we summarized the functional groups that provided molecules certain spectrum capability. It is hoped that useful information could be mined out to provide hints to molecular design of OPV materials.

  8. THE NASA AMES POLYCYCLIC AROMATIC HYDROCARBON INFRARED SPECTROSCOPIC DATABASE: THE COMPUTED SPECTRA

    International Nuclear Information System (INIS)

    Bauschlicher, C. W.; Ricca, A.; Boersma, C.; Mattioda, A. L.; Cami, J.; Peeters, E.; Allamandola, L. J.; Sanchez de Armas, F.; Puerta Saborido, G.; Hudgins, D. M.

    2010-01-01

    The astronomical emission features, formerly known as the unidentified infrared bands, are now commonly ascribed to polycyclic aromatic hydrocarbons (PAHs). The laboratory experiments and computational modeling done at the NASA Ames Research Center to create a collection of PAH IR spectra relevant to test and refine the PAH hypothesis have been assembled into a spectroscopic database. This database now contains over 800 PAH spectra spanning 2-2000 μm (5000-5 cm -1 ). These data are now available on the World Wide Web at www.astrochem.org/pahdb. This paper presents an overview of the computational spectra in the database and the tools developed to analyze and interpret astronomical spectra using the database. A description of the online and offline user tools available on the Web site is also presented.

  9. Development of a computational database for application in Probabilistic Safety Analysis of nuclear research reactors

    International Nuclear Information System (INIS)

    Macedo, Vagner dos Santos

    2016-01-01

    The objective of this work is to present the computational database that was developed to store technical information and process data on component operation, failure and maintenance for the nuclear research reactors located at the Nuclear and Energy Research Institute (Instituto de Pesquisas Energéticas e Nucleares, IPEN), in São Paulo, Brazil. Data extracted from this database may be applied in the Probabilistic Safety Analysis of these research reactors or in less complex quantitative assessments related to safety, reliability, availability and maintainability of these facilities. This database may be accessed by users of the corporate network, named IPEN intranet. Professionals who require the access to the database must be duly registered by the system administrator, so that they will be able to consult and handle the information. The logical model adopted to represent the database structure is an entity-relationship model, which is in accordance with the protocols installed in IPEN intranet. The open-source relational database management system called MySQL, which is based on the Structured Query Language (SQL), was used in the development of this work. The PHP programming language was adopted to allow users to handle the database. Finally, the main result of this work was the creation a web application for the component reliability database named PSADB, specifically developed for the research reactors of IPEN; furthermore, the database management system provides relevant information efficiently. (author)

  10. A reliable computational workflow for the selection of optimal screening libraries.

    Science.gov (United States)

    Gilad, Yocheved; Nadassy, Katalin; Senderowitz, Hanoch

    2015-01-01

    The experimental screening of compound collections is a common starting point in many drug discovery projects. Successes of such screening campaigns critically depend on the quality of the screened library. Many libraries are currently available from different vendors yet the selection of the optimal screening library for a specific project is challenging. We have devised a novel workflow for the rational selection of project-specific screening libraries. The workflow accepts as input a set of virtual candidate libraries and applies the following steps to each library: (1) data curation; (2) assessment of ADME/T profile; (3) assessment of the number of promiscuous binders/frequent HTS hitters; (4) assessment of internal diversity; (5) assessment of similarity to known active compound(s) (optional); (6) assessment of similarity to in-house or otherwise accessible compound collections (optional). For ADME/T profiling, Lipinski's and Veber's rule-based filters were implemented and a new blood brain barrier permeation model was developed and validated (85 and 74 % success rate for training set and test set, respectively). Diversity and similarity descriptors which demonstrated best performances in terms of their ability to select either diverse or focused sets of compounds from three databases (Drug Bank, CMC and CHEMBL) were identified and used for diversity and similarity assessments. The workflow was used to analyze nine common screening libraries available from six vendors. The results of this analysis are reported for each library providing an assessment of its quality. Furthermore, a consensus approach was developed to combine the results of these analyses into a single score for selecting the optimal library under different scenarios. We have devised and tested a new workflow for the rational selection of screening libraries under different scenarios. The current workflow was implemented using the Pipeline Pilot software yet due to the usage of generic

  11. Computer programme for control and maintenance and object oriented database: application to the realisation of an particle accelerator, the VIVITRON

    International Nuclear Information System (INIS)

    Diaz, A.

    1996-01-01

    The command and control system for the Vivitron, a new generation electrostatic particle accelerator, has been implemented using workstations and front-end computers using VME standards, the whole within an environment of UNIX/VxWorks. This architecture is distributed over an Ethernet network. Measurements and commands of the different sensors and actuators are concentrated in the front-end computers. The development of a second version of the software giving better performance and more functionality is described. X11 based communication has been utilised to transmit all the necessary informations to display parameters within the front-end computers on to the graphic screens. All other communications between processes use the Remote Procedure Call method (RPC). The conception of the system is based largely on the object oriented database O 2 which integrates a full description of equipments and the code necessary to manage it. This code is generated by the database. This innovation permits easy maintenance of the system and bypasses the need of a specialist when adding new equipments. The new version of the command and control system has been progressively installed since August 1995. (author)

  12. A computer database system to calculate staff radiation doses and maintain records

    International Nuclear Information System (INIS)

    Clewer, P.

    1985-01-01

    A database has been produced to record the personal dose records of all employees monitored for radiation exposure in the Wessex Health Region. Currently there are more than 2000 personnel in 115 departments but the capacity of the database allows for expansion. The computer is interfaced to a densitometer for film badge reading. The hardware used by the database, which is based on a popular microcomputer, is described, as are the various programs that make up the software. The advantages over the manual card index system that it replaces are discussed. (author)

  13. Obstetrical ultrasound data-base management system by using personal computer

    International Nuclear Information System (INIS)

    Jeon, Hae Jeong; Park, Jeong Hee; Kim, Soo Nyung

    1993-01-01

    A computer program which performs obstetric calculations on Clipper Language using the data from ultrasonography was developed for personal computer. It was designed for fast assessment of fetal development, prediction of gestational age, and weight from ultrasonographic measurements which included biparietal diameter, femur length, gestational sac, occipito-frontal diameter, abdominal diameter, and etc. The Obstetrical-Ultrasound Data-Base Management System was tested for its performance. The Obstetrical-Ultrasound Data-Base Management System was very useful in patient management with its convenient data filing, easy retrieval of previous report, prompt but accurate estimation of fetal growth and skeletal anomaly and production of equation and growth curve for pregnant women

  14. Nuclear plant operations, maintenance, and configuration management using three-dimensional computer graphics and databases

    International Nuclear Information System (INIS)

    Tutos, N.C.; Reinschmidt, K.F.

    1987-01-01

    Stone and Webster Engineering Corporation has developed the Plant Digital Model concept as a new approach to Configuration Mnagement of nuclear power plants. The Plant Digital Model development is a step-by-step process, based on existing manual procedures and computer applications, and is fully controllable by the plant managers and engineers. The Plant Digital Model is based on IBM computer graphics and relational database management systems, and therefore can be easily integrated with existing plant databases and corporate management-information systems

  15. An algorithm of discovering signatures from DNA databases on a computer cluster.

    Science.gov (United States)

    Lee, Hsiao Ping; Sheu, Tzu-Fang

    2014-10-05

    Signatures are short sequences that are unique and not similar to any other sequence in a database that can be used as the basis to identify different species. Even though several signature discovery algorithms have been proposed in the past, these algorithms require the entirety of databases to be loaded in the memory, thus restricting the amount of data that they can process. It makes those algorithms unable to process databases with large amounts of data. Also, those algorithms use sequential models and have slower discovery speeds, meaning that the efficiency can be improved. In this research, we are debuting the utilization of a divide-and-conquer strategy in signature discovery and have proposed a parallel signature discovery algorithm on a computer cluster. The algorithm applies the divide-and-conquer strategy to solve the problem posed to the existing algorithms where they are unable to process large databases and uses a parallel computing mechanism to effectively improve the efficiency of signature discovery. Even when run with just the memory of regular personal computers, the algorithm can still process large databases such as the human whole-genome EST database which were previously unable to be processed by the existing algorithms. The algorithm proposed in this research is not limited by the amount of usable memory and can rapidly find signatures in large databases, making it useful in applications such as Next Generation Sequencing and other large database analysis and processing. The implementation of the proposed algorithm is available at http://www.cs.pu.edu.tw/~fang/DDCSDPrograms/DDCSD.htm.

  16. Development of a computational database for probabilistic safety assessment of nuclear research reactors

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, Vagner S.; Oliveira, Patricia S. Pagetti de; Andrade, Delvonei Alves de, E-mail: vagner.macedo@usp.br, E-mail: patricia@ipen.br, E-mail: delvonei@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The objective of this work is to describe the database being developed at IPEN - CNEN / SP for application in the Probabilistic Safety Assessment of nuclear research reactors. The database can be accessed by means of a computational program installed in the corporate computer network, named IPEN Intranet, and this access will be allowed only to professionals previously registered. Data updating, editing and searching tasks will be controlled by a system administrator according to IPEN Intranet security rules. The logical model and the physical structure of the database can be represented by an Entity Relationship Model, which is based on the operational routines performed by IPEN - CNEN / SP users. The web application designed for the management of the database is named PSADB. It is being developed with MySQL database software and PHP programming language is being used. Data stored in this database are divided into modules that refer to technical specifications, operating history, maintenance history and failure events associated with the main components of the nuclear facilities. (author)

  17. Development of a computational database for probabilistic safety assessment of nuclear research reactors

    International Nuclear Information System (INIS)

    Macedo, Vagner S.; Oliveira, Patricia S. Pagetti de; Andrade, Delvonei Alves de

    2015-01-01

    The objective of this work is to describe the database being developed at IPEN - CNEN / SP for application in the Probabilistic Safety Assessment of nuclear research reactors. The database can be accessed by means of a computational program installed in the corporate computer network, named IPEN Intranet, and this access will be allowed only to professionals previously registered. Data updating, editing and searching tasks will be controlled by a system administrator according to IPEN Intranet security rules. The logical model and the physical structure of the database can be represented by an Entity Relationship Model, which is based on the operational routines performed by IPEN - CNEN / SP users. The web application designed for the management of the database is named PSADB. It is being developed with MySQL database software and PHP programming language is being used. Data stored in this database are divided into modules that refer to technical specifications, operating history, maintenance history and failure events associated with the main components of the nuclear facilities. (author)

  18. Constructing a population-based research database from routine maternal screening records: a resource for studying alloimmunization in pregnant women.

    Directory of Open Access Journals (Sweden)

    Brian K Lee

    Full Text Available BACKGROUND: Although screening for maternal red blood cell antibodies during pregnancy is a standard procedure, the prevalence and clinical consequences of non-anti-D immunization are poorly understood. The objective was to create a national database of maternal antibody screening results that can be linked with population health registers to create a research resource for investigating these issues. STUDY DESIGN AND METHODS: Each birth in the Swedish Medical Birth Register was uniquely identified and linked to the text stored in routine maternal antibody screening records in the time window from 9 months prior to 2 weeks after the delivery date. These text records were subjected to a computerized search for specific antibodies using regular expressions. To illustrate the research potential of the resulting database, selected antibody prevalence rates are presented as tables and figures, and the complete data (from more than 60 specific antibodies presented as online moving graphical displays. RESULTS: More than one million (1,191,761 births with valid screening information from 1982-2002 constitute the study population. Computerized coverage of screening increased steadily over time and varied by region as electronic records were adopted. To ensure data quality, we restricted analysis to birth records in areas and years with a sustained coverage of at least 80%, representing 920,903 births from 572,626 mothers in 17 of the 24 counties in Sweden. During the study period, non-anti-D and anti-D antibodies occurred in 76.8/10,000 and 14.1/10,000 pregnancies respectively, with marked differences between specific antibodies over time. CONCLUSION: This work demonstrates the feasibility of creating a nationally representative research database from the routine maternal antibody screening records from an extended calendar period. By linkage with population registers of maternal and child health, such data are a valuable resource for addressing important

  19. Advances in computational metabolomics and databases deepen the understanding of metabolisms.

    Science.gov (United States)

    Tsugawa, Hiroshi

    2018-01-29

    Mass spectrometry (MS)-based metabolomics is the popular platform for metabolome analyses. Computational techniques for the processing of MS raw data, for example, feature detection, peak alignment, and the exclusion of false-positive peaks, have been established. The next stage of untargeted metabolomics would be to decipher the mass fragmentation of small molecules for the global identification of human-, animal-, plant-, and microbiota metabolomes, resulting in a deeper understanding of metabolisms. This review is an update on the latest computational metabolomics including known/expected structure databases, chemical ontology classifications, and mass spectrometry cheminformatics for the interpretation of mass fragmentations and for the elucidation of unknown metabolites. The importance of metabolome 'databases' and 'repositories' is also discussed because novel biological discoveries are often attributable to the accumulation of data, to relational databases, and to their statistics. Lastly, a practical guide for metabolite annotations is presented as the summary of this review. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. WISDOM-II: Screening against multiple targets implicated in malaria using computational grid infrastructures

    Directory of Open Access Journals (Sweden)

    Kenyon Colin

    2009-05-01

    Full Text Available Abstract Background Despite continuous efforts of the international community to reduce the impact of malaria on developing countries, no significant progress has been made in the recent years and the discovery of new drugs is more than ever needed. Out of the many proteins involved in the metabolic activities of the Plasmodium parasite, some are promising targets to carry out rational drug discovery. Motivation Recent years have witnessed the emergence of grids, which are highly distributed computing infrastructures particularly well fitted for embarrassingly parallel computations like docking. In 2005, a first attempt at using grids for large-scale virtual screening focused on plasmepsins and ended up in the identification of previously unknown scaffolds, which were confirmed in vitro to be active plasmepsin inhibitors. Following this success, a second deployment took place in the fall of 2006 focussing on one well known target, dihydrofolate reductase (DHFR, and on a new promising one, glutathione-S-transferase. Methods In silico drug design, especially vHTS is a widely and well-accepted technology in lead identification and lead optimization. This approach, therefore builds, upon the progress made in computational chemistry to achieve more accurate in silico docking and in information technology to design and operate large scale grid infrastructures. Results On the computational side, a sustained infrastructure has been developed: docking at large scale, using different strategies in result analysis, storing of the results on the fly into MySQL databases and application of molecular dynamics refinement are MM-PBSA and MM-GBSA rescoring. The modeling results obtained are very promising. Based on the modeling results, In vitro results are underway for all the targets against which screening is performed. Conclusion The current paper describes the rational drug discovery activity at large scale, especially molecular docking using FlexX software

  1. Visibility of microcalcifications in computed and screen-film mammography

    International Nuclear Information System (INIS)

    Cowen, Arnold R.; Launders, Jason H.; Jadav, Mark; Brettle, David S.

    1997-01-01

    Due to the clinically and technically demanding nature of breast x-ray imaging, mammography still remains one of the few essentially film-based radiological imaging techniques in modern medical imaging. There are a range of possible benefits available if a practical and economical direct digital imaging technique can be introduced to routine clinical practice. There has been much debate regarding the minimum specification required for direct digital acquisition. One such direct digital system available is computed radiography (CR), which has a modest specification when compared with modern screen-film mammography (SFM) systems. This paper details two psychophysical studies in which the detection of simulated microcalcifications with CR has been directly compared to that with SFM. The first study found that under scatter-free conditions the minimum detectable size of micro calcification was approximately 130 μm for both SFM and CR. The second study found that SFM had a 4.6% higher probability of observers being able to correctly identify the shape of 350 μm diameter test details; there was no significant difference for either larger or smaller test details. From the results of these studies it has been demonstrated that the modest specification of CR, in terms of limiting resolution, does not translate into a dramatic difference in the perception of details at the limit of detectability. When judging the imaging performance of a system it is more important to compare the signal-to-noise ratio transfer spectrum characteristics, rather than simply the modulation transfer function. (author)

  2. Data management and language enhancement for generalized set theory computer language for operation of large relational databases

    Science.gov (United States)

    Finley, Gail T.

    1988-01-01

    This report covers the study of the relational database implementation in the NASCAD computer program system. The existing system is used primarily for computer aided design. Attention is also directed to a hidden-surface algorithm for final drawing output.

  3. The image database management system of teaching file using personal computer

    International Nuclear Information System (INIS)

    Shin, M. J.; Kim, G. W.; Chun, T. J.; Ahn, W. H.; Baik, S. K.; Choi, H. Y.; Kim, B. G.

    1995-01-01

    For the systemic management and easy using of teaching file in radiology department, the authors tried to do the setup of a database management system of teaching file using personal computer. We used a personal computer (IBM PC compatible, 486DX2) including a image capture card(Window vision, Dooin Elect, Seoul, Korea) and video camera recorder (8mm, CCD-TR105, Sony, Tokyo, Japan) for the acquisition and storage of images. We developed the database program by using Foxpro for Window 2.6(Microsoft, Seattle, USA) executed in the Window 3.1 (Microsoft, Seattle, USA). Each datum consisted of hospital number, name, sex, age, examination date, keyword, radiologic examination modalities, final diagnosis, radiologic findings, references and representative images. The images were acquired and stored as bitmap format (8 bitmap, 540 X 390 ∼ 545 X 414, 256 gray scale) and displayed on the 17 inch-flat monitor(1024 X 768, Samtron, Seoul, Korea). Without special devices, the images acquisition and storage could be done on the reading viewbox, simply. The image quality on the computer's monitor was less than the one of original film on the viewbox, but generally the characteristics of each lesions could be differentiated. Easy retrieval of data was possible for the purpose of teaching file system. Without high cost appliances, we could consummate the image database system of teaching file using personal computer with relatively inexpensive method

  4. Designing Second Generation Anti-Alzheimer Compounds as Inhibitors of Human Acetylcholinesterase: Computational Screening of Synthetic Molecules and Dietary Phytochemicals.

    Science.gov (United States)

    Amat-Ur-Rasool, Hafsa; Ahmed, Mehboob

    2015-01-01

    Alzheimer's disease (AD), a big cause of memory loss, is a progressive neurodegenerative disorder. The disease leads to irreversible loss of neurons that result in reduced level of acetylcholine neurotransmitter (ACh). The reduction of ACh level impairs brain functioning. One aspect of AD therapy is to maintain ACh level up to a safe limit, by blocking acetylcholinesterase (AChE), an enzyme that is naturally responsible for its degradation. This research presents an in-silico screening and designing of hAChE inhibitors as potential anti-Alzheimer drugs. Molecular docking results of the database retrieved (synthetic chemicals and dietary phytochemicals) and self-drawn ligands were compared with Food and Drug Administration (FDA) approved drugs against AD as controls. Furthermore, computational ADME studies were performed on the hits to assess their safety. Human AChE was found to be most approptiate target site as compared to commonly used Torpedo AChE. Among the tested dietry phytochemicals, berberastine, berberine, yohimbine, sanguinarine, elemol and naringenin are the worth mentioning phytochemicals as potential anti-Alzheimer drugs The synthetic leads were mostly dual binding site inhibitors with two binding subunits linked by a carbon chain i.e. second generation AD drugs. Fifteen new heterodimers were designed that were computationally more efficient inhibitors than previously reported compounds. Using computational methods, compounds present in online chemical databases can be screened to design more efficient and safer drugs against cognitive symptoms of AD.

  5. Designing Second Generation Anti-Alzheimer Compounds as Inhibitors of Human Acetylcholinesterase: Computational Screening of Synthetic Molecules and Dietary Phytochemicals.

    Directory of Open Access Journals (Sweden)

    Hafsa Amat-Ur-Rasool

    Full Text Available Alzheimer's disease (AD, a big cause of memory loss, is a progressive neurodegenerative disorder. The disease leads to irreversible loss of neurons that result in reduced level of acetylcholine neurotransmitter (ACh. The reduction of ACh level impairs brain functioning. One aspect of AD therapy is to maintain ACh level up to a safe limit, by blocking acetylcholinesterase (AChE, an enzyme that is naturally responsible for its degradation. This research presents an in-silico screening and designing of hAChE inhibitors as potential anti-Alzheimer drugs. Molecular docking results of the database retrieved (synthetic chemicals and dietary phytochemicals and self-drawn ligands were compared with Food and Drug Administration (FDA approved drugs against AD as controls. Furthermore, computational ADME studies were performed on the hits to assess their safety. Human AChE was found to be most approptiate target site as compared to commonly used Torpedo AChE. Among the tested dietry phytochemicals, berberastine, berberine, yohimbine, sanguinarine, elemol and naringenin are the worth mentioning phytochemicals as potential anti-Alzheimer drugs The synthetic leads were mostly dual binding site inhibitors with two binding subunits linked by a carbon chain i.e. second generation AD drugs. Fifteen new heterodimers were designed that were computationally more efficient inhibitors than previously reported compounds. Using computational methods, compounds present in online chemical databases can be screened to design more efficient and safer drugs against cognitive symptoms of AD.

  6. The experimental nuclear reaction data (EXFOR): Extended computer database and Web retrieval system

    Science.gov (United States)

    Zerkin, V. V.; Pritychenko, B.

    2018-04-01

    The EXchange FORmat (EXFOR) experimental nuclear reaction database and the associated Web interface provide access to the wealth of low- and intermediate-energy nuclear reaction physics data. This resource is based on numerical data sets and bibliographical information of ∼22,000 experiments since the beginning of nuclear science. The principles of the computer database organization, its extended contents and Web applications development are described. New capabilities for the data sets uploads, renormalization, covariance matrix, and inverse reaction calculations are presented. The EXFOR database, updated monthly, provides an essential support for nuclear data evaluation, application development, and research activities. It is publicly available at the websites of the International Atomic Energy Agency Nuclear Data Section, http://www-nds.iaea.org/exfor, the U.S. National Nuclear Data Center, http://www.nndc.bnl.gov/exfor, and the mirror sites in China, India and Russian Federation.

  7. Computer-aided diagnosis system for bone scintigrams from Japanese patients: importance of training database

    DEFF Research Database (Denmark)

    Horikoshi, Hiroyuki; Kikuchi, Akihiro; Onoguchi, Masahisa

    2012-01-01

    higher performance than the corresponding CAD software trained with a European database for the analysis of bone scans from Japanese patients. These results could at least partly be caused by the physical differences between Japanese and European patients resulting in less influence of attenuation......Computer-aided diagnosis (CAD) software for bone scintigrams have recently been introduced as a clinical quality assurance tool. The purpose of this study was to compare the diagnostic accuracy of two CAD systems, one based on a European and one on a Japanese training database, in a group of bone...... scans from Japanese patients.The two CAD software are trained to interpret bone scans using training databases consisting of bone scans with the desired interpretation, metastatic disease or not. One software was trained using 795 bone scans from European patients and the other with 904 bone scans from...

  8. Toward clinically usable CAD for lung cancer screening with computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Matthew S.; Lo, Pechin; Goldin, Jonathan G.; Barnoy, Eran; Kim, Grace Hyun J.; McNitt-Gray, Michael F.; Aberle, Denise R. [David Geffen School of Medicine at UCLA, Center for Computer Vision and Imaging Biomarkers, Department of Radiological Sciences, Los Angeles, CA (United States)

    2014-11-15

    The purpose of this study was to define clinically appropriate, computer-aided lung nodule detection (CAD) requirements and protocols based on recent screening trials. In the following paper, we describe a CAD evaluation methodology based on a publically available, annotated computed tomography (CT) image data set, and demonstrate the evaluation of a new CAD system with the functionality and performance required for adoption in clinical practice. A new automated lung nodule detection and measurement system was developed that incorporates intensity thresholding, a Euclidean Distance Transformation, and segmentation based on watersheds. System performance was evaluated against the Lung Imaging Database Consortium (LIDC) CT reference data set. The test set comprised thin-section CT scans from 108 LIDC subjects. The median (±IQR) sensitivity per subject was 100 (±37.5) for nodules ≥ 4 mm and 100 (±8.33) for nodules ≥ 8 mm. The corresponding false positive rates were 0 (±2.0) and 0 (±1.0), respectively. The concordance correlation coefficient between the CAD nodule diameter and the LIDC reference was 0.91, and for volume it was 0.90. The new CAD system shows high nodule sensitivity with a low false positive rate. Automated volume measurements have strong agreement with the reference standard. Thus, it provides comprehensive, clinically-usable lung nodule detection and assessment functionality. (orig.)

  9. Toward clinically usable CAD for lung cancer screening with computed tomography

    International Nuclear Information System (INIS)

    Brown, Matthew S.; Lo, Pechin; Goldin, Jonathan G.; Barnoy, Eran; Kim, Grace Hyun J.; McNitt-Gray, Michael F.; Aberle, Denise R.

    2014-01-01

    The purpose of this study was to define clinically appropriate, computer-aided lung nodule detection (CAD) requirements and protocols based on recent screening trials. In the following paper, we describe a CAD evaluation methodology based on a publically available, annotated computed tomography (CT) image data set, and demonstrate the evaluation of a new CAD system with the functionality and performance required for adoption in clinical practice. A new automated lung nodule detection and measurement system was developed that incorporates intensity thresholding, a Euclidean Distance Transformation, and segmentation based on watersheds. System performance was evaluated against the Lung Imaging Database Consortium (LIDC) CT reference data set. The test set comprised thin-section CT scans from 108 LIDC subjects. The median (±IQR) sensitivity per subject was 100 (±37.5) for nodules ≥ 4 mm and 100 (±8.33) for nodules ≥ 8 mm. The corresponding false positive rates were 0 (±2.0) and 0 (±1.0), respectively. The concordance correlation coefficient between the CAD nodule diameter and the LIDC reference was 0.91, and for volume it was 0.90. The new CAD system shows high nodule sensitivity with a low false positive rate. Automated volume measurements have strong agreement with the reference standard. Thus, it provides comprehensive, clinically-usable lung nodule detection and assessment functionality. (orig.)

  10. GAMCAT - a personal computer database on alpha particles and gamma rays from radioactive decay

    International Nuclear Information System (INIS)

    Tepel, J.W.; Mueller, H.W.

    1990-01-01

    The GAMCAT database is a compilation of data describing the alpha particles and gamma rays that occur in the radioactive decay of all known nuclides, adapted for IBM Personal Computers and compatible systems. These compiled data have been previously published, and are now available as a compact database. Entries can be retrieved by defining the properties of the parent nuclei as well as alpha-particle and gamma-ray energies or any combination of these parameters. The system provides fast access to the data and has been completely written in C to run on an AT-compatible computer, with a hard disk and 640K of memory under DOS 2.11 or higher. GAMCAT is available from the Fachinformationszentrum Karlsruhe. (orig.)

  11. DB90: A Fortran Callable Relational Database Routine for Scientific and Engineering Computer Programs

    Science.gov (United States)

    Wrenn, Gregory A.

    2005-01-01

    This report describes a database routine called DB90 which is intended for use with scientific and engineering computer programs. The software is written in the Fortran 90/95 programming language standard with file input and output routines written in the C programming language. These routines should be completely portable to any computing platform and operating system that has Fortran 90/95 and C compilers. DB90 allows a program to supply relation names and up to 5 integer key values to uniquely identify each record of each relation. This permits the user to select records or retrieve data in any desired order.

  12. Estimation of radiation exposure from lung cancer screening program with low-dose computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Su Yeon; Jun, Jae Kwan [Graduate School of Cancer Science and Policy, National Cancer Center, Seoul (Korea, Republic of)

    2016-12-15

    The National Lung Screening Trial (NLST) demonstrated that screening with Low-dose Computed Tomography (LDCT) screening reduced lung cancer mortality in a high-risk population. Recently, the United States Preventive Services Task Force (USPSTF) gave a B recommendation for annual LDCT screening for individuals at high-risk. With the promising results, Korea developed lung cancer screening guideline and is planning a pilot study for implementation of national lung cancer screening. With widespread adoption of lung cancer screening with LDCT, there are concerns about harms of screening, including high false-positive rates and radiation exposure. Over the 3 rounds of screening in the NLST, 96.4% of positive results were false-positives. Although the initial screening is performed at low dose, subsequent diagnostic examinations following positive results additively contribute to patient's lifetime exposure. As with implementing a large-scale screening program, there is a lack of established risk assessment about the effect of radiation exposure from long-term screening program. Thus, the purpose of this study was to estimate cumulative radiation exposure of annual LDCT lung cancer screening program over 20-year period.

  13. Heuristic lipophilicity potential for computer-aided rational drug design: Optimizations of screening functions and parameters

    Science.gov (United States)

    Du, Qishi; Mezey, Paul G.

    1998-09-01

    In this research we test and compare three possible atom-basedscreening functions used in the heuristic molecular lipophilicity potential(HMLP). Screening function 1 is a power distance-dependent function, b_{{i}} /| {R_{{i}}- r} |^γ, screening function 2is an exponential distance-dependent function, biexp(-| {R_i- r} |/d_0 , and screening function 3 is aweighted distance-dependent function, {{sign}}( {b_i } ){{exp}}ξ ( {| {R_i- r} |/| {b_i } |} )For every screening function, the parameters (γ ,d0, and ξ are optimized using 41 common organic molecules of 4 types of compounds:aliphatic alcohols, aliphatic carboxylic acids, aliphatic amines, andaliphatic alkanes. The results of calculations show that screening function3 cannot give chemically reasonable results, however, both the powerscreening function and the exponential screening function give chemicallysatisfactory results. There are two notable differences between screeningfunctions 1 and 2. First, the exponential screening function has largervalues in the short distance than the power screening function, thereforemore influence from the nearest neighbors is involved using screeningfunction 2 than screening function 1. Second, the power screening functionhas larger values in the long distance than the exponential screeningfunction, therefore screening function 1 is effected by atoms at longdistance more than screening function 2. For screening function 1, thesuitable range of parameter d0 is 1.5 < d0 < 3.0, and d0 = 2.0 is recommended. HMLP developed in this researchprovides a potential tool for computer-aided three-dimensional drugdesign.

  14. Cost-effectiveness of computed tomographic colonography screening for colorectal cancer in the medicare population

    NARCIS (Netherlands)

    A.B. Knudsen (Amy); I. Lansdorp-Vogelaar (Iris); C.M. Rutter (Carolyn); J.E. Savarino (James); M. van Ballegooijen (Marjolein); K.M. Kuntz (Karen); A. Zauber (Ann)

    2010-01-01

    textabstractBackground The Centers for Medicare and Medicaid Services (CMS) considered whether to reimburse computed tomographic colonography (CTC) for colorectal cancer screening of Medicare enrollees. To help inform its decision, we evaluated the reimbursement rate at which CTC screening could be

  15. International Association for the Study of Lung Cancer Computed Tomography Screening Workshop 2011 Report

    NARCIS (Netherlands)

    Field, John K.; Smith, Robert A.; Aberle, Denise R.; Oudkerk, Matthijs; Baldwin, David R.; Yankelevitz, David; Pedersen, Jesper Holst; Swanson, Scott James; Travis, William D.; Wisbuba, Ignacio I.; Noguchi, Masayuki; Mulshine, Jim L.

    The International Association for the Study of Lung Cancer (IASLC) Board of Directors convened a computed tomography (CT) Screening Task Force to develop an IASLC position statement, after the National Cancer Institute press statement from the National Lung Screening Trial showed that lung cancer

  16. Difficulties encountered managing nodules detected during a computed tomography lung cancer screening program.

    Science.gov (United States)

    Veronesi, Giulia; Bellomi, Massimo; Scanagatta, Paolo; Preda, Lorenzo; Rampinelli, Cristiano; Guarize, Juliana; Pelosi, Giuseppe; Maisonneuve, Patrick; Leo, Francesco; Solli, Piergiorgio; Masullo, Michele; Spaggiari, Lorenzo

    2008-09-01

    The main challenge of screening a healthy population with low-dose computed tomography is to balance the excessive use of diagnostic procedures with the risk of delayed cancer detection. We evaluated the pitfalls, difficulties, and sources of mistakes in the management of lung nodules detected in volunteers in the Cosmos single-center screening trial. A total of 5201 asymptomatic high-risk volunteers underwent screening with multidetector low-dose computed tomography. Nodules detected at baseline or new nodules at annual screening received repeat low-dose computed tomography at 1 year if less than 5 mm, repeat low-dose computed tomography 3 to 6 months later if between 5 and 8 mm, and fluorodeoxyglucose positron emission tomography if more than 8 mm. Growing nodules at the annual screening received low-dose computed tomography at 6 months and computed tomography-positron emission tomography or surgical biopsy according to doubling time, type, and size. During the first year of screening, 106 patients underwent lung biopsy and 91 lung cancers were identified (70% were stage I). Diagnosis was delayed (false-negative) in 6 patients (stage IIB in 1 patient, stage IIIA in 3 patients, and stage IV in 2 patients), including 2 small cell cancers and 1 central lesion. Surgical biopsy revealed benign disease (false-positives) in 15 cases (14%). Positron emission tomography sensitivity was 88% for prevalent cancers and 70% for cancers diagnosed after first annual screening. No needle biopsy procedures were performed in this cohort of patients. Low-dose computed tomography screening is effective for the early detection of lung cancers, but nodule management remains a challenge. Computed tomography-positron emission tomography is useful at baseline, but its sensitivity decreases significantly the subsequent year. Multidisciplinary management and experience are crucial for minimizing misdiagnoses.

  17. Computer systems and methods for the query and visualization of multidimensional databases

    Science.gov (United States)

    Stolte, Chris; Tang, Diane L; Hanrahan, Patrick

    2015-03-03

    A computer displays a graphical user interface on its display. The graphical user interface includes a schema information region and a data visualization region. The schema information region includes multiple operand names, each operand corresponding to one or more fields of a multi-dimensional database that includes at least one data hierarchy. The data visualization region includes a columns shelf and a rows shelf. The computer detects user actions to associate one or more first operands with the columns shelf and to associate one or more second operands with the rows shelf. The computer generates a visual table in the data visualization region in accordance with the user actions. The visual table includes one or more panes. Each pane has an x-axis defined based on data for the one or more first operands, and each pane has a y-axis defined based on data for the one or more second operands.

  18. Improving the analysis, storage and sharing of neuroimaging data using relational databases and distributed computing.

    Science.gov (United States)

    Hasson, Uri; Skipper, Jeremy I; Wilde, Michael J; Nusbaum, Howard C; Small, Steven L

    2008-01-15

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data.

  19. Computer systems and methods for the query and visualization of multidimensional databases

    Science.gov (United States)

    Stolte, Chris [Palo Alto, CA; Tang, Diane L [Palo Alto, CA; Hanrahan, Patrick [Portola Valley, CA

    2011-02-01

    In response to a user request, a computer generates a graphical user interface on a computer display. A schema information region of the graphical user interface includes multiple operand names, each operand name associated with one or more fields of a multi-dimensional database. A data visualization region of the graphical user interface includes multiple shelves. Upon detecting a user selection of the operand names and a user request to associate each user-selected operand name with a respective shelf in the data visualization region, the computer generates a visual table in the data visualization region in accordance with the associations between the operand names and the corresponding shelves. The visual table includes a plurality of panes, each pane having at least one axis defined based on data for the fields associated with a respective operand name.

  20. Computational resources for ribosome profiling: from database to Web server and software.

    Science.gov (United States)

    Wang, Hongwei; Wang, Yan; Xie, Zhi

    2017-08-14

    Ribosome profiling is emerging as a powerful technique that enables genome-wide investigation of in vivo translation at sub-codon resolution. The increasing application of ribosome profiling in recent years has achieved remarkable progress toward understanding the composition, regulation and mechanism of translation. This benefits from not only the awesome power of ribosome profiling but also an extensive range of computational resources available for ribosome profiling. At present, however, a comprehensive review on these resources is still lacking. Here, we survey the recent computational advances guided by ribosome profiling, with a focus on databases, Web servers and software tools for storing, visualizing and analyzing ribosome profiling data. This review is intended to provide experimental and computational biologists with a reference to make appropriate choices among existing resources for the question at hand. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Hand held control unit for controlling a display screen-oriented computer game, and a display screen-oriented computer game having one or more such control units

    NARCIS (Netherlands)

    2001-01-01

    A hand-held control unit is used to control a display screen-oriented computer game. The unit comprises a housing with a front side, a set of control members lying generally flush with the front side for through actuating thereof controlling actions of in-game display items, and an output for

  2. FaceWarehouse: a 3D facial expression database for visual computing.

    Science.gov (United States)

    Cao, Chen; Weng, Yanlin; Zhou, Shun; Tong, Yiying; Zhou, Kun

    2014-03-01

    We present FaceWarehouse, a database of 3D facial expressions for visual computing applications. We use Kinect, an off-the-shelf RGBD camera, to capture 150 individuals aged 7-80 from various ethnic backgrounds. For each person, we captured the RGBD data of her different expressions, including the neutral expression and 19 other expressions such as mouth-opening, smile, kiss, etc. For every RGBD raw data record, a set of facial feature points on the color image such as eye corners, mouth contour, and the nose tip are automatically localized, and manually adjusted if better accuracy is required. We then deform a template facial mesh to fit the depth data as closely as possible while matching the feature points on the color image to their corresponding points on the mesh. Starting from these fitted face meshes, we construct a set of individual-specific expression blendshapes for each person. These meshes with consistent topology are assembled as a rank-3 tensor to build a bilinear face model with two attributes: identity and expression. Compared with previous 3D facial databases, for every person in our database, there is a much richer matching collection of expressions, enabling depiction of most human facial actions. We demonstrate the potential of FaceWarehouse for visual computing with four applications: facial image manipulation, face component transfer, real-time performance-based facial image animation, and facial animation retargeting from video to image.

  3. Performance of popular open source databases for HEP related computing problems

    International Nuclear Information System (INIS)

    Kovalskyi, D; Sfiligoi, I; Wuerthwein, F; Yagil, A

    2014-01-01

    Databases are used in many software components of HEP computing, from monitoring and job scheduling to data storage and processing. It is not always clear at the beginning of a project if a problem can be handled by a single server, or if one needs to plan for a multi-server solution. Before a scalable solution is adopted, it helps to know how well it performs in a single server case to avoid situations when a multi-server solution is adopted mostly due to sub-optimal performance per node. This paper presents comparison benchmarks of popular open source database management systems. As a test application we use a user job monitoring system based on the Glidein workflow management system used in the CMS Collaboration.

  4. An automated tuberculosis screening strategy combining X-ray-based computer-aided detection and clinical information

    Science.gov (United States)

    Melendez, Jaime; Sánchez, Clara I.; Philipsen, Rick H. H. M.; Maduskar, Pragnya; Dawson, Rodney; Theron, Grant; Dheda, Keertan; van Ginneken, Bram

    2016-04-01

    Lack of human resources and radiological interpretation expertise impair tuberculosis (TB) screening programmes in TB-endemic countries. Computer-aided detection (CAD) constitutes a viable alternative for chest radiograph (CXR) reading. However, no automated techniques that exploit the additional clinical information typically available during screening exist. To address this issue and optimally exploit this information, a machine learning-based combination framework is introduced. We have evaluated this framework on a database containing 392 patient records from suspected TB subjects prospectively recruited in Cape Town, South Africa. Each record comprised a CAD score, automatically computed from a CXR, and 12 clinical features. Comparisons with strategies relying on either CAD scores or clinical information alone were performed. Our results indicate that the combination framework outperforms the individual strategies in terms of the area under the receiving operating characteristic curve (0.84 versus 0.78 and 0.72), specificity at 95% sensitivity (49% versus 24% and 31%) and negative predictive value (98% versus 95% and 96%). Thus, it is believed that combining CAD and clinical information to estimate the risk of active disease is a promising tool for TB screening.

  5. In person versus computer screening for intimate partner violence among pregnant patients.

    Science.gov (United States)

    Chang, Judy C; Dado, Diane; Schussler, Sara; Hawker, Lynn; Holland, Cynthia L; Burke, Jessica G; Cluss, Patricia A

    2012-09-01

    To compare in person versus computerized screening for intimate partner violence (IPV) in a hospital-based prenatal clinic and explore women's assessment of the screening methods. We compared patient IPV disclosures on a computerized questionnaire to audio-taped first obstetric visits with an obstetric care provider and performed semi-structured interviews with patient participants who reported experiencing IPV. Two-hundred and fifty patient participants and 52 provider participants were in the study. Ninety-one (36%) patients disclosed IPV either via computer or in person. Of those who disclosed IPV, 60 (66%) disclosed via both methods, but 31 (34%) disclosed IPV via only one of the two methods. Twenty-three women returned for interviews. They recommended using both types together. While computerized screening was felt to be non-judgmental and more anonymous, in person screening allowed for tailored questioning and more emotional connection with the provider. Computerized screening allowed disclosure without fear of immediate judgment. In person screening allows more flexibility in wording of questions regarding IPV and opportunity for interpersonal rapport. Both computerized or self-completed screening and in person screening is recommended. Providers should address IPV using non-judgmental, descriptive language, include assessments for psychological IPV, and repeat screening in person, even if no patient disclosure occurs via computer. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  6. Visual ergonomic aspects of glare on computer displays: glossy screens and angular dependence

    Science.gov (United States)

    Brunnström, Kjell; Andrén, Börje; Konstantinides, Zacharias; Nordström, Lukas

    2007-02-01

    Recently flat panel computer displays and notebook computer are designed with a so called glare panel i.e. highly glossy screens, have emerged on the market. The shiny look of the display appeals to the costumers, also there are arguments that the contrast, colour saturation etc improves by using a glare panel. LCD displays suffer often from angular dependent picture quality. This has been even more pronounced by the introduction of Prism Light Guide plates into displays for notebook computers. The TCO label is the leading labelling system for computer displays. Currently about 50% of all computer displays on the market are certified according to the TCO requirements. The requirements are periodically updated to keep up with the technical development and the latest research in e.g. visual ergonomics. The gloss level of the screen and the angular dependence has recently been investigated by conducting user studies. A study of the effect of highly glossy screens compared to matt screens has been performed. The results show a slight advantage for the glossy screen when no disturbing reflexes are present, however the difference was not statistically significant. When disturbing reflexes are present the advantage is changed into a larger disadvantage and this difference is statistically significant. Another study of angular dependence has also been performed. The results indicates a linear relationship between the picture quality and the centre luminance of the screen.

  7. Secure and robust cloud computing for high-throughput forensic microsatellite sequence analysis and databasing.

    Science.gov (United States)

    Bailey, Sarah F; Scheible, Melissa K; Williams, Christopher; Silva, Deborah S B S; Hoggan, Marina; Eichman, Christopher; Faith, Seth A

    2017-11-01

    Next-generation Sequencing (NGS) is a rapidly evolving technology with demonstrated benefits for forensic genetic applications, and the strategies to analyze and manage the massive NGS datasets are currently in development. Here, the computing, data storage, connectivity, and security resources of the Cloud were evaluated as a model for forensic laboratory systems that produce NGS data. A complete front-to-end Cloud system was developed to upload, process, and interpret raw NGS data using a web browser dashboard. The system was extensible, demonstrating analysis capabilities of autosomal and Y-STRs from a variety of NGS instrumentation (Illumina MiniSeq and MiSeq, and Oxford Nanopore MinION). NGS data for STRs were concordant with standard reference materials previously characterized with capillary electrophoresis and Sanger sequencing. The computing power of the Cloud was implemented with on-demand auto-scaling to allow multiple file analysis in tandem. The system was designed to store resulting data in a relational database, amenable to downstream sample interpretations and databasing applications following the most recent guidelines in nomenclature for sequenced alleles. Lastly, a multi-layered Cloud security architecture was tested and showed that industry standards for securing data and computing resources were readily applied to the NGS system without disadvantageous effects for bioinformatic analysis, connectivity or data storage/retrieval. The results of this study demonstrate the feasibility of using Cloud-based systems for secured NGS data analysis, storage, databasing, and multi-user distributed connectivity. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Computer-aided detection of masses in full-field digital mammography using screen-film mammograms for training

    International Nuclear Information System (INIS)

    Kallenberg, Michiel; Karssemeijer, Nico

    2008-01-01

    It would be of great value when available databases of screen-film mammography (SFM) images can be used to train full-field digital mammography (FFDM) computer-aided detection (CAD) systems, as compilation of new databases is costly. In this paper, we investigate this possibility. Firstly, we develop a method that converts an FFDM image into an SFM-like representation. In this conversion method, we establish a relation between exposure and optical density by simulation of an automatic exposure control unit. Secondly, we investigate the effects of using the SFM images as training samples compared to training with FFDM images. Our FFDM database consisted of 266 cases, of which 102 were biopsy-proven malignant masses and 164 normals. The images were acquired with systems of two different manufacturers. We found that, when we trained our FFDM CAD system with a small number of images, training with FFDM images, using a five-fold crossvalidation procedure, outperformed training with SFM images. However, when the full SFM database, consisting of 348 abnormal cases (including 204 priors) and 810 normal cases, was used for training, SFM training outperformed FFDMA training. These results show that an existing CAD system for detection of masses in SFM can be used for FFDM images without retraining.

  9. Screening of synthetic and natural product databases: Identification of novel androgens and antiandrogens.

    Science.gov (United States)

    Bobach, Claudia; Tennstedt, Stephanie; Palberg, Kristin; Denkert, Annika; Brandt, Wolfgang; de Meijere, Armin; Seliger, Barbara; Wessjohann, Ludger A

    2015-01-27

    The androgen receptor is an important pharmaceutical target for a variety of diseases. This paper presents an in silico/in vitro screening procedure to identify new androgen receptor ligands. The two-step virtual screening procedure uses a three-dimensional pharmacophore model and a docking/scoring routine. About 39,000 filtered compounds were docked with PLANTS and scored by Chemplp. Subsequent to virtual screening, 94 compounds, including 28 steroidal and 66 nonsteroidal compounds, were tested by an androgen receptor fluorescence polarization ligand displacement assay. As a result, 30 compounds were identified that show a relative binding affinity of more than 50% in comparison to 100 nM dihydrotestosterone and were classified as androgen receptor binders. For 11 androgen receptor binders of interest IC50 and Ki values were determined. The compound with the highest affinity exhibits a Ki value of 10.8 nM. Subsequent testing of the 11 compounds in a PC-3 and LNCaP multi readout proliferation assay provides insights into the potential mode of action. Further steroid receptor ligand displacement assays and docking studies on estrogen receptors α and β, glucocorticoid receptor, and progesterone receptor gave information about the specificity of the 11 most active compounds. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  10. Practical considerations and effects of metallic screen fluorescence and backscatter control in gamma computed radiography

    International Nuclear Information System (INIS)

    Mango, Steven

    2016-01-01

    It is a fairly common misconception that the role of metallic screens used with computed radiography is primarily that of scatter control, and that any amplification of the image signal is minimal. To the contrary, this paper shows how the physical interaction between gamma rays and front metallic screens can yield a significant boost in signal and whether that increased signal is, in fact, beneficial or detrimental to image quality. For rear metallic screens, this signal boost is differentiated from backscatter, and image quality considerations should be more carefully thought out because of the separation between the screen and the imaging layer provided by the imaging plate support. Various physical interactions are explained, and a series of practical experiments show the various changes in signal level and image quality with various thicknesses of lead and copper screens. Recommendations are made for the configuration of the imaging plate and screens for optimum image quality and for the control and monitoring of scatter.

  11. Low-Dose Chest Computed Tomography for Lung Cancer Screening Among Hodgkin Lymphoma Survivors: A Cost-Effectiveness Analysis

    International Nuclear Information System (INIS)

    Wattson, Daniel A.; Hunink, M.G. Myriam; DiPiro, Pamela J.; Das, Prajnan; Hodgson, David C.; Mauch, Peter M.; Ng, Andrea K.

    2014-01-01

    Purpose: Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Methods and Materials: Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Results: Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. Conclusions: HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening

  12. Low-Dose Chest Computed Tomography for Lung Cancer Screening Among Hodgkin Lymphoma Survivors: A Cost-Effectiveness Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wattson, Daniel A., E-mail: dwattson@partners.org [Harvard Radiation Oncology Program, Boston, Massachusetts (United States); Hunink, M.G. Myriam [Departments of Radiology and Epidemiology, Erasmus Medical Center, Rotterdam, the Netherlands and Center for Health Decision Science, Harvard School of Public Health, Boston, Massachusetts (United States); DiPiro, Pamela J. [Department of Imaging, Dana-Farber Cancer Institute, Boston, Massachusetts (United States); Das, Prajnan [Department of Radiation Oncology, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Hodgson, David C. [Department of Radiation Oncology, University of Toronto, Toronto, Ontario (Canada); Mauch, Peter M.; Ng, Andrea K. [Department of Radiation Oncology, Brigham and Women' s Hospital and Dana-Farber Cancer Institute, Boston, Massachusetts (United States)

    2014-10-01

    Purpose: Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Methods and Materials: Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Results: Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. Conclusions: HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening

  13. Low-dose chest computed tomography for lung cancer screening among Hodgkin lymphoma survivors: a cost-effectiveness analysis.

    Science.gov (United States)

    Wattson, Daniel A; Hunink, M G Myriam; DiPiro, Pamela J; Das, Prajnan; Hodgson, David C; Mauch, Peter M; Ng, Andrea K

    2014-10-01

    Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening may be cost effective for all smokers but possibly not

  14. China National Lung Cancer Screening Guideline with Low-dose Computed 
Tomography (2018 version

    Directory of Open Access Journals (Sweden)

    Qinghua ZHOU

    2018-02-01

    Full Text Available Background and objective Lung cancer is the leading cause of cancer-related death in China. The results from a randomized controlled trial using annual low-dose computed tomography (LDCT in specific high-risk groups demonstrated a 20% reduction in lung cancer mortality. The aim of tihs study is to establish the China National lung cancer screening guidelines for clinical practice. Methods The China lung cancer early detection and treatment expert group (CLCEDTEG established the China National Lung Cancer Screening Guideline with multidisciplinary representation including 4 thoracic surgeons, 4 thoracic radiologists, 2 medical oncologists, 2 pulmonologists, 2 pathologist, and 2 epidemiologist. Members have engaged in interdisciplinary collaborations regarding lung cancer screening and clinical care of patients with at risk for lung cancer. The expert group reviewed the literature, including screening trials in the United States and Europe and China, and discussed local best clinical practices in the China. A consensus-based guidelines, China National Lung Cancer Screening Guideline (CNLCSG, was recommended by CLCEDTEG appointed by the National Health and Family Planning Commission, based on results of the National Lung Screening Trial, systematic review of evidence related to LDCT screening, and protocol of lung cancer screening program conducted in rural China. Results Annual lung cancer screening with LDCT is recommended for high risk individuals aged 50-74 years who have at least a 20 pack-year smoking history and who currently smoke or have quit within the past five years. Individualized decision making should be conducted before LDCT screening. LDCT screening also represents an opportunity to educate patients as to the health risks of smoking; thus, education should be integrated into the screening process in order to assist smoking cessation. Conclusion A lung cancer screening guideline is recommended for the high-risk population in China

  15. [China National Lung Cancer Screening Guideline with Low-dose Computed 
Tomography (2018 version)].

    Science.gov (United States)

    Zhou, Qinghua; Fan, Yaguang; Wang, Ying; Qiao, Youlin; Wang, Guiqi; Huang, Yunchao; Wang, Xinyun; Wu, Ning; Zhang, Guozheng; Zheng, Xiangpeng; Bu, Hong; Li, Yin; Wei, Sen; Chen, Liang'an; Hu, Chengping; Shi, Yuankai; Sun, Yan

    2018-02-20

    Lung cancer is the leading cause of cancer-related death in China. The results from a randomized controlled trial using annual low-dose computed tomography (LDCT) in specific high-risk groups demonstrated a 20% reduction in lung cancer mortality. The aim of tihs study is to establish the China National lung cancer screening guidelines for clinical practice. The China lung cancer early detection and treatment expert group (CLCEDTEG) established the China National Lung Cancer Screening Guideline with multidisciplinary representation including 4 thoracic surgeons, 4 thoracic radiologists, 2 medical oncologists, 2 pulmonologists, 2 pathologist, and 2 epidemiologist. Members have engaged in interdisciplinary collaborations regarding lung cancer screening and clinical care of patients with at risk for lung cancer. The expert group reviewed the literature, including screening trials in the United States and Europe and China, and discussed local best clinical practices in the China. A consensus-based guidelines, China National Lung Cancer Screening Guideline (CNLCSG), was recommended by CLCEDTEG appointed by the National Health and Family Planning Commission, based on results of the National Lung Screening Trial, systematic review of evidence related to LDCT screening, and protocol of lung cancer screening program conducted in rural China. Annual lung cancer screening with LDCT is recommended for high risk individuals aged 50-74 years who have at least a 20 pack-year smoking history and who currently smoke or have quit within the past five years. Individualized decision making should be conducted before LDCT screening. LDCT screening also represents an opportunity to educate patients as to the health risks of smoking; thus, education should be integrated into the screening process in order to assist smoking cessation. A lung cancer screening guideline is recommended for the high-risk population in China. Additional research , including LDCT combined with biomarkers, is

  16. A Medical Image Backup Architecture Based on a NoSQL Database and Cloud Computing Services.

    Science.gov (United States)

    Santos Simões de Almeida, Luan Henrique; Costa Oliveira, Marcelo

    2015-01-01

    The use of digital systems for storing medical images generates a huge volume of data. Digital images are commonly stored and managed on a Picture Archiving and Communication System (PACS), under the DICOM standard. However, PACS is limited because it is strongly dependent on the server's physical space. Alternatively, Cloud Computing arises as an extensive, low cost, and reconfigurable resource. However, medical images contain patient information that can not be made available in a public cloud. Therefore, a mechanism to anonymize these images is needed. This poster presents a solution for this issue by taking digital images from PACS, converting the information contained in each image file to a NoSQL database, and using cloud computing to store digital images.

  17. Image storage, cataloguing and retrieval using a personal computer database software application

    International Nuclear Information System (INIS)

    Lewis, G.; Howman-Giles, R.

    1999-01-01

    Full text: Interesting images and cases are collected and collated by most nuclear medicine practitioners throughout the world. Changing imaging technology has altered the way in which images may be presented and are reported, with less reliance on 'hard copy' for both reporting and archiving purposes. Digital image generation and storage is rapidly replacing film in both radiological and nuclear medicine practice. A personal computer database based interesting case filing system is described and demonstrated. The digital image storage format allows instant access to both case information (e.g. history and examination, scan report or teaching point) and the relevant images. The database design allows rapid selection of cases and images appropriate to a particular diagnosis, scan type, age or other search criteria. Correlative X-ray, CT, MRI and ultrasound images can also be stored and accessed. The application is in use at The New Children's Hospital as an aid to postgraduate medical education, with new cases being regularly added to the database

  18. Comparison of different strategies in prenatal screening for Down's syndrome: cost effectiveness analysis of computer simulation.

    Science.gov (United States)

    Gekas, Jean; Gagné, Geneviève; Bujold, Emmanuel; Douillard, Daniel; Forest, Jean-Claude; Reinharz, Daniel; Rousseau, François

    2009-02-13

    To assess and compare the cost effectiveness of three different strategies for prenatal screening for Down's syndrome (integrated test, sequential screening, and contingent screenings) and to determine the most useful cut-off values for risk. Computer simulations to study integrated, sequential, and contingent screening strategies with various cut-offs leading to 19 potential screening algorithms. The computer simulation was populated with data from the Serum Urine and Ultrasound Screening Study (SURUSS), real unit costs for healthcare interventions, and a population of 110 948 pregnancies from the province of Québec for the year 2001. Cost effectiveness ratios, incremental cost effectiveness ratios, and screening options' outcomes. The contingent screening strategy dominated all other screening options: it had the best cost effectiveness ratio ($C26,833 per case of Down's syndrome) with fewer procedure related euploid miscarriages and unnecessary terminations (respectively, 6 and 16 per 100,000 pregnancies). It also outperformed serum screening at the second trimester. In terms of the incremental cost effectiveness ratio, contingent screening was still dominant: compared with screening based on maternal age alone, the savings were $C30,963 per additional birth with Down's syndrome averted. Contingent screening was the only screening strategy that offered early reassurance to the majority of women (77.81%) in first trimester and minimised costs by limiting retesting during the second trimester (21.05%). For the contingent and sequential screening strategies, the choice of cut-off value for risk in the first trimester test significantly affected the cost effectiveness ratios (respectively, from $C26,833 to $C37,260 and from $C35,215 to $C45,314 per case of Down's syndrome), the number of procedure related euploid miscarriages (from 6 to 46 and from 6 to 45 per 100,000 pregnancies), and the number of unnecessary terminations (from 16 to 26 and from 16 to 25 per 100

  19. School Students and Computer Games with Screen Violence

    Science.gov (United States)

    Fedorov, A. V.

    2005-01-01

    In this article, the author states how these days, school students from low-income strata of the population in Russia spend hours sitting in computer rooms and Internet clubs, where, for a relatively small fee, they can play interactive video games. And to determine what games they prefer the author conducted a content analysis of eighty-seven…

  20. Low-dose computed tomography for lung cancer screening: comparison of performance between annual and biennial screen

    Energy Technology Data Exchange (ETDEWEB)

    Sverzellati, Nicola; Silva, M. [University of Parma, Radiology, Department of Surgical Sciences, Parma (Italy); Calareso, G.; Marchiano, A. [Fondazione IRCCS Istituto Nazionale dei Tumori, Department of Radiology, Milan (Italy); Galeone, C. [University of Milano-Bicocca, Department of Statistics and Quantitative Methods, Division of Biostatistics, Epidemiology and Public Health, Laboratory of Healthcare Research and Pharmacoepidemiology, Milan (Italy); Sestini, S.; Pastorino, U. [Fondazione IRCCS Istituto Nazionale dei Tumori, Department of Surgery, Section of Thoracic Surgery, Milan (Italy); Sozzi, G. [Fondazione IRCCS Istituto Nazionale dei Tumori, Tumor Genomics Unit, Department of Experimental Oncology and Molecular Medicine, Milan (Italy)

    2016-11-15

    To compare the performance metrics of two different strategies of lung cancer screening by low-dose computed tomography (LDCT), namely, annual (LDCT1) or biennial (LDCT2) screen. Recall rate, detection rate, interval cancers, sensitivity, specificity, positive and negative predictive values (PPV and NPV, respectively) were compared between LDCT1 and LDCT2 arms of the MILD trial over the first seven (T0-T6; median follow-up 7.3 years) and four rounds (T0-T3; median follow-up 7.3 years), respectively. 1152 LDCT1 and 1151 LDCT2 participants underwent a total of 6893 and 4715 LDCT scans, respectively. The overall recall rate was higher in LDCT2 arm (6.97 %) than in LDCT1 arm (5.81 %) (p = 0.01), which was counterbalanced by the overall lower number of LDCT scans. No difference was observed for the overall detection rate (0.56 % in both arms). The two LDCT arms had similar specificity (99.2 % in both arms), sensitivity (73.5 %, in LDCT2 vs. 68.5 % in LDCT1, p = 0.62), PPV (42.4 %, in LDCT2, vs. 40.6 %, in LDCT1, p = 0.83) and NPV (99.8 %, in LDCT2 vs. 99.7 %, in LDCT1, p = 0.71). Biennial screen may save about one third of LDCT scans with similar performance indicators as compared to annual screening. (orig.)

  1. Low-dose computed tomography for lung cancer screening: comparison of performance between annual and biennial screen

    International Nuclear Information System (INIS)

    Sverzellati, Nicola; Silva, M.; Calareso, G.; Marchiano, A.; Galeone, C.; Sestini, S.; Pastorino, U.; Sozzi, G.

    2016-01-01

    To compare the performance metrics of two different strategies of lung cancer screening by low-dose computed tomography (LDCT), namely, annual (LDCT1) or biennial (LDCT2) screen. Recall rate, detection rate, interval cancers, sensitivity, specificity, positive and negative predictive values (PPV and NPV, respectively) were compared between LDCT1 and LDCT2 arms of the MILD trial over the first seven (T0-T6; median follow-up 7.3 years) and four rounds (T0-T3; median follow-up 7.3 years), respectively. 1152 LDCT1 and 1151 LDCT2 participants underwent a total of 6893 and 4715 LDCT scans, respectively. The overall recall rate was higher in LDCT2 arm (6.97 %) than in LDCT1 arm (5.81 %) (p = 0.01), which was counterbalanced by the overall lower number of LDCT scans. No difference was observed for the overall detection rate (0.56 % in both arms). The two LDCT arms had similar specificity (99.2 % in both arms), sensitivity (73.5 %, in LDCT2 vs. 68.5 % in LDCT1, p = 0.62), PPV (42.4 %, in LDCT2, vs. 40.6 %, in LDCT1, p = 0.83) and NPV (99.8 %, in LDCT2 vs. 99.7 %, in LDCT1, p = 0.71). Biennial screen may save about one third of LDCT scans with similar performance indicators as compared to annual screening. (orig.)

  2. A Computational model for compressed sensing RNAi cellular screening

    Directory of Open Access Journals (Sweden)

    Tan Hua

    2012-12-01

    Full Text Available Abstract Background RNA interference (RNAi becomes an increasingly important and effective genetic tool to study the function of target genes by suppressing specific genes of interest. This system approach helps identify signaling pathways and cellular phase types by tracking intensity and/or morphological changes of cells. The traditional RNAi screening scheme, in which one siRNA is designed to knockdown one specific mRNA target, needs a large library of siRNAs and turns out to be time-consuming and expensive. Results In this paper, we propose a conceptual model, called compressed sensing RNAi (csRNAi, which employs a unique combination of group of small interfering RNAs (siRNAs to knockdown a much larger size of genes. This strategy is based on the fact that one gene can be partially bound with several small interfering RNAs (siRNAs and conversely, one siRNA can bind to a few genes with distinct binding affinity. This model constructs a multi-to-multi correspondence between siRNAs and their targets, with siRNAs much fewer than mRNA targets, compared with the conventional scheme. Mathematically this problem involves an underdetermined system of equations (linear or nonlinear, which is ill-posed in general. However, the recently developed compressed sensing (CS theory can solve this problem. We present a mathematical model to describe the csRNAi system based on both CS theory and biological concerns. To build this model, we first search nucleotide motifs in a target gene set. Then we propose a machine learning based method to find the effective siRNAs with novel features, such as image features and speech features to describe an siRNA sequence. Numerical simulations show that we can reduce the siRNA library to one third of that in the conventional scheme. In addition, the features to describe siRNAs outperform the existing ones substantially. Conclusions This csRNAi system is very promising in saving both time and cost for large-scale RNAi

  3. Cost-effectiveness of implementing computed tomography screening for lung cancer in Taiwan.

    Science.gov (United States)

    Yang, Szu-Chun; Lai, Wu-Wei; Lin, Chien-Chung; Su, Wu-Chou; Ku, Li-Jung; Hwang, Jing-Shiang; Wang, Jung-Der

    2017-06-01

    A screening program for lung cancer requires more empirical evidence. Based on the experience of the National Lung Screening Trial (NLST), we developed a method to adjust lead-time bias and quality-of-life changes for estimating the cost-effectiveness of implementing computed tomography (CT) screening in Taiwan. The target population was high-risk (≥30 pack-years) smokers between 55 and 75 years of age. From a nation-wide, 13-year follow-up cohort, we estimated quality-adjusted life expectancy (QALE), loss-of-QALE, and lifetime healthcare expenditures per case of lung cancer stratified by pathology and stage. Cumulative stage distributions for CT-screening and no-screening were assumed equal to those for CT-screening and radiography-screening in the NLST to estimate the savings of loss-of-QALE and additional costs of lifetime healthcare expenditures after CT screening. Costs attributable to screen-negative subjects, false-positive cases and radiation-induced lung cancer were included to obtain the incremental cost-effectiveness ratio from the public payer's perspective. The incremental costs were US$22,755 per person. After dividing this by savings of loss-of-QALE (1.16 quality-adjusted life year (QALY)), the incremental cost-effectiveness ratio was US$19,683 per QALY. This ratio would fall to US$10,947 per QALY if the stage distribution for CT-screening was the same as that of screen-detected cancers in the NELSON trial. Low-dose CT screening for lung cancer among high-risk smokers would be cost-effective in Taiwan. As only about 5% of our women are smokers, future research is necessary to identify the high-risk groups among non-smokers and increase the coverage. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  4. Screens

    OpenAIRE

    2016-01-01

    This Sixth volume in the series The Key Debates. Mutations and Appropriations in European Film Studies investigates the question of screens in the context both of the dematerialization due to digitalization and the multiplication of media screens. Scholars offer various infomations and theories of topics such as the archeology of screen, film and media theories, contemporary art, pragmatics of new ways of screening (from home video to street screening).

  5. Benefits of computer screen-based simulation in learning cardiac arrest procedures.

    Science.gov (United States)

    Bonnetain, Elodie; Boucheix, Jean-Michel; Hamet, Maël; Freysz, Marc

    2010-07-01

    What is the best way to train medical students early so that they acquire basic skills in cardiopulmonary resuscitation as effectively as possible? Studies have shown the benefits of high-fidelity patient simulators, but have also demonstrated their limits. New computer screen-based multimedia simulators have fewer constraints than high-fidelity patient simulators. In this area, as yet, there has been no research on the effectiveness of transfer of learning from a computer screen-based simulator to more realistic situations such as those encountered with high-fidelity patient simulators. We tested the benefits of learning cardiac arrest procedures using a multimedia computer screen-based simulator in 28 Year 2 medical students. Just before the end of the traditional resuscitation course, we compared two groups. An experiment group (EG) was first asked to learn to perform the appropriate procedures in a cardiac arrest scenario (CA1) in the computer screen-based learning environment and was then tested on a high-fidelity patient simulator in another cardiac arrest simulation (CA2). While the EG was learning to perform CA1 procedures in the computer screen-based learning environment, a control group (CG) actively continued to learn cardiac arrest procedures using practical exercises in a traditional class environment. Both groups were given the same amount of practice, exercises and trials. The CG was then also tested on the high-fidelity patient simulator for CA2, after which it was asked to perform CA1 using the computer screen-based simulator. Performances with both simulators were scored on a precise 23-point scale. On the test on a high-fidelity patient simulator, the EG trained with a multimedia computer screen-based simulator performed significantly better than the CG trained with traditional exercises and practice (16.21 versus 11.13 of 23 possible points, respectively; p<0.001). Computer screen-based simulation appears to be effective in preparing learners to

  6. Analysis on Cloud Computing Database in Cloud Environment – Concept and Adoption Paradigm

    Directory of Open Access Journals (Sweden)

    Elena-Geanina ULARU

    2012-08-01

    Full Text Available With the development of the Internet’s new technical functionalities, new concepts have started to take shape. These concepts have an important role especially in the development of corporate IT. Such a concept is „the Cloud”. Various marketing campaigns have started to focus on the Cloud and began to promote it in different but confusing ways. This campaigns do little, to explain what cloud computing is and why it is becoming increasingly necessary. The lack of understanding in this new technology generates a lack of knowledge in business cloud adoption regarding database and also application. Only by focusing on the business processes and objectives an enterprise can achieve the full benefits of the cloud and mitigate the potential risks. In this article we create our own complete definition of the cloud and we analyze the essential aspects of cloud adoption for a banking financial reporting application.

  7. A Computer Knowledge Database of accidents at work in the construction industry

    Science.gov (United States)

    Hoła, B.; Szóstak, M.

    2017-10-01

    At least 60,000 fatal accidents at work occur on building sites all over the world each year, which means that on average, every 10 minutes an employee dies during the execution of work. In 2015 on Polish building sites, 5,776 accidents at work happened, of which 69 resulted in the death of an employee. Accidents are an enormous social and economic burden for companies, communities and countries. The vast majority of accidents at work can be prevented by appropriate and effective preventive measures. Therefore, the Computer Knowledge Database (CKD) was formulated for this purpose and it enables data and information on accidents at work in the construction industry to be collected and processed in order to obtain necessary knowledge. This gained knowledge will be the basis to form conclusions of a preventive nature

  8. The UK Lung Cancer Screening Trial: a pilot randomised controlled trial of low-dose computed tomography screening for the early detection of lung cancer.

    Science.gov (United States)

    Field, John K; Duffy, Stephen W; Baldwin, David R; Brain, Kate E; Devaraj, Anand; Eisen, Tim; Green, Beverley A; Holemans, John A; Kavanagh, Terry; Kerr, Keith M; Ledson, Martin; Lifford, Kate J; McRonald, Fiona E; Nair, Arjun; Page, Richard D; Parmar, Mahesh Kb; Rintoul, Robert C; Screaton, Nicholas; Wald, Nicholas J; Weller, David; Whynes, David K; Williamson, Paula R; Yadegarfar, Ghasem; Hansell, David M

    2016-05-01

    Lung cancer kills more people than any other cancer in the UK (5-year survival high-risk UK population, determine optimum recruitment, screening, reading and care pathway strategies; and (2) assess the psychological consequences and the health-economic implications of screening. A pilot randomised controlled trial comparing intervention with usual care. A population-based risk questionnaire identified individuals who were at high risk of developing lung cancer (≥ 5% over 5 years). Thoracic centres with expertise in lung cancer imaging, respiratory medicine, pathology and surgery: Liverpool Heart & Chest Hospital, Merseyside, and Papworth Hospital, Cambridgeshire. Individuals aged 50-75 years, at high risk of lung cancer, in the primary care trusts adjacent to the centres. A thoracic LDCT scan. Follow-up computed tomography (CT) scans as per protocol. Referral to multidisciplinary team clinics was determined by nodule size criteria. Population-based recruitment based on risk stratification; management of the trial through web-based database; optimal characteristics of CT scan readers (radiologists vs. radiographers); characterisation of CT-detected nodules utilising volumetric analysis; prevalence of lung cancer at baseline; sociodemographic factors affecting participation; psychosocial measures (cancer distress, anxiety, depression, decision satisfaction); and cost-effectiveness modelling. A total of 247,354 individuals were approached to take part in the trial; 30.7% responded positively to the screening invitation. Recruitment of participants resulted in 2028 in the CT arm and 2027 in the control arm. A total of 1994 participants underwent CT scanning: 42 participants (2.1%) were diagnosed with lung cancer; 36 out of 42 (85.7%) of the screen-detected cancers were identified as stage 1 or 2, and 35 (83.3%) underwent surgical resection as their primary treatment. Lung cancer was more common in the lowest socioeconomic group. Short-term adverse psychosocial

  9. Advances in Parallel Computing and Databases for Digital Pathology in Cancer Research

    Science.gov (United States)

    2016-11-13

    databases. The advent of NewSQL and NoSQL (Not Only SQL) databases has led to the development of new technologies that are well suited for applications... NoSQL graph databases are tuned to support graph operations and NoSQL key-value databases excel at rapid ingest of unstructured data. Recent NewSQL

  10. Television viewing, computer use and total screen time in Canadian youth.

    Science.gov (United States)

    Mark, Amy E; Boyce, William F; Janssen, Ian

    2006-11-01

    Research has linked excessive television viewing and computer use in children and adolescents to a variety of health and social problems. Current recommendations are that screen time in children and adolescents should be limited to no more than 2 h per day. To determine the percentage of Canadian youth meeting the screen time guideline recommendations. The representative study sample consisted of 6942 Canadian youth in grades 6 to 10 who participated in the 2001/2002 World Health Organization Health Behaviour in School-Aged Children survey. Only 41% of girls and 34% of boys in grades 6 to 10 watched 2 h or less of television per day. Once the time of leisure computer use was included and total daily screen time was examined, only 18% of girls and 14% of boys met the guidelines. The prevalence of those meeting the screen time guidelines was higher in girls than boys. Fewer than 20% of Canadian youth in grades 6 to 10 met the total screen time guidelines, suggesting that increased public health interventions are needed to reduce the number of leisure time hours that Canadian youth spend watching television and using the computer.

  11. Radioactivity on the surfaces of computer monitors and television screens due to progeny palatal

    International Nuclear Information System (INIS)

    Abdel-Nady, A.; Morsy, A.A.

    2002-01-01

    Computer monitors and television screens can collect radon progeny. Radon decay forming meta-stable progeny, namely, Po-218, Po-214, and Po-210, which are found mostly in positively, charged aerosol particles. These particles are attract by the large negative field of a video display terminals (VDT) leading to buildup of radioactivity on the VDT screen. The charged aerosol particles might drift in the electric field between the VDT and the operator and be accelerated into the operator's face. The aim of this work is to measure these phenomena set of ultra-sensitive TASTRAK detectors used to measure the plate out of positively charged radioactive radon progeny. The track detectors were fixed on the outer monitor screen. For an occupational computer worker spending 200 days per year for 6 hours a day. It was found that the mean dose equivalent was 1.77 mSv, 0.25 mSv/year for normal CRT and LCD monitors respectively

  12. Computational Screening of Light-absorbing Materials for Photoelectrochemical Water Splitting

    DEFF Research Database (Denmark)

    Castelli, Ivano E.; Kuhar, Korina; Pandey, Mohnish

    2018-01-01

    Efficient conversion of solar energy into electricity or fuels requires the identification of new semiconductors with optimal optical and electronic properties. We discuss the current and future role that computational screening is expected to play in this challenge. We discuss the identification...

  13. Computer Vision Tool and Technician as First Reader of Lung Cancer Screening CT Scans

    NARCIS (Netherlands)

    Ritchie, A.J.; Sanghera, C.; Jacobs, C.; Zhang, W.; Mayo, J.; Schmidt, H.; Gingras, M.; Pasian, S.; Stewart, L.; Tsai, S.; Manos, D.; Seely, J.M.; Burrowes, P.; Bhatia, R.; Atkar-Khattra, S.; Ginneken, B. van; Tammemagi, M.; Tsao, M.S.; Lam, S.; et al.,

    2016-01-01

    To implement a cost-effective low-dose computed tomography (LDCT) lung cancer screening program at the population level, accurate and efficient interpretation of a large volume of LDCT scans is needed. The objective of this study was to evaluate a workflow strategy to identify abnormal LDCT scans in

  14. Computer Decision Support to Improve Autism Screening and Care in Community Pediatric Clinics

    Science.gov (United States)

    Bauer, Nerissa S.; Sturm, Lynne A.; Carroll, Aaron E.; Downs, Stephen M.

    2013-01-01

    An autism module was added to an existing computer decision support system (CDSS) to facilitate adherence to recommended guidelines for screening for autism spectrum disorders in primary care pediatric clinics. User satisfaction was assessed by survey and informal feedback at monthly meetings between clinical staff and the software team. To assess…

  15. Interim evaluation report of the mutually operable database systems by different computers; Denshi keisanki sogo un'yo database system chukan hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-03-01

    This is the interim report on evaluation of the mutually operable database systems by different computers. The techniques for these systems fall into four categories of those related to (1) dispersed data systems, (2) multimedia, (3) high reliability, and (4) elementary techniques for mutually operable network systems. The techniques for the category (1) include those for vertically dispersed databases, database systems for multiple addresses in a wide area, and open type combined database systems, which have been in progress generally as planned. Those for the category (2) include the techniques for color document inputting and information retrieval, meaning compiling, understanding highly overlapping data, and controlling data centered by drawings, which have been in progress generally as planned. Those for the category (3) include the techniques for improving resistance of the networks to obstruction, and security of the data in the networks, which have been in progress generally as planned. Those for the category (4) include the techniques for rule processing for development of protocols, protocols for mutually connecting the systems, and high-speed, high-function networks, which have been in progress generally as planned. It is expected that the original objectives are finally achieved, because the development programs for these categories have been in progress generally as planned. (NEDO)

  16. Interim evaluation report of the mutually operable database systems by different computers; Denshi keisanki sogo un'yo database system chukan hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-03-01

    This is the interim report on evaluation of the mutually operable database systems by different computers. The techniques for these systems fall into four categories of those related to (1) dispersed data systems, (2) multimedia, (3) high reliability, and (4) elementary techniques for mutually operable network systems. The techniques for the category (1) include those for vertically dispersed databases, database systems for multiple addresses in a wide area, and open type combined database systems, which have been in progress generally as planned. Those for the category (2) include the techniques for color document inputting and information retrieval, meaning compiling, understanding highly overlapping data, and controlling data centered by drawings, which have been in progress generally as planned. Those for the category (3) include the techniques for improving resistance of the networks to obstruction, and security of the data in the networks, which have been in progress generally as planned. Those for the category (4) include the techniques for rule processing for development of protocols, protocols for mutually connecting the systems, and high-speed, high-function networks, which have been in progress generally as planned. It is expected that the original objectives are finally achieved, because the development programs for these categories have been in progress generally as planned. (NEDO)

  17. Relational databases

    CERN Document Server

    Bell, D A

    1986-01-01

    Relational Databases explores the major advances in relational databases and provides a balanced analysis of the state of the art in relational databases. Topics covered include capture and analysis of data placement requirements; distributed relational database systems; data dependency manipulation in database schemata; and relational database support for computer graphics and computer aided design. This book is divided into three sections and begins with an overview of the theory and practice of distributed systems, using the example of INGRES from Relational Technology as illustration. The

  18. Search for β2 adrenergic receptor ligands by virtual screening via grid computing and investigation of binding modes by docking and molecular dynamics simulations.

    Directory of Open Access Journals (Sweden)

    Qifeng Bai

    Full Text Available We designed a program called MolGridCal that can be used to screen small molecule database in grid computing on basis of JPPF grid environment. Based on MolGridCal program, we proposed an integrated strategy for virtual screening and binding mode investigation by combining molecular docking, molecular dynamics (MD simulations and free energy calculations. To test the effectiveness of MolGridCal, we screened potential ligands for β2 adrenergic receptor (β2AR from a database containing 50,000 small molecules. MolGridCal can not only send tasks to the grid server automatically, but also can distribute tasks using the screensaver function. As for the results of virtual screening, the known agonist BI-167107 of β2AR is ranked among the top 2% of the screened candidates, indicating MolGridCal program can give reasonable results. To further study the binding mode and refine the results of MolGridCal, more accurate docking and scoring methods are used to estimate the binding affinity for the top three molecules (agonist BI-167107, neutral antagonist alprenolol and inverse agonist ICI 118,551. The results indicate agonist BI-167107 has the best binding affinity. MD simulation and free energy calculation are employed to investigate the dynamic interaction mechanism between the ligands and β2AR. The results show that the agonist BI-167107 also has the lowest binding free energy. This study can provide a new way to perform virtual screening effectively through integrating molecular docking based on grid computing, MD simulations and free energy calculations. The source codes of MolGridCal are freely available at http://molgridcal.codeplex.com.

  19. Optimisation and assessment of three modern touch screen tablet computers for clinical vision testing.

    Directory of Open Access Journals (Sweden)

    Humza J Tahir

    Full Text Available Technological advances have led to the development of powerful yet portable tablet computers whose touch-screen resolutions now permit the presentation of targets small enough to test the limits of normal visual acuity. Such devices have become ubiquitous in daily life and are moving into the clinical space. However, in order to produce clinically valid tests, it is important to identify the limits imposed by the screen characteristics, such as resolution, brightness uniformity, contrast linearity and the effect of viewing angle. Previously we have conducted such tests on the iPad 3. Here we extend our investigations to 2 other devices and outline a protocol for calibrating such screens, using standardised methods to measure the gamma function, warm up time, screen uniformity and the effects of viewing angle and screen reflections. We demonstrate that all three devices manifest typical gamma functions for voltage and luminance with warm up times of approximately 15 minutes. However, there were differences in homogeneity and reflectance among the displays. We suggest practical means to optimise quality of display for vision testing including screen calibration.

  20. Optimisation and assessment of three modern touch screen tablet computers for clinical vision testing.

    Science.gov (United States)

    Tahir, Humza J; Murray, Ian J; Parry, Neil R A; Aslam, Tariq M

    2014-01-01

    Technological advances have led to the development of powerful yet portable tablet computers whose touch-screen resolutions now permit the presentation of targets small enough to test the limits of normal visual acuity. Such devices have become ubiquitous in daily life and are moving into the clinical space. However, in order to produce clinically valid tests, it is important to identify the limits imposed by the screen characteristics, such as resolution, brightness uniformity, contrast linearity and the effect of viewing angle. Previously we have conducted such tests on the iPad 3. Here we extend our investigations to 2 other devices and outline a protocol for calibrating such screens, using standardised methods to measure the gamma function, warm up time, screen uniformity and the effects of viewing angle and screen reflections. We demonstrate that all three devices manifest typical gamma functions for voltage and luminance with warm up times of approximately 15 minutes. However, there were differences in homogeneity and reflectance among the displays. We suggest practical means to optimise quality of display for vision testing including screen calibration.

  1. Genome-wide screen for universal individual identification SNPs based on the HapMap and 1000 Genomes databases.

    Science.gov (United States)

    Huang, Erwen; Liu, Changhui; Zheng, Jingjing; Han, Xiaolong; Du, Weian; Huang, Yuanjian; Li, Chengshi; Wang, Xiaoguang; Tong, Dayue; Ou, Xueling; Sun, Hongyu; Zeng, Zhaoshu; Liu, Chao

    2018-04-03

    Differences among SNP panels for individual identification in SNP-selecting and populations led to few common SNPs, compromising their universal applicability. To screen all universal SNPs, we performed a genome-wide SNP mining in multiple populations based on HapMap and 1000Genomes databases. SNPs with high minor allele frequencies (MAF) in 37 populations were selected. With MAF from ≥0.35 to ≥0.43, the number of selected SNPs decreased from 2769 to 0. A total of 117 SNPs with MAF ≥0.39 have no linkage disequilibrium with each other in every population. For 116 of the 117 SNPs, cumulative match probability (CMP) ranged from 2.01 × 10-48 to 1.93 × 10-50 and cumulative exclusion probability (CEP) ranged from 0.9999999996653 to 0.9999999999945. In 134 tested Han samples, 110 of the 117 SNPs remained within high MAF and conformed to Hardy-Weinberg equilibrium, with CMP = 4.70 × 10-47 and CEP = 0.999999999862. By analyzing the same number of autosomal SNPs as in the HID-Ion AmpliSeq Identity Panel, i.e. 90 randomized out of the 110 SNPs, our panel yielded preferable CMP and CEP. Taken together, the 110-SNPs panel is advantageous for forensic test, and this study provided plenty of highly informative SNPs for compiling final universal panels.

  2. HighResNPS.com – an Internet Database for Liquid Chromatography - High Resolution Mass Spectrometry Screening for New Psychoactive Substances

    DEFF Research Database (Denmark)

    Dalsgaard, Petur Weihe; Mollerup, Christian Brinch; Mardal, Marie

    /Discussions: . The overlapping entries of the database verify that similar fragment ions can be observed from identical compounds across different LC-HRMS systems. The inclusion of fragment ions from other labs can reduce false positive identifications, when no reference standard is available in-house. HighResNPS can serve......Background/Introduction: The number of new psychoactive substances (NPS) is constantly increasing which makes it challenging to keep the screening libraries updated with the relevant analytical targets. Liquid chromatography coupled High Resolution Mass Spectrometry (LC-HRMS) screening methods...... with most screening platforms after minor formatting. Results: Currently, 11 users from 9 laboratories in 7 counties have contributed with 318 entries to the database with experimental data containing at least one fragment ion. 66% of the uploaded data were based on reference standards. Synthetic...

  3. Improving the Computational Performance of Ontology-Based Classification Using Graph Databases

    Directory of Open Access Journals (Sweden)

    Thomas J. Lampoltshammer

    2015-07-01

    Full Text Available The increasing availability of very high-resolution remote sensing imagery (i.e., from satellites, airborne laser scanning, or aerial photography represents both a blessing and a curse for researchers. The manual classification of these images, or other similar geo-sensor data, is time-consuming and leads to subjective and non-deterministic results. Due to this fact, (semi- automated classification approaches are in high demand in affected research areas. Ontologies provide a proper way of automated classification for various kinds of sensor data, including remotely sensed data. However, the processing of data entities—so-called individuals—is one of the most cost-intensive computational operations within ontology reasoning. Therefore, an approach based on graph databases is proposed to overcome the issue of a high time consumption regarding the classification task. The introduced approach shifts the classification task from the classical Protégé environment and its common reasoners to the proposed graph-based approaches. For the validation, the authors tested the approach on a simulation scenario based on a real-world example. The results demonstrate a quite promising improvement of classification speed—up to 80,000 times faster than the Protégé-based approach.

  4. Unraveling the web of viroinformatics: computational tools and databases in virus research.

    Science.gov (United States)

    Sharma, Deepak; Priyadarshini, Pragya; Vrati, Sudhanshu

    2015-02-01

    The beginning of the second century of research in the field of virology (the first virus was discovered in 1898) was marked by its amalgamation with bioinformatics, resulting in the birth of a new domain--viroinformatics. The availability of more than 100 Web servers and databases embracing all or specific viruses (for example, dengue virus, influenza virus, hepatitis virus, human immunodeficiency virus [HIV], hemorrhagic fever virus [HFV], human papillomavirus [HPV], West Nile virus, etc.) as well as distinct applications (comparative/diversity analysis, viral recombination, small interfering RNA [siRNA]/short hairpin RNA [shRNA]/microRNA [miRNA] studies, RNA folding, protein-protein interaction, structural analysis, and phylotyping and genotyping) will definitely aid the development of effective drugs and vaccines. However, information about their access and utility is not available at any single source or on any single platform. Therefore, a compendium of various computational tools and resources dedicated specifically to virology is presented in this article. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  5. DNAStat, version 2.1--a computer program for processing genetic profile databases and biostatistical calculations.

    Science.gov (United States)

    Berent, Jarosław

    2010-01-01

    This paper presents the new DNAStat version 2.1 for processing genetic profile databases and biostatistical calculations. The popularization of DNA studies employed in the judicial system has led to the necessity of developing appropriate computer programs. Such programs must, above all, address two critical problems, i.e. the broadly understood data processing and data storage, and biostatistical calculations. Moreover, in case of terrorist attacks and mass natural disasters, the ability to identify victims by searching related individuals is very important. DNAStat version 2.1 is an adequate program for such purposes. The DNAStat version 1.0 was launched in 2005. In 2006, the program was updated to 1.1 and 1.2 versions. There were, however, slight differences between those versions and the original one. The DNAStat version 2.0 was launched in 2007 and the major program improvement was an introduction of the group calculation options with the potential application to personal identification of mass disasters and terrorism victims. The last 2.1 version has the option of language selection--Polish or English, which will enhance the usage and application of the program also in other countries.

  6. Screening for cognitive impairment in older individuals. Validation study of a computer-based test.

    Science.gov (United States)

    Green, R C; Green, J; Harrison, J M; Kutner, M H

    1994-08-01

    This study examined the validity of a computer-based cognitive test that was recently designed to screen the elderly for cognitive impairment. Criterion-related validity was examined by comparing test scores of impaired patients and normal control subjects. Construct-related validity was computed through correlations between computer-based subtests and related conventional neuropsychological subtests. University center for memory disorders. Fifty-two patients with mild cognitive impairment by strict clinical criteria and 50 unimpaired, age- and education-matched control subjects. Control subjects were rigorously screened by neurological, neuropsychological, imaging, and electrophysiological criteria to identify and exclude individuals with occult abnormalities. Using a cut-off total score of 126, this computer-based instrument had a sensitivity of 0.83 and a specificity of 0.96. Using a prevalence estimate of 10%, predictive values, positive and negative, were 0.70 and 0.96, respectively. Computer-based subtests correlated significantly with conventional neuropsychological tests measuring similar cognitive domains. Thirteen (17.8%) of 73 volunteers with normal medical histories were excluded from the control group, with unsuspected abnormalities on standard neuropsychological tests, electroencephalograms, or magnetic resonance imaging scans. Computer-based testing is a valid screening methodology for the detection of mild cognitive impairment in the elderly, although this particular test has important limitations. Broader applications of computer-based testing will require extensive population-based validation. Future studies should recognize that normal control subjects without a history of disease who are typically used in validation studies may have a high incidence of unsuspected abnormalities on neurodiagnostic studies.

  7. Parapsychology and the neurosciences: a computer-based content analysis of abstracts in the database "MEDLINE" from 1975 to 1995.

    Science.gov (United States)

    Fassbender, P

    1997-04-01

    A computer-based content of 109 abstracts retrieved by the subject heading "parapsychology" from the database MEDLINE for the years 1975-1995 is presented. Data were analyzed by four categories to terms denoting (1) research methods, (2) neurosciences, (3) humanities/psychodynamics, and (4) parapsychology. Results indicated a growing interest in neuroscientific and neuropsychological explanations and theories.

  8. A computational platform to maintain and migrate manual functional annotations for BioCyc databases.

    Science.gov (United States)

    Walsh, Jesse R; Sen, Taner Z; Dickerson, Julie A

    2014-10-12

    BioCyc databases are an important resource for information on biological pathways and genomic data. Such databases represent the accumulation of biological data, some of which has been manually curated from literature. An essential feature of these databases is the continuing data integration as new knowledge is discovered. As functional annotations are improved, scalable methods are needed for curators to manage annotations without detailed knowledge of the specific design of the BioCyc database. We have developed CycTools, a software tool which allows curators to maintain functional annotations in a model organism database. This tool builds on existing software to improve and simplify annotation data imports of user provided data into BioCyc databases. Additionally, CycTools automatically resolves synonyms and alternate identifiers contained within the database into the appropriate internal identifiers. Automating steps in the manual data entry process can improve curation efforts for major biological databases. The functionality of CycTools is demonstrated by transferring GO term annotations from MaizeCyc to matching proteins in CornCyc, both maize metabolic pathway databases available at MaizeGDB, and by creating strain specific databases for metabolic engineering.

  9. Community-Based Multidisciplinary Computed Tomography Screening Program Improves Lung Cancer Survival.

    Science.gov (United States)

    Miller, Daniel L; Mayfield, William R; Luu, Theresa D; Helms, Gerald A; Muster, Alan R; Beckler, Vickie J; Cann, Aaron

    2016-05-01

    Lung cancer is the most common cause of cancer deaths in the United States. Overall survival is less than 20%, with the majority of patients presenting with advanced disease. The National Lung Screening Trial, performed mainly in academic medical centers, showed that cancer mortality can be reduced with computed tomography (CT) screening compared with chest radiography in high-risk patients. To determine whether this survival advantage can be duplicated in a community-based multidisciplinary thoracic oncology program, we initiated a CT scan screening program for lung cancer within an established health care system. In 2008, we launched a lung cancer CT screening program within the WellStar Health System (WHS) consisting of five hospitals, three health parks, 140 outpatient medical offices, and 12 imaging centers that provide care in a five-county area of approximately 1.4 million people in Metro-Atlanta. Screening criteria incorporated were the International Early Lung Cancer Action Program (2008 to 2010) and National Comprehensive Cancer Network guidelines (2011 to 2013) for moderate- and high-risk patients. A total of 1,267 persons underwent CT lung cancer screening in WHS from 2008 through 2013; 53% were men, 87% were 50 years of age or older, and 83% were current or former smokers. Noncalcified indeterminate pulmonary nodules were found in 518 patients (41%). Thirty-six patients (2.8%) underwent a diagnostic procedure for positive findings on their CT scan; 30 proved to have cancer, 28 (2.2%) primary lung cancer and 2 metastatic cancer, and 6 had benign disease. Fourteen patients (50%) had their lung cancer discovered on their initial CT scan, 11 on subsequent scans associated with indeterminate pulmonary nodules growth and 3 patients who had a new indeterminate pulmonary nodules. Only 15 (54%) of these 28 patients would have qualified as a National Lung Screening Trial high-risk patient; 75% had stage I or II disease. Overall 5-year survival was 64% and 5-year

  10. Noninvasive Computed Tomography–based Risk Stratification of Lung Adenocarcinomas in the National Lung Screening Trial

    Science.gov (United States)

    Maldonado, Fabien; Duan, Fenghai; Raghunath, Sushravya M.; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Garg, Kavita; Greco, Erin; Nath, Hrudaya; Robb, Richard A.; Bartholmai, Brian J.

    2015-01-01

    Rationale: Screening for lung cancer using low-dose computed tomography (CT) reduces lung cancer mortality. However, in addition to a high rate of benign nodules, lung cancer screening detects a large number of indolent cancers that generally belong to the adenocarcinoma spectrum. Individualized management of screen-detected adenocarcinomas would be facilitated by noninvasive risk stratification. Objectives: To validate that Computer-Aided Nodule Assessment and Risk Yield (CANARY), a novel image analysis software, successfully risk stratifies screen-detected lung adenocarcinomas based on clinical disease outcomes. Methods: We identified retrospective 294 eligible patients diagnosed with lung adenocarcinoma spectrum lesions in the low-dose CT arm of the National Lung Screening Trial. The last low-dose CT scan before the diagnosis of lung adenocarcinoma was analyzed using CANARY blinded to clinical data. Based on their parametric CANARY signatures, all the lung adenocarcinoma nodules were risk stratified into three groups. CANARY risk groups were compared using survival analysis for progression-free survival. Measurements and Main Results: A total of 294 patients were included in the analysis. Kaplan-Meier analysis of all the 294 adenocarcinoma nodules stratified into the Good, Intermediate, and Poor CANARY risk groups yielded distinct progression-free survival curves (P < 0.0001). This observation was confirmed in the unadjusted and adjusted (age, sex, race, and smoking status) progression-free survival analysis of all stage I cases. Conclusions: CANARY allows the noninvasive risk stratification of lung adenocarcinomas into three groups with distinct post-treatment progression-free survival. Our results suggest that CANARY could ultimately facilitate individualized management of incidentally or screen-detected lung adenocarcinomas. PMID:26052977

  11. Noninvasive Computed Tomography-based Risk Stratification of Lung Adenocarcinomas in the National Lung Screening Trial.

    Science.gov (United States)

    Maldonado, Fabien; Duan, Fenghai; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Karwoski, Ronald A; Garg, Kavita; Greco, Erin; Nath, Hrudaya; Robb, Richard A; Bartholmai, Brian J; Peikert, Tobias

    2015-09-15

    Screening for lung cancer using low-dose computed tomography (CT) reduces lung cancer mortality. However, in addition to a high rate of benign nodules, lung cancer screening detects a large number of indolent cancers that generally belong to the adenocarcinoma spectrum. Individualized management of screen-detected adenocarcinomas would be facilitated by noninvasive risk stratification. To validate that Computer-Aided Nodule Assessment and Risk Yield (CANARY), a novel image analysis software, successfully risk stratifies screen-detected lung adenocarcinomas based on clinical disease outcomes. We identified retrospective 294 eligible patients diagnosed with lung adenocarcinoma spectrum lesions in the low-dose CT arm of the National Lung Screening Trial. The last low-dose CT scan before the diagnosis of lung adenocarcinoma was analyzed using CANARY blinded to clinical data. Based on their parametric CANARY signatures, all the lung adenocarcinoma nodules were risk stratified into three groups. CANARY risk groups were compared using survival analysis for progression-free survival. A total of 294 patients were included in the analysis. Kaplan-Meier analysis of all the 294 adenocarcinoma nodules stratified into the Good, Intermediate, and Poor CANARY risk groups yielded distinct progression-free survival curves (P < 0.0001). This observation was confirmed in the unadjusted and adjusted (age, sex, race, and smoking status) progression-free survival analysis of all stage I cases. CANARY allows the noninvasive risk stratification of lung adenocarcinomas into three groups with distinct post-treatment progression-free survival. Our results suggest that CANARY could ultimately facilitate individualized management of incidentally or screen-detected lung adenocarcinomas.

  12. Reduction of the performance of a noise screen due to screen-induced wind-speed gradients: numerical computations and wind-tunnel experiments

    NARCIS (Netherlands)

    Salomons, E.M.

    1999-01-01

    Downwind sound propagation over a noise screen is investigated by numerical computations and scale model experiments in a wind tunnel. For the computations, the parabolic equation method is used, with a range-dependent sound-speed profile based on wind-speed profiles measured in the wind tunnel and

  13. An eye movement study for identification of suitable font characters for presentation on a computer screen.

    Science.gov (United States)

    Banerjee, Jayeeta; Majumdar, Dhurjati; Majumdar, Deepti; Pal, Madhu Sudan

    2010-06-01

    We are experiencing a shifting of media: from the printed paper to the computer screen. This transition is modifying the process of how we read and understand a text. It is very difficult to conclude on suitability of font characters based upon subjective evaluation method only. Present study evaluates the effect of font type on human cognitive workload during perception of individual alphabets on a computer screen. Twenty six young subjects volunteered for this study. Here, subjects have been shown individual characters of different font types and their eye movements have been recorded. A binocular eye movement recorder was used for eye movement recording. The results showed that different eye movement parameters such as pupil diameter, number of fixations, fixation duration were less for font type Verdana. The present study recommends the use of font type Verdana for presentation of individual alphabets on various electronic displays in order to reduce cognitive workload.

  14. Modeling nanoscale gas sensors under realistic conditions: Computational screening of metal-doped carbon nanotubes

    DEFF Research Database (Denmark)

    García Lastra, Juan Maria; Mowbray, Duncan; Thygesen, Kristian Sommer

    2010-01-01

    We use computational screening to systematically investigate the use of transition-metal-doped carbon nanotubes for chemical-gas sensing. For a set of relevant target molecules (CO, NH3, and H2S) and the main components of air (N2, O2, and H2O), we calculate the binding energy and change in condu......We use computational screening to systematically investigate the use of transition-metal-doped carbon nanotubes for chemical-gas sensing. For a set of relevant target molecules (CO, NH3, and H2S) and the main components of air (N2, O2, and H2O), we calculate the binding energy and change...... the change in the nanotube resistance per doping site as a function of the target molecule concentration assuming charge transport in the diffusive regime. Our analysis points to Ni-doped nanotubes as candidates for CO sensors working under typical atmospheric conditions....

  15. Database Administrator

    Science.gov (United States)

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  16. Binary Decision Trees for Preoperative Periapical Cyst Screening Using Cone-beam Computed Tomography.

    Science.gov (United States)

    Pitcher, Brandon; Alaqla, Ali; Noujeim, Marcel; Wealleans, James A; Kotsakis, Georgios; Chrepa, Vanessa

    2017-03-01

    Cone-beam computed tomographic (CBCT) analysis allows for 3-dimensional assessment of periradicular lesions and may facilitate preoperative periapical cyst screening. The purpose of this study was to develop and assess the predictive validity of a cyst screening method based on CBCT volumetric analysis alone or combined with designated radiologic criteria. Three independent examiners evaluated 118 presurgical CBCT scans from cases that underwent apicoectomies and had an accompanying gold standard histopathological diagnosis of either a cyst or granuloma. Lesion volume, density, and specific radiologic characteristics were assessed using specialized software. Logistic regression models with histopathological diagnosis as the dependent variable were constructed for cyst prediction, and receiver operating characteristic curves were used to assess the predictive validity of the models. A conditional inference binary decision tree based on a recursive partitioning algorithm was constructed to facilitate preoperative screening. Interobserver agreement was excellent for volume and density, but it varied from poor to good for the radiologic criteria. Volume and root displacement were strong predictors for cyst screening in all analyses. The binary decision tree classifier determined that if the volume of the lesion was >247 mm 3 , there was 80% probability of a cyst. If volume was cyst probability was 60% (78% accuracy). The good accuracy and high specificity of the decision tree classifier renders it a useful preoperative cyst screening tool that can aid in clinical decision making but not a substitute for definitive histopathological diagnosis after biopsy. Confirmatory studies are required to validate the present findings. Published by Elsevier Inc.

  17. Feasibility of Tablet Computer Screening for Opioid Abuse in the Emergency Department

    Directory of Open Access Journals (Sweden)

    Weiner, Scott G.

    2014-12-01

    Full Text Available Introduction: Tablet computer-based screening may have the potential for detecting patients at risk for opioid abuse in the emergency department (ED. Study objectives were a to determine if the revised Screener and Opioid Assessment for Patients with Pain (SOAPP®-R, a 24-question previously paper-based screening tool for opioid abuse potential, could be administered on a tablet computer to an ED patient population; b to demonstrate that >90% of patients can complete the electronic screener without assistance in 35 years. One hundred percent of subjects completed the screener. Median time to completion was 148 (interquartile range 117.5-184.3 seconds, and 95% (n=78 completed in <5 minutes. 93% (n=76 rated ease of completion as very easy. Conclusions: It is feasible to administer a screening tool to a cohort of ED patients on a tablet computer. The screener administration time is minimal and patient ease of use with this modality is high. [West J Emerg Med. 2015;16(1:18-23

  18. Psychology of computer use: XXXII. Computer screen-savers as distractors.

    Science.gov (United States)

    Volk, F A; Halcomb, C G

    1994-12-01

    The differences in performance of 16 male and 16 female undergraduates on three cognitive tasks were investigated in the presence of visual distractors (computer-generated dynamic graphic images). These tasks included skilled and unskilled proofreading and listening comprehension. The visually demanding task of proofreading (skilled and unskilled) showed no significant decreases in performance in the distractor conditions. Results showed significant decrements, however, in performance on listening comprehension in at least one of the distractor conditions.

  19. Implementation of Secondary Index on Cloud Computing NoSQL Database in Big Data Environment

    Directory of Open Access Journals (Sweden)

    Bao Rong Chang

    2015-01-01

    Full Text Available This paper introduces the combination of NoSQL database HBase and enterprise search platform Solr so as to tackle the problem of the secondary index function with fast query. In order to verify the effectiveness and efficiency of the proposed approach, the assessment using Cost-Performance ratio has been done for several competitive benchmark databases and the proposed one. As a result, our proposed approach outperforms the other databases and fulfills secondary index function with fast query in NoSQL database. Moreover, according to the cross-sectional analysis, the proposed combination of HBase and Solr database is capable of performing an excellent query/response in a big data environment.

  20. Utility of screening computed tomography of chest, abdomen and pelvis in patients after heart transplantation

    International Nuclear Information System (INIS)

    Dasari, Tarun W.; Pavlovic-Surjancev, Biljana; Dusek, Linda; Patel, Nilamkumar; Heroux, Alain L.

    2011-01-01

    Introduction: Malignancy is a late cause of mortality in heart transplant recipients. It is unknown if screening computed tomography scan would lead to early detection of such malignancies or serious vascular anomalies post heart transplantation. Methods: This is a single center observational study of patients undergoing surveillance computed tomography of chest, abdomen and pelvis atleast 5 years after transplantation. Abnormal findings, included pulmonary nodules, lymphadenopathy and intra-thoracic and intra-abdominal masses and vascular anomalies such as abdominal aortic aneurysm. The clinical follow up of each of these major abnormal findings is summarized. Results: A total of 63 patients underwent computed tomography scan of chest, abdomen and pelvis at least 5 years after transplantation. Of these, 54 (86%) were male and 9 (14%) were female. Mean age was 52 ± 9.2 years. Computed tomography revealed 1 lung cancer (squamous cell) only. Non specific pulmonary nodules were seen in 6 patients (9.5%). The most common incidental finding was abdominal aortic aneurysms (N = 6 (9.5%)), which necessitated follow up computed tomography (N = 5) or surgery (N = 1). Mean time to detection of abdominal aortic aneurysms from transplantation was 14.6 ± 4.2 years. Mean age at the time of detection of abdominal aortic aneurysms was 74.5 ± 3.2 years. Conclusion: Screening computed tomography scan in patients 5 years from transplantation revealed only one malignancy but lead to increased detection of abdominal aortic aneurysms. Thus the utility is low in terms of detection of malignancy. Based on this study we do not recommend routine computed tomography post heart transplantation.

  1. Computational approaches to screen candidate ligands with anti- Parkinson's activity using R programming.

    Science.gov (United States)

    Jayadeepa, R M; Niveditha, M S

    2012-01-01

    It is estimated that by 2050 over 100 million people will be affected by the Parkinson's disease (PD). We propose various computational approaches to screen suitable candidate ligand with anti-Parkinson's activity from phytochemicals. Five different types of dopamine receptors have been identified in the brain, D1-D5. Dopamine receptor D3 was selected as the target receptor. The D3 receptor exists in areas of the brain outside the basal ganglia, such as the limbic system, and thus may play a role in the cognitive and emotional changes noted in Parkinson's disease. A ligand library of 100 molecules with anti-Parkinson's activity was collected from literature survey. Nature is the best combinatorial chemist and possibly has answers to all diseases of mankind. Failure of some synthetic drugs and its side effects have prompted many researches to go back to ancient healing methods which use herbal medicines to give relief. Hence, the candidate ligands with anti-Parkinson's were selected from herbal sources through literature survey. Lipinski rules were applied to screen the suitable molecules for the study, the resulting 88 molecules were energy minimized, and subjected to docking using Autodock Vina. The top eleven molecules were screened according to the docking score generated by Autodock Vina Commercial drug Ropinirole was computed similarly and was compared with the 11 phytochemicals score, the screened molecules were subjected to toxicity analysis and to verify toxic property of phytochemicals. R Programming was applied to remove the bias from the top eleven molecules. Using cluster analysis and Confusion Matrix two phytochemicals were computationally selected namely Rosmarinic acid and Gingkolide A for further studies on the disease Parkinson's.

  2. A brief measure of Smokers' knowledge of lung cancer screening with low-dose computed tomography

    Directory of Open Access Journals (Sweden)

    Lisa M. Lowenstein

    2016-12-01

    Full Text Available We describe the development and psychometric properties of a new, brief measure of smokers' knowledge of lung cancer screening with low-dose computed tomography (LDCT. Content experts identified key facts smokers should know in making an informed decision about lung cancer screening. Sample questions were drafted and iteratively refined based on feedback from content experts and cognitive testing with ten smokers. The resulting 16-item knowledge measure was completed by 108 heavy smokers in Houston, Texas, recruited from 12/2014 to 09/2015. Item difficulty, item discrimination, internal consistency and test-retest reliability were assessed. Group differences based upon education levels and smoking history were explored. Several items were dropped due to ceiling effects or overlapping constructs, resulting in a 12-item knowledge measure. Additional items with high item uncertainty were retained because of their importance in informed decision making about lung cancer screening. Internal consistency reliability of the final scale was acceptable (KR-20 = 0.66 and test-retest reliability of the overall scale was 0.84 (intraclass correlation. Knowledge scores differed across education levels (F = 3.36, p = 0.04, while no differences were observed between current and former smokers (F = 1.43, p = 0.24 or among participants who met or did not meet the 30-pack-year screening eligibility criterion (F = 0.57, p = 0.45. The new measure provides a brief, valid and reliable indicator of smokers' knowledge of key concepts central to making an informed decision about lung cancer screening with LDCT, and can be part of a broader assessment of the quality of smokers' decision making about lung cancer screening.

  3. Performance of computer-aided detection in false-negative screening mammograms of breast cancers

    International Nuclear Information System (INIS)

    Han, Boo Kyung; Kim, Ji Young; Shin, Jung Hee; Choe, Yeon Hyeon

    2004-01-01

    To analyze retrospectively the abnormalities visible on the false-negative screening mammograms of patients with breast cancer and to determine the performance of computer-aided detection (CAD) in the detection of cancers. Of 108 consecutive cases of breast cancer diagnosed over a period of 6 years, of which previous screening mammograms were available, 32 retrospectively visible abnormalities (at which locations cancer later developed) were found in the previous mammograms, and which were originally reported as negative. These 32 patients ranged in age from 38 to 72 years (mean 52 years). We analyzed their previous mammographic findings, and assessed the ability of CAD to mark cancers in previous mammograms, according to the clinical presentation, the type of abnormalities and the mammographic parenchymal density. In these 32 previous mammograms of breast cancers (20 asymptomatic, 12 symptomatic), the retrospectively visible abnormalities were identified as densities in 22, calcifications in 8, and densities with calcifications in 2. CAD marked abnormalities in 20 (63%) of the 32 cancers with false-negative screening mammograms; 14 (70%) of the 20 subsequent screening-detected cancers, 5 (50%) of the 10 interval cancers, and 1 (50%) of the 2 cancers palpable after the screening interval. CAD marked 12 (50%) of the 24 densities and 9 (90%) of the 10 calcifications. CAD marked abnormalities in 7 (50%) of the 14 predominantly fatty breasts, and 13 (72%) of the 18 dense breasts. CAD-assisted diagnosis could potentially decrease the number of false-negative mammograms caused by the failure to recognize the cancer in the screening program, although its usefulness in the prevention of interval cancers appears to be limited

  4. CGPD: Cancer Genetics and Proteomics Database - A Dataset for Computational Analysis and Online Cancer Diagnostic Centre

    Directory of Open Access Journals (Sweden)

    Muhammad Rizwan Riaz

    2014-06-01

    Full Text Available Cancer Genetics and Proteomics Database (CGPD is a repository for genetics and proteomics data of those Homo sapiens genes which are involved in Cancer. These genes are categorized in the database on the basis of cancer type. 72 genes of 13 types of cancers are considered in this database yet. Primers, promoters and peptides of these genes are also made available. Primers provided for each gene, with their features and conditions given to facilitate the researchers, are useful in PCR amplification, especially in cloning experiments. CGPD also contains Online Cancer Diagnostic Center (OCDC. It also contains transcription and translation tools to assist research work in progressive manner. The database is publicly available at http://www.cgpd.comyr.com.

  5. Abdominal ultrasound-scanning versus non-contrast computed tomography as screening method for abdominal aortic aneurysm

    DEFF Research Database (Denmark)

    Liisberg, Mads; Diederichsen, Axel C.; Lindholt, Jes S.

    2017-01-01

    Background: Validating non-contrast-enhanced computed tomography (nCT) compared to ultrasound sonography (US) as screening method for abdominal aortic aneurysm (AAA) screening. Methods: Consecutively attending men (n = 566) from the pilot study of the randomized Danish CardioVascular Screening......CT seems superior to US concerning sensitivity, and is able to detect aneurysmal lesions not detectable with US. Finally, the prevalence of AAA in Denmark seems to remain relatively high, in this small pilot study group....

  6. A comparison of symptoms after viewing text on a computer screen and hardcopy.

    Science.gov (United States)

    Chu, Christina; Rosenfield, Mark; Portello, Joan K; Benzoni, Jaclyn A; Collier, Juanita D

    2011-01-01

    Computer vision syndrome (CVS) is a complex of eye and vision problems experienced during or related to computer use. Ocular symptoms may include asthenopia, accommodative and vergence difficulties and dry eye. CVS occurs in up to 90% of computer workers, and given the almost universal use of these devices, it is important to identify whether these symptoms are specific to computer operation, or are simply a manifestation of performing a sustained near-vision task. This study compared ocular symptoms immediately following a sustained near task. 30 young, visually-normal subjects read text aloud either from a desktop computer screen or a printed hardcopy page at a viewing distance of 50 cm for a continuous 20 min period. Identical text was used in the two sessions, which was matched for size and contrast. Target viewing angle and luminance were similar for the two conditions. Immediately following completion of the reading task, subjects completed a written questionnaire asking about their level of ocular discomfort during the task. When comparing the computer and hardcopy conditions, significant differences in median symptom scores were reported with regard to blurred vision during the task (t = 147.0; p = 0.03) and the mean symptom score (t = 102.5; p = 0.04). In both cases, symptoms were higher during computer use. Symptoms following sustained computer use were significantly worse than those reported after hard copy fixation under similar viewing conditions. A better understanding of the physiology underlying CVS is critical to allow more accurate diagnosis and treatment. This will allow practitioners to optimize visual comfort and efficiency during computer operation.

  7. Recommendations from the European Society of Thoracic Surgeons (ESTS) regarding computed tomography screening for lung cancer in Europe

    DEFF Research Database (Denmark)

    Pedersen, Jesper Holst; Rzyman, Witold; Veronesi, Giulia

    2017-01-01

    In order to provide recommendations regarding implementation of computed tomography (CT) screening in Europe the ESTS established a working group with eight experts in the field. On a background of the current situation regarding CT screening in Europe and the available evidence, ten recommendati...

  8. A Visual Database System for Image Analysis on Parallel Computers and its Application to the EOS Amazon Project

    Science.gov (United States)

    Shapiro, Linda G.; Tanimoto, Steven L.; Ahrens, James P.

    1996-01-01

    The goal of this task was to create a design and prototype implementation of a database environment that is particular suited for handling the image, vision and scientific data associated with the NASA's EOC Amazon project. The focus was on a data model and query facilities that are designed to execute efficiently on parallel computers. A key feature of the environment is an interface which allows a scientist to specify high-level directives about how query execution should occur.

  9. Coupling computer-interpretable guidelines with a drug-database through a web-based system – The PRESGUID project

    Directory of Open Access Journals (Sweden)

    Fieschi Marius

    2004-03-01

    Full Text Available Abstract Background Clinical Practice Guidelines (CPGs available today are not extensively used due to lack of proper integration into clinical settings, knowledge-related information resources, and lack of decision support at the point of care in a particular clinical context. Objective The PRESGUID project (PREScription and GUIDelines aims to improve the assistance provided by guidelines. The project proposes an online service enabling physicians to consult computerized CPGs linked to drug databases for easier integration into the healthcare process. Methods Computable CPGs are structured as decision trees and coded in XML format. Recommendations related to drug classes are tagged with ATC codes. We use a mapping module to enhance computerized guidelines coupling with a drug database, which contains detailed information about each usable specific medication. In this way, therapeutic recommendations are backed up with current and up-to-date information from the database. Results Two authoritative CPGs, originally diffused as static textual documents, have been implemented to validate the computerization process and to illustrate the usefulness of the resulting automated CPGs and their coupling with a drug database. We discuss the advantages of this approach for practitioners and the implications for both guideline developers and drug database providers. Other CPGs will be implemented and evaluated in real conditions by clinicians working in different health institutions.

  10. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing

    Science.gov (United States)

    Fang, Ye; Ding, Yun; Feinstein, Wei P.; Koppelman, David M.; Moreno, Juana; Jarrell, Mark; Ramanujam, J.; Brylinski, Michal

    2016-01-01

    Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249. PMID:27420300

  11. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing.

    Directory of Open Access Journals (Sweden)

    Ye Fang

    Full Text Available Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU. First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249.

  12. Automated Cervical Screening and Triage, Based on HPV Testing and Computer-Interpreted Cytology.

    Science.gov (United States)

    Yu, Kai; Hyun, Noorie; Fetterman, Barbara; Lorey, Thomas; Raine-Bennett, Tina R; Zhang, Han; Stamps, Robin E; Poitras, Nancy E; Wheeler, William; Befano, Brian; Gage, Julia C; Castle, Philip E; Wentzensen, Nicolas; Schiffman, Mark

    2018-04-11

    State-of-the-art cervical cancer prevention includes human papillomavirus (HPV) vaccination among adolescents and screening/treatment of cervical precancer (CIN3/AIS and, less strictly, CIN2) among adults. HPV testing provides sensitive detection of precancer but, to reduce overtreatment, secondary "triage" is needed to predict women at highest risk. Those with the highest-risk HPV types or abnormal cytology are commonly referred to colposcopy; however, expert cytology services are critically lacking in many regions. To permit completely automatable cervical screening/triage, we designed and validated a novel triage method, a cytologic risk score algorithm based on computer-scanned liquid-based slide features (FocalPoint, BD, Burlington, NC). We compared it with abnormal cytology in predicting precancer among 1839 women testing HPV positive (HC2, Qiagen, Germantown, MD) in 2010 at Kaiser Permanente Northern California (KPNC). Precancer outcomes were ascertained by record linkage. As additional validation, we compared the algorithm prospectively with cytology results among 243 807 women screened at KPNC (2016-2017). All statistical tests were two-sided. Among HPV-positive women, the algorithm matched the triage performance of abnormal cytology. Combined with HPV16/18/45 typing (Onclarity, BD, Sparks, MD), the automatable strategy referred 91.7% of HPV-positive CIN3/AIS cases to immediate colposcopy while deferring 38.4% of all HPV-positive women to one-year retesting (compared with 89.1% and 37.4%, respectively, for typing and cytology triage). In the 2016-2017 validation, the predicted risk scores strongly correlated with cytology (P < .001). High-quality cervical screening and triage performance is achievable using this completely automated approach. Automated technology could permit extension of high-quality cervical screening/triage coverage to currently underserved regions.

  13. Machine Learning Approaches Toward Building Predictive Models for Small Molecule Modulators of miRNA and Its Utility in Virtual Screening of Molecular Databases.

    Science.gov (United States)

    Periwal, Vinita; Scaria, Vinod

    2017-01-01

    The ubiquitous role of microRNAs (miRNAs) in a number of pathological processes has suggested that they could act as potential drug targets. RNA-binding small molecules offer an attractive means for modulating miRNA function. The availability of bioassay data sets for a variety of biological assays and molecules in public domain provides a new opportunity toward utilizing them to create models and further utilize them for in silico virtual screening approaches to prioritize or assign potential functions for small molecules. Here, we describe a computational strategy based on machine learning for creation of predictive models from high-throughput biological screens for virtual screening of small molecules with the potential to inhibit microRNAs. Such models could be potentially used for computational prioritization of small molecules before performing high-throughput biological assay.

  14. Requirements for a system to analyze HEP events using database computing

    International Nuclear Information System (INIS)

    May, E.; Lifka, D.; Lusk, E.; Price, L.E.; Day, C.T.; Loken, S.; MacFarlane, J.F.; Baden, A.

    1992-01-01

    We describe the requirements for the design and prototyping of an object-oriented database designed to analyze data in high energy physics. Our goal is to satisfy the data processing and analysis needs of a generic high energy physics experiment to be proposed for the Superconducting SuperCollider (SSC), and requires the collection and analysis of between 10 and 100 million sets of vectors (events), each approximately one megabyte in length. We sketch how this analysis would proceed using an object-oriented database which support the basic data types used in HEP

  15. The Effect of Relational Database Technology on Administrative Computing at Carnegie Mellon University.

    Science.gov (United States)

    Golden, Cynthia; Eisenberger, Dorit

    1990-01-01

    Carnegie Mellon University's decision to standardize its administrative system development efforts on relational database technology and structured query language is discussed and its impact is examined in one of its larger, more widely used applications, the university information system. Advantages, new responsibilities, and challenges of the…

  16. PRIDE and "Database on Demand" as valuable tools for computational proteomics.

    Science.gov (United States)

    Vizcaíno, Juan Antonio; Reisinger, Florian; Côté, Richard; Martens, Lennart

    2011-01-01

    The Proteomics Identifications Database (PRIDE, http://www.ebi.ac.uk/pride ) provides users with the ability to explore and compare mass spectrometry-based proteomics experiments that reveal details of the protein expression found in a broad range of taxonomic groups, tissues, and disease states. A PRIDE experiment typically includes identifications of proteins, peptides, and protein modifications. Additionally, many of the submitted experiments also include the mass spectra that provide the evidence for these identifications. Finally, one of the strongest advantages of PRIDE in comparison with other proteomics repositories is the amount of metadata it contains, a key point to put the above-mentioned data in biological and/or technical context. Several informatics tools have been developed in support of the PRIDE database. The most recent one is called "Database on Demand" (DoD), which allows custom sequence databases to be built in order to optimize the results from search engines. We describe the use of DoD in this chapter. Additionally, in order to show the potential of PRIDE as a source for data mining, we also explore complex queries using federated BioMart queries to integrate PRIDE data with other resources, such as Ensembl, Reactome, or UniProt.

  17. Lung cancer screening beyond low-dose computed tomography: the role of novel biomarkers.

    Science.gov (United States)

    Hasan, Naveed; Kumar, Rohit; Kavuru, Mani S

    2014-10-01

    Lung cancer is the most common and lethal malignancy in the world. The landmark National lung screening trial (NLST) showed a 20% relative reduction in mortality in high-risk individuals with screening low-dose computed tomography. However, the poor specificity and low prevalence of lung cancer in the NLST provide major limitations to its widespread use. Furthermore, a lung nodule on CT scan requires a nuanced and individualized approach towards management. In this regard, advances in high through-put technology (molecular diagnostics, multi-gene chips, proteomics, and bronchoscopic techniques) have led to discovery of lung cancer biomarkers that have shown potential to complement the current screening standards. Early detection of lung cancer can be achieved by analysis of biomarkers from tissue samples within the respiratory tract such as sputum, saliva, nasal/bronchial airway epithelial cells and exhaled breath condensate or through peripheral biofluids such as blood, serum and urine. Autofluorescence bronchoscopy has been employed in research setting to identify pre-invasive lesions not identified on CT scan. Although these modalities are not yet commercially available in clinic setting, they will be available in the near future and clinicians who care for patients with lung cancer should be aware. In this review, we present up-to-date state of biomarker development, discuss their clinical relevance and predict their future role in lung cancer management.

  18. Redesign of a computerized clinical reminder for colorectal cancer screening: a human-computer interaction evaluation

    Directory of Open Access Journals (Sweden)

    Saleem Jason J

    2011-11-01

    Full Text Available Abstract Background Based on barriers to the use of computerized clinical decision support (CDS learned in an earlier field study, we prototyped design enhancements to the Veterans Health Administration's (VHA's colorectal cancer (CRC screening clinical reminder to compare against the VHA's current CRC reminder. Methods In a controlled simulation experiment, 12 primary care providers (PCPs used prototypes of the current and redesigned CRC screening reminder in a within-subject comparison. Quantitative measurements were based on a usability survey, workload assessment instrument, and workflow integration survey. We also collected qualitative data on both designs. Results Design enhancements to the VHA's existing CRC screening clinical reminder positively impacted aspects of usability and workflow integration but not workload. The qualitative analysis revealed broad support across participants for the design enhancements with specific suggestions for improving the reminder further. Conclusions This study demonstrates the value of a human-computer interaction evaluation in informing the redesign of information tools to foster uptake, integration into workflow, and use in clinical practice.

  19. Implementation of depression screening in antenatal clinics through tablet computers: results of a feasibility study.

    Science.gov (United States)

    Marcano-Belisario, José S; Gupta, Ajay K; O'Donoghue, John; Ramchandani, Paul; Morrison, Cecily; Car, Josip

    2017-05-10

    Mobile devices may facilitate depression screening in the waiting area of antenatal clinics. This can present implementation challenges, of which we focused on survey layout and technology deployment. We assessed the feasibility of using tablet computers to administer a socio-demographic survey, the Whooley questions and the Edinburgh Postnatal Depression Scale (EPDS) to 530 pregnant women attending National Health Service (NHS) antenatal clinics across England. We randomised participants to one of two layout versions of these surveys: (i) a scrolling layout where each survey was presented on a single screen; or (ii) a paging layout where only one question appeared on the screen at any given time. Overall, 85.10% of eligible pregnant women agreed to take part. Of these, 90.95% completed the study procedures. Approximately 23% of participants answered Yes to at least one Whooley question, and approximately 13% of them scored 10 points of more on the EPDS. We observed no association between survey layout and the responses given to the Whooley questions, the median EPDS scores, the number of participants at increased risk of self-harm, and the number of participants asking for technical assistance. However, we observed a difference in the number of participants at each EPDS scoring interval (p = 0.008), which provide an indication of a woman's risk of depression. A scrolling layout resulted in faster completion times (median = 4 min 46 s) than a paging layout (median = 5 min 33 s) (p = 0.024). However, the clinical significance of this difference (47.5 s) is yet to be determined. Tablet computers can be used for depression screening in the waiting area of antenatal clinics. This requires the careful consideration of clinical workflows, and technology-related issues such as connectivity and security. An association between survey layout and EPDS scoring intervals needs to be explored further to determine if it corresponds to a survey layout effect

  20. Initial screening test for blunt cerebrovascular injury: Validity assessment of whole-body computed tomography.

    Science.gov (United States)

    Laser, Adriana; Kufera, Joseph A; Bruns, Brandon R; Sliker, Clint W; Tesoriero, Ronald B; Scalea, Thomas M; Stein, Deborah M

    2015-09-01

    Our whole-body computed tomography protocol (WBCT), used to image patients with polytrauma, consists of a noncontrast head computed tomography (CT) followed by a multidetector computed tomography (40- or 64- slice) that includes an intravenous, contrast-enhanced scan from the face through the pelvis. WBCT is used to screen for blunt cerebrovascular injury (BCVI) during initial CT imaging of the patient with polytrauma and allows for early initiation of therapy with the goal of avoiding stroke. WBCT has not been directly compared with CT angiography (CTA) of the neck as a screening tool for BCVI. We hypothesize that WBCT is a valid modality to diagnose BCVI compared with neck CTA, thus screening patients with polytrauma for BCVI and limiting the need for subsequent CTA. A retrospective review of the trauma registry was conducted for all patients diagnosed with BCVI from June 2009 to June 2013 at our institution. All injuries, identified and graded on initial WBCT, were compared with neck CTA imaging performed within the first 72 hours. Sensitivity was calculated for WBCT by the use of CTA as the reference standard. Proportions of agreement also were calculated between the grades of injury for both imaging modalities. A total of 319 injured vessels were identified in 227 patients. On initial WBCT 80 (25%) of the injuries were grade I, 75 (24%) grade II, 45 (14%) grade III, 41 (13%) grade IV, and 58 (18%) were classified as indeterminate: 27 vertebral and 31 carotid lesions. Twenty (6%) of the 319 injuries were not detected on WBCT but identified on subsequent CTA (9 grade I, 7 grade II, 4 grade III); 6 vertebral and 14 carotid. For each vessel type and for all vessels combined, WBCT demonstrated sensitivity rates of over 90% to detect BCVI among the population of patients with at least one vessel injured. There was concordant grading of injuries between WBCT and initial diagnostic CTA in 154 (48% of all injuries). Lower grade injures were more discordant than higher

  1. Computer-assisted static/dynamic renal imaging: a screening test for renovascular hypertension

    International Nuclear Information System (INIS)

    Keim, H.J.; Johnson, P.M.; Vaughan, E.D. Jr.; Beg, K.; Follett, D.A.; Freeman, L.M.; Laragh, J.H.

    1979-01-01

    Computer-assisted static/dynamic renal imaging with [ 197 Hg] chlormerodrin and [/sup 99m/Tc] pertechnetate was evaluated prospectively as a screening test for renovascular hypertension. Results are reported for 51 patients: 33 with benign essential hypertension and 18 with renovascular hypertension, and for 21 normal controls. All patients underwent renal arteriography. Patients with significant obesity, renal insufficiency, or renoparenchymal disease were excluded from this study. Independent visual analyses of renal gamma images and time-activity transit curves identified 17 of the 18 patients with renovascular hypertension; one study was equivocal. There were five equivocal and three false-positive results in the essential hypertension and normal control groups. The sensitivity of the method was 94% and the specificity 85%. Since the prevalence of the renovascular subset of hypertension is approximately 5%, the predictive value is only 25%. Inclusion of computer-generated data did not improve this result. Accordingly, this method is not recommended as a primary screening test for renovascular hypertension

  2. Computer Vision Tool and Technician as First Reader of Lung Cancer Screening CT Scans.

    Science.gov (United States)

    Ritchie, Alexander J; Sanghera, Calvin; Jacobs, Colin; Zhang, Wei; Mayo, John; Schmidt, Heidi; Gingras, Michel; Pasian, Sergio; Stewart, Lori; Tsai, Scott; Manos, Daria; Seely, Jean M; Burrowes, Paul; Bhatia, Rick; Atkar-Khattra, Sukhinder; van Ginneken, Bram; Tammemagi, Martin; Tsao, Ming Sound; Lam, Stephen

    2016-05-01

    To implement a cost-effective low-dose computed tomography (LDCT) lung cancer screening program at the population level, accurate and efficient interpretation of a large volume of LDCT scans is needed. The objective of this study was to evaluate a workflow strategy to identify abnormal LDCT scans in which a technician assisted by computer vision (CV) software acts as a first reader with the aim to improve speed, consistency, and quality of scan interpretation. Without knowledge of the diagnosis, a technician reviewed 828 randomly batched scans (136 with lung cancers, 556 with benign nodules, and 136 without nodules) from the baseline Pan-Canadian Early Detection of Lung Cancer Study that had been annotated by the CV software CIRRUS Lung Screening (Diagnostic Image Analysis Group, Nijmegen, The Netherlands). The scans were classified as either normal (no nodules ≥1 mm or benign nodules) or abnormal (nodules or other abnormality). The results were compared with the diagnostic interpretation by Pan-Canadian Early Detection of Lung Cancer Study radiologists. The overall sensitivity and specificity of the technician in identifying an abnormal scan were 97.8% (95% confidence interval: 96.4-98.8) and 98.0% (95% confidence interval: 89.5-99.7), respectively. Of the 112 prevalent nodules that were found to be malignant in follow-up, 92.9% were correctly identified by the technician plus CV compared with 84.8% by the study radiologists. The average time taken by the technician to review a scan after CV processing was 208 ± 120 seconds. Prescreening CV software and a technician as first reader is a promising strategy for improving the consistency and quality of screening interpretation of LDCT scans. Copyright © 2016 International Association for the Study of Lung Cancer. Published by Elsevier Inc. All rights reserved.

  3. Blunt cerebrovascular injury screening with 64-channel multidetector computed tomography: more slices finally cut it.

    Science.gov (United States)

    Paulus, Elena M; Fabian, Timothy C; Savage, Stephanie A; Zarzaur, Ben L; Botta, Vandana; Dutton, Wesley; Croce, Martin A

    2014-02-01

    Aggressive screening to diagnose blunt cerebrovascular injury (BCVI) results in early treatment, leading to improved outcomes and reduced stroke rates. While computed tomographic angiography (CTA) has been widely adopted for BCVI screening, evidence of its diagnostic sensitivity is marginal. Previous work from our institution using 32-channel multidetector CTA in 684 patients demonstrated an inadequate sensitivity of 51% (Ann Surg. 2011,253: 444-450). Digital subtraction angiography (DSA) continues to be the reference standard of diagnosis but has significant drawbacks of invasiveness and resource demands. There have been continued advances in CT technology, and this is the first report of an extensive experience with 64-channel multidetector CTA. Patients screened for BCVI using CTA and DSA (reference) at a Level 1 trauma center during the 12-month period ending in May 2012 were identified. Results of CTA and DSA, complications, and strokes were retrospectively reviewed and compared. A total of 594 patients met criteria for BCVI screening and underwent both CTA and DSA. One hundred twenty-eight patients (22% of those screened) had 163 injured vessels: 99 (61%) carotid artery injuries and 64 (39%) vertebral artery injuries. Sixty-four-channel CTA demonstrated an overall sensitivity per vessel of 68% and specificity of 92%. The 52 false-negative findings on CTA were composed of 34 carotid artery injuries and 18 vertebral artery injuries; 32 (62%) were Grade I injuries. Overall, positive predictive value was 36.2%, and negative predictive value was 97.5%. Six procedure-related complications (1%) occurred with DSA, including two iatrogenic dissections and one stroke. Sixty-four-channel CTA demonstrated a significantly improved sensitivity of 68% versus the 51% previously reported for the 32-channel CTA (p = 0.0075). Sixty-two percent of the false-negative findings occurred with low-grade injuries. Considering complications, cost, and resource demand associated with

  4. Computational toxicology as implemented by the U.S. EPA: providing high throughput decision support tools for screening and assessing chemical exposure, hazard and risk.

    Science.gov (United States)

    Kavlock, Robert; Dix, David

    2010-02-01

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the Toxicity of Chemicals (U.S. EPA, 2009a). Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast) and exposure (ExpoCast), and creating virtual liver (v-Liver) and virtual embryo (v-Embryo) systems models. U.S. EPA-funded STAR centers are also providing bioinformatics, computational toxicology data and models, and developmental toxicity data and models. The models and underlying data are being made publicly

  5. Cyclone: java-based querying and computing with Pathway/Genome databases.

    Science.gov (United States)

    Le Fèvre, François; Smidtas, Serge; Schächter, Vincent

    2007-05-15

    Cyclone aims at facilitating the use of BioCyc, a collection of Pathway/Genome Databases (PGDBs). Cyclone provides a fully extensible Java Object API to analyze and visualize these data. Cyclone can read and write PGDBs, and can write its own data in the CycloneML format. This format is automatically generated from the BioCyc ontology by Cyclone itself, ensuring continued compatibility. Cyclone objects can also be stored in a relational database CycloneDB. Queries can be written in SQL, and in an intuitive and concise object-oriented query language, Hibernate Query Language (HQL). In addition, Cyclone interfaces easily with Java software including the Eclipse IDE for HQL edition, the Jung API for graph algorithms or Cytoscape for graph visualization. Cyclone is freely available under an open source license at: http://sourceforge.net/projects/nemo-cyclone. For download and installation instructions, tutorials, use cases and examples, see http://nemo-cyclone.sourceforge.net.

  6. A computer network system for mutual usage four databases of nuclear materials (Data-Free-Way)

    International Nuclear Information System (INIS)

    Fujita, M.; Kurihara, Y.; Shindou, M.; Yokoyama, N.; Tachi, Y.; Kano, S.; Iwata, S.

    1996-01-01

    Distributed database system named 'Data-Free-Way' for advanced nuclear materials has been developed by National Research Institute for Metals (NRIM), Japan Atomic Energy Research Institute (JAERI) and Power Reactor and Nuclear Fuel Development Corporation (PNC) under cooperation agreement between these three organizations. In the paper, features and functions of the system including input data are described together with method to share database among the three organizations as well as examples of the easy accessible search of material properties. Results of analysis of tensile and creep properties data on type 316 stainless steel collected by the different organizations and stored in the present system are also introduced as an example of attractive utilization of the system. Moreover, in order to consider the system in near future, some trails of WWW server of several sites in 'Data-Free-Way' to supply the information on nuclear materials to Internet are introduced. (author)

  7. Computer-Aided Systems Engineering for Flight Research Projects Using a Workgroup Database

    Science.gov (United States)

    Mizukami, Masahi

    2004-01-01

    An online systems engineering tool for flight research projects has been developed through the use of a workgroup database. Capabilities are implemented for typical flight research systems engineering needs in document library, configuration control, hazard analysis, hardware database, requirements management, action item tracking, project team information, and technical performance metrics. Repetitive tasks are automated to reduce workload and errors. Current data and documents are instantly available online and can be worked on collaboratively. Existing forms and conventional processes are used, rather than inventing or changing processes to fit the tool. An integrated tool set offers advantages by automatically cross-referencing data, minimizing redundant data entry, and reducing the number of programs that must be learned. With a simplified approach, significant improvements are attained over existing capabilities for minimal cost. By using a workgroup-level database platform, personnel most directly involved in the project can develop, modify, and maintain the system, thereby saving time and money. As a pilot project, the system has been used to support an in-house flight experiment. Options are proposed for developing and deploying this type of tool on a more extensive basis.

  8. The power of an ontology-driven developmental toxicity database for data mining and computational modeling

    Science.gov (United States)

    Modeling of developmental toxicology presents a significant challenge to computational toxicology due to endpoint complexity and lack of data coverage. These challenges largely account for the relatively few modeling successes using the structure–activity relationship (SAR) parad...

  9. Computational assessment of hemodynamics-based diagnostic tools using a database of virtual subjects: Application to three case studies.

    Science.gov (United States)

    Willemet, Marie; Vennin, Samuel; Alastruey, Jordi

    2016-12-08

    Many physiological indexes and algorithms based on pulse wave analysis have been suggested in order to better assess cardiovascular function. Because these tools are often computed from in-vivo hemodynamic measurements, their validation is time-consuming, challenging, and biased by measurement errors. Recently, a new methodology has been suggested to assess theoretically these computed tools: a database of virtual subjects generated using numerical 1D-0D modeling of arterial hemodynamics. The generated set of simulations encloses a wide selection of healthy cases that could be encountered in a clinical study. We applied this new methodology to three different case studies that demonstrate the potential of our new tool, and illustrated each of them with a clinically relevant example: (i) we assessed the accuracy of indexes estimating pulse wave velocity; (ii) we validated and refined an algorithm that computes central blood pressure; and (iii) we investigated theoretical mechanisms behind the augmentation index. Our database of virtual subjects is a new tool to assist the clinician: it provides insight into the physical mechanisms underlying the correlations observed in clinical practice. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Creating an Electronic Reference and Information Database for Computer-aided ECM Design

    Science.gov (United States)

    Nekhoroshev, M. V.; Pronichev, N. D.; Smirnov, G. V.

    2018-01-01

    The paper presents a review on electrochemical shaping. An algorithm has been developed to implement a computer shaping model applicable to pulse electrochemical machining. For that purpose, the characteristics of pulse current occurring in electrochemical machining of aviation materials have been studied. Based on integrating the experimental results and comprehensive electrochemical machining process data modeling, a subsystem for computer-aided design of electrochemical machining for gas turbine engine blades has been developed; the subsystem was implemented in the Teamcenter PLM system.

  11. Blockchain-based database to ensure data integrity in cloud computing environments

    OpenAIRE

    Gaetani, Edoardo; Aniello, Leonardo; Baldoni, Roberto; Lombardi, Federico; Margheri, Andrea; Sassone, Vladimiro

    2017-01-01

    Data is nowadays an invaluable resource, indeed it guides all business decisions in most of the computer-aided human activities. Threats to data integrity are thus of paramount relevance, as tampering with data may maliciously affect crucial business decisions. This issue is especially true in cloud computing environments, where data owners cannot control fundamental data aspects, like the physical storage of data and the control of its accesses. Blockchain has recently emerged as a fascinati...

  12. Comparison of District-level Smoking Prevalence and Their Income Gaps from Two National Databases: the National Health Screening Database and the Community Health Survey in Korea, 2009-2014.

    Science.gov (United States)

    Kim, Ikhan; Bahk, Jinwook; Kim, Yeon Yong; Lee, Jeehye; Kang, Hee Yeon; Lee, Juyeon; Yun, Sung Cheol; Park, Jong Heon; Shin, Soon Ae; Khang, Young Ho

    2018-02-05

    We compared age-standardized prevalence of cigarette smoking and their income gaps at the district-level in Korea using the National Health Screening Database (NHSD) and the Community Health Survey (CHS). Between 2009 and 2014, 39,049,485 subjects participating in the NHSD and 989,292 participants in the CHS were analyzed. The age-standardized prevalence of smoking and their interquintile income differences were calculated for 245 districts of Korea. We examined between-period correlations for the age-standardized smoking prevalence at the district-level and investigated the district-level differences in smoking prevalence and income gaps between the two databases. The between-period correlation coefficients of smoking prevalence for both genders were 0.92-0.97 in NHSD and 0.58-0.69 in CHS, respectively. When using NHSD, we found significant income gaps in all districts for men and 244 districts for women. However, when CHS was analyzed, only 167 and 173 districts for men and women, respectively, showed significant income gaps. While correlation coefficients of district-level smoking prevalence from two databases were 0.87 for men and 0.85 for women, a relatively weak correlation between income gaps from the two databases was found. Based on two databases, income gaps in smoking prevalence were evident for nearly all districts of Korea. Because of the large sample size for each district, NHSD may provide stable district-level smoking prevalence and its income gap and thus should be considered as a valuable data source for monitoring district-level smoking prevalence and its socioeconomic inequality. © 2018 The Korean Academy of Medical Sciences.

  13. MOLA: a bootable, self-configuring system for virtual screening using AutoDock4/Vina on computer clusters

    Directory of Open Access Journals (Sweden)

    Abreu Rui MV

    2010-10-01

    Full Text Available Abstract Background Virtual screening of small molecules using molecular docking has become an important tool in drug discovery. However, large scale virtual screening is time demanding and usually requires dedicated computer clusters. There are a number of software tools that perform virtual screening using AutoDock4 but they require access to dedicated Linux computer clusters. Also no software is available for performing virtual screening with Vina using computer clusters. In this paper we present MOLA, an easy-to-use graphical user interface tool that automates parallel virtual screening using AutoDock4 and/or Vina in bootable non-dedicated computer clusters. Implementation MOLA automates several tasks including: ligand preparation, parallel AutoDock4/Vina jobs distribution and result analysis. When the virtual screening project finishes, an open-office spreadsheet file opens with the ligands ranked by binding energy and distance to the active site. All results files can automatically be recorded on an USB-flash drive or on the hard-disk drive using VirtualBox. MOLA works inside a customized Live CD GNU/Linux operating system, developed by us, that bypass the original operating system installed on the computers used in the cluster. This operating system boots from a CD on the master node and then clusters other computers as slave nodes via ethernet connections. Conclusion MOLA is an ideal virtual screening tool for non-experienced users, with a limited number of multi-platform heterogeneous computers available and no access to dedicated Linux computer clusters. When a virtual screening project finishes, the computers can just be restarted to their original operating system. The originality of MOLA lies on the fact that, any platform-independent computer available can he added to the cluster, without ever using the computer hard-disk drive and without interfering with the installed operating system. With a cluster of 10 processors, and a

  14. Evaluation of Computational Docking to Identify Pregnane X Receptor Agonists in the ToxCast Database

    OpenAIRE

    Kortagere, Sandhya; Krasowski, Matthew D.; Reschly, Erica J.; Venkatesh, Madhukumar; Mani, Sridhar; Ekins, Sean

    2010-01-01

    Background The pregnane X receptor (PXR) is a key transcriptional regulator of many genes [e.g., cytochrome P450s (CYP2C9, CYP3A4, CYP2B6), MDR1] involved in xenobiotic metabolism and excretion. Objectives As part of an evaluation of different approaches to predict compound affinity for nuclear hormone receptors, we used the molecular docking program GOLD and a hybrid scoring scheme based on similarity weighted GoldScores to predict potential PXR agonists in the ToxCast database of pesticides...

  15. The Emdros Text Database Engine as a Platform for Persuasive Computing

    DEFF Research Database (Denmark)

    Sandborg-Petersen, Ulrik

    2013-01-01

    This paper describes the nature and scope of Emdros, a text database engine for annotated text. Three case-studies of persuasive learning systems using Emdros as an important architectural component are described, and their status as to participation in the three legs of BJ Fogg's Functional Triad...... of Persuasive Design is assessed. Various properties of Emdros are discussed, both with respect to competing systems, and with respect to the three case studies. It is argued that these properties together enable Emdros to form part of the foundation for a large class of systems whose primary function involves...

  16. Contaminant screening of wastewater with HPLC-IM-qTOF-MS and LC+LC-IM-qTOF-MS using a CCS database.

    Science.gov (United States)

    Stephan, Susanne; Hippler, Joerg; Köhler, Timo; Deeb, Ahmad A; Schmidt, Torsten C; Schmitz, Oliver J

    2016-09-01

    Non-target analysis has become an important tool in the field of water analysis since a broad variety of pollutants from different sources are released to the water cycle. For identification of compounds in such complex samples, liquid chromatography coupled to high resolution mass spectrometry are often used. The introduction of ion mobility spectrometry provides an additional separation dimension and allows determining collision cross sections (CCS) of the analytes as a further physicochemical constant supporting the identification. A CCS database with more than 500 standard substances including drug-like compounds and pesticides was used for CCS data base search in this work. A non-target analysis of a wastewater sample was initially performed with high performance liquid chromatography (HPLC) coupled to an ion mobility-quadrupole-time of flight mass spectrometer (IM-qTOF-MS). A database search including exact mass (±5 ppm) and CCS (±1 %) delivered 22 different compounds. Furthermore, the same sample was analyzed with a two-dimensional LC method, called LC+LC, developed in our group for the coupling to IM-qTOF-MS. This four dimensional separation platform revealed 53 different compounds, identified over exact mass and CCS, in the examined wastewater sample. It is demonstrated that the CCS database can also help to distinguish between isobaric structures exemplified for cyclophosphamide and ifosfamide. Graphical Abstract Scheme of sample analysis and database screening.

  17. The computer coordination method and research of inland river traffic based on ship database

    Science.gov (United States)

    Liu, Shanshan; Li, Gen

    2018-04-01

    A computer coordinated management method for inland river ship traffic is proposed in this paper, Get the inland ship's position, speed and other navigation information by VTS, building ship's statics and dynamic data bases, writing a program of computer coordinated management of inland river traffic by VB software, Automatic simulation and calculation of the meeting states of ships, Providing ship's long-distance collision avoidance information. The long-distance collision avoidance of ships will be realized. The results show that, Ships avoid or reduce meetings, this method can effectively control the macro collision avoidance of ships.

  18. ECOS: a configurable, multi-terabyte database supporting engineering and technical computing at Sizewell B

    International Nuclear Information System (INIS)

    Binns, F.; Fish, A.

    1992-01-01

    One of the three main classes of computing support systems is concerned with the technical and engineering aspects of Sizewell-B power station. These aspects are primarily concerned with engineering means to optimise plant use to maximise power output by increasing availability and efficiency. At Sizewell-B the Engineering Computer system (ECOS) will provide the necessary support facilities, and is described. ECOS is being used by the station commissioning team and for monitoring the state of some plant already in service. (Author)

  19. A computer-tailored intervention to promote informed decision making for prostate cancer screening among African American men.

    Science.gov (United States)

    Allen, Jennifer D; Mohllajee, Anshu P; Shelton, Rachel C; Drake, Bettina F; Mars, Dana R

    2009-12-01

    African American men experience a disproportionate burden of prostate cancer (CaP) morbidity and mortality. National screening guidelines advise men to make individualized screening decisions through a process termed informed decision making (IDM). In this pilot study, a computer-tailored decision-aid designed to promote IDM was evaluated using a pre-/posttest design. African American men aged 40 years and older were recruited from a variety of community settings (n = 108). At pretest, 43% of men reported having made a screening decision; at posttest 47% reported this to be the case (p = .39). Significant improvements were observed between pre- and posttest on scores of knowledge, decision self-efficacy, and decisional conflict. Men were also more likely to want an active role in decision making after using the tool. These results suggest that use of a computer-tailored decision aid is a promising strategy to promote IDM for CaP screening among African American men.

  20. Computer-aided diagnostics of screening mammography using content-based image retrieval

    Science.gov (United States)

    Deserno, Thomas M.; Soiron, Michael; de Oliveira, Júlia E. E.; de A. Araújo, Arnaldo

    2012-03-01

    Breast cancer is one of the main causes of death among women in occidental countries. In the last years, screening mammography has been established worldwide for early detection of breast cancer, and computer-aided diagnostics (CAD) is being developed to assist physicians reading mammograms. A promising method for CAD is content-based image retrieval (CBIR). Recently, we have developed a classification scheme of suspicious tissue pattern based on the support vector machine (SVM). In this paper, we continue moving towards automatic CAD of screening mammography. The experiments are based on in total 10,509 radiographs that have been collected from different sources. From this, 3,375 images are provided with one and 430 radiographs with more than one chain code annotation of cancerous regions. In different experiments, this data is divided into 12 and 20 classes, distinguishing between four categories of tissue density, three categories of pathology and in the 20 class problem two categories of different types of lesions. Balancing the number of images in each class yields 233 and 45 images remaining in each of the 12 and 20 classes, respectively. Using a two-dimensional principal component analysis, features are extracted from small patches of 128 x 128 pixels and classified by means of a SVM. Overall, the accuracy of the raw classification was 61.6 % and 52.1 % for the 12 and the 20 class problem, respectively. The confusion matrices are assessed for detailed analysis. Furthermore, an implementation of a SVM-based CBIR system for CADx in screening mammography is presented. In conclusion, with a smarter patch extraction, the CBIR approach might reach precision rates that are helpful for the physicians. This, however, needs more comprehensive evaluation on clinical data.

  1. The NASA Ames Polycyclic Aromatic Hydrocarbon Infrared Spectroscopic Database : The Computed Spectra

    NARCIS (Netherlands)

    Bauschlicher, C. W.; Boersma, C.; Ricca, A.; Mattioda, A. L.; Cami, J.; Peeters, E.; de Armas, F. Sanchez; Saborido, G. Puerta; Hudgins, D. M.; Allamandola, L. J.

    The astronomical emission features, formerly known as the unidentified infrared bands, are now commonly ascribed to polycyclic aromatic hydrocarbons (PAHs). The laboratory experiments and computational modeling done at the NASA Ames Research Center to create a collection of PAH IR spectra relevant

  2. Computational screening of functionalized zinc porphyrins for dye sensitized solar cells

    DEFF Research Database (Denmark)

    Ørnsø, Kristian Baruël; García Lastra, Juan Maria; Thygesen, Kristian Sommer

    2013-01-01

    separation, and high output voltage. Here we demonstrate an extensive computational screening of zinc porphyrins functionalized with electron donating side groups and electron accepting anchoring groups. The trends in frontier energy levels versus side groups are analyzed and a no-loss DSSC level alignment...... quality is estimated. Out of the initial 1029 molecules, we find around 50 candidates with level alignment qualities within 5% of the optimal limit. We show that the level alignment of five zinc porphyrin dyes which were recently used in DSSCs with high efficiencies can be further improved by simple side......An efficient dye sensitized solar cell (DSSC) is one possible solution to meet the world's rapidly increasing energy demands and associated climate challenges. This requires inexpensive and stable dyes with well-positioned frontier energy levels for maximal solar absorption, efficient charge...

  3. Value of three-dimensional computed tomography in screening cerebral aneurysms

    Energy Technology Data Exchange (ETDEWEB)

    Yamaguchi, Tamaki; Sugiura, Yusuke; Suzuki, Atsushi; Yamagata, Yoshitaka [Hyogo Medical Coll. (Japan)

    1997-10-01

    We performed three-dimensional computed tomography (3D-CT) in 6 patients of cerebral aneurysm. Prior cerebral angiography showed a total of 17 aneurysms. 3D-CT alone detected 10 cerebral aneurysm (59%). It was possible to identify aneurysms larger than 10 mm even when located near the circle of Willis. It was difficult to identify aneurysms when smaller than 7 mm regardless of their location. 3D-CT was of limited value in detecting cerebral aneurysms, particularly when located near the circle of Willis with complex vascular network. As cases of oculomotor palsy may be caused by lesions other than cerebral aneurysm, we advocate that 3D-CT be performed after magnetic resonance imaging (MRI) in screening cases of suspected cerebral aneurysm. (author)

  4. Screening of photosynthetic pigments for herbicidal activity with a new computational molecular approach.

    Science.gov (United States)

    Krishnaraj, R Navanietha; Chandran, Saravanan; Pal, Parimal; Berchmans, Sheela

    2013-12-01

    There is an immense interest among the researchers to identify new herbicides which are effective against the herbs without affecting the environment. In this work, photosynthetic pigments are used as the ligands to predict their herbicidal activity. The enzyme 5-enolpyruvylshikimate-3-phosphate (EPSP) synthase is a good target for the herbicides. Homology modeling of the target enzyme is done using Modeler 9.11 and the model is validated. Docking studies were performed with AutoDock Vina algorithm to predict the binding of the natural pigments such as β-carotene, chlorophyll a, chlorophyll b, phycoerythrin and phycocyanin to the target. β-carotene, phycoerythrin and phycocyanin have higher binding energies indicating the herbicidal activity of the pigments. This work reports a procedure to screen herbicides with computational molecular approach. These pigments will serve as potential bioherbicides in the future.

  5. Computational screening of new inorganic materials for highly efficient solar energy conversion

    DEFF Research Database (Denmark)

    Kuhar, Korina

    2017-01-01

    in solar cells convert solar energy into electricity, and PC uses harvested energy to conduct chemical reactions, such as splitting water into oxygen and, more importantly, hydrogen, also known as the fuel of the future. Further progress in both PV and PC fields is mostly limited by the flaws in materials...... materials. In this work a high-throughput computational search for suitable absorbers for PV and PC applications is presented. A set of descriptors has been developed, such that each descriptor targets an important property or issue of a good solar energy conversion material. The screening study...... that we have access to. Despite the vast amounts of energy at our disposal, we are not able to harvest this solar energy efficiently. Currently, there are a few ways of converting solar power into usable energy, such as photovoltaics (PV) or photoelectrochemical generation of fuels (PC). PV processes...

  6. Temporal analysis of laser beam propagation in the atmosphere using computer-generated long phase screens.

    Science.gov (United States)

    Dios, Federico; Recolons, Jaume; Rodríguez, Alejandro; Batet, Oscar

    2008-02-04

    Temporal analysis of the irradiance at the detector plane is intended as the first step in the study of the mean fade time in a free optical communication system. In the present work this analysis has been performed for a Gaussian laser beam propagating in the atmospheric turbulence by means of computer simulation. To this end, we have adapted a previously known numerical method to the generation of long phase screens. The screens are displaced in a transverse direction as the wave is propagated, in order to simulate the wind effect. The amplitude of the temporal covariance and its power spectrum have been obtained at the optical axis, at the beam centroid and at a certain distance from these two points. Results have been worked out for weak, moderate and strong turbulence regimes and when possible they have been compared with theoretical models. These results show a significant contribution of beam wander to the temporal behaviour of the irradiance, even in the case of weak turbulence. We have also found that the spectral bandwidth of the covariance is hardly dependent on the Rytov variance.

  7. Computer-aided detection of pulmonary nodules: a comparative study using the public LIDC/IDRI database

    International Nuclear Information System (INIS)

    Jacobs, Colin; Prokop, Mathias; Rikxoort, Eva M. van; Ginneken, Bram van; Murphy, Keelin; Schaefer-Prokop, Cornelia M.

    2016-01-01

    To benchmark the performance of state-of-the-art computer-aided detection (CAD) of pulmonary nodules using the largest publicly available annotated CT database (LIDC/IDRI), and to show that CAD finds lesions not identified by the LIDC's four-fold double reading process. The LIDC/IDRI database contains 888 thoracic CT scans with a section thickness of 2.5 mm or lower. We report performance of two commercial and one academic CAD system. The influence of presence of contrast, section thickness, and reconstruction kernel on CAD performance was assessed. Four radiologists independently analyzed the false positive CAD marks of the best CAD system. The updated commercial CAD system showed the best performance with a sensitivity of 82 % at an average of 3.1 false positive detections per scan. Forty-five false positive CAD marks were scored as nodules by all four radiologists in our study. On the largest publicly available reference database for lung nodule detection in chest CT, the updated commercial CAD system locates the vast majority of pulmonary nodules at a low false positive rate. Potential for CAD is substantiated by the fact that it identifies pulmonary nodules that were not marked during the extensive four-fold LIDC annotation process. (orig.)

  8. A computational design approach for virtual screening of peptide interactions across K+ channel families

    Directory of Open Access Journals (Sweden)

    Craig A. Doupnik

    2015-01-01

    Full Text Available Ion channels represent a large family of membrane proteins with many being well established targets in pharmacotherapy. The ‘druggability’ of heteromeric channels comprised of different subunits remains obscure, due largely to a lack of channel-specific probes necessary to delineate their therapeutic potential in vivo. Our initial studies reported here, investigated the family of inwardly rectifying potassium (Kir channels given the availability of high resolution crystal structures for the eukaryotic constitutively active Kir2.2 channel. We describe a ‘limited’ homology modeling approach that can yield chimeric Kir channels having an outer vestibule structure representing nearly any known vertebrate or invertebrate channel. These computationally-derived channel structures were tested in silico for ‘docking’ to NMR structures of tertiapin (TPN, a 21 amino acid peptide found in bee venom. TPN is a highly selective and potent blocker for the epithelial rat Kir1.1 channel, but does not block human or zebrafish Kir1.1 channel isoforms. Our Kir1.1 channel-TPN docking experiments recapitulated published in vitro findings for TPN-sensitive and TPN-insensitive channels. Additionally, in silico site-directed mutagenesis identified ‘hot spots’ within the channel outer vestibule that mediate energetically favorable docking scores and correlate with sites previously identified with in vitro thermodynamic mutant-cycle analysis. These ‘proof-of-principle’ results establish a framework for virtual screening of re-engineered peptide toxins for interactions with computationally derived Kir channels that currently lack channel-specific blockers. When coupled with electrophysiological validation, this virtual screening approach may accelerate the drug discovery process, and can be readily applied to other ion channels families where high resolution structures are available.

  9. [Stroke mortality in Poland--role of observational studies based on computer databases].

    Science.gov (United States)

    Mazurek, Maciej

    2005-01-01

    Stroke is a leading cause of death worldwide and remains one of the major public health problems. Most European countries have experienced declines in stroke mortality in contrast to central and eastern European countries including Poland. The World Health Organization Data Bank is an invaluable source of information especially for mortality trends. Stroke mortality in Poland and some problems with accuracy of ICD coding for the identification of patients with acute stroke are discussed. Computerized databases are increasingly being used to identify patients with acute stroke for epidemiological, quality of care, and cost studies. More accurate methods of collecting and analysis of the data should be implemented to gain more information from these bases.

  10. Evolution of computational models in BioModels Database and the Physiome Model Repository.

    Science.gov (United States)

    Scharm, Martin; Gebhardt, Tom; Touré, Vasundra; Bagnacani, Andrea; Salehzadeh-Yazdi, Ali; Wolkenhauer, Olaf; Waltemath, Dagmar

    2018-04-12

    A useful model is one that is being (re)used. The development of a successful model does not finish with its publication. During reuse, models are being modified, i.e. expanded, corrected, and refined. Even small changes in the encoding of a model can, however, significantly affect its interpretation. Our motivation for the present study is to identify changes in models and make them transparent and traceable. We analysed 13734 models from BioModels Database and the Physiome Model Repository. For each model, we studied the frequencies and types of updates between its first and latest release. To demonstrate the impact of changes, we explored the history of a Repressilator model in BioModels Database. We observed continuous updates in the majority of models. Surprisingly, even the early models are still being modified. We furthermore detected that many updates target annotations, which improves the information one can gain from models. To support the analysis of changes in model repositories we developed MoSt, an online tool for visualisations of changes in models. The scripts used to generate the data and figures for this study are available from GitHub https://github.com/binfalse/BiVeS-StatsGenerator and as a Docker image at https://hub.docker.com/r/binfalse/bives-statsgenerator/ . The website https://most.bio.informatik.uni-rostock.de/ provides interactive access to model versions and their evolutionary statistics. The reuse of models is still impeded by a lack of trust and documentation. A detailed and transparent documentation of all aspects of the model, including its provenance, will improve this situation. Knowledge about a model's provenance can avoid the repetition of mistakes that others already faced. More insights are gained into how the system evolves from initial findings to a profound understanding. We argue that it is the responsibility of the maintainers of model repositories to offer transparent model provenance to their users.

  11. A New Screening Methodology for Improved Oil Recovery Processes Using Soft-Computing Techniques

    Science.gov (United States)

    Parada, Claudia; Ertekin, Turgay

    2010-05-01

    The first stage of production of any oil reservoir involves oil displacement by natural drive mechanisms such as solution gas drive, gas cap drive and gravity drainage. Typically, improved oil recovery (IOR) methods are applied to oil reservoirs that have been depleted naturally. In more recent years, IOR techniques are applied to reservoirs even before their natural energy drive is exhausted by primary depletion. Descriptive screening criteria for IOR methods are used to select the appropriate recovery technique according to the fluid and rock properties. This methodology helps in assessing the most suitable recovery process for field deployment of a candidate reservoir. However, the already published screening guidelines neither provide information about the expected reservoir performance nor suggest a set of project design parameters, which can be used towards the optimization of the process. In this study, artificial neural networks (ANN) are used to build a high-performance neuro-simulation tool for screening different improved oil recovery techniques: miscible injection (CO2 and N2), waterflooding and steam injection processes. The simulation tool consists of proxy models that implement a multilayer cascade feedforward back propagation network algorithm. The tool is intended to narrow the ranges of possible scenarios to be modeled using conventional simulation, reducing the extensive time and energy spent in dynamic reservoir modeling. A commercial reservoir simulator is used to generate the data to train and validate the artificial neural networks. The proxy models are built considering four different well patterns with different well operating conditions as the field design parameters. Different expert systems are developed for each well pattern. The screening networks predict oil production rate and cumulative oil production profiles for a given set of rock and fluid properties, and design parameters. The results of this study show that the networks are

  12. Development of application program and building database to increase facilities for using the radiation effect assessment computer codes

    International Nuclear Information System (INIS)

    Hyun Seok Ko; Young Min Kim; Suk-Hoon Kim; Dong Hoon Shin; Chang-Sun Kang

    2005-01-01

    The current radiation effect assessment system is required the skillful technique about the application for various code and high level of special knowledge classified by field. Therefore, as a matter of fact, it is very difficult for the radiation users' who don't have enough special knowledge to assess or recognize the radiation effect properly. For this, we already have developed the five Computer codes(windows-based), that is the radiation effect assessment system, in radiation utilizing field including the nuclear power generation. It needs the computer program that non-specialist can use the five computer codes to have already developed with ease. So, we embodied the A.I-based specialist system that can infer the assessment system by itself, according to the characteristic of given problem. The specialist program can guide users, search data, inquire of administrator directly. Conceptually, with circumstance which user to apply the five computer code may encounter actually, we embodied to consider aspects as follows. First, the accessibility of concept and data to need must be improved. Second, the acquirement of reference theory and use of corresponding computer code must be easy. Third, Q and A function needed for solution of user's question out of consideration previously. Finally, the database must be renewed continuously. Actually, to express this necessity, we develop the client program to organize reference data, to build the access methodology(query) about organized data, to load the visible expression function of searched data. And It is embodied the instruction method(effective theory acquirement procedure and methodology) to acquire the theory referring the five computer codes. It is developed the data structure access program(DBMS) to renew continuously data with ease. For Q and A function, it is embodied the Q and A board within client program because the user of client program can search the content of question and answer. (authors)

  13. Efficient linking of birth certificate and newborn screening databases for laboratory investigation of congenital cytomegalovirus infection and preterm birth: Florida, 2008.

    Science.gov (United States)

    DePasquale, John M; Freeman, Karen; Amin, Minal M; Park, Sohyun; Rivers, Samantha; Hopkins, Richard; Cannon, Michael J; Dy, Bonifacio; Dollard, Sheila C

    2012-02-01

    The objectives of this study are (1) to design an accurate method for linking newborn screening (NBS) and state birth certificate databases to create a de-identified study database; (2) To assess maternal cytomegalovirus (CMV) seroprevalence by measuring CMV IgG in newborn dried blood spots; (3) To assess congenital CMV infection among newborns and possible association with preterm birth. NBS and birth databases were linked and patient records were de-identified. A stratified random sample of records based on gestational age was selected and used to retrieve blood spots from the state NBS laboratory. Serum containing maternal antibodies was eluted from blood spots and tested for the presence of CMV IgG. DNA was extracted from blood spots and tested for the presence of CMV DNA. Analyses were performed with bivariable and multivariable logistic regression models. Linkage rates and specimen collection exceeded 98% of the total possible yielding a final database with 3,101 newborn blood spots. CMV seroprevalence was 91% among Black mothers, 83% among Hispanic mothers, 59% among White mothers, and decreased with increasing amounts of education. The prevalence of CMV infection in newborns was 0.45% and did not vary significantly by gestational age. Successful methods for database linkage, newborn blood spots collection, and de-identification of records can serve as a model for future congenital exposure surveillance projects. Maternal CMV seroprevalence was strongly associated with race/ethnicity and educational level. Congenital CMV infection rates were lower than those reported by other studies and lacked statistical power to examine associations with preterm birth.

  14. Development and Usability Testing of a Computer-Tailored Decision Support Tool for Lung Cancer Screening: Study Protocol.

    Science.gov (United States)

    Carter-Harris, Lisa; Comer, Robert Skipworth; Goyal, Anurag; Vode, Emilee Christine; Hanna, Nasser; Ceppa, DuyKhanh; Rawl, Susan M

    2017-11-16

    Awareness of lung cancer screening remains low in the screening-eligible population, and when patients visit their clinician never having heard of lung cancer screening, engaging in shared decision making to arrive at an informed decision can be a challenge. Therefore, methods to effectively support both patients and clinicians to engage in these important discussions are essential. To facilitate shared decision making about lung cancer screening, effective methods to prepare patients to have these important discussions with their clinician are needed. Our objective is to develop a computer-tailored decision support tool that meets the certification criteria of the International Patient Decision Aid Standards instrument version 4.0 that will support shared decision making in lung cancer screening decisions. Using a 3-phase process, we will develop and test a prototype of a computer-tailored decision support tool in a sample of lung cancer screening-eligible individuals. In phase I, we assembled a community advisory board comprising 10 screening-eligible individuals to develop the prototype. In phase II, we recruited a sample of 13 screening-eligible individuals to test the prototype for usability, acceptability, and satisfaction. In phase III, we are conducting a pilot randomized controlled trial (RCT) with 60 screening-eligible participants who have never been screened for lung cancer. Outcomes tested include lung cancer and screening knowledge, lung cancer screening health beliefs (perceived risk, perceived benefits, perceived barriers, and self-efficacy), perception of being prepared to engage in a patient-clinician discussion about lung cancer screening, occurrence of a patient-clinician discussion about lung cancer screening, and stage of adoption for lung cancer screening. Phases I and II are complete. Phase III is underway. As of July 15, 2017, 60 participants have been enrolled into the study, and have completed the baseline survey, intervention, and first

  15. International Association for the Study of Lung Cancer Computed Tomography Screening Workshop 2011 report

    DEFF Research Database (Denmark)

    Field, John K; Smith, Robert A; Aberle, Denise R

    2011-01-01

    national screening programs; (iii) develop guidelines for the clinical work-up of "indeterminate nodules" resulting from CT screening programmers; (iv) guidelines for pathology reporting of nodules from lung cancer CT screening programs; (v) recommendations for surgical and therapeutic interventions...... of suspicious nodules identified through lung cancer CT screening programs; and (vi) integration of smoking cessation practices into future national lung cancer CT screening programs....

  16. Image processing algorithm of computer-aided diagnosis in lung cancer screening by CT

    International Nuclear Information System (INIS)

    Yamamoto, Shinji

    2004-01-01

    In this paper, an image processing algorithm for computer-aided diagnosis of lung cancer by X-ray CT is described, which has been developed by my research group for these 10 years or so. CT lung images gathered at the mass screening stage are almost all normal, and lung cancer nodules will be found as the rate of less than 10%. To pick up such a very rare nodules with the high accuracy, a very sensitive detection algorithm is requested which is detectable local and very slight variation of the image. On the contrary, such a sensitive detection algorithm introduces a bad effect that a lot of normal shadows will be detected as abnormal shadows. In this paper I describe how to compromise this complicated subject and realize a practical computer-aided diagnosis tool by the image processing algorithm developed by my research group. Especially, I will mainly focus my description to the principle and characteristics of the Quoit filter which is newly developed as a high sensitive filter by my group. (author)

  17. Geochemical databases. Part 1. Pmatch: a program to manage thermochemical data. Part 2. The experimental validation of geochemical computer models

    International Nuclear Information System (INIS)

    Pearson, F.J. Jr.; Avis, J.D.; Nilsson, K.; Skytte Jensen, B.

    1993-01-01

    This work is carried out under cost-sharing contract with European Atomic Energy Community in the framework of its programme on Management and Storage of Radioactive Wastes. Part 1: PMATCH, A Program to Manage Thermochemical Data, describes the development and use of a computer program, by means of which new thermodynamic data from literature may be referenced to a common frame and thereby become internally consistent with an existing database. The report presents the relevant thermodynamic expressions and their use in the program is discussed. When there is not sufficient thermodynamic data available to describe a species behaviour under all conceivable conditions, the problems arising are thoroughly discussed and the available data is handled by approximating expressions. Part II: The Experimental Validation of Geochemical Computer models are the results of experimental investigations of the equilibria established in aqueous suspensions of mixtures of carbonate minerals (Calcium, magnesium, manganese and europium carbonates) compared with theoretical calculations made by means of the geochemical JENSEN program. The study revealed that the geochemical computer program worked well, and that its database was of sufficient validity. However, it was observed that experimental difficulties could hardly be avoided, when as here a gaseous component took part in the equilibria. Whereas the magnesium and calcium carbonates did not demonstrate mutual solid solubility, this produced abnormal effects when manganese and calcium carbonates were mixed resulting in a diminished solubility of both manganese and calcium. With tracer amounts of europium added to a suspension of calcite in sodium carbonate solutions long term experiments revealed a transition after 1-2 months, whereby the tracer became more strongly adsorbed onto calcite. The transition is interpreted as the nucleation and formation of a surface phase incorporating the 'species' NaEu(Co 3 ) 2

  18. Analysis on Cloud Computing Database in Cloud Environment – Concept and Adoption Paradigm

    OpenAIRE

    Elena-Geanina ULARU; Florina PUICAN; Manole VELICANU

    2012-01-01

    With the development of the Internet’s new technical functionalities, new concepts have started to take shape. These concepts have an important role especially in the development of corporate IT. Such a concept is „the Cloud”. Various marketing campaigns have started to focus on the Cloud and began to promote it in different but confusing ways. This campaigns do little, to explain what cloud computing is and why it is becoming increasingly necessary. The lack of understanding in this new tech...

  19. Resource utilization and costs during the initial years of lung cancer screening with computed tomography in Canada.

    Science.gov (United States)

    Cressman, Sonya; Lam, Stephen; Tammemagi, Martin C; Evans, William K; Leighl, Natasha B; Regier, Dean A; Bolbocean, Corneliu; Shepherd, Frances A; Tsao, Ming-Sound; Manos, Daria; Liu, Geoffrey; Atkar-Khattra, Sukhinder; Cromwell, Ian; Johnston, Michael R; Mayo, John R; McWilliams, Annette; Couture, Christian; English, John C; Goffin, John; Hwang, David M; Puksa, Serge; Roberts, Heidi; Tremblay, Alain; MacEachern, Paul; Burrowes, Paul; Bhatia, Rick; Finley, Richard J; Goss, Glenwood D; Nicholas, Garth; Seely, Jean M; Sekhon, Harmanjatinder S; Yee, John; Amjadi, Kayvan; Cutz, Jean-Claude; Ionescu, Diana N; Yasufuku, Kazuhiro; Martel, Simon; Soghrati, Kamyar; Sin, Don D; Tan, Wan C; Urbanski, Stefan; Xu, Zhaolin; Peacock, Stuart J

    2014-10-01

    It is estimated that millions of North Americans would qualify for lung cancer screening and that billions of dollars of national health expenditures would be required to support population-based computed tomography lung cancer screening programs. The decision to implement such programs should be informed by data on resource utilization and costs. Resource utilization data were collected prospectively from 2059 participants in the Pan-Canadian Early Detection of Lung Cancer Study using low-dose computed tomography (LDCT). Participants who had 2% or greater lung cancer risk over 3 years using a risk prediction tool were recruited from seven major cities across Canada. A cost analysis was conducted from the Canadian public payer's perspective for resources that were used for the screening and treatment of lung cancer in the initial years of the study. The average per-person cost for screening individuals with LDCT was $453 (95% confidence interval [CI], $400-$505) for the initial 18-months of screening following a baseline scan. The screening costs were highly dependent on the detected lung nodule size, presence of cancer, screening intervention, and the screening center. The mean per-person cost of treating lung cancer with curative surgery was $33,344 (95% CI, $31,553-$34,935) over 2 years. This was lower than the cost of treating advanced-stage lung cancer with chemotherapy, radiotherapy, or supportive care alone, ($47,792; 95% CI, $43,254-$52,200; p = 0.061). In the Pan-Canadian study, the average cost to screen individuals with a high risk for developing lung cancer using LDCT and the average initial cost of curative intent treatment were lower than the average per-person cost of treating advanced stage lung cancer which infrequently results in a cure.

  20. Automated Groundwater Screening

    International Nuclear Information System (INIS)

    Taylor, Glenn A.; Collard, Leonard B.

    2005-01-01

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application

  1. Computed Tomographic Virtual Colonoscopy to Screen for Colorectal Neoplasia in asymptomatic adults

    International Nuclear Information System (INIS)

    Pickhardt, Perry J.; Choi, J Richard; Hwang, Inku and others

    2004-01-01

    We evaluated the performance characteristics of computed tomographic (CT) virtual colonospy for the detection of colorectal neoplasia in an average-risk screening population. A total of 1233 symptomatic adults (mean age, 57.8 years) underwent same-day virtual and optical colonoscopy. Radiologists used the three-dimensional endoluminal display for the initial detection of polyps on CT virtual colonoscopy. For the initial examination of each colonic segment, the colonoscopists were unaware of the findings on virtual colonoscopy, which were revealed to them before any subsequent reexamination. The sensitivity and specificity of virtual colonoscopy and the sensitivity of optical colonoscopy were calculated with the use of the findings of the final, unblinded optical colonoscopy as the reference standard. The sensitivity of virtual colonoscopy for adenomatous polyps was 93.8 percent for polyps at least 10 mm in diameter, 93.9 percent for polyps at least 8 mm in diameter, and 88.7 percent for polyps at least 6 mm in diameter. The sensitivity of optical colonoscopy for adenomatous polyps was 87.5 percent, 91.5 percent, and 92.3 percent for the three sizes of polyps, respectively. The specificity of virtual colonoscopy for adenomatous polyps was 96.0 percent for polyps at least 10 mm in diameter, 92.2 percent for polyps at least 8 mm in diameter, and 79.6 percent for polyps at least 6 mm in diameter.Two polyps were malignant; both were detected on virtual colonoscopy, and one of them was missed on optical colonoscopy before the results on virtual colonoscopy were revealed. CT virtual colonoscopy with the use of a three-dimensional approach is an accurate screening method for the detection of colorectal neoplasia in symptomatic average-risk adults and compares favorably with optical colonoscopy in terms of the detection of clinically relevant lesions

  2. Usability testing of a respiratory interface using computer screen and facial expressions videos.

    Science.gov (United States)

    Oliveira, Ana; Pinho, Cátia; Monteiro, Sandra; Marcos, Ana; Marques, Alda

    2013-12-01

    Computer screen videos (CSVs) and users' facial expressions videos (FEVs) are recommended to evaluate systems performance. However, software combining both methods is often non-accessible in clinical research fields. The Observer-XT software is commonly used for clinical research to assess human behaviours. Thus, this study reports on the combination of CSVs and FEVs, to evaluate a graphical user interface (GUI). Eight physiotherapists entered clinical information in the GUI while CSVs and FEVs were collected. The frequency and duration of a list of behaviours found in FEVs were analysed using the Observer-XT-10.5. Simultaneously, the frequency and duration of usability problems of CSVs were manually registered. CSVs and FEVs timelines were also matched to verify combinations. The analysis of FEVs revealed that the category most frequently observed in users behaviour was the eye contact with the screen (ECS, 32±9) whilst verbal communication achieved the highest duration (14.8±6.9min). Regarding the CSVs, 64 problems, related with the interface (73%) and the user (27%), were found. In total, 135 usability problems were identified by combining both methods. The majority were reported through verbal communication (45.8%) and ECS (40.8%). "False alarms" and "misses" did not cause quantifiable reactions and the facial expressions problems were mainly related with the lack of familiarity (55.4%) felt by users when interacting with the interface. These findings encourage the use of Observer-XT-10.5 to conduct small usability sessions, as it identifies emergent groups of problems by combining methods. However, to validate final versions of systems further validation should be conducted using specialized software. © 2013 Published by Elsevier Ltd.

  3. The Study of Learners' Preference for Visual Complexity on Small Screens of Mobile Computers Using Neural Networks

    Science.gov (United States)

    Wang, Lan-Ting; Lee, Kun-Chou

    2014-01-01

    The vision plays an important role in educational technologies because it can produce and communicate quite important functions in teaching and learning. In this paper, learners' preference for the visual complexity on small screens of mobile computers is studied by neural networks. The visual complexity in this study is divided into five…

  4. Improving computer-aided detection assistance in breast cancer screening by removal of obviously false-positive findings

    NARCIS (Netherlands)

    Mordang, Jan-Jurre; Gubern-Merida, Albert; Bria, Alessandro; Tortorella, Francesco; den Heeten, Gerard; Karssemeijer, Nico

    2017-01-01

    Purpose: Computer-aided detection (CADe) systems for mammography screening still mark many false positives. This can cause radiologists to lose confidence in CADe, especially when many false positives are obviously not suspicious to them. In this study, we focus on obvious false positives generated

  5. Improving computer-aided detection assistance in breast cancer screening by removal of obviously false-positive findings

    NARCIS (Netherlands)

    Mordang, J.J.; Gubern Merida, A.; Bria, A.; Tortorella, F.; Heeten, G.; Karssemeijer, N.

    2017-01-01

    PURPOSE: Computer-aided detection (CADe) systems for mammography screening still mark many false positives. This can cause radiologists to lose confidence in CADe, especially when many false positives are obviously not suspicious to them. In this study, we focus on obvious false positives generated

  6. Density functional theory based screening of ternary alkali-transition metal borohydrides: A computational material design project

    DEFF Research Database (Denmark)

    Hummelshøj, Jens Strabo; Landis, David; Voss, Johannes

    2009-01-01

    We present a computational screening study of ternary metal borohydrides for reversible hydrogen storage based on density functional theory. We investigate the stability and decomposition of alloys containing 1 alkali metal atom, Li, Na, or K (M1); and 1 alkali, alkaline earth or 3d/4d transition...

  7. The BioFragment Database (BFDb): An open-data platform for computational chemistry analysis of noncovalent interactions

    Science.gov (United States)

    Burns, Lori A.; Faver, John C.; Zheng, Zheng; Marshall, Michael S.; Smith, Daniel G. A.; Vanommeslaeghe, Kenno; MacKerell, Alexander D.; Merz, Kenneth M.; Sherrill, C. David

    2017-10-01

    Accurate potential energy models are necessary for reliable atomistic simulations of chemical phenomena. In the realm of biomolecular modeling, large systems like proteins comprise very many noncovalent interactions (NCIs) that can contribute to the protein's stability and structure. This work presents two high-quality chemical databases of common fragment interactions in biomolecular systems as extracted from high-resolution Protein DataBank crystal structures: 3380 sidechain-sidechain interactions and 100 backbone-backbone interactions that inaugurate the BioFragment Database (BFDb). Absolute interaction energies are generated with a computationally tractable explicitly correlated coupled cluster with perturbative triples [CCSD(T)-F12] "silver standard" (0.05 kcal/mol average error) for NCI that demands only a fraction of the cost of the conventional "gold standard," CCSD(T) at the complete basis set limit. By sampling extensively from biological environments, BFDb spans the natural diversity of protein NCI motifs and orientations. In addition to supplying a thorough assessment for lower scaling force-field (2), semi-empirical (3), density functional (244), and wavefunction (45) methods (comprising >1M interaction energies), BFDb provides interactive tools for running and manipulating the resulting large datasets and offers a valuable resource for potential energy model development and validation.

  8. Meta-analysis of two computer-assisted screening methods for diagnosing oral precancer and cancer.

    Science.gov (United States)

    Ye, Xiaojing; Zhang, Jing; Tan, Yaqin; Chen, Guanying; Zhou, Gang

    2015-11-01

    The early diagnosis of oral precancer and cancer is crucial and could have the highest impact on improving survival rates. A meta-analysis was conducted to compare the accuracy between the OralCDx brush biopsy and DNA-image cytometry in diagnosing both conditions. Bibliographic databases were systematically searched for original relevant studies on the early diagnosis of oral precancer and oral cancer. Study characteristics were evaluated to determine the accuracy of the two screening strategies. Thirteen studies (eight of OralCDx brush biopsy and five of DNA-image cytometry) were identified as having reported on 1981 oral mucosa lesions. The meta-analysis found that the area under the summary receiver operating characteristic curves of the OralCDx brush biopsy and DNA-image cytometry were 0.8879 and 0.9885, respectively. The pooled sensitivity, specificity, and diagnostic odds ratio of the OralCDx brush biopsy were 86% (95% CI 81-90), 81% (95% CI 78-85), and 20.36 (95% CI 2.72-152.67), respectively, while these modalities of DNA-image cytometry were 89% (95% CI 83-94), 99% (95% CI 97-100), and 446.08 (95% CI 73.36-2712.43), respectively. Results of a pairwise comparison between each modality demonstrated that specificity, area under the curve (AUC), and Q(∗) index of DNA-image cytometry was significantly higher than that of the OralCDx brush biopsy (Z=2.821, p0.05). In conclusion, the meta-analysis of the published studies indicated that DNA-image cytometry is more accurate than the OralCDx brush biopsy in diagnosing oral precancer and oral cancer. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Predicting Structures of Ru-Centered Dyes: A Computational Screening Tool.

    Science.gov (United States)

    Fredin, Lisa A; Allison, Thomas C

    2016-04-07

    Dye-sensitized solar cells (DSCs) represent a means for harvesting solar energy to produce electrical power. Though a number of light harvesting dyes are in use, the search continues for more efficient and effective compounds to make commercially viable DSCs a reality. Computational methods have been increasingly applied to understand the dyes currently in use and to aid in the search for improved light harvesting compounds. Semiempirical quantum chemistry methods have a well-deserved reputation for giving good quality results in a very short amount of computer time. The most recent semiempirical models such as PM6 and PM7 are parametrized for a wide variety of molecule types, including organometallic complexes similar to DSC chromophores. In this article, the performance of PM6 is tested against a set of 20 molecules whose geometries were optimized using a density functional theory (DFT) method. It is found that PM6 gives geometries that are in good agreement with the optimized DFT structures. In order to reduce the differences between geometries optimized using PM6 and geometries optimized using DFT, the PM6 basis set parameters have been optimized for a subset of the molecules. It is found that it is sufficient to optimize the basis set for Ru alone to improve the agreement between the PM6 results and the DFT results. When this optimized Ru basis set is used, the mean unsigned error in Ru-ligand bond lengths is reduced from 0.043 to 0.017 Å in the set of 20 test molecules. Though the magnitude of these differences is small, the effect on the calculated UV/vis spectra is significant. These results clearly demonstrate the value of using PM6 to screen DSC chromophores as well as the value of optimizing PM6 basis set parameters for a specific set of molecules.

  10. A Comparison between Effect of Viewing Text on Computer Screen and iPad® on Visual Symptoms and Functions

    Directory of Open Access Journals (Sweden)

    Pittaya Phamonvaechavan

    2017-07-01

    Full Text Available Objective: To compare the ocular symptoms following sustained near vision between laptop computer and iPad®. Methods: Forty normal subjects read text from a laptop computer screen and an iPad® screen for a continuous 20 min period. Similar text was used in both sessions, which was matched for size and contrast. After finishing viewing text, subjects immediately completed a written questionnaire categorizing symptom scores into three groups: Dry eye, Pain and Blurred vision score. The accommodative amplitude and fusional convergence amplitude at near vision were also assessed before and after reading. Results: In both conditions, mean symptom scores were higher during iPad use. When comparing the computer and iPad conditions, mean scores were statistically significant different in Pain score (6.30 vs 8.70; p=0.025 and Blurred vision score (10.13 vs 12.03; p=0.041 but no statistically significant difference in Dry eye score (6.30 vs 6.60; p=0.71. There were significant change in accommodative amplitude and fusional convergence amplitude with near vision when compared before and after near-vision tasks in both cases. Conclusion: Pain and Blurred vision symptoms following sustained iPad use were significantly worse than those reported after computer use under similar viewing conditions. However, both computer screen and iPad cause ocular symptoms having an impact on quality of life.

  11. Comparison of low contrast detectability of computed tomography and screen/film mammography systems

    International Nuclear Information System (INIS)

    Noriah Jamal; Kwan Hoong Ng; McLean, D.

    2006-01-01

    The objective of this study was to compare low contrast detectability of computed radiography (CR) and screen/film (SF) mammography systems. The Nijimegen contrast detail test object (CDMAM type 3.4) was imaged at 28 kV, in automatic exposure control mode separately. Six medical imaging physicists read each CDMAM phantom image. Contrast detail curves were plotted to compare low contrast detectability of CR (soft copy and hard copy) and SF mammography systems. Effect of varying exposure parameters, namely kV, object position inside the breast phantom, and entrance surface exposure (ESE) on the contrast detail curve were also investigated using soft copy CR. The significant of the difference of contrast between CR and SF, and for each exposure parameter was tested using non-parametric Kruskal-Wallis test. We found that the low contrast detectability of CR (soft copy and hard copy) system is not significantly different to that of SF system (p>0.05, Kruskal-Wallis test). For CR soft copy, no significant relationship (p>0.05, Kruskal-Wallis test) was seen for variation of kV, object position inside the breast phantom and ESE. This indicates that CR is comparable with SF for useful detection and visualization of low contrast objects such as small low contrast areas corresponding to breast pathology

  12. Comparison of low-contrast detectability of computed radiography and screen/ film mammography systems

    International Nuclear Information System (INIS)

    Noriah Jamal; Kwan-Hoong Ng; McLean, D.; McLean, D.

    2008-01-01

    The objective of this study is to compare low-contrast detectability of computed radiography (CR) and screen/ film (SF) mammography systems. The Nijimegen contrast detail test object (CDMAM type 3.4) was imaged at 28 kV, in automatic exposure control mode separately. Six medical imaging physicists read each CDMAM phantom image. Contrast detail curves were plotted to compare low-contrast detectability of CR (soft copy and hard copy) and SF mammography systems. Effect of varying exposure parameters, namely kV, object position inside the breast phantom, and entrance surface exposure (ESE) on the contrast detail curve were also investigated using soft copy CR. The significance of the difference in contrast between CR and SF, and for each exposure parameter, was tested using non-parametric Kruskal-Wallis test. The low-contrast detectability of the CR (soft copy and hard copy) system was found to be not significantly different to that of the SF system (p> 0.05, Kruskal-Wallis test).For CR soft copy, no significant relationship (p>0.05, Kruskal-Wallis test) was seen for variation of kV, object position inside the breast phantom and ESE. This indicates that CR is comparable with SF for useful detection and visualization of low-contrast objects such as small low-contrast areas corresponding to breast pathology. (Author)

  13. Computational Screening of MOF-Based Mixed Matrix Membranes for CO2/N2 Separations

    Directory of Open Access Journals (Sweden)

    Zeynep Sumer

    2016-01-01

    Full Text Available Atomically detailed simulations were used to examine CO2/N2 separation potential of metal organic framework- (MOF- based mixed matrix membranes (MMMs in this study. Gas permeability and selectivity of 700 new MMMs composed of 70 different MOFs and 10 different polymers were calculated for CO2/N2 separation. This is the largest number of MOF-based MMMs for which computational screening is done to date. Selecting the appropriate MOFs as filler particles in polymers resulted in MMMs that have higher CO2/N2 selectivities and higher CO2 permeabilities compared to pure polymer membranes. We showed that, for polymers that have low CO2 permeabilities but high CO2 selectivities, the identity of the MOF used as filler is not important. All MOFs enhanced the CO2 permeabilities of this type of polymers without changing their selectivities. Several MOF-based MMMs were identified to exceed the upper bound established for polymers. The methods we introduced in this study will create many opportunities to select the MOF/polymer combinations with useful properties for CO2 separation applications.

  14. Abdominal computed tomography scan as a screening tool in blunt trauma

    International Nuclear Information System (INIS)

    Brasel, K.J.; Borgstrom, D.C.; Kolewe, K.A.

    1997-01-01

    Background. One of the most difficult problems in blunt trauma is evaluation for potential intraabdominal injury. Admission for serial abdominal exams remains the standard of care after intraabdominal injury has been initially excluded. We hypothesized a normal abdominal computed tomography (CT) scan in a subgroup of minimally injured patients would obviate admission for serial abdominal examinations, allowing safe discharge from the emergency department (ED). Methods. We reviewed our blunt trauma experience with patients admitted solely for serial abdominal examinations after a normal CT. Patients were identified from the trauma registry at a Level 1 trauma center from July 1991 through June 1995. Patients with abnormal CTs, extra-abdominal injuries necessitating admission, hemodynamic abnormalities, a Glasgow Coma Scale less than 13, or injury severity scores (ISSs) greater than 15 were excluded. Records of 238 patients remained; we reviewed them to determine the presence of missed abdominal injury. Results. None of the 238 patients had a missed abdominal injury. Average ISS of these patients was 3.2 (range, 0 to 10). Discharging these patients from the ED would result in a yearly cost savings of $32,874 to our medical system. Conclusions. Abdominal CT scan is a safe and cost-effective screening tool in patients with blunt trauma. A normal CT scan in minimally injured patients allows safe discharge from the ED. (authors)

  15. Opportunistic screening for osteoporosis on routine computed tomography? An external validation study

    Energy Technology Data Exchange (ETDEWEB)

    Buckens, Constantinus F. [University Medical Center Utrecht, Department of Radiology, Utrecht (Netherlands); Universitair Medisch Centrum Utrecht, Department of Radiology, Utrecht (Netherlands); Dijkhuis, Gawein; Jong, Pim A. de [University Medical Center Utrecht, Department of Radiology, Utrecht (Netherlands); Keizer, Bart de [University Medical Center Utrecht, Department of Nuclear Medicine, Utrecht (Netherlands); Verhaar, Harald J. [University Medical Center Utrecht, Department of Geriatric Medicine, Utrecht (Netherlands)

    2015-07-15

    Opportunistic screening for osteoporosis using computed tomography (CT) examinations that happen to visualise the spine can be used to identify patients with osteoporosis. We sought to verify the diagnostic performance of vertebral Hounsfield unit (HU) measurements on routine CT examinations for diagnosing osteoporosis in a separate, external population. Consecutive patients who underwent a CT examination of the chest or abdomen and had also received a dual-energy X-ray absorptiometry (DXA) test were retrospectively included. CTs were evaluated for vertebral fractures and vertebral attenuation (density) values were measured. Diagnostic performance measures and the area under the receiver operator characteristics curve (AUC) for diagnosing osteoporosis were calculated. Three hundred and two patients with a mean age of 57.9 years were included, of which 82 (27 %) had osteoporosis according to DXA and 65 (22 %) had vertebral fractures. The diagnostic performance for vertebral HU measurements was modest, with a maximal AUC of 0.74 (0.68 - 0.80). At that optimal threshold the sensitivity was 62 % (51 - 72 %) and the specificity was 79 % (74 - 84 %). We confirmed that simple trabecular vertebral density measurements on routine CT contain diagnostic information related to bone mineral density as measured by DXA, albeit with substantially lower diagnostic accuracy than previously reported. (orig.)

  16. Opportunistic screening for osteoporosis on routine computed tomography? An external validation study

    International Nuclear Information System (INIS)

    Buckens, Constantinus F.; Dijkhuis, Gawein; Jong, Pim A. de; Keizer, Bart de; Verhaar, Harald J.

    2015-01-01

    Opportunistic screening for osteoporosis using computed tomography (CT) examinations that happen to visualise the spine can be used to identify patients with osteoporosis. We sought to verify the diagnostic performance of vertebral Hounsfield unit (HU) measurements on routine CT examinations for diagnosing osteoporosis in a separate, external population. Consecutive patients who underwent a CT examination of the chest or abdomen and had also received a dual-energy X-ray absorptiometry (DXA) test were retrospectively included. CTs were evaluated for vertebral fractures and vertebral attenuation (density) values were measured. Diagnostic performance measures and the area under the receiver operator characteristics curve (AUC) for diagnosing osteoporosis were calculated. Three hundred and two patients with a mean age of 57.9 years were included, of which 82 (27 %) had osteoporosis according to DXA and 65 (22 %) had vertebral fractures. The diagnostic performance for vertebral HU measurements was modest, with a maximal AUC of 0.74 (0.68 - 0.80). At that optimal threshold the sensitivity was 62 % (51 - 72 %) and the specificity was 79 % (74 - 84 %). We confirmed that simple trabecular vertebral density measurements on routine CT contain diagnostic information related to bone mineral density as measured by DXA, albeit with substantially lower diagnostic accuracy than previously reported. (orig.)

  17. Cloud-Based NoSQL Open Database of Pulmonary Nodules for Computer-Aided Lung Cancer Diagnosis and Reproducible Research.

    Science.gov (United States)

    Ferreira Junior, José Raniery; Oliveira, Marcelo Costa; de Azevedo-Marques, Paulo Mazzoncini

    2016-12-01

    Lung cancer is the leading cause of cancer-related deaths in the world, and its main manifestation is pulmonary nodules. Detection and classification of pulmonary nodules are challenging tasks that must be done by qualified specialists, but image interpretation errors make those tasks difficult. In order to aid radiologists on those hard tasks, it is important to integrate the computer-based tools with the lesion detection, pathology diagnosis, and image interpretation processes. However, computer-aided diagnosis research faces the problem of not having enough shared medical reference data for the development, testing, and evaluation of computational methods for diagnosis. In order to minimize this problem, this paper presents a public nonrelational document-oriented cloud-based database of pulmonary nodules characterized by 3D texture attributes, identified by experienced radiologists and classified in nine different subjective characteristics by the same specialists. Our goal with the development of this database is to improve computer-aided lung cancer diagnosis and pulmonary nodule detection and classification research through the deployment of this database in a cloud Database as a Service framework. Pulmonary nodule data was provided by the Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI), image descriptors were acquired by a volumetric texture analysis, and database schema was developed using a document-oriented Not only Structured Query Language (NoSQL) approach. The proposed database is now with 379 exams, 838 nodules, and 8237 images, 4029 of them are CT scans and 4208 manually segmented nodules, and it is allocated in a MongoDB instance on a cloud infrastructure.

  18. Online image databases as multi-purpose resources: discovery of a new host ant of Rickia wasmannii Cavara (Ascomycota, Laboulbeniales by screening AntWeb.org

    Directory of Open Access Journals (Sweden)

    Ferenc Báthori

    2017-12-01

    Full Text Available Public awareness has been raised on the importance of natural history and academic collections for science and society in a time when reduced financial support and staff cuts are prevalent. In the field of biology, new species and new interspecies associations are constantly discovered by making use of museum collections, digitalised materials or citizen science programs. In our study, the Myrmica Latreille, 1804 image collection of AntWeb.org was screened for fungal ectoparasites. A total of 397 imaged specimens from 133 species were visually investigated. A single specimen of M. hellenica Finzi, 1926, collected in Greece by U. Sahlberg, showed a conspicuous fungal infection. The parasite was identified using microscopic methods as Rickia wasmannii Cavara, an ectoparasitic fungal species specialised to Myrmica ants. This finding represents a new country record and a new Myrmica species for the host spectrum of R. wasmannii. According to our results, online entomological databases can be screened relatively easily for ectoparasitic fungal infections from new hosts and new regions. However, depending on quality of the insect voucher photos, additional investigation of the material could be needed to confirm the identity of the parasite.

  19. Identification of the Beer Component Hordenine as Food-Derived Dopamine D2 Receptor Agonist by Virtual Screening a 3D Compound Database

    Science.gov (United States)

    Sommer, Thomas; Hübner, Harald; El Kerdawy, Ahmed; Gmeiner, Peter; Pischetsrieder, Monika; Clark, Timothy

    2017-03-01

    The dopamine D2 receptor (D2R) is involved in food reward and compulsive food intake. The present study developed a virtual screening (VS) method to identify food components, which may modulate D2R signalling. In contrast to their common applications in drug discovery, VS methods are rarely applied for the discovery of bioactive food compounds. Here, databases were created that exclusively contain substances occurring in food and natural sources (about 13,000 different compounds in total) as the basis for combined pharmacophore searching, hit-list clustering and molecular docking into D2R homology models. From 17 compounds finally tested in radioligand assays to determine their binding affinities, seven were classified as hits (hit rate = 41%). Functional properties of the five most active compounds were further examined in β-arrestin recruitment and cAMP inhibition experiments. D2R-promoted G-protein activation was observed for hordenine, a constituent of barley and beer, with approximately identical ligand efficacy as dopamine (76%) and a Ki value of 13 μM. Moreover, hordenine antagonised D2-mediated β-arrestin recruitment indicating functional selectivity. Application of our databases provides new perspectives for the discovery of bioactive food constituents using VS methods. Based on its presence in beer, we suggest that hordenine significantly contributes to mood-elevating effects of beer.

  20. The NASA Ames PAH IR Spectroscopic Database: Computational Version 3.00 with Updated Content and the Introduction of Multiple Scaling Factors

    Science.gov (United States)

    Bauschlicher, Charles W., Jr.; Ricca, A.; Boersma, C.; Allamandola, L. J.

    2018-02-01

    Version 3.00 of the library of computed spectra in the NASA Ames PAH IR Spectroscopic Database (PAHdb) is described. Version 3.00 introduces the use of multiple scale factors, instead of the single scaling factor used previously, to align the theoretical harmonic frequencies with the experimental fundamentals. The use of multiple scale factors permits the use of a variety of basis sets; this allows new PAH species to be included in the database, such as those containing oxygen, and yields an improved treatment of strained species and those containing nitrogen. In addition, the computed spectra of 2439 new PAH species have been added. The impact of these changes on the analysis of an astronomical spectrum through database-fitting is considered and compared with a fit using Version 2.00 of the library of computed spectra. Finally, astronomical constraints are defined for the PAH spectral libraries in PAHdb.

  1. Evaluation of the potential benefit of computer-aided diagnosis (CAD) for lung cancer screening using photofluorography

    International Nuclear Information System (INIS)

    Matsumoto, Tsuneo; Nakamura, Hiroshi; Nakanishi, Takashi; Doi, Kunio; Kano, Akiko.

    1993-01-01

    To evaluate the potential benefit of computer-aided diagnosis (CAD) in lung cancer screenings using photofluorographic films, we performed an observer test with 12 radiologists. We used 60 photofluorographic films obtained from a lung cancer screening program in Yamaguchi Prefecture (30 contained cancerous nodules and 30 had no nodules). In these cases, our current automated detection scheme achieved a sensitivity of 80%, but yielded an average of 11 false-positives per image. The observer study consisted of three viewing conditions: 1) only the original image (single reading), 2) the original image and computer output obtained from the current CAD scheme (CAD 1), 3) the original image and computer output obtained from a simulated improved CAD scheme with the same 80% true-positive rate, but with an average of one false-positive per image (CAD 2). Compared with double reading using independent interpretations, which is based on a higher score between two single readings, CAD 2 was more sensitive in subtle cases. The specificity of CAD was superior to that of double reading. Although CAD 1 (Az=0.805) was inferior to double reading (Az=0.837) in terms of the ROC curve, CAD 2 (Az=0.872) significantly improved the ROC curve and also significantly reduced observation time (p<0.05). If the number of false positives can be reduced, computer-aided diagnosis may play an important role in lung cancer screening programs. (author)

  2. Computer/Mobile Device Screen Time of Children and Their Eye Care Behavior: The Roles of Risk Perception and Parenting.

    Science.gov (United States)

    Chang, Fong-Ching; Chiu, Chiung-Hui; Chen, Ping-Hung; Miao, Nae-Fang; Chiang, Jeng-Tung; Chuang, Hung-Yi

    2018-03-01

    This study assessed the computer/mobile device screen time and eye care behavior of children and examined the roles of risk perception and parental practices. Data were obtained from a sample of 2,454 child-parent dyads recruited from 30 primary schools in Taipei city and New Taipei city, Taiwan, in 2016. Self-administered questionnaires were collected from students and parents. Fifth-grade students spend more time on new media (computer/smartphone/tablet: 16 hours a week) than on traditional media (television: 10 hours a week). The average daily screen time (3.5 hours) for these children exceeded the American Academy of Pediatrics recommendations (≤2 hours). Multivariate analysis results showed that after controlling for demographic factors, the parents with higher levels of risk perception and parental efficacy were more likely to mediate their child's eye care behavior. Children who reported lower academic performance, who were from non-intact families, reported lower levels of risk perception of mobile device use, had parents who spent more time using computers and mobile devices, and had lower levels of parental mediation were more likely to spend more time using computers and mobile devices; whereas children who reported higher academic performance, higher levels of risk perception, and higher levels of parental mediation were more likely to engage in higher levels of eye care behavior. Risk perception by children and parental practices are associated with the amount of screen time that children regularly engage in and their level of eye care behavior.

  3. Computer-aided beam arrangement based on similar cases in radiation treatment-planning databases for stereotactic lung radiation therapy

    International Nuclear Information System (INIS)

    Magome, Taiki; Shioyama, Yoshiyuki; Arimura, Hidetaka

    2013-01-01

    The purpose of this study was to develop a computer-aided method for determination of beam arrangements based on similar cases in a radiotherapy treatment-planning database for stereotactic lung radiation therapy. Similar-case-based beam arrangements were automatically determined based on the following two steps. First, the five most similar cases were searched, based on geometrical features related to the location, size and shape of the planning target volume, lung and spinal cord. Second, five beam arrangements of an objective case were automatically determined by registering five similar cases with the objective case, with respect to lung regions, by means of a linear registration technique. For evaluation of the beam arrangements five treatment plans were manually created by applying the beam arrangements determined in the second step to the objective case. The most usable beam arrangement was selected by sorting the five treatment plans based on eight plan evaluation indices, including the D95, mean lung dose and spinal cord maximum dose. We applied the proposed method to 10 test cases, by using an RTP database of 81 cases with lung cancer, and compared the eight plan evaluation indices between the original treatment plan and the corresponding most usable similar-case-based treatment plan. As a result, the proposed method may provide usable beam arrangements, which have no statistically significant differences from the original beam arrangements (P>0.05) in terms of the eight plan evaluation indices. Therefore, the proposed method could be employed as an educational tool for less experienced treatment planners. (author)

  4. Development of a computer-based automated pure tone hearing screening device: a preliminary clinical trial.

    Science.gov (United States)

    Gan, Kok Beng; Azeez, Dhifaf; Umat, Cila; Ali, Mohd Alauddin Mohd; Wahab, Noor Alaudin Abdul; Mukari, Siti Zamratol Mai-Sarah

    2012-10-01

    Hearing screening is important for the early detection of hearing loss. The requirements of specialized equipment, skilled personnel, and quiet environments for valid screening results limit its application in schools and health clinics. This study aimed to develop an automated hearing screening kit (auto-kit) with the capability of realtime noise level monitoring to ensure that the screening is performed in an environment that conforms to the standard. The auto-kit consists of a laptop, a 24-bit resolution sound card, headphones, a microphone, and a graphical user interface, which is calibrated according to the American National Standards Institute S3.6-2004 standard. The auto-kit can present four test tones (500, 1000, 2000, and 4000 Hz) at 25 or 40 dB HL screening cut-off level. The clinical results at 40 dB HL screening cut-off level showed that the auto-kit has a sensitivity of 92.5% and a specificity of 75.0%. Because the 500 Hz test tone is not included in the standard hearing screening procedure, it can be excluded from the auto-kit test procedure. The exclusion of 500 Hz test tone improved the specificity of the auto-kit from 75.0% to 92.3%, which suggests that the auto-kit could be a valid hearing screening device. In conclusion, the auto-kit may be a valuable hearing screening tool, especially in countries where resources are limited.

  5. Comparison of digital tomosynthesis and computed tomography for lung nodule detection in SOS screening program.

    Science.gov (United States)

    Grosso, Maurizio; Priotto, Roberto; Ghirardo, Donatella; Talenti, Alberto; Roberto, Emanuele; Bertolaccini, Luca; Terzi, Alberto; Chauvie, Stéphane

    2017-08-01

    To compare the lung nodules' detection of digital tomosynthesis (DTS) and computed tomography (CT) in the context of the SOS (Studio OSservazionale) prospective screening program for lung cancer detection. One hundred and thirty-two of the 1843 subjects enrolled in the SOS study underwent CT because non-calcified nodules with diameters larger than 5 mm and/or multiple nodules were present in DTS. Two expert radiologists reviewed the exams classifying the nodules based on their radiological appearance and their dimension. LUNG-RADS classification was applied to compare receiver operator characteristics curve between CT and DTS with respect to final diagnosis. CT was used as gold standard. DTS and CT detected 208 and 179 nodules in the 132 subjects, respectively. Of these 208 nodules, 189 (91%) were solid, partially solid, and ground glass opacity. CT confirmed 140/189 (74%) of these nodules but found 4 nodules that were not detected by DTS. DTS and CT were concordant in 62% of the cases applying the 5-point LUNG-RADS scale. The concordance rose to 86% on a suspicious/non-suspicious binary scale. The areas under the curve in receiver operator characteristics were 0.89 (95% CI 0.83-0.94) and 0.80 (95% CI 0.72-0.89) for CT and DTS, respectively. The mean effective dose was 0.09 ± 0.04 mSv for DTS and 4.90 ± 1.20 mSv for CT. The use of a common classification for nodule detection in DTS and CT helps in comparing the two technologies. DTS detected and correctly classified 74% of the nodules seen by CT but lost 4 nodules identified by CT. Concordance between DTS and CT rose to 86% of the nodules when considering LUNG-RADS on a binary scale.

  6. The interplay of attention economics and computer-aided detection marks in screening mammography

    Science.gov (United States)

    Schwartz, Tayler M.; Sridharan, Radhika; Wei, Wei; Lukyanchenko, Olga; Geiser, William; Whitman, Gary J.; Haygood, Tamara Miner

    2016-03-01

    Introduction: According to attention economists, overabundant information leads to decreased attention for individual pieces of information. Computer-aided detection (CAD) alerts radiologists to findings potentially associated with breast cancer but is notorious for creating an abundance of false-positive marks. We suspected that increased CAD marks do not lengthen mammogram interpretation time, as radiologists will selectively disregard these marks when present in larger numbers. We explore the relevance of attention economics in mammography by examining how the number of CAD marks affects interpretation time. Methods: We performed a retrospective review of bilateral digital screening mammograms obtained between January 1, 2011 and February 28, 2014, using only weekend interpretations to decrease distractions and the likelihood of trainee participation. We stratified data according to reader and used ANOVA to assess the relationship between number of CAD marks and interpretation time. Results: Ten radiologists, with median experience after residency of 12.5 years (range 6 to 24,) interpreted 1849 mammograms. When accounting for number of images, Breast Imaging Reporting and Data System category, and breast density, increasing numbers of CAD marks was correlated with longer interpretation time only for the three radiologists with the fewest years of experience (median 7 years.) Conclusion: For the 7 most experienced readers, increasing CAD marks did not lengthen interpretation time. We surmise that as CAD marks increase, the attention given to individual marks decreases. Experienced radiologists may rapidly dismiss larger numbers of CAD marks as false-positive, having learned that devoting extra attention to such marks does not improve clinical detection.

  7. Computational fragment-based screening using RosettaLigand: the SAMPL3 challenge

    Science.gov (United States)

    Kumar, Ashutosh; Zhang, Kam Y. J.

    2012-05-01

    SAMPL3 fragment based virtual screening challenge provides a valuable opportunity for researchers to test their programs, methods and screening protocols in a blind testing environment. We participated in SAMPL3 challenge and evaluated our virtual fragment screening protocol, which involves RosettaLigand as the core component by screening a 500 fragments Maybridge library against bovine pancreatic trypsin. Our study reaffirmed that the real test for any virtual screening approach would be in a blind testing environment. The analyses presented in this paper also showed that virtual screening performance can be improved, if a set of known active compounds is available and parameters and methods that yield better enrichment are selected. Our study also highlighted that to achieve accurate orientation and conformation of ligands within a binding site, selecting an appropriate method to calculate partial charges is important. Another finding is that using multiple receptor ensembles in docking does not always yield better enrichment than individual receptors. On the basis of our results and retrospective analyses from SAMPL3 fragment screening challenge we anticipate that chances of success in a fragment screening process could be increased significantly with careful selection of receptor structures, protein flexibility, sufficient conformational sampling within binding pocket and accurate assignment of ligand and protein partial charges.

  8. Implementing a computer-assisted telephone interview (CATI) system to increase colorectal cancer screening: a process evaluation.

    Science.gov (United States)

    White, Mary Jo; Stark, Jennifer R; Luckmann, Roger; Rosal, Milagros C; Clemow, Lynn; Costanza, Mary E

    2006-06-01

    Computer-assisted telephone interviewing (CATI) systems used by telephone counselors (TCs) may be efficient mechanisms to counsel patients on cancer and recommended preventive screening tests in order to extend a primary care provider's reach to his/her patients. The implementation process of such a system for promoting colorectal (CRC) cancer screening using a computer-assisted telephone interview (CATI) system is reported in this paper. The process evaluation assessed three components of the intervention: message production, program implementation and audience reception. Of 1181 potentially eligible patients, 1025 (87%) patients were reached by the TCs and 725 of those patients (71%) were eligible to receive counseling. Five hundred eighty-two (80%) patients agreed to counseling. It is feasible to design and use CATI systems for prevention counseling of patients in primary care practices. CATI systems have the potential of being used as a referral service by primary care providers and health care organizations for patient education.

  9. A computer-tailored intervention to promote informed decision making for prostate cancer screening among African-American men

    Science.gov (United States)

    Allen, Jennifer D.; Mohllajee, Anshu P.; Shelton, Rachel C.; Drake, Bettina F.; Mars, Dana R.

    2010-01-01

    African-American men experience a disproportionate burden of prostate cancer (CaP) morbidity and mortality. National screening guidelines advise men to make individualized screening decisions through a process termed “informed decision making” (IDM). In this pilot study, a computer-tailored decision-aid designed to promote IDM was evaluated using a pre/post test design. African-American men aged 40+ recruited from a variety of community settings (n=108). At pre-test, 43% of men reported having made a screening decision; at post-test 47% reported this to be the case (p=0.39). Significant improvements were observed on scores (0–100%) of knowledge (54% vs 72%; pMen were also more likely to want an active role in decision-making after using the tool (67% vs 75%; p=0.03). These results suggest that use of a computer-tailored decision-aid is a promising strategy to promote IDM for CaP screening among African-American men. PMID:19477736

  10. Cost-effectiveness of computed tomography colonography in colorectal cancer screening: a systematic review.

    Science.gov (United States)

    Hanly, Paul; Skally, Mairead; Fenlon, Helen; Sharp, Linda

    2012-10-01

    The European Code Against Cancer recommends individuals aged ≥ 50 should participate in colorectal cancer screening. CT-colonography (CTC) is one of several screening tests available. We systematically reviewed evidence on, and identified key factors influencing, cost-effectiveness of CTC screening. PubMed, Medline, and the Cochrane library were searched for cost-effectiveness or cost-utility analyses of CTC-based screening, published in English, January 1999 to July 2010. Data was abstracted on setting, model type and horizon, screening scenario(s), comparator(s), participants, uptake, CTC performance and cost, effectiveness, ICERs, and whether extra-colonic findings and medical complications were considered. Sixteen studies were identified from the United States (n = 11), Canada (n = 2), and France, Italy, and the United Kingdom (1 each). Markov state-transition (n = 14) or microsimulation (n = 2) models were used. Eleven considered direct medical costs only; five included indirect costs. Fourteen compared CTC with no screening; fourteen compared CTC with colonoscopy-based screening; fewer compared CTC with sigmoidoscopy (8) or fecal tests (4). Outcomes assessed were life-years gained/saved (13), QALYs (2), or both (1). Three considered extra-colonic findings; seven considered complications. CTC appeared cost-effective versus no screening and, in general, flexible sigmoidoscopy and fecal occult blood testing. Results were mixed comparing CTC to colonoscopy. Parameters most influencing cost-effectiveness included: CTC costs, screening uptake, threshold for polyp referral, and extra-colonic findings. Evidence on cost-effectiveness of CTC screening is heterogeneous, due largely to between-study differences in comparators and parameter values. Future studies should: compare CTC with currently favored tests, especially fecal immunochemical tests; consider extra-colonic findings; and conduct comprehensive sensitivity analyses.

  11. Automated Oracle database testing

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    Ensuring database stability and steady performance in the modern world of agile computing is a major challenge. Various changes happening at any level of the computing infrastructure: OS parameters & packages, kernel versions, database parameters & patches, or even schema changes, all can potentially harm production services. This presentation shows how an automatic and regular testing of Oracle databases can be achieved in such agile environment.

  12. Quality study of portal images acquired by computed radiography and screen-film system under megavoltage ray

    International Nuclear Information System (INIS)

    Cao Guoquan; Jin Xiance; Wu Shixiu; Xie Congying; Zhang Li; Yu Jianyi; Li Yueqing

    2007-01-01

    Objective: To evaluate the quality of the portal images acquired by computed radiography (CR) system and conventional screen-film system, respectively. Methods: Imaging plates (IP) and X-ray films ora home-devised lead phantom with a leakage of 6.45% were acquired, and modulation transfer function (MTF) curves of the both images were measured using edge method. Portal images of 40 nasopharyngeal cancer patients were acquired by IP and screen-film system respectively. Two doctors with similar experience evaluated the damage degree of petrosal bone, the receiver operating characteristic (ROC) curve of CR images and general images were drawn according to two doctors evaluation results. Results: The identification frequency of CR system and screen-film system were 1.159 and 0.806 Lp/mm respectively. For doctor one, the area under ROC curve of CR images and general images were 0.802 and 0.742 respectively. For doctor two, the area under ROC curve of CR images and general images were 0.751 and 0.600 respectively. The MTF curve and ROC curve of CR are both better than those of screen-film system. Conclusion: The image quality of CR portal imaging is much better than that of screen-film system. The utility of CR in linear accelerator for portal imaging is promising in clinic. (authors)

  13. Denver screening protocol for blunt cerebrovascular injury reduces the use of multi-detector computed tomography angiography.

    Science.gov (United States)

    Beliaev, Andrei M; Barber, P Alan; Marshall, Roger J; Civil, Ian

    2014-06-01

    Blunt cerebrovascular injury (BCVI) occurs in 0.2-2.7% of blunt trauma patients and has up to 30% mortality. Conventional screening does not recognize up to 20% of BCVI patients. To improve diagnosis of BCVI, both an expanded battery of screening criteria and a multi-detector computed tomography angiography (CTA) have been suggested. The aim of this study is to investigate whether the use of CTA restricted to the Denver protocol screen-positive patients would reduce the unnecessary use of CTA as a pre-emptive screening tool. This is a registry-based study of blunt trauma patients admitted to Auckland City Hospital from 1998 to 2012. The diagnosis of BCVI was confirmed or excluded with CTA, magnetic resonance angiography and, if these imaging were non-conclusive, four-vessel digital subtraction angiography. Thirty (61%) BCVI and 19 (39%) non-BCVI patients met eligibility criteria. The Denver protocol applied to our cohort of patients had a sensitivity of 97% (95% confidence interval (CI): 83-100%) and a specificity of 42% (95% CI: 20-67%). With a prevalence of BCVI in blunt trauma patients of 0.2% and 2.7%, post-test odds of a screen-positive test were 0.03 (95% CI: 0.002-0.005) and 0.046 (95% CI: 0.314-0.068), respectively. Application of the CTA to the Denver protocol screen-positive trauma patients can decrease the use of CTA as a pre-emptive screening tool by 95-97% and reduces its hazards. © 2013 Royal Australasian College of Surgeons.

  14. Chest Computed Tomographic Image Screening for Cystic Lung Diseases in Patients with Spontaneous Pneumothorax Is Cost Effective.

    Science.gov (United States)

    Gupta, Nishant; Langenderfer, Dale; McCormack, Francis X; Schauer, Daniel P; Eckman, Mark H

    2017-01-01

    Patients without a known history of lung disease presenting with a spontaneous pneumothorax are generally diagnosed as having primary spontaneous pneumothorax. However, occult diffuse cystic lung diseases such as Birt-Hogg-Dubé syndrome (BHD), lymphangioleiomyomatosis (LAM), and pulmonary Langerhans cell histiocytosis (PLCH) can also first present with a spontaneous pneumothorax, and their early identification by high-resolution computed tomographic (HRCT) chest imaging has implications for subsequent management. The objective of our study was to evaluate the cost-effectiveness of HRCT chest imaging to facilitate early diagnosis of LAM, BHD, and PLCH. We constructed a Markov state-transition model to assess the cost-effectiveness of screening HRCT to facilitate early diagnosis of diffuse cystic lung diseases in patients presenting with an apparent primary spontaneous pneumothorax. Baseline data for prevalence of BHD, LAM, and PLCH and rates of recurrent pneumothoraces in each of these diseases were derived from the literature. Costs were extracted from 2014 Medicare data. We compared a strategy of HRCT screening followed by pleurodesis in patients with LAM, BHD, or PLCH versus conventional management with no HRCT screening. In our base case analysis, screening for the presence of BHD, LAM, or PLCH in patients presenting with a spontaneous pneumothorax was cost effective, with a marginal cost-effectiveness ratio of $1,427 per quality-adjusted life-year gained. Sensitivity analysis showed that screening HRCT remained cost effective for diffuse cystic lung diseases prevalence as low as 0.01%. HRCT image screening for BHD, LAM, and PLCH in patients with apparent primary spontaneous pneumothorax is cost effective. Clinicians should consider performing a screening HRCT in patients presenting with apparent primary spontaneous pneumothorax.

  15. Comparison between film-screen and computed radiography systems in Brazilian mammography

    International Nuclear Information System (INIS)

    Vieira, L.A.; Oliveira, J.R.; Carvalho, L.A.P.; César, A.C.Z.; Nogueira, M.S.

    2015-01-01

    Since 2004 the Public Health Office of the State of Minas Gerais in Brazil has established the Image Quality Control Program in Mammography. It evaluates the image quality based on an accredited phantom of the Brazilian College of Radiology (CBR). This phantom follows international standards such as masses, specks, fibers, contrast details and spatial resolution. The contrast index (CI) is accessed through optical density (OD) measurements. Although OD is defined under film-screen (FS) scope, among all accessible mammographic systems under the health office surveillance, almost 80% are computed radiography (CR) based. A necessity to adapt the protocol has emerged to consider OD as a conformity parameter. Objective: To verify the OD accessibility under CR´s printed out films and the feasibility to calculate contrast index, in comparison with FS´s. Results: A total of 56 images were evaluated with three different CBR phantoms. They were equally divided into FS and CR systems and a densitometer was used to read out their OD values. The correlation between their contrast-to-noise ratio (CNR) was found to be in the order of 0.77 (±0.14). The samples were not significantly different (inside 5% incertitude) for every phantom. The CNR correlation coefficient was 0.871. For OD, correlation coefficient was 0.989 and a log-fit function has shown good agreement with detector response. The OD-normalized standard deviation difference between CR and FS for every different phantom was 36.6%, 2.8% and 20.2%. A CI range for CR´s lying between 0.13 and 0.69 was found. Conclusions: Different phantoms were successfully tested in both CR and FS to evaluate the feasibility in use contrast index as a conformity parameter since their correlations are strictly related to calibration curve, as provided by phantom manufacturer. The relative CR-FS OD σ-difference provides a spreading indicator, where the first and last phantoms are considerably out of expectation. Such differences are

  16. Computer-aided detection of breast masses: Four-view strategy for screening mammography

    International Nuclear Information System (INIS)

    Wei Jun; Chan Heangping; Zhou Chuan; Wu Yita; Sahiner, Berkman; Hadjiiski, Lubomir M.; Roubidoux, Marilyn A.; Helvie, Mark A.

    2011-01-01

    Purpose: To improve the performance of a computer-aided detection (CAD) system for mass detection by using four-view information in screening mammography. Methods: The authors developed a four-view CAD system that emulates radiologists' reading by using the craniocaudal and mediolateral oblique views of the ipsilateral breast to reduce false positives (FPs) and the corresponding views of the contralateral breast to detect asymmetry. The CAD system consists of four major components: (1) Initial detection of breast masses on individual views, (2) information fusion of the ipsilateral views of the breast (referred to as two-view analysis), (3) information fusion of the corresponding views of the contralateral breast (referred to as bilateral analysis), and (4) fusion of the four-view information with a decision tree. The authors collected two data sets for training and testing of the CAD system: A mass set containing 389 patients with 389 biopsy-proven masses and a normal set containing 200 normal subjects. All cases had four-view mammograms. The true locations of the masses on the mammograms were identified by an experienced MQSA radiologist. The authors randomly divided the mass set into two independent sets for cross validation training and testing. The overall test performance was assessed by averaging the free response receiver operating characteristic (FROC) curves of the two test subsets. The FP rates during the FROC analysis were estimated by using the normal set only. The jackknife free-response ROC (JAFROC) method was used to estimate the statistical significance of the difference between the test FROC curves obtained with the single-view and the four-view CAD systems. Results: Using the single-view CAD system, the breast-based test sensitivities were 58% and 77% at the FP rates of 0.5 and 1.0 per image, respectively. With the four-view CAD system, the breast-based test sensitivities were improved to 76% and 87% at the corresponding FP rates, respectively

  17. Rapid Screening of Bovine Milk Oligosaccharides in a Whey Permeate Product and Domestic Animal Milks by Accurate Mass Database and Tandem Mass Spectral Library

    Science.gov (United States)

    Lee, Hyeyoung; Cuthbertson, Daniel J.; Otter, Don E.; Barile, Daniela

    2018-01-01

    A bovine milk oligosaccharide (BMO) library, prepared from cow colostrum, with 34 structures was generated and used to rapidly screen oligosaccharides in domestic animal milks and a whey permeate powder. The novel library was entered into a custom Personal Compound Database and Library (PCDL) and included accurate mass, retention time, and tandem mass spectra. Oligosaccharides in minute-sized samples were separated using nanoliquid chromatography (nanoLC) coupled to a high resolution and sensitive quadrupole-Time of Flight (Q-ToF) MS system. Using the PCDL, 18 oligosaccharides were found in a BMO-enriched product obtained from whey permeate processing. The usefulness of the analytical system and BMO library was further validated using milks from domestic sheep and buffaloes. Through BMO PCDL searching, 15 and 13 oligosaccharides in the BMO library were assigned in sheep and buffalo milks, respectively, thus demonstrating significant overlap between oligosaccharides in bovine (cow and buffalo) and ovine (sheep) milks. This method was shown to be an efficient, reliable, and rapid tool to identify oligosaccharide structures using automated spectral matching. PMID:27428379

  18. Feasibility of a computer-delivered driver safety behavior screening and intervention program initiated during an emergency department visit.

    Science.gov (United States)

    Murphy, Mary; Smith, Lucia; Palma, Anton; Lounsbury, David; Bijur, Polly; Chambers, Paul; Gallagher, E John

    2013-01-01

    Injuries from motor vehicle crashes are a significant public health problem. The emergency department (ED) provides a setting that may be used to screen for behaviors that increase risk for motor vehicle crashes and provide brief interventions to people who might otherwise not have access to screening and intervention. The purpose of the present study was to (1) assess the feasibility of using a computer-assisted screening program to educate ED patients about risky driving behaviors, (2) evaluate patient acceptance of the computer-based traffic safety educational intervention during an ED visit, and (3) assess postintervention changes in risky driving behaviors. Pre/posteducational intervention involving medically stable adult ED patients in a large urban academic ED serving over 100,000 patients annually. Patients completed a self-administered, computer-based program that queried patients on risky driving behaviors (texting, talking, and other forms of distracted driving) and alcohol use. The computer provided patients with educational information on the dangers of these behaviors and data were collected on patient satisfaction with the program. Staff called patients 1 month post-ED visit for a repeat query. One hundred forty-nine patients participated, and 111 completed 1-month follow up (75%); the mean age was 39 (range: 21-70), 59 percent were Hispanic, and 52 percent were male. Ninety-seven percent of patients reported that the program was easy to use and that they were comfortable receiving this education via computer during their ED visit. All driving behaviors significantly decreased in comparison to baseline with the following reductions reported: talking on the phone, 30 percent; aggressive driving, 30 percent; texting while driving, 19 percent; drowsy driving, 16 percent; driving while multitasking, 12 percent; and drinking and driving, 9 percent. Overall, patients were very satisfied receiving educational information about these behaviors via computer

  19. Evaluation report on research and development of a database system for mutual computer operation; Denshi keisanki sogo un'yo database system no kenkyu kaihatsu ni kansuru hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-03-01

    This paper describes evaluation on the research and development of a database system for mutual computer operation, with respect to discrete database technology, multi-media technology, high reliability technology, and mutual operation network system technology. A large number of research results placing the views on the future were derived, such as the issues of discretion and utilization patterns of the discrete database, structuring of data for multi-media information, retrieval systems, flexible and high-level utilization of the network, and the issues in database protection. These achievements are publicly disclosed widely. The largest feature of this project is in aiming at forming a network system that can be operated mutually under multi-vender environment. Therefore, the researches and developments have been executed under the spirit of the principle of openness to public and international cooperation. These efforts are represented by organizing the rule establishment committee, execution of mutual interconnection experiment (including demonstration evaluation), and development of the mounting rules based on the ISO's 'open system interconnection (OSI)'. These results are compiled in the JIS as the basic reference model for the open system interconnection, whereas the targets shown in the basic plan have been achieved sufficiently. (NEDO)

  20. Evaluation report on research and development of a database system for mutual computer operation; Denshi keisanki sogo un'yo database system no kenkyu kaihatsu ni kansuru hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-03-01

    This paper describes evaluation on the research and development of a database system for mutual computer operation, with respect to discrete database technology, multi-media technology, high reliability technology, and mutual operation network system technology. A large number of research results placing the views on the future were derived, such as the issues of discretion and utilization patterns of the discrete database, structuring of data for multi-media information, retrieval systems, flexible and high-level utilization of the network, and the issues in database protection. These achievements are publicly disclosed widely. The largest feature of this project is in aiming at forming a network system that can be operated mutually under multi-vender environment. Therefore, the researches and developments have been executed under the spirit of the principle of openness to public and international cooperation. These efforts are represented by organizing the rule establishment committee, execution of mutual interconnection experiment (including demonstration evaluation), and development of the mounting rules based on the ISO's 'open system interconnection (OSI)'. These results are compiled in the JIS as the basic reference model for the open system interconnection, whereas the targets shown in the basic plan have been achieved sufficiently. (NEDO)

  1. Is computer aided detection (CAD) cost effective in screening mammography? A model based on the CADET II study

    Science.gov (United States)

    2011-01-01

    Background Single reading with computer aided detection (CAD) is an alternative to double reading for detecting cancer in screening mammograms. The aim of this study is to investigate whether the use of a single reader with CAD is more cost-effective than double reading. Methods Based on data from the CADET II study, the cost-effectiveness of single reading with CAD versus double reading was measured in terms of cost per cancer detected. Cost (Pound (£), year 2007/08) of single reading with CAD versus double reading was estimated assuming a health and social service perspective and a 7 year time horizon. As the equipment cost varies according to the unit size a separate analysis was conducted for high, average and low volume screening units. One-way sensitivity analyses were performed by varying the reading time, equipment and assessment cost, recall rate and reader qualification. Results CAD is cost increasing for all sizes of screening unit. The introduction of CAD is cost-increasing compared to double reading because the cost of CAD equipment, staff training and the higher assessment cost associated with CAD are greater than the saving in reading costs. The introduction of single reading with CAD, in place of double reading, would produce an additional cost of £227 and £253 per 1,000 women screened in high and average volume units respectively. In low volume screening units, the high cost of purchasing the equipment will results in an additional cost of £590 per 1,000 women screened. One-way sensitivity analysis showed that the factors having the greatest effect on the cost-effectiveness of CAD with single reading compared with double reading were the reading time and the reader's professional qualification (radiologist versus advanced practitioner). Conclusions Without improvements in CAD effectiveness (e.g. a decrease in the recall rate) CAD is unlikely to be a cost effective alternative to double reading for mammography screening in UK. This study

  2. The efficacy of using computer-aided detection (CAD) for detection of breast cancer in mammography screening

    DEFF Research Database (Denmark)

    Henriksen, Emilie L; Carlsen, Jonathan F; Vejborg, Ilse Mm

    2018-01-01

    Background Early detection of breast cancer (BC) is crucial in lowering the mortality. Purpose To present an overview of studies concerning computer-aided detection (CAD) in screening mammography for early detection of BC and compare diagnostic accuracy and recall rates (RR) of single reading (SR......) with SR + CAD and double reading (DR) with SR + CAD. Material and Methods PRISMA guidelines were used as a review protocol. Articles on clinical trials concerning CAD for detection of BC in a screening population were included. The literature search resulted in 1522 records. A total of 1491 records were...... excluded by abstract and 18 were excluded by full text reading. A total of 13 articles were included. Results All but two studies from the SR vs. SR + CAD group showed an increased sensitivity and/or cancer detection rate (CDR) when adding CAD. The DR vs. SR + CAD group showed no significant differences...

  3. Balancing curability and unnecessary surgery in the context of computed tomography screening for lung cancer.

    Science.gov (United States)

    Flores, Raja; Bauer, Thomas; Aye, Ralph; Andaz, Shahriyour; Kohman, Leslie; Sheppard, Barry; Mayfield, William; Thurer, Richard; Smith, Michael; Korst, Robert; Straznicka, Michaela; Grannis, Fred; Pass, Harvey; Connery, Cliff; Yip, Rowena; Smith, James P; Yankelevitz, David; Henschke, Claudia; Altorki, Nasser

    2014-05-01

    Surgical management is a critical component of computed tomography (CT) screening for lung cancer. We report the results for US sites in a large ongoing screening program, the International Early Lung Cancer Action Program (I-ELCAP). We identified all patients who underwent surgical resection. We compared the results before (1993-2005) and after (2006-2011) termination of the National Lung Screening Trial to identify emerging trends. Among 31,646 baseline and 37,861 annual repeat CT screenings, 492 patients underwent surgical resection; 437 (89%) were diagnosed with lung cancer; 396 (91%) had clinical stage I disease. In the 54 (11%) patients with nonmalignant disease, resection was sublobar in 48 and lobectomy in 6. The estimated cure rate based on the 15-year Kaplan-Meier survival for all 428 patients (excluding 9 typical carcinoids) with lung cancer was 84% (95% confidence interval [CI], 80%-88%) and 88% (95% CI, 83%-92%) for clinical stage I disease resected within 1 month of diagnosis. Video-assisted thoracoscopic surgery and sublobar resection increased significantly, from 10% to 34% (P < .0001) and 22% to 34% (P = .01) respectively; there were no significant differences in the percentage of malignant diagnoses (90% vs 87%, P = .36), clinical stage I (92% vs 89%, P = .33), pathologic stage I (85% vs 82%, P = .44), tumor size (P = .61), or cell type (P = .81). The frequency and extent of surgery for nonmalignant disease can be minimized in a CT screening program and provide a high cure rate for those diagnosed with lung cancer and undergoing surgical resection. Copyright © 2014 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.

  4. Computed tomographic colonography to screen for colorectal cancer, extracolonic cancer, and aortic aneurysm: model simulation with cost-effectiveness analysis.

    Science.gov (United States)

    Hassan, Cesare; Pickhardt, Perry J; Pickhardt, Perry; Laghi, Andrea; Kim, Daniel H; Kim, Daniel; Zullo, Angelo; Iafrate, Franco; Di Giulio, Lorenzo; Morini, Sergio

    2008-04-14

    In addition to detecting colorectal neoplasia, abdominal computed tomography (CT) with colonography technique (CTC) can also detect unsuspected extracolonic cancers and abdominal aortic aneurysms (AAA).The efficacy and cost-effectiveness of this combined abdominal CT screening strategy are unknown. A computerized Markov model was constructed to simulate the occurrence of colorectal neoplasia, extracolonic malignant neoplasm, and AAA in a hypothetical cohort of 100,000 subjects from the United States who were 50 years of age. Simulated screening with CTC, using a 6-mm polyp size threshold for reporting, was compared with a competing model of optical colonoscopy (OC), both without and with abdominal ultrasonography for AAA detection (OC-US strategy). In the simulated population, CTC was the dominant screening strategy, gaining an additional 1458 and 462 life-years compared with the OC and OC-US strategies and being less costly, with a savings of $266 and $449 per person, respectively. The additional gains for CTC were largely due to a decrease in AAA-related deaths, whereas the modeled benefit from extracolonic cancer downstaging was a relatively minor factor. At sensitivity analysis, OC-US became more cost-effective only when the CTC sensitivity for large polyps dropped to 61% or when broad variations of costs were simulated, such as an increase in CTC cost from $814 to $1300 or a decrease in OC cost from $1100 to $500. With the OC-US approach, suboptimal compliance had a strong negative influence on efficacy and cost-effectiveness. The estimated mortality from CT-induced cancer was less than estimated colonoscopy-related mortality (8 vs 22 deaths), both of which were minor compared with the positive benefit from screening. When detection of extracolonic findings such as AAA and extracolonic cancer are considered in addition to colorectal neoplasia in our model simulation, CT colonography is a dominant screening strategy (ie, more clinically effective and more cost

  5. Stuck on Screens: Patterns of Computer and Gaming Station Use in Youth Seen in a Psychiatric Clinic

    Science.gov (United States)

    Baer, Susan; Bogusz, Elliot; Green, David A.

    2011-01-01

    Objective: Computer and gaming-station use has become entrenched in the culture of our youth. Parents of children with psychiatric disorders report concerns about overuse, but research in this area is limited. The goal of this study is to evaluate computer/gaming-station use in adolescents in a psychiatric clinic population and to examine the relationship between use and functional impairment. Method: 102 adolescents, ages 11–17, from out-patient psychiatric clinics participated. Amount of computer/gaming-station use, type of use (gaming or non-gaming), and presence of addictive features were ascertained along with emotional/functional impairment. Multivariate linear regression was used to examine correlations between patterns of use and impairment. Results: Mean screen time was 6.7±4.2 hrs/day. Presence of addictive features was positively correlated with emotional/functional impairment. Time spent on computer/gaming-station use was not correlated overall with impairment after controlling for addictive features, but non-gaming time was positively correlated with risky behavior in boys. Conclusions: Youth with psychiatric disorders are spending much of their leisure time on the computer/gaming-station and a substantial subset show addictive features of use which is associated with impairment. Further research to develop measures and to evaluate risk is needed to identify the impact of this problem. PMID:21541096

  6. Spatial distribution of clinical computer systems in primary care in England in 2016 and implications for primary care electronic medical record databases: a cross-sectional population study.

    Science.gov (United States)

    Kontopantelis, Evangelos; Stevens, Richard John; Helms, Peter J; Edwards, Duncan; Doran, Tim; Ashcroft, Darren M

    2018-02-28

    UK primary care databases (PCDs) are used by researchers worldwide to inform clinical practice. These databases have been primarily tied to single clinical computer systems, but little is known about the adoption of these systems by primary care practices or their geographical representativeness. We explore the spatial distribution of clinical computing systems and discuss the implications for the longevity and regional representativeness of these resources. Cross-sectional study. English primary care clinical computer systems. 7526 general practices in August 2016. Spatial mapping of family practices in England in 2016 by clinical computer system at two geographical levels, the lower Clinical Commissioning Group (CCG, 209 units) and the higher National Health Service regions (14 units). Data for practices included numbers of doctors, nurses and patients, and area deprivation. Of 7526 practices, Egton Medical Information Systems (EMIS) was used in 4199 (56%), SystmOne in 2552 (34%) and Vision in 636 (9%). Great regional variability was observed for all systems, with EMIS having a stronger presence in the West of England, London and the South; SystmOne in the East and some regions in the South; and Vision in London, the South, Greater Manchester and Birmingham. PCDs based on single clinical computer systems are geographically clustered in England. For example, Clinical Practice Research Datalink and The Health Improvement Network, the most popular primary care databases in terms of research outputs, are based on the Vision clinical computer system, used by <10% of practices and heavily concentrated in three major conurbations and the South. Researchers need to be aware of the analytical challenges posed by clustering, and barriers to accessing alternative PCDs need to be removed. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  7. A Computable Definition of Sepsis Facilitates Screening and Performance Improvement Tracking.

    Science.gov (United States)

    Alessi, Lauren J; Warmus, Holly R; Schaffner, Erin K; Kantawala, Sajel; Carcillo, Joseph; Rosen, Johanna; Horvat, Christopher M

    2018-03-01

    Sepsis kills almost 5,000 children annually, accounting for 16% of pediatric health care spending in the United States. We sought to identify sepsis within the Electronic Health Record (EHR) of a quaternary children's hospital to characterize disease incidence, improve recognition and response, and track performance metrics. Methods are organized in a plan-do-study-act cycle. During the "plan" phase, electronic definitions of sepsis (blood culture and antibiotic within 24 hours) and septic shock (sepsis plus vasoactive medication) were created to establish benchmark data and track progress with statistical process control. The performance of a screening tool was evaluated in the emergency department. During the "do" phase, a novel inpatient workflow is being piloted, which involves regular sepsis screening by nurses using the tool, and a regimented response to high risk patients. Screening tool use in the emergency department reduced time to antibiotics (Fig. 1). Of the 6,159 admissions, EHR definitions identified 1,433 (23.3%) between July and December 2016 with sepsis, of which 159 (11.1%) had septic shock. Hospital mortality for all sepsis patients was 2.2% and 15.7% for septic shock (Table 1). These findings approximate epidemiologic studies of sepsis and severe sepsis, which report a prevalence range of 0.45-8.2% and mortality range of 8.2-25% (Table 2). 1-5 . Implementation of a sepsis screening tool is associated with improved performance. The prevalence of sepsis conditions identified with electronic definitions approximates the epidemiologic landscape characterized by other point-prevalence and administrative studies, providing face validity to this approach, and proving useful for tracking performance improvement.

  8. Interobserver agreement and performance score comparison in quality control using a breast phantom: screen-film mammography vs computed radiography

    International Nuclear Information System (INIS)

    Shimamoto, Kazuhiro; Ikeda, Mitsuru; Satake, Hiroko; Ishigaki, Satoko; Sawaki, Akiko; Ishigaki, Takeo

    2002-01-01

    Our objective was to evaluate interobserver agreement and to compare the performance score in quality control of screen-film mammography and computed radiography (CR) using a breast phantom. Eleven radiologists interpreted a breast phantom image (CIRS model X) by four viewing methods: (a) original screen-film; (b) soft-copy reading of the digitized film image; (c) hard-copy reading of CR using an imaging plate; and (d) soft-copy reading of CR. For the soft-copy reading, a 17-in. CRT monitor (1024 x 1536 x 8 bits) was used. The phantom image was evaluated using a scoring system outlined in the instruction manual, and observers judged each object using a three-point rating scale: (a) clearly seen; (b) barely seen; and (c) not seen. For statistical analysis, the kappa statistic was employed. For ''mass'' depiction, interobserver agreement using CR was significantly lower than when using screen-film (p<0.05). There was no significant difference in the kappa value for detecting ''microcalcification''; however, the performance score of ''microcalcification'' on CR hard-copy was significantly lower than on the other three viewing methods (p<0.05). Viewing methods (film or CR, soft-copy or hard-copy) could affect how the phantom image is judged. Paying special attention to viewing conditions is recommended for quality control of CR mammograms. (orig.)

  9. Use of redundant sets of landmark information by humans (Homo sapiens) in a goal-searching task in an open field and on a computer screen.

    Science.gov (United States)

    Sekiguchi, Katsuo; Ushitani, Tomokazu; Sawa, Kosuke

    2018-05-01

    Landmark-based goal-searching tasks that were similar to those for pigeons (Ushitani & Jitsumori, 2011) were provided to human participants to investigate whether they could learn and use multiple sources of spatial information that redundantly indicate the position of a hidden target in both an open field (Experiment 1) and on a computer screen (Experiments 2 and 3). During the training in each experiment, participants learned to locate a target in 1 of 25 objects arranged in a 5 × 5 grid, using two differently colored, arrow-shaped (Experiments 1 and 2) or asymmetrically shaped (Experiment 3) landmarks placed adjacent to the goal and pointing to the goal location. The absolute location and directions of the landmarks varied across trials, but the constant configuration of the goal and the landmarks enabled participants to find the goal using both global configural information and local vector information (pointing to the goal by each individual landmark). On subsequent test trials, the direction was changed for one of the landmarks to conflict with the global configural information. Results of Experiment 1 indicated that participants used vector information from a single landmark but not configural information. Further examinations revealed that the use of global (metric) information was enhanced remarkably by goal searching with nonarrow-shaped landmarks on the computer monitor (Experiment 3) but much less so with arrow-shaped landmarks (Experiment 2). The General Discussion focuses on a comparison between humans in the current study and pigeons in the previous study. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  10. Computer-Aided Evaluation of Screening Mammograms Based on Local Texture Models

    Czech Academy of Sciences Publication Activity Database

    Grim, Jiří; Somol, Petr; Haindl, Michal; Daneš, J.

    2009-01-01

    Roč. 18, č. 4 (2009), s. 765-773 ISSN 1057-7149 R&D Projects: GA ČR GA102/07/1594; GA ČR GA102/08/0593; GA MŠk 1M0572 Grant - others:GA MŠk(CZ) 2C06019 Institutional research plan: CEZ:AV0Z10750506 Keywords : Screening mammography * texture information * local statistical model * Gaussian mixture Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 2.848, year: 2009

  11. Designing and testing computer based screening engine for severe sepsis/septic shock.

    Science.gov (United States)

    Herasevich, V; Afessa, B; Chute, C G; Gajic, O

    2008-11-06

    This study addresses the role of a sepsis "sniffer", an automatic screening tool for the timely identification of patients with severe sepsis/septic shock, based electronic medical records. During the two months prospective implementation in a medical intensive care unit, 37 of 320 consecutive patients developed severe sepsis/septic shock. The sniffer demonstrated a sensitivity of 48% and specificity of 86%, and positive predictive value 32%. Further improvements are needed prior to the implementation of sepsis sniffer in clinical practice and research.

  12. The effects of induced oblique astigmatism on symptoms and reading performance while viewing a computer screen.

    Science.gov (United States)

    Rosenfield, Mark; Hue, Jennifer E; Huang, Rae R; Bababekova, Yuliya

    2012-03-01

    Computer vision syndrome (CVS) is a complex of eye and vision problems related to computer use which has been reported in up to 90% of computer users. Ocular symptoms may include asthenopia, accommodative and vergence difficulties and dry eye. Previous studies have reported that uncorrected astigmatism may have a significant impact on symptoms of CVS. However, its effect on task performance is unclear. This study recorded symptoms after a 10 min period of reading from a computer monitor either through the habitual distance refractive correction or with a supplementary -1.00 or -2.00D oblique cylinder added over these lenses in 12 young, visually-normal subjects. Additionally, the distance correction condition was repeated to assess the repeatability of the symptom questionnaire. Subjects' reading speed and accuracy were monitored during the course of the 10 min trial. There was no significant difference in reading rate or the number of errors between the three astigmatic conditions. However, a significant change in symptoms was reported with the median total symptom scores for the 0, 1 and 2D astigmatic conditions being 2.0, 6.5 and 40.0, respectively (p computer operation. Ophthalmic & Physiological Optics © 2011 The College of Optometrists.

  13. The Danish fetal medicine database

    DEFF Research Database (Denmark)

    Ekelund, Charlotte Kvist; Kopp, Tine Iskov; Tabor, Ann

    2016-01-01

    trimester ultrasound scan performed at all public hospitals in Denmark are registered in the database. Main variables/descriptive data: Data on maternal characteristics, ultrasonic, and biochemical variables are continuously sent from the fetal medicine units’Astraia databases to the central database via...... analyses are sent to the database. Conclusion: It has been possible to establish a fetal medicine database, which monitors first-trimester screening for chromosomal abnormalities and second-trimester screening for major fetal malformations with the input from already collected data. The database...

  14. Mass prophylactic screening of the organized female populaton using the Thermograph-Computer System

    International Nuclear Information System (INIS)

    Vepkhvadze, R.Ya.; Khvedelidze, E.Sh.

    1984-01-01

    Organizational aspects of the Thermograph Computer System usage have been analyzed. It has been shown that results of thermodiagnosis completely coincide with clinical conclusion, but roentrenological method permits to reveal a disease only for 19 patients from 36 ones. It is possible to examine 120 women for the aim of early diagnosis of mammary gland diseases during the day operating hours with the use of the Thermograph Computer System. A movable thermodiagnostic room simultaneoUsly served as an inspection room to discover visual forms of tumor diseases including diseases of cervix uteri and may be used for mass preventive examination of the organized female population

  15. DESIGN AND DEVELOP A COMPUTER AIDED DESIGN FOR AUTOMATIC EXUDATES DETECTION FOR DIABETIC RETINOPATHY SCREENING

    Directory of Open Access Journals (Sweden)

    C. A. SATHIYAMOORTHY

    2016-04-01

    Full Text Available Diabetic Retinopathy is a severe and widely spread eye disease which can lead to blindness. One of the main symptoms for vision loss is Exudates and it could be prevented by applying an early screening process. In the Existing systems, a Fuzzy C-Means Clustering technique is used for detecting the exudates for analyzation. The main objective of this paper is, to improve the efficiency of the Exudates detection in diabetic retinopathy images. To do this, a three Stage – [TS] approach is introduced for detecting and extracting the exudates automatically from the retinal images for screening the Diabetic retinopathy. TS functions on the image in three levels such as Pre-processing the image, enhancing the image and detecting the Exudates accurately. After successful detection, the detected exudates are classified using GLCM method for finding the accuracy. The TS approach is experimented using MATLAB software and the performance evaluation can be proved by comparing the results with the existing approach’s result and with the hand-drawn ground truths images from the expert ophthalmologist.

  16. Hybrid quadrupole-orbitrap mass spectrometry analysis with accurate-mass database and parallel reaction monitoring for high-throughput screening and quantification of multi-xenobiotics in honey.

    Science.gov (United States)

    Li, Yi; Zhang, Jinzhen; Jin, Yue; Wang, Lin; Zhao, Wen; Zhang, Wenwen; Zhai, Lifei; Zhang, Yaping; Zhang, Yongxin; Zhou, Jinhui

    2016-01-15

    This study reports a rapid, automated screening and quantification method for the determination of multi-xenobiotic residues in honey using ultra-high performance liquid chromatography-hybrid quadrupole-Orbitrap mass spectrometry (UHPLC-Q-Orbitrap) with a user-built accurate-mass database plus parallel reaction monitoring (PRM). The database contains multi-xenobiotic information including formulas, adduct types, theoretical exact mass and retention time, characteristic fragment ions, ion ratios, and mass accuracies. A simple sample preparation method was developed to reduce xenobiotic loss in the honey samples. The screening method was validated based on retention time deviation, mass accuracy via full scan-data-dependent MS/MS (full scan-ddMS2), multi-isotope ratio, characteristic ion ratio, sensitivity, and positive/negative switching performance between the spiked sample and corresponding standard solution. The quantification method based on the PRM mode is a promising new quantitative tool which we validated in terms of selectivity, linearity, recovery (accuracy), repeatability (precision), decision limit (CCα), detection capability (CCβ), matrix effects, and carry-over. The optimized methods proposed in this study enable the automated screening and quantification of 157 compounds in less than 15 min in honey. The results of this study, as they represent a convenient protocol for large-scale screening and quantification, also provide a research approach for analysis of various contaminants in other matrices. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. NIRS database of the original research database

    International Nuclear Information System (INIS)

    Morita, Kyoko

    1991-01-01

    Recently, library staffs arranged and compiled the original research papers that have been written by researchers for 33 years since National Institute of Radiological Sciences (NIRS) established. This papers describes how the internal database of original research papers has been created. This is a small sample of hand-made database. This has been cumulating by staffs who have any knowledge about computer machine or computer programming. (author)

  18. Single reading with computer-aided detection performed by selected radiologists in a breast cancer screening program

    Energy Technology Data Exchange (ETDEWEB)

    Bargalló, Xavier, E-mail: xbarga@clinic.cat [Department of Radiology (CDIC), Hospital Clínic de Barcelona, C/ Villarroel, 170, 08036 Barcelona (Spain); Santamaría, Gorane; Amo, Montse del; Arguis, Pedro [Department of Radiology (CDIC), Hospital Clínic de Barcelona, C/ Villarroel, 170, 08036 Barcelona (Spain); Ríos, José [Biostatistics and Data Management Core Facility, IDIBAPS, (Hospital Clinic) C/ Mallorca, 183. Floor -1. Office #60. 08036 Barcelona (Spain); Grau, Jaume [Preventive Medicine and Epidemiology Unit, Hospital Clínic de Barcelona, C/ Villarroel, 170, 08036 Barcelona (Spain); Burrel, Marta; Cores, Enrique; Velasco, Martín [Department of Radiology (CDIC), Hospital Clínic de Barcelona, C/ Villarroel, 170, 08036 Barcelona (Spain)

    2014-11-15

    Highlights: • 1-The cancer detection rate of the screening program improved using a single reading protocol by experienced radiologists assisted by CAD. • 2-The cancer detection rate improved at the cost of increasing recall rate. • 3-CAD, used by breast radiologists, did not help to detect more cancers. - Abstract: Objectives: To assess the impact of shifting from a standard double reading plus arbitration protocol to a single reading by experienced radiologists assisted by computer-aided detection (CAD) in a breast cancer screening program. Methods: This was a prospective study approved by the ethics committee. Data from 21,321 consecutive screening mammograms in incident rounds (2010–2012) were read following a single reading plus CAD protocol and compared with data from 47,462 consecutive screening mammograms in incident rounds (2004–2010) that were interpreted following a double reading plus arbitration protocol. For the single reading, radiologists were selected on the basis of the appraisement of their previous performance. Results: Period 2010–2012 vs. period 2004–2010: Cancer detection rate (CDR): 6.1‰ (95% confidence interval: 5.1–7.2) vs. 5.25‰; Recall rate (RR): 7.02% (95% confidence interval: 6.7–7.4) vs. 7.24% (selected readers before arbitration) and vs. 3.94 (all readers after arbitration); Predictive positive value of recall: 8.69% vs. 13.32%. Average size of invasive cancers: 14.6 ± 9.5 mm vs. 14.3 ± 9.5 mm. Stage: 0 (22.3/26.1%); I (59.2/50.8%); II (19.2/17.1%); III (3.1/3.3%); IV (0/1.9%). Specialized breast radiologists performed better than general radiologists. Conclusions: The cancer detection rate of the screening program improved using a single reading protocol by experienced radiologists assisted by CAD, at the cost of a moderate increase of the recall rate mainly related to the lack of arbitration.

  19. Computational challenges and human factors influencing the design and use of clinical research participant eligibility pre-screening tools

    Directory of Open Access Journals (Sweden)

    Pressler Taylor R

    2012-05-01

    Full Text Available Abstract Background Clinical trials are the primary mechanism for advancing clinical care and evidenced-based practice, yet challenges with the recruitment of participants for such trials are widely recognized as a major barrier to these types of studies. Data warehouses (DW store large amounts of heterogenous clinical data that can be used to enhance recruitment practices, but multiple challenges exist when using a data warehouse for such activities, due to the manner of collection, management, integration, analysis, and dissemination of the data. A critical step in leveraging the DW for recruitment purposes is being able to match trial eligibility criteria to discrete and semi-structured data types in the data warehouse, though trial eligibility criteria tend to be written without concern for their computability. We present the multi-modal evaluation of a web-based tool that can be used for pre-screening patients for clinical trial eligibility and assess the ability of this tool to be practically used for clinical research pre-screening and recruitment. Methods The study used a validation study, usability testing, and a heuristic evaluation to evaluate and characterize the operational characteristics of the software as well as human factors affecting its use. Results Clinical trials from the Division of Cardiology and the Department of Family Medicine were used for this multi-modal evaluation, which included a validation study, usability study, and a heuristic evaluation. From the results of the validation study, the software demonstrated a positive predictive value (PPV of 54.12% and 0.7%, respectively, and a negative predictive value (NPV of 73.3% and 87.5%, respectively, for two types of clinical trials. Heuristic principles concerning error prevention and documentation were characterized as the major usability issues during the heuristic evaluation. Conclusions This software is intended to provide an initial list of eligible patients to a

  20. A TALE-inspired computational screen for proteins that contain approximate tandem repeats.

    Science.gov (United States)

    Perycz, Malgorzata; Krwawicz, Joanna; Bochtler, Matthias

    2017-01-01

    TAL (transcription activator-like) effectors (TALEs) are bacterial proteins that are secreted from bacteria to plant cells to act as transcriptional activators. TALEs and related proteins (RipTALs, BurrH, MOrTL1 and MOrTL2) contain approximate tandem repeats that differ in conserved positions that define specificity. Using PERL, we screened ~47 million protein sequences for TALE-like architecture characterized by approximate tandem repeats (between 30 and 43 amino acids in length) and sequence variability in conserved positions, without requiring sequence similarity to TALEs. Candidate proteins were scored according to their propensity for nuclear localization, secondary structure, repeat sequence complexity, as well as covariation and predicted structural proximity of variable residues. Biological context was tentatively inferred from co-occurrence of other domains and interactome predictions. Approximate repeats with TALE-like features that merit experimental characterization were found in a protein of chestnut blight fungus, a eukaryotic plant pathogen.

  1. Positioning graphical objects on computer screens: a three-phase model.

    Science.gov (United States)

    Pastel, Robert

    2011-02-01

    This experiment identifies and models phases during the positioning of graphical objects (called cursors in this article) on computer displays. The human computer-interaction community has traditionally used Fitts' law to model selection in graphical user interfaces, whereas human factors experiments have found the single-component Fitts' law inadequate to model positioning of real objects. Participants (N=145) repeatedly positioned variably sized square cursors within variably sized rectangular targets using computer mice. The times for the cursor to just touch the target, for the cursor to enter the target, and for participants to indicate positioning completion were observed. The positioning tolerances were varied from very precise and difficult to imprecise and easy. The time for the cursor to touch the target was proportional to the initial cursor-target distance. The time for the cursor to completely enter the target after touching was proportional to the logarithms of cursor size divided by target tolerances. The time for participants to indicate positioning after entering was inversely proportional to the tolerance. A three-phase model defined by regions--distant, proximate, and inside the target--was proposed and could model the positioning tasks. The three-phase model provides a framework for ergonomists to evaluate new positioning techniques and can explain their deficiencies. The model provides a means to analyze tasks and enhance interaction during positioning.

  2. Computation of Resonance-Screened Cross Section by the Dorix-Speng System

    Energy Technology Data Exchange (ETDEWEB)

    Haeggblom, H

    1968-09-15

    The report describes a scheme for computation of group cross sections for fast reactors in energy regions where the resonance structure of the cross sections may be dense. A combination of the programmes Dorix and Speng is then used. Dorix calculates group cross sections for each resonance absorber separately. The interaction between resolved resonances in the same isotope is treated using a method described in a separate report. The interaction between correlated and non-correlated resonances in the unresolved region is also considered. By a Dorix calculation we obtain effective microscopic cross sections which are then read in on a library tape. This library contains both point-by-point data and group cross sections and is used in the Speng programme for computation of spectrum and/or macroscopic cross sections. The resonance interaction between different isotopes is computed in Speng by the same method as was used in the Dorix programme for non-correlated unresolved resonances. Consideration is also given to the width of the resonances compared to the energy loss by a neutron colliding with some of the scattering elements.

  3. Computation of Resonance-Screened Cross Section by the Dorix-Speng System

    International Nuclear Information System (INIS)

    Haeggblom, H.

    1968-09-01

    The report describes a scheme for computation of group cross sections for fast reactors in energy regions where the resonance structure of the cross sections may be dense. A combination of the programmes Dorix and Speng is then used. Dorix calculates group cross sections for each resonance absorber separately. The interaction between resolved resonances in the same isotope is treated using a method described in a separate report. The interaction between correlated and non-correlated resonances in the unresolved region is also considered. By a Dorix calculation we obtain effective microscopic cross sections which are then read in on a library tape. This library contains both point-by-point data and group cross sections and is used in the Speng programme for computation of spectrum and/or macroscopic cross sections. The resonance interaction between different isotopes is computed in Speng by the same method as was used in the Dorix programme for non-correlated unresolved resonances. Consideration is also given to the width of the resonances compared to the energy loss by a neutron colliding with some of the scattering elements

  4. Nuclear Criticality Technology and Safety Project parameter study database

    International Nuclear Information System (INIS)

    Toffer, H.; Erickson, D.G.; Samuel, T.J.; Pearson, J.S.

    1993-03-01

    A computerized, knowledge-screened, comprehensive database of the nuclear criticality safety documentation has been assembled as part of the Nuclear Criticality Technology and Safety (NCTS) Project. The database is focused on nuclear criticality parameter studies. The database has been computerized using dBASE III Plus and can be used on a personal computer or a workstation. More than 1300 documents have been reviewed by nuclear criticality specialists over the last 5 years to produce over 800 database entries. Nuclear criticality specialists will be able to access the database and retrieve information about topical parameter studies, authors, and chronology. The database places the accumulated knowledge in the nuclear criticality area over the last 50 years at the fingertips of a criticality analyst

  5. The efficacy of using computer-aided detection (CAD) for detection of breast cancer in mammography screening: a systematic review.

    Science.gov (United States)

    Henriksen, Emilie L; Carlsen, Jonathan F; Vejborg, Ilse Mm; Nielsen, Michael B; Lauridsen, Carsten A

    2018-01-01

    Background Early detection of breast cancer (BC) is crucial in lowering the mortality. Purpose To present an overview of studies concerning computer-aided detection (CAD) in screening mammography for early detection of BC and compare diagnostic accuracy and recall rates (RR) of single reading (SR) with SR + CAD and double reading (DR) with SR + CAD. Material and Methods PRISMA guidelines were used as a review protocol. Articles on clinical trials concerning CAD for detection of BC in a screening population were included. The literature search resulted in 1522 records. A total of 1491 records were excluded by abstract and 18 were excluded by full text reading. A total of 13 articles were included. Results All but two studies from the SR vs. SR + CAD group showed an increased sensitivity and/or cancer detection rate (CDR) when adding CAD. The DR vs. SR + CAD group showed no significant differences in sensitivity and CDR. Adding CAD to SR increased the RR and decreased the specificity in all but one study. For the DR vs. SR + CAD group only one study reported a significant difference in RR. Conclusion All but two studies showed an increase in RR, sensitivity and CDR when adding CAD to SR. Compared to DR no statistically significant differences in sensitivity or CDR were reported. Additional studies based on organized population-based screening programs, with longer follow-up time, high-volume readers, and digital mammography are needed to evaluate the efficacy of CAD.

  6. Virtual Colonoscopy Screening With Ultra Low-Dose CT and Less-Stressful Bowel Preparation: A Computer Simulation Study

    Science.gov (United States)

    Wang, Jing; Wang, Su; Li, Lihong; Fan, Yi; Lu, Hongbing; Liang, Zhengrong

    2008-10-01

    Computed tomography colonography (CTC) or CT-based virtual colonoscopy (VC) is an emerging tool for detection of colonic polyps. Compared to the conventional fiber-optic colonoscopy, VC has demonstrated the potential to become a mass screening modality in terms of safety, cost, and patient compliance. However, current CTC delivers excessive X-ray radiation to the patient during data acquisition. The radiation is a major concern for screening application of CTC. In this work, we performed a simulation study to demonstrate a possible ultra low-dose CT technique for VC. The ultra low-dose abdominal CT images were simulated by adding noise to the sinograms of the patient CTC images acquired with normal dose scans at 100 mA s levels. The simulated noisy sinogram or projection data were first processed by a Karhunen-Loeve domain penalized weighted least-squares (KL-PWLS) restoration method and then reconstructed by a filtered backprojection algorithm for the ultra low-dose CT images. The patient-specific virtual colon lumen was constructed and navigated by a VC system after electronic colon cleansing of the orally-tagged residue stool and fluid. By the KL-PWLS noise reduction, the colon lumen can successfully be constructed and the colonic polyp can be detected in an ultra low-dose level below 50 mA s. Polyp detection can be found more easily by the KL-PWLS noise reduction compared to the results using the conventional noise filters, such as Hanning filter. These promising results indicate the feasibility of an ultra low-dose CTC pipeline for colon screening with less-stressful bowel preparation by fecal tagging with oral contrast.

  7. A Study for Visual Realism of Designed Pictures on Computer Screens by Investigation and Brain-Wave Analyses.

    Science.gov (United States)

    Wang, Lan-Ting; Lee, Kun-Chou

    2016-08-01

    In this article, the visual realism of designed pictures on computer screens is studied by investigation and brain-wave analyses. The practical electroencephalogram (EEG) measurement is always time-varying and fluctuating so that conventional statistical techniques are not adequate for analyses. This study proposes a new scheme based on "fingerprinting" to analyze the EEG. Fingerprinting is a technique of probabilistic pattern recognition used in electrical engineering, very like the identification of human fingerprinting in a criminal investigation. The goal of this study was to assess whether subjective preference for pictures could be manifested physiologically by EEG fingerprinting analyses. The most important advantage of the fingerprinting technique is that it does not require accurate measurement. Instead, it uses probabilistic classification. Participants' preference for pictures can be assessed using fingerprinting analyses of physiological EEG measurements. © The Author(s) 2016.

  8. Osteoporosis markers on low-dose lung cancer screening chest computed tomography scans predict all-cause mortality

    Energy Technology Data Exchange (ETDEWEB)

    Buckens, C.F. [University Medical Center Utrecht, Radiology Department, Utrecht (Netherlands); University Medical Center Utrecht, Julius Center for Health Sciences and Primary Care, Utrecht (Netherlands); Graaf, Y. van der [University Medical Center Utrecht, Julius Center for Health Sciences and Primary Care, Utrecht (Netherlands); Verkooijen, H.M.; Mali, W.P.; Jong, P.A. de [University Medical Center Utrecht, Radiology Department, Utrecht (Netherlands); Isgum, I.; Mol, C.P. [University Medical Center Utrecht, Image Sciences Institute, Utrecht (Netherlands); Verhaar, H.J. [University Medical Center Utrecht, Department of Geriatric Medicine, Utrecht (Netherlands); Vliegenthart, R.; Oudkerk, M. [Medical Center Groningen, Department of Radiology, Utrecht (Netherlands); Aalst, C.M. van; Koning, H.J. de [Erasmus MC Rotterdam, Department of Public Health, Rotterdam (Netherlands)

    2015-01-15

    Further survival benefits may be gained from low-dose chest computed tomography (CT) by assessing vertebral fractures and bone density. We sought to assess the association between CT-measured vertebral fractures and bone density with all-cause mortality in lung cancer screening participants. Following a case-cohort design, lung cancer screening trial participants (N = 3,673) who died (N = 196) during a median follow-up of 6 years (inter-quartile range: 5.7-6.3) were identified and added to a random sample of N = 383 from the trial. We assessed vertebral fractures using Genant and acute;s semiquantative method on sagittal reconstructions and measured bone density (Hounsfield Units (HU)) in vertebrae. Cox proportional hazards modelling was used to determine if vertebral fractures or bone density were independently predictive of mortality. The prevalence of vertebral fractures was 35 % (95 % confidence interval 30-40 %) among survivors and 51 % (44-58 %) amongst cases. After adjusting for age, gender, smoking status, pack years smoked, coronary and aortic calcium volume and pulmonary emphysema, the adjusted hazard ratio (HR) for vertebral fracture was 2.04 (1.43-2.92). For each 10 HU decline in trabecular bone density, the adjusted HR was 1.08 (1.02-1.15). Vertebral fractures and bone density are independently associated with all-cause mortality. (orig.)

  9. Comparison of standard reading and computer aided detection (CAD) on a national proficiency test of screening mammography

    International Nuclear Information System (INIS)

    Ciatto, Stefano; Del Turco, Marco Rosselli; Risso, Gabriella; Catarzi, Sandra; Bonardi, Rita; Viterbo, Valeria; Gnutti, Pierangela; Guglielmoni, Barbara; Pinelli, Lelio; Pandiscia, Anna; Navarra, Francesco; Lauria, Adele; Palmiero, Rosa; Indovina, Pietro Luigi

    2003-01-01

    Objective: To evaluate the role of computer aided detection (CAD) in improving the interpretation of screening mammograms Material and methods: Ten radiologists underwent a proficiency test of screening mammography first by conventional reading and then with the help of CAD. Radiologists were blinded to test results for the whole study duration. Results of conventional and CAD reading were compared in terms of sensitivity and recall rate. Double reading was simulated combining conventional readings of four expert radiologists and compared with CAD reading. Results: Considering all ten readings, cancer was identified in 146 or 153 of 170 cases (85.8 vs. 90.0%; χ 2 =0.99, df=1, P=0.31) and recalls were 106 or 152 of 1330 cases (7.9 vs. 11.4%; χ 2 =8.69, df=1, P=0.003) at conventional or CAD reading, respectively. CAD reading was essentially the same (sensitivity 97.0 vs. 96.0%; χ 2 =7.1, df=1, P=0.93; recall rate 10.7 vs. 10.6%; χ 2 =1.5, df=1, P=0.96) as compared with simulated conventional double reading. Conclusion: CAD reading seems to improve the sensitivity of conventional reading while reducing specificity, both effects being of limited size. CAD reading had almost the same performance of simulated conventional double reading, suggesting a possible use of CAD which needs to be confirmed by further studies inclusive of cost-effective analysis

  10. Osteoporosis markers on low-dose lung cancer screening chest computed tomography scans predict all-cause mortality

    International Nuclear Information System (INIS)

    Buckens, C.F.; Graaf, Y. van der; Verkooijen, H.M.; Mali, W.P.; Jong, P.A. de; Isgum, I.; Mol, C.P.; Verhaar, H.J.; Vliegenthart, R.; Oudkerk, M.; Aalst, C.M. van; Koning, H.J. de

    2015-01-01

    Further survival benefits may be gained from low-dose chest computed tomography (CT) by assessing vertebral fractures and bone density. We sought to assess the association between CT-measured vertebral fractures and bone density with all-cause mortality in lung cancer screening participants. Following a case-cohort design, lung cancer screening trial participants (N = 3,673) who died (N = 196) during a median follow-up of 6 years (inter-quartile range: 5.7-6.3) were identified and added to a random sample of N = 383 from the trial. We assessed vertebral fractures using Genant and acute;s semiquantative method on sagittal reconstructions and measured bone density (Hounsfield Units (HU)) in vertebrae. Cox proportional hazards modelling was used to determine if vertebral fractures or bone density were independently predictive of mortality. The prevalence of vertebral fractures was 35 % (95 % confidence interval 30-40 %) among survivors and 51 % (44-58 %) amongst cases. After adjusting for age, gender, smoking status, pack years smoked, coronary and aortic calcium volume and pulmonary emphysema, the adjusted hazard ratio (HR) for vertebral fracture was 2.04 (1.43-2.92). For each 10 HU decline in trabecular bone density, the adjusted HR was 1.08 (1.02-1.15). Vertebral fractures and bone density are independently associated with all-cause mortality. (orig.)

  11. Computational redesign of bacterial biotin carboxylase inhibitors using structure-based virtual screening of combinatorial libraries.

    Science.gov (United States)

    Brylinski, Michal; Waldrop, Grover L

    2014-04-02

    As the spread of antibiotic resistant bacteria steadily increases, there is an urgent need for new antibacterial agents. Because fatty acid synthesis is only used for membrane biogenesis in bacteria, the enzymes in this pathway are attractive targets for antibacterial agent development. Acetyl-CoA carboxylase catalyzes the committed and regulated step in fatty acid synthesis. In bacteria, the enzyme is composed of three distinct protein components: biotin carboxylase, biotin carboxyl carrier protein, and carboxyltransferase. Fragment-based screening revealed that amino-oxazole inhibits biotin carboxylase activity and also exhibits antibacterial activity against Gram-negative organisms. In this report, we redesigned previously identified lead inhibitors to expand the spectrum of bacteria sensitive to the amino-oxazole derivatives by including Gram-positive species. Using 9,411 small organic building blocks, we constructed a diverse combinatorial library of 1.2×10⁸ amino-oxazole derivatives. A subset of 9×10⁶ of these compounds were subjected to structure-based virtual screening against seven biotin carboxylase isoforms using similarity-based docking by eSimDock. Potentially broad-spectrum antibiotic candidates were selected based on the consensus ranking by several scoring functions including non-linear statistical models implemented in eSimDock and traditional molecular mechanics force fields. The analysis of binding poses of the top-ranked compounds docked to biotin carboxylase isoforms suggests that: (1) binding of the amino-oxazole anchor is stabilized by a network of hydrogen bonds to residues 201, 202 and 204; (2) halogenated aromatic moieties attached to the amino-oxazole scaffold enhance interactions with a hydrophobic pocket formed by residues 157, 169, 171 and 203; and (3) larger substituents reach deeper into the binding pocket to form additional hydrogen bonds with the side chains of residues 209 and 233. These structural insights into drug

  12. Computer-delivered indirect screening and brief intervention for drug use in the perinatal period: A randomized trial.

    Science.gov (United States)

    Ondersma, Steven J; Svikis, Dace S; Thacker, Casey; Resnicow, Ken; Beatty, Jessica R; Janisse, James; Puder, Karoline

    2018-04-01

    Under-reporting of drug use in the perinatal period is well-documented, and significantly limits the reach of proactive intervention approaches. The Wayne Indirect Drug Use Screener (WIDUS) focuses on correlates of drug use rather than use itself. This trial tested a computer-delivered, brief intervention designed for use with indirect screen-positive cases, seeking to motivate reductions in drug use without presuming its presence. Randomized clinical trial with 500 WIDUS-positive postpartum women recruited between August 14, 2012 and November 19, 2014. Participants were randomly assigned to either a time control condition or a single-session, tailored, indirect brief intervention. The primary outcome was days of drug use over the 6-month follow-up period; secondary outcomes included urine and hair analyses results at 3- and 6-month follow-up. All outcomes were measured by blinded evaluators. Of the 500 participants (252 intervention and 248 control), 36.1% of participants acknowledged drug use in the 3 months prior to pregnancy, but 89% tested positive at the 6-month follow-up. Participants rated the intervention as easy to use (4.9/5) and helpful (4.4/5). Analyses revealed no between-group differences in drug use (52% in the intervention group, vs. 53% among controls; OR 1.03). Exploratory analyses also showed that intervention effects were not moderated by baseline severity, WIDUS score, or readiness to change. The present trial showed no evidence of efficacy for an indirect, single-session, computer-delivered, brief intervention designed as a complement to indirect screening. More direct approaches that still do not presume active drug use may be possible and appropriate. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Computational screening and molecular dynamics simulation of disease associated nsSNPs in CENP-E

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Ambuj [Bioinformatics Division, School of Bio Sciences and Technology, Vellore Institute of Technology University, Vellore 632014, Tamil Nadu (India); Purohit, Rituraj, E-mail: riturajpurohit@gmail.com [Bioinformatics Division, School of Bio Sciences and Technology, Vellore Institute of Technology University, Vellore 632014, Tamil Nadu (India)

    2012-10-15

    Aneuploidy and chromosomal instability (CIN) are hallmarks of most solid tumors. Mutations in centroemere proteins have been observed in promoting aneuploidy and tumorigenesis. Recent studies reported that Centromere-associated protein-E (CENP-E) is involved in inducing cancers. In this study we investigated the pathogenic effect of 132 nsSNPs reported in CENP-E using computational platform. Y63H point mutation found to be associated with cancer using SIFT, Polyphen, PhD-SNP, MutPred, CanPredict and Dr. Cancer tools. Further we investigated the binding affinity of ATP molecule to the CENP-E motor domain. Complementarity scores obtained from docking studies showed significant loss in ATP binding affinity of mutant structure. Molecular dynamics simulation was carried to examine the structural consequences of Y63H mutation. Root mean square deviation (RMSD), root mean square fluctuation (RMSF), radius of gyration (R{sub g}), solvent accessibility surface area (SASA), energy value, hydrogen bond (NH Bond), eigenvector projection, trace of covariance matrix and atom density analysis results showed notable loss in stability for mutant structure. Y63H mutation was also shown to disrupt the native conformation of ATP binding region in CENP-E motor domain. Docking studies for remaining 18 mutations at 63rd residue position as well as other two computationally predicted disease associated mutations S22L and P69S were also carried to investigate their affect on ATP binding affinity of CENP-E motor domain. Our study provided a promising computational methodology to study the tumorigenic consequences of nsSNPs that have not been characterized and clear clue to the wet lab scientist.

  14. Computational screening and molecular dynamics simulation of disease associated nsSNPs in CENP-E

    International Nuclear Information System (INIS)

    Kumar, Ambuj; Purohit, Rituraj

    2012-01-01

    Aneuploidy and chromosomal instability (CIN) are hallmarks of most solid tumors. Mutations in centroemere proteins have been observed in promoting aneuploidy and tumorigenesis. Recent studies reported that Centromere-associated protein-E (CENP-E) is involved in inducing cancers. In this study we investigated the pathogenic effect of 132 nsSNPs reported in CENP-E using computational platform. Y63H point mutation found to be associated with cancer using SIFT, Polyphen, PhD-SNP, MutPred, CanPredict and Dr. Cancer tools. Further we investigated the binding affinity of ATP molecule to the CENP-E motor domain. Complementarity scores obtained from docking studies showed significant loss in ATP binding affinity of mutant structure. Molecular dynamics simulation was carried to examine the structural consequences of Y63H mutation. Root mean square deviation (RMSD), root mean square fluctuation (RMSF), radius of gyration (R g ), solvent accessibility surface area (SASA), energy value, hydrogen bond (NH Bond), eigenvector projection, trace of covariance matrix and atom density analysis results showed notable loss in stability for mutant structure. Y63H mutation was also shown to disrupt the native conformation of ATP binding region in CENP-E motor domain. Docking studies for remaining 18 mutations at 63rd residue position as well as other two computationally predicted disease associated mutations S22L and P69S were also carried to investigate their affect on ATP binding affinity of CENP-E motor domain. Our study provided a promising computational methodology to study the tumorigenic consequences of nsSNPs that have not been characterized and clear clue to the wet lab scientist.

  15. Value of computed tomography as a screening examination of pancreatic cancer

    International Nuclear Information System (INIS)

    Honda, Hiroshi; Watanabe, Katsushi; Nishikawa, Kiyoshi

    1983-01-01

    The abdominal CT films of 50 patients were reviewed by ten radiologists to evaluate the role of CT examination in the screening of pancreatic cancer. The 50 patients consisted of 10 with pancreatic cancer, 8 with other pancreatic abnormalities, and 32 with normal pancreas. Ten radiologists were divided into two groups according to their experience in evaluating CT examinations, an experienced group and an unexperienced group, respectively. In the detectability of pancreatic abnormality, the experienced group showed a sensitivity of 72.2% and a specificity of 86.2%. The unexperienced group showed a sensitivity of 70.9% and a specificity of 72.0%. In the detectability of pancreatic cancer, the experienced group showed a sensitivity of 62.0% and a specificity of 83.4%. The unexperienced group showed a sensitivity of 66.0% and a specificity of 81.8%. In the localization of the pancreatic cancer, there was no difference between the two groups. Pancreatic abnormality can be detected with high accuracy, but diagnosis of the nature of pancreatic cancer is difficult. Experience in evaluating CT examinations elevates the detectability of pancreatic abnormality but does not elevate the detectability of pancreatic cancer. These results suggest the difficulty in diagnosis of pancreatic cancer. (author)

  16. Discovery of earth-abundant nitride semiconductors by computational screening and high-pressure synthesis

    Science.gov (United States)

    Hinuma, Yoyo; Hatakeyama, Taisuke; Kumagai, Yu; Burton, Lee A.; Sato, Hikaru; Muraba, Yoshinori; Iimura, Soshi; Hiramatsu, Hidenori; Tanaka, Isao; Hosono, Hideo; Oba, Fumiyasu

    2016-01-01

    Nitride semiconductors are attractive because they can be environmentally benign, comprised of abundant elements and possess favourable electronic properties. However, those currently commercialized are mostly limited to gallium nitride and its alloys, despite the rich composition space of nitrides. Here we report the screening of ternary zinc nitride semiconductors using first-principles calculations of electronic structure, stability and dopability. This approach identifies as-yet-unreported CaZn2N2 that has earth-abundant components, smaller carrier effective masses than gallium nitride and a tunable direct bandgap suited for light emission and harvesting. High-pressure synthesis realizes this phase, verifying the predicted crystal structure and band-edge red photoluminescence. In total, we propose 21 promising systems, including Ca2ZnN2, Ba2ZnN2 and Zn2PN3, which have not been reported as semiconductors previously. Given the variety in bandgaps of the identified compounds, the present study expands the potential suitability of nitride semiconductors for a broader range of electronic, optoelectronic and photovoltaic applications. PMID:27325228

  17. review of the archaeological evidence for food plants from the British Isles: an example of the use of the Archaeobotanical Computer Database (ABCD

    Directory of Open Access Journals (Sweden)

    Philippa Tomlinson

    1996-09-01

    Full Text Available The Archaeobotanical Computer Database is an electronic compilation of information about remains of plants from archaeological deposits throughout the British Isles. For the first time, this wealth of published data, much of it post-dating Godwin's (1975 History of the British Flora has been brought together in a form in which the user can explore the history of a particular species or group of plants, or investigate the flora and vegetation of a particular archaeological period or part of the British Isles. The database contains information about the sites, deposits and samples from which the remains in question have been recovered, together with details of the plant parts identified and their mode of preservation. It also provides some interpretative guidance concerning the integrity of contexts and the reliability of dating as an aid to judging the quality of the data available. In this paper the compilers of the ABCD make use of the database in order to review the archaeological evidence for food plants in the British Isles. The paper begins with a definition of its scope, examining the concept of a "food plant" and the taphonomy of plant remains on British archaeological sites. It then summarises the principal changes in food plants from the prehistoric period to post-medieval times. The body of the paper is a detailed discussion of the evidence for the use of berries, other fruits, vegetables, pulses, herbs and flavourings, oil plants, cereals and nuts. Finally, the paper compares the archaeological evidence with that known from documentary sources. Readers will be able to view the archaeological evidence as distribution maps and will be able to explore aspects of the database online, enabling queries by taxa, site or worker. Instructions on obtaining electronic copies of the database tables and registering as an ABCD user are also included.

  18. Development and initial testing of a computer-based patient decision aid to promote colorectal cancer screening for primary care practice

    Directory of Open Access Journals (Sweden)

    Fowler Beth

    2005-11-01

    Full Text Available Abstract Background Although colorectal cancer screening is recommended by major policy-making organizations, rates of screening remain low. Our aim was to develop a patient-directed, computer-based decision aid about colorectal cancer screening and investigate whether it could increase patient interest in screening. Methods We used content from evidence-based literature reviews and our previous decision aid research to develop a prototype. We performed two rounds of usability testing with representative patients to revise the content and format. The final decision aid consisted of an introductory segment, four test-specific segments, and information to allow comparison of the tests across several key parameters. We then conducted a before-after uncontrolled trial of 80 patients 50–75 years old recruited from an academic internal medicine practice. Results Mean viewing time was 19 minutes. The decision aid improved patients' intent to ask providers for screening from a mean score of 2.8 (1 = not at all likely to ask, 4 = very likely to ask before viewing the decision aid to 3.2 afterwards (difference, 0.4; p Conclusion We conclude that a computer-based decision aid can increase patient intent to be screened and increase interest in screening. Practice Implications: This decision aid can be viewed by patients prior to provider appointments to increase motivation to be screened and to help them decide about which modality to use for screening. Further work is required to integrate the decision aid with other practice change strategies to raise screening rates to target levels.

  19. Computationally assisted screening and design of cell-interactive peptides by a cell-based assay using peptide arrays and a fuzzy neural network algorithm.

    Science.gov (United States)

    Kaga, Chiaki; Okochi, Mina; Tomita, Yasuyuki; Kato, Ryuji; Honda, Hiroyuki

    2008-03-01

    We developed a method of effective peptide screening that combines experiments and computational analysis. The method is based on the concept that screening efficiency can be enhanced from even limited data by use of a model derived from computational analysis that serves as a guide to screening and combining the model with subsequent repeated experiments. Here we focus on cell-adhesion peptides as a model application of this peptide-screening strategy. Cell-adhesion peptides were screened by use of a cell-based assay of a peptide array. Starting with the screening data obtained from a limited, random 5-mer library (643 sequences), a rule regarding structural characteristics of cell-adhesion peptides was extracted by fuzzy neural network (FNN) analysis. According to this rule, peptides with unfavored residues in certain positions that led to inefficient binding were eliminated from the random sequences. In the restricted, second random library (273 sequences), the yield of cell-adhesion peptides having an adhesion rate more than 1.5-fold to that of the basal array support was significantly high (31%) compared with the unrestricted random library (20%). In the restricted third library (50 sequences), the yield of cell-adhesion peptides increased to 84%. We conclude that a repeated cycle of experiments screening limited numbers of peptides can be assisted by the rule-extracting feature of FNN.

  20. Rapid, computer vision-enabled murine screening system identifies neuropharmacological potential of two new mechanisms

    Directory of Open Access Journals (Sweden)

    Steven L Roberds

    2011-09-01

    Full Text Available The lack of predictive in vitro models for behavioral phenotypes impedes rapid advancement in neuropharmacology and psychopharmacology. In vivo behavioral assays are more predictive of activity in human disorders, but such assays are often highly resource-intensive. Here we describe the successful application of a computer vision-enabled system to identify potential neuropharmacological activity of two new mechanisms. The analytical system was trained using multiple drugs that are used clinically to treat depression, schizophrenia, anxiety, and other psychiatric or behavioral disorders. During blinded testing the PDE10 inhibitor TP-10 produced a signature of activity suggesting potential antipsychotic activity. This finding is consistent with TP-10’s activity in multiple rodent models that is similar to that of clinically used antipsychotic drugs. The CK1ε inhibitor PF-670462 produced a signature consistent with anxiolytic activity and, at the highest dose tested, behavioral effects similar to that of opiate analgesics. Neither TP-10 nor PF-670462 was included in the training set. Thus, computer vision-based behavioral analysis can facilitate drug discovery by identifying neuropharmacological effects of compounds acting through new mechanisms.

  1. Exploring Natural Products from the Biodiversity of Pakistan for Computational Drug Discovery Studies: Collection, Optimization, Design and Development of A Chemical Database (ChemDP).

    Science.gov (United States)

    Mirza, Shaher Bano; Bokhari, Habib; Fatmi, Muhammad Qaiser

    2015-01-01

    Pakistan possesses a rich and vast source of natural products (NPs). Some of these secondary metabolites have been identified as potent therapeutic agents. However, the medicinal usage of most of these compounds has not yet been fully explored. The discoveries for new scaffolds of NPs as inhibitors of certain enzymes or receptors using advanced computational drug discovery approaches are also limited due to the unavailability of accurate 3D structures of NPs. An organized database incorporating all relevant information, therefore, can facilitate to explore the medicinal importance of the metabolites from Pakistani Biodiversity. The Chemical Database of Pakistan (ChemDP; release 01) is a fully-referenced, evolving, web-based, virtual database which has been designed and developed to introduce natural products (NPs) and their derivatives from the biodiversity of Pakistan to Global scientific communities. The prime aim is to provide quality structures of compounds with relevant information for computer-aided drug discovery studies. For this purpose, over 1000 NPs have been identified from more than 400 published articles, for which 2D and 3D molecular structures have been generated with a special focus on their stereochemistry, where applicable. The PM7 semiempirical quantum chemistry method has been used to energy optimize the 3D structure of NPs. The 2D and 3D structures can be downloaded as .sdf, .mol, .sybyl, .mol2, and .pdb files - readable formats by many chemoinformatics/bioinformatics software packages. Each entry in ChemDP contains over 100 data fields representing various molecular, biological, physico-chemical and pharmacological properties, which have been properly documented in the database for end users. These pieces of information have been either manually extracted from the literatures or computationally calculated using various computational tools. Cross referencing to a major data repository i.e. ChemSpider has been made available for overlapping

  2. Trialling computer touch-screen technology to assess psychological distress in patients with gynaecological cancer

    Directory of Open Access Journals (Sweden)

    Georgia Halkett

    2010-12-01

    Full Text Available BackgroundCancer impacts on the psychological well-being of many cancer patients. Appropriate tools can be used to assist health professionals in identifying patient needs and psychological distress. Recent research suggests that touchscreen technology can be used to administer surveys. The aim of this study was to evaluate the use of a touchscreen system in comparison to written questionnaires in a large tertiary hospital in Western Australia (WA.Method Patients who were scheduled to commence treatment for gynaecological cancer participated in this study. Patients were assigned to complete either a written questionnaire or the same survey using the touchscreen technology. Both methods of survey contained the same scales. All participants were asked to complete a follow-up patient satisfaction survey. Semi-structured interviews were conducted with health professionals to elicit views about the implementation of the technology and the available referral pathways. Data was analysed using descriptive statistics and content analysis. ResultsThirty patients completed the touchscreen questionnaires and an equal number completed the survey on paper. Participants who used the touchscreens were not significantly more satisfied than other participants. Four themes were noted in the interviews with health professionals: usability of technology, patients’ acceptance of technology, advantages of psychological screening and the value of the instruments included.ConclusionAlthough previous studies report that computerised assessments are a feasible option for assessing cancer patients’ needs, the data collected in this study demonstrates that the technology was not reliable with significant practical problems. The technology did not serve these patients better than pen and paper.

  3. Exploration of the associations of touch-screen tablet computer usage and musculoskeletal discomfort.

    Science.gov (United States)

    Chiang, Hsin-Yu Ariel; Liu, Chien-Hsiou

    2016-03-10

    Tablet users may be at high risk of developing physical discomfort because of their usage behaviors and tablet design. Investigate the usage of tablets, variations in head and neck posture associated with different tablet tilt angles, and the association of tablet use with users' musculoskeletal discomfort. A survey of users' subjective perceptions conducted by questionnaire and measurements of users' postures by a 3D Motion analysis system was used to explore the effects of tablet use. The questionnaire results indicated that over half of the participants reported physical discomfort after using tablets, with the most prevalent discomfort in the neck and shoulders, and more intensity of discomfort for the back although only few participants experienced it. Chi-squared tests indicated that significantly more participants who tended to use tablet computers to play games reported having musculoskeletal discomfort after using a tablet. In addition, preferences for tablet tilt angles varied across tasks (reading and game playing). The results from the 3D motion analysis revealed that head and neck flexion angles were significantly reduced when the tablets were positioned at relatively steep tilt angles. Neck flexion angle was significantly higher in game playing. These data add information regarding to the usage of tablet and its associations with physical discomfort (significantly more participants who tended to use tablet computers to play games reported having musculoskeletal discomfort after using a tablet). Steep tilt angles (such as 60°) may cause tablet users to decrease their head and neck flexion angles, which could lead to a more neutral, effortless, and ergonomically correct posture. Maintaining proper neck posture during active activities such as game playing is recommended to avoid neck discomfort.

  4. Developing an automated database for monitoring ultrasound- and computed tomography-guided procedure complications and diagnostic yield.

    Science.gov (United States)

    Itri, Jason N; Jones, Lisa P; Kim, Woojin; Boonn, William W; Kolansky, Ana S; Hilton, Susan; Zafar, Hanna M

    2014-04-01

    Monitoring complications and diagnostic yield for image-guided procedures is an important component of maintaining high quality patient care promoted by professional societies in radiology and accreditation organizations such as the American College of Radiology (ACR) and Joint Commission. These outcome metrics can be used as part of a comprehensive quality assurance/quality improvement program to reduce variation in clinical practice, provide opportunities to engage in practice quality improvement, and contribute to developing national benchmarks and standards. The purpose of this article is to describe the development and successful implementation of an automated web-based software application to monitor procedural outcomes for US- and CT-guided procedures in an academic radiology department. The open source tools PHP: Hypertext Preprocessor (PHP) and MySQL were used to extract relevant procedural information from the Radiology Information System (RIS), auto-populate the procedure log database, and develop a user interface that generates real-time reports of complication rates and diagnostic yield by site and by operator. Utilizing structured radiology report templates resulted in significantly improved accuracy of information auto-populated from radiology reports, as well as greater compliance with manual data entry. An automated web-based procedure log database is an effective tool to reliably track complication rates and diagnostic yield for US- and CT-guided procedures performed in a radiology department.

  5. Computational screen and experimental validation of anti-influenza effects of quercetin and chlorogenic acid from traditional Chinese medicine

    Science.gov (United States)

    Liu, Zekun; Zhao, Junpeng; Li, Weichen; Shen, Li; Huang, Shengbo; Tang, Jingjing; Duan, Jie; Fang, Fang; Huang, Yuelong; Chang, Haiyan; Chen, Ze; Zhang, Ran

    2016-01-01

    The Influenza A virus is a great threat for human health, while various subtypes of the virus made it difficult to develop drugs. With the development of state-of-art computational chemistry, computational molecular docking could serve as a virtual screen of potential leading compound. In this study, we performed molecular docking for influenza A H1N1 (A/PR/8/34) with small molecules such as quercetin and chlorogenic acid, which were derived from traditional Chinese medicine. The results showed that these small molecules have strong binding abilities with neuraminidase from H1N1 (A/PR/8/34). Further details showed that the structural features of the molecules might be helpful for further drug design and development. The experiments in vitro, in vivo have validated the anti-influenza effect of quercetin and chlorogenic acid, which indicating comparable protection effects as zanamivir. Taken together, it was proposed that chlorogenic acid and quercetin could be employed as the effective lead compounds for anti-influenza A H1N1.

  6. IPTV Service Framework Based on Secure Authentication and Lightweight Content Encryption for Screen-Migration in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Aymen Abdullah Alsaffar

    2015-01-01

    Full Text Available These days, the advancing of smart devices (e.g. smart phones, tablets, PC, etc. capabilities and the increase of internet bandwidth enables IPTV service provider to extend their services to smart mobile devices. User can just receive their IPTV service using any smart devices by accessing the internet via wireless network from anywhere anytime in the world which is convenience for users. However, wireless network communication has well a known critical security threats and vulnerabilities to user smart devices and IPTV service such as user identity theft, reply attack, MIM attack, and so forth. A secure authentication for user devices and multimedia protection mechanism is necessary to protect both user devices and IPTV services. As result, we proposed framework of IPTV service based on secure authentication mechanism and lightweight content encryption method for screen-migration in Cloud computing. We used cryptographic nonce combined with user ID and password to authenticate user device in any mobile terminal they passes by. In addition we used Lightweight content encryption to protect and reduce the content decode overload at mobile terminals. Our proposed authentication mechanism reduces the computational processing by 30% comparing to other authentication mechanism and our lightweight content encryption reduces encryption delay to 0.259 second.

  7. Atrial Fibrillation Screening in Nonmetropolitan Areas Using a Telehealth Surveillance System With an Embedded Cloud-Computing Algorithm: Prospective Pilot Study.

    Science.gov (United States)

    Chen, Ying-Hsien; Hung, Chi-Sheng; Huang, Ching-Chang; Hung, Yu-Chien; Hwang, Juey-Jen; Ho, Yi-Lwun

    2017-09-26

    Atrial fibrillation (AF) is a common form of arrhythmia that is associated with increased risk of stroke and mortality. Detecting AF before the first complication occurs is a recognized priority. No previous studies have examined the feasibility of undertaking AF screening using a telehealth surveillance system with an embedded cloud-computing algorithm; we address this issue in this study. The objective of this study was to evaluate the feasibility of AF screening in nonmetropolitan areas using a telehealth surveillance system with an embedded cloud-computing algorithm. We conducted a prospective AF screening study in a nonmetropolitan area using a single-lead electrocardiogram (ECG) recorder. All ECG measurements were reviewed on the telehealth surveillance system and interpreted by the cloud-computing algorithm and a cardiologist. The process of AF screening was evaluated with a satisfaction questionnaire. Between March 11, 2016 and August 31, 2016, 967 ECGs were recorded from 922 residents in nonmetropolitan areas. A total of 22 (2.4%, 22/922) residents with AF were identified by the physician's ECG interpretation, and only 0.2% (2/967) of ECGs contained significant artifacts. The novel cloud-computing algorithm for AF detection had a sensitivity of 95.5% (95% CI 77.2%-99.9%) and specificity of 97.7% (95% CI 96.5%-98.5%). The overall satisfaction score for the process of AF screening was 92.1%. AF screening in nonmetropolitan areas using a telehealth surveillance system with an embedded cloud-computing algorithm is feasible. ©Ying-Hsien Chen, Chi-Sheng Hung, Ching-Chang Huang, Yu-Chien Hung, Juey-Jen Hwang, Yi-Lwun Ho. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 26.09.2017.

  8. Atrial Fibrillation Screening in Nonmetropolitan Areas Using a Telehealth Surveillance System With an Embedded Cloud-Computing Algorithm: Prospective Pilot Study

    Science.gov (United States)

    Chen, Ying-Hsien; Hung, Chi-Sheng; Huang, Ching-Chang; Hung, Yu-Chien

    2017-01-01

    Background Atrial fibrillation (AF) is a common form of arrhythmia that is associated with increased risk of stroke and mortality. Detecting AF before the first complication occurs is a recognized priority. No previous studies have examined the feasibility of undertaking AF screening using a telehealth surveillance system with an embedded cloud-computing algorithm; we address this issue in this study. Objective The objective of this study was to evaluate the feasibility of AF screening in nonmetropolitan areas using a telehealth surveillance system with an embedded cloud-computing algorithm. Methods We conducted a prospective AF screening study in a nonmetropolitan area using a single-lead electrocardiogram (ECG) recorder. All ECG measurements were reviewed on the telehealth surveillance system and interpreted by the cloud-computing algorithm and a cardiologist. The process of AF screening was evaluated with a satisfaction questionnaire. Results Between March 11, 2016 and August 31, 2016, 967 ECGs were recorded from 922 residents in nonmetropolitan areas. A total of 22 (2.4%, 22/922) residents with AF were identified by the physician’s ECG interpretation, and only 0.2% (2/967) of ECGs contained significant artifacts. The novel cloud-computing algorithm for AF detection had a sensitivity of 95.5% (95% CI 77.2%-99.9%) and specificity of 97.7% (95% CI 96.5%-98.5%). The overall satisfaction score for the process of AF screening was 92.1%. Conclusions AF screening in nonmetropolitan areas using a telehealth surveillance system with an embedded cloud-computing algorithm is feasible. PMID:28951384

  9. Database Dictionary for Ethiopian National Ground-Water DAtabase (ENGDA) Data Fields

    Science.gov (United States)

    Kuniansky, Eve L.; Litke, David W.; Tucci, Patrick

    2007-01-01

    Introduction This document describes the data fields that are used for both field forms and the Ethiopian National Ground-water Database (ENGDA) tables associated with information stored about production wells, springs, test holes, test wells, and water level or water-quality observation wells. Several different words are used in this database dictionary and in the ENGDA database to describe a narrow shaft constructed in the ground. The most general term is borehole, which is applicable to any type of hole. A well is a borehole specifically constructed to extract water from the ground; however, for this data dictionary and for the ENGDA database, the words well and borehole are used interchangeably. A production well is defined as any well used for water supply and includes hand-dug wells, small-diameter bored wells equipped with hand pumps, or large-diameter bored wells equipped with large-capacity motorized pumps. Test holes are borings made to collect information about the subsurface with continuous core or non-continuous core and/or where geophysical logs are collected. Test holes are not converted into wells. A test well is a well constructed for hydraulic testing of an aquifer in order to plan a larger ground-water production system. A water-level or water-quality observation well is a well that is used to collect information about an aquifer and not used for water supply. A spring is any naturally flowing, local, ground-water discharge site. The database dictionary is designed to help define all fields on both field data collection forms (provided in attachment 2 of this report) and for the ENGDA software screen entry forms (described in Litke, 2007). The data entered into each screen entry field are stored in relational database tables within the computer database. The organization of the database dictionary is designed based on field data collection and the field forms, because this is what the majority of people will use. After each field, however, the

  10. A malaria diagnostic tool based on computer vision screening and visualization of Plasmodium falciparum candidate areas in digitized blood smears.

    Directory of Open Access Journals (Sweden)

    Nina Linder

    Full Text Available INTRODUCTION: Microscopy is the gold standard for diagnosis of malaria, however, manual evaluation of blood films is highly dependent on skilled personnel in a time-consuming, error-prone and repetitive process. In this study we propose a method using computer vision detection and visualization of only the diagnostically most relevant sample regions in digitized blood smears. METHODS: Giemsa-stained thin blood films with P. falciparum ring-stage trophozoites (n = 27 and uninfected controls (n = 20 were digitally scanned with an oil immersion objective (0.1 µm/pixel to capture approximately 50,000 erythrocytes per sample. Parasite candidate regions were identified based on color and object size, followed by extraction of image features (local binary patterns, local contrast and Scale-invariant feature transform descriptors used as input to a support vector machine classifier. The classifier was trained on digital slides from ten patients and validated on six samples. RESULTS: The diagnostic accuracy was tested on 31 samples (19 infected and 12 controls. From each digitized area of a blood smear, a panel with the 128 most probable parasite candidate regions was generated. Two expert microscopists were asked to visually inspect the panel on a tablet computer and to judge whether the patient was infected with P. falciparum. The method achieved a diagnostic sensitivity and specificity of 95% and 100% as well as 90% and 100% for the two readers respectively using the diagnostic tool. Parasitemia was separately calculated by the automated system and the correlation coefficient between manual and automated parasitemia counts was 0.97. CONCLUSION: We developed a decision support system for detecting malaria parasites using a computer vision algorithm combined with visualization of sample areas with the highest probability of malaria infection. The system provides a novel method for blood smear screening with a significantly reduced need for

  11. Screening Belief: The Life of Pi, Computer Generated Imagery, and Religious Imagination

    Directory of Open Access Journals (Sweden)

    Rachel Wagner

    2016-07-01

    Full Text Available Ang Lee’s The Life of Pi is based on Yann Martel’s novel of the same name. The film expands upon the novel’s fantastic story through the integration of new visual metaphors that invite religious reflection, and is reinforced by religious rituals within and beyond the film itself. Martel’s novel invites readers to believe Pi’s story without seeing it. Viewers of the film, by contrast, are invited to believe Pi’s story precisely because they are seeing it so vividly. Ang Lee constructs a filmic world using such elaborately developed CGI (computer-generated imagery that the film exhibits only a vestigial relationship to the real-life animals and locations used in its creation. Indeed, it is impossible to make sense of the film’s extensive use of religious themes and rituals without understanding its use of immersive visual effects. For Ang Lee, the manufacture of a seamless, aesthetically appealing CGI world was a means of visually affirming the broadly conceived notions of interconnectedness and purpose that he borrowed from Christianity, Islam, Hinduism, and Jewish mysticism.

  12. ePORT, NASA's Computer Database Program for System Safety Risk Management Oversight (Electronic Project Online Risk Tool)

    Science.gov (United States)

    Johnson, Paul W.

    2008-01-01

    ePORT (electronic Project Online Risk Tool) provides a systematic approach to using an electronic database program to manage a program/project risk management processes. This presentation will briefly cover the standard risk management procedures, then thoroughly cover NASA's Risk Management tool called ePORT. This electronic Project Online Risk Tool (ePORT) is a web-based risk management program that provides a common framework to capture and manage risks, independent of a programs/projects size and budget. It is used to thoroughly cover the risk management paradigm providing standardized evaluation criterion for common management reporting, ePORT improves Product Line, Center and Corporate Management insight, simplifies program/project manager reporting, and maintains an archive of data for historical reference.

  13. Clinical application of low-dose CT combined with computer-aided detection in lung cancer screening

    International Nuclear Information System (INIS)

    Xu Zushan; Hou Hongjun; Xu Yan; Ma Daqing

    2010-01-01

    Objective: To investigate the clinical value of chest low-dose CT (LDCT) combined with computer-aided detection (CAD) system for lung cancer screening in high risk population. Methods: Two hundred and nineteen healthy candidates underwent 64-slice LDCT scan. All images were reviewed in consensus by two radiologists with 15 years of thoracic CT diagnosis experience. Then the image data were analyzed with CAD alone. Finally images were reviewed by two radiologists with 5 years of CT diagnosis experience with and without CT Viewer software. The sensitivity, false positive rate of CAD for pulmonary nodule detection were calculated. SPSS 11.5 software and Chi-square test were used for the statistics. Results: Of 219 candidates ,104(47.5% ) were detected with lung nodules. There were 366 true nodules confirmed by the senior radiologists. The CAD system detected 271 (74.0%) true nodules and 424 false-positive nodules. The false-positive rate was 1.94/per case. The two junior radiologists indentifid 292 (79.8%), 286(78.1%) nodules without CAD and 336 (91.8%), 333 (91.0%) nodules with CAD respectively. There were significant differences for radiologists in indentifying nodules with or without CAD system (P<0.01). Conclusions: CAD is more sensitive than radiologists for indentifying the nodules in the central area or in the hilar region of the lung. While radiologists are more sensitive for the peripheral and sub-pleural nodules,or ground glass opacity nodules, or nodules smaller than 4 mm. CAD can not be used alone. The detection rate can be improved with the combination of radiologist and CAD in LDCT screen. (authors)

  14. Features of undiagnosed breast cancers at screening breast MR imaging and potential utility of computer-aided evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Mirinae; Cho, Nariya; Bea, Min Sun; Koo, Hye Ryoung; Kim, Won Hwa; Lee, Su Hyun; Chu, A Jung [Dept. of Radiology, Seoul National University Hospital, Seoul (Korea, Republic of)

    2016-02-15

    To retrospectively evaluate the features of undiagnosed breast cancers on prior screening breast magnetic resonance (MR) images in patients who were subsequently diagnosed with breast cancer, as well as the potential utility of MR-computer-aided evaluation (CAE). Between March 2004 and May 2013, of the 72 consecutive pairs of prior negative MR images and subsequent MR images with diagnosed cancers (median interval, 32.8 months; range, 5.4-104.6 months), 36 (50%) had visible findings (mean size, 1.0 cm; range, 0.3-5.2 cm). The visible findings were divided into either actionable or under threshold groups by the blinded review by 5 radiologists. MR imaging features, reasons for missed cancer, and MR-CAE features according to actionability were evaluated. Of the 36 visible findings on prior MR images, 33.3% (12 of 36) of the lesions were determined to be actionable and 66.7% (24 of 36) were underthreshold; 85.7% (6 of 7) of masses and 31.6% (6 of 19) of non-mass enhancements were classified as actionable lesions. Mimicking physiologic enhancements (27.8%, 10 of 36) and small lesion size (27.8%, 10 of 36) were the most common reasons for missed cancer. Actionable findings tended to show more washout or plateau kinetic patterns on MR-CAE than underthreshold findings, as the 100% of actionable findings and 46.7% of underthreshold findings showed washout or plateau (p = 0.008). MR-CAE has the potential for reducing the number of undiagnosed breast cancers on screening breast MR images, the majority of which are caused by mimicking physiologic enhancements or small lesion size.

  15. Computer application for database management and networking of service radio physics; Aplicacion informatica para la gestion de bases de datos y conexiones en red de un servicio de radiofisica

    Energy Technology Data Exchange (ETDEWEB)

    Ferrando Sanchez, A.; Cabello Murillo, E.; Diaz Fuentes, R.; Castro Novais, J.; Clemente Gutierrez, F.; Casa de Juan, M. A. de la; Adaimi Hernandez, P.

    2011-07-01

    The databases in the quality control prove to be a powerful tool for recording, management and statistical process control. Developed in a Windows environment and under Access (Microsoft Office) our service implements this philosophy on the centers computer network. A computer that acts as the server provides the database to the treatment units to record quality control measures daily and incidents. To remove shortcuts stop working with data migration, possible use of duplicate and erroneous data loss because of errors in network connections, which are common problems, we proceeded to manage connections and access to databases ease of maintenance and use horn to all service personnel.

  16. Accessing and using chemical databases

    DEFF Research Database (Denmark)

    Nikolov, Nikolai Georgiev; Pavlov, Todor; Niemelä, Jay Russell

    2013-01-01

    Computer-based representation of chemicals makes it possible to organize data in chemical databases-collections of chemical structures and associated properties. Databases are widely used wherever efficient processing of chemical information is needed, including search, storage, retrieval......, and dissemination. Structure and functionality of chemical databases are considered. The typical kinds of information found in a chemical database are considered-identification, structural, and associated data. Functionality of chemical databases is presented, with examples of search and access types. More details...... are included about the OASIS database and platform and the Danish (Q)SAR Database online. Various types of chemical database resources are discussed, together with a list of examples....

  17. Evaluating Computer Screen Time and Its Possible Link to Psychopathology in the Context of Age: A Cross-Sectional Study of Parents and Children.

    Science.gov (United States)

    Segev, Aviv; Mimouni-Bloch, Aviva; Ross, Sharon; Silman, Zmira; Maoz, Hagai; Bloch, Yuval

    2015-01-01

    Several studies have suggested that high levels of computer use are linked to psychopathology. However, there is ambiguity about what should be considered normal or over-use of computers. Furthermore, the nature of the link between computer usage and psychopathology is controversial. The current study utilized the context of age to address these questions. Our hypothesis was that the context of age will be paramount for differentiating normal from excessive use, and that this context will allow a better understanding of the link to psychopathology. In a cross-sectional study, 185 parents and children aged 3-18 years were recruited in clinical and community settings. They were asked to fill out questionnaires regarding demographics, functional and academic variables, computer use as well as psychiatric screening questionnaires. Using a regression model, we identified 3 groups of normal-use, over-use and under-use and examined known factors as putative differentiators between the over-users and the other groups. After modeling computer screen time according to age, factors linked to over-use were: decreased socialization (OR 3.24, Confidence interval [CI] 1.23-8.55, p = 0.018), difficulty to disengage from the computer (OR 1.56, CI 1.07-2.28, p = 0.022) and age, though borderline-significant (OR 1.1 each year, CI 0.99-1.22, p = 0.058). While psychopathology was not linked to over-use, post-hoc analysis revealed that the link between increased computer screen time and psychopathology was age-dependent and solidified as age progressed (p = 0.007). Unlike computer usage, the use of small-screens and smartphones was not associated with psychopathology. The results suggest that computer screen time follows an age-based course. We conclude that differentiating normal from over-use as well as defining over-use as a possible marker for psychiatric difficulties must be performed within the context of age. If verified by additional studies, future research should integrate

  18. Effect of smoking cessation on quantitative computed tomography in smokers at risk in a lung cancer screening population

    Energy Technology Data Exchange (ETDEWEB)

    Jobst, Bertram J.; Eichinger, Monika; Wielpuetz, Mark O. [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology, Heidelberg (Germany); Translational Lung Research Centre Heidelberg (TLRC), Member of the German Lung Research Centre (DZL), Heidelberg (Germany); Thoraxklinik at the University of Heidelberg, Department of Diagnostic and Interventional Radiology with Nuclear Medicine, Heidelberg (Germany); German Cancer Research Center (DKFZ), Department of Radiology, Heidelberg (Germany); Weinheimer, Oliver; Trauth, Mila; Kauczor, Hans-Ulrich [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology, Heidelberg (Germany); Translational Lung Research Centre Heidelberg (TLRC), Member of the German Lung Research Centre (DZL), Heidelberg (Germany); Thoraxklinik at the University of Heidelberg, Department of Diagnostic and Interventional Radiology with Nuclear Medicine, Heidelberg (Germany); Becker, Nikolaus; Motsch, Erna; Gross, Marie-Luise; Eigentopf, Anke [German Cancer Research Centre (DKFZ Heidelberg), Division of Cancer Epidemiology, Heidelberg (Germany); Tremper, Jan; Delorme, Stefan [German Cancer Research Center (DKFZ), Department of Radiology, Heidelberg (Germany)

    2018-02-15

    To longitudinally evaluate effects of smoking cessation on quantitative CT in a lung cancer screening cohort of heavy smokers over 4 years. After 4 years, low-dose chest CT was available for 314 long-term ex-smokers (ES), 404 continuous smokers (CS) and 39 recent quitters (RQ) who quitted smoking within 2 years after baseline CT. CT acquired at baseline and after 3 and 4 years was subjected to well-evaluated densitometry software, computing mean lung density (MLD) and 15th percentile of the lung density histogram (15TH). At baseline, active smokers showed significantly higher MLD and 15TH (-822±35 and -936±25 HU, respectively) compared to ES (-831±31 and -947±22 HU, p<0.01-0.001). After 3 years, CS again had significantly higher MLD and 15TH (-801±29 and -896±23 HU) than ES (-808±27 and -906±20 HU, p<0.01-0.001) but also RQ (-813±20 and -909±15 HU, p<0.05-0.001). Quantitative CT parameters did not change significantly after 4 years. Importantly, smoking status independently predicted MLD at baseline and year 3 (p<0.001) in multivariate analysis. On quantitative CT, lung density is higher in active smokers than ex-smokers, and sustainably decreases after smoking cessation, reflecting smoking-induced inflammation. Interpretations of quantitative CT data within clinical trials should consider smoking status. (orig.)

  19. Effect of smoking cessation on quantitative computed tomography in smokers at risk in a lung cancer screening population

    International Nuclear Information System (INIS)

    Jobst, Bertram J.; Eichinger, Monika; Wielpuetz, Mark O.; Weinheimer, Oliver; Trauth, Mila; Kauczor, Hans-Ulrich; Becker, Nikolaus; Motsch, Erna; Gross, Marie-Luise; Eigentopf, Anke; Tremper, Jan; Delorme, Stefan

    2018-01-01

    To longitudinally evaluate effects of smoking cessation on quantitative CT in a lung cancer screening cohort of heavy smokers over 4 years. After 4 years, low-dose chest CT was available for 314 long-term ex-smokers (ES), 404 continuous smokers (CS) and 39 recent quitters (RQ) who quitted smoking within 2 years after baseline CT. CT acquired at baseline and after 3 and 4 years was subjected to well-evaluated densitometry software, computing mean lung density (MLD) and 15th percentile of the lung density histogram (15TH). At baseline, active smokers showed significantly higher MLD and 15TH (-822±35 and -936±25 HU, respectively) compared to ES (-831±31 and -947±22 HU, p<0.01-0.001). After 3 years, CS again had significantly higher MLD and 15TH (-801±29 and -896±23 HU) than ES (-808±27 and -906±20 HU, p<0.01-0.001) but also RQ (-813±20 and -909±15 HU, p<0.05-0.001). Quantitative CT parameters did not change significantly after 4 years. Importantly, smoking status independently predicted MLD at baseline and year 3 (p<0.001) in multivariate analysis. On quantitative CT, lung density is higher in active smokers than ex-smokers, and sustainably decreases after smoking cessation, reflecting smoking-induced inflammation. Interpretations of quantitative CT data within clinical trials should consider smoking status. (orig.)

  20. Comparing genomes: databases and computational tools for comparative analysis of prokaryotic genomes - DOI: 10.3395/reciis.v1i2.Sup.105en

    Directory of Open Access Journals (Sweden)

    Marcos Catanho

    2007-12-01

    Full Text Available Since the 1990's, the complete genetic code of more than 600 living organisms has been deciphered, such as bacteria, yeasts, protozoan parasites, invertebrates and vertebrates, including Homo sapiens, and plants. More than 2,000 other genome projects representing medical, commercial, environmental and industrial interests, or comprising model organisms, important for the development of the scientific research, are currently in progress. The achievement of complete genome sequences of numerous species combined with the tremendous progress in computation that occurred in the last few decades allowed the use of new holistic approaches in the study of genome structure, organization and evolution, as well as in the field of gene prediction and functional classification. Numerous public or proprietary databases and computational tools have been created attempting to optimize the access to this information through the web. In this review, we present the main resources available through the web for comparative analysis of prokaryotic genomes. We concentrated on the group of mycobacteria that contains important human and animal pathogens. The birth of Bioinformatics and Computational Biology and the contributions of these disciplines to the scientific development of this field are also discussed.

  1. Evidence-based investigation of the influence of computer-aided detection of polyps on screening of colon cancer with CT colonography

    International Nuclear Information System (INIS)

    Yoshida, Hiroyuki

    2008-01-01

    Computed tomographic colonography (CTC), also known as virtual colonoscopy, is a CT examination of the colon for colorectal neoplasms. Recent large-scale clinical trials have demonstrated that CTC yields sensitivity comparable to optical colonoscopy in the detection of clinically significant polyps in a screening population, making CTC a promising technique for screening of colon cancer. For CTC to be a clinically practical means of screening, it must reliably and consistently detect polyps with high accuracy. However, high-level expertise is required to interpret the resulting CT images to find polyps, resulting in variable diagnostic accuracy among radiologists in the detection of polyps. A key technology to overcome this problem and to bring CTC to prime time for screening of colorectal cancer is computer-aided detection (CAD) of polyps. CAD automatically detects the locations of suspicious polyps in CTC images and presents them to radiologists. CAD has the potential to increase diagnostic performance in the detection of polyps as well as to reduce variability of the diagnostic accuracy among radiologists. This paper presents an evidence-based investigation of the influence of CAD on screening of colon cancer with CTC by describing the benefits of using CAD in the diagnosis of CTC, the fundamental CAD scheme for the detection of polyps in CTC, its detection performance, the effect on the improvement of detection performance, as well as the current and future challenges in CAD. (author)

  2. Children's accuracy of portion size estimation using digital food images: effects of interface design and size of image on computer screen.

    Science.gov (United States)

    Baranowski, Tom; Baranowski, Janice C; Watson, Kathleen B; Martin, Shelby; Beltran, Alicia; Islam, Noemi; Dadabhoy, Hafza; Adame, Su-heyla; Cullen, Karen; Thompson, Debbe; Buday, Richard; Subar, Amy

    2011-03-01

    To test the effect of image size and presence of size cues on the accuracy of portion size estimation by children. Children were randomly assigned to seeing images with or without food size cues (utensils and checked tablecloth) and were presented with sixteen food models (foods commonly eaten by children) in varying portion sizes, one at a time. They estimated each food model's portion size by selecting a digital food image. The same food images were presented in two ways: (i) as small, graduated portion size images all on one screen or (ii) by scrolling across large, graduated portion size images, one per sequential screen. Laboratory-based with computer and food models. Volunteer multi-ethnic sample of 120 children, equally distributed by gender and ages (8 to 13 years) in 2008-2009. Average percentage of correctly classified foods was 60·3 %. There were no differences in accuracy by any design factor or demographic characteristic. Multiple small pictures on the screen at once took half the time to estimate portion size compared with scrolling through large pictures. Larger pictures had more overestimation of size. Multiple images of successively larger portion sizes of a food on one computer screen facilitated quicker portion size responses with no decrease in accuracy. This is the method of choice for portion size estimation on a computer.

  3. Cpf1-Database: web-based genome-wide guide RNA library design for gene knockout screens using CRISPR-Cpf1.

    Science.gov (United States)

    Park, Jeongbin; Bae, Sangsu

    2018-03-15

    Following the type II CRISPR-Cas9 system, type V CRISPR-Cpf1 endonucleases have been found to be applicable for genome editing in various organisms in vivo. However, there are as yet no web-based tools capable of optimally selecting guide RNAs (gRNAs) among all possible genome-wide target sites. Here, we present Cpf1-Database, a genome-wide gRNA library design tool for LbCpf1 and AsCpf1, which have DNA recognition sequences of 5'-TTTN-3' at the 5' ends of target sites. Cpf1-Database provides a sophisticated but simple way to design gRNAs for AsCpf1 nucleases on the genome scale. One can easily access the data using a straightforward web interface, and using the powerful collections feature one can easily design gRNAs for thousands of genes in short time. Free access at http://www.rgenome.net/cpf1-database/. sangsubae@hanyang.ac.kr.

  4. Multimodal lung cancer screening using the ITALUNG biomarker panel and low dose computed tomography. Results of the ITALUNG biomarker study.

    Science.gov (United States)

    Carozzi, Francesca Maria; Bisanzi, Simonetta; Carrozzi, Laura; Falaschi, Fabio; Lopes Pegna, Andrea; Mascalchi, Mario; Picozzi, Giulia; Peluso, Marco; Sani, Cristina; Greco, Luana; Ocello, Cristina; Paci, Eugenio

    2017-07-01

    Asymptomatic high-risk subjects, randomized in the intervention arm of the ITALUNG trial (1,406 screened for lung cancer), were enrolled for the ITALUNG biomarker study (n = 1,356), in which samples of blood and sputum were analyzed for plasma DNA quantification (cut off 5 ng/ml), loss of heterozygosity and microsatellite instability. The ITALUNG biomarker panel (IBP) was considered positive if at least one of the two biomarkers included in the panel was positive. Subjects with and without lung cancer diagnosis at the end of the screening cycle with LDCT (n = 517) were evaluated. Out of 18 baseline screen detected lung cancer cases, 17 were IBP positive (94%). Repeat screen-detected lung cancer cases were 18 and 12 of them positive at baseline IBP test (66%). Interval cancer cases (2-years) and biomarker tests after a suspect Non Calcific Nodule follow-up were investigated. The single test versus multimodal screening measures of accuracy were compared in a simulation within the screened ITALUNG intervention arm, considering screen-detected and interval cancer cases. Sensitivity was 90% at baseline screening. Specificity was 71 and 61% for LDCT and IBP as baseline single test, and improved at 89% with multimodal, combined screening. The positive predictive value was 4.3% for LDCT at baseline and 10.6% for multimodal screening. Multimodal screening could improve the screening efficiency at baseline and strategies for future implementation are discussed. If IBP was used as primary screening test, the LDCT burden might decrease of about 60%. © 2017 UICC.

  5. Assessment of dose to patients undergoing computed radiography and film screen x-ray examinations in some Khartoum Hospitals

    International Nuclear Information System (INIS)

    Mohamed Khair, Haiffa Daffa Allah Mustafa

    2015-12-01

    Medical ionizing radiation sources give by far the largest contribution to the population dose from man made sources and most of the contribution comes from diagnostic x-rays. The optimization principle of radiation protection requires the minimization of radiation dose to patients while acquiring diagnostic quality images in radiology. In radiography, the extent of patient dose reduction is limited by the characteristics of the system used and the quality (or penetrating ability) of the x-ray beam. In this study, the entrance surface air kerma doses (ESA Ks) to patients undergoing 7 selected x-ray examinations were estimated. The study was conducted in eight hospitals in Khartoum State, comprising nine x-ray units and a total of 1200 patients were involved. Four of the hospitals involved in this study use computed radiography (CR) technology while the other four use film screen (FS) technology. The selected examinations were, abdomen (AP), chest (PA), pelvis (AP), skull (AP/PA), skull (LAT), thoracic spine (AP) and thoracic spine (LAT). The entrance surface air kerma was calculated by two methods, utilizing software CAL Dose X-3.5 and a mathematical model. Average ESAK values calculated using the two methods for hospitals using (CR) technology in mGy were 2.99 and 2.98, 0.34 and 0.31, 2.79 and 2.58, 0.76 and 0.71, 0.94 and 0.79, 3.4 and 3,2 and 5.9 and 5.03, for the above mentioned selected investigations respectively. And average ESAK values calculated using two methods for hospital using FS technology in mGy were found 4.98 and 4.19, 0.37 and 0.34, 4.15 and 3.95, 2.2 and 2. 1.3 and 1.1, 3.9 and 3.9, 9.4 and 8.3 for the above mentioned selected investigations respectively. Average ESAK values obtained by two methods for FS were higher values than the obtained by CR by 37 and 29%, 50 and 25%, 8%, 32 and 34%, 65 and 64%, 27 and 28%, 12% and 73% and 39% for the above mentioned selected investigations, respectively. This shows that CR technique allows diagnostically

  6. How to benchmark methods for structure-based virtual screening of large compound libraries.

    Science.gov (United States)

    Christofferson, Andrew J; Huang, Niu

    2012-01-01

    Structure-based virtual screening is a useful computational technique for ligand discovery. To systematically evaluate different docking approaches, it is important to have a consistent benchmarking protocol that is both relevant and unbiased. Here, we describe the designing of a benchmarking data set for docking screen assessment, a standard docking screening process, and the analysis and presentation of the enrichment of annotated ligands among a background decoy database.

  7. Building Parts Inventory Files Using the AppleWorks Data Base Subprogram and Apple IIe or GS Computers.

    Science.gov (United States)

    Schlenker, Richard M.

    This manual is a "how to" training device for building database files using the AppleWorks program with an Apple IIe or Apple IIGS Computer with Duodisk or two disk drives and an 80-column card. The manual provides step-by-step directions, and includes 25 figures depicting the computer screen at the various stages of the database file…

  8. Tsunami early warning in the Mediterranean: role, structure and tricks of pre-computed tsunami simulation databases and matching/forecasting algorithms

    Science.gov (United States)

    Armigliato, Alberto; Pagnoni, Gianluca; Tinti, Stefano

    2014-05-01

    The general idea that pre-computed simulated scenario databases can play a key role in conceiving tsunami early warning systems is commonly accepted by now. But it was only in the last decade that it started to be applied to the Mediterranean region, taking special impulse from initiatives like the GDACS and from recently concluded EU-funded projects such as TRIDEC and NearToWarn. With reference to these two projects and with the possibility of further developing this research line in the frame of the FP7 ASTARTE project, we discuss some results we obtained regarding two major topics, namely the strategies applicable to the tsunami scenario database building and the design and performance assessment of a timely and "reliable" elementary-scenario combination algorithm to be run in real-time. As for the first theme, we take advantage of the experience gained in the test areas of Western Iberia, Rhodes (Greece) and Cyprus to illustrate the criteria with which a "Matching Scenario Database" (MSDB) can be built. These involve 1) the choice of the main tectonic tsunamigenic sources (or areas), 2) their tessellation with matrices of elementary faults whose dimension heavily depend on the particular studied area and must be a compromise between the needs to represent the tsunamigenic area in sufficient detail and of limiting the number of scenarios to be simulated, 3) the computation of the scenarios themselves, 4) the choice of the relevant simulation outputs and the standardisation of their formats. Regarding the matching/forecast algorithm, we want it to select and combine the MSDB elements based on the initial earthquake magnitude and location estimate, and to produce a forecast of (at least) the tsunami arrival time, amplitude and period at the closest tide-level sensors and in all needed forecast points. We discuss the performance of the algorithm in terms of the time needed to produce the forecast after the earthquake is detected. In particular, we analyse the

  9. The Danish Fetal Medicine Database

    DEFF Research Database (Denmark)

    Ekelund, Charlotte K; Petersen, Olav B; Jørgensen, Finn S

    2015-01-01

    OBJECTIVE: To describe the establishment and organization of the Danish Fetal Medicine Database and to report national results of first-trimester combined screening for trisomy 21 in the 5-year period 2008-2012. DESIGN: National register study using prospectively collected first-trimester screening...... data from the Danish Fetal Medicine Database. POPULATION: Pregnant women in Denmark undergoing first-trimester screening for trisomy 21. METHODS: Data on maternal characteristics, biochemical and ultrasonic markers are continuously sent electronically from local fetal medicine databases (Astraia Gmbh...... software) to a central national database. Data are linked to outcome data from the National Birth Register, the National Patient Register and the National Cytogenetic Register via the mother's unique personal registration number. First-trimester screening data from 2008 to 2012 were retrieved. MAIN OUTCOME...

  10. Marginal public health gain of screening for colorectal cancer: modelling study, based on WHO and national databases in the Nordic countries

    DEFF Research Database (Denmark)

    Sigurdsson, J.A.; Getz, L.; Sjonell, G.

    2013-01-01

    .78 to 0.92]. Our calculations are based on the World Health Organization and national databanks on death causes (ICD-10) and the mid-year number of inhabitants in the target group. For Finland, Denmark, Norway and Sweden, we used data for 2009. For Iceland, due to the population's small size, we......, cardiovascular diseases and accidents, with some national variations. Conclusions and implications Establishment of a screening programme for CRC for people aged 55-74 can be expected to affect only a minor proportion of all premature deaths in the Nordic setting. From a public health perspective, prioritizing...

  11. Quantitative pre-clinical screening of therapeutics for joint diseases using contrast enhanced micro-computed tomography.

    Science.gov (United States)

    Willett, N J; Thote, T; Hart, M; Moran, S; Guldberg, R E; Kamath, R V

    2016-09-01

    The development of effective therapies for cartilage protection has been limited by a lack of efficient quantitative cartilage imaging modalities in pre-clinical in vivo models. Our objectives were two-fold: first, to validate a new contrast-enhanced 3D imaging analysis technique, equilibrium partitioning of an ionic contrast agent-micro computed tomography (EPIC-μCT), in a rat medial meniscal transection (MMT) osteoarthritis (OA) model; and second, to quantitatively assess the sensitivity of EPIC-μCT to detect the effects of matrix metalloproteinase inhibitor (MMPi) therapy on cartilage degeneration. Rats underwent MMT surgery and tissues were harvested at 1, 2, and 3 weeks post-surgery or rats received an MMPi or vehicle treatment and tissues harvested 3 weeks post-surgery. Parameters of disease progression were evaluated using histopathology and EPIC-μCT. Correlations and power analyses were performed to compare the techniques. EPIC-μCT was shown to provide simultaneous 3D quantification of multiple parameters, including cartilage degeneration and osteophyte formation. In MMT animals treated with MMPi, OA progression was attenuated, as measured by 3D parameters such as lesion volume and osteophyte size. A post-hoc power analysis showed that 3D parameters for EPIC-μCT were more sensitive than 2D parameters requiring fewer animals to detect a therapeutic effect of MMPi. 2D parameters were comparable between EPIC-μCT and histopathology. This study demonstrated that EPIC-μCT has high sensitivity to provide 3D structural and compositional measurements of cartilage and bone in the joint. EPIC-μCT can be used in combination with histology to provide a comprehensive analysis to screen new potential therapies. Copyright © 2016 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  12. Spondylolysis and spina bifida occulta in pediatric patients: prevalence study using computed tomography as a screening method.

    Science.gov (United States)

    Urrutia, Julio; Cuellar, Jorge; Zamora, Tomas

    2016-02-01

    The prevalence of spondylolysis reported from radiograph-based studies has been questioned in recent computed tomography (CT)-based studies in adults; however, no new data are available in pediatric patients. Spina bifida occulta (SBO), which has been associated to spondylolysis, may be increasing its prevalence, according to recent studies in adults in the last decades, but without new data in pediatric patients. We aimed to determine the prevalence of spondylolysis and SBO in pediatric patients using abdomen and pelvis CT as a screening tool. We studied 228 patients 4-15 years old (107 males), who were evaluated with abdomen and pelvis CT scans for reasons not related to the spine. The entire lumbo-sacral spine was evaluated to detect the presence of spondylolysis and SBO. We compared the prevalence of spondylolysis in patients with and without SBO. A logistic regression analysis was performed to determine the effect of age and sex as independent predictors of spondylolysis and SBO. The prevalence of spondylolysis was 3.5 % (1.1-5.9 %); 2/8 patients presented with olisthesis, both with grade I slip. The prevalence of SBO was 41.2 % (34.8-59.2 %) (94 patients). Spondylolysis was not more frequent in patients with SBO than in patients without SBO. Male sex and decreasing age independently predicted the presence of SBO, but not of spondylolysis. We observed a 3.5 % prevalence of spondylolysis and a 41.2 % prevalence of SBO. SBO was significantly more frequent in males and younger patients.

  13. Clinically Practical Approach for Screening of Low Muscularity Using Electronic Linear Measures on Computed Tomography Images in Critically Ill Patients.

    Science.gov (United States)

    Avrutin, Egor; Moisey, Lesley L; Zhang, Roselyn; Khattab, Jenna; Todd, Emma; Premji, Tahira; Kozar, Rosemary; Heyland, Daren K; Mourtzakis, Marina

    2017-12-06

    Computed tomography (CT) scans performed during routine hospital care offer the opportunity to quantify skeletal muscle and predict mortality and morbidity in intensive care unit (ICU) patients. Existing methods of muscle cross-sectional area (CSA) quantification require specialized software, training, and time commitment that may not be feasible in a clinical setting. In this article, we explore a new screening method to identify patients with low muscle mass. We analyzed 145 scans of elderly ICU patients (≥65 years old) using a combination of measures obtained with a digital ruler, commonly found on hospital radiological software. The psoas and paraspinal muscle groups at the level of the third lumbar vertebra (L3) were evaluated by using 2 linear measures each and compared with an established method of CT image analysis of total muscle CSA in the L3 region. There was a strong association between linear measures of psoas and paraspinal muscle groups and total L3 muscle CSA (R 2 = 0.745, P < 0.001). Linear measures, age, and sex were included as covariates in a multiple logistic regression to predict those with low muscle mass; receiver operating characteristic (ROC) area under the curve (AUC) of the combined psoas and paraspinal linear index model was 0.920. Intraclass correlation coefficients (ICCs) were used to evaluate intrarater and interrater reliability, resulting in scores of 0.979 (95% CI: 0.940-0.992) and 0.937 (95% CI: 0.828-0.978), respectively. A digital ruler can reliably predict L3 muscle CSA, and these linear measures may be used to identify critically ill patients with low muscularity who are at risk for worse clinical outcomes. © 2017 American Society for Parenteral and Enteral Nutrition.

  14. Computational Screening for Design of Optimal Coating Materials to Suppress Gas Evolution in Li-Ion Battery Cathodes.

    Science.gov (United States)

    Min, Kyoungmin; Seo, Seung-Woo; Choi, Byungjin; Park, Kwangjin; Cho, Eunseog

    2017-05-31

    Ni-rich layered oxides are attractive materials owing to their potentially high capacity for cathode applications. However, when used as cathodes in Li-ion batteries, they contain a large amount of Li residues, which degrade the electrochemical properties because they are the source of gas generation inside the battery. Here, we propose a computational approach to designing optimal coating materials that prevent gas evolution by removing residual Li from the surface of the battery cathode. To discover promising coating materials, the reactions of 16 metal phosphates (MPs) and 45 metal oxides (MOs) with the Li residues, LiOH, and Li 2 CO 3 are examined within a thermodynamic framework. A materials database is constructed according to density functional theory using a hybrid functional, and the reaction products are obtained according to the phases in thermodynamic equilibrium in the phase diagram. In addition, the gravimetric efficiency is calculated to identify coating materials that can eliminate Li residues with a minimal weight of the coating material. Overall, more MP and MO materials react with LiOH than with Li 2 CO 3 . Specifically, MPs exhibit better reactivity to both Li residues, whereas MOs react more with LiOH. The reaction products, such as Li-containing phosphates or oxides, are also obtained to identify the phases on the surface of a cathode after coating. On the basis of the Pareto-front analysis, P 2 O 5 could be an optimal material for the reaction with both Li residuals. Finally, the reactivity of the coating materials containing 3d/4d transition metal elements is better than that of materials containing other types of elements.

  15. Integrative data mining of high-throughput in vitro screens, in vivo data, and disease information to identify Adverse Outcome Pathway (AOP) signatures:ToxCast high-throughput screening data and Comparative Toxicogenomics Database (CTD) as a case study.

    Science.gov (United States)

    The Adverse Outcome Pathway (AOP) framework provides a systematic way to describe linkages between molecular and cellular processes and organism or population level effects. The current AOP assembly methods however, are inefficient. Our goal is to generate computationally-pr...

  16. Performance and Cost-Effectiveness of Computed Tomography Lung Cancer Screening Scenarios in a Population-Based Setting: A Microsimulation Modeling Analysis in Ontario, Canada.

    Directory of Open Access Journals (Sweden)

    Kevin Ten Haaf

    2017-02-01

    Full Text Available The National Lung Screening Trial (NLST results indicate that computed tomography (CT lung cancer screening for current and former smokers with three annual screens can be cost-effective in a trial setting. However, the cost-effectiveness in a population-based setting with >3 screening rounds is uncertain. Therefore, the objective of this study was to estimate the cost-effectiveness of lung cancer screening in a population-based setting in Ontario, Canada, and evaluate the effects of screening eligibility criteria.This study used microsimulation modeling informed by various data sources, including the Ontario Health Insurance Plan (OHIP, Ontario Cancer Registry, smoking behavior surveys, and the NLST. Persons, born between 1940 and 1969, were examined from a third-party health care payer perspective across a lifetime horizon. Starting in 2015, 576 CT screening scenarios were examined, varying by age to start and end screening, smoking eligibility criteria, and screening interval. Among the examined outcome measures were lung cancer deaths averted, life-years gained, percentage ever screened, costs (in 2015 Canadian dollars, and overdiagnosis. The results of the base-case analysis indicated that annual screening was more cost-effective than biennial screening. Scenarios with eligibility criteria that required as few as 20 pack-years were dominated by scenarios that required higher numbers of accumulated pack-years. In general, scenarios that applied stringent smoking eligibility criteria (i.e., requiring higher levels of accumulated smoking exposure were more cost-effective than scenarios with less stringent smoking eligibility criteria, with modest differences in life-years gained. Annual screening between ages 55-75 for persons who smoked ≥40 pack-years and who currently smoke or quit ≤10 y ago yielded an incremental cost-effectiveness ratio of $41,136 Canadian dollars ($33,825 in May 1, 2015, United States dollars per life-year gained

  17. Biofuel Database

    Science.gov (United States)

    Biofuel Database (Web, free access)   This database brings together structural, biological, and thermodynamic data for enzymes that are either in current use or are being considered for use in the production of biofuels.

  18. Community Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This excel spreadsheet is the result of merging at the port level of several of the in-house fisheries databases in combination with other demographic databases such...

  19. Retrieval program system of Chinese Evaluated (frequently useful) Nuclear Decay Database

    International Nuclear Information System (INIS)

    Huang Xiaolong; Zhou Chunmei

    1995-01-01

    The Chinese Evaluated (frequently useful) Nuclear Decay Database has been set up in MICRO-VAX-11 computer at Chinese Nuclear Data Center (CNDC). For users' convenience, the retrieval program system of the database is written. Retrieval can be carried out for one nucleus or multi-nucleus. The retrieved results can be displayed on terminal screen or output to M3081 printer and laser printer in ENSDF format, table report or scheme diagrams

  20. Computer-aided system of evaluation for population-based all-in-one service screening (CASE-PASS): from study design to outcome analysis with bias adjustment.

    Science.gov (United States)

    Chen, Li-Sheng; Yen, Amy Ming-Fang; Duffy, Stephen W; Tabar, Laszlo; Lin, Wen-Chou; Chen, Hsiu-Hsi

    2010-10-01

    Population-based routine service screening has gained popularity following an era of randomized controlled trials. The evaluation of these service screening programs is subject to study design, data availability, and the precise data analysis for adjusting bias. We developed a computer-aided system that allows the evaluation of population-based service screening to unify these aspects and facilitate and guide the program assessor to efficiently perform an evaluation. This system underpins two experimental designs: the posttest-only non-equivalent design and the one-group pretest-posttest design and demonstrates the type of data required at both the population and individual levels. Three major analyses were developed that included a cumulative mortality analysis, survival analysis with lead-time adjustment, and self-selection bias adjustment. We used SAS AF software to develop a graphic interface system with a pull-down menu style. We demonstrate the application of this system with data obtained from a Swedish population-based service screen and a population-based randomized controlled trial for the screening of breast, colorectal, and prostate cancer, and one service screening program for cervical cancer with Pap smears. The system provided automated descriptive results based on the various sources of available data and cumulative mortality curves corresponding to the study designs. The comparison of cumulative survival between clinically and screen-detected cases without a lead-time adjustment are also demonstrated. The intention-to-treat and noncompliance analysis with self-selection bias adjustments are also shown to assess the effectiveness of the population-based service screening program. Model validation was composed of a comparison between our adjusted self-selection bias estimates and the empirical results on effectiveness reported in the literature. We demonstrate a computer-aided system allowing the evaluation of population-based service screening

  1. Computer programme for control and maintenance and object oriented database: application to the realisation of an particle accelerator, the VIVITRON; Logiciel de controle et commande et base de donnees orientee objet: application dans le cadre de la mise en oeuvre d`un accelerateur de particules, le VIVITRON

    Energy Technology Data Exchange (ETDEWEB)

    Diaz, A

    1996-01-11

    The command and control system for the Vivitron, a new generation electrostatic particle accelerator, has been implemented using workstations and front-end computers using VME standards, the whole within an environment of UNIX/VxWorks. This architecture is distributed over an Ethernet network. Measurements and commands of the different sensors and actuators are concentrated in the front-end computers. The development of a second version of the software giving better performance and more functionality is described. X11 based communication has been utilised to transmit all the necessary informations to display parameters within the front-end computers on to the graphic screens. All other communications between processes use the Remote Procedure Call method (RPC). The conception of the system is based largely on the object oriented database O{sub 2} which integrates a full description of equipments and the code necessary to manage it. This code is generated by the database. This innovation permits easy maintenance of the system and bypasses the need of a specialist when adding new equipments. The new version of the command and control system has been progressively installed since August 1995. (author) 38 refs.

  2. Multimedia messages in genetics: design, development, and evaluation of a computer-based instructional resource for secondary school students in a Tay Sachs disease carrier screening program.

    Science.gov (United States)

    Gason, Alexandra A; Aitken, MaryAnne; Delatycki, Martin B; Sheffield, Edith; Metcalfe, Sylvia A

    2004-01-01

    Tay Sachs disease is a recessively inherited neurodegenerative disorder, for which carrier screening programs exist worldwide. Education for those offered a screening test is essential in facilitating informed decision-making. In Melbourne, Australia, we have designed, developed, and evaluated a computer-based instructional resource for use in the Tay Sachs disease carrier screening program for secondary school students attending Jewish schools. The resource entitled "Genetics in the Community: Tay Sachs disease" was designed on a platform of educational learning theory. The development of the resource included formative evaluation using qualitative data analysis supported by descriptive quantitative data. The final resource was evaluated within the screening program and compared with the standard oral presentation using a questionnaire. Knowledge outcomes were measured both before and after either of the educational formats. Data from the formative evaluation were used to refine the content and functionality of the final resource. The questionnaire evaluation of 302 students over two years showed the multimedia resource to be equally effective as an oral educational presentation in facilitating participants' knowledge construction. The resource offers a large number of potential benefits, which are not limited to the Tay Sachs disease carrier screening program setting, such as delivery of a consistent educational message, short delivery time, and minimum financial and resource commitment. This article outlines the value of considering educational theory and describes the process of multimedia development providing a framework that may be of value when designing genetics multimedia resources in general.

  3. Examination of Chronic Smoking Behavior and Eligibility for Low-Dose Computed Tomography for Lung Cancer Screening Among Older Chinese Male Smokers.

    Science.gov (United States)

    Li, Chien-Ching; Matthews, Alicia K; Dong, XinQi

    2017-07-01

    Low-dose computed tomography lung cancer (LDCT) screening is an effective way to decrease lung cancer mortality. Both Medicare and private insurers offer coverage of LDCT screening to beneficiaries who are at high risk of developing lung cancer. In this study, we examined rates and predictors of chronic smoking behavior and eligibility for coverage of LDCT screening among older Chinese men living in the greater Chicago area. Data were obtained from the Population Study of Chinese Elderly in Chicago, a population-based survey of community-dwelling, older Chinese adults in the Chicago metropolitan area. Eligibility criteria according to Centers of Medicare and Medicaid Services (CMS) and U.S. Preventive Services Task Force (USPSTF) for LDCT screening were used. Multivariate logistic regression was conducted to determine predictors of chronic smoking behavior which was operationalized as meeting criteria for LDCT screening. A quarter of the sample were current smokers and 42.5% reported a prior history of smoking. Eighteen percent and 22% of older Chinese men met the eligibility criteria for appropriateness for CMS and USPSTF LDCT screening, respectively. Furthermore, education, marital status, and number of children were significantly associated with chronic smoking behavior. Older Chinese men with chronic smoking behavior are at high risk of developing lung cancer and nearly one in five meet eligibility for LDCT screening. Increased outreach and education regarding early detection of lung cancer and smoking cessation are needed for this vulnerable and high-risk population. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  4. Annotation of novel neuropeptide precursors in the migratory locust based on transcript screening of a public EST database and mass spectrometry

    Directory of Open Access Journals (Sweden)

    De Loof Arnold

    2006-08-01

    Full Text Available Abstract Background For holometabolous insects there has been an explosion of proteomic and peptidomic information thanks to large genome sequencing projects. Heterometabolous insects, although comprising many important species, have been far less studied. The migratory locust Locusta migratoria, a heterometabolous insect, is one of the most infamous agricultural pests. They undergo a well-known and profound phase transition from the relatively harmless solitary form to a ferocious gregarious form. The underlying regulatory mechanisms of this phase transition are not fully understood, but it is undoubtedly that neuropeptides are involved. However, neuropeptide research in locusts is hampered by the absence of genomic information. Results Recently, EST (Expressed Sequence Tag databases from Locusta migratoria were constructed. Using bioinformatical tools, we searched these EST databases specifically for neuropeptide precursors. Based on known locust neuropeptide sequences, we confirmed the sequence of several previously identified neuropeptide precursors (i.e. pacifastin-related peptides, which consolidated our method. In addition, we found two novel neuroparsin precursors and annotated the hitherto unknown tachykinin precursor. Besides one of the known tachykinin peptides, this EST contained an additional tachykinin-like sequence. Using neuropeptide precursors from Drosophila melanogaster as a query, we succeeded in annotating the Locusta neuropeptide F, allatostatin-C and ecdysis-triggering hormone precursor, which until now had not been identified in locusts or in any other heterometabolous insect. For the tachykinin precursor, the ecdysis-triggering hormone precursor and the allatostatin-C precursor, translation of the predicted neuropeptides in neural tissues was confirmed with mass spectrometric techniques. Conclusion In this study we describe the annotation of 6 novel neuropeptide precursors and the neuropeptides they encode from the

  5. Novel high-resolution computed tomography-based radiomic classifier for screen-identified pulmonary nodules in the National Lung Screening Trial.

    Science.gov (United States)

    Peikert, Tobias; Duan, Fenghai; Rajagopalan, Srinivasan; Karwoski, Ronald A; Clay, Ryan; Robb, Richard A; Qin, Ziling; Sicks, JoRean; Bartholmai, Brian J; Maldonado, Fabien

    2018-01-01

    Optimization of the clinical management of screen-detected lung nodules is needed to avoid unnecessary diagnostic interventions. Herein we demonstrate the potential value of a novel radiomics-based approach for the classification of screen-detected indeterminate nodules. Independent quantitative variables assessing various radiologic nodule features such as sphericity, flatness, elongation, spiculation, lobulation and curvature were developed from the NLST dataset using 726 indeterminate nodules (all ≥ 7 mm, benign, n = 318 and malignant, n = 408). Multivariate analysis was performed using least absolute shrinkage and selection operator (LASSO) method for variable selection and regularization in order to enhance the prediction accuracy and interpretability of the multivariate model. The bootstrapping method was then applied for the internal validation and the optimism-corrected AUC was reported for the final model. Eight of the originally considered 57 quantitative radiologic features were selected by LASSO multivariate modeling. These 8 features include variables capturing Location: vertical location (Offset carina centroid z), Size: volume estimate (Minimum enclosing brick), Shape: flatness, Density: texture analysis (Score Indicative of Lesion/Lung Aggression/Abnormality (SILA) texture), and surface characteristics: surface complexity (Maximum shape index and Average shape index), and estimates of surface curvature (Average positive mean curvature and Minimum mean curvature), all with Pscreen-detected nodule characterization appears extremely promising however independent external validation is needed.

  6. New orthopaedic implant management tool for computer-assisted planning, navigation, and simulation: from implant CAD files to a standardized XML-based implant database.

    Science.gov (United States)

    Sagbo, S; Blochaou, F; Langlotz, F; Vangenot, C; Nolte, L-P; Zheng, G

    2005-01-01

    Computer-Assisted Orthopaedic Surgery (CAOS) has made much progress over the last 10 years. Navigation systems have been recognized as important tools that help surgeons, and various such systems have been developed. A disadvantage of these systems is that they use non-standard formalisms and techniques. As a result, there are no standard concepts for implant and tool management or data formats to store information for use in 3D planning and navigation. We addressed these limitations and developed a practical and generic solution that offers benefits for surgeons, implant manufacturers, and CAS application developers. We developed a virtual implant database containing geometrical as well as calibration information for orthopedic implants and instruments, with a focus on trauma. This database has been successfully tested for various applications in the client/server mode. The implant information is not static, however, because manufacturers periodically revise their implants, resulting in the deletion of some implants and the introduction of new ones. Tracking these continuous changes and keeping CAS systems up to date is a tedious task if done manually. This leads to additional costs for system development, and some errors are inevitably generated due to the huge amount of information that has to be processed. To ease management with respect to implant life cycle, we developed a tool to assist end-users (surgeons, hospitals, CAS system providers, and implant manufacturers) in managing their implants. Our system can be used for pre-operative planning and intra-operative navigation, and also for any surgical simulation involving orthopedic implants. Currently, this tool allows addition of new implants, modification of existing ones, deletion of obsolete implants, export of a given implant, and also creation of backups. Our implant management system has been successfully tested in the laboratory with very promising results. It makes it possible to fill the current gap

  7. ZAGRADA - A New Radiocarbon Database

    International Nuclear Information System (INIS)

    Portner, A.; Obelic, B.; Krajcar Bornic, I.

    2008-01-01

    In the Radiocarbon and Tritium Laboratory at the Rudjer Boskovic Institute three different techniques for 14C dating have been used: Gas Proportional Counting (GPC), Liquid Scintillation Counting (LSC) and preparation of milligram-sized samples for AMS dating (Accelerator Mass Spectrometry). The use of several measurement techniques has initiated a need for development of a new relational database ZAGRADA (Zagreb Radiocarbon Database) since the existing software package CARBO could not satisfy the requirements for parallel processing/using of several techniques. Using the SQL procedures, and constraints defined by primary and foreign keys, ZAGRADA enforces high data integrity and provides better performances in data filtering and sorting. Additionally, the new database for 14C samples is a multi-user oriented application that can be accessed from remote computers in the work group providing thus better efficiency of laboratory activities. In order to facilitate data handling and processing in ZAGRADA, the graphical user interface is designed to be user-friendly and to perform various actions on data like input, corrections, searching, sorting and output to printer. All invalid actions performed in user interface are registered with short textual description of an error occurred and appearing on screen in message boxes. Unauthorized access is also prevented by login control and each application window has implemented support to track last changes made by the user. The implementation of a new database for 14C samples has significant contribution to scientific research performed in the Radiocarbon and Tritium Laboratory and will provide better and easier communication with customers.(author)

  8. [Patient's Autonomy and Information in Psycho-Oncology: Computer Based Distress Screening for an Interactive Treatment Planning (ePOS-react)].

    Science.gov (United States)

    Schäffeler, Norbert; Sedelmaier, Jana; Möhrer, Hannah; Ziser, Katrin; Ringwald, Johanna; Wickert, Martin; Brucker, Sara; Junne, Florian; Zipfel, Stephan; Teufel, Martin

    2017-07-01

    To identify distressed patients in oncology using screening questionnaires is quite challenging in clinical routine. Up to now there is no evidence based recommendation which instrument is most suitable and how to put a screening to practice. Using computer based screening tools offers the possibility to automatically analyse patient's data, inform psycho-oncological and medical staff about the results, and use reactive questionnaires. Studies on how to empower patients in decision making in psycho-oncology are rare.Methods Women with breast and gynaecological cancer have been consecutively included in this study (n=103) at time of inpatient surgical treatment in a gynaecological clinic. They answered the computer based screening questionnaire (ePOS-react) for routine distress screening at time of admission. At the end of the tool an individual recommendation concerning psycho-oncological treatment is given ( i) psycho-oncological counselling, ii) brief psycho-oncological contact, iii) no treatment suggestion). The informed patients could choose autonomously either the recommended treatment or an individually more favoured alternative possibility. Additionally, a clinical interview (approx. 30 min) based on the "Psychoonkologische Basisdiagnostik (PO-Bado)" has been carried out for a third-party assessment of patients' need for treatment.Results 68.9% followed the treatment recommendation. 22.3% asked for a more "intense" (e. g. counselling instead of recommended brief contact) and 8,7% for a "less intense" intervention than recommended. The accordance of third-party assessment (clinical interview "PO-Bado") and treatment recommendation is about 72.8%. The accordance of third-party assessment and patient's choice (ePOS-react) is about 58.3%. The latter is smaller because 29.1% asked for a brief psycho-oncological contact for whom from the third-party assessment's perspective no indication for treatment has been existent.Discussion A direct response of the

  9. Implications of Nine Risk Prediction Models for Selecting Ever-Smokers for Computed Tomography Lung Cancer Screening.

    Science.gov (United States)

    Katki, Hormuzd A; Kovalchik, Stephanie A; Petito, Lucia C; Cheung, Li C; Jacobs, Eric; Jemal, Ahmedin; Berg, Christine D; Chaturvedi, Anil K

    2018-05-15

    Lung cancer screening guidelines recommend using individualized risk models to refer ever-smokers for screening. However, different models select different screening populations. The performance of each model in selecting ever-smokers for screening is unknown. To compare the U.S. screening populations selected by 9 lung cancer risk models (the Bach model; the Spitz model; the Liverpool Lung Project [LLP] model; the LLP Incidence Risk Model [LLPi]; the Hoggart model; the Prostate, Lung, Colorectal, and Ovarian Cancer Screening Trial Model 2012 [PLCOM2012]; the Pittsburgh Predictor; the Lung Cancer Risk Assessment Tool [LCRAT]; and the Lung Cancer Death Risk Assessment Tool [LCDRAT]) and to examine their predictive performance in 2 cohorts. Population-based prospective studies. United States. Models selected U.S. screening populations by using data from the National Health Interview Survey from 2010 to 2012. Model performance was evaluated using data from 337 388 ever-smokers in the National Institutes of Health-AARP Diet and Health Study and 72 338 ever-smokers in the CPS-II (Cancer Prevention Study II) Nutrition Survey cohort. Model calibration (ratio of model-predicted to observed cases [expected-observed ratio]) and discrimination (area under the curve [AUC]). At a 5-year risk threshold of 2.0%, the models chose U.S. screening populations ranging from 7.6 million to 26 million ever-smokers. These disagreements occurred because, in both validation cohorts, 4 models (the Bach model, PLCOM2012, LCRAT, and LCDRAT) were well-calibrated (expected-observed ratio range, 0.92 to 1.12) and had higher AUCs (range, 0.75 to 0.79) than 5 models that generally overestimated risk (expected-observed ratio range, 0.83 to 3.69) and had lower AUCs (range, 0.62 to 0.75). The 4 best-performing models also had the highest sensitivity at a fixed specificity (and vice versa) and similar discrimination at a fixed risk threshold. These models showed better agreement on size of the

  10. Rapid Computer Aided Ligand Design and Screening of Precious Metal Extractants from TRUEX Raffinate with Experimental Validation

    International Nuclear Information System (INIS)

    Clark, Aurora Sue; Wall, Nathalie; Benny, Paul

    2015-01-01

    design of a software program that uses state-of-the-art computational combinatorial chemistry, and is developed and validated with experimental data acquisition; the resulting tool allows for rapid design and screening of new ligands for the extraction of precious metals from SNF. This document describes the software that has been produced, ligands that have been designed, and fundamental new understandings of the extraction process of Rh(III) as a function of solution phase conditions (pH, nature of acid, etc.).

  11. Care-bolus tracking systems in multislice-helical computed tomography - a new method in the screening of cardiovascular failure?

    International Nuclear Information System (INIS)

    Stueckle, C.A.; Kickuth, R.; Kirchner, E.M.; Liermann, D.; Kirchner, J.

    2002-01-01

    Purpose. Recently bolus tracking systems were developed to improve the timing of intravenous contrast media application in helical computed tomography. We investigated the benefit of this new method as a parameter of the cardiac function.Material and methods. Retrospective analysis of 64 patients which incidentally underwent bolus triggered contrast enhanced helical CT and invasive investigation of the heart within one week. All examinations were performed on the CT scanner Somatom Plus 4 Volume Zoom (Siemens Corp., Forchheim, Germany) using the C.A.R.E. trademark Bolus software. This performs repetitive low- dose test scans (e.g. for the abdomen: 140 kV, 20 mA, TI 0,5 s) and measures the Hounsfield attenuation (increase over the baseline) in a preselected region of interest. The displayed increase of vascular density over the time after peripheral contrast media injection (75 ml Iopromid (300 mg/ml), 2 ml/s) was categorised to three types: (a) rapid increase, (b) deceleration before a 100 HE threshold was reached and (c) one or more peaks. The findings of the invasive investigation of the heart were correlated to the findings of the bolus-tracking measurements.Results. The examinations were categorized as follows: 19 type A, 34 type B, 11 type C. We found a high significant correlation between the type of the Hounsfield attenuation and systolic pressure in the left ventricle. There was no correlation between the type of the Hounsfield attenuation and the diastolic pressure in the left ventricle, the pressures related to the right ventricle or the ejection fraction. The bolus- tacking system showed a sensitivity of 53, a specificity of 82, an accuracy of 70%, a positive predictive value of 70% and a negative predictive value of 70% in detection of left heart failure.Conclusion. The bolus tracking system C.A.R.E.-bolus copyright often shows atypical Hounsfield attenuation in cases of cardiac failure but is not suitable as a screening method of the cardiopulmonary

  12. Rapid Computer Aided Ligand Design and Screening of Precious Metal Extractants from TRUEX Raffinate with Experimental Validation

    Energy Technology Data Exchange (ETDEWEB)

    Clark, Aurora Sue [Washington State Univ., Pullman, WA (United States); Wall, Nathalie [Washington State Univ., Pullman, WA (United States); Benny, Paul [Washington State Univ., Pullman, WA (United States)

    2015-11-16

    through the design of a software program that uses state-of-the-art computational combinatorial chemistry, and is developed and validated with experimental data acquisition; the resulting tool allows for rapid design and screening of new ligands for the extraction of precious metals from SNF. This document describes the software that has been produced, ligands that have been designed, and fundamental new understandings of the extraction process of Rh(III) as a function of solution phase conditions (pH, nature of acid, etc.).

  13. Computer Simulation Investigation on the Effect of Channelled and Unchannelled Screens on Smoke Contamination in Atriums Upper Balconies

    Directory of Open Access Journals (Sweden)

    Nasif Mohammad Shakir

    2014-07-01

    Full Text Available This paper performed the effect of installing channel screen on smoke contamination in the presence of 0.5 m deep down stand in a fire compartment. The results are then compared with smoke contamination occurrence when the channel screens were removed. The results showed that there will be 96% increase in upper balconies smoke contamination in an atrium when no channel screens at fire compartment opening are used. This work provides new correlation obtained from numerical study which can predict the smoke contamination height in upper balconies of the atrium in the presence of 0.5 m down stand and no channel screens. The proposed correlation will be useful design tool for building designer to design safe shopping malls (atrium.

  14. Cloud database development and management

    CERN Document Server

    Chao, Lee

    2013-01-01

    Nowadays, cloud computing is almost everywhere. However, one can hardly find a textbook that utilizes cloud computing for teaching database and application development. This cloud-based database development book teaches both the theory and practice with step-by-step instructions and examples. This book helps readers to set up a cloud computing environment for teaching and learning database systems. The book will cover adequate conceptual content for students and IT professionals to gain necessary knowledge and hands-on skills to set up cloud based database systems.

  15. Towards cloud-centric distributed database evaluation

    OpenAIRE

    Seybold, Daniel

    2016-01-01

    The area of cloud computing also pushed the evolvement of distributed databases, resulting in a variety of distributed database systems, which can be classified in relation databases, NoSQL and NewSQL database systems. In general all representatives of these database system classes claim to provide elasticity and "unlimited" horizontal scalability. As these characteristics comply with the cloud, distributed databases seem to be a perfect match for Database-as-a-Service systems (DBaaS).

  16. Towards Cloud-centric Distributed Database Evaluation

    OpenAIRE

    Seybold, Daniel

    2016-01-01

    The area of cloud computing also pushed the evolvement of distributed databases, resulting in a variety of distributed database systems, which can be classified in relation databases, NoSQL and NewSQL database systems. In general all representatives of these database system classes claim to provide elasticity and "unlimited" horizontal scalability. As these characteristics comply with the cloud, distributed databases seem to be a perfect match for Database-as-a-Service systems (DBaaS).

  17. Diabetes, Frequency of Exercise, and Mortality Over 12 Years: Analysis of the National Health Insurance Service-Health Screening (NHIS-HEALS) Database.

    Science.gov (United States)

    Shin, Woo Young; Lee, Taehee; Jeon, Da Hye; Kim, Hyeon Chang

    2018-02-19

    The goal of this study was to analyze the relationship between exercise frequency and all-cause mortality for individuals diagnosed with and without diabetes mellitus (DM). We analyzed data for 505,677 participants (53.9% men) in the National Health Insurance Service-National Health Screening (NHIS-HEALS) cohort. The study endpoint variable was all-cause mortality. Frequency of exercise and covariates including age, sex, smoking status, household income, blood pressure, fasting glucose, body mass index, total cholesterol, and Charlson comorbidity index were determined at baseline. Cox proportional hazard regression models were developed to assess the effects of exercise frequency (0, 1-2, 3-4, 5-6, and 7 days per week) on mortality, separately in individuals with and without DM. We found a U-shaped association between exercise frequency and mortality in individuals with and without DM. However, the frequency of exercise associated with the lowest risk of all-cause mortality was 3-4 times per week (hazard ratio [HR], 0.69; 95% confidence interval [CI], 0.65-0.73) in individuals without DM, and 5-6 times per week in those with DM (HR, 0.93; 95% CI, 0.78-1.10). A moderate frequency of exercise may reduce mortality regardless of the presence or absence of DM; however, when compared to those without the condition, people with DM may need to exercise more often. © 2018 The Korean Academy of Medical Sciences.

  18. Screening of 439 Pesticide Residues in Fruits and Vegetables by Gas Chromatography-Quadrupole-Time-of-Flight Mass Spectrometry Based on TOF Accurate Mass Database and Q-TOF Spectrum Library.

    Science.gov (United States)

    Li, Jian-Xun; Li, Xiao-Ying; Chang, Qiao-Ying; Li, Yan; Jin, Ling-He; Pang, Guo-Fang; Fan, Chun-Lin

    2018-05-03

    Because of its unique characteristics of accurate mass full-spectrum acquisition, high resolution, and fast acquisition rates, GC-quadrupole-time-of-flight MS (GC-Q-TOF/MS) has become a powerful tool for pesticide residue analysis. In this study, a TOF accurate mass database and Q-TOF spectrum library of 439 pesticides were established, and the parameters of the TOF database were optimized. Through solid-phase extraction (SPE), whereby pesticides are extracted from fruit and vegetable substrates by using 40 mL 1% acetic acid in acetonitrile (v/v), purified by the Carbon/NH₂ SPE cartridge, and finally detected by GC-Q-TOF/MS, the rapid analysis of 439 pesticides in fruits and vegetables can be achieved. The methodology verification results show that more than 70 and 91% of pesticides, spiked in fruits and vegetables with concentrations of 10 and 100 μg/kg, respectively, saw recoveries that conform to the European Commission's criterion of between 70 and 120% with RSD ≤20%. Eighty-one percent of pesticides have screening detection limits lower than 10 μg/kg, which makes this a reliable analysis technology for the monitoring of pesticide residues in fruits and vegetables. This technology was further validated for its characteristics of high precision, high speed, and high throughput through successful detection of 9817 samples during 2013-2015.

  19. An examination of intrinsic errors in electronic structure methods using the Environmental Molecular Sciences Laboratory computational results database and the Gaussian-2 set

    International Nuclear Information System (INIS)

    Feller, D.; Peterson, K.A.

    1998-01-01

    The Gaussian-2 (G2) collection of atoms and molecules has been studied with Hartree endash Fock and correlated levels of theory, ranging from second-order perturbation theory to coupled cluster theory with noniterative inclusion of triple excitations. By exploiting the systematic convergence properties of the correlation consistent family of basis sets, complete basis set limits were estimated for a large number of the G2 energetic properties. Deviations with respect to experimentally derived energy differences corresponding to rigid molecules were obtained for 15 basis set/method combinations, as well as the estimated complete basis set limit. The latter values are necessary for establishing the intrinsic error for each method. In order to perform this analysis, the information generated in the present study was combined with the results of many previous benchmark studies in an electronic database, where it is available for use by other software tools. Such tools can assist users of electronic structure codes in making appropriate basis set and method choices that will increase the likelihood of achieving their accuracy goals without wasteful expenditures of computer resources. copyright 1998 American Institute of Physics

  20. Database security in the cloud

    OpenAIRE

    Sakhi, Imal

    2012-01-01

    The aim of the thesis is to get an overview of the database services available in cloud computing environment, investigate the security risks associated with it and propose the possible countermeasures to minimize the risks. The thesis also analyzes two cloud database service providers namely; Amazon RDS and Xeround. The reason behind choosing these two providers is because they are currently amongst the leading cloud database providers and both provide relational cloud databases which makes ...

  1. JICST Factual Database(2)

    Science.gov (United States)

    Araki, Keisuke

    The computer programme, which builds atom-bond connection tables from nomenclatures, is developed. Chemical substances with their nomenclature and varieties of trivial names or experimental code numbers are inputted. The chemical structures of the database are stereospecifically stored and are able to be searched and displayed according to stereochemistry. Source data are from laws and regulations of Japan, RTECS of US and so on. The database plays a central role within the integrated fact database service of JICST and makes interrelational retrieval possible.

  2. Federal databases

    International Nuclear Information System (INIS)

    Welch, M.J.; Welles, B.W.

    1988-01-01

    Accident statistics on all modes of transportation are available as risk assessment analytical tools through several federal agencies. This paper reports on the examination of the accident databases by personal contact with the federal staff responsible for administration of the database programs. This activity, sponsored by the Department of Energy through Sandia National Laboratories, is an overview of the national accident data on highway, rail, air, and marine shipping. For each mode, the definition or reporting requirements of an accident are determined and the method of entering the accident data into the database is established. Availability of the database to others, ease of access, costs, and who to contact were prime questions to each of the database program managers. Additionally, how the agency uses the accident data was of major interest

  3. Database for propagation models

    Science.gov (United States)

    Kantak, Anil V.

    1991-07-01

    A propagation researcher or a systems engineer who intends to use the results of a propagation experiment is generally faced with various database tasks such as the selection of the computer software, the hardware, and the writing of the programs to pass the data through the models of interest. This task is repeated every time a new experiment is conducted or the same experiment is carried out at a different location generating different data. Thus the users of this data have to spend a considerable portion of their time learning how to implement the computer hardware and the software towards the desired end. This situation may be facilitated considerably if an easily accessible propagation database is created that has all the accepted (standardized) propagation phenomena models approved by the propagation research community. Also, the handling of data will become easier for the user. Such a database construction can only stimulate the growth of the propagation research it if is available to all the researchers, so that the results of the experiment conducted by one researcher can be examined independently by another, without different hardware and software being used. The database may be made flexible so that the researchers need not be confined only to the contents of the database. Another way in which the database may help the researchers is by the fact that they will not have to document the software and hardware tools used in their research since the propagation research community will know the database already. The following sections show a possible database construction, as well as properties of the database for the propagation research.

  4. Database of Information technology resources

    OpenAIRE

    Barzda, Erlandas

    2005-01-01

    The subject of this master work is the internet information resource database. This work also handles the problems of old information systems which do not meet the new contemporary requirements. The aim is to create internet information system, based on object-oriented technologies and tailored to computer users’ needs. The internet information database system helps computers administrators to get the all needed information about computers network elements and easy to register all changes int...

  5. Sulfide perovskites for solar energy conversion applications: computational screening and synthesis of the selected compound LaYS3

    DEFF Research Database (Denmark)

    Kuhar, Korina; Crovetto, Andrea; Pandey, Mohnish

    2017-01-01

    of ternary sulfides followed by synthesis and confirmation of the properties of one of the most promising materials. The screening focusses on materials with ABS3 composition taking both perovskite and non-perovskite structures into consideration, and the material selection is based on descriptors...

  6. Screening for early lung cancer with low-dose spiral computed tomography: results of annual follow-up examinations in asymptomatic smokers

    International Nuclear Information System (INIS)

    Diederich, Stefan; Thomas, Michael; Semik, Michael; Lenzen, Horst; Roos, Nikolaus; Weber, Anushe; Heindel, Walter; Wormanns, Dag

    2004-01-01

    The aim of this study was analysis of incidence results in a prospective one-arm feasibility study of lung cancer screening with low-radiation-dose spiral computed tomography in heavy smokers. Eight hundred seventeen smokers (≥40 years, ≥20 pack years of smoking history) underwent baseline low-dose CT. Biopsy was recommended in nodules >10 mm with CT morphology suggesting malignancy. In all other lesions follow-up with low-dose CT was recommended. Annual repeat CT was offered to all study participants. Six hundred sixty-eight (81.8%) of the 817 subjects underwent annual repeat CT with a total of 1735 follow-up years. Follow-up of non-calcified nodules present at baseline CT demonstrated growth in 11 of 792 subjects. Biopsy was performed in 8 of 11 growing nodules 7 of which represented lung cancer. Of 174 new nodules, 3 represented lung cancer. The 10 screen-detected lung cancers were all non-small cell cancer (6 stage IA, 1 stage IB, 1 stage IIIA, 2 stage IV). Five symptom-diagnosed cancers (2 small cell lung cancer: 1 limited disease, 1 extensive disease, 3 central/endobronchial non-small cell lung cancer, 2 stage IIIA, 1 stage IIIB) were diagnosed because of symptoms in the 12-month interval between two annual CT scans. Incidence of lung cancer was lower than prevalence, screen-detected cancers were smaller, and stage I was found in 70% (7 of 10) of screen-detected tumors. Only 27% (4 of 15) of invasive procedures was performed for benign lesions; however, 33% (5 of 15) of all cancers diagnosed in the population were symptom-diagnosed cancers (3 central NSCLC, all stage III, 2 SCLC) demonstrating the limitations of CT screening. (orig.)

  7. Database Replication

    CERN Document Server

    Kemme, Bettina

    2010-01-01

    Database replication is widely used for fault-tolerance, scalability and performance. The failure of one database replica does not stop the system from working as available replicas can take over the tasks of the failed replica. Scalability can be achieved by distributing the load across all replicas, and adding new replicas should the load increase. Finally, database replication can provide fast local access, even if clients are geographically distributed clients, if data copies are located close to clients. Despite its advantages, replication is not a straightforward technique to apply, and

  8. Refactoring databases evolutionary database design

    CERN Document Server

    Ambler, Scott W

    2006-01-01

    Refactoring has proven its value in a wide range of development projects–helping software professionals improve system designs, maintainability, extensibility, and performance. Now, for the first time, leading agile methodologist Scott Ambler and renowned consultant Pramodkumar Sadalage introduce powerful refactoring techniques specifically designed for database systems. Ambler and Sadalage demonstrate how small changes to table structures, data, stored procedures, and triggers can significantly enhance virtually any database design–without changing semantics. You’ll learn how to evolve database schemas in step with source code–and become far more effective in projects relying on iterative, agile methodologies. This comprehensive guide and reference helps you overcome the practical obstacles to refactoring real-world databases by covering every fundamental concept underlying database refactoring. Using start-to-finish examples, the authors walk you through refactoring simple standalone databas...

  9. snoSeeker: an advanced computational package for screening of guide and orphan snoRNA genes in the human genome

    OpenAIRE

    Yang, Jian-Hua; Zhang, Xiao-Chen; Huang, Zhan-Peng; Zhou, Hui; Huang, Mian-Bo; Zhang, Shu; Chen, Yue-Qin; Qu, Liang-Hu

    2006-01-01

    Small nucleolar RNAs (snoRNAs) represent an abundant group of non-coding RNAs in eukaryotes. They can be divided into guide and orphan snoRNAs according to the presence or absence of antisense sequence to rRNAs or snRNAs. Current snoRNA-searching programs, which are essentially based on sequence complementarity to rRNAs or snRNAs, exist only for the screening of guide snoRNAs. In this study, we have developed an advanced computational package, snoSeeker, which includes CDseeker and ACAseeker ...

  10. RDD Databases

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database was established to oversee documents issued in support of fishery research activities including experimental fishing permits (EFP), letters of...

  11. Snowstorm Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Snowstorm Database is a collection of over 500 snowstorms dating back to 1900 and updated operationally. Only storms having large areas of heavy snowfall (10-20...

  12. Dealer Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The dealer reporting databases contain the primary data reported by federally permitted seafood dealers in the northeast. Electronic reporting was implemented May 1,...

  13. National database

    DEFF Research Database (Denmark)

    Kristensen, Helen Grundtvig; Stjernø, Henrik

    1995-01-01

    Artikel om national database for sygeplejeforskning oprettet på Dansk Institut for Sundheds- og Sygeplejeforskning. Det er målet med databasen at samle viden om forsknings- og udviklingsaktiviteter inden for sygeplejen.......Artikel om national database for sygeplejeforskning oprettet på Dansk Institut for Sundheds- og Sygeplejeforskning. Det er målet med databasen at samle viden om forsknings- og udviklingsaktiviteter inden for sygeplejen....

  14. Screening of 485 Pesticide Residues in Fruits and Vegetables by Liquid Chromatography-Quadrupole-Time-of-Flight Mass Spectrometry Based on TOF Accurate Mass Database and QTOF Spectrum Library.

    Science.gov (United States)

    Pang, Guo-Fang; Fan, Chun-Lin; Chang, Qiao-Ying; Li, Jian-Xun; Kang, Jian; Lu, Mei-Ling

    2018-03-22

    This paper uses the LC-quadrupole-time-of-flight MS technique to evaluate the behavioral characteristics of MSof 485 pesticides under different conditions and has developed an accurate mass database and spectra library. A high-throughput screening and confirmation method has been developed for the 485 pesticides in fruits and vegetables. Through the optimization of parameters such as accurate mass number, time of retention window, ionization forms, etc., the method has improved the accuracy of pesticide screening, thus avoiding the occurrence of false-positive and false-negative results. The method features a full scan of fragments, with 80% of pesticide qualitative points over 10, which helps increase pesticide qualitative accuracy. The abundant differences of fragment categories help realize the effective separation and qualitative identification of isomer pesticides. Four different fruits and vegetables-apples, grapes, celery, and tomatoes-were chosen to evaluate the efficiency of the method at three fortification levels of 5, 10, and 20 μg/kg, and satisfactory results were obtained. With this method, a national survey of pesticide residues was conducted between 2012 and 2015 for 12 551 samples of 146 different fruits and vegetables collected from 638 sampling points in 284 counties across 31 provincial capitals/cities directly under the central government, which provided scientific data backup for ensuring pesticide residue safety of the fruits and vegetables consumed daily by the public. Meanwhile, the big data statistical analysis of the new technique also further proves it to be of high speed, high throughput, high accuracy, high reliability, and high informatization.

  15. Teaching Historians with Databases.

    Science.gov (United States)

    Burton, Vernon

    1993-01-01

    Asserts that, although pressures to publish have detracted from the quality of teaching at the college level, recent innovations in educational technology have created opportunities for instructional improvement. Describes the use of computer-assisted instruction and databases in college-level history courses. (CFR)

  16. The CAPEC Database

    DEFF Research Database (Denmark)

    Nielsen, Thomas Lund; Abildskov, Jens; Harper, Peter Mathias

    2001-01-01

    in the compound. This classification makes the CAPEC database a very useful tool, for example, in the development of new property models, since properties of chemically similar compounds are easily obtained. A program with efficient search and retrieval functions of properties has been developed.......The Computer-Aided Process Engineering Center (CAPEC) database of measured data was established with the aim to promote greater data exchange in the chemical engineering community. The target properties are pure component properties, mixture properties, and special drug solubility data....... The database divides pure component properties into primary, secondary, and functional properties. Mixture properties are categorized in terms of the number of components in the mixture and the number of phases present. The compounds in the database have been classified on the basis of the functional groups...

  17. Comparison of conventional and cadmium-zinc-telluride single-photon emission computed tomography for analysis of thallium-201 myocardial perfusion imaging: an exploratory study in normal databases for different ethnicities.

    Science.gov (United States)

    Ishihara, Masaru; Onoguchi, Masahisa; Taniguchi, Yasuyo; Shibutani, Takayuki

    2017-12-01

    The aim of this study was to clarify the differences in thallium-201-chloride (thallium-201) myocardial perfusion imaging (MPI) scans evaluated by conventional anger-type single-photon emission computed tomography (conventional SPECT) versus cadmium-zinc-telluride SPECT (CZT SPECT) imaging in normal databases for different ethnic groups. MPI scans from 81 consecutive Japanese patients were examined using conventional SPECT and CZT SPECT and analyzed with the pre-installed quantitative perfusion SPECT (QPS) software. We compared the summed stress score (SSS), summed rest score (SRS), and summed difference score (SDS) for the two SPECT devices. For a normal MPI reference, we usually use Japanese databases for MPI created by the Japanese Society of Nuclear Medicine, which can be used with conventional SPECT but not with CZT SPECT. In this study, we used new Japanese normal databases constructed in our institution to compare conventional and CZT SPECT. Compared with conventional SPECT, CZT SPECT showed lower SSS (p < 0.001), SRS (p = 0.001), and SDS (p = 0.189) using the pre-installed SPECT database. In contrast, CZT SPECT showed no significant difference from conventional SPECT in QPS analysis using the normal databases from our institution. Myocardial perfusion analyses by CZT SPECT should be evaluated using normal databases based on the ethnic group being evaluated.

  18. NUCDAS: a database of nuclear criticality data around the world

    International Nuclear Information System (INIS)

    Komuro, Yuichi; Sakai, Tomohiro

    1999-01-01

    The NUCDAS database, which contains great numbers of nuclear criticality data and subcritical limit data described in criticality safety handbooks of Japan and foreign countries, has been developed at JAERI. The database was designed to perform quick search on criticality data and subcritical limits and to draw their curves for comparison. So, criticality data among handbooks can be shown on the screen and/or printed on the paper. The database runs on the Apple Macintosh computer and written in 4th Dimension, a relational database software for the Macintosh. This tool provides powerful search and sort capabilities. An appropriate graphic software (e.g. KaleidaGraph) is used to draw a graph of selected criticality data. NUCDAS will be demonstrated in the poster presentation. NUCEF'98 participants who are interested in NUCDAS will be able to operate Macintosh with the database and will be encouraged to give us some comments on it for modifications. Though all messages on the screen are written in Japanese, don't worry. (author)

  19. Feasibility of opportunistic osteoporosis screening in routine contrast-enhanced multi detector computed tomography (MDCT) using texture analysis.

    Science.gov (United States)

    Mookiah, M R K; Rohrmeier, A; Dieckmeyer, M; Mei, K; Kopp, F K; Noel, P B; Kirschke, J S; Baum, T; Subburaj, K

    2018-04-01

    This study investigated the feasibility of opportunistic osteoporosis screening in routine contrast-enhanced MDCT exams using texture analysis. The results showed an acceptable reproducibility of texture features, and these features could discriminate healthy/osteoporotic fracture cohort with an accuracy of 83%. This aim of this study is to investigate the feasibility of opportunistic osteoporosis screening in routine contrast-enhanced MDCT exams using texture analysis. We performed texture analysis at the spine in routine MDCT exams and investigated the effect of intravenous contrast medium (IVCM) (n = 7), slice thickness (n = 7), the long-term reproducibility (n = 9), and the ability to differentiate healthy/osteoporotic fracture cohort (n = 9 age and gender matched pairs). Eight texture features were extracted using gray level co-occurrence matrix (GLCM). The independent sample t test was used to rank the features of healthy/fracture cohort and classification was performed using support vector machine (SVM). The results revealed significant correlations between texture parameters derived from MDCT scans with and without IVCM (r up to 0.91) slice thickness of 1 mm versus 2 and 3 mm (r up to 0.96) and scan-rescan (r up to 0.59). The performance of the SVM classifier was evaluated using 10-fold cross-validation and revealed an average classification accuracy of 83%. Opportunistic osteoporosis screening at the spine using specific texture parameters (energy, entropy, and homogeneity) and SVM can be performed in routine contrast-enhanced MDCT exams.

  20. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  1. ADANS database specification

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-01-16

    The purpose of the Air Mobility Command (AMC) Deployment Analysis System (ADANS) Database Specification (DS) is to describe the database organization and storage allocation and to provide the detailed data model of the physical design and information necessary for the construction of the parts of the database (e.g., tables, indexes, rules, defaults). The DS includes entity relationship diagrams, table and field definitions, reports on other database objects, and a description of the ADANS data dictionary. ADANS is the automated system used by Headquarters AMC and the Tanker Airlift Control Center (TACC) for airlift planning and scheduling of peacetime and contingency operations as well as for deliberate planning. ADANS also supports planning and scheduling of Air Refueling Events by the TACC and the unit-level tanker schedulers. ADANS receives input in the form of movement requirements and air refueling requests. It provides a suite of tools for planners to manipulate these requirements/requests against mobility assets and to develop, analyze, and distribute schedules. Analysis tools are provided for assessing the products of the scheduling subsystems, and editing capabilities support the refinement of schedules. A reporting capability provides formatted screen, print, and/or file outputs of various standard reports. An interface subsystem handles message traffic to and from external systems. The database is an integral part of the functionality summarized above.

  2. Experiment Databases

    Science.gov (United States)

    Vanschoren, Joaquin; Blockeel, Hendrik

    Next to running machine learning algorithms based on inductive queries, much can be learned by immediately querying the combined results of many prior studies. Indeed, all around the globe, thousands of machine learning experiments are being executed on a daily basis, generating a constant stream of empirical information on machine learning techniques. While the information contained in these experiments might have many uses beyond their original intent, results are typically described very concisely in papers and discarded afterwards. If we properly store and organize these results in central databases, they can be immediately reused for further analysis, thus boosting future research. In this chapter, we propose the use of experiment databases: databases designed to collect all the necessary details of these experiments, and to intelligently organize them in online repositories to enable fast and thorough analysis of a myriad of collected results. They constitute an additional, queriable source of empirical meta-data based on principled descriptions of algorithm executions, without reimplementing the algorithms in an inductive database. As such, they engender a very dynamic, collaborative approach to experimentation, in which experiments can be freely shared, linked together, and immediately reused by researchers all over the world. They can be set up for personal use, to share results within a lab or to create open, community-wide repositories. Here, we provide a high-level overview of their design, and use an existing experiment database to answer various interesting research questions about machine learning algorithms and to verify a number of recent studies.

  3. Physical and Clinical Comparison between a Screen-Film System and a Dual-Side Reading Mammography-Dedicated Computed Radiography System

    International Nuclear Information System (INIS)

    Rivetti, S.; Canossi, B.; Battista, R.; Vetruccio, E.; Torricelli, P.; Lanconelli, N.; Danielli, C.; Borasi, G.

    2009-01-01

    Background: Digital mammography systems, thanks to a physical performance better than conventional screen-film units, have the potential of reducing the dose to patients, without decreasing the diagnostic accuracy. Purpose: To achieve a physical and clinical comparison between two systems: a screen-film plate and a dual-side computed radiography system (CRM; FUJIFILM FCR 5000 MA). Material and Methods: A unique feature of the FCR 5000 MA system is that it has a clear support medium, allowing light emitted during the scanning process to be detected on the 'back' of the storage phosphor plate, considerably improving the system's efficiency. The system's physical performance was tested by means of a quantitative analysis, with calculation of the modulation transfer function, detective quantum efficiency, and contrast-detail analysis; subsequently, the results were compared with those achieved using a screen-film system (SFM; Eastmann Kodak MinR-MinR 2000). A receiver operating characteristic (ROC) analysis was then performed on 120 paired clinical images obtained in a craniocaudal projection with the conventional SFM system under standard exposure conditions and also with the CRM system working with a dose reduced by 35% (average breast thickness: 4.3 cm; mean glandular dose: 1.45 mGy). CRM clinical images were interpreted both in hard copy and in soft copy. Results: The ROC analysis revealed that the performances of the two systems (SFM and CRM with reduced dose) were similar (P>0.05): the diagnostic accuracy of the two systems, when valued in terms of the area underneath the ROC curve, was found to be 0.74 for the SFM, 0.78 for the CRM (hard copy), and 0.79 for the CRM (soft copy). Conclusion: The outcome obtained from our experiments shows that the use of the dual-side CRM system is a very good alternative to the screen-film system

  4. WMC Database Evaluation. Case Study Report

    Energy Technology Data Exchange (ETDEWEB)

    Palounek, Andrea P. T [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-10-29

    The WMC Database is ultimately envisioned to hold a collection of experimental data, design information, and information from computational models. This project was a first attempt at using the Database to access experimental data and extract information from it. This evaluation shows that the Database concept is sound and robust, and that the Database, once fully populated, should remain eminently usable for future researchers.

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  6. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  8. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  9. Depression Screening

    Science.gov (United States)

    ... Depression Screening Substance Abuse Screening Alcohol Use Screening Depression Screening (PHQ-9) - Instructions The following questions are ... this tool, there is also text-only version . Depression Screening - Manual Instructions The following questions are a ...

  10. COMPARISON OF POPULAR BIOINFORMATICS DATABASES

    OpenAIRE

    Abdulganiyu Abdu Yusuf; Zahraddeen Sufyanu; Kabir Yusuf Mamman; Abubakar Umar Suleiman

    2016-01-01

    Bioinformatics is the application of computational tools to capture and interpret biological data. It has wide applications in drug development, crop improvement, agricultural biotechnology and forensic DNA analysis. There are various databases available to researchers in bioinformatics. These databases are customized for a specific need and are ranged in size, scope, and purpose. The main drawbacks of bioinformatics databases include redundant information, constant change, data spread over m...

  11. Exploiting PubChem for Virtual Screening.

    Science.gov (United States)

    Xie, Xiang-Qun

    2010-12-01

    IMPORTANCE OF THE FIELD: PubChem is a public molecular information repository, a scientific showcase of the NIH Roadmap Initiative. The PubChem database holds over 27 million records of unique chemical structures of compounds (CID) derived from nearly 70 million substance depositions (SID), and contains more than 449,000 bioassay records with over thousands of in vitro biochemical and cell-based screening bioassays established, with targeting more than 7000 proteins and genes linking to over 1.8 million of substances. AREAS COVERED IN THIS REVIEW: This review builds on recent PubChem-related computational chemistry research reported by other authors while providing readers with an overview of the PubChem database, focusing on its increasing role in cheminformatics, virtual screening and toxicity prediction modeling. WHAT THE READER WILL GAIN: These publicly available datasets in PubChem provide great opportunities for scientists to perform cheminformatics and virtual screening research for computer-aided drug design. However, the high volume and complexity of the datasets, in particular the bioassay-associated false positives/negatives and highly imbalanced datasets in PubChem, also creates major challenges. Several approaches regarding the modeling of PubChem datasets and development of virtual screening models for bioactivity and toxicity predictions are also reviewed. TAKE HOME MESSAGE: Novel data-mining cheminformatics tools and virtual screening algorithms are being developed and used to retrieve, annotate and analyze the large-scale and highly complex PubChem biological screening data for drug design.

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  13. Development of an effective dose coefficient database using a computational human phantom and Monte Carlo simulations to evaluate exposure dose for the usage of NORM-added consumer products.

    Science.gov (United States)

    Yoo, Do Hyeon; Shin, Wook-Geun; Lee, Jaekook; Yeom, Yeon Soo; Kim, Chan Hyeong; Chang, Byung-Uck; Min, Chul Hee

    2017-11-01

    After the Fukushima accident in Japan, the Korean Government implemented the "Act on Protective Action Guidelines Against Radiation in the Natural Environment" to regulate unnecessary radiation exposure to the public. However, despite the law which came into effect in July 2012, an appropriate method to evaluate the equivalent and effective doses from naturally occurring radioactive material (NORM) in consumer products is not available. The aim of the present study is to develop and validate an effective dose coefficient database enabling the simple and correct evaluation of the effective dose due to the usage of NORM-added consumer products. To construct the database, we used a skin source method with a computational human phantom and Monte Carlo (MC) simulation. For the validation, the effective dose was compared between the database using interpolation method and the original MC method. Our result showed a similar equivalent dose across the 26 organs and a corresponding average dose between the database and the MC calculations of database with sufficient accuracy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Timeliness and Predictability in Real-Time Database Systems

    National Research Council Canada - National Science Library

    Son, Sang H

    1998-01-01

    The confluence of computers, communications, and databases is quickly creating a globally distributed database where many applications require real time access to both temporally accurate and multimedia data...

  15. Association between TV viewing, computer use and overweight, determinants and competing activities of screen time in 4- to 13-year-old children.

    Science.gov (United States)

    de Jong, E; Visscher, T L S; HiraSing, R A; Heymans, M W; Seidell, J C; Renders, C M

    2013-01-01

    TV viewing and computer use is associated with childhood overweight, but it remains unclear as to how these behaviours could best be targeted. The aim of this study was to determine to what extent the association between TV viewing, computer use and overweight is explained by other determinants of overweight, to find determinants of TV viewing and computer use in the home environment and to investigate competing activities. A cross-sectional study was carried out among 4072 children aged 4-13 years in the city of Zwolle, the Netherlands. Data collection consisted of measured height, weight and waist circumference, and a parental questionnaire on socio-demographic characteristics, child's nutrition, physical activity (PA) and sedentary behaviour. Associations were studied with logistic regression analyses, for older and younger children, boys and girls separately. The odds ratio (OR) of being overweight was 1.70 (95% confidence interval (CI): 1.07-2.72) for viewing TV >1.5 h among 4- to 8-year-old children adjusted for all potential confounders. Computer use was not significantly associated with overweight. Determinants of TV viewing were as follows: having >2 TVs in the household (OR: 2.38; 95% CI: 1.66-3.41), a TV in the child's bedroom and not having rules on TV viewing. TV viewing and computer use were both associated with shorter sleep duration and not with less PA. Association between TV viewing and overweight is not explained by socio-demographic variables, drinking sugared drinks and eating snacks. Factors in the home environment influence children's TV viewing. Parents have a central role as they determine the number of TVs, rules and also their children's bedtime. Therefore, interventions to reduce screen time should support parents in making home environmental changes, especially when the children are young.

  16. Interpretation Time Using a Concurrent-Read Computer-Aided Detection System for Automated Breast Ultrasound in Breast Cancer Screening of Women With Dense Breast Tissue.

    Science.gov (United States)

    Jiang, Yulei; Inciardi, Marc F; Edwards, Alexandra V; Papaioannou, John

    2018-05-24

    The purpose of this study was to compare diagnostic accuracy and interpretation time of screening automated breast ultrasound (ABUS) for women with dense breast tissue without and with use of a recently U.S. Food and Drug Administration-approved computer-aided detection (CAD) system for concurrent read. In a retrospective observer performance study, 18 radiologists interpreted a cancer-enriched set (i.e., cancer prevalence higher than in the original screening cohort) of 185 screening ABUS studies (52 with and 133 without breast cancer). These studies were from a large cohort of ABUS screened patients interpreted as BI-RADS density C or D. Each reader interpreted each case twice in a counterbalanced study, once without the CAD system and once with it, separated by 4 weeks. For each case, each reader identified abnormal findings and reported BI-RADS assessment category and level of suspicion for breast cancer. Interpretation time was recorded. Level of suspicion data were compared to evaluate diagnostic accuracy by means of the Dorfman-Berbaum-Metz method of jackknife with ANOVA ROC analysis. Interpretation times were compared by ANOVA. The ROC AUC was 0.848 with the CAD system, compared with 0.828 without it, for a difference of 0.020 (95% CI, -0.011 to 0.051) and was statistically noninferior to the AUC without the CAD system with respect to a margin of -0.05 (p = 0.000086). The mean interpretation time was 3 minutes 33 seconds per case without the CAD system and 2 minutes 24 seconds with it, for a difference of 1 minute 9 seconds saved (95% CI, 44-93 seconds; p = 0.000014), or a reduction in interpretation time to 67% of the time without the CAD system. Use of the concurrent-read CAD system for interpretation of screening ABUS studies of women with dense breast tissue who do not have symptoms is expected to make interpretation significantly faster and produce noninferior diagnostic accuracy compared with interpretation without the CAD system.

  17. Computer Aided Screening of Phytochemicals from Garcinia against the Dengue NS2B/NS3 Protease.

    Science.gov (United States)

    Qamar, Tahir Ul; Mumtaz, Arooj; Ashfaq, Usman Ali; Azhar, Samia; Fatima, Tabeer; Hassan, Muhammad; Hussain, Syed Sajid; Akram, Waheed; Idrees, Sobia

    2014-01-01

    Dengue virus NS2/NS3 protease because of its ability to cleave viral proteins is considered as an attractive target to screen antiviral agents. Medicinal plants contain a variety of phytochemicals that can be used as drug against different diseases and infections. Therefore, this study was designed to uncover possible phytochemical of different classes (Aromatic, Carbohydrates, Lignin, Saponins, Steroids, Tannins, Terpenoids, Xanthones) that could be used as inhibitors against the NS2B/NS3 protease of DENV. With the help of molecular docking, Garcinia phytochemicals found to be bound deeply inside the active site of DENV NS2B/NS3 protease among all tested phytochemicals and had interactions with catalytic triad (His51, Asp75, Ser135). Thus, it can be concluded from the study that these Gracinia phytochemicals could serve as important inhibitors to inhibit the viral replication inside the host cell. Further in-vitro investigations require confirming their efficacy.

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  20. The Danish Fetal Medicine Database

    Directory of Open Access Journals (Sweden)

    Ekelund CK

    2016-10-01

    Full Text Available Charlotte Kvist Ekelund,1 Tine Iskov Kopp,2 Ann Tabor,1 Olav Bjørn Petersen3 1Department of Obstetrics, Center of Fetal Medicine, Rigshospitalet, University of Copenhagen, Copenhagen, Denmark; 2Registry Support Centre (East – Epidemiology and Biostatistics, Research Centre for Prevention and Health, Glostrup, Denmark; 3Fetal Medicine Unit, Aarhus University Hospital, Aarhus Nord, Denmark Aim: The aim of this study is to set up a database in order to monitor the detection rates and false-positive rates of first-trimester screening for chromosomal abnormalities and prenatal detection rates of fetal malformations in Denmark. Study population: Pregnant women with a first or second trimester ultrasound scan performed at all public hospitals in Denmark are registered in the database. Main variables/descriptive data: Data on maternal characteristics, ultrasonic, and biochemical variables are continuously sent from the fetal medicine units' Astraia databases to the central database via web service. Information about outcome of pregnancy (miscarriage, termination, live birth, or stillbirth is received from the National Patient Register and National Birth Register and linked via the Danish unique personal registration number. Furthermore, results of all pre- and postnatal chromosome analyses are sent to the database. Conclusion: It has been possible to establish a fetal medicine database, which monitors first-trimester screening for chromosomal abnormalities and second-trimester screening for major fetal malformations with the input from already collected data. The database is valuable to assess the performance at a regional level and to compare Danish performance with international results at a national level. Keywords: prenatal screening, nuchal translucency, fetal malformations, chromosomal abnormalities

  1. On the Future of Thermochemical Databases, the Development of Solution Models and the Practical Use of Computational Thermodynamics in Volcanology, Geochemistry and Petrology: Can Innovations of Modern Data Science Democratize an Oligarchy?

    Science.gov (United States)

    Ghiorso, M. S.

    2014-12-01

    Computational thermodynamics (CT) has now become an essential tool of petrologic and geochemical research. CT is the basis for the construction of phase diagrams, the application of geothermometers and geobarometers, the equilibrium speciation of solutions, the construction of pseudosections, calculations of mass transfer between minerals, melts and fluids, and, it provides a means of estimating materials properties for the evaluation of constitutive relations in fluid dynamical simulations. The practical application of CT to Earth science problems requires data. Data on the thermochemical properties and the equation of state of relevant materials, and data on the relative stability and partitioning of chemical elements between phases as a function of temperature and pressure. These data must be evaluated and synthesized into a self consistent collection of theoretical models and model parameters that is colloquially known as a thermodynamic database. Quantitative outcomes derived from CT reply on the existence, maintenance and integrity of thermodynamic databases. Unfortunately, the community is reliant on too few such databases, developed by a small number of research groups, and mostly under circumstances where refinement and updates to the database lag behind or are unresponsive to need. Given the increasing level of reliance on CT calculations, what is required is a paradigm shift in the way thermodynamic databases are developed, maintained and disseminated. They must become community resources, with flexible and assessable software interfaces that permit easy modification, while at the same time maintaining theoretical integrity and fidelity to the underlying experimental observations. Advances in computational and data science give us the tools and resources to address this problem, allowing CT results to be obtained at the speed of thought, and permitting geochemical and petrological intuition to play a key role in model development and calibration.

  2. Diagnosis and screening of small hepatocellular carcinomas. Comparison of radionuclide imaging, ultrasound, computed tomography, hepatic angiography, and alpha 1-fetoprotein assay

    International Nuclear Information System (INIS)

    Takashima, T.; Matsui, O.; Suzuki, M.; Ida, M.

    1982-01-01

    Twenty-nine small (less than 5 cm) hepatocellular carcinomas in 18 patients were examined by radionuclide imaging (RN), ultrasound (US), computed tomography (CT), hepatic angiography, and serum alpha 1-fetoprotein (AFP) assay. Sensitivity was 39% with RN, 50% with US, 56% with CT, and 94% with angiography, including infusion hepatic angiography (IHA). Lesions larger than 3 cm could be detected by all of these methods; those between 2 and 3 cm were generally shown by US and CT but not RN. IHA was essential for diagnosis of lesions less than 2 cm, which were otherwise difficult or impossible to detect except with angiography. As a screening method, AFP was best, followed by US and CT. The authors recommend using AFP and US to minimize expense and radiation exposure. In questionable cases, IHA should be performed

  3. Reflections on the Implementation of Low-Dose Computed Tomography Screening in Individuals at High Risk of Lung Cancer in Spain.

    Science.gov (United States)

    Garrido, Pilar; Sánchez, Marcelo; Belda Sanchis, José; Moreno Mata, Nicolás; Artal, Ángel; Gayete, Ángel; Matilla González, José María; Galbis Caravajal, José Marcelo; Isla, Dolores; Paz-Ares, Luis; Seijo, Luis M

    2017-10-01

    Lung cancer (LC) is a major public health issue. Despite recent advances in treatment, primary prevention and early diagnosis are key to reducing the incidence and mortality of this disease. A recent clinical trial demonstrated the efficacy of selective screening by low-dose computed tomography (LDCT) in reducing the risk of both lung cancer mortality and all-cause mortality in high-risk individuals. This article contains the reflections of an expert group on the use of LDCT for early diagnosis of LC in high-risk individuals, and how to evaluate its implementation in Spain. The expert group was set up by the Spanish Society of Pulmonology and Thoracic Surgery (SEPAR), the Spanish Society of Thoracic Surgery (SECT), the Spanish Society of Radiology (SERAM) and the Spanish Society of Medical Oncology (SEOM). Copyright © 2017 SEPAR. Publicado por Elsevier España, S.L.U. All rights reserved.

  4. An integrated methodological approach to the computer-assisted gas chromatographic screening of basic drugs in biological fluids using nitrogen selective detection.

    Science.gov (United States)

    Dugal, R; Massé, R; Sanchez, G; Bertrand, M J

    1980-01-01

    This paper presents the methodological aspects of a computerized system for the gas-chromatographic screening and primary identification of central nervous system stimulants and narcotic analgesics (including some of their respective metabolites) extracted from urine. The operating conditions of a selective nitrogen detector for optimized analytical functions are discussed, particularly the effect of carrier and fuel gas on the detector's sensitivity to nitrogen-containing molecules and discriminating performance toward biological matrix interferences. Application of simple extraction techniques, combined with rapid derivatization procedures, computer data acquisition, and reduction of chromatographic data are presented. Results show that this system approach allows for the screening of several drugs and their metabolites in a short amount of time. The reliability and stability of the system have been tested by analyzing several thousand samples for doping control at major international sporting events and for monitoring drug intake in addicts participating in a rehabilitation program. Results indicate that these techniques can be used and adapted to many different analytical toxicology situations.

  5. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  12. Network-based Database Course

    DEFF Research Database (Denmark)

    Nielsen, J.N.; Knudsen, Morten; Nielsen, Jens Frederik Dalsgaard

    A course in database design and implementation has been de- signed, utilizing existing network facilities. The course is an elementary course for students of computer engineering. Its purpose is to give the students a theoretical database knowledge as well as practical experience with design...... and implementation. A tutorial relational database and the students self-designed databases are implemented on the UNIX system of Aalborg University, thus giving the teacher the possibility of live demonstrations in the lecture room, and the students the possibility of interactive learning in their working rooms...

  13. COMPUTER AIDED DIAGNOSIS FOR DETECTION AND STAGE IDENTIFICATION OF CERVICAL CANCER BY USING PAP SMEAR SCREENING TEST IMAGES

    Directory of Open Access Journals (Sweden)

    S. Athinarayanan

    2016-05-01

    Full Text Available The majority of the women of the world were affected by the disease of cervical cancer. As a result of this disease, their death rate was increase as hasty level. Hence so many number of research people was focused this notion as their research interest and also they have done so many number of solutions for finding this cancer by using some image processing technique and achieved a good results only in advanced and high cost techniques of LBC, biopsy or Colposcopy test Images. Therefore the reason, the authors have chosen this problem and also did not only to find whether the patient is affected by a cancer or not. In addition to the patient was affected by this cancer means and also to identify which severity stage of this disease the patient could be live. Then this work has done in based on the images of low cost pap smear screening test by using various image processing techniques with the help of Computerized Image Processing Software Interactive Data Language (IDL-Image Processing Language. Thus the final reports would be very useful to the pathologists for further analysis.

  14. Database Perspectives on Blockchains

    OpenAIRE

    Cohen, Sara; Zohar, Aviv

    2018-01-01

    Modern blockchain systems are a fresh look at the paradigm of distributed computing, applied under assumptions of large-scale public networks. They can be used to store and share information without a trusted central party. There has been much effort to develop blockchain systems for a myriad of uses, ranging from cryptocurrencies to identity control, supply chain management, etc. None of this work has directly studied the fundamental database issues that arise when using blockchains as the u...

  15. Building a virtual ligand screening pipeline using free software: a survey.

    Science.gov (United States)

    Glaab, Enrico

    2016-03-01

    Virtual screening, the search for bioactive compounds via computational methods, provides a wide range of opportunities to speed up drug development and reduce the associated risks and costs. While virtual screening is already a standard practice in pharmaceutical companies, its applications in preclinical academic research still remain under-exploited, in spite of an increasing availability of dedicated free databases and software tools. In this survey, an overview of recent developments in this field is presented, focusing on free software and data repositories for screening as alternatives to their commercial counterparts, and outlining how available resources can be interlinked into a comprehensive virtual screening pipeline using typical academic computing facilities. Finally, to facilitate the set-up of corresponding pipelines, a downloadable software system is provided, using platform virtualization to integrate pre-installed screening tools and scripts for reproducible application across different operating systems. © The Author 2015. Published by Oxford University Press.

  16. Routine Self-administered, Touch-Screen Computer Based Suicidal Ideation Assessment Linked to Automated Response Team Notification in an HIV Primary Care Setting

    Science.gov (United States)

    Lawrence, Sarah T.; Willig, James H.; Crane, Heidi M.; Ye, Jiatao; Aban, Inmaculada; Lober, William; Nevin, Christa R.; Batey, D. Scott; Mugavero, Michael J.; McCullumsmith, Cheryl; Wright, Charles; Kitahata, Mari; Raper, James L.; Saag, Micheal S.; Schumacher, Joseph E.

    2010-01-01

    Summary The implementation of routine computer-based screening for suicidal ideation and other psychosocial domains through standardized patient reported outcome instruments in two high volume urban HIV clinics is described. Factors associated with an increased risk of self-reported suicidal ideation were determined. Background HIV/AIDS continues to be associated with an under-recognized risk for suicidal ideation, attempted as well as completed suicide. Suicidal ideation represents an important predictor for subsequent attempted and completed suicide. We sought to implement routine screening of suicidal ideation and associated conditions using computerized patient reported outcome (PRO) assessments. Methods Two geographically distinct academic HIV primary care clinics enrolled patients attending scheduled visits from 12/2005 to 2/2009. Touch-screen-based, computerized PRO assessments were implemented into routine clinical care. Substance abuse (ASSIST), alcohol consumption (AUDIT-C), depression (PHQ-9) and anxiety (PHQ-A) were assessed. The PHQ-9 assesses the frequency of suicidal ideation in the preceding two weeks. A response of “nearly every day” triggered an automated page to pre-determined clinic personnel who completed more detailed self-harm assessments. Results Overall 1,216 (UAB= 740; UW= 476) patients completed initial PRO assessment during the study period. Patients were white (53%; n=646), predominantly males (79%; n=959) with a mean age of 44 (± 10). Among surveyed patients, 170 (14%) endorsed some level of suicidal ideation, while 33 (3%) admitted suicidal ideation nearly every day. In multivariable analysis, suicidal ideation risk was lower with advancing age (OR=0.74 per 10 years;95%CI=0.58-0.96) and was increased with current substance abuse (OR=1.88;95%CI=1.03-3.44) and more severe depression (OR=3.91 moderate;95%CI=2.12-7.22; OR=25.55 severe;95%CI=12.73-51.30). Discussion Suicidal ideation was associated with current substance abuse and

  17. Computational Characterization of Small Molecules Binding to the Human XPF Active Site and Virtual Screening to Identify Potential New DNA Repair Inhibitors Targeting the ERCC1-XPF Endonuclease

    Directory of Open Access Journals (Sweden)

    Francesco Gentile

    2018-04-01

    Full Text Available The DNA excision repair protein ERCC-1-DNA repair endonuclease XPF (ERCC1-XPF is a heterodimeric endonuclease essential for the nucleotide excision repair (NER DNA repair pathway. Although its activity is required to maintain genome integrity in healthy cells, ERCC1-XPF can counteract the effect of DNA-damaging therapies such as platinum-based chemotherapy in cancer cells. Therefore, a promising approach to enhance the effect of these therapies is to combine their use with small molecules, which can inhibit the repair mechanisms in cancer cells. Currently, there are no structures available for the catalytic site of the human ERCC1-XPF, which performs the metal-mediated cleavage of a DNA damaged strand at 5′. We adopted a homology modeling strategy to build a structural model of the human XPF nuclease domain which contained the active site and to extract dominant conformations of the domain using molecular dynamics simulations followed by clustering of the trajectory. We investigated the binding modes of known small molecule inhibitors targeting the active site to build a pharmacophore model. We then performed a virtual screening of the ZINC Is Not Commercial 15 (ZINC15 database to identify new ERCC1-XPF endonuclease inhibitors. Our work provides structural insights regarding the binding mode of small molecules targeting the ERCC1-XPF active site that can be used to rationally optimize such compounds. We also propose a set of new potential DNA repair inhibitors to be considered for combination cancer therapy strategies.

  18. THE USAGE OF ORIGINAL COMPUTER PROGRAM FOR SCREENING OF SENILE ASTHENIA IN PRE- AND POST GRADUATE MEDICAL EDUCATION

    Directory of Open Access Journals (Sweden)

    Svetlana G. Gorelik

    2014-02-01

    Full Text Available The results of introduction of information technologies in educational process of medical students and students of postgraduate education were shown in the article. The actuality of material, which was outlined in the article, was caused by problems, which prevailed in the system of high education and by necessity of cooperation between practical health care and theoretical knowledge. For this aim the original «Computer program of optimization of care in geriatrics in dependence from degree of senile asthenia” was proposed. This program solved not only problems of qualitative alteration of informative environment of educational system but it contributed to increasing of effectiveness of knowledge’s mastering for increasing the quality of medical and social help to population.

  19. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  20. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  2. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  3. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  6. Interpenetrating metal-organic and inorganic 3D networks: a computer-aided systematic investigation. Part II [1]. Analysis of the Inorganic Crystal Structure Database (ICSD)

    International Nuclear Information System (INIS)

    Baburin, I.A.; Blatov, V.A.; Carlucci, L.; Ciani, G.; Proserpio, D.M.

    2005-01-01

    Interpenetration in metal-organic and inorganic networks has been investigated by a systematic analysis of the crystallographic structural databases. We have used a version of TOPOS (a package for multipurpose crystallochemical analysis) adapted for searching for interpenetration and based on the concept of Voronoi-Dirichlet polyhedra and on the representation of a crystal structure as a reduced finite graph. In this paper, we report comprehensive lists of interpenetrating inorganic 3D structures from the Inorganic Crystal Structure Database (ICSD), inclusive of 144 Collection Codes for equivalent interpenetrating nets, analyzed on the basis of their topologies. Distinct Classes, corresponding to the different modes in which individual identical motifs can interpenetrate, have been attributed to the entangled structures. Interpenetrating nets of different nature as well as interpenetrating H-bonded nets were also examined

  7. Computer networks for financial activity management, control and statistics of databases of economic administration at the Joint Institute for Nuclear Research

    International Nuclear Information System (INIS)

    Tyupikova, T.V.; Samoilov, V.N.

    2003-01-01

    Modern information technologies urge natural sciences to further development. But it comes together with evaluation of infrastructures, to spotlight favorable conditions for the development of science and financial base in order to prove and protect legally new research. Any scientific development entails accounting and legal protection. In the report, we consider a new direction in software, organization and control of common databases on the example of the electronic document handling, which functions in some departments of the Joint Institute for Nuclear Research

  8. Colorectal Cancer Screening

    OpenAIRE

    Quintero, Enrique; Saito, Yutaka; Hassan, Cessare; Senore, Carlo

    2012-01-01

    Colorectal cancer, which is the leading cancer in Singapore, can be prevented by increased use of screening and polypectomy. A range of screening strategies such as stool-based tests, flexible sigmoidoscopy, colonoscopy and computed tomography colonography are available, each with different strengths and limitations. Primary care physicians should discuss appropriate screening modalities with their patients, tailored to their individual needs. Physicians, patients and the government should wo...

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  10. The utility of computed tomography as a screening tool for the evaluation of pediatric blunt chest trauma.

    Science.gov (United States)

    Markel, Troy A; Kumar, Rajiv; Koontz, Nicholas A; Scherer, L R; Applegate, Kimberly E

    2009-07-01

    There is a growing concern that computed tomography (CT) is being unnecessarily overused for the evaluation of pediatric patients. The purpose of this study was to analyze the trends and utility of chest CT use compared with chest X-ray (CXR) for the evaluation of children with blunt chest trauma. A 4-year retrospective review was performed for pediatric patients who underwent chest CT within 24 hours of sustaining blunt trauma at a Level-I trauma center. Trends in the use of CT and CXR were documented, and results of radiology reports were analyzed and compared with clinical outcomes. Three hundred thirty-three children, mean age 11 years, had chest CTs, increasing from 5.5% in 2001-2002 to 10.5% in 2004-2005 (p tool to analyze which patients may require CT evaluation. A multidisciplinary approach is warranted to develop guidelines that standardize the use of CT and thereby decreases unnecessary radiation exposure to pediatric patients.

  11. Efficacy for preoperative screening of liver metastasis using computed tomography and ultrasonography in patients with colon cancer

    International Nuclear Information System (INIS)

    Kotanagi, Hitoshi; Sato, Emi; Murakoshi, Satoshi

    2000-01-01

    To establish an effective examination system for preoperative evaluation of liver metastasis by colon cancer, we analyzed the sensitivity, cost, and efficacy of computed tomography (CT) and ultrasonography (US) in 354 patients (including 63 patients with liver metastasis). The presence or absence of liver metastasis was ultimately diagnosed 5 years after the operation. The sensitivity, specificity, and accuracy for detecting liver metastasis were 65%, 94%, and 89%, respectively for CT, 57%, 97%, and 91% for US, and 65%, 93%, and 88% for CT plus US, and there were no significant differences among them. Neither CT nor US could fully detect intrahepatic cancer spread. The cost of detection of one patient with liver metastasis was 6,298 points for plain CT, 20,169 points for enhanced CT, and 5,773 points for US. It was concluded that CT plus US should not be employed for preoperative assessment of liver metastasis, because the detection rate of the two modalities is not significantly different, and these modalities do not compensate for each other's defects. From the standpoint of the cost-benefit relationship, US should be selected for preoperative evaluation of liver metastasis. (author)

  12. Computation-based virtual screening for designing novel antimalarial drugs by targeting falcipain-III: a structure-based drug designing approach.

    Science.gov (United States)

    Kesharwani, Rajesh Kumar; Singh, Durg Vijay; Misra, Krishna

    2013-01-01

    better binding affinity compared to the PDB bound inhibitor of falcipain-III. The docking simulation results of falcipain-III with designed leupeptin analogues using Glide compared with AutoDock and find 80% similarity as better binder than leupeptin. These results further highlight new leupeptin analogues as promising future inhibitors for chemotherapeutic prevention of malaria. The result of Glide for falcipain-III has been compared with the result of AutoDock and finds very less differences in their order of binding affinity. Although there are no extra hydrogen bonds, however, equal number of hydrogen bonds with variable strength as compared to leupeptin along with the enhanced hydrophobic and electrostatic interactions in case of analogues supports our study that it holds the ligand molecules strongly within the receptor. The comparative e-pharmacophoric study also suggests and supports our predictions regarding the minimum features required in ligand molecule to behave as falcipain- III inhibitors and is also helpful in screening the large database as future antimalarial inhibitors.

  13. Design and implementation of the CEBAF element database

    International Nuclear Information System (INIS)

    Larrieu, T.; Joyce, M.; Slominski, C.

    2012-01-01

    With the inauguration of the CEBAF Element Database (CED) in Fall 2010, Jefferson Lab computer scientists have taken a step toward the eventual goal of a model-driven accelerator. Once fully populated, the database will be the primary repository of information used for everything from generating lattice decks to booting control computers to building controls screens. A requirement influencing the CED design is that it provide access to not only present, but also future and past configurations of the accelerator. To accomplish this, an introspective database schema was designed that allows new elements, types, and properties to be defined on-the-fly with no changes to table structure. Used in conjunction with Oracle Workspace Manager, it allows users to query data from any time in the database history with the same tools used to query the present configuration. Users can also check-out workspaces to use as staging areas for upcoming machine configurations. All Access to the CED is through a well-documented Application Programming Interface (API) that is translated automatically from original C++ source code into native libraries for scripting languages such as perl, php, and TCL making access to the CED easy and ubiquitous. (authors)

  14. Virtual materials design using databases of calculated materials properties

    International Nuclear Information System (INIS)

    Munter, T R; Landis, D D; Abild-Pedersen, F; Jones, G; Wang, S; Bligaard, T

    2009-01-01

    Materials design is most commonly carried out by experimental trial and error techniques. Current trends indicate that the increased complexity of newly developed materials, the exponential growth of the available computational power, and the constantly improving algorithms for solving the electronic structure problem, will continue to increase the relative importance of computational methods in the design of new materials. One possibility for utilizing electronic structure theory in the design of new materials is to create large databases of materials properties, and subsequently screen these for new potential candidates satisfying given design criteria. We utilize a database of more than 81 000 electronic structure calculations. This alloy database is combined with other published materials properties to form the foundation of a virtual materials design framework (VMDF). The VMDF offers a flexible collection of materials databases, filters, analysis tools and visualization methods, which are particularly useful in the design of new functional materials and surface structures. The applicability of the VMDF is illustrated by two examples. One is the determination of the Pareto-optimal set of binary alloy methanation catalysts with respect to catalytic activity and alloy stability; the other is the search for new alloy mercury absorbers.

  15. Stackfile Database

    Science.gov (United States)

    deVarvalho, Robert; Desai, Shailen D.; Haines, Bruce J.; Kruizinga, Gerhard L.; Gilmer, Christopher

    2013-01-01

    This software provides storage retrieval and analysis functionality for managing satellite altimetry data. It improves the efficiency and analysis capabilities of existing database software with improved flexibility and documentation. It offers flexibility in the type of data that can be stored. There is efficient retrieval either across the spatial domain or the time domain. Built-in analysis tools are provided for frequently performed altimetry tasks. This software package is used for storing and manipulating satellite measurement data. It was developed with a focus on handling the requirements of repeat-track altimetry missions such as Topex and Jason. It was, however, designed to work with a wide variety of satellite measurement data [e.g., Gravity Recovery And Climate Experiment -- GRACE). The software consists of several command-line tools for importing, retrieving, and analyzing satellite measurement data.

  16. Performance of machine learning methods for ligand-based virtual screening.

    Science.gov (United States)

    Plewczynski, Dariusz; Spieser, Stéphane A H; Koch, Uwe

    2009-05-01

    Computational screening of compound databases has become increasingly popular in pharmaceutical research. This review focuses on the evaluation of ligand-based virtual screening using active compounds as templates in the context of drug discovery. Ligand-based screening techniques are based on comparative molecular similarity analysis of compounds with known and unknown activity. We provide an overview of publications that have evaluated different machine learning methods, such as support vector machines, decision trees, ensemble methods such as boosting, bagging and random forests, clustering methods, neuronal networks, naïve Bayesian, data fusion methods and others.

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  18. Computer tomography colonography participation and yield in patients under surveillance for 6-9 mm polyps in a population-based screening trial

    NARCIS (Netherlands)

    Tutein Nolthenius, Charlotte J.; Boellaard, Thierry N.; de Haan, Margriet C.; Nio, C. Yung; Thomeer, Maarten G. J.; Bipat, Shandra; Montauban van Swijndregt, Alexander D.; van de Vijver, Marc J.; Biermann, Katharina; Kuipers, Ernst J.; Dekker, Evelien; Stoker, Jaap

    2016-01-01

    Surveillance CT colonography (CTC) is a viable option for 6-9 mm polyps at CTC screening for colorectal cancer. We established participation and diagnostic yield of surveillance and determined overall yield of CTC screening. In an invitational CTC screening trial 82 of 982 participants harboured 6-9

  19. Automatic detection of anomalies in screening mammograms

    Science.gov (United States)

    2013-01-01

    Background Diagnostic performance in breast screening programs may be influenced by the prior probability of disease. Since breast cancer incidence is roughly half a percent in the general population there is a large probability that the screening exam will be normal. That factor may contribute to false negatives. Screening programs typically exhibit about 83% sensitivity and 91% specificity. This investigation was undertaken to determine if a system could be developed to pre-sort screening-images into normal and suspicious bins based on their likelihood to contain disease. Wavelets were investigated as a method to parse the image data, potentially removing confounding information. The development of a classification system based on features extracted from wavelet transformed mammograms is reported. Methods In the multi-step procedure images were processed using 2D discrete wavelet transforms to create a set of maps at different size scales. Next, statistical features were computed from each map, and a subset of these features was the input for a concerted-effort set of naïve Bayesian classifiers. The classifier network was constructed to calculate the probability that the parent mammography image contained an abnormality. The abnormalities were not identified, nor were they regionalized. The algorithm was tested on two publicly available databases: the Digital Database for Screening Mammography (DDSM) and the Mammographic Images Analysis Society’s database (MIAS). These databases contain radiologist-verified images and feature common abnormalities including: spiculations, masses, geometric deformations and fibroid tissues. Results The classifier-network designs tested achieved sensitivities and specificities sufficient to be potentially useful in a clinical setting. This first series of tests identified networks with 100% sensitivity and up to 79% specificity for abnormalities. This performance significantly exceeds the mean sensitivity reported in literature

  20. Extending Database Integration Technology

    National Research Council Canada - National Science Library

    Buneman, Peter

    1999-01-01

    Formal approaches to the semantics of databases and database languages can have immediate and practical consequences in extending database integration technologies to include a vastly greater range...

  1. Multiplex-PCR-Based Screening and Computational Modeling of Virulence Factors and T-Cell Mediated Immunity in Helicobacter pylori Infections for Accurate Clinical Diagnosis.

    Directory of Open Access Journals (Sweden)

    Sinem Oktem-Okullu

    Full Text Available The outcome of H. pylori infection is closely related with bacteria's virulence factors and host immune response. The association between T cells and H. pylori infection has been identified, but the effects of the nine major H. pylori specific virulence factors; cagA, vacA, oipA, babA, hpaA, napA, dupA, ureA, ureB on T cell response in H. pylori infected patients have not been fully elucidated. We developed a multiplex- PCR assay to detect nine H. pylori virulence genes with in a three PCR reactions. Also, the expression levels of Th1, Th17 and Treg cell specific cytokines and transcription factors were detected by using qRT-PCR assays. Furthermore, a novel expert derived model is developed to identify set of factors and rules that can distinguish the ulcer patients from gastritis patients. Within all virulence factors that we tested, we identified a correlation between the presence of napA virulence gene and ulcer disease as a first data. Additionally, a positive correlation between the H. pylori dupA virulence factor and IFN-γ, and H. pylori babA virulence factor and IL-17 was detected in gastritis and ulcer patients respectively. By using computer-based models, clinical outcomes of a patients infected with H. pylori can be predicted by screening the patient's H. pylori vacA m1/m2, ureA and cagA status and IFN-γ (Th1, IL-17 (Th17, and FOXP3 (Treg expression levels. Herein, we report, for the first time, the relationship between H. pylori virulence factors and host immune responses for diagnostic prediction of gastric diseases using computer-based models.

  2. Multiplex-PCR-Based Screening and Computational Modeling of Virulence Factors and T-Cell Mediated Immunity in Helicobacter pylori Infections for Accurate Clinical Diagnosis.

    Science.gov (United States)

    Oktem-Okullu, Sinem; Tiftikci, Arzu; Saruc, Murat; Cicek, Bahattin; Vardareli, Eser; Tozun, Nurdan; Kocagoz, Tanil; Sezerman, Ugur; Yavuz, Ahmet Sinan; Sayi-Yazgan, Ayca

    2015-01-01

    The outcome of H. pylori infection is closely related with bacteria's virulence factors and host immune response. The association between T cells and H. pylori infection has been identified, but the effects of the nine major H. pylori specific virulence factors; cagA, vacA, oipA, babA, hpaA, napA, dupA, ureA, ureB on T cell response in H. pylori infected patients have not been fully elucidated. We developed a multiplex- PCR assay to detect nine H. pylori virulence genes with in a three PCR reactions. Also, the expression levels of Th1, Th17 and Treg cell specific cytokines and transcription factors were detected by using qRT-PCR assays. Furthermore, a novel expert derived model is developed to identify set of factors and rules that can distinguish the ulcer patients from gastritis patients. Within all virulence factors that we tested, we identified a correlation between the presence of napA virulence gene and ulcer disease as a first data. Additionally, a positive correlation between the H. pylori dupA virulence factor and IFN-γ, and H. pylori babA virulence factor and IL-17 was detected in gastritis and ulcer patients respectively. By using computer-based models, clinical outcomes of a patients infected with H. pylori can be predicted by screening the patient's H. pylori vacA m1/m2, ureA and cagA status and IFN-γ (Th1), IL-17 (Th17), and FOXP3 (Treg) expression levels. Herein, we report, for the first time, the relationship between H. pylori virulence factors and host immune responses for diagnostic prediction of gastric diseases using computer-based models.

  3. Investigation of CPD and HMDS Sample Preparation Techniques for Cervical Cells in Developing Computer-Aided Screening System Based on FE-SEM/EDX

    Science.gov (United States)

    Ng, Siew Cheok; Abu Osman, Noor Azuan

    2014-01-01

    This paper investigated the effects of critical-point drying (CPD) and hexamethyldisilazane (HMDS) sample preparation techniques for cervical cells on field emission scanning electron microscopy and energy dispersive X-ray (FE-SEM/EDX). We investigated the visualization of cervical cell image and elemental distribution on the cervical cell for two techniques of sample preparation. Using FE-SEM/EDX, the cervical cell images are captured and the cell element compositions are extracted for both sample preparation techniques. Cervical cell image quality, elemental composition, and processing time are considered for comparison of performances. Qualitatively, FE-SEM image based on HMDS preparation technique has better image quality than CPD technique in terms of degree of spread cell on the specimen and morphologic signs of cell deteriorations (i.e., existence of plate and pellet drying artifacts and membrane blebs). Quantitatively, with mapping and line scanning EDX analysis, carbon and oxygen element compositions in HMDS technique were higher than the CPD technique in terms of weight percentages. The HMDS technique has shorter processing time than the CPD technique. The results indicate that FE-SEM imaging, elemental composition, and processing time for sample preparation with the HMDS technique were better than CPD technique for cervical cell preparation technique for developing computer-aided screening system. PMID:25610902

  4. Development of an imaging-planning program for screen/film and computed radiography mammography for breasts with short chest wall to nipple distance.

    Science.gov (United States)

    Dong, S L; Su, J L; Yeh, Y H; Chu, T C; Lin, Y C; Chuang, K S

    2011-04-01

    Imaging breasts with a short chest wall to nipple distance (CWND) using a traditional mammographic X-ray unit is a technical challenge for mammographers. The purpose of this study is the development of an imaging-planning program to assist in determination of imaging parameters of screen/film (SF) and computed radiography (CR) mammography for short CWND breasts. A traditional mammographic X-ray unit (Mammomat 3000, Siemens, Munich, Germany) was employed. The imaging-planning program was developed by combining the compressed breast thickness correction, the equivalent polymethylmethacrylate thickness assessment for breasts and the tube loading (mAs) measurement. Both phantom exposures and a total of 597 exposures were used for examining the imaging-planning program. Results of the phantom study show that the tube loading rapidly decreased with the CWND when the automatic exposure control (AEC) detector was not fully covered by the phantom. For patient exposures with the AEC fully covered by breast tissue, the average fractional tube loadings, defined as the ratio of the predicted mAs using the imaging-planning program and mAs of the mammogram, were 1.10 and 1.07 for SF and CR mammograms, respectively. The predicted mAs values were comparable to the mAs values, as determined by the AEC. By applying the imaging-planning program in clinical practice, the experiential dependence of the mammographer for determination of the imaging parameters for short CWND breasts is minimised.

  5. IgG4-associated multifocal systemic fibrosis detected by cancer screening with 18F-FDG positron emission tomography/computed tomography

    International Nuclear Information System (INIS)

    Soga, Shigeyoshi; Kita, Tamotsu; Hiratsuka, Miyuki; Sakaguchi, Chiharu; Shinmoto, Hiroshi; Kosuda, Shigeru; Sakata, Ikuko; Miura, Soichiro

    2010-01-01

    Serial fluorine-18 fluorodeoxyglucose positron emission tomography/computed tomography ( 18 F-FDG PET/CT) studies were performed with an interval of one year in a 62-year-old man with IgG4-associated multifocal systemic fibrosis (IMSF). He first underwent 18 F-FDG PET/CT cancer screening, which revealed multiple 18 F-FDG-avid uptakes in the pancreas, prostate, and lymph nodes in the upper mediastinum, pulmonary hila, porta hepatis, and the left iliac and inguinal regions. He was not symptomatic at this initial examination. The follow-up 18 F-FDG PET/CT study showed disappearance of 18 F-FDG-avid uptake foci in the pancreas despite no treatment having been administered, but demonstrated new lesions in the abdominal para-aortic region and more intense FDG uptake in the porta hepatis lesion. Serial 18 F-FDG PET/CT studies might be useful in monitoring patients with IMSF, as well as evaluating the state of systemic involvement. Findings of 18 F-FDG PET/CT may provide information useful for determining the optimal initiation of IMSF treatment. (author)

  6. Enforcing Privacy in Cloud Databases

    OpenAIRE

    Moghadam, Somayeh Sobati; Darmont, Jérôme; Gavin, Gérald

    2017-01-01

    International audience; Outsourcing databases, i.e., resorting to Database-as-a-Service (DBaaS), is nowadays a popular choice due to the elasticity, availability, scalability and pay-as-you-go features of cloud computing. However, most data are sensitive to some extent, and data privacy remains one of the top concerns to DBaaS users, for obvious legal and competitive reasons.In this paper, we survey the mechanisms that aim at making databases secure in a cloud environment, and discuss current...

  7. Database Systems - Present and Future

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available The database systems have nowadays an increasingly important role in the knowledge-based society, in which computers have penetrated all fields of activity and the Internet tends to develop worldwide. In the current informatics context, the development of the applications with databases is the work of the specialists. Using databases, reach a database from various applications, and also some of related concepts, have become accessible to all categories of IT users. This paper aims to summarize the curricular area regarding the fundamental database systems issues, which are necessary in order to train specialists in economic informatics higher education. The database systems integrate and interfere with several informatics technologies and therefore are more difficult to understand and use. Thus, students should know already a set of minimum, mandatory concepts and their practical implementation: computer systems, programming techniques, programming languages, data structures. The article also presents the actual trends in the evolution of the database systems, in the context of economic informatics.

  8. ISSUES IN MOBILE DISTRIBUTED REAL TIME DATABASES: PERFORMANCE AND REVIEW

    OpenAIRE

    VISHNU SWAROOP,; Gyanendra Kumar Gupta,; UDAI SHANKER

    2011-01-01

    Increase in handy and small electronic devices in computing fields; it makes the computing more popularand useful in business. Tremendous advances in wireless networks and portable computing devices have led to development of mobile computing. Support of real time database system depending upon thetiming constraints, due to availability of data distributed database, and ubiquitous computing pull the mobile database concept, which emerges them in a new form of technology as mobile distributed ...

  9. Computational tools and resources for metabolism-related property predictions. 1. Overview of publicly available (free and commercial) databases and software.

    Science.gov (United States)

    Peach, Megan L; Zakharov, Alexey V; Liu, Ruifeng; Pugliese, Angelo; Tawa, Gregory; Wallqvist, Anders; Nicklaus, Marc C

    2012-10-01

    Metabolism has been identified as a defining factor in drug development success or failure because of its impact on many aspects of drug pharmacology, including bioavailability, half-life and toxicity. In this article, we provide an outline and descriptions of the resources for metabolism-related property predictions that are currently either freely or commercially available to the public. These resources include databases with data on, and software for prediction of, several end points: metabolite formation, sites of metabolic transformation, binding to metabolizing enzymes and metabolic stability. We attempt to place each tool in historical context and describe, wherever possible, the data it was based on. For predictions of interactions with metabolizing enzymes, we show a typical set of results for a small test set of compounds. Our aim is to give a clear overview of the areas and aspects of metabolism prediction in which the currently available resources are useful and accurate, and the areas in which they are inadequate or missing entirely.

  10. Does computer-aided detection have a role in the arbitration of discordant double-reading opinions in a breast-screening programme?

    International Nuclear Information System (INIS)

    James, J.J.; Cornford, E.J.

    2009-01-01

    Aims: To investigate whether a computer-aided detection (CAD) system could act as an arbitrator of discordant double-reading opinions, replacing the need for an independent third film reader. Methods: The mammograms of the 240 women that underwent arbitration by an independent third reader were identified from the 16,629 women attending our screening centre between July 2003 and April 2004. Mammograms of the arbitration cases were digitized and analysed by a CAD system. To assess the ability of CAD to act as the arbitrator, the site of the CAD prompts was retrospectively compared to the site of any abnormality noted by the original film readers. If a CAD prompt was placed on a region marked by one of the film readers then the decision of CAD as the arbitrator was that the women should be recalled for further assessment. If no mark was placed then the region was considered low risk and the decision was not to recall. The decision of CAD as the arbitrator was retrospectively compared with the original recall decision of the independent third reader. Results: There were 21 cancer cases in the group of women undergoing arbitration, diagnosed both at the original screening episode and subsequently. The independent third reader recalled 15/18 (83%) of the cancers that corresponded with the arbitrated lesion. CAD as the arbitrator would have recalled 16/18 (89%) of the cancers that corresponded to the arbitrated lesion. CAD acting as the arbitrator would have resulted in a significant increase in normal women being recalled to assessment in the arbitration group (P < 0.001). The extra 50 recalls would have potentially increased the overall recall rate to assessment from 3.1 to 3.4%; a relative increase of 10%. Conclusions: The main effect of CAD acting as an arbitrator of discordant double-reading opinions is to increase the recall rate, significantly above what is found when arbitration is performed by an independent third reader. Using CAD as an arbitrator may be an

  11. Computational Screening of Energy Materials

    DEFF Research Database (Denmark)

    Pandey, Mohnish

    , it is the need of the hour to search for environmentally benign renewable energy resources. The biggest source of the renewable energy is our sun and the immense energy it provides can be used to power the whole planet. However, an efficient way to harvest the solar energy to meet all the energy demand has...... not been realized yet. A promising way to utilize the solar energy is the photon assisted water splitting. The process involves the absorption of sunlight with a semiconducting material (or a photoabsorber) and the generated electron-hole pair can be used to produce hydrogen by splitting the water. However...... an accurate description of the energies with the first-principle calculations. Therefore, along this line the accuracy and predictability of the Meta-Generalized Gradient Approximation functional with Bayesian error estimation is also assessed....

  12. Computer Screen or Real Life?

    DEFF Research Database (Denmark)

    Suurmets, Seidi; Clement, Jesper

    the allocation of visual attention actually depends on the study setting has not been investigated. Methods: In this study we used a withinsubject design where identical stimuli were presented to 60 female participants in two settings: 1) mobile, and 2) stationary. This was done with an interval of one month....... In mobile settings visual salience was less predictive of eye movement selections and the dwell times were longer. The stationary presentation of stimuli resulted in considerable central fixation bias and the locations of redwells were more spread out. The freeviewing condition resulted in highest...... variability between the two settings, but decreased when tasks and time pressure were introduced. Conclusions: These findings have ramifications for the deployment of eyetracking and help to transfer and generalise future findings acquired in lab settings to natural environments. In the context of marketing...

  13. Screen-detected versus interval cancers: Effect of imaging modality and breast density in the Flemish Breast Cancer Screening Programme

    Energy Technology Data Exchange (ETDEWEB)

    Timmermans, Lore; Bacher, Klaus; Thierens, Hubert [Ghent University, Department of Basic Medical Sciences, QCC-Gent, Ghent (Belgium); Bleyen, Luc; Herck, Koen van [Ghent University, Centrum voor Preventie en Vroegtijdige Opsporing van Kanker, Ghent (Belgium); Lemmens, Kim; Ongeval, Chantal van; Steen, Andre van [University Hospitals Leuven, Department of Radiology, Leuven (Belgium); Martens, Patrick [Centrum voor Kankeropsporing, Bruges (Belgium); Brabander, Isabel de [Belgian Cancer Registry, Brussels (Belgium); Goossens, Mathieu [UZ Brussel, Dienst Kankerpreventie, Brussels (Belgium)

    2017-09-15

    To investigate if direct radiography (DR) performs better than screen-film mammography (SF) and computed radiography (CR) in dense breasts in a decentralized organised Breast Cancer Screening Programme. To this end, screen-detected versus interval cancers were studied in different BI-RADS density classes for these imaging modalities. The study cohort consisted of 351,532 women who participated in the Flemish Breast Cancer Screening Programme in 2009 and 2010. Information on screen-detected and interval cancers, breast density scores of radiologist second readers, and imaging modality was obtained by linkage of the databases of the Centre of Cancer Detection and the Belgian Cancer Registry. Overall, 67% of occurring breast cancers are screen detected and 33% are interval cancers, with DR performing better than SF and CR. The interval cancer rate increases gradually with breast density, regardless of modality. In the high-density class, the interval cancer rate exceeds the cancer detection rate for SF and CR, but not for DR. DR is superior to SF and CR with respect to cancer detection rates for high-density breasts. To reduce the high interval cancer rate in dense breasts, use of an additional imaging technique in screening can be taken into consideration. (orig.)

  14. Screen-detected versus interval cancers: Effect of imaging modality and breast density in the Flemish Breast Cancer Screening Programme

    International Nuclear Information System (INIS)

    Timmermans, Lore; Bacher, Klaus; Thierens, Hubert; Bleyen, Luc; Herck, Koen van; Lemmens, Kim; Ongeval, Chantal van; Steen, Andre van; Martens, Patrick; Brabander, Isabel de; Goossens, Mathieu

    2017-01-01

    To investigate if direct radiography (DR) performs better than screen-film mammography (SF) and computed radiography (CR) in dense breasts in a decentralized organised Breast Cancer Screening Programme. To this end, screen-detected versus interval cancers were studied in different BI-RADS density classes for these imaging modalities. The study cohort consisted of 351,532 women who participated in the Flemish Breast Cancer Screening Programme in 2009 and 2010. Information on screen-detected and interval cancers, breast density scores of radiologist second readers, and imaging modality was obtained by linkage of the databases of the Centre of Cancer Detection and the Belgian Cancer Registry. Overall, 67% of occurring breast cancers are screen detected and 33% are interval cancers, with DR performing better than SF and CR. The interval cancer rate increases gradually with breast density, regardless of modality. In the high-density class, the interval cancer rate exceeds the cancer detection rate for SF and CR, but not for DR. DR is superior to SF and CR with respect to cancer detection rates for high-density breasts. To reduce the high interval cancer rate in dense breasts, use of an additional imaging technique in screening can be taken into consideration. (orig.)

  15. Database development and management

    CERN Document Server

    Chao, Lee

    2006-01-01

    Introduction to Database Systems Functions of a DatabaseDatabase Management SystemDatabase ComponentsDatabase Development ProcessConceptual Design and Data Modeling Introduction to Database Design Process Understanding Business ProcessEntity-Relationship Data Model Representing Business Process with Entity-RelationshipModelTable Structure and NormalizationIntroduction to TablesTable NormalizationTransforming Data Models to Relational Databases .DBMS Selection Transforming Data Models to Relational DatabasesEnforcing ConstraintsCreating Database for Business ProcessPhysical Design and Database

  16. Secure Distributed Databases Using Cryptography

    OpenAIRE

    Ion IVAN; Cristian TOMA

    2006-01-01

    The computational encryption is used intensively by different databases management systems for ensuring privacy and integrity of information that are physically stored in files. Also, the information is sent over network and is replicated on different distributed systems. It is proved that a satisfying level of security is achieved if the rows and columns of tables are encrypted independently of table or computer that sustains the data. Also, it is very important that the SQL - Structured Que...

  17. Can breast MRI computer-aided detection (CAD) improve radiologist accuracy for lesions detected at MRI screening and recommended for biopsy in a high-risk population?

    International Nuclear Information System (INIS)

    Arazi-Kleinman, T.; Causer, P.A.; Jong, R.A.; Hill, K.; Warner, E.

    2009-01-01

    Aim: To evaluate the sensitivity and specificity of magnetic resonance imaging (MRI) computer-aided detection (CAD) for breast MRI screen-detected lesions recommended for biopsy in a high-risk population. Material and methods: Fifty-six consecutive Breast Imaging Reporting and Data System (BI-RADS) 3-5 lesions with histopathological correlation [nine invasive cancers, 13 ductal carcinoma in situ (DCIS) and 34 benign] were retrospectively evaluated using a breast MRI CAD prototype (CAD-Gaea). CAD evaluation was performed separately and in consensus by two radiologists specializing in breast imaging, blinded to the histopathology. Thresholds of 50, 80, and 100% and delayed enhancement were independently assessed with CAD. Lesions were rated as malignant or benign according to threshold and delayed enhancement only and in combination. Sensitivities, specificities, and negative predictive values (NPV) were determined for CAD assessments versus pathology. Initial MRI BI-RADS interpretation without CAD versus CAD assessments were compared using paired binary diagnostic tests. Results: Threshold levels for lesion enhancement were: 50% to include all malignant (and all benign) lesions; and 100% for all invasive cancer and high-grade DCIS. Combined use of threshold and enhancement patterns for CAD assessment was best (73% sensitivity, 56% specificity and 76% NPV for all cancer). Sensitivities and NPV were better for invasive cancer (100%/100%) than for all malignancies (54%/76%). Radiologists' MRI interpretation was more sensitive than CAD (p = 0.05), but less specific (p = 0.001) for cancer detection. Conclusion: The breast MRI CAD system used could not improve the radiologists' accuracy for distinguishing all malignant from benign lesions, due to the poor sensitivity for DCIS detection.

  18. Screening mammography-detected cancers: the sensitivity of the computer-aided detection system as applied to full-field digital mammography

    International Nuclear Information System (INIS)

    Yang, Sang Kyu; Cho, Nariya; Ko, Eun Sook; Kim, Do Yeon; Moon, Woo Kyung

    2006-01-01

    We wanted to evaluate the sensitivity of the computer-aided detection (CAD) system for performing full-field digital mammography (FFDM) on the breast cancers that were originally detected by screening mammography. The CAD system (Image Checker v3.1, R2 Technology, Los Altos, Calif.) together with a full-field digital mammography system (Senographe 2000D, GE Medical Systems, Buc, France) was prospectively applied to the mammograms of 70 mammographically detected breast cancer patients (age range, 37-69; median age, 51 years) who had negative findings on their clinical examinations. The sensitivity of the CAD system, according to histopathologic findings and radiologic primary features (i.e, mass, microcalcifications or mass with microcalcifications) and also the false-positive marking rate were then determined. The CAD system correctly depicted 67 of 70 breast cancer lesions (97.5%). The CAD system marked 29 of 30 breast cancers that presented with microcalcifications only (sensitivity 96.7%) and all 18 breast cancers the presented with mass together with microcalcifications (sensitivity 100%). Twenty of the 22 lesions that appeared as a mass only were marked correctly by the CAD system (sensitivity 90.9%). The CAD system correctly depicted all 22 lesions of ductal carcinoma in situ (sensitivity: 100%), all 13 lesions of invasive ductal carcinoma with ductal carcinoma in situ (sensitivity: 100%) and the 1 lesion of invasive lobular carcinoma (sensitivity: 100%). Thirty one of the 34 lesions of invasive ductal carcinoma were marked correctly by the CAD system (sensitivity: 91.8%). The rate of false-positive marks was 0.21 mass marks per image and 0.16 microcalcification marks per image. The overall rate of false-positive marks was 0.37 per image. The CAD system using FFDM is useful for the detection of asymptomatic breast cancers, and it has a high overall tumor detection rate. The false negative cases were found in relatively small invasive ductal carcinoma

  19. Screening mammography-detected cancers: the sensitivity of the computer-aided detection system as applied to full-field digital mammography

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Sang Kyu; Cho, Nariya; Ko, Eun Sook; Kim, Do Yeon; Moon, Woo Kyung [College of Medicine Seoul National University and The Insititute of Radiation Medicine, Seoul National University Research Center, Seoul (Korea, Republic of)

    2006-04-15

    We wanted to evaluate the sensitivity of the computer-aided detection (CAD) system for performing full-field digital mammography (FFDM) on the breast cancers that were originally detected by screening mammography. The CAD system (Image Checker v3.1, R2 Technology, Los Altos, Calif.) together with a full-field digital mammography system (Senographe 2000D, GE Medical Systems, Buc, France) was prospectively applied to the mammograms of 70 mammographically detected breast cancer patients (age range, 37-69; median age, 51 years) who had negative findings on their clinical examinations. The sensitivity of the CAD system, according to histopathologic findings and radiologic primary features (i.e, mass, microcalcifications or mass with microcalcifications) and also the false-positive marking rate were then determined. The CAD system correctly depicted 67 of 70 breast cancer lesions (97.5%). The CAD system marked 29 of 30 breast cancers that presented with microcalcifications only (sensitivity 96.7%) and all 18 breast cancers the presented with mass together with microcalcifications (sensitivity 100%). Twenty of the 22 lesions that appeared as a mass only were marked correctly by the CAD system (sensitivity 90.9%). The CAD system correctly depicted all 22 lesions of ductal carcinoma in situ (sensitivity: 100%), all 13 lesions of invasive ductal carcinoma with ductal carcinoma in situ (sensitivity: 100%) and the 1 lesion of invasive lobular carcinoma (sensitivity: 100%). Thirty one of the 34 lesions of invasive ductal carcinoma were marked correctly by the CAD system (sensitivity: 91.8%). The rate of false-positive marks was 0.21 mass marks per image and 0.16 microcalcification marks per image. The overall rate of false-positive marks was 0.37 per image. The CAD system using FFDM is useful for the detection of asymptomatic breast cancers, and it has a high overall tumor detection rate. The false negative cases were found in relatively small invasive ductal carcinoma.

  20. Some Considerations about Modern Database Machines

    Directory of Open Access Journals (Sweden)

    Manole VELICANU

    2010-01-01

    Full Text Available Optimizing the two computing resources of any computing system - time and space - has al-ways been one of the priority objectives of any database. A current and effective solution in this respect is the computer database. Optimizing computer applications by means of database machines has been a steady preoccupation of researchers since the late seventies. Several information technologies have revolutionized the present information framework. Out of these, those which have brought a major contribution to the optimization of the databases are: efficient handling of large volumes of data (Data Warehouse, Data Mining, OLAP – On Line Analytical Processing, the improvement of DBMS – Database Management Systems facilities through the integration of the new technologies, the dramatic increase in computing power and the efficient use of it (computer networks, massive parallel computing, Grid Computing and so on. All these information technologies, and others, have favored the resumption of the research on database machines and the obtaining in the last few years of some very good practical results, as far as the optimization of the computing resources is concerned.

  1. JT-60 database system, 2

    International Nuclear Information System (INIS)

    Itoh, Yasuhiro; Kurihara, Kenichi; Kimura, Toyoaki.

    1987-07-01

    The JT-60 central control system, ''ZENKEI'' collects the control and instrumentation data relevant to discharge and device status data for plant monitoring. The former of the engineering data amounts to about 3 Mbytes per shot of discharge. The ''ZENKEI'' control system which consists of seven minicomputers for on-line real-time control has little performance of handling such a large amount of data for physical and engineering analysis. In order to solve this problem, it was planned to establish the experimental database on the Front-end Processor (FEP) of general purpose large computer in JAERI Computer Center. The database management system (DBMS), therefore, has been developed for creating the database during the shot interval. The engineering data are shipped up from ''ZENKEI'' to FEP through the dedicated communication line after the shot. The hierarchical data model has been adopted in this database, which consists of the data files with tree structure of three keys of system, discharge type and shot number. The JT-60 DBMS provides the data handling packages of subroutines for interfacing the database with user's application programs. The subroutine packages for supporting graphic processing and the function of access control for security of the database are also prepared in this DBMS. (author)

  2. Secure Distributed Databases Using Cryptography

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2006-01-01

    Full Text Available The computational encryption is used intensively by different databases management systems for ensuring privacy and integrity of information that are physically stored in files. Also, the information is sent over network and is replicated on different distributed systems. It is proved that a satisfying level of security is achieved if the rows and columns of tables are encrypted independently of table or computer that sustains the data. Also, it is very important that the SQL - Structured Query Language query requests and responses to be encrypted over the network connection between the client and databases server. All this techniques and methods must be implemented by the databases administrators, designer and developers in a consistent security policy.

  3. The CEBAF Element Database and Related Operational Software

    Energy Technology Data Exchange (ETDEWEB)

    Larrieu, Theodore [Thomas Jefferson National Accelerator Facility, Newport News, VA (United States); Slominski, Christopher [Thomas Jefferson National Accelerator Facility, Newport News, VA (United States); Keesee, Marie [Thomas Jefferson National Accelerator Facility, Newport News, VA (United States); Turner, Dennison [Thomas Jefferson National Accelerator Facility, Newport News, VA (United States); Joyce, Michele [Thomas Jefferson National Accelerator Facility, Newport News, VA (United States)

    2015-09-01

    The newly commissioned 12GeV CEBAF accelerator relies on a flexible, scalable and comprehensive database to define the accelerator. This database delivers the configuration for CEBAF operational tools, including hardware checkout, the downloadable optics model, control screens, and much more. The presentation will describe the flexible design of the CEBAF Element Database (CED), its features and assorted use case examples.

  4. Protein-Protein Interaction Databases

    DEFF Research Database (Denmark)

    Szklarczyk, Damian; Jensen, Lars Juhl

    2015-01-01

    Years of meticulous curation of scientific literature and increasingly reliable computational predictions have resulted in creation of vast databases of protein interaction data. Over the years, these repositories have become a basic framework in which experiments are analyzed and new directions...

  5. Database design using entity-relationship diagrams

    CERN Document Server

    Bagui, Sikha

    2011-01-01

    Data, Databases, and the Software Engineering ProcessDataBuilding a DatabaseWhat is the Software Engineering Process?Entity Relationship Diagrams and the Software Engineering Life Cycle          Phase 1: Get the Requirements for the Database          Phase 2: Specify the Database          Phase 3: Design the DatabaseData and Data ModelsFiles, Records, and Data ItemsMoving from 3 × 5 Cards to ComputersDatabase Models     The Hierarchical ModelThe Network ModelThe Relational ModelThe Relational Model and Functional DependenciesFundamental Relational DatabaseRelational Database and SetsFunctional

  6. Mathematics for Databases

    NARCIS (Netherlands)

    ir. Sander van Laar

    2007-01-01

    A formal description of a database consists of the description of the relations (tables) of the database together with the constraints that must hold on the database. Furthermore the contents of a database can be retrieved using queries. These constraints and queries for databases can very well be

  7. Databases and their application

    NARCIS (Netherlands)

    Grimm, E.C.; Bradshaw, R.H.W; Brewer, S.; Flantua, S.; Giesecke, T.; Lézine, A.M.; Takahara, H.; Williams, J.W.,Jr; Elias, S.A.; Mock, C.J.

    2013-01-01

    During the past 20 years, several pollen database cooperatives have been established. These databases are now constituent databases of the Neotoma Paleoecology Database, a public domain, multiproxy, relational database designed for Quaternary-Pliocene fossil data and modern surface samples. The

  8. DOT Online Database

    Science.gov (United States)

    Page Home Table of Contents Contents Search Database Search Login Login Databases Advisory Circulars accessed by clicking below: Full-Text WebSearch Databases Database Records Date Advisory Circulars 2092 5 data collection and distribution policies. Document Database Website provided by MicroSearch

  9. Physics in Screening Environments

    Science.gov (United States)

    Certik, Ondrej

    In the current study, we investigated atoms in screening environments like plasmas. It is common practice to extract physical data, such as temperature and electron densities, from plasma experiments. We present results that address inherent computational difficulties that arise when the screening approach is extended to include the interaction between the atomic electrons. We show that there may arise an ambiguity in the interpretation of physical properties, such as temperature and charge density, from experimental data due to the opposing effects of electron-nucleus screening and electron-electron screening. The focus of the work, however, is on the resolution of inherent computational challenges that appear in the computation of two-particle matrix elements. Those enter already at the Hartree-Fock level. Furthermore, as examples of post Hartree-Fock calculations, we show second-order Green's function results and many body perturbation theory results of second order. A self-contained derivation of all necessary equations has been included. The accuracy of the implementation of the method is established by comparing standard unscreened results for various atoms and molecules against literature for Hartree-Fock as well as Green's function and many body perturbation theory. The main results of the thesis are presented in the chapter called Screened Results, where the behavior of several atomic systems depending on electron-electron and electron-nucleus Debye screening was studied. The computer code that we have developed has been made available for anybody to use. Finally, we present and discuss results obtained for screened interactions. We also examine thoroughly the computational details of the calculations and particular implementations of the method.

  10. Computed radiography in NDT application

    International Nuclear Information System (INIS)

    Deprins, Eric

    2004-01-01

    Computed Radiography, or digital radiography by use of reusable Storage Phosphor screens, offers a convenient and reliable way to replace film. In addition to the reduced cost on consumables, the return on investment of CR systems is strongly determined by savings in exposure time, processing times and archival times. But also intangible costs like plant shutdown, environment safety and longer usability of isotopes are increasingly important when considering replacing film by Storage Phosphor systems. But mote than in traditional radiography, the use of digital images is a trade-off between the speed and the required quality. Better image quality is obtained by longer exposure times, slower phosphor screens and higher scan resolutions. Therefore, different kinds of storage phosphor screens are needed in order to cover every application. Most operations have the data, associated with the tests to be performed, centrally stored in a database. Using a digital radiography system gives not only the advantages of the manipulation of digital images, but also the digital data that is associated with it. Smart methods to associate cassettes and Storage screens with exposed images enhance the workflow of the NDT processes, and avoid human error. Automated measurements tools increase the throughput in different kinds of operations. This paper gives an overview of the way certain operations have decided to replace film by Computed Radiography, and what the major benefits for them have been.

  11. Screening for skin cancer.

    Science.gov (United States)

    Helfand, M; Mahon, S M; Eden, K B; Frame, P S; Orleans, C T

    2001-04-01

    Malignant melanoma is often lethal, and its incidence in the United States has increased rapidly over the past 2 decades. Nonmelanoma skin cancer is seldom lethal, but, if advanced, can cause severe disfigurement and morbidity. Early detection and treatment of melanoma might reduce mortality, while early detection and treatment of nonmelanoma skin cancer might prevent major disfigurement and to a lesser extent prevent mortality. Current recommendations from professional societies regarding screening for skin cancer vary. To examine published data on the effectiveness of routine screening for skin cancer by a primary care provider, as part of an assessment for the U.S. Preventive Services Task Force. We searched the MEDLINE database for papers published between 1994 and June 1999, using search terms for screening, physical examination, morbidity, and skin neoplasms. For information on accuracy of screening tests, we used the search terms sensitivity and specificity. We identified the most important studies from before 1994 from the Guide to Clinical Preventive Services, second edition, and from high-quality reviews. We used reference lists and expert recommendations to locate additional articles. Two reviewers independently reviewed a subset of 500 abstracts. Once consistency was established, the remainder were reviewed by one reviewer. We included studies if they contained data on yield of screening, screening tests, risk factors, risk assessment, effectiveness of early detection, or cost effectiveness. We abstracted the following descriptive information from full-text published studies of screening and recorded it in an electronic database: type of screening study, study design, setting, population, patient recruitment, screening test description, examiner, advertising targeted at high-risk groups or not targeted, reported risk factors of participants, and procedure for referrals. We also abstracted the yield of screening data including probabilities and numbers

  12. Dietary Supplement Ingredient Database

    Science.gov (United States)

    ... and US Department of Agriculture Dietary Supplement Ingredient Database Toggle navigation Menu Home About DSID Mission Current ... values can be saved to build a small database or add to an existing database for national, ...

  13. Energy Consumption Database

    Science.gov (United States)

    Consumption Database The California Energy Commission has created this on-line database for informal reporting ) classifications. The database also provides easy downloading of energy consumption data into Microsoft Excel (XLSX

  14. Development of a relational database for nuclear material (NM) accounting in RC and I Group

    International Nuclear Information System (INIS)

    Yadav, M.B.; Ramakumar, K.L.; Venugopal, V.

    2011-01-01

    A relational database for the nuclear material accounting in RC and I Group has been developed with MYSQL for Back-End and JAVA for Front-End development. Back-End has been developed to avoid any data redundancy, to provide random access of the data and to retrieve the required information from database easily. JAVA Applet and Java Swing components of JAVA programming have been used in the Front-End development. Front-End has been developed to provide data security, data integrity, to generate inventory status report at the end of accounting period, and also to have a quick look of some required information on computer screen. The database has been tested for the data of three quarters of the year 2009. It has been implemented from 1st January, 2010 for the accounting of nuclear material in RC and I Group. (author)

  15. Development of a relational database for nuclear material (NM) accounting in RC and I Group

    Energy Technology Data Exchange (ETDEWEB)

    Yadav, M B; Ramakumar, K L; Venugopal, V [Radioanalytical Chemistry Division, Radiochemistry and Isotope Group, Bhabha Atomic Research Centre, Mumbai (India)

    2011-07-01

    A relational database for the nuclear material accounting in RC and I Group has been developed with MYSQL for Back-End and JAVA for Front-End development. Back-End has been developed to avoid any data redundancy, to provide random access of the data and to retrieve the required information from database easily. JAVA Applet and Java Swing components of JAVA programming have been used in the Front-End development. Front-End has been developed to provide data security, data integrity, to generate inventory status report at the end of accounting period, and also to have a quick look of some required information on computer screen. The database has been tested for the data of three quarters of the year 2009. It has been implemented from 1st January, 2010 for the accounting of nuclear material in RC and I Group. (author)