WorldWideScience

Sample records for automating spreadsheet discovery

  1. Spreadsheet

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    The spreadsheet shown in tables is intended to show how environmental costs can be calculated, displayed, and modified. It is not intended to show the environmental costs of any real resource or its effects, although it could show such costs if actual data were used. It is based on a hypothetical coal plant emitting various quantities of pollutants to which people are exposed. The environmental cost of the plant consists of the economic value of the ensuing health risks. The values used in the table are intended to be illustrative only, although they are based on modified versions of actual data from a study for the Bonneville Power Administration. The formulas used to calculate the values are also displayed. Although only one environmental effect (health risks) is calculated and valued in this spreadsheet, the same or similar procedure could be used for a variety of other environmental effects. This spreadsheet is intended to be a model; a complete accounting for all environmental costs associated with a coal plant is beyond the scope of this project

  2. Automated Supernova Discovery (Abstract)

    Science.gov (United States)

    Post, R. S.

    2015-12-01

    (Abstract only) We are developing a system of robotic telescopes for automatic recognition of Supernovas as well as other transient events in collaboration with the Puckett Supernova Search Team. At the SAS2014 meeting, the discovery program, SNARE, was first described. Since then, it has been continuously improved to handle searches under a wide variety of atmospheric conditions. Currently, two telescopes are used to build a reference library while searching for PSN with a partial library. Since data is taken every night without clouds, we must deal with varying atmospheric and high background illumination from the moon. Software is configured to identify a PSN, reshoot for verification with options to change the run plan to acquire photometric or spectrographic data. The telescopes are 24-inch CDK24, with Alta U230 cameras, one in CA and one in NM. Images and run plans are sent between sites so the CA telescope can search while photometry is done in NM. Our goal is to find bright PSNs with magnitude 17.5 or less which is the limit of our planned spectroscopy. We present results from our first automated PSN discoveries and plans for PSN data acquisition.

  3. Automated discovery systems and the inductivist controversy

    Science.gov (United States)

    Giza, Piotr

    2017-09-01

    The paper explores possible influences that some developments in the field of branches of AI, called automated discovery and machine learning systems, might have upon some aspects of the old debate between Francis Bacon's inductivism and Karl Popper's falsificationism. Donald Gillies facetiously calls this controversy 'the duel of two English knights', and claims, after some analysis of historical cases of discovery, that Baconian induction had been used in science very rarely, or not at all, although he argues that the situation has changed with the advent of machine learning systems. (Some clarification of terms machine learning and automated discovery is required here. The key idea of machine learning is that, given data with associated outcomes, software can be trained to make those associations in future cases which typically amounts to inducing some rules from individual cases classified by the experts. Automated discovery (also called machine discovery) deals with uncovering new knowledge that is valuable for human beings, and its key idea is that discovery is like other intellectual tasks and that the general idea of heuristic search in problem spaces applies also to discovery tasks. However, since machine learning systems discover (very low-level) regularities in data, throughout this paper I use the generic term automated discovery for both kinds of systems. I will elaborate on this later on). Gillies's line of argument can be generalised: thanks to automated discovery systems, philosophers of science have at their disposal a new tool for empirically testing their philosophical hypotheses. Accordingly, in the paper, I will address the question, which of the two philosophical conceptions of scientific method is better vindicated in view of the successes and failures of systems developed within three major research programmes in the field: machine learning systems in the Turing tradition, normative theory of scientific discovery formulated by Herbert Simon

  4. An automated data handling process integrating spreadsheets and word processors with analytical programs

    International Nuclear Information System (INIS)

    Fisher, G.F.; Bennett, L.G.I.

    1994-01-01

    A data handling process utilizing software programs that are commercially available for use on MS-DOS microcomputers was developed to reduce the time, energy and labour required to tabulate the final results of trace analyses. The elimination of hand computations reduced the possibility of transcription errors since, once the γ-ray spectrum analysis results are obtained and saved to a hard disk of a microcomputer, they can be manipulated very easily with little possibility of distortion. The 8 step process permitted the selection of each element of interest's best concentration value based upon its associated peak area. Calculated concentration values were automatically compared against the sample's determination limit. Unsatisfactory values were flagged for latter review and adjustment by the user. In the final step, a file was created which identified the samples with their appropriate particulars (i.e. source, sample, date, etc.), and the trace element concentration were displayed. This final file contained a fully formatted summary table that listed all of the sample's results and particulars such that it could be printed or imported into a word processor for inclusion in a report. In the illustrated application of analyzing wear debris in oil-lubricated systems, over 13,000 individual numbers were processed to arrive at final concentration estimates of 19 trace elements in 80 samples. The system works very well for the elements that were analyzed in this investigation. The usefulness of commercially available spreadsheets and word processors for this task was demonstrated. (author) 5 refs.; 2 figs.; 5 tabs

  5. Spreadsheet Patents

    DEFF Research Database (Denmark)

    Borum, Holger Stadel; Kirkbro, Malthe Ettrup; Sestoft, Peter

    2018-01-01

    This technical report gives a list of US patents and patent applications related to spreadsheet implementation technology. It is intended as a companion to the monograph Spreadsheet Implementation Technology (Peter Sestoft, MIT Press 2014), and substantially extends and updates an appendix from...

  6. Perspectives on bioanalytical mass spectrometry and automation in drug discovery.

    Science.gov (United States)

    Janiszewski, John S; Liston, Theodore E; Cole, Mark J

    2008-11-01

    The use of high speed synthesis technologies has resulted in a steady increase in the number of new chemical entities active in the drug discovery research stream. Large organizations can have thousands of chemical entities in various stages of testing and evaluation across numerous projects on a weekly basis. Qualitative and quantitative measurements made using LC/MS are integrated throughout this process from early stage lead generation through candidate nomination. Nearly all analytical processes and procedures in modern research organizations are automated to some degree. This includes both hardware and software automation. In this review we discuss bioanalytical mass spectrometry and automation as components of the analytical chemistry infrastructure in pharma. Analytical chemists are presented as members of distinct groups with similar skillsets that build automated systems, manage test compounds, assays and reagents, and deliver data to project teams. The ADME-screening process in drug discovery is used as a model to highlight the relationships between analytical tasks in drug discovery. Emerging software and process automation tools are described that can potentially address gaps and link analytical chemistry related tasks. The role of analytical chemists and groups in modern 'industrialized' drug discovery is also discussed.

  7. Automated Discovery of Speech Act Categories in Educational Games

    Science.gov (United States)

    Rus, Vasile; Moldovan, Cristian; Niraula, Nobal; Graesser, Arthur C.

    2012-01-01

    In this paper we address the important task of automated discovery of speech act categories in dialogue-based, multi-party educational games. Speech acts are important in dialogue-based educational systems because they help infer the student speaker's intentions (the task of speech act classification) which in turn is crucial to providing adequate…

  8. Automated cell type discovery and classification through knowledge transfer

    Science.gov (United States)

    Lee, Hao-Chih; Kosoy, Roman; Becker, Christine E.

    2017-01-01

    Abstract Motivation: Recent advances in mass cytometry allow simultaneous measurements of up to 50 markers at single-cell resolution. However, the high dimensionality of mass cytometry data introduces computational challenges for automated data analysis and hinders translation of new biological understanding into clinical applications. Previous studies have applied machine learning to facilitate processing of mass cytometry data. However, manual inspection is still inevitable and becoming the barrier to reliable large-scale analysis. Results: We present a new algorithm called Automated Cell-type Discovery and Classification (ACDC) that fully automates the classification of canonical cell populations and highlights novel cell types in mass cytometry data. Evaluations on real-world data show ACDC provides accurate and reliable estimations compared to manual gating results. Additionally, ACDC automatically classifies previously ambiguous cell types to facilitate discovery. Our findings suggest that ACDC substantially improves both reliability and interpretability of results obtained from high-dimensional mass cytometry profiling data. Availability and Implementation: A Python package (Python 3) and analysis scripts for reproducing the results are availability on https://bitbucket.org/dudleylab/acdc. Contact: brian.kidd@mssm.edu or joel.dudley@mssm.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28158442

  9. Automated vocabulary discovery for geo-parsing online epidemic intelligence.

    Science.gov (United States)

    Keller, Mikaela; Freifeld, Clark C; Brownstein, John S

    2009-11-24

    Automated surveillance of the Internet provides a timely and sensitive method for alerting on global emerging infectious disease threats. HealthMap is part of a new generation of online systems designed to monitor and visualize, on a real-time basis, disease outbreak alerts as reported by online news media and public health sources. HealthMap is of specific interest for national and international public health organizations and international travelers. A particular task that makes such a surveillance useful is the automated discovery of the geographic references contained in the retrieved outbreak alerts. This task is sometimes referred to as "geo-parsing". A typical approach to geo-parsing would demand an expensive training corpus of alerts manually tagged by a human. Given that human readers perform this kind of task by using both their lexical and contextual knowledge, we developed an approach which relies on a relatively small expert-built gazetteer, thus limiting the need of human input, but focuses on learning the context in which geographic references appear. We show in a set of experiments, that this approach exhibits a substantial capacity to discover geographic locations outside of its initial lexicon. The results of this analysis provide a framework for future automated global surveillance efforts that reduce manual input and improve timeliness of reporting.

  10. Automated vocabulary discovery for geo-parsing online epidemic intelligence

    Directory of Open Access Journals (Sweden)

    Freifeld Clark C

    2009-11-01

    Full Text Available Abstract Background Automated surveillance of the Internet provides a timely and sensitive method for alerting on global emerging infectious disease threats. HealthMap is part of a new generation of online systems designed to monitor and visualize, on a real-time basis, disease outbreak alerts as reported by online news media and public health sources. HealthMap is of specific interest for national and international public health organizations and international travelers. A particular task that makes such a surveillance useful is the automated discovery of the geographic references contained in the retrieved outbreak alerts. This task is sometimes referred to as "geo-parsing". A typical approach to geo-parsing would demand an expensive training corpus of alerts manually tagged by a human. Results Given that human readers perform this kind of task by using both their lexical and contextual knowledge, we developed an approach which relies on a relatively small expert-built gazetteer, thus limiting the need of human input, but focuses on learning the context in which geographic references appear. We show in a set of experiments, that this approach exhibits a substantial capacity to discover geographic locations outside of its initial lexicon. Conclusion The results of this analysis provide a framework for future automated global surveillance efforts that reduce manual input and improve timeliness of reporting.

  11. GWATCH: a web platform for automated gene association discovery analysis

    Science.gov (United States)

    2014-01-01

    Background As genome-wide sequence analyses for complex human disease determinants are expanding, it is increasingly necessary to develop strategies to promote discovery and validation of potential disease-gene associations. Findings Here we present a dynamic web-based platform – GWATCH – that automates and facilitates four steps in genetic epidemiological discovery: 1) Rapid gene association search and discovery analysis of large genome-wide datasets; 2) Expanded visual display of gene associations for genome-wide variants (SNPs, indels, CNVs), including Manhattan plots, 2D and 3D snapshots of any gene region, and a dynamic genome browser illustrating gene association chromosomal regions; 3) Real-time validation/replication of candidate or putative genes suggested from other sources, limiting Bonferroni genome-wide association study (GWAS) penalties; 4) Open data release and sharing by eliminating privacy constraints (The National Human Genome Research Institute (NHGRI) Institutional Review Board (IRB), informed consent, The Health Insurance Portability and Accountability Act (HIPAA) of 1996 etc.) on unabridged results, which allows for open access comparative and meta-analysis. Conclusions GWATCH is suitable for both GWAS and whole genome sequence association datasets. We illustrate the utility of GWATCH with three large genome-wide association studies for HIV-AIDS resistance genes screened in large multicenter cohorts; however, association datasets from any study can be uploaded and analyzed by GWATCH. PMID:25374661

  12. Measuring Spreadsheet Formula Understandability

    NARCIS (Netherlands)

    Hermans, F.F.J.; Pinzger, M.; Van Deursen, A.

    2012-01-01

    Spreadsheets are widely used in industry, because they are flexible and easy to use. Often they are used for business-critical applications. It is however difficult for spreadsheet users to correctly assess the quality of spreadsheets, especially with respect to the understandability.

  13. Optimization modeling with spreadsheets

    CERN Document Server

    Baker, Kenneth R

    2015-01-01

    An accessible introduction to optimization analysis using spreadsheets Updated and revised, Optimization Modeling with Spreadsheets, Third Edition emphasizes model building skills in optimization analysis. By emphasizing both spreadsheet modeling and optimization tools in the freely available Microsoft® Office Excel® Solver, the book illustrates how to find solutions to real-world optimization problems without needing additional specialized software. The Third Edition includes many practical applications of optimization models as well as a systematic framework that il

  14. Bell automation system on STM32F4 Discovery board

    OpenAIRE

    Božović, Denis

    2017-01-01

    A bell automation system is a device, the aim of which is to maximize the automation of bell ringing and thus release from duty the person in charge of it. The modern way of life and forms of employment generally make it difficult for human bell-ringers to carry out the task as they did for centuries. In this thesis it is explained what can be expected of the bell automation system in the regions of Slovenia, and why it is desirable that it supports certain functionalities. Using as an exampl...

  15. Spreadsheets and Bulgarian Goats

    Science.gov (United States)

    Sugden, Steve

    2012-01-01

    We consider a problem appearing in an Australian Mathematics Challenge in 2003. This article considers whether a spreadsheet might be used to model this problem, thus allowing students to explore its structure within the spreadsheet environment. It then goes on to reflect on some general principles of problem decomposition when the final goal is a…

  16. Novel automated biomarker discovery work flow for urinary peptidomics

    DEFF Research Database (Denmark)

    Balog, Crina I.; Hensbergen, Paul J.; Derks, Rico

    2009-01-01

    samples from Schistosoma haematobium-infected individuals to evaluate clinical applicability. RESULTS: The automated RP-SCX sample cleanup and fractionation system exhibits a high qualitative and quantitative reproducibility, with both BSA standards and urine samples. Because of the relatively high...

  17. Flexible End2End Workflow Automation of Hit-Discovery Research.

    Science.gov (United States)

    Holzmüller-Laue, Silke; Göde, Bernd; Thurow, Kerstin

    2014-08-01

    The article considers a new approach of more complex laboratory automation at the workflow layer. The authors purpose the automation of end2end workflows. The combination of all relevant subprocesses-whether automated or manually performed, independently, and in which organizational unit-results in end2end processes that include all result dependencies. The end2end approach focuses on not only the classical experiments in synthesis or screening, but also on auxiliary processes such as the production and storage of chemicals, cell culturing, and maintenance as well as preparatory activities and analyses of experiments. Furthermore, the connection of control flow and data flow in the same process model leads to reducing of effort of the data transfer between the involved systems, including the necessary data transformations. This end2end laboratory automation can be realized effectively with the modern methods of business process management (BPM). This approach is based on a new standardization of the process-modeling notation Business Process Model and Notation 2.0. In drug discovery, several scientific disciplines act together with manifold modern methods, technologies, and a wide range of automated instruments for the discovery and design of target-based drugs. The article discusses the novel BPM-based automation concept with an implemented example of a high-throughput screening of previously synthesized compound libraries. © 2014 Society for Laboratory Automation and Screening.

  18. Recent development in software and automation tools for high-throughput discovery bioanalysis.

    Science.gov (United States)

    Shou, Wilson Z; Zhang, Jun

    2012-05-01

    Bioanalysis with LC-MS/MS has been established as the method of choice for quantitative determination of drug candidates in biological matrices in drug discovery and development. The LC-MS/MS bioanalytical support for drug discovery, especially for early discovery, often requires high-throughput (HT) analysis of large numbers of samples (hundreds to thousands per day) generated from many structurally diverse compounds (tens to hundreds per day) with a very quick turnaround time, in order to provide important activity and liability data to move discovery projects forward. Another important consideration for discovery bioanalysis is its fit-for-purpose quality requirement depending on the particular experiments being conducted at this stage, and it is usually not as stringent as those required in bioanalysis supporting drug development. These aforementioned attributes of HT discovery bioanalysis made it an ideal candidate for using software and automation tools to eliminate manual steps, remove bottlenecks, improve efficiency and reduce turnaround time while maintaining adequate quality. In this article we will review various recent developments that facilitate automation of individual bioanalytical procedures, such as sample preparation, MS/MS method development, sample analysis and data review, as well as fully integrated software tools that manage the entire bioanalytical workflow in HT discovery bioanalysis. In addition, software tools supporting the emerging high-resolution accurate MS bioanalytical approach are also discussed.

  19. A Fully Automated High-Throughput Flow Cytometry Screening System Enabling Phenotypic Drug Discovery.

    Science.gov (United States)

    Joslin, John; Gilligan, James; Anderson, Paul; Garcia, Catherine; Sharif, Orzala; Hampton, Janice; Cohen, Steven; King, Miranda; Zhou, Bin; Jiang, Shumei; Trussell, Christopher; Dunn, Robert; Fathman, John W; Snead, Jennifer L; Boitano, Anthony E; Nguyen, Tommy; Conner, Michael; Cooke, Mike; Harris, Jennifer; Ainscow, Ed; Zhou, Yingyao; Shaw, Chris; Sipes, Dan; Mainquist, James; Lesley, Scott

    2018-05-01

    The goal of high-throughput screening is to enable screening of compound libraries in an automated manner to identify quality starting points for optimization. This often involves screening a large diversity of compounds in an assay that preserves a connection to the disease pathology. Phenotypic screening is a powerful tool for drug identification, in that assays can be run without prior understanding of the target and with primary cells that closely mimic the therapeutic setting. Advanced automation and high-content imaging have enabled many complex assays, but these are still relatively slow and low throughput. To address this limitation, we have developed an automated workflow that is dedicated to processing complex phenotypic assays for flow cytometry. The system can achieve a throughput of 50,000 wells per day, resulting in a fully automated platform that enables robust phenotypic drug discovery. Over the past 5 years, this screening system has been used for a variety of drug discovery programs, across many disease areas, with many molecules advancing quickly into preclinical development and into the clinic. This report will highlight a diversity of approaches that automated flow cytometry has enabled for phenotypic drug discovery.

  20. Semi-Automated Discovery of Application Session Structure

    Energy Technology Data Exchange (ETDEWEB)

    Kannan, J.; Jung, J.; Paxson, V.; Koksal, C.

    2006-09-07

    While the problem of analyzing network traffic at the granularity of individual connections has seen considerable previous work and tool development, understanding traffic at a higher level---the structure of user-initiated sessions comprised of groups of related connections---remains much less explored. Some types of session structure, such as the coupling between an FTP control connection and the data connections it spawns, have prespecified forms, though the specifications do not guarantee how the forms appear in practice. Other types of sessions, such as a user reading email with a browser, only manifest empirically. Still other sessions might exist without us even knowing of their presence, such as a botnet zombie receiving instructions from its master and proceeding in turn to carry them out. We present algorithms rooted in the statistics of Poisson processes that can mine a large corpus of network connection logs to extract the apparent structure of application sessions embedded in the connections. Our methods are semi-automated in that we aim to present an analyst with high-quality information (expressed as regular expressions) reflecting different possible abstractions of an application's session structure. We develop and test our methods using traces from a large Internet site, finding diversity in the number of applications that manifest, their different session structures, and the presence of abnormal behavior. Our work has applications to traffic characterization and monitoring, source models for synthesizing network traffic, and anomaly detection.

  1. Semi-automated knowledge discovery: identifying and profiling human trafficking

    Science.gov (United States)

    Poelmans, Jonas; Elzinga, Paul; Ignatov, Dmitry I.; Kuznetsov, Sergei O.

    2012-11-01

    We propose an iterative and human-centred knowledge discovery methodology based on formal concept analysis. The proposed approach recognizes the important role of the domain expert in mining real-world enterprise applications and makes use of specific domain knowledge, including human intelligence and domain-specific constraints. Our approach was empirically validated at the Amsterdam-Amstelland police to identify suspects and victims of human trafficking in 266,157 suspicious activity reports. Based on guidelines of the Attorney Generals of the Netherlands, we first defined multiple early warning indicators that were used to index the police reports. Using concept lattices, we revealed numerous unknown human trafficking and loverboy suspects. In-depth investigation by the police resulted in a confirmation of their involvement in illegal activities resulting in actual arrestments been made. Our human-centred approach was embedded into operational policing practice and is now successfully used on a daily basis to cope with the vastly growing amount of unstructured information.

  2. Implementing function spreadsheets

    DEFF Research Database (Denmark)

    Sestoft, Peter

    2008-01-01

    : that of turning an expression into a named function. Hence they proposed a way to define a function in terms of a worksheet with designated input and output cells; we shall call it a function sheet. The goal of our work is to develop implementations of function sheets and study their application to realistic...... examples. Therefore, we are also developing a simple yet comprehensive spreadsheet core implementation for experimentation with this technology. Here we report briefly on our experiments with function sheets as well as other uses of our spreadsheet core implementation....

  3. The deductive spreadsheet

    CERN Document Server

    Cervesato, Iliano

    2013-01-01

    This book describes recent multidisciplinary research at the confluence of the fields of logic programming, database theory and human-computer interaction. The goal of this effort was to develop the basis of a deductive spreadsheet, a user productivity application that allows users without formal training in computer science to make decisions about generic data in the same simple way they currently use spreadsheets to make decisions about numerical data. The result is an elegant design supported by the most recent developments in the above disciplines.The first half of the book focuses on the

  4. Automated discovery of functional generality of human gene expression programs.

    Directory of Open Access Journals (Sweden)

    Georg K Gerber

    2007-08-01

    Full Text Available An important research problem in computational biology is the identification of expression programs, sets of co-expressed genes orchestrating normal or pathological processes, and the characterization of the functional breadth of these programs. The use of human expression data compendia for discovery of such programs presents several challenges including cellular inhomogeneity within samples, genetic and environmental variation across samples, uncertainty in the numbers of programs and sample populations, and temporal behavior. We developed GeneProgram, a new unsupervised computational framework based on Hierarchical Dirichlet Processes that addresses each of the above challenges. GeneProgram uses expression data to simultaneously organize tissues into groups and genes into overlapping programs with consistent temporal behavior, to produce maps of expression programs, which are sorted by generality scores that exploit the automatically learned groupings. Using synthetic and real gene expression data, we showed that GeneProgram outperformed several popular expression analysis methods. We applied GeneProgram to a compendium of 62 short time-series gene expression datasets exploring the responses of human cells to infectious agents and immune-modulating molecules. GeneProgram produced a map of 104 expression programs, a substantial number of which were significantly enriched for genes involved in key signaling pathways and/or bound by NF-kappaB transcription factors in genome-wide experiments. Further, GeneProgram discovered expression programs that appear to implicate surprising signaling pathways or receptor types in the response to infection, including Wnt signaling and neurotransmitter receptors. We believe the discovered map of expression programs involved in the response to infection will be useful for guiding future biological experiments; genes from programs with low generality scores might serve as new drug targets that exhibit minimal

  5. Spreadsheet analysis of gamma spectra for nuclear material measurements

    International Nuclear Information System (INIS)

    Mosby, W.R.; Pace, D.M.

    1990-01-01

    A widely available commercial spreadsheet package for personal computers is used to calculate gamma spectra peak areas using both region of interest and peak fitting methods. The gamma peak areas obtained are used for uranium enrichment assays and for isotopic analyses of mixtures of transuranics. The use of spreadsheet software with an internal processing language allows automation of routine analysis procedures increasing ease of use and reducing processing errors while providing great flexibility in addressing unusual measurement problems. 4 refs., 9 figs

  6. Accelerating the discovery of materials for clean energy in the era of smart automation

    Science.gov (United States)

    Tabor, Daniel P.; Roch, Loïc M.; Saikin, Semion K.; Kreisbeck, Christoph; Sheberla, Dennis; Montoya, Joseph H.; Dwaraknath, Shyam; Aykol, Muratahan; Ortiz, Carlos; Tribukait, Hermann; Amador-Bedolla, Carlos; Brabec, Christoph J.; Maruyama, Benji; Persson, Kristin A.; Aspuru-Guzik, Alán

    2018-05-01

    The discovery and development of novel materials in the field of energy are essential to accelerate the transition to a low-carbon economy. Bringing recent technological innovations in automation, robotics and computer science together with current approaches in chemistry, materials synthesis and characterization will act as a catalyst for revolutionizing traditional research and development in both industry and academia. This Perspective provides a vision for an integrated artificial intelligence approach towards autonomous materials discovery, which, in our opinion, will emerge within the next 5 to 10 years. The approach we discuss requires the integration of the following tools, which have already seen substantial development to date: high-throughput virtual screening, automated synthesis planning, automated laboratories and machine learning algorithms. In addition to reducing the time to deployment of new materials by an order of magnitude, this integrated approach is expected to lower the cost associated with the initial discovery. Thus, the price of the final products (for example, solar panels, batteries and electric vehicles) will also decrease. This in turn will enable industries and governments to meet more ambitious targets in terms of reducing greenhouse gas emissions at a faster pace.

  7. Automated in vivo platform for the discovery of functional food treatments of hypercholesterolemia.

    Directory of Open Access Journals (Sweden)

    Robert M Littleton

    Full Text Available The zebrafish is becoming an increasingly popular model system for both automated drug discovery and investigating hypercholesterolemia. Here we combine these aspects and for the first time develop an automated high-content confocal assay for treatments of hypercholesterolemia. We also create two algorithms for automated analysis of cardiodynamic data acquired by high-speed confocal microscopy. The first algorithm computes cardiac parameters solely from the frequency-domain representation of cardiodynamic data while the second uses both frequency- and time-domain data. The combined approach resulted in smaller differences relative to manual measurements. The methods are implemented to test the ability of a methanolic extract of the hawthorn plant (Crataegus laevigata to treat hypercholesterolemia and its peripheral cardiovascular effects. Results demonstrate the utility of these methods and suggest the extract has both antihypercholesterolemic and postitively inotropic properties.

  8. Early identification of hERG liability in drug discovery programs by automated patch clamp

    Directory of Open Access Journals (Sweden)

    Timm eDanker

    2014-09-01

    Full Text Available Blockade of the cardiac ion channel coded by hERG can lead to cardiac arrhythmia, which has become a major concern in drug discovery and development. Automated electrophysiological patch clamp allows assessment of hERG channel effects early in drug development to aid medicinal chemistry programs and has become routine in pharmaceutical companies. However, a number of potential sources of errors in setting up hERG channel assays by automated patch clamp can lead to misinterpretation of data or false effects being reported. This article describes protocols for automated electrophysiology screening of compound effects on the hERG channel current. Protocol details and the translation of criteria known from manual patch clamp experiments to automated patch clamp experiments to achieve good quality data are emphasized. Typical pitfalls and artifacts that may lead to misinterpretation of data are discussed. While this article focuses on hERG channel recordings using the QPatch (Sophion A/S, Copenhagen, Denmark technology, many of the assay and protocol details given in this article can be transferred for setting up different ion channel assays by automated patch clamp and are similar on other planar patch clamp platforms.

  9. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    International Nuclear Information System (INIS)

    Tsai, Yingssu; McPhillips, Scott E.; González, Ana; McPhillips, Timothy M.; Zinn, Daniel; Cohen, Aina E.; Feese, Michael D.; Bushnell, David; Tiefenbrunn, Theresa; Stout, C. David; Ludaescher, Bertram; Hedman, Britt; Hodgson, Keith O.; Soltis, S. Michael

    2013-01-01

    New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data, performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully

  10. The Spiral Discovery Network as an Automated General-Purpose Optimization Tool

    Directory of Open Access Journals (Sweden)

    Adam B. Csapo

    2018-01-01

    Full Text Available The Spiral Discovery Method (SDM was originally proposed as a cognitive artifact for dealing with black-box models that are dependent on multiple inputs with nonlinear and/or multiplicative interaction effects. Besides directly helping to identify functional patterns in such systems, SDM also simplifies their control through its characteristic spiral structure. In this paper, a neural network-based formulation of SDM is proposed together with a set of automatic update rules that makes it suitable for both semiautomated and automated forms of optimization. The behavior of the generalized SDM model, referred to as the Spiral Discovery Network (SDN, and its applicability to nondifferentiable nonconvex optimization problems are elucidated through simulation. Based on the simulation, the case is made that its applicability would be worth investigating in all areas where the default approach of gradient-based backpropagation is used today.

  11. Some Spreadsheet Poka-Yoke

    OpenAIRE

    Bekenn, Bill; Hooper, Ray

    2009-01-01

    Whilst not all spreadsheet defects are structural in nature, poor layout choices can compromise spreadsheet quality. These defects may be avoided at the development stage by some simple mistake prevention and detection devices. Poka-Yoke (Japanese for Mistake Proofing), which owes its genesis to the Toyota Production System (the standard for manufacturing excellence throughout the world) offers some principles that may be applied to reducing spreadsheet defects. In this paper we examine sprea...

  12. A fully automated primary screening system for the discovery of therapeutic antibodies directly from B cells.

    Science.gov (United States)

    Tickle, Simon; Howells, Louise; O'Dowd, Victoria; Starkie, Dale; Whale, Kevin; Saunders, Mark; Lee, David; Lightwood, Daniel

    2015-04-01

    For a therapeutic antibody to succeed, it must meet a range of potency, stability, and specificity criteria. Many of these characteristics are conferred by the amino acid sequence of the heavy and light chain variable regions and, for this reason, can be screened for during antibody selection. However, it is important to consider that antibodies satisfying all these criteria may be of low frequency in an immunized animal; for this reason, it is essential to have a mechanism that allows for efficient sampling of the immune repertoire. UCB's core antibody discovery platform combines high-throughput B cell culture screening and the identification and isolation of single, antigen-specific IgG-secreting B cells through a proprietary technique called the "fluorescent foci" method. Using state-of-the-art automation to facilitate primary screening, extremely efficient interrogation of the natural antibody repertoire is made possible; more than 1 billion immune B cells can now be screened to provide a useful starting point from which to identify the rare therapeutic antibody. This article will describe the design, construction, and commissioning of a bespoke automated screening platform and two examples of how it was used to screen for antibodies against two targets. © 2014 Society for Laboratory Automation and Screening.

  13. Predicting Causal Relationships from Biological Data: Applying Automated Casual Discovery on Mass Cytometry Data of Human Immune Cells

    KAUST Repository

    Triantafillou, Sofia; Lagani, Vincenzo; Heinze-Deml, Christina; Schmidt, Angelika; Tegner, Jesper; Tsamardinos, Ioannis

    2017-01-01

    Learning the causal relationships that define a molecular system allows us to predict how the system will respond to different interventions. Distinguishing causality from mere association typically requires randomized experiments. Methods for automated causal discovery from limited experiments exist, but have so far rarely been tested in systems biology applications. In this work, we apply state-of-the art causal discovery methods on a large collection of public mass cytometry data sets, measuring intra-cellular signaling proteins of the human immune system and their response to several perturbations. We show how different experimental conditions can be used to facilitate causal discovery, and apply two fundamental methods that produce context-specific causal predictions. Causal predictions were reproducible across independent data sets from two different studies, but often disagree with the KEGG pathway databases. Within this context, we discuss the caveats we need to overcome for automated causal discovery to become a part of the routine data analysis in systems biology.

  14. Predicting Causal Relationships from Biological Data: Applying Automated Casual Discovery on Mass Cytometry Data of Human Immune Cells

    KAUST Repository

    Triantafillou, Sofia

    2017-03-31

    Learning the causal relationships that define a molecular system allows us to predict how the system will respond to different interventions. Distinguishing causality from mere association typically requires randomized experiments. Methods for automated causal discovery from limited experiments exist, but have so far rarely been tested in systems biology applications. In this work, we apply state-of-the art causal discovery methods on a large collection of public mass cytometry data sets, measuring intra-cellular signaling proteins of the human immune system and their response to several perturbations. We show how different experimental conditions can be used to facilitate causal discovery, and apply two fundamental methods that produce context-specific causal predictions. Causal predictions were reproducible across independent data sets from two different studies, but often disagree with the KEGG pathway databases. Within this context, we discuss the caveats we need to overcome for automated causal discovery to become a part of the routine data analysis in systems biology.

  15. A generic template for automated bioanalytical ligand-binding assays using modular robotic scripts in support of discovery biotherapeutic programs.

    Science.gov (United States)

    Duo, Jia; Dong, Huijin; DeSilva, Binodh; Zhang, Yan J

    2013-07-01

    Sample dilution and reagent pipetting are time-consuming steps in ligand-binding assays (LBAs). Traditional automation-assisted LBAs use assay-specific scripts that require labor-intensive script writing and user training. Five major script modules were developed on Tecan Freedom EVO liquid handling software to facilitate the automated sample preparation and LBA procedure: sample dilution, sample minimum required dilution, standard/QC minimum required dilution, standard/QC/sample addition, and reagent addition. The modular design of automation scripts allowed the users to assemble an automated assay with minimal script modification. The application of the template was demonstrated in three LBAs to support discovery biotherapeutic programs. The results demonstrated that the modular scripts provided the flexibility in adapting to various LBA formats and the significant time saving in script writing and scientist training. Data generated by the automated process were comparable to those by manual process while the bioanalytical productivity was significantly improved using the modular robotic scripts.

  16. Automated discovery of safety and efficacy concerns for joint & muscle pain relief treatments from online reviews.

    Science.gov (United States)

    Adams, David Z; Gruss, Richard; Abrahams, Alan S

    2017-04-01

    Product issues can cost companies millions in lawsuits and have devastating effects on a firm's sales, image and goodwill, especially in the era of social media. The ability for a system to detect the presence of safety and efficacy (S&E) concerns early on could not only protect consumers from injuries due to safety hazards, but could also mitigate financial damage to the manufacturer. Prior studies in the field of automated defect discovery have found industry-specific techniques appropriate to the automotive, consumer electronics, home appliance, and toy industries, but have not investigated pain relief medicines and medical devices. In this study, we focus specifically on automated discovery of S&E concerns in over-the-counter (OTC) joint and muscle pain relief remedies and devices. We select a dataset of over 32,000 records for three categories of Joint & Muscle Pain Relief treatments from Amazon's online product reviews, and train "smoke word" dictionaries which we use to score holdout reviews, for the presence of safety and efficacy issues. We also score using conventional sentiment analysis techniques. Compared to traditional sentiment analysis techniques, we found that smoke term dictionaries were better suited to detect product concerns from online consumer reviews, and significantly outperformed the sentiment analysis techniques in uncovering both efficacy and safety concerns, across all product subcategories. Our research can be applied to the healthcare and pharmaceutical industry in order to detect safety and efficacy concerns, reducing risks that consumers face using these products. These findings can be highly beneficial to improving quality assurance and management in joint and muscle pain relief. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Early detection of pharmacovigilance signals with automated methods based on false discovery rates: a comparative study.

    Science.gov (United States)

    Ahmed, Ismaïl; Thiessard, Frantz; Miremont-Salamé, Ghada; Haramburu, Françoise; Kreft-Jais, Carmen; Bégaud, Bernard; Tubert-Bitter, Pascale

    2012-06-01

    Improving the detection of drug safety signals has led several pharmacovigilance regulatory agencies to incorporate automated quantitative methods into their spontaneous reporting management systems. The three largest worldwide pharmacovigilance databases are routinely screened by the lower bound of the 95% confidence interval of proportional reporting ratio (PRR₀₂.₅), the 2.5% quantile of the Information Component (IC₀₂.₅) or the 5% quantile of the Gamma Poisson Shrinker (GPS₀₅). More recently, Bayesian and non-Bayesian False Discovery Rate (FDR)-based methods were proposed that address the arbitrariness of thresholds and allow for a built-in estimate of the FDR. These methods were also shown through simulation studies to be interesting alternatives to the currently used methods. The objective of this work was twofold. Based on an extensive retrospective study, we compared PRR₀₂.₅, GPS₀₅ and IC₀₂.₅ with two FDR-based methods derived from the Fisher's exact test and the GPS model (GPS(pH0) [posterior probability of the null hypothesis H₀ calculated from the Gamma Poisson Shrinker model]). Secondly, restricting the analysis to GPS(pH0), we aimed to evaluate the added value of using automated signal detection tools compared with 'traditional' methods, i.e. non-automated surveillance operated by pharmacovigilance experts. The analysis was performed sequentially, i.e. every month, and retrospectively on the whole French pharmacovigilance database over the period 1 January 1996-1 July 2002. Evaluation was based on a list of 243 reference signals (RSs) corresponding to investigations launched by the French Pharmacovigilance Technical Committee (PhVTC) during the same period. The comparison of detection methods was made on the basis of the number of RSs detected as well as the time to detection. Results comparing the five automated quantitative methods were in favour of GPS(pH0) in terms of both number of detections of true signals and

  18. SV-AUTOPILOT: optimized, automated construction of structural variation discovery and benchmarking pipelines.

    Science.gov (United States)

    Leung, Wai Yi; Marschall, Tobias; Paudel, Yogesh; Falquet, Laurent; Mei, Hailiang; Schönhuth, Alexander; Maoz Moss, Tiffanie Yael

    2015-03-25

    Many tools exist to predict structural variants (SVs), utilizing a variety of algorithms. However, they have largely been developed and tested on human germline or somatic (e.g. cancer) variation. It seems appropriate to exploit this wealth of technology available for humans also for other species. Objectives of this work included: a) Creating an automated, standardized pipeline for SV prediction. b) Identifying the best tool(s) for SV prediction through benchmarking. c) Providing a statistically sound method for merging SV calls. The SV-AUTOPILOT meta-tool platform is an automated pipeline for standardization of SV prediction and SV tool development in paired-end next-generation sequencing (NGS) analysis. SV-AUTOPILOT comes in the form of a virtual machine, which includes all datasets, tools and algorithms presented here. The virtual machine easily allows one to add, replace and update genomes, SV callers and post-processing routines and therefore provides an easy, out-of-the-box environment for complex SV discovery tasks. SV-AUTOPILOT was used to make a direct comparison between 7 popular SV tools on the Arabidopsis thaliana genome using the Landsberg (Ler) ecotype as a standardized dataset. Recall and precision measurements suggest that Pindel and Clever were the most adaptable to this dataset across all size ranges while Delly performed well for SVs larger than 250 nucleotides. A novel, statistically-sound merging process, which can control the false discovery rate, reduced the false positive rate on the Arabidopsis benchmark dataset used here by >60%. SV-AUTOPILOT provides a meta-tool platform for future SV tool development and the benchmarking of tools on other genomes using a standardized pipeline. It optimizes detection of SVs in non-human genomes using statistically robust merging. The benchmarking in this study has demonstrated the power of 7 different SV tools for analyzing different size classes and types of structural variants. The optional merge

  19. The Semantic Automated Discovery and Integration (SADI Web service Design-Pattern, API and Reference Implementation

    Directory of Open Access Journals (Sweden)

    Wilkinson Mark D

    2011-10-01

    Full Text Available Abstract Background The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. Description SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. Conclusions SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services

  20. The Semantic Automated Discovery and Integration (SADI) Web service Design-Pattern, API and Reference Implementation

    Science.gov (United States)

    2011-01-01

    Background The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. Description SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. Conclusions SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner

  1. The Semantic Automated Discovery and Integration (SADI) Web service Design-Pattern, API and Reference Implementation.

    Science.gov (United States)

    Wilkinson, Mark D; Vandervalk, Benjamin; McCarthy, Luke

    2011-10-24

    The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner very similar to data housed in

  2. Automated Sample Preparation Platform for Mass Spectrometry-Based Plasma Proteomics and Biomarker Discovery

    Directory of Open Access Journals (Sweden)

    Vilém Guryča

    2014-03-01

    Full Text Available The identification of novel biomarkers from human plasma remains a critical need in order to develop and monitor drug therapies for nearly all disease areas. The discovery of novel plasma biomarkers is, however, significantly hampered by the complexity and dynamic range of proteins within plasma, as well as the inherent variability in composition from patient to patient. In addition, it is widely accepted that most soluble plasma biomarkers for diseases such as cancer will be represented by tissue leakage products, circulating in plasma at low levels. It is therefore necessary to find approaches with the prerequisite level of sensitivity in such a complex biological matrix. Strategies for fractionating the plasma proteome have been suggested, but improvements in sensitivity are often negated by the resultant process variability. Here we describe an approach using multidimensional chromatography and on-line protein derivatization, which allows for higher sensitivity, whilst minimizing the process variability. In order to evaluate this automated process fully, we demonstrate three levels of processing and compare sensitivity, throughput and reproducibility. We demonstrate that high sensitivity analysis of the human plasma proteome is possible down to the low ng/mL or even high pg/mL level with a high degree of technical reproducibility.

  3. Predicting Causal Relationships from Biological Data: Applying Automated Causal Discovery on Mass Cytometry Data of Human Immune Cells

    KAUST Repository

    Triantafillou, Sofia; Lagani, Vincenzo; Heinze-Deml, Christina; Schmidt, Angelika; Tegner, Jesper; Tsamardinos, Ioannis

    2017-01-01

    Learning the causal relationships that define a molecular system allows us to predict how the system will respond to different interventions. Distinguishing causality from mere association typically requires randomized experiments. Methods for automated  causal discovery from limited experiments exist, but have so far rarely been tested in systems biology applications. In this work, we apply state-of-the art causal discovery methods on a large collection of public mass cytometry data sets, measuring intra-cellular signaling proteins of the human immune system and their response to several perturbations. We show how different experimental conditions can be used to facilitate causal discovery, and apply two fundamental methods that produce context-specific causal predictions. Causal predictions were reproducible across independent data sets from two different studies, but often disagree with the KEGG pathway databases. Within this context, we discuss the caveats we need to overcome for automated causal discovery to become a part of the routine data analysis in systems biology.

  4. Predicting Causal Relationships from Biological Data: Applying Automated Causal Discovery on Mass Cytometry Data of Human Immune Cells

    KAUST Repository

    Triantafillou, Sofia

    2017-09-29

    Learning the causal relationships that define a molecular system allows us to predict how the system will respond to different interventions. Distinguishing causality from mere association typically requires randomized experiments. Methods for automated  causal discovery from limited experiments exist, but have so far rarely been tested in systems biology applications. In this work, we apply state-of-the art causal discovery methods on a large collection of public mass cytometry data sets, measuring intra-cellular signaling proteins of the human immune system and their response to several perturbations. We show how different experimental conditions can be used to facilitate causal discovery, and apply two fundamental methods that produce context-specific causal predictions. Causal predictions were reproducible across independent data sets from two different studies, but often disagree with the KEGG pathway databases. Within this context, we discuss the caveats we need to overcome for automated causal discovery to become a part of the routine data analysis in systems biology.

  5. Upscaling and automation of electrophysiology: toward high throughput screening in ion channel drug discovery

    DEFF Research Database (Denmark)

    Asmild, Margit; Oswald, Nicholas; Krzywkowski, Karen M

    2003-01-01

    by developing two lines of automated patch clamp products, a traditional pipette-based system called Apatchi-1, and a silicon chip-based system QPatch. The degree of automation spans from semi-automation (Apatchi-1) where a trained technician interacts with the system in a limited way, to a complete automation...... (QPatch 96) where the system works continuously and unattended until screening of a full compound library is completed. The performance of the systems range from medium to high throughputs....

  6. Electronic spreadsheet vs. manual payroll.

    Science.gov (United States)

    Kiley, M M

    1991-01-01

    Medical groups with direct employees must employ someone or contract with a company to compute payroll, writes Michael Kiley, Ph.D., M.P.H. However, many medical groups, including small ones, own a personal or minicomputer to handle accounts receivable. Kiley explains, in detail, how this same computer and a spreadsheet program also can be used to perform payroll functions.

  7. Simple Functions Spreadsheet tool presentation

    International Nuclear Information System (INIS)

    Grive, Mireia; Domenech, Cristina; Montoya, Vanessa; Garcia, David; Duro, Lara

    2010-09-01

    This document is a guide for users of the Simple Functions Spreadsheet tool. The Simple Functions Spreadsheet tool has been developed by Amphos 21 to determine the solubility limits of some radionuclides and it has been especially designed for Performance Assessment exercises. The development of this tool has been promoted by the necessity expressed by SKB of having a confident and easy-to-handle tool to calculate solubility limits in an agile and relatively fast manner. Its development started in 2005 and since then, it has been improved until the current version. This document describes the accurate and preliminary study following expert criteria that has been used to select the simplified aqueous speciation and solid phase system included in the tool. This report also gives the basic instructions to use this tool and to interpret its results. Finally, this document also reports the different validation tests and sensitivity analyses that have been done during the verification process

  8. 3D Graphics with Spreadsheets

    Directory of Open Access Journals (Sweden)

    Jan Benacka

    2009-06-01

    Full Text Available In the article, the formulas for orthographic parallel projection of 3D bodies on computer screen are derived using secondary school vector algebra. The spreadsheet implementation is demonstrated in six applications that project bodies with increasing intricacy – a convex body (cube with non-solved visibility, convex bodies (cube, chapel with solved visibility, a coloured convex body (chapel with solved visibility, and a coloured non-convex body (church with solved visibility. The projections are revolvable in horizontal and vertical plane, and they are changeable in size. The examples show an unusual way of using spreadsheets as a 3D computer graphics tool. The applications can serve as a simple introduction to the general principles of computer graphics, to the graphics with spreadsheets, and as a tool for exercising stereoscopic vision. The presented approach is usable at visualising 3D scenes within some topics of secondary school curricula as solid geometry (angles and distances of lines and planes within simple bodies or analytic geometry in space (angles and distances of lines and planes in E3, and even at university level within calculus at visualising graphs of z = f(x,y functions. Examples are pictured.

  9. GENPLAT: an automated platform for biomass enzyme discovery and cocktail optimization.

    Science.gov (United States)

    Walton, Jonathan; Banerjee, Goutami; Car, Suzana

    2011-10-24

    The high cost of enzymes for biomass deconstruction is a major impediment to the economic conversion of lignocellulosic feedstocks to liquid transportation fuels such as ethanol. We have developed an integrated high throughput platform, called GENPLAT, for the discovery and development of novel enzymes and enzyme cocktails for the release of sugars from diverse pretreatment/biomass combinations. GENPLAT comprises four elements: individual pure enzymes, statistical design of experiments, robotic pipeting of biomass slurries and enzymes, and automated colorimeteric determination of released Glc and Xyl. Individual enzymes are produced by expression in Pichia pastoris or Trichoderma reesei, or by chromatographic purification from commercial cocktails or from extracts of novel microorganisms. Simplex lattice (fractional factorial) mixture models are designed using commercial Design of Experiment statistical software. Enzyme mixtures of high complexity are constructed using robotic pipeting into a 96-well format. The measurement of released Glc and Xyl is automated using enzyme-linked colorimetric assays. Optimized enzyme mixtures containing as many as 16 components have been tested on a variety of feedstock and pretreatment combinations. GENPLAT is adaptable to mixtures of pure enzymes, mixtures of commercial products (e.g., Accellerase 1000 and Novozyme 188), extracts of novel microbes, or combinations thereof. To make and test mixtures of ˜10 pure enzymes requires less than 100 μg of each protein and fewer than 100 total reactions, when operated at a final total loading of 15 mg protein/g glucan. We use enzymes from several sources. Enzymes can be purified from natural sources such as fungal cultures (e.g., Aspergillus niger, Cochliobolus carbonum, and Galerina marginata), or they can be made by expression of the encoding genes (obtained from the increasing number of microbial genome sequences) in hosts such as E. coli, Pichia pastoris, or a filamentous fungus such

  10. A Literature Review of Spreadsheet Technology

    DEFF Research Database (Denmark)

    Bock, Alexander

    2016-01-01

    It was estimated that there would be over 55 million end-user programmers in 2012 in many different fields such as engineering,insurance and banking, and the numbers are not expected to have dwindled since. Consequently, technological advancements of spreadsheets is of great interest to a wide...... number of people from different backgrounds. This literature review presents an overview of research on spreadsheet technology, its challenges and its solutions. We also attempt to identify why software developers generally frown upon spreadsheets and how spreadsheet research can help alter this view....

  11. Spreadsheet algorithm for stagewise solvent extraction

    International Nuclear Information System (INIS)

    Leonard, R.A.; Regalbuto, M.C.

    1994-01-01

    The material balance and equilibrium equations for solvent extraction processes have been combined with computer spreadsheets in a new way so that models for very complex multicomponent multistage operations can be setup and used easily. A part of the novelty is the way in which the problem is organized in the spreadsheet. In addition, to facilitate spreadsheet setup, a new calculational procedure has been developed. The resulting Spreadsheet Algorithm for Stagewise Solvent Extraction (SASSE) can be used with either IBM or Macintosh personal computers as a simple yet powerful tool for analyzing solvent extraction flowsheets. 22 refs., 4 figs., 2 tabs

  12. The Adam and Eve Robot Scientists for the Automated Discovery of Scientific Knowledge

    Science.gov (United States)

    King, Ross

    A Robot Scientist is a physically implemented robotic system that applies techniques from artificial intelligence to execute cycles of automated scientific experimentation. A Robot Scientist can automatically execute cycles of hypothesis formation, selection of efficient experiments to discriminate between hypotheses, execution of experiments using laboratory automation equipment, and analysis of results. The motivation for developing Robot Scientists is to better understand science, and to make scientific research more efficient. The Robot Scientist `Adam' was the first machine to autonomously discover scientific knowledge: both form and experimentally confirm novel hypotheses. Adam worked in the domain of yeast functional genomics. The Robot Scientist `Eve' was originally developed to automate early-stage drug development, with specific application to neglected tropical disease such as malaria, African sleeping sickness, etc. We are now adapting Eve to work with on cancer. We are also teaching Eve to autonomously extract information from the scientific literature.

  13. Supporting professional spreadsheet users by generating leveled dataflow diagrams

    NARCIS (Netherlands)

    Hermans, F.; Pinzger, M.; Van Deursen, A.

    2010-01-01

    Thanks to their flexibility and intuitive programming model, spreadsheets are widely used in industry, often for businesscritical applications. Similar to software developers, professional spreadsheet users demand support for maintaining and transferring their spreadsheets. In this paper, we first

  14. DataSpread: Unifying Databases and Spreadsheets.

    Science.gov (United States)

    Bendre, Mangesh; Sun, Bofan; Zhang, Ding; Zhou, Xinyan; Chang, Kevin ChenChuan; Parameswaran, Aditya

    2015-08-01

    Spreadsheet software is often the tool of choice for ad-hoc tabular data management, processing, and visualization, especially on tiny data sets. On the other hand, relational database systems offer significant power, expressivity, and efficiency over spreadsheet software for data management, while lacking in the ease of use and ad-hoc analysis capabilities. We demonstrate DataSpread, a data exploration tool that holistically unifies databases and spreadsheets. It continues to offer a Microsoft Excel-based spreadsheet front-end, while in parallel managing all the data in a back-end database, specifically, PostgreSQL. DataSpread retains all the advantages of spreadsheets, including ease of use, ad-hoc analysis and visualization capabilities, and a schema-free nature, while also adding the advantages of traditional relational databases, such as scalability and the ability to use arbitrary SQL to import, filter, or join external or internal tables and have the results appear in the spreadsheet. DataSpread needs to reason about and reconcile differences in the notions of schema, addressing of cells and tuples, and the current "pane" (which exists in spreadsheets but not in traditional databases), and support data modifications at both the front-end and the back-end. Our demonstration will center on our first and early prototype of the DataSpread, and will give the attendees a sense for the enormous data exploration capabilities offered by unifying spreadsheets and databases.

  15. Spreadsheet Design: An Optimal Checklist for Accountants

    Science.gov (United States)

    Barnes, Jeffrey N.; Tufte, David; Christensen, David

    2009-01-01

    Just as good grammar, punctuation, style, and content organization are important to well-written documents, basic fundamentals of spreadsheet design are essential to clear communication. In fact, the very principles of good writing should be integrated into spreadsheet workpaper design and organization. The unique contributions of this paper are…

  16. Spreadsheet Modeling of Electron Distributions in Solids

    Science.gov (United States)

    Glassy, Wingfield V.

    2006-01-01

    A series of spreadsheet modeling exercises constructed as part of a new upper-level elective course on solid state materials and surface chemistry is described. The spreadsheet exercises are developed to provide students with the opportunity to interact with the conceptual framework where the role of the density of states and the Fermi-Dirac…

  17. Lens Ray Diagrams with a Spreadsheet

    Science.gov (United States)

    González, Manuel I.

    2018-01-01

    Physicists create spreadsheets customarily to carry out numerical calculations and to display their results in a meaningful, nice-looking way. Spreadsheets can also be used to display a vivid geometrical model of a physical system. This statement is illustrated with an example taken from geometrical optics: images formed by a thin lens. A careful…

  18. A High Throughput, 384-Well, Semi-Automated, Hepatocyte Intrinsic Clearance Assay for Screening New Molecular Entities in Drug Discovery.

    Science.gov (United States)

    Heinle, Lance; Peterkin, Vincent; de Morais, Sonia M; Jenkins, Gary J; Badagnani, Ilaria

    2015-01-01

    A high throughput, semi-automated clearance screening assay in hepatocytes was developed allowing a scientist to generate data for 96 compounds in one week. The 384-well format assay utilizes a Thermo Multidrop Combi and an optimized LC-MS/MS method. The previously reported LCMS/ MS method reduced the analytical run time by 3-fold, down to 1.2 min injection-to-injection. The Multidrop was able to deliver hepatocytes to 384-well plates with minimal viability loss. Comparison of results from the new 384-well and historical 24-well assays yielded a correlation of 0.95. In addition, results obtained for 25 marketed drugs with various metabolism pathways had a correlation of 0.75 when compared with literature values. Precision was maintained in the new format as 8 compounds tested in ≥39 independent experiments had coefficients of variation ≤21%. The ability to predict in vivo clearances using the new stability assay format was also investigated using 22 marketed drugs and 26 AbbVie compounds. Correction of intrinsic clearance values with binding to hepatocytes (in vitro data) and plasma (in vivo data) resulted in a higher in vitro to in vivo correlation when comparing 22 marketed compounds in human (0.80 vs 0.35) and 26 AbbVie Discovery compounds in rat (0.56 vs 0.17), demonstrating the importance of correcting for binding in clearance studies. This newly developed high throughput, semi-automated clearance assay allows for rapid screening of Discovery compounds to enable Structure Activity Relationship (SAR) analysis based on high quality hepatocyte stability data in sufficient quantity and quality to drive the next round of compound synthesis.

  19. Spreadsheets in the Cloud { Not Ready Yet

    Directory of Open Access Journals (Sweden)

    Bruce D. McCullogh

    2013-01-01

    Full Text Available Cloud computing is a relatively new technology that facilitates collaborative creation and modification of documents over the internet in real time. Here we provide an introductory assessment of the available statistical functions in three leading cloud spreadsheets namely Google Spreadsheet, Microsoft Excel Web App, and Zoho Sheet. Our results show that the developers of cloud-based spreadsheets are not performing basic quality control, resulting in statistical computations that are misleading and erroneous. Moreover, the developers do not provide sufficient information regarding the software and the hardware, which can change at any time without notice. Indeed, rerunning the tests after several months we obtained different and sometimes worsened results.

  20. Toolkits for nuclear science. Data and spreadsheets

    International Nuclear Information System (INIS)

    Lindstrom, R.M.

    2006-01-01

    In the past decade, the combination of readily accessible, reliable data in electronic form with well-tested spreadsheet programs has changed the approach to experiment planning and computation of results. This has led to a flowering of software applications based on spreadsheets, mostly written by scientists, not by professional programmers trained in numerical methods. Formal quality systems increasingly call for verified computational methods and reference data as part of the analytical process, a demand that is difficult to meet with most spreadsheets. Examples are given of utilities used in our laboratory, with suggestions for verification and quality maintenance. (author)

  1. Sign use and cognition in automated scientific discovery: are computers only special kinds of signs?

    Science.gov (United States)

    Giza, Piotr

    2018-04-01

    James Fetzer criticizes the computational paradigm, prevailing in cognitive science by questioning, what he takes to be, its most elementary ingredient: that cognition is computation across representations. He argues that if cognition is taken to be a purposive, meaningful, algorithmic problem solving activity, then computers are incapable of cognition. Instead, they appear to be signs of a special kind, that can facilitate computation. He proposes the conception of minds as semiotic systems as an alternative paradigm for understanding mental phenomena, one that seems to overcome the difficulties of computationalism. Now, I argue, that with computer systems dealing with scientific discovery, the matter is not so simple as that. The alleged superiority of humans using signs to stand for something other over computers being merely "physical symbol systems" or "automatic formal systems" is only easy to establish in everyday life, but becomes far from obvious when scientific discovery is at stake. In science, as opposed to everyday life, the meaning of symbols is, apart from very low-level experimental investigations, defined implicitly by the way the symbols are used in explanatory theories or experimental laws relevant to the field, and in consequence, human and machine discoverers are much more on a par. Moreover, the great practical success of the genetic programming method and recent attempts to apply it to automatic generation of cognitive theories seem to show, that computer systems are capable of very efficient problem solving activity in science, which is neither purposive nor meaningful, nor algorithmic. This, I think, undermines Fetzer's argument that computer systems are incapable of cognition because computation across representations is bound to be a purposive, meaningful, algorithmic problem solving activity.

  2. Declarative Parallel Programming in Spreadsheet End-User Development

    DEFF Research Database (Denmark)

    Biermann, Florian

    2016-01-01

    Spreadsheets are first-order functional languages and are widely used in research and industry as a tool to conveniently perform all kinds of computations. Because cells on a spreadsheet are immutable, there are possibilities for implicit parallelization of spreadsheet computations. In this liter...... can directly apply results from functional array programming to a spreadsheet model of computations.......Spreadsheets are first-order functional languages and are widely used in research and industry as a tool to conveniently perform all kinds of computations. Because cells on a spreadsheet are immutable, there are possibilities for implicit parallelization of spreadsheet computations....... In this literature study, we provide an overview of the publications on spreadsheet end-user programming and declarative array programming to inform further research on parallel programming in spreadsheets. Our results show that there is a clear overlap between spreadsheet programming and array programming and we...

  3. Towards automating the discovery of certain innovative design principles through a clustering-based optimization technique

    Science.gov (United States)

    Bandaru, Sunith; Deb, Kalyanmoy

    2011-09-01

    In this article, a methodology is proposed for automatically extracting innovative design principles which make a system or process (subject to conflicting objectives) optimal using its Pareto-optimal dataset. Such 'higher knowledge' would not only help designers to execute the system better, but also enable them to predict how changes in one variable would affect other variables if the system has to retain its optimal behaviour. This in turn would help solve other similar systems with different parameter settings easily without the need to perform a fresh optimization task. The proposed methodology uses a clustering-based optimization technique and is capable of discovering hidden functional relationships between the variables, objective and constraint functions and any other function that the designer wishes to include as a 'basis function'. A number of engineering design problems are considered for which the mathematical structure of these explicit relationships exists and has been revealed by a previous study. A comparison with the multivariate adaptive regression splines (MARS) approach reveals the practicality of the proposed approach due to its ability to find meaningful design principles. The success of this procedure for automated innovization is highly encouraging and indicates its suitability for further development in tackling more complex design scenarios.

  4. Whole animal automated platform for drug discovery against multi-drug resistant Staphylococcus aureus.

    Directory of Open Access Journals (Sweden)

    Rajmohan Rajamuthiah

    Full Text Available Staphylococcus aureus, the leading cause of hospital-acquired infections in the United States, is also pathogenic to the model nematode Caenorhabditis elegans. The C. elegans-S. aureus infection model was previously carried out on solid agar plates where the bacteriovorous C. elegans feeds on a lawn of S. aureus. However, agar-based assays are not amenable to large scale screens for antibacterial compounds. We have developed a high throughput liquid screening assay that uses robotic instrumentation to dispense a precise amount of methicillin resistant S. aureus (MRSA and worms in 384-well assay plates, followed by automated microscopy and image analysis. In validation of the liquid assay, an MRSA cell wall defective mutant, MW2ΔtarO, which is attenuated for killing in the agar-based assay, was found to be less virulent in the liquid assay. This robust assay with a Z'-factor consistently greater than 0.5 was utilized to screen the Biomol 4 compound library consisting of 640 small molecules with well characterized bioactivities. As proof of principle, 27 of the 30 clinically used antibiotics present in the library conferred increased C. elegans survival and were identified as hits in the screen. Surprisingly, the antihelminthic drug closantel was also identified as a hit in the screen. In further studies, we confirmed the anti-staphylococcal activity of closantel against vancomycin-resistant S. aureus isolates and other Gram-positive bacteria. The liquid C. elegans-S. aureus assay described here allows screening for anti-staphylococcal compounds that are not toxic to the host.

  5. Whole animal automated platform for drug discovery against multi-drug resistant Staphylococcus aureus.

    Science.gov (United States)

    Rajamuthiah, Rajmohan; Fuchs, Beth Burgwyn; Jayamani, Elamparithi; Kim, Younghoon; Larkins-Ford, Jonah; Conery, Annie; Ausubel, Frederick M; Mylonakis, Eleftherios

    2014-01-01

    Staphylococcus aureus, the leading cause of hospital-acquired infections in the United States, is also pathogenic to the model nematode Caenorhabditis elegans. The C. elegans-S. aureus infection model was previously carried out on solid agar plates where the bacteriovorous C. elegans feeds on a lawn of S. aureus. However, agar-based assays are not amenable to large scale screens for antibacterial compounds. We have developed a high throughput liquid screening assay that uses robotic instrumentation to dispense a precise amount of methicillin resistant S. aureus (MRSA) and worms in 384-well assay plates, followed by automated microscopy and image analysis. In validation of the liquid assay, an MRSA cell wall defective mutant, MW2ΔtarO, which is attenuated for killing in the agar-based assay, was found to be less virulent in the liquid assay. This robust assay with a Z'-factor consistently greater than 0.5 was utilized to screen the Biomol 4 compound library consisting of 640 small molecules with well characterized bioactivities. As proof of principle, 27 of the 30 clinically used antibiotics present in the library conferred increased C. elegans survival and were identified as hits in the screen. Surprisingly, the antihelminthic drug closantel was also identified as a hit in the screen. In further studies, we confirmed the anti-staphylococcal activity of closantel against vancomycin-resistant S. aureus isolates and other Gram-positive bacteria. The liquid C. elegans-S. aureus assay described here allows screening for anti-staphylococcal compounds that are not toxic to the host.

  6. Biomarker Discovery Using New Metabolomics Software for Automated Processing of High Resolution LC-MS Data

    Science.gov (United States)

    Hnatyshyn, S.; Reily, M.; Shipkova, P.; McClure, T.; Sanders, M.; Peake, D.

    2011-01-01

    Robust biomarkers of target engagement and efficacy are required in different stages of drug discovery. Liquid chromatography coupled to high resolution mass spectrometry provides sensitivity, accuracy and wide dynamic range required for identification of endogenous metabolites in biological matrices. LCMS is widely-used tool for biomarker identification and validation. Typical high resolution LCMS profiles from biological samples may contain greater than a million mass spectral peaks corresponding to several thousand endogenous metabolites. Reduction of the total number of peaks, component identification and statistical comparison across sample groups remains to be a difficult and time consuming challenge. Blood samples from four groups of rats (male vs. female, fully satiated and food deprived) were analyzed using high resolution accurate mass (HRAM) LCMS. All samples were separated using a 15 minute reversed-phase C18 LC gradient and analyzed in both positive and negative ion modes. Data was acquired using 15K resolution and 5ppm mass measurement accuracy. The entire data set was analyzed using software developed in collaboration between Bristol Meyers Squibb and Thermo Fisher Scientific to determine the metabolic effects of food deprivation on rats. Metabolomic LC-MS data files are extraordinarily complex and appropriate reduction of the number of spectral peaks via identification of related peaks and background removal is essential. A single component such as hippuric acid generates more than 20 related peaks including isotopic clusters, adducts and dimers. Plasma and urine may contain 500-1500 unique quantifiable metabolites. Noise filtering approaches including blank subtraction were used to reduce the number of irrelevant peaks. By grouping related signals such as isotopic peaks and alkali adducts, data processing was greatly simplified by reducing the total number of components by 10-fold. The software processes 48 samples in under 60minutes. Principle

  7. Spreadsheet tool for estimating noise reduction costs

    International Nuclear Information System (INIS)

    Frank, L.; Senden, V.; Leszczynski, Y.

    2009-01-01

    The Northeast Capital Industrial Association (NCIA) represents industry in Alberta's industrial heartland. The organization is in the process of developing a regional noise management plan (RNMP) for their member companies. The RNMP includes the development of a noise reduction cost spreadsheet tool to conduct reviews of practical noise control treatments available for individual plant equipment, inclusive of ranges of noise attenuation achievable, which produces a budgetary prediction of the installed cost of practical noise control treatments. This paper discussed the noise reduction cost spreadsheet tool, with particular reference to noise control best practices approaches and spreadsheet tool development such as prerequisite, assembling data required, approach, and unit pricing database. Use and optimization of the noise reduction cost spreadsheet tool was also discussed. It was concluded that the noise reduction cost spreadsheet tool is an easy interactive tool to estimate implementation costs related to different strategies and options of noise control mitigating measures and was very helpful in gaining insight for noise control planning purposes. 2 tabs.

  8. On the Numerical Accuracy of Spreadsheets

    Directory of Open Access Journals (Sweden)

    Alejandro C. Frery

    2010-10-01

    Full Text Available This paper discusses the numerical precision of five spreadsheets (Calc, Excel, Gnumeric, NeoOffice and Oleo running on two hardware platforms (i386 and amd64 and on three operating systems (Windows Vista, Ubuntu Intrepid and Mac OS Leopard. The methodology consists of checking the number of correct significant digits returned by each spreadsheet when computing the sample mean, standard deviation, first-order autocorrelation, F statistic in ANOVA tests, linear and nonlinear regression and distribution functions. A discussion about the algorithms for pseudorandom number generation provided by these platforms is also conducted. We conclude that there is no safe choice among the spreadsheets here assessed: they all fail in nonlinear regression and they are not suited for Monte Carlo experiments.

  9. LICSS - a chemical spreadsheet in microsoft excel.

    Science.gov (United States)

    Lawson, Kevin R; Lawson, Jonty

    2012-02-02

    Representations of chemical datasets in spreadsheet format are important for ready data assimilation and manipulation. In addition to the normal spreadsheet facilities, chemical spreadsheets need to have visualisable chemical structures and data searchable by chemical as well as textual queries. Many such chemical spreadsheet tools are available, some operating in the familiar Microsoft Excel environment. However, within this group, the performance of Excel is often compromised, particularly in terms of the number of compounds which can usefully be stored on a sheet. LICSS is a lightweight chemical spreadsheet within Microsoft Excel for Windows. LICSS stores structures solely as Smiles strings. Chemical operations are carried out by calling Java code modules which use the CDK, JChemPaint and OPSIN libraries to provide cheminformatics functionality. Compounds in sheets or charts may be visualised (individually or en masse), and sheets may be searched by substructure or similarity. All the molecular descriptors available in CDK may be calculated for compounds (in batch or on-the-fly), and various cheminformatic operations such as fingerprint calculation, Sammon mapping, clustering and R group table creation may be carried out.We detail here the features of LICSS and how they are implemented. We also explain the design criteria, particularly in terms of potential corporate use, which led to this particular implementation. LICSS is an Excel-based chemical spreadsheet with a difference:• It can usefully be used on sheets containing hundreds of thousands of compounds; it doesn't compromise the normal performance of Microsoft Excel• It is designed to be installed and run in environments in which users do not have admin privileges; installation involves merely file copying, and sharing of LICSS sheets invokes automatic installation• It is free and extensibleLICSS is open source software and we hope sufficient detail is provided here to enable developers to add their

  10. Spreadsheet as a tool of engineering analysis

    International Nuclear Information System (INIS)

    Becker, M.

    1985-01-01

    In engineering analysis, problems tend to be categorized into those that can be done by hand and those that require the computer for solution. The advent of personal computers, and in particular, the advent of spreadsheet software, blurs this distinction, creating an intermediate category of problems appropriate for use with interactive personal computing

  11. Process mining : spreadsheet-like technology for processes

    NARCIS (Netherlands)

    Van Der Aalst, W.M.P.

    2016-01-01

    Spreadsheets can be viewed as a success story. Since the late seventies spreadsheet programs have been installed on the majority of computers and play a role comparable to text editors and databases management systems. Spreadsheets can be used to do anything with numbers, but are unable to handle

  12. Enron versus EUSES : A comparison of two spreadsheet corpora

    NARCIS (Netherlands)

    Jansen, B.

    2015-01-01

    Spreadsheets are widely used within companies and often form the basis for business decisions. Numerous cases are known where incorrect information in spreadsheets lead to incorrect decisions. Such cases underline the relevance of research on the professional use of spreadsheets. Recently a new

  13. Computerized cost estimation spreadsheet and cost data base for fusion devices

    International Nuclear Information System (INIS)

    Hamilton, W.R.; Rothe, K.E.

    1985-01-01

    An automated approach to performing and cataloging cost estimates has been developed at the Fusion Engineering Design Center (FEDC), wherein the cost estimate record is stored in the LOTUS 1-2-3 spreadsheet on an IBM personal computer. The cost estimation spreadsheet is based on the cost coefficient/cost algorithm approach and incorporates a detailed generic code of cost accounts for both tokamak and tandem mirror devices. Component design parameters (weight, surface area, etc.) and cost factors are input, and direct and indirect costs are calculated. The cost data base file derived from actual cost experience within the fusion community and refined to be compatible with the spreadsheet costing approach is a catalog of cost coefficients, algorithms, and component costs arranged into data modules corresponding to specific components and/or subsystems. Each data module contains engineering, equipment, and installation labor cost data for different configurations and types of the specific component or subsystem. This paper describes the assumptions, definitions, methodology, and architecture incorporated in the development of the cost estimation spreadsheet and cost data base, along with the type of input required and the output format

  14. Forming conjectures within a spreadsheet environment

    Science.gov (United States)

    Calder, Nigel; Brown, Tony; Hanley, Una; Darby, Susan

    2006-12-01

    This paper is concerned with the use of spreadsheets within mathematical investigational tasks. Considering the learning of both children and pre-service teaching students, it examines how mathematical phenomena can be seen as a function of the pedagogical media through which they are encountered. In particular, it shows how pedagogical apparatus influence patterns of social interaction, and how this interaction shapes the mathematical ideas that are engaged with. Notions of conjecture, along with the particular faculty of the spreadsheet setting, are considered with regard to the facilitation of mathematical thinking. Employing an interpretive perspective, a key focus is on how alternative pedagogical media and associated discursive networks influence the way that students form and test informal conjectures.

  15. Mitigating Spreadsheet Model Risk with Python Open Source Infrastructure

    OpenAIRE

    Beavers, Oliver

    2018-01-01

    Across an aggregation of EuSpRIG presentation papers, two maxims hold true: spreadsheets models are akin to software, yet spreadsheet developers are not software engineers. As such, the lack of traditional software engineering tools and protocols invites a higher rate of error in the end result. This paper lays ground work for spreadsheet modelling professionals to develop reproducible audit tools using freely available, open source packages built with the Python programming language, enablin...

  16. Integrated Spreadsheets as Learning Environments for Young Children

    Directory of Open Access Journals (Sweden)

    Sergei Abramovich

    2014-01-01

    Full Text Available This classroom note shares experience of using spreadsheets with a group of 2nd grade students. The main feature of the learning environments that made effective the integration of technology and grade appropriate mathematics is the use of images of modern tools such as the Nintendo DC, the Play Station Portable, and the iPhone. The idea is illustrated by presenting a number of worksheets of so modified spreadsheets called integrated spreadsheets. The authors suggest using spreadsheets in that way offers an attractive interface for young students and enhances significantly their on-task behavior.

  17. NET PRESENT VALUE SIMULATING WITH A SPREADSHEET

    Directory of Open Access Journals (Sweden)

    Maria CONSTANTINESCU

    2010-01-01

    Full Text Available Decision making has always been a difficult process, based on various combinations if objectivity (when scientific tools were used and subjectivity (considering that decisions are finally made by people, with their strengths and weaknesses. The IT revolution has also reached the areas of management and decision making, helping managers make better and more informed decisions by providing them with a variety of tools, from the personal computers to the specialized software. Most simulations are performed in a spreadsheet, because the number of calculations required soon overwhelms human capability.

  18. Calculating germination measurements and organizing spreadsheets

    OpenAIRE

    Ranal, Marli A.; Santana, Denise Garcia de; Ferreira, Wanessa Resende; Mendes-Rodrigues, Clesnan

    2009-01-01

    With the objective to minimize difficulties for beginners we are proposing the use of a conventional spreadsheet for the calculations of the main germination (or emergence) measurements, the organization of the final data for the statistical analysis and some electronic commands involved in these steps. Com o objetivo de minimizar as dificuldades dos iniciantes, estamos propondo o uso de planilhas eletrônicas convencionais para o cálculo das principais medidas de germinação (ou emergência)...

  19. Simple Spreadsheet Thermal Models for Cryogenic Applications

    Science.gov (United States)

    Nash, Alfred

    1995-01-01

    Self consistent circuit analog thermal models that can be run in commercial spreadsheet programs on personal computers have been created to calculate the cooldown and steady state performance of cryogen cooled Dewars. The models include temperature dependent conduction and radiation effects. The outputs of the models provide temperature distribution and Dewar performance information. these models have been used to analyze the SIRTF Telescope Test Facility (STTF). The facility has been brought on line for its first user, the Infrared Telescope Technology Testbed (ITTT), for the Space Infrared Telescope Facility (SIRTF) at JPL. The model algorithm as well as a comparison between the models' predictions and actual performance of this facility will be presented.

  20. Enron’s Spreadsheets and Related Emails : A Dataset and Analysis

    NARCIS (Netherlands)

    Hermans, F.; Murphy-Hill, E.

    2014-01-01

    Spreadsheets are used extensively in business processes around the world and as such, a topic of research interest. Over the past few years, many spreadsheet studies have been performed on the EUSES spreadsheet corpus. While this corpus has served the spreadsheet community well, the spreadsheets it

  1. Closing the Loop: Automated Data-Driven Cognitive Model Discoveries Lead to Improved Instruction and Learning Gains

    Science.gov (United States)

    Liu, Ran; Koedinger, Kenneth R.

    2017-01-01

    As the use of educational technology becomes more ubiquitous, an enormous amount of learning process data is being produced. Educational data mining seeks to analyze and model these data, with the ultimate goal of improving learning outcomes. The most firmly grounded and rigorous evaluation of an educational data mining discovery is whether it…

  2. Robofurnace: A semi-automated laboratory chemical vapor deposition system for high-throughput nanomaterial synthesis and process discovery

    International Nuclear Information System (INIS)

    Oliver, C. Ryan; Westrick, William; Koehler, Jeremy; Brieland-Shoultz, Anna; Anagnostopoulos-Politis, Ilias; Cruz-Gonzalez, Tizoc; Hart, A. John

    2013-01-01

    Laboratory research and development on new materials, such as nanostructured thin films, often utilizes manual equipment such as tube furnaces due to its relatively low cost and ease of setup. However, these systems can be prone to inconsistent outcomes due to variations in standard operating procedures and limitations in performance such as heating and cooling rates restrict the parameter space that can be explored. Perhaps more importantly, maximization of research throughput and the successful and efficient translation of materials processing knowledge to production-scale systems, relies on the attainment of consistent outcomes. In response to this need, we present a semi-automated lab-scale chemical vapor deposition (CVD) furnace system, called “Robofurnace.” Robofurnace is an automated CVD system built around a standard tube furnace, which automates sample insertion and removal and uses motion of the furnace to achieve rapid heating and cooling. The system has a 10-sample magazine and motorized transfer arm, which isolates the samples from the lab atmosphere and enables highly repeatable placement of the sample within the tube. The system is designed to enable continuous operation of the CVD reactor, with asynchronous loading/unloading of samples. To demonstrate its performance, Robofurnace is used to develop a rapid CVD recipe for carbon nanotube (CNT) forest growth, achieving a 10-fold improvement in CNT forest mass density compared to a benchmark recipe using a manual tube furnace. In the long run, multiple systems like Robofurnace may be linked to share data among laboratories by methods such as Twitter. Our hope is Robofurnace and like automation will enable machine learning to optimize and discover relationships in complex material synthesis processes

  3. Robofurnace: A semi-automated laboratory chemical vapor deposition system for high-throughput nanomaterial synthesis and process discovery

    Energy Technology Data Exchange (ETDEWEB)

    Oliver, C. Ryan; Westrick, William; Koehler, Jeremy; Brieland-Shoultz, Anna; Anagnostopoulos-Politis, Ilias; Cruz-Gonzalez, Tizoc [Department of Mechanical Engineering, University of Michigan, Ann Arbor, Michigan 48109 (United States); Hart, A. John, E-mail: ajhart@mit.edu [Department of Mechanical Engineering, University of Michigan, Ann Arbor, Michigan 48109 (United States); Department of Mechanical Engineering, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139 (United States)

    2013-11-15

    Laboratory research and development on new materials, such as nanostructured thin films, often utilizes manual equipment such as tube furnaces due to its relatively low cost and ease of setup. However, these systems can be prone to inconsistent outcomes due to variations in standard operating procedures and limitations in performance such as heating and cooling rates restrict the parameter space that can be explored. Perhaps more importantly, maximization of research throughput and the successful and efficient translation of materials processing knowledge to production-scale systems, relies on the attainment of consistent outcomes. In response to this need, we present a semi-automated lab-scale chemical vapor deposition (CVD) furnace system, called “Robofurnace.” Robofurnace is an automated CVD system built around a standard tube furnace, which automates sample insertion and removal and uses motion of the furnace to achieve rapid heating and cooling. The system has a 10-sample magazine and motorized transfer arm, which isolates the samples from the lab atmosphere and enables highly repeatable placement of the sample within the tube. The system is designed to enable continuous operation of the CVD reactor, with asynchronous loading/unloading of samples. To demonstrate its performance, Robofurnace is used to develop a rapid CVD recipe for carbon nanotube (CNT) forest growth, achieving a 10-fold improvement in CNT forest mass density compared to a benchmark recipe using a manual tube furnace. In the long run, multiple systems like Robofurnace may be linked to share data among laboratories by methods such as Twitter. Our hope is Robofurnace and like automation will enable machine learning to optimize and discover relationships in complex material synthesis processes.

  4. Spreadsheets as tools for statistical computing and statistics education

    OpenAIRE

    Neuwirth, Erich

    2000-01-01

    Spreadsheets are an ubiquitous program category, and we will discuss their use in statistics and statistics education on various levels, ranging from very basic examples to extremely powerful methods. Since the spreadsheet paradigm is very familiar to many potential users, using it as the interface to statistical methods can make statistics more easily accessible.

  5. Designing Spreadsheet-Based Tasks for Purposeful Algebra

    Science.gov (United States)

    Ainley, Janet; Bills, Liz; Wilson, Kirsty

    2005-01-01

    We describe the design of a sequence of spreadsheet-based pedagogic tasks for the introduction of algebra in the early years of secondary schooling within the Purposeful Algebraic Activity project. This design combines two relatively novel features to bring a different perspective to research in the use of spreadsheets for the learning and…

  6. Excel Spreadsheets for Algebra: Improving Mental Modeling for Problem Solving

    Science.gov (United States)

    Engerman, Jason; Rusek, Matthew; Clariana, Roy

    2014-01-01

    This experiment investigates the effectiveness of Excel spreadsheets in a high school algebra class. Students in the experiment group convincingly outperformed the control group on a post lesson assessment. The student responses, teacher observations involving Excel spreadsheet revealed that it operated as a mindtool, which formed the users'…

  7. Using Spreadsheets to Produce Acid-Base Titration Curves.

    Science.gov (United States)

    Cawley, Martin James; Parkinson, John

    1995-01-01

    Describes two spreadsheets for producing acid-base titration curves, one uses relatively simple cell formulae that can be written into the spreadsheet by inexperienced students and the second uses more complex formulae that are best written by the teacher. (JRH)

  8. Spreadsheet eases heat balance, payback calculations

    International Nuclear Information System (INIS)

    Conner, K.P.

    1992-01-01

    This paper reports that a generalized Lotus type spreadsheet program has been developed to perform the heat balance and simple payback calculations for various turbine-generator (TG) inlet steam pressures. It can be used for potential plant expansions or new cogeneration installations. The program performs the basic heat balance calculations that are associated with turbine-generator, feedwater heating process steam requirements and desuperheating. The printout, shows the basic data and formulation used in the calculations. The turbine efficiency data used are applicable for automatic extraction turbine-generators in the 30-80 MW range. Simple payback calculations are for chemical recovery boilers and power boilers used in the pulp and paper industry. However, the program will also accommodate boilers common to other industries

  9. Understanding Gauss’s law using spreadsheets

    Science.gov (United States)

    Baird, William H.

    2013-09-01

    Some of the results from the electrostatics portion of introductory physics are particularly difficult for students to understand and/or believe. For students who have yet to take vector calculus, Gauss’s law is far from obvious and may seem more difficult than Coulomb’s. When these same students are told that the minimum potential energy for charges added to a conductor is realized when all charges are on the surface, they may have a hard time believing that the energy would not be lowered if just one of those charges were moved from the surface to the interior of a conductor. Investigating these ideas using Coulomb’s law and/or the formula for the potential energy of a system of discrete charges might be tempting, but as the number of charges climbs past a few the calculations become tedious. A spreadsheet enables students to perform these for a hundred or more charges and confirm the familiar results.

  10. The parameter spreadsheets and their applications

    International Nuclear Information System (INIS)

    Schwitters, R.; Chao, A.; Chou, W.; Peterson, J.

    1993-01-01

    This paper is to announce that a set of parameter spreadsheets, using the Microsoft EXCEL software, has been developed for the SSC (and also for the LHC). In this program, the input (or control) parameters and the derived parameters are linked by equations that express the accelerator physics involved. A subgroup of parameters that are considered critical, or possible bottlenecks, has been highlighted under the category of open-quotes Flagsclose quotes. Given certain performance goals, one can use this program to open-quotes tuneclose quotes the input parameters in such a way that the flagged parameters do not exceed their acceptable range. During the past years, this program has been employed for the following purposes: (a) To guide the machine designs for various operation scenarios, (b) To generate a parameter list that is self-consistent and, (c) To study the impact of some proposed parameter changes (e.g., different choices of the rf frequency and bunch spacing)

  11. Excel spreadsheet in teaching numerical methods

    Science.gov (United States)

    Djamila, Harimi

    2017-09-01

    One of the important objectives in teaching numerical methods for undergraduates’ students is to bring into the comprehension of numerical methods algorithms. Although, manual calculation is important in understanding the procedure, it is time consuming and prone to error. This is specifically the case when considering the iteration procedure used in many numerical methods. Currently, many commercial programs are useful in teaching numerical methods such as Matlab, Maple, and Mathematica. These are usually not user-friendly by the uninitiated. Excel spreadsheet offers an initial level of programming, which it can be used either in or off campus. The students will not be distracted with writing codes. It must be emphasized that general commercial software is required to be introduced later to more elaborated questions. This article aims to report on a teaching numerical methods strategy for undergraduates engineering programs. It is directed to students, lecturers and researchers in engineering field.

  12. Affordances of Spreadsheets In Mathematical Investigation: Potentialities For Learning

    Directory of Open Access Journals (Sweden)

    Nigel Calder

    2009-10-01

    Full Text Available This article, is concerned with the ways learning is shaped when mathematics problems are investigated in spreadsheet environments. It considers how the opportunities and constraints the digital media affords influenced the decisions the students made, and the direction of their enquiry pathway. How might the learning trajectory unfold, and the learning process and mathematical understanding emerge? Will the spreadsheet, as the pedagogical medium, evoke learning in a distinctive manner? The article reports on an aspect of an ongoing study involving students as they engage mathematical investigative tasks through digital media, the spreadsheet in particular. It considers the affordances of this learning environment for primary-aged students.

  13. Spreadsheet Decision Support Model for Training Exercise Material Requirements Planning

    National Research Council Canada - National Science Library

    Tringali, Arthur

    1997-01-01

    This thesis focuses on developing a spreadsheet decision support model that can be used by combat engineer platoon and company commanders in determining the material requirements and estimated costs...

  14. Computer Corner: Spreadsheets, Power Series, Generating Functions, and Integers.

    Science.gov (United States)

    Snow, Donald R.

    1989-01-01

    Implements a table algorithm on a spreadsheet program and obtains functions for several number sequences such as the Fibonacci and Catalan numbers. Considers other applications of the table algorithm to integers represented in various number bases. (YP)

  15. Spreadsheet Decision Support Model for Training Exercise Material Requirements Planning

    National Research Council Canada - National Science Library

    Tringali, Arthur

    1997-01-01

    ... associated with military training exercises. The model combines the business practice of Material Requirements Planning and the commercial spreadsheet software capabilities of Lotus 1-2-3 to calculate the requirements for food, consumable...

  16. Strontium-90 Error Discovered in Subcontract Laboratory Spreadsheet. Topical Report

    International Nuclear Information System (INIS)

    Brown, D.D.; Nagel, A.S.

    1999-07-01

    West Valley Demonstration Project health physicists and environment scientists discovered a series of errors in a subcontractor's spreadsheet being used to reduce data as part of their strontium-90 analytical process

  17. Spreadsheet-Enhanced Problem Solving in Context as Modeling

    Directory of Open Access Journals (Sweden)

    Sergei Abramovich

    2003-07-01

    development through situated mathematical problem solving. Modeling activities described in this paper support the epistemological position regarding the interplay that exists between the development of mathematical concepts and available methods of calculation. The spreadsheet used is Microsoft Excel 2001

  18. Applying the CobiT Control Framework to Spreadsheet Developments

    OpenAIRE

    Butler, Raymond J.

    2008-01-01

    One of the problems reported by researchers and auditors in the field of spreadsheet risks is that of getting and keeping managements attention to the problem. Since 1996, the Information Systems Audit & Control Foundation and the IT Governance Institute have published CobiT which brings mainstream IT control issues into the corporate governance arena. This paper illustrates how spreadsheet risk and control issues can be mapped onto the CobiT framework and thus brought to managers attention i...

  19. Using spreadsheet modelling to teach about feedback in physics

    Science.gov (United States)

    Lingard, Michael

    2003-09-01

    This article looks generally at spreadsheet modelling of feedback situations. It has several benefits as a teaching tool. Additionally, a consideration of the limitations of calculating at many discrete points can lead, at A-level, to an appreciation of the need for the calculus. Feedback situations can be used to introduce the idea of differential equations. Microsoft ExcelTM is the spreadsheet used.

  20. Phylogenetic Conflict in Bears Identified by Automated Discovery of Transposable Element Insertions in Low-Coverage Genomes

    Science.gov (United States)

    Gallus, Susanne; Janke, Axel

    2017-01-01

    Abstract Phylogenetic reconstruction from transposable elements (TEs) offers an additional perspective to study evolutionary processes. However, detecting phylogenetically informative TE insertions requires tedious experimental work, limiting the power of phylogenetic inference. Here, we analyzed the genomes of seven bear species using high-throughput sequencing data to detect thousands of TE insertions. The newly developed pipeline for TE detection called TeddyPi (TE detection and discovery for Phylogenetic Inference) identified 150,513 high-quality TE insertions in the genomes of ursine and tremarctine bears. By integrating different TE insertion callers and using a stringent filtering approach, the TeddyPi pipeline produced highly reliable TE insertion calls, which were confirmed by extensive in vitro validation experiments. Analysis of single nucleotide substitutions in the flanking regions of the TEs shows that these substitutions correlate with the phylogenetic signal from the TE insertions. Our phylogenomic analyses show that TEs are a major driver of genomic variation in bears and enabled phylogenetic reconstruction of a well-resolved species tree, despite strong signals for incomplete lineage sorting and introgression. The analyses show that the Asiatic black, sun, and sloth bear form a monophyletic clade, in which phylogenetic incongruence originates from incomplete lineage sorting. TeddyPi is open source and can be adapted to various TE and structural variation callers. The pipeline makes it possible to confidently extract thousands of TE insertions even from low-coverage genomes (∼10×) of nonmodel organisms. This opens new possibilities for biologists to study phylogenies and evolutionary processes as well as rates and patterns of (retro-)transposition and structural variation. PMID:28985298

  1. Phylogenetic Conflict in Bears Identified by Automated Discovery of Transposable Element Insertions in Low-Coverage Genomes.

    Science.gov (United States)

    Lammers, Fritjof; Gallus, Susanne; Janke, Axel; Nilsson, Maria A

    2017-10-01

    Phylogenetic reconstruction from transposable elements (TEs) offers an additional perspective to study evolutionary processes. However, detecting phylogenetically informative TE insertions requires tedious experimental work, limiting the power of phylogenetic inference. Here, we analyzed the genomes of seven bear species using high-throughput sequencing data to detect thousands of TE insertions. The newly developed pipeline for TE detection called TeddyPi (TE detection and discovery for Phylogenetic Inference) identified 150,513 high-quality TE insertions in the genomes of ursine and tremarctine bears. By integrating different TE insertion callers and using a stringent filtering approach, the TeddyPi pipeline produced highly reliable TE insertion calls, which were confirmed by extensive in vitro validation experiments. Analysis of single nucleotide substitutions in the flanking regions of the TEs shows that these substitutions correlate with the phylogenetic signal from the TE insertions. Our phylogenomic analyses show that TEs are a major driver of genomic variation in bears and enabled phylogenetic reconstruction of a well-resolved species tree, despite strong signals for incomplete lineage sorting and introgression. The analyses show that the Asiatic black, sun, and sloth bear form a monophyletic clade, in which phylogenetic incongruence originates from incomplete lineage sorting. TeddyPi is open source and can be adapted to various TE and structural variation callers. The pipeline makes it possible to confidently extract thousands of TE insertions even from low-coverage genomes (∼10×) of nonmodel organisms. This opens new possibilities for biologists to study phylogenies and evolutionary processes as well as rates and patterns of (retro-)transposition and structural variation. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  2. Service Discovery At Home

    NARCIS (Netherlands)

    Sundramoorthy, V.; Scholten, Johan; Jansen, P.G.; Hartel, Pieter H.

    Service discovery is a fady new field that kicked off since the advent of ubiquitous computing and has been found essential in the making of intelligent networks by implementing automated discovery and remote control between deviies. This paper provides an ovewiew and comparison of several prominent

  3. Service discovery at home

    NARCIS (Netherlands)

    Sundramoorthy, V.; Scholten, Johan; Jansen, P.G.; Hartel, Pieter H.

    2003-01-01

    Service discovery is a fairly new field that kicked off since the advent of ubiquitous computing and has been found essential in the making of intelligent networks by implementing automated discovery and remote control between devices. This paper provides an overview and comparison of several

  4. Automated Inadvertent Intruder Application

    International Nuclear Information System (INIS)

    Koffman, Larry D.; Lee, Patricia L.; Cook, James R.; Wilhite, Elmer L.

    2008-01-01

    The Environmental Analysis and Performance Modeling group of Savannah River National Laboratory (SRNL) conducts performance assessments of the Savannah River Site (SRS) low-level waste facilities to meet the requirements of DOE Order 435.1. These performance assessments, which result in limits on the amounts of radiological substances that can be placed in the waste disposal facilities, consider numerous potential exposure pathways that could occur in the future. One set of exposure scenarios, known as inadvertent intruder analysis, considers the impact on hypothetical individuals who are assumed to inadvertently intrude onto the waste disposal site. Inadvertent intruder analysis considers three distinct scenarios for exposure referred to as the agriculture scenario, the resident scenario, and the post-drilling scenario. Each of these scenarios has specific exposure pathways that contribute to the overall dose for the scenario. For the inadvertent intruder analysis, the calculation of dose for the exposure pathways is a relatively straightforward algebraic calculation that utilizes dose conversion factors. Prior to 2004, these calculations were performed using an Excel spreadsheet. However, design checks of the spreadsheet calculations revealed that errors could be introduced inadvertently when copying spreadsheet formulas cell by cell and finding these errors was tedious and time consuming. This weakness led to the specification of functional requirements to create a software application that would automate the calculations for inadvertent intruder analysis using a controlled source of input parameters. This software application, named the Automated Inadvertent Intruder Application, has undergone rigorous testing of the internal calculations and meets software QA requirements. The Automated Inadvertent Intruder Application was intended to replace the previous spreadsheet analyses with an automated application that was verified to produce the same calculations and

  5. 18 excel spreadsheets by species and year giving reproduction and growth data. One excel spreadsheet of herbicide treatment chemistry.

    Data.gov (United States)

    U.S. Environmental Protection Agency — Excel spreadsheets by species (4 letter code is abbreviation for genus and species used in study, year 2010 or 2011 is year data collected, SH indicates data for...

  6. Spread-sheet application to classify radioactive material for shipment

    International Nuclear Information System (INIS)

    Brown, A.N.

    1998-01-01

    A spread-sheet application has been developed at the Idaho National Engineering and Environmental Laboratory to aid the shipper when classifying nuclide mixtures of normal form, radioactive materials. The results generated by this spread-sheet are used to confirm the proper US DOT classification when offering radioactive material packages for transport. The user must input to the spread-sheet the mass of the material being classified, the physical form (liquid or not) and the activity of each regulated nuclide. The spread-sheet uses these inputs to calculate two general values: 1)the specific activity of the material and a summation calculation of the nuclide content. The specific activity is used to determine if the material exceeds the DOT minimal threshold for a radioactive material. If the material is calculated to be radioactive, the specific activity is also used to determine if the material meets the activity requirement for one of the three low specific activity designations (LSA-I, LSA-II, LSA-III, or not LSA). Again, if the material is calculated to be radioactive, the summation calculation is then used to determine which activity category the material will meet (Limited Quantity, Type A, Type B, or Highway Route Controlled Quantity). This spread-sheet has proven to be an invaluable aid for shippers of radioactive materials at the Idaho National Engineering and Environmental Laboratory. (authors)

  7. Electronic spreadsheet to acquire the reflectance from the TM and ETM+ Landsat images

    Directory of Open Access Journals (Sweden)

    Antonio R. Formaggio

    2005-08-01

    Full Text Available The reflectance of agricultural cultures and other terrestrial surface "targets" is an intrinsic parameter of these targets, so in many situations, it must be used instead of the values of "gray levels" that is found in the satellite images. In order to get reflectance values, it is necessary to eliminate the atmospheric interference and to make a set of calculations that uses sensor parameters and information regarding the original image. The automation of this procedure has the advantage to speed up the process and to reduce the possibility of errors during calculations. The objective of this paper is to present an electronic spreadsheet that simplifies and automatizes the transformation of the digital numbers of the TM/Landsat-5 and ETM+/Landsat-7 images into reflectance. The method employed for atmospheric correction was the dark object subtraction (DOS. The electronic spreadsheet described here is freely available to users and can be downloaded at the following website: http://www.dsr.inpe.br/Calculo_Reflectancia.xls.

  8. AUTOMATED INADVERTENT INTRUDER APPLICATION

    International Nuclear Information System (INIS)

    Koffman, L; Patricia Lee, P; Jim Cook, J; Elmer Wilhite, E

    2007-01-01

    The Environmental Analysis and Performance Modeling group of Savannah River National Laboratory (SRNL) conducts performance assessments of the Savannah River Site (SRS) low-level waste facilities to meet the requirements of DOE Order 435.1. These performance assessments, which result in limits on the amounts of radiological substances that can be placed in the waste disposal facilities, consider numerous potential exposure pathways that could occur in the future. One set of exposure scenarios, known as inadvertent intruder analysis, considers the impact on hypothetical individuals who are assumed to inadvertently intrude onto the waste disposal site. Inadvertent intruder analysis considers three distinct scenarios for exposure referred to as the agriculture scenario, the resident scenario, and the post-drilling scenario. Each of these scenarios has specific exposure pathways that contribute to the overall dose for the scenario. For the inadvertent intruder analysis, the calculation of dose for the exposure pathways is a relatively straightforward algebraic calculation that utilizes dose conversion factors. Prior to 2004, these calculations were performed using an Excel spreadsheet. However, design checks of the spreadsheet calculations revealed that errors could be introduced inadvertently when copying spreadsheet formulas cell by cell and finding these errors was tedious and time consuming. This weakness led to the specification of functional requirements to create a software application that would automate the calculations for inadvertent intruder analysis using a controlled source of input parameters. This software application, named the Automated Inadvertent Intruder Application, has undergone rigorous testing of the internal calculations and meets software QA requirements. The Automated Inadvertent Intruder Application was intended to replace the previous spreadsheet analyses with an automated application that was verified to produce the same calculations and

  9. Spreadsheets, Graphing Calculators and the Line of Best Fit

    Directory of Open Access Journals (Sweden)

    Bernie O'Sullivan

    2003-07-01

    One technique that can now be done, almost mindlessly, is the line of best fit. Both the graphing calculator and the Excel spreadsheet produce models for collected data that appear to be very good fits, but upon closer scrutiny, are revealed to be quite poor. This article will examine one such case. I will couch the paper within the framework of a very good classroom investigation that will help generate students’ understanding of the basic principles of curve fitting and will enable them to produce a very accurate model of collected data by combining the technology of the graphing calculator and the spreadsheet.

  10. Using the Talbot_Lau_interferometer_parameters Spreadsheet

    Energy Technology Data Exchange (ETDEWEB)

    Kallman, Jeffrey S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-06-04

    Talbot-Lau interferometers allow incoherent X-ray sources to be used for phase contrast imaging. A spreadsheet for exploring the parameter space of Talbot and Talbot-Lau interferometers has been assembled. This spreadsheet allows the user to examine the consequences of choosing phase grating pitch, source energy, and source location on the overall geometry of a Talbot or Talbot-Lau X-ray interferometer. For the X-ray energies required to penetrate scanned luggage the spacing between gratings is large enough that the mechanical tolerances for amplitude grating positioning are unlikely to be met.

  11. A Comparative Study of Spreadsheet Applications on Mobile Devices

    Directory of Open Access Journals (Sweden)

    Veera V. S. M. Chintapalli

    2016-01-01

    Full Text Available Advances in mobile screen sizes and feature enhancement for mobile applications have increased the number of users accessing spreadsheets on mobile devices. This paper reports a comparative usability study on four popular mobile spreadsheet applications: OfficeSuite Viewer 6, Documents To Go, ThinkFree Online, and Google Drive. We compare them against three categories of usability criteria: visibility; navigation, scrolling, and feedback; and interaction, satisfaction, simplicity, and convenience. Measures for each criterion were derived in a survey. Questionnaires were designed to address the measures based on the comparative criteria provided in the analysis.

  12. Automated Discovery of Mimicry Attacks

    National Research Council Canada - National Science Library

    Giffin, Jonathon T; Jha, Somesh; Miller, Barton P

    2006-01-01

    .... These systems are useful only if they detect actual attacks. Previous research developed manually-constructed mimicry and evasion attacks that avoided detection by hiding a malicious series of system calls within a valid sequence allowed by the model...

  13. Information Spreadsheet for Engines and Vehicles Compliance Information System (EV-CIS) User Registration

    Science.gov (United States)

    In this spreadsheet, user(s) provide their company’s manufacturer code, user contact information for EV-CIS, and user roles. This spreadsheet is used for the Company Authorizing Official (CAO), CROMERR Signer, and EV-CIS Submitters.

  14. A user friendly spreadsheet program for calibration using weighted regression. User's Guide

    NARCIS (Netherlands)

    Gort SM; Hoogerbrugge R; LOC

    1995-01-01

    Een gebruiksvriendelijke computer spreadsheet voor calibratie doeleinden is beschreven. Deze spreadsheet (ontwikkeld met behulp van Microsoft Excel) stelt niet-statistici, zoals analytische chemici, in de gelegenheid om gewogen lineaire regressie toe te passen. Verschillende calibratiefuncties en

  15. A Spreadsheet-based GIS tool for planning aerial photography

    Science.gov (United States)

    The U.S.EPA's Pacific Coastal Ecology Branch has developed a tool which facilitates planning aerial photography missions. This tool is an Excel spreadsheet which accepts various input parameters such as desired photo-scale and boundary coordinates of the study area and compiles ...

  16. When Spreadsheets Become Software - Quality Control Challenges and Approaches - 13360

    International Nuclear Information System (INIS)

    Fountain, Stefanie A.; Chen, Emmie G.; Beech, John F.; Wyatt, Elizabeth E.; Quinn, Tanya B.; Seifert, Robert W.; Bonczek, Richard R.

    2013-01-01

    As part of a preliminary waste acceptance criteria (PWAC) development, several commercial models were employed, including the Hydrologic Evaluation of Landfill Performance model (HELP) [1], the Disposal Unit Source Term - Multiple Species model (DUSTMS) [2], and the Analytical Transient One, Two, and Three-Dimensional model (AT123D) [3]. The results of these models were post-processed in MS Excel spreadsheets to convert the model results to alternate units, compare the groundwater concentrations to the groundwater concentration thresholds, and then to adjust the waste contaminant masses (based on average concentration over the waste volume) as needed in an attempt to achieve groundwater concentrations at the limiting point of assessment that would meet the compliance concentrations while maximizing the potential use of the landfill (i.e., maximizing the volume of projected waste being generated that could be placed in the landfill). During the course of the PWAC calculation development, one of the Microsoft (MS) Excel spreadsheets used to post-process the results of the commercial model packages grew to include more than 575,000 formulas across 18 worksheets. This spreadsheet was used to assess six base scenarios as well as nine uncertainty/sensitivity scenarios. The complexity of the spreadsheet resulted in the need for a rigorous quality control (QC) procedure to verify data entry and confirm the accuracy of formulas. (authors)

  17. Using a Spreadsheet Scroll Bar to Solve Equilibrium Concentrations

    Science.gov (United States)

    Raviolo, Andres

    2012-01-01

    A simple, conceptual method is described for using the spreadsheet scroll bar to find the composition of a system at chemical equilibrium. Simulation of any kind of chemical equilibrium can be carried out using this method, and the effects of different disturbances can be predicted. This simulation, which can be used in general chemistry…

  18. Negative Effects of Learning Spreadsheet Management on Learning Database Management

    Science.gov (United States)

    Vágner, Anikó; Zsakó, László

    2015-01-01

    A lot of students learn spreadsheet management before database management. Their similarities can cause a lot of negative effects when learning database management. In this article, we consider these similarities and explain what can cause problems. First, we analyse the basic concepts such as table, database, row, cell, reference, etc. Then, we…

  19. Solving L-L Extraction Problems with Excel Spreadsheet

    Science.gov (United States)

    Teppaitoon, Wittaya

    2016-01-01

    This work aims to demonstrate the use of Excel spreadsheets for solving L-L extraction problems. The key to solving the problems successfully is to be able to determine a tie line on the ternary diagram where the calculation must be carried out. This enables the reader to analyze the extraction process starting with a simple operation, the…

  20. A methodology for constructing the calculation model of scientific spreadsheets

    NARCIS (Netherlands)

    Vos, de M.; Wielemaker, J.; Schreiber, G.; Wielinga, B.; Top, J.L.

    2015-01-01

    Spreadsheets models are frequently used by scientists to analyze research data. These models are typically described in a paper or a report, which serves as single source of information on the underlying research project. As the calculation workflow in these models is not made explicit, readers are

  1. Development of an excel spreadsheet formean glandular dose in mammography

    International Nuclear Information System (INIS)

    Nagoshi, Kazuyo; Fujisaki, Tatsuya

    2008-01-01

    The purpose of this study was to develop an Excel spreadsheet to calculate mean glandular dose (D g ) in mammography using clinical exposure data. D g can be calculated as the product of incident air kerma (K a ) and D gN (i.e., D g =K a x D gN ). According to the method of Klein et al (Phys Med Biol 1997; 42: 651-671), K a was measured at the entrance surface with an ionization dosimeter. Normalized glandular dose (D gN ) coefficients, taking into account breast glandularity, were computed using Boone's method (Med Phys 2002; 29: 869-875). D gN coefficients can be calculated for any arbitrary X-ray spectrum. These calculation procedures were input into a Microsoft Excel spreadsheet. The resulting Excel spreadsheet is easy to use and is always applicable in the field of mammography. The exposure conditions concerning D g in clinical practice were also investigated in 22 women. Four exposure conditions (target/filter combination and tube voltage) were automatically selected in this study. This investigation found that average D g for each exposure was 1.9 mGy. Because it is recommended that quality control of radiation dose management in mammography is done using an American College of Radiology (ACR) phantom, information about patient dose is not obtained in many facilities. The present Excel spreadsheet was accordingly considered useful for optimization of exposure conditions and explanation of mammography to patients. (author)

  2. Spreadsheet application to classify radioactive material for shipment

    International Nuclear Information System (INIS)

    Brown, A.N.

    1997-12-01

    A spreadsheet application has been developed at the Idaho National Engineering and Environmental Laboratory to aid the shipper when classifying nuclide mixtures of normal form, radioactive materials. The results generated by this spreadsheet are used to confirm the proper US Department of Transportation (DOT) classification when offering radioactive material packages for transport. The user must input to the spreadsheet the mass of the material being classified, the physical form (liquid or not), and the activity of each regulated nuclide. The spreadsheet uses these inputs to calculate two general values: (1) the specific activity of the material, and (2) a summation calculation of the nuclide content. The specific activity is used to determine if the material exceeds the DOT minimal threshold for a radioactive material (Yes or No). If the material is calculated to be radioactive, the specific activity is also used to determine if the material meets the activity requirement for one of the three Low Specific Activity designations (LSA-I, LSA-II, LSA-III, or Not LSA). Again, if the material is calculated to be radioactive, the summation calculation is then used to determine which activity category the material will meet (Limited Quantity, Type A, Type B, or Highway Route Controlled Quantity)

  3. A Spreadsheet-Based, Matrix Formulation Linear Programming Lesson

    DEFF Research Database (Denmark)

    Harrod, Steven

    2009-01-01

    The article focuses on the spreadsheet-based, matrix formulation linear programming lesson. According to the article, it makes a higher level of theoretical mathematics approachable by a wide spectrum of students wherein many may not be decision sciences or quantitative methods majors. Moreover...

  4. Introducing Artificial Neural Networks through a Spreadsheet Model

    Science.gov (United States)

    Rienzo, Thomas F.; Athappilly, Kuriakose K.

    2012-01-01

    Business students taking data mining classes are often introduced to artificial neural networks (ANN) through point and click navigation exercises in application software. Even if correct outcomes are obtained, students frequently do not obtain a thorough understanding of ANN processes. This spreadsheet model was created to illuminate the roles of…

  5. When Spreadsheets Become Software - Quality Control Challenges and Approaches - 13360

    Energy Technology Data Exchange (ETDEWEB)

    Fountain, Stefanie A.; Chen, Emmie G.; Beech, John F. [Geosyntec Consultants, Inc., 1255 Roberts Boulevard NW, Suite 200, Kennesaw, GA 30144 (United States); Wyatt, Elizabeth E. [LATA Environmental Services of Kentucky, LLC, 761 Veterans Ave, Kevil, KY 42053 (United States); Quinn, Tanya B. [Geosyntec Consultants, Inc., 2002 Summit Boulevard NE, Suite 885, Atlanta, GA 30319 (United States); Seifert, Robert W. [Portsmouth/Paducah Project Office, United States Department of Energy, 5600 Hobbs Rd, Kevil, KY 42053 (United States); Bonczek, Richard R. [Portsmouth/Paducah Project Office, United States Department of Energy, 1017 Majestic Drive, Lexington, KY 40513 (United States)

    2013-07-01

    As part of a preliminary waste acceptance criteria (PWAC) development, several commercial models were employed, including the Hydrologic Evaluation of Landfill Performance model (HELP) [1], the Disposal Unit Source Term - Multiple Species model (DUSTMS) [2], and the Analytical Transient One, Two, and Three-Dimensional model (AT123D) [3]. The results of these models were post-processed in MS Excel spreadsheets to convert the model results to alternate units, compare the groundwater concentrations to the groundwater concentration thresholds, and then to adjust the waste contaminant masses (based on average concentration over the waste volume) as needed in an attempt to achieve groundwater concentrations at the limiting point of assessment that would meet the compliance concentrations while maximizing the potential use of the landfill (i.e., maximizing the volume of projected waste being generated that could be placed in the landfill). During the course of the PWAC calculation development, one of the Microsoft (MS) Excel spreadsheets used to post-process the results of the commercial model packages grew to include more than 575,000 formulas across 18 worksheets. This spreadsheet was used to assess six base scenarios as well as nine uncertainty/sensitivity scenarios. The complexity of the spreadsheet resulted in the need for a rigorous quality control (QC) procedure to verify data entry and confirm the accuracy of formulas. (authors)

  6. Automated cost modeling for coal combustion systems

    International Nuclear Information System (INIS)

    Rowe, R.M.; Anast, K.R.

    1991-01-01

    This paper reports on cost information developed at AMAX R and D Center for coal-water slurry production implemented in an automated spreadsheet (Lotus 123) for personal computer use. The spreadsheet format allows the user toe valuate impacts of various process options, coal feedstock characteristics, fuel characteristics, plant location sites, and plant sizes on fuel cost. Model flexibility reduces time and labor required to determine fuel costs and provides a basis to compare fuels manufactured by different processes. The model input includes coal characteristics, plant flowsheet definition, plant size, and market location. Based on these inputs, selected unit operations are chosen for coal processing

  7. Rewriting High-Level Spreadsheet Structures into Higher-Order Functional Programs

    DEFF Research Database (Denmark)

    Biermann, Florian; Dou, Wensheng; Sestoft, Peter

    2017-01-01

    Spreadsheets are used heavily in industry and academia. Often, spreadsheet models are developed for years and their complexity grows vastly beyond what the paradigm was originally conceived for. Such complexity often comes at the cost of recalculation performance. However, spreadsheet models...

  8. Introduction to supercritical fluids a spreadsheet-based approach

    CERN Document Server

    Smith, Richard; Peters, Cor

    2013-01-01

    This text provides an introduction to supercritical fluids with easy-to-use Excel spreadsheets suitable for both specialized-discipline (chemistry or chemical engineering student) and mixed-discipline (engineering/economic student) classes. Each chapter contains worked examples, tip boxes and end-of-the-chapter problems and projects. Part I covers web-based chemical information resources, applications and simplified theory presented in a way that allows students of all disciplines to delve into the properties of supercritical fluids and to design energy, extraction and materials formation systems for real-world processes that use supercritical water or supercritical carbon dioxide. Part II takes a practical approach and addresses the thermodynamic framework, equations of state, fluid phase equilibria, heat and mass transfer, chemical equilibria and reaction kinetics of supercritical fluids. Spreadsheets are arranged as Visual Basic for Applications (VBA) functions and macros that are completely (source code) ...

  9. Spreadsheet based analysis of Mössbauer spectra

    Energy Technology Data Exchange (ETDEWEB)

    Gunnlaugsson, H. P., E-mail: haraldur.p.gunnlaugsson@cern.ch [CERN, PH Div (Switzerland)

    2016-12-15

    Using spreadsheet programs to analyse spectral data opens up new possibilities in data analysis. The spreadsheet program contains all the functionality needed for graphical support, fitting and post processing of the results. Unconventional restrictions between fitting parameters can be set up freely, and simultaneous analysis i.e. analysis of many spectra simultaneously in terms of model parameters is straightforward. The free program package Vinda – used for analysing Mössbauer spectra – is described. The package contains support for reading data, calibration, and common functions of particular importance for Mössbauer spectroscopy (f-factors, second order Doppler shift etc.). Methods to create spectral series and support for error analysis is included. Different types of fitting models are included, ranging from simple Lorentzian models to complex distribution models.

  10. The Architecture of a Complex GIS & Spreadsheet Based DSS

    Directory of Open Access Journals (Sweden)

    Dinu Airinei

    2010-01-01

    Full Text Available The decision support applications available on today market use to combine the decision analysis of historical databased on On-Line Analytical Processing (OLAP products or spreadsheet pivot tables with some new reporting facilities as alerts or key performance indicators available in portal dashboards or in complex spreadsheet-like reports, both corresponding to a new approach of the field called Business Intelligence. Moreover the geographical features of GIS added to DSS applications become more and more required by many kinds of businesses. In fact they are more useful this way than as distinctive parts.The paper tries to present a certain DSS architecture based on the association between such approaches and technologies. The particular examples are meant to support all the theoretical arguments and to complete the understanding of the interaction schemas available.

  11. A spreadsheet-coupled SOLGAS: A computerized thermodynamic equilibrium calculation tool. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Trowbridge, L.D.; Leitnaker, J.M. [Oak Ridge K-25 Site, TN (United States). Technical Analysis and Operations Div.

    1995-07-01

    SOLGAS, an early computer program for calculating equilibrium in a chemical system, has been made more user-friendly, and several ``bells and whistles`` have been added. The necessity to include elemental species has been eliminated. The input of large numbers of starting conditions has been automated. A revised spreadsheet-based format for entering data, including non-ideal binary and ternary mixtures, simplifies and reduces chances for error. Calculational errors by SOLGAS are flagged, and several programming errors are corrected. Auxiliary programs are available to assemble and partially automate plotting of large amounts of data. Thermodynamic input data can be changed on line. The program can be operated with or without a co-processor. Copies of the program, suitable for the IBM-PC or compatibles with at least 384 bytes of low RAM, are available from the authors. This user manual contains appendices with examples of the use of SOLGAS. These range from elementary examples, such as, the relationships among water, ice, and water vapor, to more complex systems: phase diagram calculation of UF{sub 4} and UF{sub 6} system; burning UF{sub 4} in fluorine; thermodynamic calculation of the Cl-F-O-H system; equilibria calculations in the CCl{sub 4}--CH{sub 3}OH system; and limitations applicable to aqueous solutions. An appendix also contains the source code.

  12. Development of a spreadsheet for SNPs typing using Microsoft EXCEL.

    Science.gov (United States)

    Hashiyada, Masaki; Itakura, Yukio; Takahashi, Shirushi; Sakai, Jun; Funayama, Masato

    2009-04-01

    Single-nucleotide polymorphisms (SNPs) have some characteristics that make them very appropriate for forensic studies and applications. In our institute, SNPs typings were performed by the TaqMan SNP Genotyping Assays using the ABI PRISM 7500 FAST Real-Time PCR System (AppliedBiosystems) and Sequence Detection Software ver.1.4 (AppliedBiosystem). The TaqMan method was desired two positive control (Allele1 and 2) and one negative control to analyze each SNP locus. Therefore, it can be analyzed up to 24 loci of a person on a 96-well-plate at the same time. If SNPs analysis is expected to apply to biometrics authentication, 48 and over loci are required to identify a person. In this study, we designed a spreadsheet package using Microsoft EXCEL, and population data were used from our 120 SNPs population studies. On the spreadsheet, we defined SNP types using 'template files' instead of positive and negative controls. "Template files" consisted of the results of 94 unknown samples and two negative controls of each of 120 SNPs loci we had previously studied. By the use of the files, the spreadsheet could analyze 96 SNPs on a 96-wells-plate simultaneously.

  13. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    Energy Technology Data Exchange (ETDEWEB)

    Etmektzoglou, A; Mishra, P; Svatos, M [Varian Medical Systems, Palo Alto, CA (United States)

    2015-06-15

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomes available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly

  14. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    International Nuclear Information System (INIS)

    Etmektzoglou, A; Mishra, P; Svatos, M

    2015-01-01

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomes available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly

  15. A review of simple multiple criteria decision making analytic procedures which are implementable on spreadsheet packages

    Directory of Open Access Journals (Sweden)

    T.J. Stewart

    2003-12-01

    Full Text Available A number of modern multi-criteria decision making aids for the discrete choice problem, are reviewed, with particular emphasis on those which can be implemented on standard commercial spreadsheet packages. Three broad classes of procedures are discussed, namely the analytic hierarchy process, reference point methods, and outranking methods. The broad principles are summarised in a consistent framework, and on a spreadsheet. LOTUS spreadsheets implementing these are available from the author.

  16. Use of Wingz spreadsheet as an interface to total-system performance assessment

    International Nuclear Information System (INIS)

    Chambers, W.F.; Treadway, A.H.

    1992-01-01

    A commercial spreadsheet has been used as an interface to a set of simple models to simulate possible nominal flow and failure scenarios at the potential high-level nuclear waste repository at Yucca Mountain, Nevada. Individual models coded in FORTRAN are linked to the spreadsheet. Complementary cumulative probability distribution functions resulting from the models are plotted through scripts associated with the spreadsheet. All codes are maintained under a source code control system for quality assurance. The spreadsheet and the simple models can be run on workstations, PCs, and Macintoshes. The software system is designed so that the FORTRAN codes can be run on several machines if a network environment is available

  17. Use of spreadsheets for interactive control of MFTF-B plasma diagnostic instruments

    International Nuclear Information System (INIS)

    Preckshot, G.G.; Goldner, A.L.; Kobayashi, A.

    1986-01-01

    The Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory has a variety of highly individualized plasma diagnostic instruments attached to the experiment. These instruments are controlled through graphics workstations networked to a central computer system. A distributed spreadsheet-like program runs in both the graphics workstations and in the central computer system. An interface very similar to a commercial spreadsheet program is presented to the user at a workstation. In a commercial spreadsheet program, the user may attach mathematical calculation functions to spreadsheet cells. At MFTF-B, hardware control functions, hardware monitoring functions, and communications functions, as well as mathematical functions, may be attached to cells. Both the user and feedback from instrument hardware may make entries in spreadsheet cells; any entry in a spreadsheet cell may cause reevaluation of the cell's associated functions. The spreadsheet approach makes the addition of a new instrument a matter of designing one or more spreadsheet tables with associated meta-language-defined control and communication function strings. This paper describes the details of the spreadsheets and the implementation experience

  18. Use of spreadsheets for interactive control of MFTF-B plasma diagnostic instruments

    International Nuclear Information System (INIS)

    Preckshot, G.G.; Goldner, A.; Kobayashi, A.

    1985-01-01

    The Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory has a variety of highly individualized plasma diagnostic instruments attached to the experiment. These instruments are controlled through graphics workstations networked to a central computer system. A distributed spreadsheet-like program runs in both the graphics workstations and in the central computer system. An interface very similar to a commercial spreadsheet program is presented to the user at a workstation. In a commercial spreadsheet program, the user may attach mathematical calculation functions to spreadsheet cells. At MFTF-B, hardware control functions, hardware monitoring functions, and communications functions, as well as mathematical functions, may be attached to cells. Both the user and feedback from instrument hardware may make entries in spreadsheet cells; any entry in a spreadsheet cell may cause reevaluation of the cell's associated functions. The spreadsheet approach makes the addition of a new instrument a matter of designing one or more spreadsheet tables with associated meta-language-defined control and communication function strings. We report here details of our spreadsheets and our implementation experience

  19. Use of a commercial spreadsheet for quality control in radiotherapy

    International Nuclear Information System (INIS)

    Sales, D.A.G.; Batista, D.V.S.

    2001-01-01

    This work presents the results obtained from elaboration of a spreadsheet to quality control of physical and clinical dosimetry of a radiotherapy service. It was developed using the resources of a commercial software, in the way to behave an independent verification of manual calculation and therapy planning system calculation to routine procedures of radiotherapy service of Instituto Nacional de Cancer. Its validation was made with the reference of current manual calculation proposed at literature and with the results of therapy planning system for test cases. (author)

  20. Head First Excel A learner's guide to spreadsheets

    CERN Document Server

    Milton, Michael

    2010-01-01

    Do you use Excel for simple lists, but get confused and frustrated when it comes to actually doing something useful with all that data? Stop tearing your hair out: Head First Excel helps you painlessly move from spreadsheet dabbler to savvy user. Whether you're completely new to Excel or an experienced user looking to make the program work better for you, this book will help you incorporate Excel into every aspect of your workflow, from a scratch pad for data-based brainstorming to exploratory analysis with PivotTables, optimizing outcomes with Goal Seek, and presenting your conclusions wit

  1. Making the business case for telemedicine: an interactive spreadsheet.

    Science.gov (United States)

    McCue, Michael J; Palsbo, Susan E

    2006-04-01

    The objective of this study was to demonstrate the business case for telemedicine in nonrural areas. We developed an interactive spreadsheet to conduct multiple financial analyses under different capital investment, revenue, and expense scenarios. We applied the spreadsheet to the specific case of poststroke rehabilitation in urban settings. The setting involved outpatient clinics associated with a freestanding rehabilitation hospital in Oklahoma. Our baseline scenario used historical financial data from face-to-face encounters as the baseline for payer and volume mix. We assumed a cost of capital of 10% to finance the project. The outcome measures were financial breakeven points and internal rate of return. A total of 340 telemedicine visits will generate a positive net cash flow each year. The project is expected to recoup the initial investment by the fourth year, produce a positive present value dollar return of more than $2,000, and earn rate of return of 20%, which exceeds the hospital's cost of capital. The business case is demonstrated for this scenario. Urban telemedicine programs can be financially self-sustaining without accounting for reductions in travel time by providers or patients. Urban telemedicine programs can be a sound business investment and not depend on grants or subsidies for start-up funding. There are several key decision points that affect breakeven points and return on investment. The best business strategy is to approach the decision as whether or not to build a new clinic.

  2. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  3. A Spreadsheet Tool for Learning the Multiple Regression F-Test, T-Tests, and Multicollinearity

    Science.gov (United States)

    Martin, David

    2008-01-01

    This note presents a spreadsheet tool that allows teachers the opportunity to guide students towards answering on their own questions related to the multiple regression F-test, the t-tests, and multicollinearity. The note demonstrates approaches for using the spreadsheet that might be appropriate for three different levels of statistics classes,…

  4. Integrated Spreadsheets as a Paradigm of Type II Technology Applications in Mathematics Teacher Education

    Science.gov (United States)

    Abramovich, Sergei

    2016-01-01

    The paper presents the use of spreadsheets integrated with digital tools capable of symbolic computations and graphic constructions in a master's level capstone course for secondary mathematics teachers. Such use of spreadsheets is congruent with the Type II technology applications framework aimed at the development of conceptual knowledge in the…

  5. SRTC Spreadsheet to Determine Relative Percent Difference (RPD) for Duplicate Waste Assay Results and to Perform the RPD Acceptance Test

    International Nuclear Information System (INIS)

    Casella, V.R.

    2002-01-01

    This report documents the calculations and logic used for the Microsoft(R) Excel spreadsheet that is used at the 773-A Solid Waste Assay Facility for evaluating duplicate analyses, and validates that the spreadsheet is performing these functions correctly

  6. Application of an automated natural language processing (NLP) workflow to enable federated search of external biomedical content in drug discovery and development.

    Science.gov (United States)

    McEntire, Robin; Szalkowski, Debbie; Butler, James; Kuo, Michelle S; Chang, Meiping; Chang, Man; Freeman, Darren; McQuay, Sarah; Patel, Jagruti; McGlashen, Michael; Cornell, Wendy D; Xu, Jinghai James

    2016-05-01

    External content sources such as MEDLINE(®), National Institutes of Health (NIH) grants and conference websites provide access to the latest breaking biomedical information, which can inform pharmaceutical and biotechnology company pipeline decisions. The value of the sites for industry, however, is limited by the use of the public internet, the limited synonyms, the rarity of batch searching capability and the disconnected nature of the sites. Fortunately, many sites now offer their content for download and we have developed an automated internal workflow that uses text mining and tailored ontologies for programmatic search and knowledge extraction. We believe such an efficient and secure approach provides a competitive advantage to companies needing access to the latest information for a range of use cases and complements manually curated commercial sources. Copyright © 2016. Published by Elsevier Ltd.

  7. Cutting solid figures by plane - analytical solution and spreadsheet implementation

    Science.gov (United States)

    Benacka, Jan

    2012-07-01

    In some secondary mathematics curricula, there is a topic called Stereometry that deals with investigating the position and finding the intersection, angle, and distance of lines and planes defined within a prism or pyramid. Coordinate system is not used. The metric tasks are solved using Pythagoras' theorem, trigonometric functions, and sine and cosine rules. The basic problem is to find the section of the figure by a plane that is defined by three points related to the figure. In this article, a formula is derived that gives the positions of the intersection points of such a plane and the figure edges, that is, the vertices of the section polygon. Spreadsheet implementations of the formula for cuboid and right rectangular pyramids are presented. The user can check his/her graphical solution, or proceed if he/she is not able to complete the section.

  8. (abstract) Simple Spreadsheet Thermal Models for Cryogenic Applications

    Science.gov (United States)

    Nash, A. E.

    1994-01-01

    Self consistent circuit analog thermal models, that can be run in commercial spreadsheet programs on personal computers, have been created to calculate the cooldown and steady state performance of cryogen cooled Dewars. The models include temperature dependent conduction and radiation effects. The outputs of the models provide temperature distribution and Dewar performance information. These models have been used to analyze the Cryogenic Telescope Test Facility (CTTF). The facility will be on line in early 1995 for its first user, the Infrared Telescope Technology Testbed (ITTT), for the Space Infrared Telescope Facility (SIRTF) at JPL. The model algorithm as well as a comparison of the model predictions and actual performance of this facility will be presented.

  9. Spreadsheet-based program for alignment of overlapping DNA sequences.

    Science.gov (United States)

    Anbazhagan, R; Gabrielson, E

    1999-06-01

    Molecular biology laboratories frequently face the challenge of aligning small overlapping DNA sequences derived from a long DNA segment. Here, we present a short program that can be used to adapt Excel spreadsheets as a tool for aligning DNA sequences, regardless of their orientation. The program runs on any Windows or Macintosh operating system computer with Excel 97 or Excel 98. The program is available for use as an Excel file, which can be downloaded from the BioTechniques Web site. Upon execution, the program opens a specially designed customized workbook and is capable of identifying overlapping regions between two sequence fragments and displaying the sequence alignment. It also performs a number of specialized functions such as recognition of restriction enzyme cutting sites and CpG island mapping without costly specialized software.

  10. Automated methods of corrosion measurement

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov; Bech-Nielsen, Gregers; Reeve, John Ch

    1997-01-01

    to revise assumptions regarding the basis of the method, which sometimes leads to the discovery of as-yet unnoticed phenomena. The present selection of automated methods for corrosion measurements is not motivated simply by the fact that a certain measurement can be performed automatically. Automation...... is applied to nearly all types of measurements today....

  11. The FAO/IAEA interactive spreadsheet for design and operation of insect mass rearing facilities

    International Nuclear Information System (INIS)

    Caceres, Carlos; Rendon, Pedro

    2006-01-01

    An electronic spreadsheet is described which helps users to design, equip and operate facilities for the mass rearing of insects for use in insect pest control programmes integrating the sterile insect technique. The spreadsheet was designed based on experience accumulated in the mass rearing of the Mediterranean fruit fly, Ceratitis capitata (Wiedemann), using genetic sexing strains based on a temperature sensitive lethal (tsl) mutation. The spreadsheet takes into account the biological, production, and quality control parameters of the species to be mass reared, as well as the diets and equipment required. All this information is incorporated into the spreadsheet for user-friendly calculation of the main components involved in facility design and operation. Outputs of the spreadsheet include size of the different rearing areas, rearing equipment, volumes of diet ingredients, other consumables, as well as personnel requirements. By adding cost factors to these components, the spreadsheet can estimate the costs of facility construction, equipment, and operation. All the output parameters can be easily generated by simply entering the target number of sterile insects required per week. For other insect species, the biological and production characteristics need to be defined and inputted accordingly to obtain outputs relevant to these species. This spreadsheet, available under http://www-naweb.iaea.org/nafa/ipc/index.html, is a powerful tool for project and facility managers as it can be used to estimate facility cost, production cost, and production projections under different rearing efficiency scenarios. (author)

  12. The FAO/IAEA interactive spreadsheet for design and operation of insect mass rearing facilities

    Energy Technology Data Exchange (ETDEWEB)

    Caceres, Carlos, E-mail: carlos.e.caceres@aphis.usda.co [International Atomic Energy Agency (IAEA), Seibersdorf (Austria). Agency' s Labs. Programme of Nuclear Techniques in Food and Agriculture; Rendon, Pedro [U.S. Department of Agriculture (USDA/APHIS/CPHST), Guatemala City (Guatemala). Animal and Plant Health Inspection. Center for Plant Health Science and Technology

    2006-07-01

    An electronic spreadsheet is described which helps users to design, equip and operate facilities for the mass rearing of insects for use in insect pest control programmes integrating the sterile insect technique. The spreadsheet was designed based on experience accumulated in the mass rearing of the Mediterranean fruit fly, Ceratitis capitata (Wiedemann), using genetic sexing strains based on a temperature sensitive lethal (tsl) mutation. The spreadsheet takes into account the biological, production, and quality control parameters of the species to be mass reared, as well as the diets and equipment required. All this information is incorporated into the spreadsheet for user-friendly calculation of the main components involved in facility design and operation. Outputs of the spreadsheet include size of the different rearing areas, rearing equipment, volumes of diet ingredients, other consumables, as well as personnel requirements. By adding cost factors to these components, the spreadsheet can estimate the costs of facility construction, equipment, and operation. All the output parameters can be easily generated by simply entering the target number of sterile insects required per week. For other insect species, the biological and production characteristics need to be defined and inputted accordingly to obtain outputs relevant to these species. This spreadsheet, available under http://www-naweb.iaea.org/nafa/ipc/index.html, is a powerful tool for project and facility managers as it can be used to estimate facility cost, production cost, and production projections under different rearing efficiency scenarios. (author)

  13. The governance of risk arising from the use of spreadsheets in organisations

    Directory of Open Access Journals (Sweden)

    Tessa Minter

    2014-06-01

    Full Text Available The key to maximising the effectiveness of spreadsheet models for critical decision making is appropriate risk governance. Those responsible for governance need, at a macro level, to identify the specific spreadsheet risks, determine the reasons for such exposures and establish where and when risk exposures occur from point of initiation to usage and storage. It is essential to identify which parties could create the exposure taking cognisance of the entire supply chain of the organisation. If management’s risk strategy is to control the risks then the question reverts to how these risks can be prevented and/or detected and corrected? This paper attempts to address each of these critical issues and to offer guidance in the governance of spreadsheet risk. The paper identifies the the risk exposures and sets out the responsibilities of directors in relation to spreadsheets and the spreadsheet cycle. Spreadsheet risk exposure can be managed in terms of setting the control environment, undertaking risk assessment, providing the requisite information and communicating with internal and external parties as well as implementing spreadsheet lifecycle application controls and monitoring activities

  14. Volatility Discovery

    DEFF Research Database (Denmark)

    Dias, Gustavo Fruet; Scherrer, Cristina; Papailias, Fotis

    The price discovery literature investigates how homogenous securities traded on different markets incorporate information into prices. We take this literature one step further and investigate how these markets contribute to stochastic volatility (volatility discovery). We formally show...... that the realized measures from homogenous securities share a fractional stochastic trend, which is a combination of the price and volatility discovery measures. Furthermore, we show that volatility discovery is associated with the way that market participants process information arrival (market sensitivity......). Finally, we compute volatility discovery for 30 actively traded stocks in the U.S. and report that Nyse and Arca dominate Nasdaq....

  15. A system for automated quantification of cutaneous electrogastrograms

    DEFF Research Database (Denmark)

    Paskaranandavadivel, Niranchan; Bull, Simon Henry; Parsell, Doug

    2015-01-01

    and amplitude were compared to automated estimates. The methods were packaged into a software executable which processes the data and presents the results in an intuitive graphical and a spreadsheet formats. Automated EGG analysis allows for clinical translation of bio-electrical analysis for potential......Clinical evaluation of cutaneous electrogastrograms (EGG) is important for understanding the role of slow waves in functional motility disorders and may be a useful diagnostic aid. An automated software package has been developed which computes metrics of interest from EGG and from slow wave...

  16. Integrating Computer Spreadsheet Modeling into a Microeconomics Curriculum: Principles to Managerial.

    Science.gov (United States)

    Clark, Joy L.; Hegji, Charles E.

    1997-01-01

    Notes that using spreadsheets to teach microeconomics principles enables learning by doing in the exploration of basic concepts. Introduction of increasingly complex topics leads to exploration of theory and managerial decision making. (SK)

  17. Spreadsheet Error Detection: an Empirical Examination in the Context of Greece

    Directory of Open Access Journals (Sweden)

    Dimitrios Maditinos

    2012-06-01

    Full Text Available The personal computers era made advanced programming tasks available to end users. Spreadsheet models are one of the most widely used applications that can produce valuable results with minimal training and effort. However, errors contained in most spreadsheets may be catastrophic and difficult to detect. This study attempts to investigate the influence of experience and spreadsheet presentation on the error finding performance by end users. To reach the target of the study, 216 business and finance students participated in a task of finding errors in a simple free cash flow model. The findings of the study reveal that presentation of the spreadsheet is of major importance as far as the error finding performance is concerned, while experience does not seem to affect students on their performance. Further research proposals and limitations of the study are, moreover, discussed.

  18. Verification of Monitor unit calculations for eclipse Treatment Planning System by in- house developed spreadsheet

    Directory of Open Access Journals (Sweden)

    Hemalatha Athiyaman

    2018-04-01

    Conclusion: The spreadsheet was tested for most of the routine treatment sites and geometries. It has good agreement with the Eclipse TPS version 13.8 for homogenous treatment sites such as head &and neck and carcinoma cervix.

  19. A simple model of hysteresis behavior using spreadsheet analysis

    International Nuclear Information System (INIS)

    Ehrmann, A; Blachowicz, T

    2015-01-01

    Hysteresis loops occur in many scientific and technical problems, especially as field dependent magnetization of ferromagnetic materials, but also as stress-strain-curves of materials measured by tensile tests including thermal effects, liquid-solid phase transitions, in cell biology or economics. While several mathematical models exist which aim to calculate hysteresis energies and other parameters, here we offer a simple model for a general hysteretic system, showing different hysteresis loops depending on the defined parameters. The calculation which is based on basic spreadsheet analysis plus an easy macro code can be used by students to understand how these systems work and how the parameters influence the reactions of the system on an external field. Importantly, in the step-by-step mode, each change of the system state, compared to the last step, becomes visible. The simple program can be developed further by several changes and additions, enabling the building of a tool which is capable of answering real physical questions in the broad field of magnetism as well as in other scientific areas, in which similar hysteresis loops occur

  20. Numeric calculation of celestial bodies with spreadsheet analysis

    Science.gov (United States)

    Koch, Alexander

    2016-04-01

    The motion of the planets and moons in our solar system can easily be calculated for any time by the Kepler laws of planetary motion. The Kepler laws are a special case of the gravitational law of Newton, especially if you consider more than two celestial bodies. Therefore it is more basic to calculate the motion by using the gravitational law. But the problem is, that by gravitational law it is not possible to calculate the state of motion with only one step of calculation. The motion has to be numerical calculated for many time intervalls. For this reason, spreadsheet analysis is helpful for students. Skills in programmes like Excel, Calc or Gnumeric are important in professional life and can easily be learnt by students. These programmes can help to calculate the complex motions with many intervalls. The more intervalls are used, the more exact are the calculated orbits. The sutdents will first get a quick course in Excel. After that they calculate with instructions the 2-D-coordinates of the orbits of Moon and Mars. Step by step the students are coding the formulae for calculating physical parameters like coordinates, force, acceleration and velocity. The project is limited to 4 weeks or 8 lessons. So the calcualtion will only include the calculation of one body around the central mass like Earth or Sun. The three-body problem can only be shortly discussed at the end of the project.

  1. A simple model of hysteresis behavior using spreadsheet analysis

    Science.gov (United States)

    Ehrmann, A.; Blachowicz, T.

    2015-01-01

    Hysteresis loops occur in many scientific and technical problems, especially as field dependent magnetization of ferromagnetic materials, but also as stress-strain-curves of materials measured by tensile tests including thermal effects, liquid-solid phase transitions, in cell biology or economics. While several mathematical models exist which aim to calculate hysteresis energies and other parameters, here we offer a simple model for a general hysteretic system, showing different hysteresis loops depending on the defined parameters. The calculation which is based on basic spreadsheet analysis plus an easy macro code can be used by students to understand how these systems work and how the parameters influence the reactions of the system on an external field. Importantly, in the step-by-step mode, each change of the system state, compared to the last step, becomes visible. The simple program can be developed further by several changes and additions, enabling the building of a tool which is capable of answering real physical questions in the broad field of magnetism as well as in other scientific areas, in which similar hysteresis loops occur.

  2. Illustrating Probability through Roulette: A Spreadsheet Simulation Model

    Directory of Open Access Journals (Sweden)

    Kala Chand Seal

    2005-11-01

    Full Text Available Teaching probability can be challenging because the mathematical formulas often are too abstract and complex for the students to fully grasp the underlying meaning and effect of the concepts. Games can provide a way to address this issue. For example, the game of roulette can be an exciting application for teaching probability concepts. In this paper, we implement a model of roulette in a spreadsheet that can simulate outcomes of various betting strategies. The simulations can be analyzed to gain better insights into the corresponding probability structures. We use the model to simulate a particular betting strategy known as the bet-doubling, or Martingale, strategy. This strategy is quite popular and is often erroneously perceived as a winning strategy even though the probability analysis shows that such a perception is incorrect. The simulation allows us to present the true implications of such a strategy for a player with a limited betting budget and relate the results to the underlying theoretical probability structure. The overall validation of the model, its use for teaching, including its application to analyze other types of betting strategies are discussed.

  3. FURTHER CONSIDERATIONS ON SPREADSHEET-BASED AUTOMATIC TREND LINES

    Directory of Open Access Journals (Sweden)

    DANIEL HOMOCIANU

    2015-12-01

    Full Text Available Most of the nowadays business applications working with data sets allow exports to the spreadsheet format. This fact is related to the experience of common business users with such products and to the possibility to couple what they have with something containing many models, functions and possibilities to process and represent data, by that getting something in dynamics and much more than a simple static less useful report. The purpose of Business Intelligence is to identify clusters, profiles, association rules, decision trees and many other patterns or even behaviours, but also to generate alerts for exceptions, determine trends and make predictions about the future based on historical data. In this context, the paper shows some practical results obtained after testing both the automatic creation of scatter charts and trend lines corresponding to the user’s preferences and the automatic suggesting of the most appropriate trend for the tested data mostly based on the statistical measure of how close they are to the regression function.

  4. Description of the Material Balance Model and Spreadsheet for Salt Dissolution

    International Nuclear Information System (INIS)

    Wiersma, B.J.

    1994-01-01

    The model employed to estimate the amount of inhibitors necessary for bearing water and dissolution water during the salt dissolution process is described. This model was inputed on a spreadsheet which allowed many different case studies to be performed. This memo describes the assumptions and equations which are used in the model, and documents the input and output cells of the spreadsheet. Two case studies are shown as examples of how the model may be employed

  5. A Novel Approach to Formulae Production and Overconfidence Measurement to Reduce Risk in Spreadsheet Modelling

    OpenAIRE

    Thorne, Simon; Ball, David; Lawson, Zoe

    2008-01-01

    Research on formulae production in spreadsheets has established the practice as high risk yet unrecognised as such by industry. There are numerous software applications that are designed to audit formulae and find errors. However these are all post creation, designed to catch errors before the spreadsheet is deployed. As a general conclusion from EuSpRIG 2003 conference it was decided that the time has come to attempt novel solutions based on an understanding of human factors. Hence in this p...

  6. A Typical Model Audit Approach: Spreadsheet Audit Methodologies in the City of London

    OpenAIRE

    Croll, Grenville J.

    2007-01-01

    Spreadsheet audit and review procedures are an essential part of almost all City of London financial transactions. Structured processes are used to discover errors in large financial spreadsheets underpinning major transactions of all types. Serious errors are routinely found and are fed back to model development teams generally under conditions of extreme time urgency. Corrected models form the essence of the completed transaction and firms undertaking model audit and review expose themselve...

  7. Spreadsheet software to assess locomotor disability to quantify permanent physical impairment

    Directory of Open Access Journals (Sweden)

    Sunderraj Ellur

    2012-01-01

    Full Text Available Context: Assessment of physical disability is an important duty of a plastic surgeon especially for those of us who are in an institutional practice. Aim: The Gazette of India notification gives a guideline regarding the assessment of the disability. However, the calculations as per the guidelines are time consuming. In this article, a spreadsheet program which is based on the notification is presented. The aim of this article is to design a spreadsheet program which is simple, reproducible, user friendly, less time consuming and accurate. Materials and Methods: This spreadsheet program was designed using the Microsoft Excel. The spreadsheet program was designed on the basis of the guidelines in the Gazette of India Notification regarding the assessment of Locomotor Disability to Quantify Permanent Physical Impairment. Two representative examples are presented to help understand the application of this program. Results: Two spreadsheet programs, one for upper limb and another for the lower limb are presented. The representative examples show the accuracy of the program to match the results of the traditional method of calculation. Conclusion: A simple spreadsheet program can be designed to assess disability as per the Gazette of India Notification. This program is easy to use and is accurate.

  8. Beyond Discovery

    DEFF Research Database (Denmark)

    Korsgaard, Steffen; Sassmannshausen, Sean Patrick

    2017-01-01

    In this chapter we explore four alternatives to the dominant discovery view of entrepreneurship; the development view, the construction view, the evolutionary view, and the Neo-Austrian view. We outline the main critique points of the discovery presented in these four alternatives, as well...

  9. Chemical Discovery

    Science.gov (United States)

    Brown, Herbert C.

    1974-01-01

    The role of discovery in the advance of the science of chemistry and the factors that are currently operating to handicap that function are considered. Examples are drawn from the author's work with boranes. The thesis that exploratory research and discovery should be encouraged is stressed. (DT)

  10. Development of Spreadsheet-Based Integrated Transaction Processing Systems and Financial Reporting Systems

    Science.gov (United States)

    Ariana, I. M.; Bagiada, I. M.

    2018-01-01

    Development of spreadsheet-based integrated transaction processing systems and financial reporting systems is intended to optimize the capabilities of spreadsheet in accounting data processing. The purpose of this study are: 1) to describe the spreadsheet-based integrated transaction processing systems and financial reporting systems; 2) to test its technical and operational feasibility. This study type is research and development. The main steps of study are: 1) needs analysis (need assessment); 2) developing spreadsheet-based integrated transaction processing systems and financial reporting systems; and 3) testing the feasibility of spreadsheet-based integrated transaction processing systems and financial reporting systems. The technical feasibility include the ability of hardware and operating systems to respond the application of accounting, simplicity and ease of use. Operational feasibility include the ability of users using accounting applications, the ability of accounting applications to produce information, and control applications of the accounting applications. The instrument used to assess the technical and operational feasibility of the systems is the expert perception questionnaire. The instrument uses 4 Likert scale, from 1 (strongly disagree) to 4 (strongly agree). Data were analyzed using percentage analysis by comparing the number of answers within one (1) item by the number of ideal answer within one (1) item. Spreadsheet-based integrated transaction processing systems and financial reporting systems integrate sales, purchases, and cash transaction processing systems to produce financial reports (statement of profit or loss and other comprehensive income, statement of changes in equity, statement of financial position, and statement of cash flows) and other reports. Spreadsheet-based integrated transaction processing systems and financial reporting systems is feasible from the technical aspects (87.50%) and operational aspects (84.17%).

  11. Implementing parallel spreadsheet models for health policy decisions: The impact of unintentional errors on model projections.

    Science.gov (United States)

    Bailey, Stephanie L; Bono, Rose S; Nash, Denis; Kimmel, April D

    2018-01-01

    Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Standard error-checking techniques may not

  12. Higgs Discovery

    DEFF Research Database (Denmark)

    Sannino, Francesco

    2013-01-01

    has been challenged by the discovery of a not-so-heavy Higgs-like state. I will therefore review the recent discovery \\cite{Foadi:2012bb} that the standard model top-induced radiative corrections naturally reduce the intrinsic non-perturbative mass of the composite Higgs state towards the desired...... via first principle lattice simulations with encouraging results. The new findings show that the recent naive claims made about new strong dynamics at the electroweak scale being disfavoured by the discovery of a not-so-heavy composite Higgs are unwarranted. I will then introduce the more speculative......I discuss the impact of the discovery of a Higgs-like state on composite dynamics starting by critically examining the reasons in favour of either an elementary or composite nature of this state. Accepting the standard model interpretation I re-address the standard model vacuum stability within...

  13. Comparison of two spreadsheets for calculation of radiation exposure following hyperthyroidism treatment with iodine-131

    Energy Technology Data Exchange (ETDEWEB)

    Vrigneaud, J.M. [CHU Bichat, nuclear medicine department, 75 - Paris (France); Carlier, T. [CHU Hotel Dieu, nuclear medicine department, 44 - Nantes (France)

    2006-07-01

    Comparison of the two spreadsheets did not show any significant differences provided that proper biological models were used to follow 131 iodine clearance. This means that even simple assumptions can be used to give reasonable radiation safety recommendations. Nevertheless, a complete understanding of the formalism is required to use correctly these spreadsheets. Initial parameters must be chosen carefully and validation of the computed results must be done. Published guidelines are found to be in accordance with those issued from these spreadsheets. Furthermore, both programs make it possible to collect biological data from each patient and use it as input to calculate individual tailored radiation safety advices. Also, measured exposure rate may be entered into the spreadsheets to calculate patient-specific close contact delays required to reduce the dose to specified limits. These spreadsheets may be used to compute restriction times for any given radiopharmaceutical, provided that input parameters are chosen correctly. They can be of great help to physicians to provide patients with guidance on how to maintain doses to other individuals as low as reasonably achievable. (authors)

  14. Comparison of two spreadsheets for calculation of radiation exposure following hyperthyroidism treatment with iodine-131

    International Nuclear Information System (INIS)

    Vrigneaud, J.M.; Carlier, T.

    2006-01-01

    Comparison of the two spreadsheets did not show any significant differences provided that proper biological models were used to follow 131 iodine clearance. This means that even simple assumptions can be used to give reasonable radiation safety recommendations. Nevertheless, a complete understanding of the formalism is required to use correctly these spreadsheets. Initial parameters must be chosen carefully and validation of the computed results must be done. Published guidelines are found to be in accordance with those issued from these spreadsheets. Furthermore, both programs make it possible to collect biological data from each patient and use it as input to calculate individual tailored radiation safety advices. Also, measured exposure rate may be entered into the spreadsheets to calculate patient-specific close contact delays required to reduce the dose to specified limits. These spreadsheets may be used to compute restriction times for any given radiopharmaceutical, provided that input parameters are chosen correctly. They can be of great help to physicians to provide patients with guidance on how to maintain doses to other individuals as low as reasonably achievable. (authors)

  15. The Automated Discovery of Hybrid Processes

    DEFF Research Database (Denmark)

    Slaats, Tijs; Reijers, Hajo; Maggi, Fabrizio Maria

    2014-01-01

    The declarative-procedural dichotomy is highly relevant when choosing the most suitable process modeling language to represent a discovered process. Less-structured processes with a high level of variability can be described in a more compact way using a declarative language. By contrast, procedu...

  16. Automated Discovery of Simulation Between Programs

    Science.gov (United States)

    2014-10-18

    relation. These relations enable the refinement-step of SimAbs. We have implemented SimAbs using UFO framework and Z3 SMT-solver and applied it to...step of SimAbs. We implemented SimAbs and AE-VAL on the top of the UFO framework [1, 15] and an SMT-solver Z3 [8], respectively. We have evaluated SimAbs...ut 6 Evaluation We have implemented SimAbs in the UFO framework, and evaluated it on the Software Verification Competition (SVCOMP’14) benchmarks and

  17. The automated discovery of hybrid processes

    NARCIS (Netherlands)

    Maggi, F.M.; Slaats, T.; Reijers, H.A.

    2014-01-01

    The declarative-procedural dichotomy is highly relevant when choosing the most suitable process modeling language to represent a discovered process. Less-structured processes with a high level of variability can be described in a more compact way using a declarative language. By contrast, procedural

  18. Spreadsheet design and validation for characteristic limits determination in gross alpha and beta measurement

    International Nuclear Information System (INIS)

    Prado, Rodrigo G.P. do; Dalmazio, Ilza

    2013-01-01

    The identification and detection of ionizing radiation are essential requisites of radiation protection. Gross alpha and beta measurements are widely applied as a screening method in radiological characterization, environmental monitoring and industrial applications. As in any other analytical technique, test performance depends on the quality of instrumental measurements and reliability of calculations. Characteristic limits refer to three specific statistics, namely, decision threshold, detection limit and confidence interval, which are fundamental to ensuring the quality of determinations. This work describes a way to calculate characteristic limits for measurements of gross alpha and beta activity applying spreadsheets. The approach used for determination of decision threshold, detection limit and limits of the confidence interval, the mathematical expressions of measurands and uncertainty followed standards guidelines. A succinct overview of this approach and examples are presented and spreadsheets were validated using specific software. Furthermore, these spreadsheets could be used as tool to instruct beginner users of methods for ionizing radiation measurements. (author)

  19. [Development of an Excel spreadsheet for meta-analysis of indirect and mixed treatment comparisons].

    Science.gov (United States)

    Tobías, Aurelio; Catalá-López, Ferrán; Roqué, Marta

    2014-01-01

    Meta-analyses in clinical research usually aimed to evaluate treatment efficacy and safety in direct comparison with a unique comparator. Indirect comparisons, using the Bucher's method, can summarize primary data when information from direct comparisons is limited or nonexistent. Mixed comparisons allow combining estimates from direct and indirect comparisons, increasing statistical power. There is a need for simple applications for meta-analysis of indirect and mixed comparisons. These can easily be conducted using a Microsoft Office Excel spreadsheet. We developed a spreadsheet for indirect and mixed effects comparisons of friendly use for clinical researchers interested in systematic reviews, but non-familiarized with the use of more advanced statistical packages. The use of the proposed Excel spreadsheet for indirect and mixed comparisons can be of great use in clinical epidemiology to extend the knowledge provided by traditional meta-analysis when evidence from direct comparisons is limited or nonexistent.

  20. A spreadsheet-based microcomputer application for determining cost-effectiveness of commercial lighting retrofit opportunities

    International Nuclear Information System (INIS)

    Spain, T.K.

    1992-01-01

    Lighting accounts for 20-25% of electricity use in the United States. With estimates of 50-70% potential reductions being made by energy engineers, lighting is a promising area for cost-effective energy conservation projects in commercial buildings. With an extensive array of alternatives available to replace or modify existing lighting systems, simple but effective calculation tools are needed to help energy auditors evaluate lighting retrofits. This paper describes a spreadsheet-based microcomputer application for determining the cost-effectiveness of commercial lighting retrofits. Developed to support walk-through energy audits conducted by the Industrial Energy Advisory Service (IdEAS), the spreadsheet provides essential comparative data for evaluating the payback of alternatives. The impact of alternatives on environmental emissions is calculated to help communicate external costs and sell the project, if appropriate. The methodology and calculations are fully documented to allow the user to duplicate the spreadsheet and modify it as needed

  1. Heat as a groundwater tracer in shallow and deep heterogeneous media: Analytical solution, spreadsheet tool, and field applications

    Science.gov (United States)

    Kurylyk, Barret L.; Irvine, Dylan J.; Carey, Sean K.; Briggs, Martin A.; Werkema, Dale D.; Bonham, Mariah

    2017-01-01

    Groundwater flow advects heat, and thus, the deviation of subsurface temperatures from an expected conduction‐dominated regime can be analysed to estimate vertical water fluxes. A number of analytical approaches have been proposed for using heat as a groundwater tracer, and these have typically assumed a homogeneous medium. However, heterogeneous thermal properties are ubiquitous in subsurface environments, both at the scale of geologic strata and at finer scales in streambeds. Herein, we apply the analytical solution of Shan and Bodvarsson (2004), developed for estimating vertical water fluxes in layered systems, in 2 new environments distinct from previous vadose zone applications. The utility of the solution for studying groundwater‐surface water exchange is demonstrated using temperature data collected from an upwelling streambed with sediment layers, and a simple sensitivity analysis using these data indicates the solution is relatively robust. Also, a deeper temperature profile recorded in a borehole in South Australia is analysed to estimate deeper water fluxes. The analytical solution is able to match observed thermal gradients, including the change in slope at sediment interfaces. Results indicate that not accounting for layering can yield errors in the magnitude and even direction of the inferred Darcy fluxes. A simple automated spreadsheet tool (Flux‐LM) is presented to allow users to input temperature and layer data and solve the inverse problem to estimate groundwater flux rates from shallow (e.g., regimes.

  2. Software validation applied to spreadsheets used in laboratories working under ISO/IEC 17025

    Science.gov (United States)

    Banegas, J. M.; Orué, M. W.

    2016-07-01

    Several documents deal with software validation. Nevertheless, more are too complex to be applied to validate spreadsheets - surely the most used software in laboratories working under ISO/IEC 17025. The method proposed in this work is intended to be directly applied to validate spreadsheets. It includes a systematic way to document requirements, operational aspects regarding to validation, and a simple method to keep records of validation results and modifications history. This method is actually being used in an accredited calibration laboratory, showing to be practical and efficient.

  3. Towards tool support for spreadsheet-based domain-specific languages

    DEFF Research Database (Denmark)

    Adam, Marian Sorin; Schultz, Ulrik Pagh

    2015-01-01

    Spreadsheets are commonly used by non-programmers to store data in a structured form, this data can in some cases be considered to be a program in a domain-specific language (DSL). Unlike ordinary text-based domain-specific languages, there is however currently no formalism for expressing...... the syntax of such spreadsheet-based DSLs (SDSLs), and there is no tool support for automatically generating language infrastructure such as parsers and IDE support. In this paper we define a simple notion of two-dimensional grammars for SDSLs, and show how such grammars can be used for automatically...

  4. Spinning the Big Wheel on “The Price is Right”: A Spreadsheet Simulation Exercise

    Directory of Open Access Journals (Sweden)

    Keith A Willoughby

    2010-04-01

    Full Text Available A popular game played in each broadcast of the United States television game show “The Price is Right” has contestants spinning a large wheel comprised of twenty different monetary values (in 5-cent increments from $0.05 to $1.00. A player wins by scoring closest to, without exceeding, $1.00. Players may accomplish this in one or a total of two spins. We develop a spreadsheet modeling exercise, useful in an introductory undergraduate Spreadsheet Analytics course, to simulate the spinning of the wheel and to determine optimal spinning strategies.

  5. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  6. Data Analysis and Graphing in an Introductory Physics Laboratory: Spreadsheet versus Statistics Suite

    Science.gov (United States)

    Peterlin, Primoz

    2010-01-01

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…

  7. Learning Binomial Probability Concepts with Simulation, Random Numbers and a Spreadsheet

    Science.gov (United States)

    Rochowicz, John A., Jr.

    2005-01-01

    This paper introduces the reader to the concepts of binomial probability and simulation. A spreadsheet is used to illustrate these concepts. Random number generators are great technological tools for demonstrating the concepts of probability. Ideas of approximation, estimation, and mathematical usefulness provide numerous ways of learning…

  8. Random Numbers Demonstrate the Frequency of Type I Errors: Three Spreadsheets for Class Instruction

    Science.gov (United States)

    Duffy, Sean

    2010-01-01

    This paper describes three spreadsheet exercises demonstrating the nature and frequency of type I errors using random number generation. The exercises are designed specifically to address issues related to testing multiple relations using correlation (Demonstration I), t tests varying in sample size (Demonstration II) and multiple comparisons…

  9. Data analysis and graphing in an introductory physics laboratory: spreadsheet versus statistics suite

    International Nuclear Information System (INIS)

    Peterlin, Primoz

    2010-01-01

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared.

  10. A Computer Simulation Using Spreadsheets for Learning Concept of Steady-State Equilibrium

    Science.gov (United States)

    Sharda, Vandana; Sastri, O. S. K. S.; Bhardwaj, Jyoti; Jha, Arbind K.

    2016-01-01

    In this paper, we present a simple spreadsheet based simulation activity that can be performed by students at the undergraduate level. This simulation is implemented in free open source software (FOSS) LibreOffice Calc, which is available for both Windows and Linux platform. This activity aims at building the probability distribution for the…

  11. Numerical Modelling with Spreadsheets as a Means to Promote STEM to High School Students

    Science.gov (United States)

    Benacka, Jan

    2016-01-01

    The article gives an account of an experiment in which sixty-eight high school students of age 16 - 19 developed spreadsheet applications that simulated fall and projectile motion in the air. The students applied the Euler method to solve the governing differential equations. The aim was to promote STEM to the students and motivate them to study…

  12. How Helpful Are Error Management and Counterfactual Thinking Instructions to Inexperienced Spreadsheet Users' Training Task Performance?

    Science.gov (United States)

    Caputi, Peter; Chan, Amy; Jayasuriya, Rohan

    2011-01-01

    This paper examined the impact of training strategies on the types of errors that novice users make when learning a commonly used spreadsheet application. Fifty participants were assigned to a counterfactual thinking training (CFT) strategy, an error management training strategy, or a combination of both strategies, and completed an easy task…

  13. Teaching Graphical Simulations of Fourier Series Expansion of Some Periodic Waves Using Spreadsheets

    Science.gov (United States)

    Singh, Iqbal; Kaur, Bikramjeet

    2018-01-01

    The present article demonstrates a way of programming using an Excel spreadsheet to teach Fourier series expansion in school/colleges without the knowledge of any typical programming language. By using this, a student learns to approximate partial sum of the n terms of Fourier series for some periodic signals such as square wave, saw tooth wave,…

  14. Potential errors when fitting experience curves by means of spreadsheet software

    NARCIS (Netherlands)

    van Sark, W.G.J.H.M.|info:eu-repo/dai/nl/074628526; Alsema, E.A.|info:eu-repo/dai/nl/073416258

    2010-01-01

    Progress ratios (PRs) are widely used in forecasting development of many technologies; they are derived from historical data represented in experience curves. Fitting the double logarithmic graphs is easily done with spreadsheet software like Microsoft Excel, by adding a trend line to the graph.

  15. Spreadsheets as a Transparent Resource for Learning the Mathematics of Annuities

    Science.gov (United States)

    Pournara, Craig

    2009-01-01

    The ability of mathematics teachers to decompress mathematics and to move between representations are two key features of mathematical knowledge that is usable for teaching. This article reports on four pre-service secondary mathematics teachers learning the mathematics of annuities. In working with spreadsheets students began to make sense of…

  16. Fuels planning: science synthesis and integration; environmental consequences fact sheet 11: Smoke Impact Spreadsheet (SIS) model

    Science.gov (United States)

    Trent Wickman; Ann Acheson

    2005-01-01

    The Smoke Impact Spreadsheet (SIS) is a simple-to-use planning model for calculating particulate matter (PM) emissions and concentrations downwind of wildland fires. This fact sheet identifies the intended users and uses, required inputs, what the model does and does not do, and tells the user how to obtain the model.

  17. Application of magnetic sensors in automation control

    Energy Technology Data Exchange (ETDEWEB)

    Hou Chunhong [AMETEK Inc., Paoli, PA 19301 (United States); Qian Zhenghong, E-mail: zqian@hdu.edu.cn [Center For Integrated Spintronic Devices (CISD), Hangzhou Dianzi University, Hangzhou, ZJ 310018 (China)

    2011-01-01

    Controls in automation need speed and position feedback. The feedback device is often referred to as encoder. Feedback technology includes mechanical, optical, and magnetic, etc. All advance with new inventions and discoveries. Magnetic sensing as a feedback technology offers certain advantages over other technologies like optical one. With new discoveries like GMR (Giant Magneto-Resistance), TMR (Tunneling Magneto-Resistance) becoming feasible for commercialization, more and more applications will be using advanced magnetic sensors in automation. This paper offers a general review on encoder and applications of magnetic sensors in automation control.

  18. The meaning of diagnostic test results: A spreadsheet for swift data analysis

    International Nuclear Information System (INIS)

    MacEneaney, Peter M.; Malone, Dermot E.

    2000-01-01

    AIMS: To design a spreadsheet program to: (a) analyse rapidly diagnostic test result data produced in local research or reported in the literature; (b) correct reported predictive values for disease prevalence in any population; (c) estimate the post-test probability of disease in individual patients. MATERIALS AND METHODS: Microsoft Excel TM was used. Section A: a contingency (2 x 2) table was incorporated into the spreadsheet. Formulae for standard calculations [sample size, disease prevalence, sensitivity and specificity with 95% confidence intervals, predictive values and likelihood ratios (LRs)] were linked to this table. The results change automatically when the data in the true or false negative and positive cells are changed. Section B: this estimates predictive values in any population, compensating for altered disease prevalence. Sections C-F: Bayes' theorem was incorporated to generate individual post-test probabilities. The spreadsheet generates 95% confidence intervals, LRs and a table and graph of conditional probabilities once the sensitivity and specificity of the test are entered. The latter shows the expected post-test probability of disease for any pre-test probability when a test of known sensitivity and specificity is positive or negative. RESULTS: This spreadsheet can be used on desktop and palmtop computers. The MS Excel TM version can be downloaded via the Internet from the URL ftp://radiography.com/pub/Rad-data99.xls CONCLUSION: A spreadsheet is useful for contingency table data analysis and assessment of the clinical meaning of diagnostic test results. MacEneaney, P.M., Malone, D.E. (2000)

  19. Sparse Mbplsr for Metabolomics Data and Biomarker Discovery

    DEFF Research Database (Denmark)

    Karaman, İbrahim

    2014-01-01

    the link between high throughput metabolomics data generated on different analytical platforms, discover important metabolites deriving from the digestion processes in the gut, and automate metabolic pathway discovery from mass spectrometry. PLS (partial least squares) based chemometric methods were...

  20. A procedure to compute equilibrium concentrations in multicomponent systems by Gibbs energy minimization on spreadsheets

    International Nuclear Information System (INIS)

    Lima da Silva, Aline; Heck, Nestor Cesar

    2003-01-01

    Equilibrium concentrations are traditionally calculated with the help of equilibrium constant equations from selected reactions. This procedure, however, is only useful for simpler problems. Analysis of the equilibrium state in a multicomponent and multiphase system necessarily involves solution of several simultaneous equations, and, as the number of system components grows, the required computation becomes more complex and tedious. A more direct and general method for solving the problem is the direct minimization of the Gibbs energy function. The solution for the nonlinear problem consists in minimizing the objective function (Gibbs energy of the system) subjected to the constraints of the elemental mass-balance. To solve it, usually a computer code is developed, which requires considerable testing and debugging efforts. In this work, a simple method to predict equilibrium composition in multicomponent systems is presented, which makes use of an electronic spreadsheet. The ability to carry out these calculations within a spreadsheet environment shows several advantages. First, spreadsheets are available 'universally' on nearly all personal computers. Second, the input and output capabilities of spreadsheets can be effectively used to monitor calculated results. Third, no additional systems or programs need to be learned. In this way, spreadsheets can be as suitable in computing equilibrium concentrations as well as to be used as teaching and learning aids. This work describes, therefore, the use of the Solver tool, contained in the Microsoft Excel spreadsheet package, on computing equilibrium concentrations in a multicomponent system, by the method of direct Gibbs energy minimization. The four phases Fe-Cr-O-C-Ni system is used as an example to illustrate the method proposed. The pure stoichiometric phases considered in equilibrium calculations are: Cr 2 O 3 (s) and FeO C r 2 O 3 (s). The atmosphere consists of O 2 , CO e CO 2 constituents. The liquid iron

  1. AXAOTHER XL -- A spreadsheet for determining doses for incidents caused by tornadoes or high-velocity straight winds

    International Nuclear Information System (INIS)

    Simpkins, A.A.

    1996-09-01

    AXAOTHER XL is an Excel Spreadsheet used to determine dose to the maximally exposed offsite individual during high-velocity straight winds or tornado conditions. Both individual and population doses may be considered. Potential exposure pathways are inhalation and plume shine. For high-velocity straight winds the spreadsheet has the capability to determine the downwind relative air concentration, however for the tornado conditions, the user must enter the relative air concentration. Theoretical models are discussed and hand calculations are performed to ensure proper application of methodologies. A section has also been included that contains user instructions for the spreadsheet

  2. Discovery Mondays

    CERN Multimedia

    2003-01-01

    Many people don't realise quite how much is going on at CERN. Would you like to gain first-hand knowledge of CERN's scientific and technological activities and their many applications? Try out some experiments for yourself, or pick the brains of the people in charge? If so, then the «Lundis Découverte» or Discovery Mondays, will be right up your street. Starting on May 5th, on every first Monday of the month you will be introduced to a different facet of the Laboratory. CERN staff, non-scientists, and members of the general public, everyone is welcome. So tell your friends and neighbours and make sure you don't miss this opportunity to satisfy your curiosity and enjoy yourself at the same time. You won't have to listen to a lecture, as the idea is to have open exchange with the expert in question and for each subject to be illustrated with experiments and demonstrations. There's no need to book, as Microcosm, CERN's interactive museum, will be open non-stop from 7.30 p.m. to 9 p.m. On the first Discovery M...

  3. CROSS-CORRELATION MODELLING OF SURFACE WATER – GROUNDWATER INTERACTION USING THE EXCEL SPREADSHEET APPLICATION

    Directory of Open Access Journals (Sweden)

    Kristijan Posavec

    2017-01-01

    Full Text Available Modelling responses of groundwater levels in aquifer systems, which occur as a reaction to changes in aquifer system boundary conditions such as river or stream stages, is commonly being studied using statistical methods, namely correlation, cross-correlation and regression methods. Although correlation and regression analysis tools are readily available in Microsoft Excel, a widely applied spreadsheet industry standard, the cross-correlation analysis tool is missing. As a part of research of groundwater pressure propagation into alluvial aquifer systems of the Sava and Drava/Danube River catchments following river stages rise, focused on estimating groundwater pressure travel times in aquifers, an Excel spreadsheet data analysis application for cross-correlation modelling has been designed and used in modelling surface water – groundwater interaction. Examples of fi eld data from the Zagreb aquifer system and the Kopački rit Nature Park aquifer system are used to illustrate the usefulness of the cross-correlation application.

  4. Integrating numerical computation into the undergraduate education physics curriculum using spreadsheet excel

    Science.gov (United States)

    Fauzi, Ahmad

    2017-11-01

    Numerical computation has many pedagogical advantages: it develops analytical skills and problem-solving skills, helps to learn through visualization, and enhances physics education. Unfortunately, numerical computation is not taught to undergraduate education physics students in Indonesia. Incorporate numerical computation into the undergraduate education physics curriculum presents many challenges. The main challenges are the dense curriculum that makes difficult to put new numerical computation course and most students have no programming experience. In this research, we used case study to review how to integrate numerical computation into undergraduate education physics curriculum. The participants of this research were 54 students of the fourth semester of physics education department. As a result, we concluded that numerical computation could be integrated into undergraduate education physics curriculum using spreadsheet excel combined with another course. The results of this research become complements of the study on how to integrate numerical computation in learning physics using spreadsheet excel.

  5. Computerized cost estimation spreadsheet and cost data base for fusion devices

    International Nuclear Information System (INIS)

    Hamilton, W.R.; Rothe, K.E.

    1985-01-01

    Component design parameters (weight, surface area, etc.) and cost factors are input and direct and indirect costs are calculated. The cost data base file derived from actual cost experience within the fusion community and refined to be compatible with the spreadsheet costing approach is a catalog of cost coefficients, algorithms, and component costs arranged into data modules corresponding to specific components and/or subsystems. Each data module contains engineering, equipment, and installation labor cost data for different configurations and types of the specific component or subsystem. This paper describes the assumptions, definitions, methodology, and architecture incorporated in the development of the cost estimation spreadsheet and cost data base, along with the type of input required and the output format

  6. Simple Functions Spreadsheet tool presentation; for determination of solubility limits of some radionuclides

    Energy Technology Data Exchange (ETDEWEB)

    Grive, Mireia; Domenech, Cristina; Montoya, Vanessa; Garcia, David; Duro, Lara (Amphos 21, Barcelona (Spain))

    2010-09-15

    This document is a guide for users of the Simple Functions Spreadsheet tool. The Simple Functions Spreadsheet tool has been developed by Amphos 21 to determine the solubility limits of some radionuclides and it has been especially designed for Performance Assessment exercises. The development of this tool has been promoted by the necessity expressed by SKB of having a confident and easy-to-handle tool to calculate solubility limits in an agile and relatively fast manner. Its development started in 2005 and since then, it has been improved until the current version. This document describes the accurate and preliminary study following expert criteria that has been used to select the simplified aqueous speciation and solid phase system included in the tool. This report also gives the basic instructions to use this tool and to interpret its results. Finally, this document also reports the different validation tests and sensitivity analyses that have been done during the verification process

  7. Teaching graphical simulations of Fourier series expansion of some periodic waves using spreadsheets

    Science.gov (United States)

    Singh, Iqbal; Kaur, Bikramjeet

    2018-05-01

    The present article demonstrates a way of programming using an Excel spreadsheet to teach Fourier series expansion in school/colleges without the knowledge of any typical programming language. By using this, a student learns to approximate partial sum of the n terms of Fourier series for some periodic signals such as square wave, saw tooth wave, half wave rectifier and full wave rectifier signals.

  8. Maxine: A spreadsheet for estimating dose from chronic atmospheric radioactive releases

    Energy Technology Data Exchange (ETDEWEB)

    Jannik, Tim [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Bell, Evaleigh [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Dixon, Kenneth [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-07-24

    MAXINE is an EXCEL© spreadsheet, which is used to estimate dose to individuals for routine and accidental atmospheric releases of radioactive materials. MAXINE does not contain an atmospheric dispersion model, but rather doses are estimated using air and ground concentrations as input. Minimal input is required to run the program and site specific parameters are used when possible. Complete code description, verification of models, and user’s manual have been included.

  9. Number Theory, Dialogue, and the Use of Spreadsheets in Teacher Education

    Directory of Open Access Journals (Sweden)

    Sergei Abramovich

    2011-04-01

    Full Text Available This paper demonstrates the use of a spreadsheet in teaching topics in elementary number theory. It emphasizes both the power and deficiency of inductive reasoning using a number of historically significant examples. The notion of computational experiment as a modern approach to the teaching of mathematics is discussed. The paper, grounded in a teacher-student dialogue as an instructional method, is a reflection on the author’s work over the years with prospective teachers of secondary mathematics.

  10. Discovery of new natural products by application of X-hitting, a novel algorithm for automated comparison of full UV-spectra, combined with structural determination by NMR spectroscophy

    DEFF Research Database (Denmark)

    Larsen, Thomas Ostenfeld; Petersen, Bent O.; Duus, Jens Øllgaard

    2005-01-01

    X-hitting, a newly developed algorithm for automated comparison of UV data, has been used for the tracking of two novel spiro-quinazoline metabolites, lapatins A (1)andB(2), in a screening study targeting quinazolines. The structures of 1 and 2 were elucidated by analysis of spectroscopic data...

  11. A spreadsheet to determine the volume ratio for target and breast in partial breast irradiation

    International Nuclear Information System (INIS)

    Kron, T.; Willis, D.; Miller, J.; Hubbard, P.; Oliver, M.; Chua, B.

    2009-01-01

    Full text: The technical feasibility of Partial Breast Irradiation (PBI) using external beam radiotherapy depends on the ratio between the evaluation planning target volume (PTV e val) and the whole breast volume (PBI volume ratio = PVR). We aimed to develop a simple method to determine PVR using measurements performed at the time of the planning CT scan. A PVR calculation tool was developed using a Microsoft Excel spreadsheet to determine the PTV from three orthogonal dimensions of the seroma cavity and a given margin on the CT scans. The breast volume is estimated from the separation and breast height in five equally spaced CT slices. The PTV e val and whole breast volume were determined for 29 patients from two centres using the spreadsheet calculation tool and compared to volumes delineated on computerised treatment planning systems. Both the PTV e val and whole breast volumes were underestimated by approximately 25% using the spreadsheet. The resulting PVRs were 1.05 +/- 0.35 (mean +/- 1 S D) times larger than the ones determined from planning. Estimations of the PVR using the calculation tool were achievable in around 5 minutes at the time of CT scanning and allow a prompt decision on the suitability of the patients for PBI.

  12. Using spreadsheets to develop applied skills in a business math course: Student feedback and perceived learning

    Directory of Open Access Journals (Sweden)

    Thomas Mays

    2015-10-01

    Full Text Available This paper describes the redesign of a business math course and its delivery in both face-to-face and online formats. Central to the redesigned course was the addition of applied spreadsheet exercises that served as both learning and summative assessment tools. Several other learning activities and assignments were integrated in the course to address diverse student learning styles and levels of math anxiety. Students were invited to complete a survey that asked them to rank course activities and assignments based on how well they helped the student learn course material. Open-ended items were also included in the survey. In the online course sections, students reported higher perceived learning from the use the spreadsheet-based application assignments, while face-to-face students preferred demonstrations. Qualitative remarks from the online students included numerous comments about the positive learning impact of the business application spreadsheet-based assignments, as well as the link between these assignments and what students considered the “real world.”

  13. INVARIANT PRACTICAL TASKS FOR WORK WITH ELECTRONIC SPREADSHEETS AT THE SECONDARY SCHOOL

    Directory of Open Access Journals (Sweden)

    Л И Карташова

    2016-12-01

    Full Text Available In article examples of practical jobs on creation and editing electronic spreadsheets for pupils of the main school are given. For fixing of knowledge and abilities of pupils on formatting of cells they are offered to create, for example, in the plate processor the table and to make its formatting on a sample which shall be brought to the computer monitor, is printed on a color printer and is laid out on the local area network in the form of the image. In the course of assimilation of data types jobs for determination and the explanation of data types to which different strings belong are offered school students. For assimilation of features of record of formulas school students are offered to write different mathematical expressions in the look suitable for use in electronic spreadsheets.Jobs reflect fundamental invariant approach to implementation of technology of operation with electronic spreadsheets as don’t depend on specific versions of computer programs. The provided jobs can be used in case of study of any plate processors. In training activity on the basis of use of invariant jobs there is a mastering the generalized methods of activities to numerical information that will allow to create a system view on use of information technologies and to consciously apply them to the solution of tasks.

  14. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  15. Station Program Note Pull Automation

    Science.gov (United States)

    Delgado, Ivan

    2016-01-01

    Upon commencement of my internship, I was in charge of maintaining the CoFR (Certificate of Flight Readiness) Tool. The tool acquires data from existing Excel workbooks on NASA's and Boeing's databases to create a new spreadsheet listing out all the potential safety concerns for upcoming flights and software transitions. Since the application was written in Visual Basic, I had to learn a new programming language and prepare to handle any malfunctions within the program. Shortly afterwards, I was given the assignment to automate the Station Program Note (SPN) Pull process. I developed an application, in Python, that generated a GUI (Graphical User Interface) that will be used by the International Space Station Safety & Mission Assurance team here at Johnson Space Center. The application will allow its users to download online files with the click of a button, import SPN's based on three different pulls, instantly manipulate and filter spreadsheets, and compare the three sources to determine which active SPN's (Station Program Notes) must be reviewed for any upcoming flights, missions, and/or software transitions. Initially, to perform the NASA SPN pull (one of three), I had created the program to allow the user to login to a secure webpage that stores data, input specific parameters, and retrieve the desired SPN's based on their inputs. However, to avoid any conflicts with sustainment, I altered it so that the user may login and download the NASA file independently. After the user has downloaded the file with the click of a button, I defined the program to check for any outdated or pre-existing files, for successful downloads, to acquire the spreadsheet, convert it from a text file to a comma separated file and finally into an Excel spreadsheet to be filtered and later scrutinized for specific SPN numbers. Once this file has been automatically manipulated to provide only the SPN numbers that are desired, they are stored in a global variable, shown on the GUI, and

  16. Distribution automation

    International Nuclear Information System (INIS)

    Gruenemeyer, D.

    1991-01-01

    This paper reports on a Distribution Automation (DA) System enhances the efficiency and productivity of a utility. It also provides intangible benefits such as improved public image and market advantages. A utility should evaluate the benefits and costs of such a system before committing funds. The expenditure for distribution automation is economical when justified by the deferral of a capacity increase, a decrease in peak power demand, or a reduction in O and M requirements

  17. Validation and configuration management plan for the KE basins KE-PU spreadsheet code

    International Nuclear Information System (INIS)

    Harris, R.A.

    1996-01-01

    This report provides documentation of the spreadsheet KE-PU software that is used to verify compliance with the Operational Safety Requirement and Process Standard limit on the amount of plutonium in the KE-Basin sandfilter backwash pit. Included are: A summary of the verification of the method and technique used in KE-PU that were documented elsewhere, the requirements, plans, and results of validation tests that confirm the proper functioning of the software, the procedures and approvals required to make changes to the software, and the method used to maintain configuration control over the software

  18. Potential errors when fitting experience curves by means of spreadsheet software

    International Nuclear Information System (INIS)

    Sark, W.G.J.H.M. van; Alsema, E.A.

    2010-01-01

    Progress ratios (PRs) are widely used in forecasting development of many technologies; they are derived from historical data represented in experience curves. Fitting the double logarithmic graphs is easily done with spreadsheet software like Microsoft Excel, by adding a trend line to the graph. However, it is unknown to many that these data are transformed to linear data before a fit is performed. This leads to erroneous results or a transformation bias in the PR, as we demonstrate using the experience curve for photovoltaic technology: logarithmic transformation leads to overestimates of progress ratios and underestimates of goodness of fit. Therefore, other graphing and analysis software is recommended.

  19. Automated extraction of radiation dose information from CT dose report images.

    Science.gov (United States)

    Li, Xinhua; Zhang, Da; Liu, Bob

    2011-06-01

    The purpose of this article is to describe the development of an automated tool for retrieving texts from CT dose report images. Optical character recognition was adopted to perform text recognitions of CT dose report images. The developed tool is able to automate the process of analyzing multiple CT examinations, including text recognition, parsing, error correction, and exporting data to spreadsheets. The results were precise for total dose-length product (DLP) and were about 95% accurate for CT dose index and DLP of scanned series.

  20. Virtual automation.

    Science.gov (United States)

    Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C

    2001-01-01

    Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).

  1. USE OF ELECTRONIC EDUCATIONAL RESOURCES WHEN TRAINING IN WORK WITH SPREADSHEETS

    Directory of Open Access Journals (Sweden)

    Х А Гербеков

    2017-12-01

    Full Text Available Today the tools for maintaining training courses based on opportunities of information and communication technologies are developed. Practically in all directions of preparation and on all subject matters electronic textbook and self-instruction manuals are created. Nevertheless the industry of computer educational and methodical materials actively develops and gets more and more areas of development and introduction. In this regard more and more urgent is a problem of development of the electronic educational resources adequate to modern educational requirements. Creation and the organization of training courses with use of electronic educational resources in particular on the basis of Internet technologies remains a difficult methodical task.In article the questions connected with development of electronic educational resources for use when studying the substantial line “Information technologies” of a school course of informatics in particular for studying of spreadsheets are considered. Also the analysis of maintenance of a school course and the unified state examination from the point of view of representation of task in him corresponding to the substantial line of studying “Information technologies” on mastering technology of information processing in spreadsheets and the methods of visualization given by means of charts and schedules is carried out.

  2. Peningkatan Hasil Belajar Operasional Spreadsheet Jenis dan Fungsi dengan Rumus Statistik Akuntansi melalui Demonstrasi dan Presentasi

    Directory of Open Access Journals (Sweden)

    NURBAITI SALPIDA GINAYANTI

    2016-08-01

    Full Text Available The research was purposed to find out the effectiveness of demonstration and presentation models are able to improve study result of students in operating Spreadsheet type and function by statistics in X Accounting 1 in SMKN 48 at Academic Year 2014/2015. The reasearch was conducted on August-November 2014. The method of the research was Action Research (PTK which was conducted in two cycles. Demonstration and Presentation were used as learning cycle model. A cycle consisted in three times of meetings and in the third meeting was done Post test. The indicators have been achieved by the result of research. As the expectation, in the second cycle, namely the number of students got the highest points was 97,45% and average value was 80,79%. In conclusion, demonstration and presentation are able to improve students’ study result in operating spreadsheet type and function by statistics if it was implemented appropriately in X Accounting 1 in SMKN 48 Jakarta.

  3. Ladtap XL Version 2017: A Spreadsheet For Estimating Dose Resulting From Aqueous Releases

    Energy Technology Data Exchange (ETDEWEB)

    Minter, K. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Jannik, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-06-15

    LADTAP XL© is an EXCEL© spreadsheet used to estimate dose to offsite individuals and populations resulting from routine and accidental releases of radioactive materials to the Savannah River. LADTAP XL© contains two worksheets: LADTAP and IRRIDOSE. The LADTAP worksheet estimates dose for environmental pathways including external exposure resulting from recreational activities on the Savannah River and internal exposure resulting from ingestion of water, fish, and invertebrates originating from the Savannah River. IRRIDOSE estimates offsite dose to individuals and populations from irrigation of foodstuffs with contaminated water from the Savannah River. In 2004, a complete description of the LADTAP XL© code and an associated user’s manual was documented in LADTAP XL©: A Spreadsheet for Estimating Dose Resulting from Aqueous Release (WSRC-TR-2004-00059) and revised input parameters, dose coefficients, and radionuclide decay constants were incorporated into LADTAP XL© Version 2013 (SRNL-STI-2011-00238). LADTAP XL© Version 2017 is a slight modification to Version 2013 with minor changes made for more user-friendly parameter inputs and organization, updates in the time conversion factors used within the dose calculations, and fixed an issue with the expected time build-up parameter referenced within the population shoreline dose calculations. This manual has been produced to update the code description, verification of the models, and provide an updated user’s manual. LADTAP XL© Version 2017 has been verified by Minter (2017) and is ready for use at the Savannah River Site (SRS).

  4. Automated DBS microsampling, microscale automation and microflow LC-MS for therapeutic protein PK.

    Science.gov (United States)

    Zhang, Qian; Tomazela, Daniela; Vasicek, Lisa A; Spellman, Daniel S; Beaumont, Maribel; Shyong, BaoJen; Kenny, Jacqueline; Fauty, Scott; Fillgrove, Kerry; Harrelson, Jane; Bateman, Kevin P

    2016-04-01

    Reduce animal usage for discovery-stage PK studies for biologics programs using microsampling-based approaches and microscale LC-MS. We report the development of an automated DBS-based serial microsampling approach for studying the PK of therapeutic proteins in mice. Automated sample preparation and microflow LC-MS were used to enable assay miniaturization and improve overall assay throughput. Serial sampling of mice was possible over the full 21-day study period with the first six time points over 24 h being collected using automated DBS sample collection. Overall, this approach demonstrated comparable data to a previous study using single mice per time point liquid samples while reducing animal and compound requirements by 14-fold. Reduction in animals and drug material is enabled by the use of automated serial DBS microsampling for mice studies in discovery-stage studies of protein therapeutics.

  5. A novel approach to formulae production\\ud and overconfidence measurement\\ud to reduce risk in spreadsheet modelling

    OpenAIRE

    Thorne, Simon; Ball, David; Lawson, Zoe Frances

    2004-01-01

    Research on formulae production in spreadsheets has established the practice as high risk yet\\ud unrecognised as such by industry. There are numerous software applications that are designed\\ud to audit formulae and find errors. However these are all post creation, designed to catch errors\\ud before the spreadsheet is deployed. As a general conclusion from EuSpRIG 2003 conference\\ud it was decided that the time has come to attempt novel solutions based on an understanding of\\ud human factors. ...

  6. Automating Finance

    Science.gov (United States)

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  7. Library Automation.

    Science.gov (United States)

    Husby, Ole

    1990-01-01

    The challenges and potential benefits of automating university libraries are reviewed, with special attention given to cooperative systems. Aspects discussed include database size, the role of the university computer center, storage modes, multi-institutional systems, resource sharing, cooperative system management, networking, and intelligent…

  8. Automated Motivic Analysis

    DEFF Research Database (Denmark)

    Lartillot, Olivier

    2016-01-01

    Motivic analysis provides very detailed understanding of musical composi- tions, but is also particularly difficult to formalize and systematize. A computational automation of the discovery of motivic patterns cannot be reduced to a mere extraction of all possible sequences of descriptions...... for lossless compression. The structural complexity resulting from successive repetitions of patterns can be controlled through a simple modelling of cycles. Generally, motivic patterns cannot always be defined solely as sequences of descriptions in a fixed set of dimensions: throughout the descriptions...... of the successive notes and intervals, various sets of musical parameters may be invoked. In this chapter, a method is presented that allows for these heterogeneous patterns to be discovered. Motivic repetition with local ornamentation is detected by reconstructing, on top of “surface-level” monodic voices, longer...

  9. On the use of a standard spreadsheet to model physical systems in school teaching*

    Science.gov (United States)

    Quale, Andreas

    2012-05-01

    In the teaching of physics at upper secondary school level (K10-K12), the students are generally taught to solve problems analytically, i.e. using the dynamics describing a system (typically in the form of differential equations) to compute its evolution in time, e.g. the motion of a body along a straight line or in a plane. This reduces the scope of problems, i.e. the kind of problems that are within students' capabilities. To make the tasks mathematically solvable, one is restricted to very idealized situations; more realistic problems are too difficult (or even impossible) to handle analytically with the mathematical abilities that may be expected from students at this level. For instance, ordinary ballistic trajectories under the action of gravity, when air resistance is included, have been 'out of reach'; in school textbooks such trajectories are generally assumed to take place in a vacuum. Another example is that according to Newton's law of universal gravitation satellites will in general move around a large central body in elliptical orbits, but the students can only deal with the special case where the orbit is circular, thus precluding (for example) a verification and discussion of Kepler's laws. It is shown that standard spreadsheet software offers a tool that can handle many such realistic situations in a uniform way, and display the results both numerically and graphically on a computer screen, quite independently of whether the formal description of the physical system itself is 'mathematically tractable'. The method employed, which is readily accessible to high school students, is to perform a numerical integration of the equations of motion, exploiting the spreadsheet's capability of successive iterations. The software is used to model and study motion of bodies in external force fields; specifically, ballistic trajectories in a homogeneous gravity field with air resistance and satellite motion in a centrally symmetric gravitational field. The

  10. Usability of Discovery Portals

    OpenAIRE

    Bulens, J.D.; Vullings, L.A.E.; Houtkamp, J.M.; Vanmeulebrouk, B.

    2013-01-01

    As INSPIRE progresses to be implemented in the EU, many new discovery portals are built to facilitate finding spatial data. Currently the structure of the discovery portals is determined by the way spatial data experts like to work. However, we argue that the main target group for discovery portals are not spatial data experts but professionals with limited spatial knowledge, and a focus outside the spatial domain. An exploratory usability experiment was carried out in which three discovery p...

  11. Teaching Students to Model Neural Circuits and Neural Networks Using an Electronic Spreadsheet Simulator. Microcomputing Working Paper Series.

    Science.gov (United States)

    Hewett, Thomas T.

    There are a number of areas in psychology where an electronic spreadsheet simulator can be used to study and explore functional relationships among a number of parameters. For example, when dealing with sensation, perception, and pattern recognition, it is sometimes desirable for students to understand both the basic neurophysiology and the…

  12. Spreadsheets for business process management : Using process mining to deal with “events” rather than “numbers”?

    NARCIS (Netherlands)

    van der Aalst, Wil

    2018-01-01

    Purpose: Process mining provides a generic collection of techniques to turn event data into valuable insights, improvement ideas, predictions, and recommendations. This paper uses spreadsheets as a metaphor to introduce process mining as an essential tool for data scientists and business analysts.

  13. A Novel Real-Time Data Acquisition Using an Excel Spreadsheet in Pendulum Experiment Tool with Light-Based Timer

    Science.gov (United States)

    Adhitama, Egy; Fauzi, Ahmad

    2018-01-01

    In this study, a pendulum experimental tool with a light-based timer has been developed to measure the period of a simple pendulum. The obtained data was automatically recorded in an Excel spreadsheet. The intensity of monochromatic light, sensed by a 3DU5C phototransistor, dynamically changes as the pendulum swings. The changed intensity varies…

  14. Pre-Service Teachers' TPACK Competencies for Spreadsheet Integration: Insights from a Mathematics-Specific Instructional Technology Course

    Science.gov (United States)

    Agyei, Douglas D.; Voogt, Joke M.

    2015-01-01

    This article explored the impact of strategies applied in a mathematics instructional technology course for developing technology integration competencies, in particular in the use of spreadsheets, in pre-service teachers. In this respect, 104 pre-service mathematics teachers from a teacher training programme in Ghana enrolled in the mathematics…

  15. A Spreadsheet-Based Visualized Mindtool for Improving Students' Learning Performance in Identifying Relationships between Numerical Variables

    Science.gov (United States)

    Lai, Chiu-Lin; Hwang, Gwo-Jen

    2015-01-01

    In this study, a spreadsheet-based visualized Mindtool was developed for improving students' learning performance when finding relationships between numerical variables by engaging them in reasoning and decision-making activities. To evaluate the effectiveness of the proposed approach, an experiment was conducted on the "phenomena of climate…

  16. Ideas Tried, Lessons Learned, and Improvements to Make: A Journey in Moving a Spreadsheet-Intensive Course Online

    Science.gov (United States)

    Berardi, Victor L.

    2012-01-01

    Using information systems to solve business problems is increasingly required of everyone in an organization, not just technical specialists. In the operations management class, spreadsheet usage has intensified with the focus on building decision models to solve operations management concerns such as forecasting, process capability, and inventory…

  17. The Euler’s Graphical User Interface Spreadsheet Calculator for Solving Ordinary Differential Equations by Visual Basic for Application Programming

    Science.gov (United States)

    Gaik Tay, Kim; Cheong, Tau Han; Foong Lee, Ming; Kek, Sie Long; Abdul-Kahar, Rosmila

    2017-08-01

    In the previous work on Euler’s spreadsheet calculator for solving an ordinary differential equation, the Visual Basic for Application (VBA) programming was used, however, a graphical user interface was not developed to capture users input. This weakness may make users confuse on the input and output since those input and output are displayed in the same worksheet. Besides, the existing Euler’s spreadsheet calculator is not interactive as there is no prompt message if there is a mistake in inputting the parameters. On top of that, there are no users’ instructions to guide users to input the derivative function. Hence, in this paper, we improved previous limitations by developing a user-friendly and interactive graphical user interface. This improvement is aimed to capture users’ input with users’ instructions and interactive prompt error messages by using VBA programming. This Euler’s graphical user interface spreadsheet calculator is not acted as a black box as users can click on any cells in the worksheet to see the formula used to implement the numerical scheme. In this way, it could enhance self-learning and life-long learning in implementing the numerical scheme in a spreadsheet and later in any programming language.

  18. A Simple Spreadsheet Program to Simulate and Analyze the Far-UV Circular Dichroism Spectra of Proteins

    Science.gov (United States)

    Abriata, Luciano A.

    2011-01-01

    A simple algorithm was implemented in a spreadsheet program to simulate the circular dichroism spectra of proteins from their secondary structure content and to fit [alpha]-helix, [beta]-sheet, and random coil contents from experimental far-UV circular dichroism spectra. The physical basis of the method is briefly reviewed within the context of…

  19. Teaching Simulation and Computer-Aided Separation Optimization in Liquid Chromatography by Means of Illustrative Microsoft Excel Spreadsheets

    Science.gov (United States)

    Fasoula, S.; Nikitas, P.; Pappa-Louisi, A.

    2017-01-01

    A series of Microsoft Excel spreadsheets were developed to simulate the process of separation optimization under isocratic and simple gradient conditions. The optimization procedure is performed in a stepwise fashion using simple macros for an automatic application of this approach. The proposed optimization approach involves modeling of the peak…

  20. Pre-service teachers’ TPACK competencies for spreadsheet integration: insights from a mathematics-specific instructional technology course

    NARCIS (Netherlands)

    Agyei, D.D.; Voogt, J.M.

    2015-01-01

    This article explored the impact of strategies applied in a mathematics instructional technology course for developing technology integration competencies, in particular in the use of spreadsheets, in pre-service teachers. In this respect, 104 pre-service mathematics teachers from a teacher training

  1. Usability of Discovery Portals

    NARCIS (Netherlands)

    Bulens, J.D.; Vullings, L.A.E.; Houtkamp, J.M.; Vanmeulebrouk, B.

    2013-01-01

    As INSPIRE progresses to be implemented in the EU, many new discovery portals are built to facilitate finding spatial data. Currently the structure of the discovery portals is determined by the way spatial data experts like to work. However, we argue that the main target group for discovery portals

  2. Discovery and the atom

    International Nuclear Information System (INIS)

    1989-01-01

    ''Discovery and the Atom'' tells the story of the founding of nuclear physics. This programme looks at nuclear physics up to the discovery of the neutron in 1932. Animation explains the science of the classic experiments, such as the scattering of alpha particles by Rutherford and the discovery of the nucleus. Archive film shows the people: Lord Rutherford, James Chadwick, Marie Curie. (author)

  3. Analysis of chromium-51 release assay data using personal computer spreadsheet software

    International Nuclear Information System (INIS)

    Lefor, A.T.; Steinberg, S.M.; Wiebke, E.A.

    1988-01-01

    The Chromium-51 release assay is a widely used technique to assess the lysis of labeled target cells in vitro. We have developed a simple technique to analyze data from Chromium-51 release assays using the widely available LOTUS 1-2-3 spreadsheet software. This package calculates percentage specific cytotoxicity and lytic units by linear regression. It uses all data points to compute the linear regression and can determine if there is a statistically significant difference between two lysis curves. The system is simple to use and easily modified, since its implementation requires neither knowledge of computer programming nor custom designed software. This package can help save considerable time when analyzing data from Chromium-51 release assays

  4. Methodology for the National Water Savings Model and Spreadsheet Tool Commercial/Institutional

    Energy Technology Data Exchange (ETDEWEB)

    Chan, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Long, Tim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Williams, Alison [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Melody, Moya [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-01-01

    Lawrence Berkeley National Laboratory (LBNL) has developed a mathematical model to quantify the water and monetary savings attributable to the United States Environmental Protection Agency’s (EPA’s) WaterSense labeling program for commercial and institutional products. The National Water Savings–Commercial/Institutional (NWS-CI) model is a spreadsheet tool with which the EPA can evaluate the success of its program for encouraging buyers in the commercial and institutional (CI) sectors to purchase more water-efficient products. WaterSense has begun by focusing on three water-using products commonly used in the CI sectors: flushometer valve toilets, urinals, and pre-rinse spray valves. To estimate the savings attributable to WaterSense for each of the three products, LBNL applies an accounting method to national product shipments and lifetimes to estimate the shipments of each product.

  5. Modelling accidental releases of tritium in the environment: application as an excel spreadsheet

    International Nuclear Information System (INIS)

    Le Dizes, S.; Tamponnet, C.

    2004-01-01

    An application as an Excel spreadsheet of the simplified modelling approach of tritium transfer in the environment developed by Tamponnet (2002) is presented. Based on the use of growth models of biological systems (plants, animals, etc.), the two-pool model (organic tritium and tritiated water) that was developed estimates the concentration of tritium within the different compartments of the food chain and in fine the dose to man by ingestion in the case of a chronic or accidental release of tritium in a river or the atmosphere. Data and knowledge have been implemented on Excel using the object-oriented programming language VisualBasic (Microsoft Visual Basic 6.0). The structure of the conceptual model and the Excel sheet are first briefly exposed. A numerical application of the model under a scenario of an accidental release of tritium in the atmosphere is then presented. Simulation results and perspectives are discussed. (author)

  6. Development of a VBA macro-based spreadsheet application for RELAP5 data post-processing

    International Nuclear Information System (INIS)

    Belchior Junior, Antonio; Andrade, Delvonei A.; Sabundjian, Gaiane; Macedo, Luiz A.; Angelo, Gabriel; Torres, Walmir M.; Umbehaun, Pedro E.; Conti, Thadeu N.; Bruel, Renata N.

    2011-01-01

    During the use of thermal-hydraulic codes such as RELAP5, large amount of data has to be managed in order to prepare its input data and also to analyze the produced results. This work presents a helpful tool developed to make it easier to handle the RELAP5 output data file. The XTRIP application is an electronic spreadsheet that contains some programmed macros that should be used for post-processing the RELAP5 output file. It can directly read the RELAP5 restart-plot binary output file and, through a user-friendly interface, transient results can be chosen and exported directly into an electronic worksheet. The XTRIP program can also do some data unit conversion as well as export these data to other programs such as Wingraf, Grapher and COBRA, etc. The main features of the developed Excel Visual Basic for Application macro as well as an example of use are presented and discussed. (author)

  7. Modelling accidental releases of carbon 14 in the environment: application as an excel spreadsheet

    International Nuclear Information System (INIS)

    Le Dizes, S.; Tamponnet, C.

    2004-01-01

    An application as an Excel spreadsheet of the simplified modelling approach of carbon 14 transfer in the environment developed by Tamponnet (2002) is presented. Based on the use of growth models of biological systems (plants, animals, etc.), the one-pool model (organic carbon) that was developed estimates the concentration of carbon 14 within the different compartments of the food chain and in fine the dose to man by ingestion in the case of a chronic or accidental release of carbon 14 in a river or the atmosphere. Data and knowledge have been implemented on Excel using the object-oriented programming language VisualBasic (Microsoft Visual Basic 6.0). The structure of the conceptual model and the Excel sheet are first briefly exposed. A numerical application of the model under a scenario of an accidental release of carbon 14 in the atmosphere is then presented. Simulation results and perspectives are discussed. (author)

  8. ISA-TAB-Nano: A Specification for Sharing Nanomaterial Research Data in Spreadsheet-based Format

    Science.gov (United States)

    2013-01-01

    Background and motivation The high-throughput genomics communities have been successfully using standardized spreadsheet-based formats to capture and share data within labs and among public repositories. The nanomedicine community has yet to adopt similar standards to share the diverse and multi-dimensional types of data (including metadata) pertaining to the description and characterization of nanomaterials. Owing to the lack of standardization in representing and sharing nanomaterial data, most of the data currently shared via publications and data resources are incomplete, poorly-integrated, and not suitable for meaningful interpretation and re-use of the data. Specifically, in its current state, data cannot be effectively utilized for the development of predictive models that will inform the rational design of nanomaterials. Results We have developed a specification called ISA-TAB-Nano, which comprises four spreadsheet-based file formats for representing and integrating various types of nanomaterial data. Three file formats (Investigation, Study, and Assay files) have been adapted from the established ISA-TAB specification; while the Material file format was developed de novo to more readily describe the complexity of nanomaterials and associated small molecules. In this paper, we have discussed the main features of each file format and how to use them for sharing nanomaterial descriptions and assay metadata. Conclusion The ISA-TAB-Nano file formats provide a general and flexible framework to record and integrate nanomaterial descriptions, assay data (metadata and endpoint measurements) and protocol information. Like ISA-TAB, ISA-TAB-Nano supports the use of ontology terms to promote standardized descriptions and to facilitate search and integration of the data. The ISA-TAB-Nano specification has been submitted as an ASTM work item to obtain community feedback and to provide a nanotechnology data-sharing standard for public development and adoption. PMID

  9. Simulation of axonal excitability using a Spreadsheet template created in Microsoft Excel.

    Science.gov (United States)

    Brown, A M

    2000-08-01

    The objective of this present study was to implement an established simulation protocol (A.M. Brown, A methodology for simulating biological systems using Microsoft Excel, Comp. Methods Prog. Biomed. 58 (1999) 181-90) to model axonal excitability. The simulation protocol involves the use of in-cell formulas directly typed into a spreadsheet and does not require any programming skills or use of the macro language. Once the initial spreadsheet template has been set up the simulations described in this paper can be executed with a few simple keystrokes. The model axon contained voltage-gated ion channels that were modeled using Hodgkin Huxley style kinetics. The basic properties of axonal excitability modeled were: (1) threshold of action potential firing, demonstrating that not only are the stimulus amplitude and duration critical in the generation of an action potential, but also the resting membrane potential; (2) refractoriness, the phenomenon of reduced excitability immediately following an action potential. The difference between the absolute refractory period, when no amount of stimulus will elicit an action potential, and relative refractory period, when an action potential may be generated by applying increased stimulus, was demonstrated with regard to the underlying state of the Na(+) and K(+) channels; (3) temporal summation, a process by which two sub-threshold stimuli can unite to elicit an action potential was shown to be due to conductance changes outlasting the first stimulus and summing with the second stimulus-induced conductance changes to drive the membrane potential past threshold; (4) anode break excitation, where membrane hyperpolarization was shown to produce an action potential by removing Na(+) channel inactivation that is present at resting membrane potential. The simulations described in this paper provide insights into mechanisms of axonal excitation that can be carried out by following an easily understood protocol.

  10. PLAN: a web platform for automating high-throughput BLAST searches and for managing and mining results.

    Science.gov (United States)

    He, Ji; Dai, Xinbin; Zhao, Xuechun

    2007-02-09

    BLAST searches are widely used for sequence alignment. The search results are commonly adopted for various functional and comparative genomics tasks such as annotating unknown sequences, investigating gene models and comparing two sequence sets. Advances in sequencing technologies pose challenges for high-throughput analysis of large-scale sequence data. A number of programs and hardware solutions exist for efficient BLAST searching, but there is a lack of generic software solutions for mining and personalized management of the results. Systematically reviewing the results and identifying information of interest remains tedious and time-consuming. Personal BLAST Navigator (PLAN) is a versatile web platform that helps users to carry out various personalized pre- and post-BLAST tasks, including: (1) query and target sequence database management, (2) automated high-throughput BLAST searching, (3) indexing and searching of results, (4) filtering results online, (5) managing results of personal interest in favorite categories, (6) automated sequence annotation (such as NCBI NR and ontology-based annotation). PLAN integrates, by default, the Decypher hardware-based BLAST solution provided by Active Motif Inc. with a greatly improved efficiency over conventional BLAST software. BLAST results are visualized by spreadsheets and graphs and are full-text searchable. BLAST results and sequence annotations can be exported, in part or in full, in various formats including Microsoft Excel and FASTA. Sequences and BLAST results are organized in projects, the data publication levels of which are controlled by the registered project owners. In addition, all analytical functions are provided to public users without registration. PLAN has proved a valuable addition to the community for automated high-throughput BLAST searches, and, more importantly, for knowledge discovery, management and sharing based on sequence alignment results. The PLAN web interface is platform

  11. PLAN: a web platform for automating high-throughput BLAST searches and for managing and mining results

    Directory of Open Access Journals (Sweden)

    Zhao Xuechun

    2007-02-01

    Full Text Available Abstract Background BLAST searches are widely used for sequence alignment. The search results are commonly adopted for various functional and comparative genomics tasks such as annotating unknown sequences, investigating gene models and comparing two sequence sets. Advances in sequencing technologies pose challenges for high-throughput analysis of large-scale sequence data. A number of programs and hardware solutions exist for efficient BLAST searching, but there is a lack of generic software solutions for mining and personalized management of the results. Systematically reviewing the results and identifying information of interest remains tedious and time-consuming. Results Personal BLAST Navigator (PLAN is a versatile web platform that helps users to carry out various personalized pre- and post-BLAST tasks, including: (1 query and target sequence database management, (2 automated high-throughput BLAST searching, (3 indexing and searching of results, (4 filtering results online, (5 managing results of personal interest in favorite categories, (6 automated sequence annotation (such as NCBI NR and ontology-based annotation. PLAN integrates, by default, the Decypher hardware-based BLAST solution provided by Active Motif Inc. with a greatly improved efficiency over conventional BLAST software. BLAST results are visualized by spreadsheets and graphs and are full-text searchable. BLAST results and sequence annotations can be exported, in part or in full, in various formats including Microsoft Excel and FASTA. Sequences and BLAST results are organized in projects, the data publication levels of which are controlled by the registered project owners. In addition, all analytical functions are provided to public users without registration. Conclusion PLAN has proved a valuable addition to the community for automated high-throughput BLAST searches, and, more importantly, for knowledge discovery, management and sharing based on sequence alignment results

  12. A Cognitive Adopted Framework for IoT Big-Data Management and Knowledge Discovery Prospective

    OpenAIRE

    Mishra, Nilamadhab; Lin, Chung-Chih; Chang, Hsien-Tsung

    2015-01-01

    In future IoT big-data management and knowledge discovery for large scale industrial automation application, the importance of industrial internet is increasing day by day. Several diversified technologies such as IoT (Internet of Things), computational intelligence, machine type communication, big-data, and sensor technology can be incorporated together to improve the data management and knowledge discovery efficiency of large scale automation applications. So in this work, we need to propos...

  13. Topology Discovery Using Cisco Discovery Protocol

    OpenAIRE

    Rodriguez, Sergio R.

    2009-01-01

    In this paper we address the problem of discovering network topology in proprietary networks. Namely, we investigate topology discovery in Cisco-based networks. Cisco devices run Cisco Discovery Protocol (CDP) which holds information about these devices. We first compare properties of topologies that can be obtained from networks deploying CDP versus Spanning Tree Protocol (STP) and Management Information Base (MIB) Forwarding Database (FDB). Then we describe a method of discovering topology ...

  14. Use of combinatorial chemistry to speed drug discovery.

    Science.gov (United States)

    Rádl, S

    1998-10-01

    IBC's International Conference on Integrating Combinatorial Chemistry into the Discovery Pipeline was held September 14-15, 1998. The program started with a pre-conference workshop on High-Throughput Compound Characterization and Purification. The agenda of the main conference was divided into sessions of Synthesis, Automation and Unique Chemistries; Integrating Combinatorial Chemistry, Medicinal Chemistry and Screening; Combinatorial Chemistry Applications for Drug Discovery; and Information and Data Management. This meeting was an excellent opportunity to see how big pharma, biotech and service companies are addressing the current bottlenecks in combinatorial chemistry to speed drug discovery. (c) 1998 Prous Science. All rights reserved.

  15. Plant automation

    International Nuclear Information System (INIS)

    Christensen, L.J.; Sackett, J.I.; Dayal, Y.; Wagner, W.K.

    1989-01-01

    This paper describes work at EBR-II in the development and demonstration of new control equipment and methods and associated schemes for plant prognosis, diagnosis, and automation. The development work has attracted the interest of other national laboratories, universities, and commercial companies. New initiatives include use of new control strategies, expert systems, advanced diagnostics, and operator displays. The unique opportunity offered by EBR-II is as a test bed where a total integrated approach to automatic reactor control can be directly tested under real power plant conditions

  16. Polar Domain Discovery with Sparkler

    Science.gov (United States)

    Duerr, R.; Khalsa, S. J. S.; Mattmann, C. A.; Ottilingam, N. K.; Singh, K.; Lopez, L. A.

    2017-12-01

    The scientific web is vast and ever growing. It encompasses millions of textual, scientific and multimedia documents describing research in a multitude of scientific streams. Most of these documents are hidden behind forms which require user action to retrieve and thus can't be directly accessed by content crawlers. These documents are hosted on web servers across the world, most often on outdated hardware and network infrastructure. Hence it is difficult and time-consuming to aggregate documents from the scientific web, especially those relevant to a specific domain. Thus generating meaningful domain-specific insights is currently difficult. We present an automated discovery system (Figure 1) using Sparkler, an open-source, extensible, horizontally scalable crawler which facilitates high throughput and focused crawling of documents pertinent to a particular domain such as information about polar regions. With this set of highly domain relevant documents, we show that it is possible to answer analytical questions about that domain. Our domain discovery algorithm leverages prior domain knowledge to reach out to commercial/scientific search engines to generate seed URLs. Subject matter experts then annotate these seed URLs manually on a scale from highly relevant to irrelevant. We leverage this annotated dataset to train a machine learning model which predicts the `domain relevance' of a given document. We extend Sparkler with this model to focus crawling on documents relevant to that domain. Sparkler avoids disruption of service by 1) partitioning URLs by hostname such that every node gets a different host to crawl and by 2) inserting delays between subsequent requests. With an NSF-funded supercomputer Wrangler, we scaled our domain discovery pipeline to crawl about 200k polar specific documents from the scientific web, within a day.

  17. WIDAFELS flexible automation systems

    International Nuclear Information System (INIS)

    Shende, P.S.; Chander, K.P.; Ramadas, P.

    1990-01-01

    After discussing the various aspects of automation, some typical examples of various levels of automation are given. One of the examples is of automated production line for ceramic fuel pellets. (M.G.B.)

  18. An Automation Planning Primer.

    Science.gov (United States)

    Paynter, Marion

    1988-01-01

    This brief planning guide for library automation incorporates needs assessment and evaluation of options to meet those needs. A bibliography of materials on automation planning and software reviews, library software directories, and library automation journals is included. (CLB)

  19. Low cost automation

    International Nuclear Information System (INIS)

    1987-03-01

    This book indicates method of building of automation plan, design of automation facilities, automation and CHIP process like basics of cutting, NC processing machine and CHIP handling, automation unit, such as drilling unit, tapping unit, boring unit, milling unit and slide unit, application of oil pressure on characteristics and basic oil pressure circuit, application of pneumatic, automation kinds and application of process, assembly, transportation, automatic machine and factory automation.

  20. Automated Budget System -

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  1. Automation 2017

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2017-01-01

    This book consists of papers presented at Automation 2017, an international conference held in Warsaw from March 15 to 17, 2017. It discusses research findings associated with the concepts behind INDUSTRY 4.0, with a focus on offering a better understanding of and promoting participation in the Fourth Industrial Revolution. Each chapter presents a detailed analysis of a specific technical problem, in most cases followed by a numerical analysis, simulation and description of the results of implementing the solution in a real-world context. The theoretical results, practical solutions and guidelines presented are valuable for both researchers working in the area of engineering sciences and practitioners looking for solutions to industrial problems. .

  2. Marketing automation

    Directory of Open Access Journals (Sweden)

    TODOR Raluca Dania

    2017-01-01

    Full Text Available The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the technical progress, the marketing fragmentation, demand for customized products and services on one side and the need to achieve constructive dialogue with the customers, immediate and flexible response and the necessity to measure the investments and the results on the other side, the classical marketing approached had changed continue to improve substantially.

  3. An Excel Spreadsheet Model for States and Districts to Assess the Cost-Benefit of School Nursing Services.

    Science.gov (United States)

    Wang, Li Yan; O'Brien, Mary Jane; Maughan, Erin D

    2016-11-01

    This paper describes a user-friendly, Excel spreadsheet model and two data collection instruments constructed by the authors to help states and districts perform cost-benefit analyses of school nursing services delivered by full-time school nurses. Prior to applying the model, states or districts need to collect data using two forms: "Daily Nurse Data Collection Form" and the "Teacher Survey." The former is used to record daily nursing activities, including number of student health encounters, number of medications administered, number of student early dismissals, and number of medical procedures performed. The latter is used to obtain estimates for the time teachers spend addressing student health issues. Once inputs are entered in the model, outputs are automatically calculated, including program costs, total benefits, net benefits, and benefit-cost ratio. The spreadsheet model, data collection tools, and instructions are available at the NASN website ( http://www.nasn.org/The/CostBenefitAnalysis ).

  4. Simulation of 2D Waves in Circular Membrane Using Excel Spreadsheet with Visual Basic for Teaching Activity

    Science.gov (United States)

    Eso, R.; Safiuddin, L. O.; Agusu, L.; Arfa, L. M. R. F.

    2018-04-01

    We propose a teaching instrument demonstrating the circular membrane waves using the excel interactive spreadsheets with the Visual Basic for Application (VBA) programming. It is based on the analytic solution of circular membrane waves involving Bessel function. The vibration modes and frequencies are determined by using Bessel approximation and initial conditions. The 3D perspective based on the spreadsheets functions and facilities has been explored to show the 3D moving objects in transitional or rotational processes. This instrument is very useful both in teaching activity and learning process of wave physics. Visualizing of the vibration of waves in the circular membrane which is showing a very clear manner of m and n vibration modes of the wave in a certain frequency has been compared and matched to the experimental result using resonance method. The peak of deflection varies in time if the initial condition was working and have the same pattern with matlab simulation in zero initial velocity

  5. Mass spectrometry for protein quantification in biomarker discovery.

    Science.gov (United States)

    Wang, Mu; You, Jinsam

    2012-01-01

    Major technological advances have made proteomics an extremely active field for biomarker discovery in recent years due primarily to the development of newer mass spectrometric technologies and the explosion in genomic and protein bioinformatics. This leads to an increased emphasis on larger scale, faster, and more efficient methods for detecting protein biomarkers in human tissues, cells, and biofluids. Most current proteomic methodologies for biomarker discovery, however, are not highly automated and are generally labor-intensive and expensive. More automation and improved software programs capable of handling a large amount of data are essential to reduce the cost of discovery and to increase throughput. In this chapter, we discuss and describe mass spectrometry-based proteomic methods for quantitative protein analysis.

  6. Calculation of economic and financing of NPP and conventional power plant using spreadsheet innovation

    International Nuclear Information System (INIS)

    Moch Djoko Birmano; Imam Bastori

    2008-01-01

    The study for calculating the economic and financing of Nuclear Power Plant (NPP) and conventional power plant using spreadsheet Innovation has been done. As case study, the NPP of PWR type of class 1050 MWe is represented by OPR-1000 (Optimized Power Reactor, 1000 MWe) and the conventional plant of class 600 MWe, is coal power plant (Coal PP). The purpose of the study is to assess the economic and financial feasibility level of OPR-1000 and Coal PP. The study result concludes that economically, OPR-1000 is more feasible compared to Coal PP because its generation cost is cheaper. Whereas financially, OPR-1000 is more beneficial compared to Coal PP because the higher benefit at the end of economic lifetime (NPV) and the higher ratio of benefit and cost (B/C Ratio). For NPP and Coal PP, the higher Discount Rate (%) is not beneficial. NPP is more sensitive to the change of discount rate compared to coal PP, whereas Coal PP is more sensitive to the change of power purchasing price than NPP. (author)

  7. Can automation in radiotherapy reduce costs?

    Science.gov (United States)

    Massaccesi, Mariangela; Corti, Michele; Azario, Luigi; Balducci, Mario; Ferro, Milena; Mantini, Giovanna; Mattiucci, Gian Carlo; Valentini, Vincenzo

    2015-01-01

    Computerized automation is likely to play an increasingly important role in radiotherapy. The objective of this study was to report the results of the first part of a program to implement a model for economical evaluation based on micro-costing method. To test the efficacy of the model, the financial impact of the introduction of an automation tool was estimated. A single- and multi-center validation of the model by a prospective collection of data is planned as the second step of the program. The model was implemented by using an interactive spreadsheet (Microsoft Excel, 2010). The variables to be included were identified across three components: productivity, staff, and equipment. To calculate staff requirements, the workflow of Gemelli ART center was mapped out and relevant workload measures were defined. Profit and loss, productivity and staffing were identified as significant outcomes. Results were presented in terms of earnings before interest and taxes (EBIT). Three different scenarios were hypothesized: baseline situation at Gemelli ART (scenario 1); reduction by 2 minutes of the average duration of treatment fractions (scenario 2); and increased incidence of advanced treatment modalities (scenario 3). By using the model, predicted EBIT values for each scenario were calculated across a period of eight years (from 2015 to 2022). For both scenarios 2 and 3 costs are expected to slightly increase as compared to baseline situation that is particularly due to a little increase in clinical personnel costs. However, in both cases EBIT values are more favorable than baseline situation (EBIT values: scenario 1, 27%, scenario 2, 30%, scenario 3, 28% of revenues). A model based on a micro-costing method was able to estimate the financial consequences of the introduction of an automation tool in our radiotherapy department. A prospective collection of data at Gemelli ART and in a consortium of centers is currently under way to prospectively validate the model.

  8. Academic Drug Discovery Centres

    DEFF Research Database (Denmark)

    Kirkegaard, Henriette Schultz; Valentin, Finn

    2014-01-01

    Academic drug discovery centres (ADDCs) are seen as one of the solutions to fill the innovation gap in early drug discovery, which has proven challenging for previous organisational models. Prior studies of ADDCs have identified the need to analyse them from the angle of their economic...

  9. Decades of Discovery

    Science.gov (United States)

    2011-06-01

    For the past two-and-a-half decades, the Office of Science at the U.S. Department of Energy has been at the forefront of scientific discovery. Over 100 important discoveries supported by the Office of Science are represented in this document.

  10. "Eureka, Eureka!" Discoveries in Science

    Science.gov (United States)

    Agarwal, Pankaj

    2011-01-01

    Accidental discoveries have been of significant value in the progress of science. Although accidental discoveries are more common in pharmacology and chemistry, other branches of science have also benefited from such discoveries. While most discoveries are the result of persistent research, famous accidental discoveries provide a fascinating…

  11. Both Automation and Paper.

    Science.gov (United States)

    Purcell, Royal

    1988-01-01

    Discusses the concept of a paperless society and the current situation in library automation. Various applications of automation and telecommunications are addressed, and future library automation is considered. Automation at the Monroe County Public Library in Bloomington, Indiana, is described as an example. (MES)

  12. Automation of P-3 Simulations to Improve Operator Workload

    Science.gov (United States)

    2012-09-01

    Training GBE Group Behavior Engine GCC Geocentric Coordinates GCS Global Coordinate System GUI Graphical User Interface xiv HLA High...this thesis and because they each have a unique approach to solving the problem of entity behavior automation. A. DISCOVERY MACHINE The United States...from the operators and can be automated in JSAF using the mental simulation approach . Two trips were conducted to visit the Naval Warfare

  13. The Greatest Mathematical Discovery?

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David H.; Borwein, Jonathan M.

    2010-05-12

    What mathematical discovery more than 1500 years ago: (1) Is one of the greatest, if not the greatest, single discovery in the field of mathematics? (2) Involved three subtle ideas that eluded the greatest minds of antiquity, even geniuses such as Archimedes? (3) Was fiercely resisted in Europe for hundreds of years after its discovery? (4) Even today, in historical treatments of mathematics, is often dismissed with scant mention, or else is ascribed to the wrong source? Answer: Our modern system of positional decimal notation with zero, together with the basic arithmetic computational schemes, which were discovered in India about 500 CE.

  14. Automated Service Discovery using Autonomous Control Technologies, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — With the advent of mobile commerce technologies, the realization of pervasive computing and the formation of ad-hoc networks can be leveraged to the benefit of the...

  15. Automated Discovery of Internet Censorship by Web Crawling

    OpenAIRE

    Darer, Alexander; Farnan, Oliver; Wright, Joss

    2018-01-01

    Censorship of the Internet is widespread around the world. As access to the web becomes increasingly ubiquitous, filtering of this resource becomes more pervasive. Transparency about specific content that citizens are denied access to is atypical. To counter this, numerous techniques for maintaining URL filter lists have been proposed by various individuals and organisations that aim to empirical data on censorship for benefit of the public and wider censorship research community. We present ...

  16. Automated Atmospheric Composition Dataset Level Metadata Discovery. Difficulties and Surprises

    Science.gov (United States)

    Strub, R. F.; Falke, S. R.; Kempler, S.; Fialkowski, E.; Goussev, O.; Lynnes, C.

    2015-12-01

    The Atmospheric Composition Portal (ACP) is an aggregator and curator of information related to remotely sensed atmospheric composition data and analysis. It uses existing tools and technologies and, where needed, enhances those capabilities to provide interoperable access, tools, and contextual guidance for scientists and value-adding organizations using remotely sensed atmospheric composition data. The initial focus is on Essential Climate Variables identified by the Global Climate Observing System - CH4, CO, CO2, NO2, O3, SO2 and aerosols. This poster addresses our efforts in building the ACP Data Table, an interface to help discover and understand remotely sensed data that are related to atmospheric composition science and applications. We harvested GCMD, CWIC, GEOSS metadata catalogs using machine to machine technologies - OpenSearch, Web Services. We also manually investigated the plethora of CEOS data providers portals and other catalogs where that data might be aggregated. This poster is our experience of the excellence, variety, and challenges we encountered.Conclusions:1.The significant benefits that the major catalogs provide are their machine to machine tools like OpenSearch and Web Services rather than any GUI usability improvements due to the large amount of data in their catalog.2.There is a trend at the large catalogs towards simulating small data provider portals through advanced services. 3.Populating metadata catalogs using ISO19115 is too complex for users to do in a consistent way, difficult to parse visually or with XML libraries, and too complex for Java XML binders like CASTOR.4.The ability to search for Ids first and then for data (GCMD and ECHO) is better for machine to machine operations rather than the timeouts experienced when returning the entire metadata entry at once. 5.Metadata harvest and export activities between the major catalogs has led to a significant amount of duplication. (This is currently being addressed) 6.Most (if not all) Earth science atmospheric composition data providers store a reference to their data at GCMD.

  17. Context-sensitive service discovery experimental prototype and evaluation

    DEFF Research Database (Denmark)

    Balken, Robin; Haukrogh, Jesper; L. Jensen, Jens

    2007-01-01

    The amount of different networks and services available to users today are increasing. This introduces the need for a way to locate and sort out irrelevant services in the process of discovering available services to a user. This paper describes and evaluates a prototype of an automated discovery...... and selection system, which locates services relevant to a user, based on his/her context and the context of the available services. The prototype includes a multi-level, hierarchical system approach and the introduction of entities called User-nodes, Super-nodes and Root-nodes. These entities separate...... the network in domains that handle the complex distributed service discovery, which is based on dynamically changing context information. In the prototype, a method for performing context-sensitive service discovery has been realised. The service discovery part utilizes UPnP, which has been expanded in order...

  18. Multidimensional process discovery

    NARCIS (Netherlands)

    Ribeiro, J.T.S.

    2013-01-01

    Typically represented in event logs, business process data describe the execution of process events over time. Business process intelligence (BPI) techniques such as process mining can be applied to get strategic insight into business processes. Process discovery, conformance checking and

  19. Fateful discovery almost forgotten

    CERN Multimedia

    1989-01-01

    "The discovery of the fission of uranium exactly half a century ago is at risk of passing unremarked because of the general ambivalence towards the consequences of this development. Can that be wise?" (4 pages)

  20. Toxins and drug discovery.

    Science.gov (United States)

    Harvey, Alan L

    2014-12-15

    Components from venoms have stimulated many drug discovery projects, with some notable successes. These are briefly reviewed, from captopril to ziconotide. However, there have been many more disappointments on the road from toxin discovery to approval of a new medicine. Drug discovery and development is an inherently risky business, and the main causes of failure during development programmes are outlined in order to highlight steps that might be taken to increase the chances of success with toxin-based drug discovery. These include having a clear focus on unmet therapeutic needs, concentrating on targets that are well-validated in terms of their relevance to the disease in question, making use of phenotypic screening rather than molecular-based assays, and working with development partners with the resources required for the long and expensive development process. Copyright © 2014 The Author. Published by Elsevier Ltd.. All rights reserved.

  1. Defining Creativity with Discovery

    OpenAIRE

    Wilson, Nicholas Charles; Martin, Lee

    2017-01-01

    The standard definition of creativity has enabled significant empirical and theoretical advances, yet contains philosophical conundrums concerning the nature of novelty and the role of recognition and values. In this work we offer an act of conceptual valeting that addresses these issues and in doing so, argue that creativity definitions can be extended through the use of discovery. Drawing on dispositional realist philosophy we outline why adding the discovery and bringing into being of new ...

  2. On the antiproton discovery

    International Nuclear Information System (INIS)

    Piccioni, O.

    1989-01-01

    The author of this article describes his own role in the discovery of the antiproton. Although Segre and Chamberlain received the Nobel Prize in 1959 for its discovery, the author claims that their experimental method was his idea which he communicated to them informally in December 1954. He describes how his application for citizenship (he was Italian), and other scientists' manipulation, prevented him from being at Berkeley to work on the experiment himself. (UK)

  3. Discovery Driven Growth

    DEFF Research Database (Denmark)

    Bukh, Per Nikolaj

    2009-01-01

    Anmeldelse af Discovery Driven Growh : A breakthrough process to reduce risk and seize opportunity, af Rita G. McGrath & Ian C. MacMillan, Boston: Harvard Business Press. Udgivelsesdato: 14 august......Anmeldelse af Discovery Driven Growh : A breakthrough process to reduce risk and seize opportunity, af Rita G. McGrath & Ian C. MacMillan, Boston: Harvard Business Press. Udgivelsesdato: 14 august...

  4. The π discovery

    International Nuclear Information System (INIS)

    Fowler, P.H.

    1988-01-01

    The paper traces the discovery of the Π meson. The discovery was made by exposure of nuclear emulsions to cosmic radiation at high altitudes, with subsequent scanning of the emulsions for meson tracks. Disintegration of nuclei by a negative meson, and the decay of a Π meson were both observed. Further measurements revealed the mass of the meson. The studies carried out on the origin of the Π-mesons, and their mode of decay, are both described. (U.K.)

  5. Shotgun Proteomics and Biomarker Discovery

    Directory of Open Access Journals (Sweden)

    W. Hayes McDonald

    2002-01-01

    Full Text Available Coupling large-scale sequencing projects with the amino acid sequence information that can be gleaned from tandem mass spectrometry (MS/MS has made it much easier to analyze complex mixtures of proteins. The limits of this “shotgun” approach, in which the protein mixture is proteolytically digested before separation, can be further expanded by separating the resulting mixture of peptides prior to MS/MS analysis. Both single dimensional high pressure liquid chromatography (LC and multidimensional LC (LC/LC can be directly interfaced with the mass spectrometer to allow for automated collection of tremendous quantities of data. While there is no single technique that addresses all proteomic challenges, the shotgun approaches, especially LC/LC-MS/MS-based techniques such as MudPIT (multidimensional protein identification technology, show advantages over gel-based techniques in speed, sensitivity, scope of analysis, and dynamic range. Advances in the ability to quantitate differences between samples and to detect for an array of post-translational modifications allow for the discovery of classes of protein biomarkers that were previously unassailable.

  6. Discovery informatics in biological and biomedical sciences: research challenges and opportunities.

    Science.gov (United States)

    Honavar, Vasant

    2015-01-01

    New discoveries in biological, biomedical and health sciences are increasingly being driven by our ability to acquire, share, integrate and analyze, and construct and simulate predictive models of biological systems. While much attention has focused on automating routine aspects of management and analysis of "big data", realizing the full potential of "big data" to accelerate discovery calls for automating many other aspects of the scientific process that have so far largely resisted automation: identifying gaps in the current state of knowledge; generating and prioritizing questions; designing studies; designing, prioritizing, planning, and executing experiments; interpreting results; forming hypotheses; drawing conclusions; replicating studies; validating claims; documenting studies; communicating results; reviewing results; and integrating results into the larger body of knowledge in a discipline. Against this background, the PSB workshop on Discovery Informatics in Biological and Biomedical Sciences explores the opportunities and challenges of automating discovery or assisting humans in discovery through advances (i) Understanding, formalization, and information processing accounts of, the entire scientific process; (ii) Design, development, and evaluation of the computational artifacts (representations, processes) that embody such understanding; and (iii) Application of the resulting artifacts and systems to advance science (by augmenting individual or collective human efforts, or by fully automating science).

  7. Simulations of the cardiac action potential based on the Hodgkin-Huxley kinetics with the use of Microsoft Excel spreadsheets.

    Science.gov (United States)

    Wu, Sheng-Nan

    2004-03-31

    The purpose of this study was to develop a method to simulate the cardiac action potential using a Microsoft Excel spreadsheet. The mathematical model contained voltage-gated ionic currents that were modeled using either Beeler-Reuter (B-R) or Luo-Rudy (L-R) phase 1 kinetics. The simulation protocol involves the use of in-cell formulas directly typed into a spreadsheet. The capability of spreadsheet iteration was used in these simulations. It does not require any prior knowledge of computer programming, although the use of the macro language can speed up the calculation. The normal configuration of the cardiac ventricular action potential can be well simulated in the B-R model that is defined by four individual ionic currents, each representing the diffusion of ions through channels in the membrane. The contribution of Na+ inward current to the rate of depolarization is reproduced in this model. After removal of Na+ current from the model, a constant current stimulus elicits an oscillatory change in membrane potential. In the L-R phase 1 model where six types of ionic currents were defined, the effect of extracellular K+ concentration on changes both in the time course of repolarization and in the time-independent K+ current can be demonstrated, when the solutions are implemented in Excel. Using the simulation protocols described here, the users can readily study and graphically display the underlying properties of ionic currents to see how changes in these properties determine the behavior of the heart cell. The method employed in these simulation protocols may also be extended or modified to other biological simulation programs.

  8. A novel real-time data acquisition using an Excel spreadsheet in pendulum experiment tool with light-based timer

    Science.gov (United States)

    Adhitama, Egy; Fauzi, Ahmad

    2018-05-01

    In this study, a pendulum experimental tool with a light-based timer has been developed to measure the period of a simple pendulum. The obtained data was automatically recorded in an Excel spreadsheet. The intensity of monochromatic light, sensed by a 3DU5C phototransistor, dynamically changes as the pendulum swings. The changed intensity varies the resistance value and was processed by the microcontroller, ATMega328, to obtain a signal period as a function of time and brightness when the pendulum crosses the light. Through the experiment, using calculated average periods, the gravitational acceleration value has been accurately and precisely determined.

  9. Main requirement for Zr-Zirconium alloys characteristics data base using the spreadsheet EXCEL

    International Nuclear Information System (INIS)

    Cesari, F.; Chiarini, A.; Izzo, N.

    1995-01-01

    The work, here exposed it is a result of a research that the authors have been performing during last years, using different kind of applicative software. Its justification is based on the observation that rarely a designer of components of nuclear plants, finds acceptable answers by queries put on a large Data Base of structural material. These DB, in fact, contain, in general, information not easily usable, often incomplete or not specific and particular of the project that the interrogator develops. In fact in his daily work a designer requires not only to retrieve data, but he want also to select and to fit them according to his criterion of evaluation almost never classifiable as a general one. Therefore, one envisage the utility of arranging such kind of information in an open system that allows from one side the creation of a DB containing data obtained from the current technical literature, together with models for their phenomenological representation (constitutive equations), and from the other one the aggregation of experimental data not available openly, to which however, the designer has access. Moreover it must insure to the designer the capability to apply particular mathematical models on subsets of selected data with criterion that originate from its experience and its original theoretical-experimental knowledge. It is obvious that in a similar context also becomes necessary a graphic representation of data and results with simple manipulations, provided by very effective graphic tools. The choice of the application software becomes therefore a very critical operation. Since the number of data necessary to characterize a material contained in such a Data Base is generally limited, one can safely envisage the use of a PC of last generation as a physical platform of the system, where a commercially available software is installed. A spreadsheet of more recent type, i.e. Microsoft's-EXCEL, seemed most opportune. In fact, it allows the protection of

  10. Excel, Earthquakes, and Moneyball: exploring Cascadia earthquake probabilities using spreadsheets and baseball analogies

    Science.gov (United States)

    Campbell, M. R.; Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.

    2017-12-01

    Much recent media attention focuses on Cascadia's earthquake hazard. A widely cited magazine article starts "An earthquake will destroy a sizable portion of the coastal Northwest. The question is when." Stories include statements like "a massive earthquake is overdue", "in the next 50 years, there is a 1-in-10 chance a "really big one" will erupt," or "the odds of the big Cascadia earthquake happening in the next fifty years are roughly one in three." These lead students to ask where the quoted probabilities come from and what they mean. These probability estimates involve two primary choices: what data are used to describe when past earthquakes happened and what models are used to forecast when future earthquakes will happen. The data come from a 10,000-year record of large paleoearthquakes compiled from subsidence data on land and turbidites, offshore deposits recording submarine slope failure. Earthquakes seem to have happened in clusters of four or five events, separated by gaps. Earthquakes within a cluster occur more frequently and regularly than in the full record. Hence the next earthquake is more likely if we assume that we are in the recent cluster that started about 1700 years ago, than if we assume the cluster is over. Students can explore how changing assumptions drastically changes probability estimates using easy-to-write and display spreadsheets, like those shown below. Insight can also come from baseball analogies. The cluster issue is like deciding whether to assume that a hitter's performance in the next game is better described by his lifetime record, or by the past few games, since he may be hitting unusually well or in a slump. The other big choice is whether to assume that the probability of an earthquake is constant with time, or is small immediately after one occurs and then grows with time. This is like whether to assume that a player's performance is the same from year to year, or changes over their career. Thus saying "the chance of

  11. Methodology for Outdoor Water Savings Model and Spreadsheet Tool for U.S. and Selected States

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Alison A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chen, Yuting [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dunham, Camilla [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fuchs, Heidi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Price, Sarah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Stratton, Hannah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-07-31

    Green lawns and landscaping are archetypical of the populated American landscape, and typically require irrigation, which corresponds to a significant fraction of residential, commercial, and institutional water use. In North American cities, the estimated portion of residential water used for outdoor purposes ranges from 22-38% in cooler climates up to 59-67% in dry and hot environments, while turfgrass coverage within the United States spans 11.1-20.2 million hectares (Milesi et al. 2009). One national estimate uses satellite and aerial photography data to develop a relationship between impervious surface and lawn surface area, yielding a conservative estimate of 16.4 (± 3.6) million hectares of lawn surface area in the United States—an area three times larger than that devoted to any irrigated crop (Milesi et al. 2005). One approach that holds promise for cutting unnecessary outdoor water use is the increased deployment of “smart” irrigation controllers to increase the water efficiency of irrigation systems. This report describes the methodology and inputs employed in a mathematical model that quantifies the effects of the U.S. Environmental Protection Agency’s WaterSense labeling program for one such type of controller, weather-based irrigation controllers (WBIC). This model builds off that described in “Methodology for National Water Savings Model and Spreadsheet Tool–Outdoor Water Use” and uses a two-tiered approach to quantify outdoor water savings attributable to the WaterSense program for WBIC, as well as net present value (NPV) of that savings. While the first iteration of the model assessed national impacts using averaged national values, this version begins by evaluating impacts in three key large states that make up a sizable portion of the irrigation market: California, Florida, and Texas. These states are considered to be the principal market of “smart” irrigation controllers that may result in the bulk of national savings. Modeled

  12. An automated swimming respirometer

    DEFF Research Database (Denmark)

    STEFFENSEN, JF; JOHANSEN, K; BUSHNELL, PG

    1984-01-01

    An automated respirometer is described that can be used for computerized respirometry of trout and sharks.......An automated respirometer is described that can be used for computerized respirometry of trout and sharks....

  13. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  14. Configuration Management Automation (CMA) -

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  15. Natural Products for Drug Discovery in the 21st Century: Innovations for Novel Drug Discovery

    Directory of Open Access Journals (Sweden)

    Nicholas Ekow Thomford

    2018-05-01

    Full Text Available The therapeutic properties of plants have been recognised since time immemorial. Many pathological conditions have been treated using plant-derived medicines. These medicines are used as concoctions or concentrated plant extracts without isolation of active compounds. Modern medicine however, requires the isolation and purification of one or two active compounds. There are however a lot of global health challenges with diseases such as cancer, degenerative diseases, HIV/AIDS and diabetes, of which modern medicine is struggling to provide cures. Many times the isolation of “active compound” has made the compound ineffective. Drug discovery is a multidimensional problem requiring several parameters of both natural and synthetic compounds such as safety, pharmacokinetics and efficacy to be evaluated during drug candidate selection. The advent of latest technologies that enhance drug design hypotheses such as Artificial Intelligence, the use of ‘organ-on chip’ and microfluidics technologies, means that automation has become part of drug discovery. This has resulted in increased speed in drug discovery and evaluation of the safety, pharmacokinetics and efficacy of candidate compounds whilst allowing novel ways of drug design and synthesis based on natural compounds. Recent advances in analytical and computational techniques have opened new avenues to process complex natural products and to use their structures to derive new and innovative drugs. Indeed, we are in the era of computational molecular design, as applied to natural products. Predictive computational softwares have contributed to the discovery of molecular targets of natural products and their derivatives. In future the use of quantum computing, computational softwares and databases in modelling molecular interactions and predicting features and parameters needed for drug development, such as pharmacokinetic and pharmacodynamics, will result in few false positive leads in drug

  16. Natural Products for Drug Discovery in the 21st Century: Innovations for Novel Drug Discovery.

    Science.gov (United States)

    Thomford, Nicholas Ekow; Senthebane, Dimakatso Alice; Rowe, Arielle; Munro, Daniella; Seele, Palesa; Maroyi, Alfred; Dzobo, Kevin

    2018-05-25

    The therapeutic properties of plants have been recognised since time immemorial. Many pathological conditions have been treated using plant-derived medicines. These medicines are used as concoctions or concentrated plant extracts without isolation of active compounds. Modern medicine however, requires the isolation and purification of one or two active compounds. There are however a lot of global health challenges with diseases such as cancer, degenerative diseases, HIV/AIDS and diabetes, of which modern medicine is struggling to provide cures. Many times the isolation of "active compound" has made the compound ineffective. Drug discovery is a multidimensional problem requiring several parameters of both natural and synthetic compounds such as safety, pharmacokinetics and efficacy to be evaluated during drug candidate selection. The advent of latest technologies that enhance drug design hypotheses such as Artificial Intelligence, the use of 'organ-on chip' and microfluidics technologies, means that automation has become part of drug discovery. This has resulted in increased speed in drug discovery and evaluation of the safety, pharmacokinetics and efficacy of candidate compounds whilst allowing novel ways of drug design and synthesis based on natural compounds. Recent advances in analytical and computational techniques have opened new avenues to process complex natural products and to use their structures to derive new and innovative drugs. Indeed, we are in the era of computational molecular design, as applied to natural products. Predictive computational softwares have contributed to the discovery of molecular targets of natural products and their derivatives. In future the use of quantum computing, computational softwares and databases in modelling molecular interactions and predicting features and parameters needed for drug development, such as pharmacokinetic and pharmacodynamics, will result in few false positive leads in drug development. This review

  17. Automated and comprehensive link engineering supporting branched, ring, and mesh network topologies

    Science.gov (United States)

    Farina, J.; Khomchenko, D.; Yevseyenko, D.; Meester, J.; Richter, A.

    2016-02-01

    Link design, while relatively easy in the past, can become quite cumbersome with complex channel plans and equipment configurations. The task of designing optical transport systems and selecting equipment is often performed by an applications or sales engineer using simple tools, such as custom Excel spreadsheets. Eventually, every individual has their own version of the spreadsheet as well as their own methodology for building the network. This approach becomes unmanageable very quickly and leads to mistakes, bending of the engineering rules and installations that do not perform as expected. We demonstrate a comprehensive planning environment, which offers an efficient approach to unify, control and expedite the design process by controlling libraries of equipment and engineering methodologies, automating the process and providing the analysis tools necessary to predict system performance throughout the system and for all channels. In addition to the placement of EDFAs and DCEs, performance analysis metrics are provided at every step of the way. Metrics that can be tracked include power, CD and OSNR, SPM, XPM, FWM and SBS. Automated routine steps assist in design aspects such as equalization, padding and gain setting for EDFAs, the placement of ROADMs and transceivers, and creating regeneration points. DWDM networks consisting of a large number of nodes and repeater huts, interconnected in linear, branched, mesh and ring network topologies, can be designed much faster when compared with conventional design methods. Using flexible templates for all major optical components, our technology-agnostic planning approach supports the constant advances in optical communications.

  18. Discovery of charm

    International Nuclear Information System (INIS)

    Goldhaber, G.

    1984-11-01

    In my talk I will cover the period 1973 to 1976 which saw the discoveries of the J/psi and psi' resonances and most of the Psion spectroscopy, the tau lepton and the D 0 ,D + charmed meson doublet. Occasionally I will refer briefly to more recent results. Since this conference is on the history of the weak-interactions I will deal primarily with the properties of naked charm and in particular the weakly decaying doublet of charmed mesons. Most of the discoveries I will mention were made with the SLAC-LBL Magnetic Detector or MARK I which we operated at SPEAR from 1973 to 1976. 27 references

  19. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  20. The use of kragten spreadsheets for uncertainty evaluation of uranium potentiometric analysis by the Brazilian Safeguards Laboratory

    International Nuclear Information System (INIS)

    Silva, Jose Wanderley S. da; Barros, Pedro Dionisio de; Araujo, Radier Mario S. de

    2009-01-01

    In safeguards, independent analysis of uranium content and enrichment of nuclear materials to verify operator's declarations is an important tool to evaluate the accountability system applied by nuclear installations. This determination may be performed by nondestructive (NDA) methods, generally done in the field using portable radiation detection systems, or destructive (DA) methods by chemical analysis when more accurate and precise results are necessary. Samples for DA analysis are collected by inspectors during safeguards inspections and sent to Safeguards Laboratory (LASAL) of the Brazilian Nuclear Energy Commission - (CNEN), where the analysis take place. The method used by LASAL for determination of uranium in different physical and chemical forms is the Davies and Gray/NBL using an automatic potentiometric titrator, which performs the titration of uranium IV by a standard solution of K 2 Cr 2 O 7 . Uncertainty budgets have been determined based on the concepts of the ISO 'Guide to the Expression of Uncertainty in Measurement' (GUM). In order to simplify the calculation of the uncertainty, a computational tool named Kragten Spreadsheet was used. Such spreadsheet uses the concepts established by the GUM and provides results that numerically approximates to those obtained by propagation of uncertainty with analytically determined sensitivity coefficients. The main parameters (input quantities) interfering on the uncertainty were studied. In order to evaluate their contribution in the final uncertainty, the uncertainties of all steps of the analytical method were estimated and compiled. (author)

  1. DEVELOPMENT OF A SPREADSHEET BASED VENDOR MANAGED INVENTORY MODEL FOR A SINGLE ECHELON SUPPLY CHAIN: A CASE STUDY

    Directory of Open Access Journals (Sweden)

    Karanam Prahlada Rao

    2010-11-01

    Full Text Available Vendor managed inventory (VMI is a supply chain initiative where the supplier assumes the responsibility for managing inventories using advanced communication means such as online messaging and data retrieval system. A well collaborated vendor manage inventory system can improve supply chain performance by decreasing the inventory level and increasing the fill rate. This paper investigates the implementation of vendor managed inventory systems in a consumer goods industry. We consider (r, Q policy for replenishing its inventory. The objective of work is to minimize the inventory across the supply chain and maximize the service level. The major contribution of this work is to develop a spreadsheet model for VMI system, Evaluation of Total inventory cost by using spreadsheet based method and Analytical method, Quantifying inventory reduction, Estimating service efficiency level, and validating the VMI spread sheet model with randomly generated demand. In the application, VMI as an inventory control system is able to reduce the inventory cost without sacrificing the service level. The results further more show that the inventory reduction obtained from analytical method is closer to the spread sheet based approach, which reveals the VMI success. However the VMI success is impacted by the quality of buyersupplier relationships, the quality of the IT system and the intensity of information sharing, but not by the quality of information shared.

  2. SEMANTIC WEB SERVICES – DISCOVERY, SELECTION AND COMPOSITION TECHNIQUES

    OpenAIRE

    Sowmya Kamath S; Ananthanarayana V.S

    2013-01-01

    Web services are already one of the most important resources on the Internet. As an integrated solution for realizing the vision of the Next Generation Web, semantic web services combine semantic web technology with web service technology, envisioning automated life cycle management of web services. This paper discusses the significance and importance of service discovery & selection to business logic, and the requisite current research in the various phases of the semantic web...

  3. Discovery: Pile Patterns

    Science.gov (United States)

    de Mestre, Neville

    2017-01-01

    Earlier "Discovery" articles (de Mestre, 1999, 2003, 2006, 2010, 2011) considered patterns from many mathematical situations. This article presents a group of patterns used in 19th century mathematical textbooks. In the days of earlier warfare, cannon balls were stacked in various arrangements depending on the shape of the pile base…

  4. Discovery and Innovation

    African Journals Online (AJOL)

    Discovery and Innovation is a journal of the African Academy of Sciences (AAS) ... World (TWAS) meant to focus attention on science and technology in Africa and the ... of Non-wood Forest Products: Potential Impacts and Challenges in Africa ...

  5. Discovery of TUG-770

    DEFF Research Database (Denmark)

    Christiansen, Elisabeth; Hansen, Steffen V F; Urban, Christian

    2013-01-01

    Free fatty acid receptor 1 (FFA1 or GPR40) enhances glucose-stimulated insulin secretion from pancreatic β-cells and currently attracts high interest as a new target for the treatment of type 2 diabetes. We here report the discovery of a highly potent FFA1 agonist with favorable physicochemical...

  6. The discovery of fission

    International Nuclear Information System (INIS)

    McKay, H.A.C.

    1978-01-01

    In this article by the retired head of the Separation Processes Group of the Chemistry Division, Atomic Energy Research Establishment, Harwell, U.K., the author recalls what he terms 'an exciting drama, the unravelling of the nature of the atomic nucleus' in the years before the Second World War, including the discovery of fission. 12 references. (author)

  7. The Discovery of America

    Science.gov (United States)

    Martin, Paul S.

    1973-01-01

    Discusses a model for explaining the spread of human population explosion on North American continent since its discovery 12,000 years ago. The model may help to map the spread of Homo sapiens throughout the New World by using the extinction chronology of the Pleistocene megafauna. (Author/PS)

  8. Using a Spreadsheet to Solve the Schro¨dinger Equations for the Energies of the Ground Electronic State and the Two Lowest Excited States of H[subscript2

    Science.gov (United States)

    Ge, Yingbin; Rittenhouse, Robert C.; Buchanan, Jacob C.; Livingston, Benjamin

    2014-01-01

    We have designed an exercise suitable for a lab or project in an undergraduate physical chemistry course that creates a Microsoft Excel spreadsheet to calculate the energy of the S[subscript 0] ground electronic state and the S[subscript 1] and T[subscript 1] excited states of H[subscript 2]. The spreadsheet calculations circumvent the…

  9. Use of a Spreadsheet to Calculate the Net Charge of Peptides and Proteins as a Function of pH: An Alternative to Using "Canned" Programs to Estimate the Isoelectric Point of These Important Biomolecules

    Science.gov (United States)

    Sims, Paul A.

    2010-01-01

    An approach is presented that utilizes a spreadsheet to allow students to explore different means of calculating and visualizing how the charge on peptides and proteins varies as a function of pH. In particular, the concept of isoelectric point is developed to allow students to compare the results of their spreadsheet calculations with those of…

  10. Bead-based screening in chemical biology and drug discovery

    DEFF Research Database (Denmark)

    Komnatnyy, Vitaly V.; Nielsen, Thomas Eiland; Qvortrup, Katrine

    2018-01-01

    libraries for early drug discovery. Among the various library forms, the one-bead-one-compound (OBOC) library, where each bead carries many copies of a single compound, holds the greatest potential for the rapid identification of novel hits against emerging drug targets. However, this potential has not yet...... been fully realized due to a number of technical obstacles. In this feature article, we review the progress that has been made towards bead-based library screening and applications to the discovery of bioactive compounds. We identify the key challenges of this approach and highlight key steps needed......High-throughput screening is an important component of the drug discovery process. The screening of libraries containing hundreds of thousands of compounds requires assays amanable to miniaturisation and automization. Combinatorial chemistry holds a unique promise to deliver structural diverse...

  11. Scientific workflows as productivity tools for drug discovery.

    Science.gov (United States)

    Shon, John; Ohkawa, Hitomi; Hammer, Juergen

    2008-05-01

    Large pharmaceutical companies annually invest tens to hundreds of millions of US dollars in research informatics to support their early drug discovery processes. Traditionally, most of these investments are designed to increase the efficiency of drug discovery. The introduction of do-it-yourself scientific workflow platforms has enabled research informatics organizations to shift their efforts toward scientific innovation, ultimately resulting in a possible increase in return on their investments. Unlike the handling of most scientific data and application integration approaches, researchers apply scientific workflows to in silico experimentation and exploration, leading to scientific discoveries that lie beyond automation and integration. This review highlights some key requirements for scientific workflow environments in the pharmaceutical industry that are necessary for increasing research productivity. Examples of the application of scientific workflows in research and a summary of recent platform advances are also provided.

  12. Current status and future prospects for enabling chemistry technology in the drug discovery process.

    Science.gov (United States)

    Djuric, Stevan W; Hutchins, Charles W; Talaty, Nari N

    2016-01-01

    This review covers recent advances in the implementation of enabling chemistry technologies into the drug discovery process. Areas covered include parallel synthesis chemistry, high-throughput experimentation, automated synthesis and purification methods, flow chemistry methodology including photochemistry, electrochemistry, and the handling of "dangerous" reagents. Also featured are advances in the "computer-assisted drug design" area and the expanding application of novel mass spectrometry-based techniques to a wide range of drug discovery activities.

  13. Current status and future prospects for enabling chemistry technology in the drug discovery process

    Science.gov (United States)

    Djuric, Stevan W.; Hutchins, Charles W.; Talaty, Nari N.

    2016-01-01

    This review covers recent advances in the implementation of enabling chemistry technologies into the drug discovery process. Areas covered include parallel synthesis chemistry, high-throughput experimentation, automated synthesis and purification methods, flow chemistry methodology including photochemistry, electrochemistry, and the handling of “dangerous” reagents. Also featured are advances in the “computer-assisted drug design” area and the expanding application of novel mass spectrometry-based techniques to a wide range of drug discovery activities. PMID:27781094

  14. The neutron discovery

    International Nuclear Information System (INIS)

    Six, J.

    1987-01-01

    The neutron: who had first the idea, who discovered it, who established its main properties. To these apparently simple questions, multiple answers exist. The progressive discovery of the neutron is a marvellous illustration of some characteristics of the scientific research, where the unforeseen may be combined with the expected. This discovery is replaced in the context of the 1930's scientific effervescence that succeeded the revolutionary introduction of quantum mechanics. This book describes the works of Bothe, the Joliot-Curie and Chadwick which led to the neutron in an unexpected way. A historical analysis allows to give a new interpretation on the hypothesis suggested by the Joliot-Curie. Some texts of these days will help the reader to revive this fascinating story [fr

  15. Atlas of Astronomical Discoveries

    CERN Document Server

    Schilling, Govert

    2011-01-01

    Four hundred years ago in Middelburg, in the Netherlands, the telescope was invented. The invention unleashed a revolution in the exploration of the universe. Galileo Galilei discovered mountains on the Moon, spots on the Sun, and moons around Jupiter. Christiaan Huygens saw details on Mars and rings around Saturn. William Herschel discovered a new planet and mapped binary stars and nebulae. Other astronomers determined the distances to stars, unraveled the structure of the Milky Way, and discovered the expansion of the universe. And, as telescopes became bigger and more powerful, astronomers delved deeper into the mysteries of the cosmos. In his Atlas of Astronomical Discoveries, astronomy journalist Govert Schilling tells the story of 400 years of telescopic astronomy. He looks at the 100 most important discoveries since the invention of the telescope. In his direct and accessible style, the author takes his readers on an exciting journey encompassing the highlights of four centuries of astronomy. Spectacul...

  16. Viral pathogen discovery

    Science.gov (United States)

    Chiu, Charles Y

    2015-01-01

    Viral pathogen discovery is of critical importance to clinical microbiology, infectious diseases, and public health. Genomic approaches for pathogen discovery, including consensus polymerase chain reaction (PCR), microarrays, and unbiased next-generation sequencing (NGS), have the capacity to comprehensively identify novel microbes present in clinical samples. Although numerous challenges remain to be addressed, including the bioinformatics analysis and interpretation of large datasets, these technologies have been successful in rapidly identifying emerging outbreak threats, screening vaccines and other biological products for microbial contamination, and discovering novel viruses associated with both acute and chronic illnesses. Downstream studies such as genome assembly, epidemiologic screening, and a culture system or animal model of infection are necessary to establish an association of a candidate pathogen with disease. PMID:23725672

  17. Automation systems for radioimmunoassay

    International Nuclear Information System (INIS)

    Yamasaki, Paul

    1974-01-01

    The application of automation systems for radioimmunoassay (RIA) was discussed. Automated systems could be useful in the second step, of the four basic processes in the course of RIA, i.e., preparation of sample for reaction. There were two types of instrumentation, a semi-automatic pipete, and a fully automated pipete station, both providing for fast and accurate dispensing of the reagent or for the diluting of sample with reagent. Illustrations of the instruments were shown. (Mukohata, S.)

  18. Fateful discovery almost forgotten

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    The paper reviews the discovery of the fission of uranium, which took place fifty years ago. A description is given of the work of Meitner and Frisch in interpreting the Fermi data on the bombardment of uranium nuclei with neutrons, i.e. proposing fission. The historical events associated with the development and exploitation of uranium fission are described, including the Manhattan Project, Hiroshima and Nagasaki, Shippingport, and Chernobyl. (U.K.)

  19. Discovery as a process

    Energy Technology Data Exchange (ETDEWEB)

    Loehle, C.

    1994-05-01

    The three great myths, which form a sort of triumvirate of misunderstanding, are the Eureka! myth, the hypothesis myth, and the measurement myth. These myths are prevalent among scientists as well as among observers of science. The Eureka! myth asserts that discovery occurs as a flash of insight, and as such is not subject to investigation. This leads to the perception that discovery or deriving a hypothesis is a moment or event rather than a process. Events are singular and not subject to description. The hypothesis myth asserts that proper science is motivated by testing hypotheses, and that if something is not experimentally testable then it is not scientific. This myth leads to absurd posturing by some workers conducting empirical descriptive studies, who dress up their study with a ``hypothesis`` to obtain funding or get it published. Methods papers are often rejected because they do not address a specific scientific problem. The fact is that many of the great breakthroughs in silence involve methods and not hypotheses or arise from largely descriptive studies. Those captured by this myth also try to block funding for those developing methods. The third myth is the measurement myth, which holds that determining what to measure is straightforward, so one doesn`t need a lot of introspection to do science. As one ecologist put it to me ``Don`t give me any of that philosophy junk, just let me out in the field. I know what to measure.`` These myths lead to difficulties for scientists who must face peer review to obtain funding and to get published. These myths also inhibit the study of science as a process. Finally, these myths inhibit creativity and suppress innovation. In this paper I first explore these myths in more detail and then propose a new model of discovery that opens the supposedly miraculous process of discovery to doser scrutiny.

  20. Advances in Computer, Communication, Control and Automation

    CERN Document Server

    011 International Conference on Computer, Communication, Control and Automation

    2012-01-01

    The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011). 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011) has been held in Zhuhai, China, November 19-20, 2011. This volume  topics covered include signal and Image processing, speech and audio Processing, video processing and analysis, artificial intelligence, computing and intelligent systems, machine learning, sensor and neural networks, knowledge discovery and data mining, fuzzy mathematics and Applications, knowledge-based systems, hybrid systems modeling and design, risk analysis and management, system modeling and simulation. We hope that researchers, graduate students and other interested readers benefit scientifically from the proceedings and also find it stimulating in the process.

  1. Automated sampling and data processing derived from biomimetic membranes

    DEFF Research Database (Denmark)

    Perry, Mark; Vissing, Thomas; Boesen, P.

    2009-01-01

    data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition...... applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet (TM)) for efficient data management......Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new...

  2. Laboratory Automation and Middleware.

    Science.gov (United States)

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Automated cloning methods.; TOPICAL

    International Nuclear Information System (INIS)

    Collart, F.

    2001-01-01

    Argonne has developed a series of automated protocols to generate bacterial expression clones by using a robotic system designed to be used in procedures associated with molecular biology. The system provides plate storage, temperature control from 4 to 37 C at various locations, and Biomek and Multimek pipetting stations. The automated system consists of a robot that transports sources from the active station on the automation system. Protocols for the automated generation of bacterial expression clones can be grouped into three categories (Figure 1). Fragment generation protocols are initiated on day one of the expression cloning procedure and encompass those protocols involved in generating purified coding region (PCR)

  4. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  5. The Impacts of Mathematical Representations Developed through Webquest and Spreadsheet Activities on the Motivation of Pre-Service Elementary School Teachers

    Science.gov (United States)

    Halat, Erdogan; Peker, Murat

    2011-01-01

    The purpose of this study was to compare the influence of instruction using WebQuest activities with the influence of an instruction using spreadsheet activities on the motivation of pre-service elementary school teachers in mathematics teaching course. There were a total of 70 pre-service elementary school teachers involved in this study. Thirty…

  6. Developing Students' Understanding of Co-Opetition and Multilevel Inventory Management Strategies in Supply Chains: An In-Class Spreadsheet Simulation Exercise

    Science.gov (United States)

    Fetter, Gary; Shockley, Jeff

    2014-01-01

    Instructors look for ways to explain to students how supply chains can be constructed so that competing suppliers can work together to improve inventory management performance (i.e., a phenomenon known as co-opetition). An Excel spreadsheet-driven simulation is presented that models a complete multilevel supply chain system--customer, retailer,…

  7. Exploring Customization in Higher Education: An Experiment in Leveraging Computer Spreadsheet Technology to Deliver Highly Individualized Online Instruction to Undergraduate Business Students

    Science.gov (United States)

    Kunzler, Jayson S.

    2012-01-01

    This dissertation describes a research study designed to explore whether customization of online instruction results in improved learning in a college business statistics course. The study involved utilizing computer spreadsheet technology to develop an intelligent tutoring system (ITS) designed to: a) collect and monitor individual real-time…

  8. SimpleTreat: a spreadsheet-based box model to predict the fate of xenobiotics in a municipal waste water treatment plant

    NARCIS (Netherlands)

    Struijs J; van de Meent D; Stoltenkamp J

    1991-01-01

    A non-equilibrium steady state box model is reported, that predicts the fate of new chemicals in a conventional sewage treatment plant from a minimal input data set. The model, written in an electronic spreadsheet (Lotus TM 123), requires a minimum input: some basic properties of the chemical, its

  9. Designing Optical Spreadsheets-Technological Pedagogical Content Knowledge Simulation (S-TPACK): A Case Study of Pre-Service Teachers Course

    Science.gov (United States)

    Thohir, M. Anas

    2018-01-01

    In the 21st century, the competence of instructional technological design is important for pre-service physics teachers. This case study described the pre-service physics teachers' design of optical spreadsheet simulation and evaluated teaching and learning the task in the classroom. The case study chose three of thirty pre-service teacher's…

  10. How To Use the Spreadsheet as a Tool in the Secondary School Mathematics Classroom. Second Edition (for Windows and Macintosh Operating Systems).

    Science.gov (United States)

    Masalski, William J.

    This book seeks to develop, enhance, and expand students' understanding of mathematics by using technology. Topics covered include the advantages of spreadsheets along with the opportunity to explore the 'what if?' type of questions encountered in the problem-solving process, enhancing the user's insight into the development and use of algorithms,…

  11. Students' meaning making in science: solving energy resource problems in virtual worlds combined with spreadsheets to develop graphs

    Science.gov (United States)

    Krange, Ingeborg; Arnseth, Hans Christian

    2012-09-01

    The aim of this study is to scrutinize the characteristics of conceptual meaning making when students engage with virtual worlds in combination with a spreadsheet with the aim to develop graphs. We study how these tools and the representations they contain or enable students to construct serve to influence their understanding of energy resource consumption. The data were gathered in 1st grade upper-secondary science classes and they constitute the basis for the interaction analysis of students' meaning making with representations. Our analyses demonstrate the difficulties involved in developing students' orientation toward more conceptual orientations to representations of the knowledge domain. Virtual worlds do not in themselves represent a solution to this problem.

  12. When social actions get translated into spreadsheets: economics and social work with children and youth in Denmark

    DEFF Research Database (Denmark)

    Schrøder, Ida Marie

    2013-01-01

    interventions to help children and young people. Inspired by the sociologist John Law, my preliminary study suggests that taking into account economy often becomes a question of translating social interventions into spreadsheets, rather than making economically-based decisions. I classify three kinds...... in order to strengthen collaborative knowledge of how to take into account public sector economy, and to reflect on how technologies can interfere with decision processes in social work.......As a means of reducing public spending, social workers in Danish municipalities are expected to take into account public sector economy when deciding on how to solve social problems. Researchers have previously investigated the impact of social work on the public sector economy, the cost...

  13. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    Science.gov (United States)

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.

  14. Methods for Automated and Continuous Commissioning of Building Systems

    Energy Technology Data Exchange (ETDEWEB)

    Larry Luskay; Michael Brambley; Srinivas Katipamula

    2003-04-30

    Avoidance of poorly installed HVAC systems is best accomplished at the close of construction by having a building and its systems put ''through their paces'' with a well conducted commissioning process. This research project focused on developing key components to enable the development of tools that will automatically detect and correct equipment operating problems, thus providing continuous and automatic commissioning of the HVAC systems throughout the life of a facility. A study of pervasive operating problems reveled the following would most benefit from an automated and continuous commissioning process: (1) faulty economizer operation; (2) malfunctioning sensors; (3) malfunctioning valves and dampers, and (4) access to project design data. Methodologies for detecting system operation faults in these areas were developed and validated in ''bare-bones'' forms within standard software such as spreadsheets, databases, statistical or mathematical packages. Demonstrations included flow diagrams and simplified mock-up applications. Techniques to manage data were demonstrated by illustrating how test forms could be populated with original design information and the recommended sequence of operation for equipment systems. Proposed tools would use measured data, design data, and equipment operating parameters to diagnosis system problems. Steps for future research are suggested to help more toward practical application of automated commissioning and its high potential to improve equipment availability, increase occupant comfort, and extend the life of system equipment.

  15. A simple method to estimate the optimum iodine concentration of contrast material through microcatheters: hydrodynamic calculation with spreadsheet software

    International Nuclear Information System (INIS)

    Yamauchi, Teiyu; Hayashi, Toshihiko; Yamada, Takeshi; Futami, Choichiro; Tsukiyama, Yumiko; Harada, Motoko; Furui, Shigeru; Suzuki, Shigeru; Mimura, Kohshiro

    2008-01-01

    It is important to increase the iodine delivery rate (I), that is the iodine concentration of the contrast material (C) x the flow rate of the contrast material (Q), through microcatheters to obtain arteriograms of the highest contrast. It is known that C is an important factor that influences I. The purpose of this study is to establish a method of hydrodynamic calculation of the optimum iodine concentration (i.e., the iodine concentration at which I becomes maximum) of the contrast material and its flow rate through commercially available microcatheters. Iopamidol, ioversol and iohexol of ten iodine concentrations were used. Iodine delivery rates (I meas) of each contrast material through ten microcatheters were measured. The calculated iodine delivery rate (I cal) and calculated optimum iodine concentration (calculated C opt) were obtained with spreadsheet software. The agreement between I cal and I meas was studied by correlation and logarithmic Bland-Altman analyses. The value of the calculated C opt was within the optimum range of iodine concentrations (i.e. the range of iodine concentrations at which I meas becomes 90% or more of the maximum) in all cases. A good correlation between I cal and I meas (I cal = 1.08 I meas, r = 0.99) was observed. Logarithmic Bland-Altman analysis showed that the 95% confidence interval of I cal/I meas was between 0.82 and 1.29. In conclusion, hydrodynamic calculation with spreadsheet software is an accurate, generally applicable and cost-saving method to estimate the value of the optimum iodine concentration and its flow rate through microcatheters

  16. Applying 'Evidence-Based Medicine' Theory to Interventional Radiology.Part 2: A Spreadsheet for Swift Assessment of Procedural Benefit and Harm

    International Nuclear Information System (INIS)

    MacEneaney, Peter M.; Malone, Dermot E.

    2000-01-01

    AIM: To design a spreadsheet program to analyse interventional radiology (IR) data rapidly produced in local research or reported in the literature using 'evidence-based medicine' (EBM) parameters of treatment benefit and harm. MATERIALS AND METHODS: Microsoft Excel TM was used. The spreadsheet consists of three worksheets. The first shows the 'Levels of Evidence and Grades of Recommendations' that can be assigned to therapeutic studies as defined by the Oxford Centre for EBM. The second and third worksheets facilitate the EBM assessment of therapeutic benefit and harm. Validity criteria are described. These include the assessment of the adequacy of sample size in the detection of possible procedural complications. A contingency (2 x 2) table for raw data on comparative outcomes in treated patients and controls has been incorporated. Formulae for EBM calculations are related to these numerators and denominators in the spreadsheet. The parameters calculated are benefit -- relative risk reduction, absolute risk reduction, number needed to treat (NNT). Harm -- relative risk, relative odds, number needed to harm (NNH). Ninety-five per cent confidence intervals are calculated for all these indices. The results change automatically when the data in the therapeutic outcome cells are changed. A final section allows the user to correct the NNT or NNH in their application to individual patients. RESULTS: This spreadsheet can be used on desktop and palmtop computers. The MS Excel TM version can be downloaded via the Internet from the URL ftp://radiography.com/pub/TxHarm00.xls. CONCLUSION: A spreadsheet is useful for the rapid analysis of the clinical benefit and harm from IR procedures. MacEneaney, P.M. and Malone, D.E

  17. Web-Scale Discovery Services Retrieve Relevant Results in Health Sciences Topics Including MEDLINE Content

    Directory of Open Access Journals (Sweden)

    Elizabeth Margaret Stovold

    2017-06-01

    coverage of MEDLINE, they recorded the first 50 results from each of the 6 PubMed searches in a spreadsheet. During data collection at the WSD sites, they searched for these references to discover if the WSD tool at each site indexed these known items. Authors adopted measures to control for any customisation of the product setup at each data collection site. In particular, they excluded local holdings from the results by limiting the searches to scholarly, peer-reviewed articles. Main results – Authors reported results for 5 of the 6 sites. All of the WSD tools retrieved between 50-60% relevant results. EDS retrieved the highest number of relevant records (195/360 and 216/360, while Primo retrieved the lowest (167/328 and 169/325. There was good observer agreement (k=0.725 for the relevance assessment. The duplicate detection rate was similar in EDS and Summon (between 96-97% unique articles, while the Primo searches returned 82.9-84.9% unique articles. All three tools retrieved relevant results that were not indexed in MEDLINE, and retrieved relevant material indexed in MEDLINE that was not retrieved in the PubMed searches. EDS and Summon retrieved more non-MEDLINE material than Primo. EDS performed best in the known-item searches, with 300/300 and 299/300 items retrieved, while Primo performed worst with 230/300 and 267/300 items retrieved. The Summon platform features an “automated query expansion” search function, where user-entered keywords are matched to related search terms and these are automatically searched along with the original keyword. The authors observed that this function resulted in a wholly relevant first page of results for one of the search questions tested in Summon. Conclusion – While EDS performed slightly better overall, the difference was not great enough in this small sample of test sites to recommend EDS over the other tools being tested. The automated query expansion found in Summon is a useful function that is worthy of further

  18. Automated System Marketplace 1994.

    Science.gov (United States)

    Griffiths, Jose-Marie; Kertis, Kimberly

    1994-01-01

    Reports results of the 1994 Automated System Marketplace survey based on responses from 60 vendors. Highlights include changes in the library automation marketplace; estimated library systems revenues; minicomputer and microcomputer-based systems; marketplace trends; global markets and mergers; research needs; new purchase processes; and profiles…

  19. Automation in Warehouse Development

    NARCIS (Netherlands)

    Hamberg, R.; Verriet, J.

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and

  20. Order Division Automated System.

    Science.gov (United States)

    Kniemeyer, Justin M.; And Others

    This publication was prepared by the Order Division Automation Project staff to fulfill the Library of Congress' requirement to document all automation efforts. The report was originally intended for internal use only and not for distribution outside the Library. It is now felt that the library community at-large may have an interest in the…

  1. Automate functional testing

    Directory of Open Access Journals (Sweden)

    Ramesh Kalindri

    2014-06-01

    Full Text Available Currently, software engineers are increasingly turning to the option of automating functional tests, but not always have successful in this endeavor. Reasons range from low planning until over cost in the process. Some principles that can guide teams in automating these tests are described in this article.

  2. Automation and robotics

    Science.gov (United States)

    Montemerlo, Melvin

    1988-01-01

    The Autonomous Systems focus on the automation of control systems for the Space Station and mission operations. Telerobotics focuses on automation for in-space servicing, assembly, and repair. The Autonomous Systems and Telerobotics each have a planned sequence of integrated demonstrations showing the evolutionary advance of the state-of-the-art. Progress is briefly described for each area of concern.

  3. Automating the Small Library.

    Science.gov (United States)

    Skapura, Robert

    1987-01-01

    Discusses the use of microcomputers for automating school libraries, both for entire systems and for specific library tasks. Highlights include available library management software, newsletters that evaluate software, constructing an evaluation matrix, steps to consider in library automation, and a brief discussion of computerized card catalogs.…

  4. 14 CFR 406.143 - Discovery.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Discovery. 406.143 Section 406.143... Transportation Adjudications § 406.143 Discovery. (a) Initiation of discovery. Any party may initiate discovery... after a complaint has been filed. (b) Methods of discovery. The following methods of discovery are...

  5. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  6. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  7. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  8. Recent advances in inkjet dispensing technologies: applications in drug discovery.

    Science.gov (United States)

    Zhu, Xiangcheng; Zheng, Qiang; Yang, Hu; Cai, Jin; Huang, Lei; Duan, Yanwen; Xu, Zhinan; Cen, Peilin

    2012-09-01

    Inkjet dispensing technology is a promising fabrication methodology widely applied in drug discovery. The automated programmable characteristics and high-throughput efficiency makes this approach potentially very useful in miniaturizing the design patterns for assays and drug screening. Various custom-made inkjet dispensing systems as well as specialized bio-ink and substrates have been developed and applied to fulfill the increasing demands of basic drug discovery studies. The incorporation of other modern technologies has further exploited the potential of inkjet dispensing technology in drug discovery and development. This paper reviews and discusses the recent developments and practical applications of inkjet dispensing technology in several areas of drug discovery and development including fundamental assays of cells and proteins, microarrays, biosensors, tissue engineering, basic biological and pharmaceutical studies. Progression in a number of areas of research including biomaterials, inkjet mechanical systems and modern analytical techniques as well as the exploration and accumulation of profound biological knowledge has enabled different inkjet dispensing technologies to be developed and adapted for high-throughput pattern fabrication and miniaturization. This in turn presents a great opportunity to propel inkjet dispensing technology into drug discovery.

  9. SU-G-BRB-04: Automated Output Factor Measurements Using Continuous Data Logging for Linac Commissioning

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, X; Li, S; Zheng, D; Wang, S; Lei, Y; Zhang, M; Ma, R; Fan, Q; Wang, X; Li, X; Verma, V; Enke, C; Zhou, S [University of Nebraska Medical Center, Omaha, NE (United States)

    2016-06-15

    Purpose: Linac commissioning is a time consuming and labor intensive process, the streamline of which is highly desirable. In particular, manual measurement of output factors for a variety of field sizes and energy greatly hinders the commissioning efficiency. In this study, automated measurement of output factors was demonstrated as ‘one-click’ using data logging of an electrometer. Methods: Beams to be measured were created in the recording and verifying (R&V) system and configured for continuous delivery. An electrometer with an automatic data logging feature enabled continuous data collection for all fields without human intervention. The electrometer saved data into a spreadsheet every 0.5 seconds. A Matlab program was developed to analyze the excel data to monitor and check the data quality. Results: For each photon energy, output factors were measured for five configurations, including open field and four wedges. Each configuration includes 72 fields sizes, ranging from 4×4 to 20×30 cm{sup 2}. Using automation, it took 50 minutes to complete the measurement of 72 field sizes, in contrast to 80 minutes when using the manual approach. The automation avoided the necessity of redundant Linac status checks between fields as in the manual approach. In fact, the only limiting factor in such automation is Linac overheating. The data collection beams in the R&V system are reusable, and the simplified process is less error-prone. In addition, our Matlab program extracted the output factors faithfully from data logging, and the discrepancy between the automatic and manual measurement is within ±0.3%. For two separate automated measurements 30 days apart, consistency check shows a discrepancy within ±1% for 6MV photon with a 60 degree wedge. Conclusion: Automated output factor measurements can save time by 40% when compared with conventional manual approach. This work laid ground for further improvement for the automation of Linac commissioning.

  10. SU-G-BRB-04: Automated Output Factor Measurements Using Continuous Data Logging for Linac Commissioning

    International Nuclear Information System (INIS)

    Zhu, X; Li, S; Zheng, D; Wang, S; Lei, Y; Zhang, M; Ma, R; Fan, Q; Wang, X; Li, X; Verma, V; Enke, C; Zhou, S

    2016-01-01

    Purpose: Linac commissioning is a time consuming and labor intensive process, the streamline of which is highly desirable. In particular, manual measurement of output factors for a variety of field sizes and energy greatly hinders the commissioning efficiency. In this study, automated measurement of output factors was demonstrated as ‘one-click’ using data logging of an electrometer. Methods: Beams to be measured were created in the recording and verifying (R&V) system and configured for continuous delivery. An electrometer with an automatic data logging feature enabled continuous data collection for all fields without human intervention. The electrometer saved data into a spreadsheet every 0.5 seconds. A Matlab program was developed to analyze the excel data to monitor and check the data quality. Results: For each photon energy, output factors were measured for five configurations, including open field and four wedges. Each configuration includes 72 fields sizes, ranging from 4×4 to 20×30 cm"2. Using automation, it took 50 minutes to complete the measurement of 72 field sizes, in contrast to 80 minutes when using the manual approach. The automation avoided the necessity of redundant Linac status checks between fields as in the manual approach. In fact, the only limiting factor in such automation is Linac overheating. The data collection beams in the R&V system are reusable, and the simplified process is less error-prone. In addition, our Matlab program extracted the output factors faithfully from data logging, and the discrepancy between the automatic and manual measurement is within ±0.3%. For two separate automated measurements 30 days apart, consistency check shows a discrepancy within ±1% for 6MV photon with a 60 degree wedge. Conclusion: Automated output factor measurements can save time by 40% when compared with conventional manual approach. This work laid ground for further improvement for the automation of Linac commissioning.

  11. Causality discovery technology

    Science.gov (United States)

    Chen, M.; Ertl, T.; Jirotka, M.; Trefethen, A.; Schmidt, A.; Coecke, B.; Bañares-Alcántara, R.

    2012-11-01

    Causality is the fabric of our dynamic world. We all make frequent attempts to reason causation relationships of everyday events (e.g., what was the cause of my headache, or what has upset Alice?). We attempt to manage causality all the time through planning and scheduling. The greatest scientific discoveries are usually about causality (e.g., Newton found the cause for an apple to fall, and Darwin discovered natural selection). Meanwhile, we continue to seek a comprehensive understanding about the causes of numerous complex phenomena, such as social divisions, economic crisis, global warming, home-grown terrorism, etc. Humans analyse and reason causality based on observation, experimentation and acquired a priori knowledge. Today's technologies enable us to make observations and carry out experiments in an unprecedented scale that has created data mountains everywhere. Whereas there are exciting opportunities to discover new causation relationships, there are also unparalleled challenges to benefit from such data mountains. In this article, we present a case for developing a new piece of ICT, called Causality Discovery Technology. We reason about the necessity, feasibility and potential impact of such a technology.

  12. A Framework for Automatic Web Service Discovery Based on Semantics and NLP Techniques

    Directory of Open Access Journals (Sweden)

    Asma Adala

    2011-01-01

    Full Text Available As a greater number of Web Services are made available today, automatic discovery is recognized as an important task. To promote the automation of service discovery, different semantic languages have been created that allow describing the functionality of services in a machine interpretable form using Semantic Web technologies. The problem is that users do not have intimate knowledge about semantic Web service languages and related toolkits. In this paper, we propose a discovery framework that enables semantic Web service discovery based on keywords written in natural language. We describe a novel approach for automatic discovery of semantic Web services which employs Natural Language Processing techniques to match a user request, expressed in natural language, with a semantic Web service description. Additionally, we present an efficient semantic matching technique to compute the semantic distance between ontological concepts.

  13. Proteomic biomarker discovery in 1000 human plasma samples with mass spectrometry

    DEFF Research Database (Denmark)

    Cominetti, Ornella; Núñez Galindo, Antonio; Corthésy, John

    2016-01-01

    automated proteomic biomarker discovery workflow. Herein, we have applied this approach to analyze 1000 plasma samples from the multicentered human dietary intervention study "DiOGenes". Study design, sample randomization, tracking, and logistics were the foundations of our large-scale study. We checked...

  14. Mapping the Stacks: Sustainability and User Experience of Animated Maps in Library Discovery Interfaces

    Science.gov (United States)

    McMillin, Bill; Gibson, Sally; MacDonald, Jean

    2016-01-01

    Animated maps of the library stacks were integrated into the catalog interface at Pratt Institute and into the EBSCO Discovery Service interface at Illinois State University. The mapping feature was developed for optimal automation of the update process to enable a range of library personnel to update maps and call-number ranges. The development…

  15. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  16. Computational methods in drug discovery

    OpenAIRE

    Sumudu P. Leelananda; Steffen Lindert

    2016-01-01

    The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD) tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery project...

  17. Representation Discovery using Harmonic Analysis

    CERN Document Server

    Mahadevan, Sridhar

    2008-01-01

    Representations are at the heart of artificial intelligence (AI). This book is devoted to the problem of representation discovery: how can an intelligent system construct representations from its experience? Representation discovery re-parameterizes the state space - prior to the application of information retrieval, machine learning, or optimization techniques - facilitating later inference processes by constructing new task-specific bases adapted to the state space geometry. This book presents a general approach to representation discovery using the framework of harmonic analysis, in particu

  18. Automating Groundwater Sampling At Hanford, The Next Step

    International Nuclear Information System (INIS)

    Connell, C.W.; Conley, S.F.; Hildebrand, R.D.; Cunningham, D.E.

    2010-01-01

    Historically, the groundwater monitoring activities at the Department of Energy's Hanford Site in southeastern Washington State have been very 'people intensive.' Approximately 1500 wells are sampled each year by field personnel or 'samplers.' These individuals have been issued pre-printed forms showing information about the well(s) for a particular sampling evolution. This information is taken from 2 official electronic databases: the Hanford Well information System (HWIS) and the Hanford Environmental Information System (HEIS). The samplers used these hardcopy forms to document the groundwater samples and well water-levels. After recording the entries in the field, the samplers turned the forms in at the end of the day and other personnel posted the collected information onto a spreadsheet that was then printed and included in a log book. The log book was then used to make manual entries of the new information into the software application(s) for the HEIS and HWIS databases. A pilot project for automating this extremely tedious process was lauched in 2008. Initially, the automation was focused on water-level measurements. Now, the effort is being extended to automate the meta-data associated with collecting groundwater samples. The project allowed electronic forms produced in the field by samplers to be used in a work flow process where the data is transferred to the database and electronic form is filed in managed records - thus eliminating manually completed forms. Elimating the manual forms and streamlining the data entry not only improved the accuracy of the information recorded, but also enhanced the efficiency and sampling capacity of field office personnel.

  19. Universal Verification Methodology Based Register Test Automation Flow.

    Science.gov (United States)

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.

  20. Operational proof of automation

    International Nuclear Information System (INIS)

    Jaerschky, R.; Reifenhaeuser, R.; Schlicht, K.

    1976-01-01

    Automation of the power plant process may imply quite a number of problems. The automation of dynamic operations requires complicated programmes often interfering in several branched areas. This reduces clarity for the operating and maintenance staff, whilst increasing the possibilities of errors. The synthesis and the organization of standardized equipment have proved very successful. The possibilities offered by this kind of automation for improving the operation of power plants will only sufficiently and correctly be turned to profit, however, if the application of these technics of equipment is further improved and if its volume is tallied with a definite etc. (orig.) [de

  1. Automation of radioimmunoassay

    International Nuclear Information System (INIS)

    Yamaguchi, Chisato; Yamada, Hideo; Iio, Masahiro

    1974-01-01

    Automation systems for measuring Australian antigen by radioimmunoassay under development were discussed. Samples were processed as follows: blood serum being dispensed by automated sampler to the test tube, and then incubated under controlled time and temperature; first counting being omitted; labelled antibody being dispensed to the serum after washing; samples being incubated and then centrifuged; radioactivities in the precipitate being counted by auto-well counter; measurements being tabulated by automated typewriter. Not only well-type counter but also position counter was studied. (Kanao, N.)

  2. Automated electron microprobe

    International Nuclear Information System (INIS)

    Thompson, K.A.; Walker, L.R.

    1986-01-01

    The Plant Laboratory at the Oak Ridge Y-12 Plant has recently obtained a Cameca MBX electron microprobe with a Tracor Northern TN5500 automation system. This allows full stage and spectrometer automation and digital beam control. The capabilities of the system include qualitative and quantitative elemental microanalysis for all elements above and including boron in atomic number, high- and low-magnification imaging and processing, elemental mapping and enhancement, and particle size, shape, and composition analyses. Very low magnification, quantitative elemental mapping using stage control (which is of particular interest) has been accomplished along with automated size, shape, and composition analysis over a large relative area

  3. Chef infrastructure automation cookbook

    CERN Document Server

    Marschall, Matthias

    2013-01-01

    Chef Infrastructure Automation Cookbook contains practical recipes on everything you will need to automate your infrastructure using Chef. The book is packed with illustrated code examples to automate your server and cloud infrastructure.The book first shows you the simplest way to achieve a certain task. Then it explains every step in detail, so that you can build your knowledge about how things work. Eventually, the book shows you additional things to consider for each approach. That way, you can learn step-by-step and build profound knowledge on how to go about your configuration management

  4. Managing laboratory automation.

    Science.gov (United States)

    Saboe, T J

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed.

  5. Automated PCB Inspection System

    Directory of Open Access Journals (Sweden)

    Syed Usama BUKHARI

    2017-05-01

    Full Text Available Development of an automated PCB inspection system as per the need of industry is a challenging task. In this paper a case study is presented, to exhibit, a proposed system for an immigration process of a manual PCB inspection system to an automated PCB inspection system, with a minimal intervention on the existing production flow, for a leading automotive manufacturing company. A detailed design of the system, based on computer vision followed by testing and analysis was proposed, in order to aid the manufacturer in the process of automation.

  6. Operational proof of automation

    International Nuclear Information System (INIS)

    Jaerschky, R.; Schlicht, K.

    1977-01-01

    Automation of the power plant process may imply quite a number of problems. The automation of dynamic operations requires complicated programmes often interfering in several branched areas. This reduces clarity for the operating and maintenance staff, whilst increasing the possibilities of errors. The synthesis and the organization of standardized equipment have proved very successful. The possibilities offered by this kind of automation for improving the operation of power plants will only sufficiently and correctly be turned to profit, however, if the application of these equipment techniques is further improved and if it stands in a certain ratio with a definite efficiency. (orig.) [de

  7. Googling your hand hygiene data: Using Google Forms, Google Sheets, and R to collect and automate analysis of hand hygiene compliance monitoring.

    Science.gov (United States)

    Wiemken, Timothy L; Furmanek, Stephen P; Mattingly, William A; Haas, Janet; Ramirez, Julio A; Carrico, Ruth M

    2018-06-01

    Hand hygiene is one of the most important interventions in the quest to eliminate healthcare-associated infections, and rates in healthcare facilities are markedly low. Since hand hygiene observation and feedback are critical to improve adherence, we created an easy-to-use, platform-independent hand hygiene data collection process and an automated, on-demand reporting engine. A 3-step approach was used for this project: 1) creation of a data collection form using Google Forms, 2) transfer of data from the form to a spreadsheet using Google Spreadsheets, and 3) creation of an automated, cloud-based analytics platform for report generation using R and RStudio Shiny software. A video tutorial of all steps in the creation and use of this free tool can be found on our YouTube channel: https://www.youtube.com/watch?v=uFatMR1rXqU&t. The on-demand reporting tool can be accessed at: https://crsp.louisville.edu/shiny/handhygiene. This data collection and automated analytics engine provides an easy-to-use environment for evaluating hand hygiene data; it also provides rapid feedback to healthcare workers. By reducing some of the data management workload required of the infection preventionist, more focused interventions may be instituted to increase global hand hygiene rates and reduce infection. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  8. Hippocampus discovery First steps

    Directory of Open Access Journals (Sweden)

    Eliasz Engelhardt

    Full Text Available The first steps of the discovery, and the main discoverers, of the hippocampus are outlined. Arantius was the first to describe a structure he named "hippocampus" or "white silkworm". Despite numerous controversies and alternate designations, the term hippocampus has prevailed until this day as the most widely used term. Duvernoy provided an illustration of the hippocampus and surrounding structures, considered the first by most authors, which appeared more than one and a half century after Arantius' description. Some authors have identified other drawings and texts which they claim predate Duvernoy's depiction, in studies by Vesalius, Varolio, Willis, and Eustachio, albeit unconvincingly. Considering the definition of the hippocampal formation as comprising the hippocampus proper, dentate gyrus and subiculum, Arantius and Duvernoy apparently described the gross anatomy of this complex. The pioneering studies of Arantius and Duvernoy revealed a relatively small hidden formation that would become one of the most valued brain structures.

  9. Automated detection of structural alerts (chemical fragments in (ecotoxicology

    Directory of Open Access Journals (Sweden)

    Ronan Bureau

    2013-02-01

    Full Text Available This mini-review describes the evolution of different algorithms dedicated to the automated discovery of chemical fragments associated to (ecotoxicological endpoints. These structural alerts correspond to one of the most interesting approach of in silico toxicology due to their direct link with specific toxicological mechanisms. A number of expert systems are already available but, since the first work in this field which considered a binomial distribution of chemical fragments between two datasets, new data miners were developed and applied with success in chemoinformatics. The frequency of a chemical fragment in a dataset is often at the core of the process for the definition of its toxicological relevance. However, recent progresses in data mining provide new insights into the automated discovery of new rules. Particularly, this review highlights the notion of Emerging Patterns that can capture contrasts between classes of data.

  10. AUTOMATED DETECTION OF STRUCTURAL ALERTS (CHEMICAL FRAGMENTS IN (ECOTOXICOLOGY

    Directory of Open Access Journals (Sweden)

    Alban Lepailleur

    2013-02-01

    Full Text Available This mini-review describes the evolution of different algorithms dedicated to the automated discovery of chemical fragments associated to (ecotoxicological endpoints. These structural alerts correspond to one of the most interesting approach of in silico toxicology due to their direct link with specific toxicological mechanisms. A number of expert systems are already available but, since the first work in this field which considered a binomial distribution of chemical fragments between two datasets, new data miners were developed and applied with success in chemoinformatics. The frequency of a chemical fragment in a dataset is often at the core of the process for the definition of its toxicological relevance. However, recent progresses in data mining provide new insights into the automated discovery of new rules. Particularly, this review highlights the notion of Emerging Patterns that can capture contrasts between classes of data.

  11. Automated applications of sandwich-cultured hepatocytes in the evaluation of hepatic drug transport.

    Science.gov (United States)

    Perry, Cassandra H; Smith, William R; St Claire, Robert L; Brouwer, Kenneth R

    2011-04-01

    Predictions of the absorption, distribution, metabolism, excretion, and toxicity of compounds in pharmaceutical development are essential aspects of the drug discovery process. B-CLEAR is an in vitro system that uses sandwich-cultured hepatocytes to evaluate and predict in vivo hepatobiliary disposition (hepatic uptake, biliary excretion, and biliary clearance), transporter-based hepatic drug-drug interactions, and potential drug-induced hepatotoxicity. Automation of predictive technologies is an advantageous and preferred format in drug discovery. In this study, manual and automated studies are investigated and equivalence is demonstrated. In addition, automated applications using model probe substrates and inhibitors to assess the cholestatic potential of drugs and evaluate hepatic drug transport are examined. The successful automation of this technology provides a more reproducible and less labor-intensive approach, reducing potential operator error in complex studies and facilitating technology transfer.

  12. Automated Vehicles Symposium 2015

    CERN Document Server

    Beiker, Sven

    2016-01-01

    This edited book comprises papers about the impacts, benefits and challenges of connected and automated cars. It is the third volume of the LNMOB series dealing with Road Vehicle Automation. The book comprises contributions from researchers, industry practitioners and policy makers, covering perspectives from the U.S., Europe and Japan. It is based on the Automated Vehicles Symposium 2015 which was jointly organized by the Association of Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Ann Arbor, Michigan, in July 2015. The topical spectrum includes, but is not limited to, public sector activities, human factors, ethical and business aspects, energy and technological perspectives, vehicle systems and transportation infrastructure. This book is an indispensable source of information for academic researchers, industrial engineers and policy makers interested in the topic of road vehicle automation.

  13. Hydrometeorological Automated Data System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Office of Hydrologic Development of the National Weather Service operates HADS, the Hydrometeorological Automated Data System. This data set contains the last 48...

  14. Automated External Defibrillator

    Science.gov (United States)

    ... leads to a 10 percent reduction in survival. Training To Use an Automated External Defibrillator Learning how to use an AED and taking a CPR (cardiopulmonary resuscitation) course are helpful. However, if trained ...

  15. Planning for Office Automation.

    Science.gov (United States)

    Mick, Colin K.

    1983-01-01

    Outlines a practical approach to planning for office automation termed the "Focused Process Approach" (the "what" phase, "how" phase, "doing" phase) which is a synthesis of the problem-solving and participatory planning approaches. Thirteen references are provided. (EJS)

  16. Fixed automated spray technology.

    Science.gov (United States)

    2011-04-19

    This research project evaluated the construction and performance of Boschungs Fixed Automated : Spray Technology (FAST) system. The FAST system automatically sprays de-icing material on : the bridge when icing conditions are about to occur. The FA...

  17. Automated Vehicles Symposium 2014

    CERN Document Server

    Beiker, Sven; Road Vehicle Automation 2

    2015-01-01

    This paper collection is the second volume of the LNMOB series on Road Vehicle Automation. The book contains a comprehensive review of current technical, socio-economic, and legal perspectives written by experts coming from public authorities, companies and universities in the U.S., Europe and Japan. It originates from the Automated Vehicle Symposium 2014, which was jointly organized by the Association for Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Burlingame, CA, in July 2014. The contributions discuss the challenges arising from the integration of highly automated and self-driving vehicles into the transportation system, with a focus on human factors and different deployment scenarios. This book is an indispensable source of information for academic researchers, industrial engineers, and policy makers interested in the topic of road vehicle automation.

  18. Automation Interface Design Development

    Data.gov (United States)

    National Aeronautics and Space Administration — Our research makes its contributions at two levels. At one level, we addressed the problems of interaction between humans and computers/automation in a particular...

  19. I-94 Automation FAQs

    Data.gov (United States)

    Department of Homeland Security — In order to increase efficiency, reduce operating costs and streamline the admissions process, U.S. Customs and Border Protection has automated Form I-94 at air and...

  20. Automation synthesis modules review

    International Nuclear Information System (INIS)

    Boschi, S.; Lodi, F.; Malizia, C.; Cicoria, G.; Marengo, M.

    2013-01-01

    The introduction of 68 Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived 68 Ge/ 68 Ga generator has been at the bases of the development of 68 Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for 68 Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. - Highlights: ► Generators availability and robust chemistry boosted for the huge diffusion of 68Ga radiopharmaceuticals. ► Different technological approaches for 68Ga radiopharmaceuticals will be discussed. ► Generator eluate post processing and evolution to cassette based systems were the major issues in automation. ► Impact of regulations on the technological development will be also considered

  1. Disassembly automation automated systems with cognitive abilities

    CERN Document Server

    Vongbunyong, Supachai

    2015-01-01

    This book presents a number of aspects to be considered in the development of disassembly automation, including the mechanical system, vision system and intelligent planner. The implementation of cognitive robotics increases the flexibility and degree of autonomy of the disassembly system. Disassembly, as a step in the treatment of end-of-life products, can allow the recovery of embodied value left within disposed products, as well as the appropriate separation of potentially-hazardous components. In the end-of-life treatment industry, disassembly has largely been limited to manual labor, which is expensive in developed countries. Automation is one possible solution for economic feasibility. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.

  2. Quantitative DMS mapping for automated RNA secondary structure inference

    OpenAIRE

    Cordero, Pablo; Kladwang, Wipapat; VanLang, Christopher C.; Das, Rhiju

    2012-01-01

    For decades, dimethyl sulfate (DMS) mapping has informed manual modeling of RNA structure in vitro and in vivo. Here, we incorporate DMS data into automated secondary structure inference using a pseudo-energy framework developed for 2'-OH acylation (SHAPE) mapping. On six non-coding RNAs with crystallographic models, DMS- guided modeling achieves overall false negative and false discovery rates of 9.5% and 11.6%, comparable or better than SHAPE-guided modeling; and non-parametric bootstrappin...

  3. Highway Electrification And Automation

    OpenAIRE

    Shladover, Steven E.

    1992-01-01

    This report addresses how the California Department of Transportation and the California PATH Program have made efforts to evaluate the feasibility and applicability of highway electrification and automation technologies. In addition to describing how the work was conducted, the report also describes the findings on highway electrification and highway automation, with experimental results, design study results, and a region-wide application impacts study for Los Angeles.

  4. Automated lattice data generation

    Directory of Open Access Journals (Sweden)

    Ayyar Venkitesh

    2018-01-01

    Full Text Available The process of generating ensembles of gauge configurations (and measuring various observables over them can be tedious and error-prone when done “by hand”. In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.

  5. Automated lattice data generation

    Science.gov (United States)

    Ayyar, Venkitesh; Hackett, Daniel C.; Jay, William I.; Neil, Ethan T.

    2018-03-01

    The process of generating ensembles of gauge configurations (and measuring various observables over them) can be tedious and error-prone when done "by hand". In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.

  6. Marketing automation supporting sales

    OpenAIRE

    Sandell, Niko

    2016-01-01

    The past couple of decades has been a time of major changes in marketing. Digitalization has become a permanent part of marketing and at the same time enabled efficient collection of data. Personalization and customization of content are playing a crucial role in marketing when new customers are acquired. This has also created a need for automation to facilitate the distribution of targeted content. As a result of successful marketing automation more information of the customers is gathered ...

  7. Instant Sikuli test automation

    CERN Document Server

    Lau, Ben

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. A concise guide written in an easy-to follow style using the Starter guide approach.This book is aimed at automation and testing professionals who want to use Sikuli to automate GUI. Some Python programming experience is assumed.

  8. Managing laboratory automation

    OpenAIRE

    Saboe, Thomas J.

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Fina...

  9. Shielded cells transfer automation

    International Nuclear Information System (INIS)

    Fisher, J.J.

    1984-01-01

    Nuclear waste from shielded cells is removed, packaged, and transferred manually in many nuclear facilities. Radiation exposure is absorbed by operators during these operations and limited only through procedural controls. Technological advances in automation using robotics have allowed a production waste removal operation to be automated to reduce radiation exposure. The robotic system bags waste containers out of glove box and transfers them to a shielded container. Operators control the system outside the system work area via television cameras. 9 figures

  10. Automated Status Notification System

    Science.gov (United States)

    2005-01-01

    NASA Lewis Research Center's Automated Status Notification System (ASNS) was born out of need. To prevent "hacker attacks," Lewis' telephone system needed to monitor communications activities 24 hr a day, 7 days a week. With decreasing staff resources, this continuous monitoring had to be automated. By utilizing existing communications hardware, a UNIX workstation, and NAWK (a pattern scanning and processing language), we implemented a continuous monitoring system.

  11. Automated Groundwater Screening

    International Nuclear Information System (INIS)

    Taylor, Glenn A.; Collard, Leonard B.

    2005-01-01

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application

  12. Materials Discovery | Materials Science | NREL

    Science.gov (United States)

    Discovery Materials Discovery Images of red and yellow particles NREL's research in materials characterization of sample by incoming beam and measuring outgoing particles, with data being stored and analyzed Staff Scientist Dr. Zakutayev specializes in design of novel semiconductor materials for energy

  13. Service discovery using Bloom filters

    NARCIS (Netherlands)

    Goering, P.T.H.; Heijenk, Geert; Lelieveldt, B.P.F.; Haverkort, Boudewijn R.H.M.; de Laat, C.T.A.M.; Heijnsdijk, J.W.J.

    A protocol to perform service discovery in adhoc networks is introduced in this paper. Attenuated Bloom filters are used to distribute services to nodes in the neighborhood and thus enable local service discovery. The protocol has been implemented in a discrete event simulator to investigate the

  14. On the pulse of discovery

    Science.gov (United States)

    2017-12-01

    What started 50 years ago as a `smudge' on paper has flourished into a fundamental field of astrophysics replete with unexpected applications and exciting discoveries. To celebrate the discovery of pulsars, we look at the past, present and future of pulsar astrophysics.

  15. Computer modeling in free spreadsheets OpenOffice.Calc as one of the modern methods of teaching physics and mathematics cycle subjects in primary and secondary schools

    Directory of Open Access Journals (Sweden)

    Markushevich M.V.

    2016-10-01

    Full Text Available the article details the use of such modern method of training as computer simulation applied to modelling of various kinds of mechanical motion of a material point in the free spreadsheet OpenOffice.org Calc while designing physics and computer science lessons in primary and secondary schools. Particular attention is paid to the application of computer modeling integrated with other modern teaching methods.

  16. Review of Literature for Inputs to the National Water Savings Model and Spreadsheet Tool-Commercial/Institutional

    Energy Technology Data Exchange (ETDEWEB)

    Whitehead, Camilla Dunham; Melody, Moya; Lutz, James

    2009-05-29

    Lawrence Berkeley National Laboratory (LBNL) is developing a computer model and spreadsheet tool for the United States Environmental Protection Agency (EPA) to help estimate the water savings attributable to their WaterSense program. WaterSense has developed a labeling program for three types of plumbing fixtures commonly used in commercial and institutional settings: flushometer valve toilets, urinals, and pre-rinse spray valves. This National Water Savings-Commercial/Institutional (NWS-CI) model is patterned after the National Water Savings-Residential model, which was completed in 2008. Calculating the quantity of water and money saved through the WaterSense labeling program requires three primary inputs: (1) the quantity of a given product in use; (2) the frequency with which units of the product are replaced or are installed in new construction; and (3) the number of times or the duration the product is used in various settings. To obtain the information required for developing the NWS-CI model, LBNL reviewed various resources pertaining to the three WaterSense-labeled commercial/institutional products. The data gathered ranged from the number of commercial buildings in the United States to numbers of employees in various sectors of the economy and plumbing codes for commercial buildings. This document summarizes information obtained about the three products' attributes, quantities, and use in commercial and institutional settings that is needed to estimate how much water EPA's WaterSense program saves.

  17. COPATH - a spreadsheet model for the estimation of carbon flows associated with the use of forest resources

    International Nuclear Information System (INIS)

    Makundi, W.; Sathaye, J.; Ketoff, A.

    1995-01-01

    The forest sector plays a key role in the global climate change process. A significant amount of net greenhouse gas emissions emanate from land use changes, and the sector offers a unique opportunity to sequester carbon in vegetation, detritus, soils and forest products. However, the estimates of carbon flows associated with the use of forest resources have been quite imprecise. This paper describes a methodological framework-COPATH-which is a spreadsheet model for estimating carbon emissions and sequestration from deforestation and harvesting of forests. The model has two parts, the first estimates carbon stocks, emissions and uptake in the base year, while the second part forecasts future emissions and the uptake under various scenarios. The forecast module is structured after the main modes of forest conversion, i.e. agriculture, pasture, forest harvesting and other land uses. The model can be used by countries which may not possess an abundance of pertinent data, and allows for the use of forest inventory data to estimate carbon stocks. The choice of the most likely scenario provides the country with a carbon flux profile necessary to formulate GHG mitigation strategies. (Author)

  18. A step-by-step guide to non-linear regression analysis of experimental data using a Microsoft Excel spreadsheet.

    Science.gov (United States)

    Brown, A M

    2001-06-01

    The objective of this present study was to introduce a simple, easily understood method for carrying out non-linear regression analysis based on user input functions. While it is relatively straightforward to fit data with simple functions such as linear or logarithmic functions, fitting data with more complicated non-linear functions is more difficult. Commercial specialist programmes are available that will carry out this analysis, but these programmes are expensive and are not intuitive to learn. An alternative method described here is to use the SOLVER function of the ubiquitous spreadsheet programme Microsoft Excel, which employs an iterative least squares fitting routine to produce the optimal goodness of fit between data and function. The intent of this paper is to lead the reader through an easily understood step-by-step guide to implementing this method, which can be applied to any function in the form y=f(x), and is well suited to fast, reliable analysis of data in all fields of biology.

  19. Semi-automated software service integration in virtual organisations

    Science.gov (United States)

    Afsarmanesh, Hamideh; Sargolzaei, Mahdi; Shadi, Mahdieh

    2015-08-01

    To enhance their business opportunities, organisations involved in many service industries are increasingly active in pursuit of both online provision of their business services (BSs) and collaborating with others. Collaborative Networks (CNs) in service industry sector, however, face many challenges related to sharing and integration of their collection of provided BSs and their corresponding software services. Therefore, the topic of service interoperability for which this article introduces a framework is gaining momentum in research for supporting CNs. It contributes to generation of formal machine readable specification for business processes, aimed at providing their unambiguous definitions, as needed for developing their equivalent software services. The framework provides a model and implementation architecture for discovery and composition of shared services, to support the semi-automated development of integrated value-added services. In support of service discovery, a main contribution of this research is the formal representation of services' behaviour and applying desired service behaviour specified by users for automated matchmaking with other existing services. Furthermore, to support service integration, mechanisms are developed for automated selection of the most suitable service(s) according to a number of service quality aspects. Two scenario cases are presented, which exemplify several specific features related to service discovery and service integration aspects.

  20. 29 CFR 2700.56 - Discovery; general.

    Science.gov (United States)

    2010-07-01

    ...(c) or 111 of the Act has been filed. 30 U.S.C. 815(c) and 821. (e) Completion of discovery... 29 Labor 9 2010-07-01 2010-07-01 false Discovery; general. 2700.56 Section 2700.56 Labor... Hearings § 2700.56 Discovery; general. (a) Discovery methods. Parties may obtain discovery by one or more...

  1. 19 CFR 207.109 - Discovery.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Discovery. 207.109 Section 207.109 Customs Duties... and Committee Proceedings § 207.109 Discovery. (a) Discovery methods. All parties may obtain discovery under such terms and limitations as the administrative law judge may order. Discovery may be by one or...

  2. 30 CFR 44.24 - Discovery.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Discovery. 44.24 Section 44.24 Mineral... Discovery. Parties shall be governed in their conduct of discovery by appropriate provisions of the Federal... discovery. Alternative periods of time for discovery may be prescribed by the presiding administrative law...

  3. 19 CFR 356.20 - Discovery.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Discovery. 356.20 Section 356.20 Customs Duties... § 356.20 Discovery. (a) Voluntary discovery. All parties are encouraged to engage in voluntary discovery... sanctions proceeding. (b) Limitations on discovery. The administrative law judge shall place such limits...

  4. 24 CFR 180.500 - Discovery.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Discovery. 180.500 Section 180.500... OPPORTUNITY CONSOLIDATED HUD HEARING PROCEDURES FOR CIVIL RIGHTS MATTERS Discovery § 180.500 Discovery. (a) In general. This subpart governs discovery in aid of administrative proceedings under this part. Discovery in...

  5. 15 CFR 25.21 - Discovery.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 1 2010-01-01 2010-01-01 false Discovery. 25.21 Section 25.21... Discovery. (a) The following types of discovery are authorized: (1) Requests for production of documents for..., discovery is available only as ordered by the ALJ. The ALJ shall regulate the timing of discovery. (d...

  6. 39 CFR 963.14 - Discovery.

    Science.gov (United States)

    2010-07-01

    ... 39 Postal Service 1 2010-07-01 2010-07-01 false Discovery. 963.14 Section 963.14 Postal Service... PANDERING ADVERTISEMENTS STATUTE, 39 U.S.C. 3008 § 963.14 Discovery. Discovery is to be conducted on a... such discovery as he or she deems reasonable and necessary. Discovery may include one or more of the...

  7. 22 CFR 224.21 - Discovery.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Discovery. 224.21 Section 224.21 Foreign....21 Discovery. (a) The following types of discovery are authorized: (1) Requests for production of... parties, discovery is available only as ordered by the ALJ. The ALJ shall regulate the timing of discovery...

  8. Perovskite classification: An Excel spreadsheet to determine and depict end-member proportions for the perovskite- and vapnikite-subgroups of the perovskite supergroup

    Science.gov (United States)

    Locock, Andrew J.; Mitchell, Roger H.

    2018-04-01

    Perovskite mineral oxides commonly exhibit extensive solid-solution, and are therefore classified on the basis of the proportions of their ideal end-members. A uniform sequence of calculation of the end-members is required if comparisons are to be made between different sets of analytical data. A Microsoft Excel spreadsheet has been programmed to assist with the classification and depiction of the minerals of the perovskite- and vapnikite-subgroups following the 2017 nomenclature of the perovskite supergroup recommended by the International Mineralogical Association (IMA). Compositional data for up to 36 elements are input into the spreadsheet as oxides in weight percent. For each analysis, the output includes the formula, the normalized proportions of 15 end-members, and the percentage of cations which cannot be assigned to those end-members. The data are automatically plotted onto the ternary and quaternary diagrams recommended by the IMA for depiction of perovskite compositions. Up to 200 analyses can be entered into the spreadsheet, which is accompanied by data calculated for 140 perovskite compositions compiled from the literature.

  9. Discovery Mondays: Surveyors' Tools

    CERN Multimedia

    2003-01-01

    Surveyors of all ages, have your rulers and compasses at the ready! This sixth edition of Discovery Monday is your chance to learn about the surveyor's tools - the state of the art in measuring instruments - and see for yourself how they work. With their usual daunting precision, the members of CERN's Surveying Group have prepared some demonstrations and exercises for you to try. Find out the techniques for ensuring accelerator alignment and learn about high-tech metrology systems such as deviation indicators, tracking lasers and total stations. The surveyors will show you how they precisely measure magnet positioning, with accuracy of a few thousandths of a millimetre. You can try your hand at precision measurement using different types of sensor and a modern-day version of the Romans' bubble level, accurate to within a thousandth of a millimetre. You will learn that photogrammetry techniques can transform even a simple digital camera into a remarkable measuring instrument. Finally, you will have a chance t...

  10. Current status and future prospects for enabling chemistry technology in the drug discovery process [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Stevan W. Djuric

    2016-09-01

    Full Text Available This review covers recent advances in the implementation of enabling chemistry technologies into the drug discovery process. Areas covered include parallel synthesis chemistry, high-throughput experimentation, automated synthesis and purification methods, flow chemistry methodology including photochemistry, electrochemistry, and the handling of “dangerous” reagents. Also featured are advances in the “computer-assisted drug design” area and the expanding application of novel mass spectrometry-based techniques to a wide range of drug discovery activities.

  11. Automation of PCXMC and ImPACT for NASA Astronaut Medical Imaging Dose and Risk Tracking

    Science.gov (United States)

    Bahadori, Amir; Picco, Charles; Flores-McLaughlin, John; Shavers, Mark; Semones, Edward

    2011-01-01

    To automate astronaut organ and effective dose calculations from occupational X-ray and computed tomography (CT) examinations incorporating PCXMC and ImPACT tools and to estimate the associated lifetime cancer risk per the National Council on Radiation Protection & Measurements (NCRP) using MATLAB(R). Methods: NASA follows guidance from the NCRP on its operational radiation safety program for astronauts. NCRP Report 142 recommends that astronauts be informed of the cancer risks from reported exposures to ionizing radiation from medical imaging. MATLAB(R) code was written to retrieve exam parameters for medical imaging procedures from a NASA database, calculate associated dose and risk, and return results to the database, using the Microsoft .NET Framework. This code interfaces with the PCXMC executable and emulates the ImPACT Excel spreadsheet to calculate organ doses from X-rays and CTs, respectively, eliminating the need to utilize the PCXMC graphical user interface (except for a few special cases) and the ImPACT spreadsheet. Results: Using MATLAB(R) code to interface with PCXMC and replicate ImPACT dose calculation allowed for rapid evaluation of multiple medical imaging exams. The user inputs the exam parameter data into the database and runs the code. Based on the imaging modality and input parameters, the organ doses are calculated. Output files are created for record, and organ doses, effective dose, and cancer risks associated with each exam are written to the database. Annual and post-flight exposure reports, which are used by the flight surgeon to brief the astronaut, are generated from the database. Conclusions: Automating PCXMC and ImPACT for evaluation of NASA astronaut medical imaging radiation procedures allowed for a traceable and rapid method for tracking projected cancer risks associated with over 12,000 exposures. This code will be used to evaluate future medical radiation exposures, and can easily be modified to accommodate changes to the risk

  12. Automation of High-Throughput Crystal Screening and Data Collection at SSRL

    International Nuclear Information System (INIS)

    Miller, Mitchell D.; Brinen, Linda S.; Deacon, Ashley M.; Bedem, Henry van den; Wolf, Guenter; Xu Qingping; Zhang Zepu; Cohen, Aina; Ellis, Paul; McPhillips, Scott E.; McPhillips, Timothy M.; Phizackerley, R. Paul; Soltis, S. Michael

    2004-01-01

    A robotic system for auto-mounting crystals from liquid nitrogen is now operational on SSRL beamlines (Cohen et al., J. Appl. Cryst. (2002). 35, 720-726). The system uses a small industrial 4-axis robot with a custom built actuator. Once mounted, automated alignment of the sample loop to the X-ray beam readies the crystal for data collection. After data collection, samples are returned to the cassette. The beamline Dewar accommodates three compact sample cassettes (holding up to 96 samples each). During the past 4 months, the system on beamline 11-1 has been used to screen over 1000 crystals. The system has reduced both screening time and manpower. Integration of the hardware components is accomplished in the Distributed Control System architecture developed at SSRL (McPhillips et al., J. Synchrotron Rad. (2002) 9, 401-406). A crystal-screening interface has been implemented in Blu-Ice. Sample details can be uploaded from an Excel spreadsheet. The JCSG generates these spreadsheets automatically from their tracking database using standard database tools (http://www.jcsg.org). New diffraction image analysis tools are being employed to aid in extracting results. Automation also permits tele-presence. For example, samples have been changed during the night without leaving home and scientists have screened crystals 1600 miles from the beamline. The system developed on beamline 11-1 has been replicated onto 1-5, 9-1, 9-2, and 11-3 and is used by both general users and the JCSG

  13. Automation Hooks Architecture for Flexible Test Orchestration - Concept Development and Validation

    Science.gov (United States)

    Lansdowne, C. A.; Maclean, John R.; Winton, Chris; McCartney, Pat

    2011-01-01

    The Automation Hooks Architecture Trade Study for Flexible Test Orchestration sought a standardized data-driven alternative to conventional automated test programming interfaces. The study recommended composing the interface using multicast DNS (mDNS/SD) service discovery, Representational State Transfer (Restful) Web Services, and Automatic Test Markup Language (ATML). We describe additional efforts to rapidly mature the Automation Hooks Architecture candidate interface definition by validating it in a broad spectrum of applications. These activities have allowed us to further refine our concepts and provide observations directed toward objectives of economy, scalability, versatility, performance, severability, maintainability, scriptability and others.

  14. 'The Lusiads', poem of discovery

    Directory of Open Access Journals (Sweden)

    Natasha Furlan Felizi

    2016-07-01

    Full Text Available The article proposes reading Os Lusíadas as a discovery journey. Discovery here read as aletheia or “revelation”, as proposed by Sophia de Mello Brey­ner Andresen in 1980. Using Martin Heidegger’s notion of aletheia in the book Parmenides along with Jorge de Sena and Sophia de Mello Breyner Andresen reflections on Camões, I’ll seek to point out alternative readings for Os Lusíadas as a “discovery journey”.

  15. Discovery of natural resources

    Science.gov (United States)

    Guild, P.W.

    1976-01-01

    Mankind will continue to need ores of more or less the types and grades used today to supply its needs for new mineral raw materials, at least until fusion or some other relatively cheap, inexhaustible energy source is developed. Most deposits being mined today were exposed at the surface or found by relatively simple geophysical or other prospecting techniques, but many of these will be depleted in the foreseeable future. The discovery of deeper or less obvious deposits to replace them will require the conjunction of science and technology to deduce the laws that governed the concentration of elements into ores and to detect and evaluate the evidence of their whereabouts. Great theoretical advances are being made to explain the origins of ore deposits and understand the general reasons for their localization. These advances have unquestionable value for exploration. Even a large deposit is, however, very small, and, with few exceptions, it was formed under conditions that have long since ceased to exist. The explorationist must suppress a great deal of "noise" to read and interpret correctly the "signals" that can define targets and guide the drilling required to find it. Is enough being done to ensure the long-term availability of mineral raw materials? The answer is probably no, in view of the expanding consumption and the difficulty of finding new deposits, but ingenuity, persistence, and continued development of new methods and tools to add to those already at hand should put off the day of "doing without" for many years. The possibility of resource exhaustion, especially in view of the long and increasing lead time needed to carry out basic field and laboratory studies in geology, geophysics, and geochemistry and to synthesize and analyze the information gained from them counsels against any letting down of our guard, however (17). Research and exploration by government, academia, and industry must be supported and encouraged; we cannot wait until an eleventh

  16. Supernovae Discovery Efficiency

    Science.gov (United States)

    John, Colin

    2018-01-01

    Abstract:We present supernovae (SN) search efficiency measurements for recent Hubble Space Telescope (HST) surveys. Efficiency is a key component to any search, and is important parameter as a correction factor for SN rates. To achieve an accurate value for efficiency, many supernovae need to be discoverable in surveys. This cannot be achieved from real SN only, due to their scarcity, so fake SN are planted. These fake supernovae—with a goal of realism in mind—yield an understanding of efficiency based on position related to other celestial objects, and brightness. To improve realism, we built a more accurate model of supernovae using a point-spread function. The next improvement to realism is planting these objects close to galaxies and of various parameters of brightness, magnitude, local galactic brightness and redshift. Once these are planted, a very accurate SN is visible and discoverable by the searcher. It is very important to find factors that affect this discovery efficiency. Exploring the factors that effect detection yields a more accurate correction factor. Further inquires into efficiency give us a better understanding of image processing, searching techniques and survey strategies, and result in an overall higher likelihood to find these events in future surveys with Hubble, James Webb, and WFIRST telescopes. After efficiency is discovered and refined with many unique surveys, it factors into measurements of SN rates versus redshift. By comparing SN rates vs redshift against the star formation rate we can test models to determine how long star systems take from the point of inception to explosion (delay time distribution). This delay time distribution is compared to SN progenitors models to get an accurate idea of what these stars were like before their deaths.

  17. Automation of Taxiing

    Directory of Open Access Journals (Sweden)

    Jaroslav Bursík

    2017-01-01

    Full Text Available The article focuses on the possibility of automation of taxiing, which is the part of a flight, which, under adverse weather conditions, greatly reduces the operational usability of an airport, and is the only part of a flight that has not been affected by automation, yet. Taxiing is currently handled manually by the pilot, who controls the airplane based on information from visual perception. The article primarily deals with possible ways of obtaining navigational information, and its automatic transfer to the controls. Analyzed wand assessed were currently available technologies such as computer vision, Light Detection and Ranging and Global Navigation Satellite System, which are useful for navigation and their general implementation into an airplane was designed. Obstacles to the implementation were identified, too. The result is a proposed combination of systems along with their installation into airplane’s systems so that it is possible to use the automated taxiing.

  18. Control and automation systems

    International Nuclear Information System (INIS)

    Schmidt, R.; Zillich, H.

    1986-01-01

    A survey is given of the development of control and automation systems for energy uses. General remarks about control and automation schemes are followed by a description of modern process control systems along with process control processes as such. After discussing the particular process control requirements of nuclear power plants the paper deals with the reliability and availability of process control systems and refers to computerized simulation processes. The subsequent paragraphs are dedicated to descriptions of the operating floor, ergonomic conditions, existing systems, flue gas desulfurization systems, the electromagnetic influences on digital circuits as well as of light wave uses. (HAG) [de

  19. Automated nuclear materials accounting

    International Nuclear Information System (INIS)

    Pacak, P.; Moravec, J.

    1982-01-01

    An automated state system of accounting for nuclear materials data was established in Czechoslovakia in 1979. A file was compiled of 12 programs in the PL/1 language. The file is divided into four groups according to logical associations, namely programs for data input and checking, programs for handling the basic data file, programs for report outputs in the form of worksheets and magnetic tape records, and programs for book inventory listing, document inventory handling and materials balance listing. A similar automated system of nuclear fuel inventory for a light water reactor was introduced for internal purposes in the Institute of Nuclear Research (UJV). (H.S.)

  20. Automating the CMS DAQ

    International Nuclear Information System (INIS)

    Bauer, G; Darlea, G-L; Gomez-Ceballos, G; Bawej, T; Chaze, O; Coarasa, J A; Deldicque, C; Dobson, M; Dupont, A; Gigi, D; Glege, F; Gomez-Reino, R; Hartl, C; Hegeman, J; Masetti, L; Behrens, U; Branson, J; Cittolin, S; Holzner, A; Erhan, S

    2014-01-01

    We present the automation mechanisms that have been added to the Data Acquisition and Run Control systems of the Compact Muon Solenoid (CMS) experiment during Run 1 of the LHC, ranging from the automation of routine tasks to automatic error recovery and context-sensitive guidance to the operator. These mechanisms helped CMS to maintain a data taking efficiency above 90% and to even improve it to 95% towards the end of Run 1, despite an increase in the occurrence of single-event upsets in sub-detector electronics at high LHC luminosity.

  1. RAS - Screens & Assays - Drug Discovery

    Science.gov (United States)

    The RAS Drug Discovery group aims to develop assays that will reveal aspects of RAS biology upon which cancer cells depend. Successful assay formats are made available for high-throughput screening programs to yield potentially effective drug compounds.

  2. Antibody informatics for drug discovery

    DEFF Research Database (Denmark)

    Shirai, Hiroki; Prades, Catherine; Vita, Randi

    2014-01-01

    to the antibody science in every project in antibody drug discovery. Recent experimental technologies allow for the rapid generation of large-scale data on antibody sequences, affinity, potency, structures, and biological functions; this should accelerate drug discovery research. Therefore, a robust bioinformatic...... infrastructure for these large data sets has become necessary. In this article, we first identify and discuss the typical obstacles faced during the antibody drug discovery process. We then summarize the current status of three sub-fields of antibody informatics as follows: (i) recent progress in technologies...... for antibody rational design using computational approaches to affinity and stability improvement, as well as ab-initio and homology-based antibody modeling; (ii) resources for antibody sequences, structures, and immune epitopes and open drug discovery resources for development of antibody drugs; and (iii...

  3. Discovery of the iron isotopes

    International Nuclear Information System (INIS)

    Schuh, A.; Fritsch, A.; Heim, M.; Shore, A.; Thoennessen, M.

    2010-01-01

    Twenty-eight iron isotopes have been observed so far and the discovery of these isotopes is discussed here. For each isotope a brief summary of the first refereed publication, including the production and identification method, is presented.

  4. Discovery of the silver isotopes

    International Nuclear Information System (INIS)

    Schuh, A.; Fritsch, A.; Ginepro, J.Q.; Heim, M.; Shore, A.; Thoennessen, M.

    2010-01-01

    Thirty-eight silver isotopes have been observed so far and the discovery of these isotopes is discussed here. For each isotope a brief summary of the first refereed publication, including the production and identification method, is presented.

  5. Synthetic biology of antimicrobial discovery

    Science.gov (United States)

    Zakeri, Bijan; Lu, Timothy K.

    2012-01-01

    Antibiotic discovery has a storied history. From the discovery of penicillin by Sir Alexander Fleming to the relentless quest for antibiotics by Selman Waksman, the stories have become like folklore, used to inspire future generations of scientists. However, recent discovery pipelines have run dry at a time when multidrug resistant pathogens are on the rise. Nature has proven to be a valuable reservoir of antimicrobial agents, which are primarily produced by modularized biochemical pathways. Such modularization is well suited to remodeling by an interdisciplinary approach that spans science and engineering. Herein, we discuss the biological engineering of small molecules, peptides, and non-traditional antimicrobials and provide an overview of the growing applicability of synthetic biology to antimicrobials discovery. PMID:23654251

  6. Discovery of the cadmium isotopes

    International Nuclear Information System (INIS)

    Amos, S.; Thoennessen, M.

    2010-01-01

    Thirty-seven cadmium isotopes have been observed so far and the discovery of these isotopes is discussed here. For each isotope a brief summary of the first refereed publication, including the production and identification method, is presented.

  7. Discoveries of isotopes by fission

    Indian Academy of Sciences (India)

    country of discovery as well as the production mechanism used to produce the isotopes. ... the disintegration products of bombarded uranium, as a consequence of a ..... advanced accelerator and newly developed separation and detection ...

  8. Synthetic biology of antimicrobial discovery.

    Science.gov (United States)

    Zakeri, Bijan; Lu, Timothy K

    2013-07-19

    Antibiotic discovery has a storied history. From the discovery of penicillin by Sir Alexander Fleming to the relentless quest for antibiotics by Selman Waksman, the stories have become like folklore used to inspire future generations of scientists. However, recent discovery pipelines have run dry at a time when multidrug-resistant pathogens are on the rise. Nature has proven to be a valuable reservoir of antimicrobial agents, which are primarily produced by modularized biochemical pathways. Such modularization is well suited to remodeling by an interdisciplinary approach that spans science and engineering. Herein, we discuss the biological engineering of small molecules, peptides, and non-traditional antimicrobials and provide an overview of the growing applicability of synthetic biology to antimicrobials discovery.

  9. The discovery of 'heavy light'

    International Nuclear Information System (INIS)

    Anon.

    1983-01-01

    The history of the discoveries of fundamental quanta is described starting from Maxwell's theory of electromagnetism up to the development of a theory of weak interaction and the detection of the W and Z bosons. (HSI).

  10. Discovery – Development of Rituximab

    Science.gov (United States)

    NCI funded the development of rituximab, one of the first monoclonal antibody cancer treatments. With the discovery of rituximab, more than 70 percent of patients diagnosed with non-hodgkin lymphoma now live five years past their initial diagnosis.

  11. Altering user' acceptance of automation through prior automation exposure.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  12. Radioactivity. Centenary of radioactivity discovery

    International Nuclear Information System (INIS)

    Charpak, G.; Tubiana, M.; Bimbot, R.

    1997-01-01

    This small booklet was edited for the occasion of the exhibitions of the celebration of the centenary of radioactivity discovery which took place in various locations in France from 1996 to 1998. It recalls some basic knowledge concerning radioactivity and its applications: history of discovery, atoms and isotopes, radiations, measurement of ionizing radiations, natural and artificial radioactivity, isotope dating and labelling, radiotherapy, nuclear power and reactors, fission and fusion, nuclear wastes, dosimetry, effects and radioprotection. (J.S.)

  13. Computational methods in drug discovery

    Directory of Open Access Journals (Sweden)

    Sumudu P. Leelananda

    2016-12-01

    Full Text Available The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery projects. Additionally, increasing knowledge of biological structures, as well as increasing computer power have made it possible to use computational methods effectively in various phases of the drug discovery and development pipeline. The importance of in silico tools is greater than ever before and has advanced pharmaceutical research. Here we present an overview of computational methods used in different facets of drug discovery and highlight some of the recent successes. In this review, both structure-based and ligand-based drug discovery methods are discussed. Advances in virtual high-throughput screening, protein structure prediction methods, protein–ligand docking, pharmacophore modeling and QSAR techniques are reviewed.

  14. LIBRARY AUTOMATION IN NIGERAN UNIVERSITIES

    African Journals Online (AJOL)

    facilitate services and access to information in libraries is widely acceptable. ... Moreover, Ugah (2001) reports that the automation process at the. Abubakar ... blueprint in 1987 and a turn-key system of automation was suggested for the library.

  15. Get Involved in Planetary Discoveries through New Worlds, New Discoveries

    Science.gov (United States)

    Shupla, Christine; Shipp, S. S.; Halligan, E.; Dalton, H.; Boonstra, D.; Buxner, S.; SMD Planetary Forum, NASA

    2013-01-01

    "New Worlds, New Discoveries" is a synthesis of NASA’s 50-year exploration history which provides an integrated picture of our new understanding of our solar system. As NASA spacecraft head to and arrive at key locations in our solar system, "New Worlds, New Discoveries" provides an integrated picture of our new understanding of the solar system to educators and the general public! The site combines the amazing discoveries of past NASA planetary missions with the most recent findings of ongoing missions, and connects them to the related planetary science topics. "New Worlds, New Discoveries," which includes the "Year of the Solar System" and the ongoing celebration of the "50 Years of Exploration," includes 20 topics that share thematic solar system educational resources and activities, tied to the national science standards. This online site and ongoing event offers numerous opportunities for the science community - including researchers and education and public outreach professionals - to raise awareness, build excitement, and make connections with educators, students, and the public about planetary science. Visitors to the site will find valuable hands-on science activities, resources and educational materials, as well as the latest news, to engage audiences in planetary science topics and their related mission discoveries. The topics are tied to the big questions of planetary science: how did the Sun’s family of planets and bodies originate and how have they evolved? How did life begin and evolve on Earth, and has it evolved elsewhere in our solar system? Scientists and educators are encouraged to get involved either directly or by sharing "New Worlds, New Discoveries" and its resources with educators, by conducting presentations and events, sharing their resources and events to add to the site, and adding their own public events to the site’s event calendar! Visit to find quality resources and ideas. Connect with educators, students and the public to

  16. Future Trends in Process Automation

    OpenAIRE

    Jämsä-Jounela, Sirkka-Liisa

    2007-01-01

    The importance of automation in the process industries has increased dramatically in recent years. In the highly industrialized countries, process automation serves to enhance product quality, master the whole range of products, improve process safety and plant availability, efficiently utilize resources and lower emissions. In the rapidly developing countries, mass production is the main motivation for applying process automation. The greatest demand for process automation is in the chemical...

  17. Adaptive Automation Design and Implementation

    Science.gov (United States)

    2015-09-17

    with an automated system to a real-world adaptive au- tomation system implementation. There have been plenty of adaptive automation 17 Adaptive...of systems without increasing manpower requirements by allocating routine tasks to automated aids, improving safety through the use of au- tomated ...between intermediate levels of au- tomation , explicitly defining which human task a given level automates. Each model aids the creation and classification

  18. Automated Diatom Analysis Applied to Traditional Light Microscopy: A Proof-of-Concept Study

    Science.gov (United States)

    Little, Z. H. L.; Bishop, I.; Spaulding, S. A.; Nelson, H.; Mahoney, C.

    2017-12-01

    Diatom identification and enumeration by high resolution light microscopy is required for many areas of research and water quality assessment. Such analyses, however, are both expertise and labor-intensive. These challenges motivate the need for an automated process to efficiently and accurately identify and enumerate diatoms. Improvements in particle analysis software have increased the likelihood that diatom enumeration can be automated. VisualSpreadsheet software provides a possible solution for automated particle analysis of high-resolution light microscope diatom images. We applied the software, independent of its complementary FlowCam hardware, to automated analysis of light microscope images containing diatoms. Through numerous trials, we arrived at threshold settings to correctly segment 67% of the total possible diatom valves and fragments from broad fields of view. (183 light microscope images were examined containing 255 diatom particles. Of the 255 diatom particles present, 216 diatoms valves and fragments of valves were processed, with 170 properly analyzed and focused upon by the software). Manual analysis of the images yielded 255 particles in 400 seconds, whereas the software yielded a total of 216 particles in 68 seconds, thus highlighting that the software has an approximate five-fold efficiency advantage in particle analysis time. As in past efforts, incomplete or incorrect recognition was found for images with multiple valves in contact or valves with little contrast. The software has potential to be an effective tool in assisting taxonomists with diatom enumeration by completing a large portion of analyses. Benefits and limitations of the approach are presented to allow for development of future work in image analysis and automated enumeration of traditional light microscope images containing diatoms.

  19. Computational discovery of extremal microstructure families

    Science.gov (United States)

    Chen, Desai; Skouras, Mélina; Zhu, Bo; Matusik, Wojciech

    2018-01-01

    Modern fabrication techniques, such as additive manufacturing, can be used to create materials with complex custom internal structures. These engineered materials exhibit a much broader range of bulk properties than their base materials and are typically referred to as metamaterials or microstructures. Although metamaterials with extraordinary properties have many applications, designing them is very difficult and is generally done by hand. We propose a computational approach to discover families of microstructures with extremal macroscale properties automatically. Using efficient simulation and sampling techniques, we compute the space of mechanical properties covered by physically realizable microstructures. Our system then clusters microstructures with common topologies into families. Parameterized templates are eventually extracted from families to generate new microstructure designs. We demonstrate these capabilities on the computational design of mechanical metamaterials and present five auxetic microstructure families with extremal elastic material properties. Our study opens the way for the completely automated discovery of extremal microstructures across multiple domains of physics, including applications reliant on thermal, electrical, and magnetic properties. PMID:29376124

  20. Knowledge Discovery in Data in Construction Projects

    Directory of Open Access Journals (Sweden)

    Szelka J.

    2016-06-01

    Full Text Available Decision-making processes, including the ones related to ill-structured problems, are of considerable significance in the area of construction projects. Computer-aided inference under such conditions requires the employment of specific methods and tools (non-algorithmic ones, the best recognized and successfully used in practice represented by expert systems. The knowledge indispensable for such systems to perform inference is most frequently acquired directly from experts (through a dialogue: a domain expert - a knowledge engineer and from various source documents. Little is known, however, about the possibility of automating knowledge acquisition in this area and as a result, in practice it is scarcely ever used. It has to be noted that in numerous areas of management more and more attention is paid to the issue of acquiring knowledge from available data. What is known and successfully employed in the practice of aiding the decision-making is the different methods and tools. The paper attempts to select methods for knowledge discovery in data and presents possible ways of representing the acquired knowledge as well as sample tools (including programming ones, allowing for the use of this knowledge in the area under consideration.

  1. Automated HAZOP revisited

    DEFF Research Database (Denmark)

    Taylor, J. R.

    2017-01-01

    Hazard and operability analysis (HAZOP) has developed from a tentative approach to hazard identification for process plants in the early 1970s to an almost universally accepted approach today, and a central technique of safety engineering. Techniques for automated HAZOP analysis were developed...

  2. Automated Student Model Improvement

    Science.gov (United States)

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  3. Automated Vehicle Monitoring System

    OpenAIRE

    Wibowo, Agustinus Deddy Arief; Heriansyah, Rudi

    2014-01-01

    An automated vehicle monitoring system is proposed in this paper. The surveillance system is based on image processing techniques such as background subtraction, colour balancing, chain code based shape detection, and blob. The proposed system will detect any human's head as appeared at the side mirrors. The detected head will be tracked and recorded for further action.

  4. Mechatronic Design Automation

    DEFF Research Database (Denmark)

    Fan, Zhun

    successfully design analogue filters, vibration absorbers, micro-electro-mechanical systems, and vehicle suspension systems, all in an automatic or semi-automatic way. It also investigates the very important issue of co-designing plant-structures and dynamic controllers in automated design of Mechatronic...

  5. Automated Accounting. Instructor Guide.

    Science.gov (United States)

    Moses, Duane R.

    This curriculum guide was developed to assist business instructors using Dac Easy Accounting College Edition Version 2.0 software in their accounting programs. The module consists of four units containing assignment sheets and job sheets designed to enable students to master competencies identified in the area of automated accounting. The first…

  6. Automated conflict resolution issues

    Science.gov (United States)

    Wike, Jeffrey S.

    1991-01-01

    A discussion is presented of how conflicts for Space Network resources should be resolved in the ATDRSS era. The following topics are presented: a description of how resource conflicts are currently resolved; a description of issues associated with automated conflict resolution; present conflict resolution strategies; and topics for further discussion.

  7. Automated gamma counters

    International Nuclear Information System (INIS)

    Regener, M.

    1977-01-01

    This is a report on the most recent developments in the full automation of gamma counting in RIA, in particular by Messrs. Kontron. The development targets were flexibility in sample capacity and shape of test tubes, the possibility of using different radioisotopes for labelling due to an optimisation of the detector system and the use of microprocessers to substitute software for hardware. (ORU) [de

  8. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2014-12-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  9. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2015-01-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  10. Building Automation Systems.

    Science.gov (United States)

    Honeywell, Inc., Minneapolis, Minn.

    A number of different automation systems for use in monitoring and controlling building equipment are described in this brochure. The system functions include--(1) collection of information, (2) processing and display of data at a central panel, and (3) taking corrective action by sounding alarms, making adjustments, or automatically starting and…

  11. Automation of activation analysis

    International Nuclear Information System (INIS)

    Ivanov, I.N.; Ivanets, V.N.; Filippov, V.V.

    1985-01-01

    The basic data on the methods and equipment of activation analysis are presented. Recommendations on the selection of activation analysis techniques, and especially the technique envisaging the use of short-lived isotopes, are given. The equipment possibilities to increase dataway carrying capacity, using modern computers for the automation of the analysis and data processing procedure, are shown

  12. Protokoller til Home Automation

    DEFF Research Database (Denmark)

    Kjær, Kristian Ellebæk

    2008-01-01

    computer, der kan skifte mellem foruddefinerede indstillinger. Nogle gange kan computeren fjernstyres over internettet, så man kan se hjemmets status fra en computer eller måske endda fra en mobiltelefon. Mens nævnte anvendelser er klassiske indenfor home automation, er yderligere funktionalitet dukket op...

  13. Automation of radioimmunoassays

    International Nuclear Information System (INIS)

    Goldie, D.J.; West, P.M.; Ismail, A.A.A.

    1979-01-01

    A short account is given of recent developments in automation of the RIA technique. Difficulties encountered in the incubation, separation and quantitation steps are summarized. Published references are given to a number of systems, both discrete and continuous flow, and details are given of a system developed by the present authors. (U.K.)

  14. Microcontroller for automation application

    Science.gov (United States)

    Cooper, H. W.

    1975-01-01

    The description of a microcontroller currently being developed for automation application was given. It is basically an 8-bit microcomputer with a 40K byte random access memory/read only memory, and can control a maximum of 12 devices through standard 15-line interface ports.

  15. Driver Psychology during Automated Platooning

    NARCIS (Netherlands)

    Heikoop, D.D.

    2017-01-01

    With the rapid increase in vehicle automation technology, the call for understanding how humans behave while driving in an automated vehicle becomes more urgent. Vehicles that have automated systems such as Lane Keeping Assist (LKA) or Adaptive Cruise Control (ACC) not only support drivers in their

  16. Automating spectral measurements

    Science.gov (United States)

    Goldstein, Fred T.

    2008-09-01

    This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.

  17. 42 CFR 426.432 - Discovery.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Discovery. 426.432 Section 426.432 Public Health... § 426.432 Discovery. (a) General rule. If the ALJ orders discovery, the ALJ must establish a reasonable timeframe for discovery. (b) Protective order—(1) Request for a protective order. Any party receiving a...

  18. 40 CFR 27.21 - Discovery.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Discovery. 27.21 Section 27.21... Discovery. (a) The following types of discovery are authorized: (1) Requests for production of documents for..., discovery is available only as ordered by the presiding officer. The presiding officer shall regulate the...

  19. 13 CFR 134.213 - Discovery.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Discovery. 134.213 Section 134.213... OFFICE OF HEARINGS AND APPEALS Rules of Practice for Most Cases § 134.213 Discovery. (a) Motion. A party may obtain discovery only upon motion, and for good cause shown. (b) Forms. The forms of discovery...

  20. 37 CFR 41.150 - Discovery.

    Science.gov (United States)

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Discovery. 41.150 Section 41... COMMERCE PRACTICE BEFORE THE BOARD OF PATENT APPEALS AND INTERFERENCES Contested Cases § 41.150 Discovery. (a) Limited discovery. A party is not entitled to discovery except as authorized in this subpart. The...

  1. 19 CFR 354.10 - Discovery.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Discovery. 354.10 Section 354.10 Customs Duties... ANTIDUMPING OR COUNTERVAILING DUTY ADMINISTRATIVE PROTECTIVE ORDER § 354.10 Discovery. (a) Voluntary discovery. All parties are encouraged to engage in voluntary discovery procedures regarding any matter, not...

  2. 14 CFR 13.220 - Discovery.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Discovery. 13.220 Section 13.220... INVESTIGATIVE AND ENFORCEMENT PROCEDURES Rules of Practice in FAA Civil Penalty Actions § 13.220 Discovery. (a) Initiation of discovery. Any party may initiate discovery described in this section, without the consent or...

  3. 49 CFR 604.38 - Discovery.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 7 2010-10-01 2010-10-01 false Discovery. 604.38 Section 604.38 Transportation... TRANSPORTATION CHARTER SERVICE Hearings. § 604.38 Discovery. (a) Permissible forms of discovery shall be within the discretion of the PO. (b) The PO shall limit the frequency and extent of discovery permitted by...

  4. 15 CFR 719.10 - Discovery.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 2 2010-01-01 2010-01-01 false Discovery. 719.10 Section 719.10... Discovery. (a) General. The parties are encouraged to engage in voluntary discovery regarding any matter... the Federal Rules of Civil Procedure relating to discovery apply to the extent consistent with this...

  5. 14 CFR 16.213 - Discovery.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Discovery. 16.213 Section 16.213... PRACTICE FOR FEDERALLY-ASSISTED AIRPORT ENFORCEMENT PROCEEDINGS Hearings § 16.213 Discovery. (a) Discovery... discovery permitted by this section if a party shows that— (1) The information requested is cumulative or...

  6. 28 CFR 76.21 - Discovery.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Discovery. 76.21 Section 76.21 Judicial... POSSESSION OF CERTAIN CONTROLLED SUBSTANCES § 76.21 Discovery. (a) Scope. Discovery under this part covers... as a general guide for discovery practices in proceedings before the Judge. However, unless otherwise...

  7. 36 CFR 1150.63 - Discovery.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Discovery. 1150.63 Section... PRACTICE AND PROCEDURES FOR COMPLIANCE HEARINGS Prehearing Conferences and Discovery § 1150.63 Discovery. (a) Parties are encouraged to engage in voluntary discovery procedures. For good cause shown under...

  8. 10 CFR 13.21 - Discovery.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Discovery. 13.21 Section 13.21 Energy NUCLEAR REGULATORY COMMISSION PROGRAM FRAUD CIVIL REMEDIES § 13.21 Discovery. (a) The following types of discovery are...) Unless mutually agreed to by the parties, discovery is available only as ordered by the ALJ. The ALJ...

  9. 49 CFR 1121.2 - Discovery.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 8 2010-10-01 2010-10-01 false Discovery. 1121.2 Section 1121.2 Transportation... TRANSPORTATION RULES OF PRACTICE RAIL EXEMPTION PROCEDURES § 1121.2 Discovery. Discovery shall follow the procedures set forth at 49 CFR part 1114, subpart B. Discovery may begin upon the filing of the petition for...

  10. 24 CFR 26.18 - Discovery.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Discovery. 26.18 Section 26.18... PROCEDURES Hearings Before Hearing Officers Discovery § 26.18 Discovery. (a) General. The parties are encouraged to engage in voluntary discovery procedures, which may commence at any time after an answer has...

  11. 38 CFR 42.21 - Discovery.

    Science.gov (United States)

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Discovery. 42.21 Section... IMPLEMENTING THE PROGRAM FRAUD CIVIL REMEDIES ACT § 42.21 Discovery. (a) The following types of discovery are... creation of a document. (c) Unless mutually agreed to by the parties, discovery is available only as...

  12. 22 CFR 521.21 - Discovery.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Discovery. 521.21 Section 521.21 Foreign... Discovery. (a) The following types of discovery are authorized: (1) Requests for production of documents for... interpreted to require the creation of a document. (c) Unless mutually agreed to by the parties, discovery is...

  13. 31 CFR 10.71 - Discovery.

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Discovery. 10.71 Section 10.71 Money... SERVICE Rules Applicable to Disciplinary Proceedings § 10.71 Discovery. (a) In general. Discovery may be... relevance, materiality and reasonableness of the requested discovery and subject to the requirements of § 10...

  14. 42 CFR 426.532 - Discovery.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Discovery. 426.532 Section 426.532 Public Health... § 426.532 Discovery. (a) General rule. If the Board orders discovery, the Board must establish a reasonable timeframe for discovery. (b) Protective order—(1) Request for a protective order. Any party...

  15. 39 CFR 955.15 - Discovery.

    Science.gov (United States)

    2010-07-01

    ... 39 Postal Service 1 2010-07-01 2010-07-01 false Discovery. 955.15 Section 955.15 Postal Service... APPEALS § 955.15 Discovery. (a) The parties are encouraged to engage in voluntary discovery procedures. In connection with any deposition or other discovery procedure, the Board may issue any order which justice...

  16. 49 CFR 1503.633 - Discovery.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 9 2010-10-01 2010-10-01 false Discovery. 1503.633 Section 1503.633... Rules of Practice in TSA Civil Penalty Actions § 1503.633 Discovery. (a) Initiation of discovery. Any party may initiate discovery described in this section, without the consent or approval of the ALJ, at...

  17. 43 CFR 35.21 - Discovery.

    Science.gov (United States)

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Discovery. 35.21 Section 35.21 Public... AND STATEMENTS § 35.21 Discovery. (a) The following types of discovery are authorized: (1) Requests...) Unless mutually agreed to by the parties, discovery is available only as ordered by the ALJ. The ALJ...

  18. 14 CFR 1264.120 - Discovery.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Discovery. 1264.120 Section 1264.120... PENALTIES ACT OF 1986 § 1264.120 Discovery. (a) The following types of discovery are authorized: (1..., discovery is available only as ordered by the presiding officer. The presiding officer shall regulate the...

  19. 22 CFR 128.6 - Discovery.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Discovery. 128.6 Section 128.6 Foreign... Discovery. (a) Discovery by the respondent. The respondent, through the Administrative Law Judge, may... discovery if the interests of national security or foreign policy so require, or if necessary to comply with...

  20. 37 CFR 11.52 - Discovery.

    Science.gov (United States)

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Discovery. 11.52 Section 11... Disciplinary Proceedings; Jurisdiction, Sanctions, Investigations, and Proceedings § 11.52 Discovery. Discovery... establishes that discovery is reasonable and relevant, the hearing officer, under such conditions as he or she...

  1. 24 CFR 26.42 - Discovery.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Discovery. 26.42 Section 26.42... PROCEDURES Hearings Pursuant to the Administrative Procedure Act Discovery § 26.42 Discovery. (a) General. The parties are encouraged to engage in voluntary discovery procedures, which may commence at any time...

  2. 49 CFR 386.37 - Discovery.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 5 2010-10-01 2010-10-01 false Discovery. 386.37 Section 386.37 Transportation... and Hearings § 386.37 Discovery. (a) Parties may obtain discovery by one or more of the following...; and requests for admission. (b) Discovery may not commence until the matter is pending before the...

  3. 29 CFR 1955.32 - Discovery.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Discovery. 1955.32 Section 1955.32 Labor Regulations...) PROCEDURES FOR WITHDRAWAL OF APPROVAL OF STATE PLANS Preliminary Conference and Discovery § 1955.32 Discovery... allow discovery by any other appropriate procedure, such as by interrogatories upon a party or request...

  4. 31 CFR 16.21 - Discovery.

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Discovery. 16.21 Section 16.21 Money... FRAUD CIVIL REMEDIES ACT OF 1986 § 16.21 Discovery. (a) The following types of discovery are authorized... to require the creation of a document. (c) Unless mutually agreed to by the parties, discovery is...

  5. 15 CFR 766.9 - Discovery.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 2 2010-01-01 2010-01-01 false Discovery. 766.9 Section 766.9... PROCEEDINGS § 766.9 Discovery. (a) General. The parties are encouraged to engage in voluntary discovery... provisions of the Federal Rules of Civil Procedure relating to discovery apply to the extent consistent with...

  6. 43 CFR 4.1130 - Discovery methods.

    Science.gov (United States)

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Discovery methods. 4.1130 Section 4.1130... Special Rules Applicable to Surface Coal Mining Hearings and Appeals Discovery § 4.1130 Discovery methods. Parties may obtain discovery by one or more of the following methods— (a) Depositions upon oral...

  7. New Generation Discovery: A Systematic View for Its Development, Issues and Future

    KAUST Repository

    Yu, Yi

    2012-11-01

    Collecting, storing, discovering, and locating are integral parts of the composition of the library. To fully utilize the library and achieve its ultimate value, the construction and production of discovery has always been a central part of the library’s practice and identity. That is the reason why the new generation (also called the next-generation discovery) discovery gets such striking effect since it came into library automation arena. However, when we talk about the new generation of discovery in the library domain, we should see it in the entirety of the library as one of its organic parts and consider its progress along with the evolution of the whole library world. We should have a deeper understanding about its relationship and interaction with the internet, the rapidly changing digital environment, and the elements and the chain of library services. To address above issues, this paper overviews the different versions of the definition for the new generation discovery by combining our own understanding. The paper also gives our own description for its properties and characteristics. The paper points out what challenges, which extends the technology domain to commercial interests and business strategy, are faced by the discovery applications, and how library and library professionals deal with those challenges. Finally, the paper elaborates on the promise brought by the new discovery development and what the next exploration might be for its future.

  8. Coupling Visualization and Data Analysis for Knowledge Discovery from Multi-dimensional Scientific Data

    International Nuclear Information System (INIS)

    Rubel, Oliver; Ahern, Sean; Bethel, E. Wes; Biggin, Mark D.; Childs, Hank; Cormier-Michel, Estelle; DePace, Angela; Eisen, Michael B.; Fowlkes, Charless C.; Geddes, Cameron G.R.; Hagen, Hans; Hamann, Bernd; Huang, Min-Yu; Keranen, Soile V.E.; Knowles, David W.; Hendriks, Chris L. Luengo; Malik, Jitendra; Meredith, Jeremy; Messmer, Peter; Prabhat; Ushizima, Daniela; Weber, Gunther H.; Wu, Kesheng

    2010-01-01

    Knowledge discovery from large and complex scientific data is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the growing number of data dimensions and data objects presents tremendous challenges for effective data analysis and data exploration methods and tools. The combination and close integration of methods from scientific visualization, information visualization, automated data analysis, and other enabling technologies 'such as efficient data management' supports knowledge discovery from multi-dimensional scientific data. This paper surveys two distinct applications in developmental biology and accelerator physics, illustrating the effectiveness of the described approach.

  9. Space Discovery: Teaching with Space. Evaluation: Summer, Fall 1998 Programs

    Science.gov (United States)

    Ewell, Bob

    1998-01-01

    This is the final report of the 1998 NASA-sponsored evaluation of the effectiveness of the United States Space Foundation's five-day Space Discovery Standard Graduate Course (Living and Working in Space), the five-day Space Discovery Advanced Graduate Course (Advanced Technology and Biomedical Research), the five-day introductory course Aviation and Space Basics all conducted during the summer of 1998, and the Teaching with Space two-day Inservice program. The purpose of the program is to motivate and equip K- 12 teachers to use proven student-attracting space and technology concepts to support standard curriculum. These programs support the America 2000 National Educational Goals, encouraging more students to stay in school, increase in competence, and have a better opportunity to be attracted to math and science. The 1998 research program continues the comprehensive evaluation begun in 1992, this year studying five summer five-day sessions and five Inservice programs offered during the Fall of 1998 in California, Colorado, New York, and Virginia. A comprehensive research design by Dr. Robert Ewell of Creative Solutions and Dr. Darwyn Linder of Arizona State University evaluated the effectiveness of various areas of the program and its applicability on diverse groups. Preliminary research methodology was a set of survey instruments administered after the courses, and another to be sent in April-4-5 months following the last inservice involved in this study. This year, we have departed from this evaluation design in two ways. First, the five-day programs used NASA's new EDCATS on-line system and associated survey rather than the Linder/Ewell instruments. The Inservice programs were evaluated using the previously developed survey adapted for Inservice programs. Second, we did not do a follow-on survey of the teachers after they had been in the field as we have done in the past. Therefore, this evaluation captures only the reactions of the teachers to the programs

  10. The Europa Ocean Discovery mission

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, B.C. [Los Alamos National Lab., NM (United States); Chyba, C.F. [Univ. of Arizona, Tucson, AZ (United States); Abshire, J.B. [National Aeronautics and Space Administration, Greenbelt, MD (United States). Goddard Space Flight Center] [and others

    1997-06-01

    Since it was first proposed that tidal heating of Europa by Jupiter might lead to liquid water oceans below Europa`s ice cover, there has been speculation over the possible exobiological implications of such an ocean. Liquid water is the essential ingredient for life as it is known, and the existence of a second water ocean in the Solar System would be of paramount importance for seeking the origin and existence of life beyond Earth. The authors present here a Discovery-class mission concept (Europa Ocean Discovery) to determine the existence of a liquid water ocean on Europa and to characterize Europa`s surface structure. The technical goal of the Europa Ocean Discovery mission is to study Europa with an orbiting spacecraft. This goal is challenging but entirely feasible within the Discovery envelope. There are four key challenges: entering Europan orbit, generating power, surviving long enough in the radiation environment to return valuable science, and complete the mission within the Discovery program`s launch vehicle and budget constraints. The authors will present here a viable mission that meets these challenges.

  11. An experimental evaluation of the generalizing capabilities of process discovery techniques and black-box sequence models

    NARCIS (Netherlands)

    Tax, N.; van Zelst, S.J.; Teinemaa, I.; Gulden, Jens; Reinhartz-Berger, Iris; Schmidt, Rainer; Guerreiro, Sérgio; Guédria, Wided; Bera, Palash

    2018-01-01

    A plethora of automated process discovery techniques have been developed which aim to discover a process model based on event data originating from the execution of business processes. The aim of the discovered process models is to describe the control-flow of the underlying business process. At the

  12. Estimating Policy-Driven Greenhouse Gas Emissions Trajectories in California: The California Greenhouse Gas Inventory Spreadsheet (GHGIS) Model

    Energy Technology Data Exchange (ETDEWEB)

    Greenblatt, Jeffery B.

    2013-10-10

    A California Greenhouse Gas Inventory Spreadsheet (GHGIS) model was developed to explore the impact of combinations of state policies on state greenhouse gas (GHG) and regional criteria pollutant emissions. The model included representations of all GHG- emitting sectors of the California economy (including those outside the energy sector, such as high global warming potential gases, waste treatment, agriculture and forestry) in varying degrees of detail, and was carefully calibrated using available data and projections from multiple state agencies and other sources. Starting from basic drivers such as population, numbers of households, gross state product, numbers of vehicles, etc., the model calculated energy demands by type (various types of liquid and gaseous hydrocarbon fuels, electricity and hydrogen), and finally calculated emissions of GHGs and three criteria pollutants: reactive organic gases (ROG), nitrogen oxides (NOx), and fine (2.5 ?m) particulate matter (PM2.5). Calculations were generally statewide, but in some sectors, criteria pollutants were also calculated for two regional air basins: the South Coast Air Basin (SCAB) and the San Joaquin Valley (SJV). Three scenarios were developed that attempt to model: (1) all committed policies, (2) additional, uncommitted policy targets and (3) potential technology and market futures. Each scenario received extensive input from state energy planning agencies, in particular the California Air Resources Board. Results indicate that all three scenarios are able to meet the 2020 statewide GHG targets, and by 2030, statewide GHG emissions range from between 208 and 396 MtCO2/yr. However, none of the scenarios are able to meet the 2050 GHG target of 85 MtCO2/yr, with emissions ranging from 188 to 444 MtCO2/yr, so additional policies will need to be developed for California to meet this stringent future target. A full sensitivity study of major scenario assumptions was also performed. In terms of criteria pollutants

  13. Optimization Modeling with Spreadsheets

    CERN Document Server

    Baker, Kenneth R

    2011-01-01

    This introductory book on optimization (mathematical programming) includes coverage on linear programming, nonlinear programming, integer programming and heuristic programming; as well as an emphasis on model building using Excel and Solver.  The emphasis on model building (rather than algorithms) is one of the features that makes this book distinctive. Most books devote more space to algorithmic details than to formulation principles. These days, however, it is not necessary to know a great deal about algorithms in order to apply optimization tools, especially when relying on the sp

  14. Financiering met Excel spreadsheets

    NARCIS (Netherlands)

    van der Goot, T.

    2010-01-01

    Dit boek geeft een samenvatting van de belangrijkste financieringsonderwerpen. Zo kun je bijvoorbeeld lezen over rendement en risico, constante waarde van kasstromen, hefboomwerking, dividendbeleid en opties.

  15. Robotic liquid handling and automation in epigenetics.

    Science.gov (United States)

    Gaisford, Wendy

    2012-10-01

    Automated liquid-handling robots and high-throughput screening (HTS) are widely used in the pharmaceutical industry for the screening of large compound libraries, small molecules for activity against disease-relevant target pathways, or proteins. HTS robots capable of low-volume dispensing reduce assay setup times and provide highly accurate and reproducible dispensing, minimizing variation between sample replicates and eliminating the potential for manual error. Low-volume automated nanoliter dispensers ensure accuracy of pipetting within volume ranges that are difficult to achieve manually. In addition, they have the ability to potentially expand the range of screening conditions from often limited amounts of valuable sample, as well as reduce the usage of expensive reagents. The ability to accurately dispense lower volumes provides the potential to achieve a greater amount of information than could be otherwise achieved using manual dispensing technology. With the emergence of the field of epigenetics, an increasing number of drug discovery companies are beginning to screen compound libraries against a range of epigenetic targets. This review discusses the potential for the use of low-volume liquid handling robots, for molecular biological applications such as quantitative PCR and epigenetics.

  16. Automated docking screens: a feasibility study.

    Science.gov (United States)

    Irwin, John J; Shoichet, Brian K; Mysinger, Michael M; Huang, Niu; Colizzi, Francesco; Wassam, Pascal; Cao, Yiqun

    2009-09-24

    Molecular docking is the most practical approach to leverage protein structure for ligand discovery, but the technique retains important liabilities that make it challenging to deploy on a large scale. We have therefore created an expert system, DOCK Blaster, to investigate the feasibility of full automation. The method requires a PDB code, sometimes with a ligand structure, and from that alone can launch a full screen of large libraries. A critical feature is self-assessment, which estimates the anticipated reliability of the automated screening results using pose fidelity and enrichment. Against common benchmarks, DOCK Blaster recapitulates the crystal ligand pose within 2 A rmsd 50-60% of the time; inferior to an expert, but respectrable. Half the time the ligand also ranked among the top 5% of 100 physically matched decoys chosen on the fly. Further tests were undertaken culminating in a study of 7755 eligible PDB structures. In 1398 cases, the redocked ligand ranked in the top 5% of 100 property-matched decoys while also posing within 2 A rmsd, suggesting that unsupervised prospective docking is viable. DOCK Blaster is available at http://blaster.docking.org .

  17. Deep Learning in Drug Discovery.

    Science.gov (United States)

    Gawehn, Erik; Hiss, Jan A; Schneider, Gisbert

    2016-01-01

    Artificial neural networks had their first heyday in molecular informatics and drug discovery approximately two decades ago. Currently, we are witnessing renewed interest in adapting advanced neural network architectures for pharmaceutical research by borrowing from the field of "deep learning". Compared with some of the other life sciences, their application in drug discovery is still limited. Here, we provide an overview of this emerging field of molecular informatics, present the basic concepts of prominent deep learning methods and offer motivation to explore these techniques for their usefulness in computer-assisted drug discovery and design. We specifically emphasize deep neural networks, restricted Boltzmann machine networks and convolutional networks. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Discovery of the Higgs boson

    CERN Document Server

    Sharma, Vivek

    2016-01-01

    The recent observation of the Higgs boson has been hailed as the scientific discovery of the century and led to the 2013 Nobel Prize in physics. This book describes the detailed science behind the decades-long search for this elusive particle at the Large Electron Positron Collider at CERN and at the Tevatron at Fermilab and its subsequent discovery and characterization at the Large Hadron Collider at CERN. Written by physicists who played leading roles in this epic search and discovery, this book is an authoritative and pedagogical exposition of the portrait of the Higgs boson that has emerged from a large number of experimental measurements. As the first of its kind, this book should be of interest to graduate students and researchers in particle physics.

  19. Bioinformatics in translational drug discovery.

    Science.gov (United States)

    Wooller, Sarah K; Benstead-Hume, Graeme; Chen, Xiangrong; Ali, Yusuf; Pearl, Frances M G

    2017-08-31

    Bioinformatics approaches are becoming ever more essential in translational drug discovery both in academia and within the pharmaceutical industry. Computational exploitation of the increasing volumes of data generated during all phases of drug discovery is enabling key challenges of the process to be addressed. Here, we highlight some of the areas in which bioinformatics resources and methods are being developed to support the drug discovery pipeline. These include the creation of large data warehouses, bioinformatics algorithms to analyse 'big data' that identify novel drug targets and/or biomarkers, programs to assess the tractability of targets, and prediction of repositioning opportunities that use licensed drugs to treat additional indications. © 2017 The Author(s).

  20. Using the iPlant collaborative discovery environment.

    Science.gov (United States)

    Oliver, Shannon L; Lenards, Andrew J; Barthelson, Roger A; Merchant, Nirav; McKay, Sheldon J

    2013-06-01

    The iPlant Collaborative is an academic consortium whose mission is to develop an informatics and social infrastructure to address the "grand challenges" in plant biology. Its cyberinfrastructure supports the computational needs of the research community and facilitates solving major challenges in plant science. The Discovery Environment provides a powerful and rich graphical interface to the iPlant Collaborative cyberinfrastructure by creating an accessible virtual workbench that enables all levels of expertise, ranging from students to traditional biology researchers and computational experts, to explore, analyze, and share their data. By providing access to iPlant's robust data-management system and high-performance computing resources, the Discovery Environment also creates a unified space in which researchers can access scalable tools. Researchers can use available Applications (Apps) to execute analyses on their data, as well as customize or integrate their own tools to better meet the specific needs of their research. These Apps can also be used in workflows that automate more complicated analyses. This module describes how to use the main features of the Discovery Environment, using bioinformatics workflows for high-throughput sequence data as examples. © 2013 by John Wiley & Sons, Inc.