WorldWideScience

Sample records for testing computational toxicology

  1. Downloadable Computational Toxicology Data

    Science.gov (United States)

    EPA’s computational toxicology research generates data that investigates the potential harm, or hazard of a chemical, the degree of exposure to chemicals as well as the unique chemical characteristics. This data is publicly available here.

  2. Progress in computational toxicology.

    Science.gov (United States)

    Ekins, Sean

    2014-01-01

    Computational methods have been widely applied to toxicology across pharmaceutical, consumer product and environmental fields over the past decade. Progress in computational toxicology is now reviewed. A literature review was performed on computational models for hepatotoxicity (e.g. for drug-induced liver injury (DILI)), cardiotoxicity, renal toxicity and genotoxicity. In addition various publications have been highlighted that use machine learning methods. Several computational toxicology model datasets from past publications were used to compare Bayesian and Support Vector Machine (SVM) learning methods. The increasing amounts of data for defined toxicology endpoints have enabled machine learning models that have been increasingly used for predictions. It is shown that across many different models Bayesian and SVM perform similarly based on cross validation data. Considerable progress has been made in computational toxicology in a decade in both model development and availability of larger scale or 'big data' models. The future efforts in toxicology data generation will likely provide us with hundreds of thousands of compounds that are readily accessible for machine learning models. These models will cover relevant chemistry space for pharmaceutical, consumer product and environmental applications. Copyright © 2013 Elsevier Inc. All rights reserved.

  3. ACToR-AGGREGATED COMPUTATIONAL TOXICOLOGY ...

    Science.gov (United States)

    One goal of the field of computational toxicology is to predict chemical toxicity by combining computer models with biological and toxicological data. predict chemical toxicity by combining computer models with biological and toxicological data

  4. Aggregated Computational Toxicology Resource (ACTOR)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Aggregated Computational Toxicology Resource (ACTOR) is a database on environmental chemicals that is searchable by chemical name and other identifiers, and by...

  5. Aggregated Computational Toxicology Online Resource

    Data.gov (United States)

    U.S. Environmental Protection Agency — Aggregated Computational Toxicology Online Resource (AcTOR) is EPA's online aggregator of all the public sources of chemical toxicity data. ACToR aggregates data...

  6. COMPUTATIONAL TOXICOLOGY-WHERE IS THE DATA? ...

    Science.gov (United States)

    This talk will briefly describe the state of the data world for computational toxicology and one approach to improve the situation, called ACToR (Aggregated Computational Toxicology Resource). This talk will briefly describe the state of the data world for computational toxicology and one approach to improve the situation, called ACToR (Aggregated Computational Toxicology Resource).

  7. Testing of Binders Toxicological Effects

    Science.gov (United States)

    Strokova, V.; Nelyubova, V.; Rykunova, M.

    2017-11-01

    The article presents the results of a study of the toxicological effect of binders with different compositions on the vital activity of plant and animal test-objects. The analysis of the effect on plant cultures was made on the basis of the phytotesting data. The study of the effect of binders on objects of animal origin was carried out using the method of short-term testing. Based on the data obtained, binders are ranked according to the degree of increase in the toxic effect: Gypsum → Portland cement → Slag Portland cement. Regardless of the test-object type, the influence of binders is due to the release of various elements (calcium ions or heavy metals) into the solution. In case of plant cultures, the saturation of the solution with elements has a positive effect (there is no inhibitory effect), and in case of animal specimens - an increase in the toxic effect.

  8. Comparative BioInformatics and Computational Toxicology

    Science.gov (United States)

    Reflecting the numerous changes in the field since the publication of the previous edition, this third edition of Developmental Toxicology focuses on the mechanisms of developmental toxicity and incorporates current technologies for testing in the risk assessment process.

  9. Applicability of Computational Systems Biology in Toxicology

    DEFF Research Database (Denmark)

    Kongsbak, Kristine Grønning; Hadrup, Niels; Audouze, Karine Marie Laure

    2014-01-01

    be used to establish hypotheses on links between the chemical and human diseases. Such information can also be applied for designing more intelligent animal/cell experiments that can test the established hypotheses. Here, we describe how and why to apply an integrative systems biology method......Systems biology as a research field has emerged within the last few decades. Systems biology, often defined as the antithesis of the reductionist approach, integrates information about individual components of a biological system. In integrative systems biology, large data sets from various sources...... and databases are used to model and predict effects of chemicals on, for instance, human health. In toxicology, computational systems biology enables identification of important pathways and molecules from large data sets; tasks that can be extremely laborious when performed by a classical literature search...

  10. ACToR - Aggregated Computational Toxicology Resource

    International Nuclear Information System (INIS)

    Judson, Richard; Richard, Ann; Dix, David; Houck, Keith; Elloumi, Fathi; Martin, Matthew; Cathey, Tommy; Transue, Thomas R.; Spencer, Richard; Wolf, Maritja

    2008-01-01

    ACToR (Aggregated Computational Toxicology Resource) is a database and set of software applications that bring into one central location many types and sources of data on environmental chemicals. Currently, the ACToR chemical database contains information on chemical structure, in vitro bioassays and in vivo toxicology assays derived from more than 150 sources including the U.S. Environmental Protection Agency (EPA), Centers for Disease Control (CDC), U.S. Food and Drug Administration (FDA), National Institutes of Health (NIH), state agencies, corresponding government agencies in Canada, Europe and Japan, universities, the World Health Organization (WHO) and non-governmental organizations (NGOs). At the EPA National Center for Computational Toxicology, ACToR helps manage large data sets being used in a high-throughput environmental chemical screening and prioritization program called ToxCast TM

  11. Evolution of Computational Toxicology-from Primitive ...

    Science.gov (United States)

    Presentation at the Health Canada seminar in Ottawa, ON, Canada on Nov. 15. 2016 Presentation at the Health Canada seminar in Ottawa, ON, Canada on Nov. 15. 2016 on the Evolution of Computational Toxicology-from Primitive Beginnings to Sophisticated Application

  12. In Silico Toxicology – Non-Testing Methods

    Science.gov (United States)

    Raunio, Hannu

    2011-01-01

    In silico toxicology in its broadest sense means “anything that we can do with a computer in toxicology.” Many different types of in silico methods have been developed to characterize and predict toxic outcomes in humans and environment. The term non-testing methods denote grouping approaches, structure–activity relationship, and expert systems. These methods are already used for regulatory purposes and it is anticipated that their role will be much more prominent in the near future. This Perspective will delineate the basic principles of non-testing methods and evaluate their role in current and future risk assessment of chemical compounds. PMID:21772821

  13. Computational Toxicology as Implemented by the US EPA ...

    Science.gov (United States)

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the T

  14. Molecular dynamics simulations and applications in computational toxicology and nanotoxicology.

    Science.gov (United States)

    Selvaraj, Chandrabose; Sakkiah, Sugunadevi; Tong, Weida; Hong, Huixiao

    2018-02-01

    Nanotoxicology studies toxicity of nanomaterials and has been widely applied in biomedical researches to explore toxicity of various biological systems. Investigating biological systems through in vivo and in vitro methods is expensive and time taking. Therefore, computational toxicology, a multi-discipline field that utilizes computational power and algorithms to examine toxicology of biological systems, has gained attractions to scientists. Molecular dynamics (MD) simulations of biomolecules such as proteins and DNA are popular for understanding of interactions between biological systems and chemicals in computational toxicology. In this paper, we review MD simulation methods, protocol for running MD simulations and their applications in studies of toxicity and nanotechnology. We also briefly summarize some popular software tools for execution of MD simulations. Published by Elsevier Ltd.

  15. Computational toxicology: Its essential role in reducing drug attrition.

    Science.gov (United States)

    Naven, R T; Louise-May, S

    2015-12-01

    Predictive toxicology plays a critical role in reducing the failure rate of new drugs in pharmaceutical research and development. Despite recent gains in our understanding of drug-induced toxicity, however, it is urgent that the utility and limitations of our current predictive tools be determined in order to identify gaps in our understanding of mechanistic and chemical toxicology. Using recently published computational regression analyses of in vitro and in vivo toxicology data, it will be demonstrated that significant gaps remain in early safety screening paradigms. More strategic analyses of these data sets will allow for a better understanding of their domain of applicability and help identify those compounds that cause significant in vivo toxicity but which are currently mis-predicted by in silico and in vitro models. These 'outliers' and falsely predicted compounds are metaphorical lighthouses that shine light on existing toxicological knowledge gaps, and it is essential that these compounds are investigated if attrition is to be reduced significantly in the future. As such, the modern computational toxicologist is more productively engaged in understanding these gaps and driving investigative toxicology towards addressing them. © The Author(s) 2015.

  16. Measuring Impact of EPAs Computational Toxicology Research (BOSC)

    Science.gov (United States)

    Computational Toxicology (CompTox) research at the EPA was initiated in 2005. Since 2005, CompTox research efforts have made tremendous advances in developing new approaches to evaluate thousands of chemicals for potential health effects. The purpose of this case study is to trac...

  17. Mind the Gap! A Journey towards Computational Toxicology.

    Science.gov (United States)

    Mangiatordi, Giuseppe Felice; Alberga, Domenico; Altomare, Cosimo Damiano; Carotti, Angelo; Catto, Marco; Cellamare, Saverio; Gadaleta, Domenico; Lattanzi, Gianluca; Leonetti, Francesco; Pisani, Leonardo; Stefanachi, Angela; Trisciuzzi, Daniela; Nicolotti, Orazio

    2016-09-01

    Computational methods have advanced toxicology towards the development of target-specific models based on a clear cause-effect rationale. However, the predictive potential of these models presents strengths and weaknesses. On the good side, in silico models are valuable cheap alternatives to in vitro and in vivo experiments. On the other, the unconscious use of in silico methods can mislead end-users with elusive results. The focus of this review is on the basic scientific and regulatory recommendations in the derivation and application of computational models. Attention is paid to examine the interplay between computational toxicology and drug discovery and development. Avoiding the easy temptation of an overoptimistic future, we report our view on what can, or cannot, realistically be done. Indeed, studies of safety/toxicity represent a key element of chemical prioritization programs carried out by chemical industries, and primarily by pharmaceutical companies. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. In silico toxicology: computational methods for the prediction of chemical toxicity

    KAUST Repository

    Raies, Arwa B.; Bajic, Vladimir B.

    2016-01-01

    Determining the toxicity of chemicals is necessary to identify their harmful effects on humans, animals, plants, or the environment. It is also one of the main steps in drug design. Animal models have been used for a long time for toxicity testing. However, in vivo animal tests are constrained by time, ethical considerations, and financial burden. Therefore, computational methods for estimating the toxicity of chemicals are considered useful. In silico toxicology is one type of toxicity assessment that uses computational methods to analyze, simulate, visualize, or predict the toxicity of chemicals. In silico toxicology aims to complement existing toxicity tests to predict toxicity, prioritize chemicals, guide toxicity tests, and minimize late-stage failures in drugs design. There are various methods for generating models to predict toxicity endpoints. We provide a comprehensive overview, explain, and compare the strengths and weaknesses of the existing modeling methods and algorithms for toxicity prediction with a particular (but not exclusive) emphasis on computational tools that can implement these methods and refer to expert systems that deploy the prediction models. Finally, we briefly review a number of new research directions in in silico toxicology and provide recommendations for designing in silico models.

  19. In silico toxicology: computational methods for the prediction of chemical toxicity

    KAUST Repository

    Raies, Arwa B.

    2016-01-06

    Determining the toxicity of chemicals is necessary to identify their harmful effects on humans, animals, plants, or the environment. It is also one of the main steps in drug design. Animal models have been used for a long time for toxicity testing. However, in vivo animal tests are constrained by time, ethical considerations, and financial burden. Therefore, computational methods for estimating the toxicity of chemicals are considered useful. In silico toxicology is one type of toxicity assessment that uses computational methods to analyze, simulate, visualize, or predict the toxicity of chemicals. In silico toxicology aims to complement existing toxicity tests to predict toxicity, prioritize chemicals, guide toxicity tests, and minimize late-stage failures in drugs design. There are various methods for generating models to predict toxicity endpoints. We provide a comprehensive overview, explain, and compare the strengths and weaknesses of the existing modeling methods and algorithms for toxicity prediction with a particular (but not exclusive) emphasis on computational tools that can implement these methods and refer to expert systems that deploy the prediction models. Finally, we briefly review a number of new research directions in in silico toxicology and provide recommendations for designing in silico models.

  20. Animal models of toxicology testing: the role of pigs.

    Science.gov (United States)

    Helke, Kristi L; Swindle, Marvin Michael

    2013-02-01

    In regulatory toxicological testing, both a rodent and non-rodent species are required. Historically, dogs and non-human primates (NHP) have been the species of choice of the non-rodent portion of testing. The pig is an appropriate option for these tests based on metabolic pathways utilized in xenobiotic biotransformation. This review focuses on the Phase I and Phase II biotransformation pathways in humans and pigs and highlights the similarities and differences of these models. This is a growing field and references are sparse. Numerous breeds of pigs are discussed along with specific breed differences in these enzymes that are known. While much available data are presented, it is grossly incomplete and sometimes contradictory based on methods used. There is no ideal species to use in toxicology. The use of dogs and NHP in xenobiotic testing continues to be the norm. Pigs present a viable and perhaps more reliable model of non-rodent testing.

  1. A computational approach to mechanistic and predictive toxicology of pesticides

    DEFF Research Database (Denmark)

    Kongsbak, Kristine Grønning; Vinggaard, Anne Marie; Hadrup, Niels

    2014-01-01

    Emerging challenges of managing and interpreting large amounts of complex biological data have given rise to the growing field of computational biology. We investigated the applicability of an integrated systems toxicology approach on five selected pesticides to get an overview of their modes...... of action in humans, to group them according to their modes of action, and to hypothesize on their potential effects on human health. We extracted human proteins associated to prochloraz, tebuconazole, epoxiconazole, procymidone, and mancozeb and enriched each protein set by using a high confidence human......, and procymidone exerted their effects mainly via interference with steroidogenesis and nuclear receptors. Prochloraz was associated to a large number of human diseases, and together with tebuconazole showed several significant associations to Testicular Dysgenesis Syndrome. Mancozeb showed a differential mode...

  2. Functional toxicology: tools to advance the future of toxicity testing

    Science.gov (United States)

    Gaytán, Brandon D.; Vulpe, Chris D.

    2014-01-01

    The increased presence of chemical contaminants in the environment is an undeniable concern to human health and ecosystems. Historically, by relying heavily upon costly and laborious animal-based toxicity assays, the field of toxicology has often neglected examinations of the cellular and molecular mechanisms of toxicity for the majority of compounds—information that, if available, would strengthen risk assessment analyses. Functional toxicology, where cells or organisms with gene deletions or depleted proteins are used to assess genetic requirements for chemical tolerance, can advance the field of toxicity testing by contributing data regarding chemical mechanisms of toxicity. Functional toxicology can be accomplished using available genetic tools in yeasts, other fungi and bacteria, and eukaryotes of increased complexity, including zebrafish, fruit flies, rodents, and human cell lines. Underscored is the value of using less complex systems such as yeasts to direct further studies in more complex systems such as human cell lines. Functional techniques can yield (1) novel insights into chemical toxicity; (2) pathways and mechanisms deserving of further study; and (3) candidate human toxicant susceptibility or resistance genes. PMID:24847352

  3. Human Environmental Disease Network: A computational model to assess toxicology of contaminants.

    Science.gov (United States)

    Taboureau, Olivier; Audouze, Karine

    2017-01-01

    During the past decades, many epidemiological, toxicological and biological studies have been performed to assess the role of environmental chemicals as potential toxicants associated with diverse human disorders. However, the relationships between diseases based on chemical exposure rarely have been studied by computational biology. We developed a human environmental disease network (EDN) to explore and suggest novel disease-disease and chemical-disease relationships. The presented scored EDN model is built upon the integration of systems biology and chemical toxicology using information on chemical contaminants and their disease relationships reported in the TDDB database. The resulting human EDN takes into consideration the level of evidence of the toxicant-disease relationships, allowing inclusion of some degrees of significance in the disease-disease associations. Such a network can be used to identify uncharacterized connections between diseases. Examples are discussed for type 2 diabetes (T2D). Additionally, this computational model allows confirmation of already known links between chemicals and diseases (e.g., between bisphenol A and behavioral disorders) and also reveals unexpected associations between chemicals and diseases (e.g., between chlordane and olfactory alteration), thus predicting which chemicals may be risk factors to human health. The proposed human EDN model allows exploration of common biological mechanisms of diseases associated with chemical exposure, helping us to gain insight into disease etiology and comorbidity. This computational approach is an alternative to animal testing supporting the 3R concept.

  4. Toxicology screen

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/article/003578.htm Toxicology screen To use the sharing features on this page, please enable JavaScript. A toxicology screen refers to various tests that determine the ...

  5. Delivering an Informational Hub for Data at the National Center for Computational Toxicology (ACS Spring Meeting) 7 of 7

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data drive...

  6. Investigating Impact Metrics for Performance for the US EPA National Center for Computational Toxicology (ACS Fall meeting)

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data drive...

  7. Mechanized toxicological serum tests in screening hospitalized patients.

    Science.gov (United States)

    Hallbach, J; Guder, W G

    1991-09-01

    A spectrum of quantitative and qualitative methods was adapted to the RA-1000/RA-XT selective analyser for the purpose of excluding or detecting common types of intoxication in the emergency laboratory of our primary care community hospital. Ethanol and salicylates (measured photometrically) and acetaminophen (measured immunologically by EMIT tox) were quantitatively analysed in serum. immunological group tests (EMIT tox) for barbiturates, benzodiazepines, tricyclic antidepressants and related compounds were used for qualitative analysis. Well established clinical chemical methods (aspartarte aminotransferase, alanine aminotransferase, creatine kinase, pseudocholinesterase, glucose and lactate) were applied to the serum samples using the same selective analyser. Within and between run precision, accuracy, recovery and detection ranges (linearity) fulfilled the recommendations of forefield toxicological analysis for all methods. Ethanol (g/l), measured photometrically with the RA-1000 analyser, agreed with the reference method (headspace gas-chromatography) with a correlation coefficient greater than 0.99 (y = 0.06 + 0.98x). Acetaminophen and salicylates showed correlation coefficients greater than 0.94 and greater than 0.99, when compared with manual colorimetric procedures (acetaminophen (mg/l): y = -3.22 + 0.896x; salicylates (mg/l): y = -2.1 + 1x). Qualitative group tests for barbiturates, benzodiazepines and tricyclic antidepressants measured with the RA-1000 analyser were in good agreement with the EMIT single test procedure. The ranges of the quantitative methods allowed quantification of analytes from therapeutic (non-toxic) to very high levels in undiluted samples (ethanol 0.05 up to 4 g/l; salicylates 32 up to 1200 mg/l and acetaminophen 1.9 up to 200 mg/l). The low detection limits of the qualitative tests allowed the recognition of compounds in plasma that were present in low concentrations and/or displayed only minor reactivity with the antibodies

  8. Computational Embryology and Predictive Toxicology of Cleft Palate

    Science.gov (United States)

    Capacity to model and simulate key events in developmental toxicity using computational systems biology and biological knowledge steps closer to hazard identification across the vast landscape of untested environmental chemicals. In this context, we chose cleft palate as a model ...

  9. How Adverse Outcome Pathways Can Aid the Development and Use of Computational Prediction Models for Regulatory Toxicology.

    Science.gov (United States)

    Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M E Bette; Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M; Whelan, Maurice

    2017-02-01

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework provides a systematic approach for organizing knowledge that may support such inference. Likewise, computational models of biological systems at various scales provide another means and platform to integrate current biological understanding to facilitate inference and extrapolation. We argue that the systematic organization of knowledge into AOP frameworks can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. This concept was explored as part of a workshop on AOP-Informed Predictive Modeling Approaches for Regulatory Toxicology held September 24-25, 2015. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development is described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology.

  10. Animals and the 3Rs in toxicology research and testing: The way forward.

    Science.gov (United States)

    Stokes, W S

    2015-12-01

    Despite efforts to eliminate the use of animals in testing and the availability of many accepted alternative methods, animals are still widely used for toxicological research and testing. While research using in vitro and computational models has dramatically increased in recent years, such efforts have not yet measurably impacted animal use for regulatory testing and are not likely to do so for many years or even decades. Until regulatory authorities have accepted test methods that can totally replace animals and these are fully implemented, large numbers of animals will continue to be used and many will continue to experience significant pain and distress. In order to positively impact the welfare of these animals, accepted alternatives must be implemented, and efforts must be directed at eliminating pain and distress and reducing animal numbers. Animal pain and distress can be reduced by earlier predictive humane endpoints, pain-relieving medications, and supportive clinical care, while sequential testing and routine use of integrated testing and decision strategies can reduce animal numbers. Applying advances in science and technology to the development of scientifically sound alternative testing models and strategies can improve animal welfare and further reduce and replace animal use. © The Author(s) 2015.

  11. DIGGING DEEPER INTO DEEP DATA: MOLECULAR DOCKING AS A HYPOTHESIS-DRIVEN BIOPHYSICAL INTERROGATION SYSTEM IN COMPUTATIONAL TOXICOLOGY.

    Science.gov (United States)

    Developing and evaluating prediactive strategies to elucidate the mode of biological activity of environmental chemicals is a major objective of the concerted efforts of the US-EPA's computational toxicology program.

  12. Current Knowledge on the Use of Computational Toxicology in Hazard Assessment of Metallic Engineered Nanomaterials

    Directory of Open Access Journals (Sweden)

    Guangchao Chen

    2017-07-01

    Full Text Available As listed by the European Chemicals Agency, the three elements in evaluating the hazards of engineered nanomaterials (ENMs include the integration and evaluation of toxicity data, categorization and labeling of ENMs, and derivation of hazard threshold levels for human health and the environment. Assessing the hazards of ENMs solely based on laboratory tests is time-consuming, resource intensive, and constrained by ethical considerations. The adoption of computational toxicology into this task has recently become a priority. Alternative approaches such as (quantitative structure–activity relationships ((QSAR and read-across are of significant help in predicting nanotoxicity and filling data gaps, and in classifying the hazards of ENMs to individual species. Thereupon, the species sensitivity distribution (SSD approach is able to serve the establishment of ENM hazard thresholds sufficiently protecting the ecosystem. This article critically reviews the current knowledge on the development of in silico models in predicting and classifying the hazard of metallic ENMs, and the development of SSDs for metallic ENMs. Further discussion includes the significance of well-curated experimental datasets and the interpretation of toxicity mechanisms of metallic ENMs based on reported models. An outlook is also given on future directions of research in this frontier.

  13. Current Knowledge on the Use of Computational Toxicology in Hazard Assessment of Metallic Engineered Nanomaterials.

    Science.gov (United States)

    Chen, Guangchao; Peijnenburg, Willie; Xiao, Yinlong; Vijver, Martina G

    2017-07-12

    As listed by the European Chemicals Agency, the three elements in evaluating the hazards of engineered nanomaterials (ENMs) include the integration and evaluation of toxicity data, categorization and labeling of ENMs, and derivation of hazard threshold levels for human health and the environment. Assessing the hazards of ENMs solely based on laboratory tests is time-consuming, resource intensive, and constrained by ethical considerations. The adoption of computational toxicology into this task has recently become a priority. Alternative approaches such as (quantitative) structure-activity relationships ((Q)SAR) and read-across are of significant help in predicting nanotoxicity and filling data gaps, and in classifying the hazards of ENMs to individual species. Thereupon, the species sensitivity distribution (SSD) approach is able to serve the establishment of ENM hazard thresholds sufficiently protecting the ecosystem. This article critically reviews the current knowledge on the development of in silico models in predicting and classifying the hazard of metallic ENMs, and the development of SSDs for metallic ENMs. Further discussion includes the significance of well-curated experimental datasets and the interpretation of toxicity mechanisms of metallic ENMs based on reported models. An outlook is also given on future directions of research in this frontier.

  14. Integrated Testing Strategy (ITS) - Opportunities to better use existing data and guide future testing in toxicology.

    Science.gov (United States)

    Jaworska, Joanna; Hoffmann, Sebastian

    2010-01-01

    The topic of Integrated Testing Strategies (ITS) has attracted considerable attention, and not only because it is supposed to be a central element of REACH, the ambitious European chemical regulation effort. Although what ITSs are supposed to do seems unambiguous, i.e. speeding up hazard and risk assessment while reducing testing costs, not much has been said, except basic conceptual proposals, about the methodologies that would allow execution of these concepts. Although a pressing concern, the topic of ITS has drawn mostly general reviews, broad concepts, and the expression of a clear need for more research on ITS. Published research in the field remains scarce. Solutions for ITS design emerge slowly, most likely due to the methodological challenges of the task, and perhaps also to it its complexity and the need for multidisciplinary collaboration. Along with the challenge, ITS offer a unique opportunity to contribute to the Toxicology of the 21st century by providing frameworks and tools to actually implement 21st century toxicology data in the chemical management and decision making processes. Further, ITS have the potential to significantly contribute to a modernization of the science of risk assessment. Therefore, to advance ITS research we propose a methodical approach to their design and will discuss currently available approaches as well as challenges to overcome. To this end, we define a framework for ITS that will inform toxicological decisions in a systematic, transparent, and consistent way. We review conceptual requirements for ITS developed earlier and present a roadmap to an operational framework that should be probabilistic, hypothesis-driven, and adaptive. Furthermore, we define properties an ITS should have in order to meet the identified requirements and differentiate them from evidence synthesis. Making use of an ITS for skin sensitization, we demonstrate how the proposed ITS concepts can be implemented.

  15. Computational toxicology as implemented by the U.S. EPA: providing high throughput decision support tools for screening and assessing chemical exposure, hazard and risk.

    Science.gov (United States)

    Kavlock, Robert; Dix, David

    2010-02-01

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the Toxicity of Chemicals (U.S. EPA, 2009a). Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast) and exposure (ExpoCast), and creating virtual liver (v-Liver) and virtual embryo (v-Embryo) systems models. U.S. EPA-funded STAR centers are also providing bioinformatics, computational toxicology data and models, and developmental toxicity data and models. The models and underlying data are being made publicly

  16. Translating Computational Toxicology Data Through Stakeholder Outreach & Engagement (SOT)

    Science.gov (United States)

    US EPA has been using in vitro testing methods in an effort to accelerate the pace of chemical evaluations and address the significant lack of health and environmental data on the thousands of chemicals found in commonly used products. Since 2005, EPA’s researchers have generated...

  17. The Computerized Laboratory Notebook concept for genetic toxicology experimentation and testing.

    Science.gov (United States)

    Strauss, G H; Stanford, W L; Berkowitz, S J

    1989-03-01

    We describe a microcomputer system utilizing the Computerized Laboratory Notebook (CLN) concept developed in our laboratory for the purpose of automating the Battery of Leukocyte Tests (BLT). The BLT was designed to evaluate blood specimens for toxic, immunotoxic, and genotoxic effects after in vivo exposure to putative mutagens. A system was developed with the advantages of low cost, limited spatial requirements, ease of use for personnel inexperienced with computers, and applicability to specific testing yet flexibility for experimentation. This system eliminates cumbersome record keeping and repetitive analysis inherent in genetic toxicology bioassays. Statistical analysis of the vast quantity of data produced by the BLT would not be feasible without a central database. Our central database is maintained by an integrated package which we have adapted to develop the CLN. The clonal assay of lymphocyte mutagenesis (CALM) section of the CLN is demonstrated. PC-Slaves expand the microcomputer to multiple workstations so that our computerized notebook can be used next to a hood while other work is done in an office and instrument room simultaneously. Communication with peripheral instruments is an indispensable part of many laboratory operations, and we present a representative program, written to acquire and analyze CALM data, for communicating with both a liquid scintillation counter and an ELISA plate reader. In conclusion we discuss how our computer system could easily be adapted to the needs of other laboratories.

  18. Diagnostic yield of hair and urine toxicology testing in potential child abuse cases.

    Science.gov (United States)

    Stauffer, Stephanie L; Wood, Stephanie M; Krasowski, Matthew D

    2015-07-01

    Detection of drugs in a child may be the first objective finding that can be reported in cases of suspected child abuse. Hair and urine toxicology testing, when performed as part of the initial clinical evaluation for suspected child abuse or maltreatment, may serve to facilitate the identification of at-risk children. Furthermore, significant environmental exposure to a drug (considered by law to constitute child abuse in some states) may be identified by toxicology testing of unwashed hair specimens. In order to determine the clinical utility of hair and urine toxicology testing in this population we performed a retrospective chart review on all children for whom hair toxicology testing was ordered at our academic medical center between January 2004 and April 2014. The medical records of 616 children aged 0-17.5 years were reviewed for injury history, previous medication and illicit drug use by caregiver(s), urine drug screen result (if performed), hair toxicology result, medication list, and outcome of any child abuse evaluation. Hair toxicology testing was positive for at least one compound in 106 cases (17.2%), with unexplained drugs in 82 cases (13.3%). Of these, there were 48 cases in which multiple compounds (including combination of parent drugs and/or metabolites within the same drug class) were identified in the sample of one patient. The compounds most frequently identified in the hair of our study population included cocaine, benzoylecgonine, native (unmetabolized) tetrahydrocannabinol, and methamphetamine. There were 68 instances in which a parent drug was identified in the hair without any of its potential metabolites, suggesting environmental exposure. Among the 82 cases in which hair toxicology testing was positive for unexplained drugs, a change in clinical outcome was noted in 71 cases (86.5%). Urine drug screens (UDS) were performed in 457 of the 616 reviewed cases. Of these, over 95% of positive UDS results could be explained by iatrogenic drug

  19. Forensic Toxicology: An Introduction.

    Science.gov (United States)

    Smith, Michael P; Bluth, Martin H

    2016-12-01

    This article presents an overview of forensic toxicology. The authors describe the three components that make up forensic toxicology: workplace drug testing, postmortem toxicology, and human performance toxicology. Also discussed are the specimens that are tested, the methods used, and how the results are interpreted in this particular discipline. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Development and Application of Computational/In Vitro Toxicological Methods for Chemical Hazard Risk Reduction of New Materials for Advanced Weapon Systems

    Science.gov (United States)

    Frazier, John M.; Mattie, D. R.; Hussain, Saber; Pachter, Ruth; Boatz, Jerry; Hawkins, T. W.

    2000-01-01

    The development of quantitative structure-activity relationship (QSAR) is essential for reducing the chemical hazards of new weapon systems. The current collaboration between HEST (toxicology research and testing), MLPJ (computational chemistry) and PRS (computational chemistry, new propellant synthesis) is focusing R&D efforts on basic research goals that will rapidly transition to useful products for propellant development. Computational methods are being investigated that will assist in forecasting cellular toxicological end-points. Models developed from these chemical structure-toxicity relationships are useful for the prediction of the toxicological endpoints of new related compounds. Research is focusing on the evaluation tools to be used for the discovery of such relationships and the development of models of the mechanisms of action. Combinations of computational chemistry techniques, in vitro toxicity methods, and statistical correlations, will be employed to develop and explore potential predictive relationships; results for series of molecular systems that demonstrate the viability of this approach are reported. A number of hydrazine salts have been synthesized for evaluation. Computational chemistry methods are being used to elucidate the mechanism of action of these salts. Toxicity endpoints such as viability (LDH) and changes in enzyme activity (glutahoione peroxidase and catalase) are being experimentally measured as indicators of cellular damage. Extrapolation from computational/in vitro studies to human toxicity, is the ultimate goal. The product of this program will be a predictive tool to assist in the development of new, less toxic propellants.

  1. Green toxicology.

    Science.gov (United States)

    Maertens, Alexandra; Anastas, Nicholas; Spencer, Pamela J; Stephens, Martin; Goldberg, Alan; Hartung, Thomas

    2014-01-01

    Historically, early identification and characterization of adverse effects of industrial chemicals was difficult because conventional toxicological test methods did not meet R&D needs for rapid, relatively inexpensive methods amenable to small amounts of test material. The pharmaceutical industry now front-loads toxicity testing, using in silico, in vitro, and less demanding animal tests at earlier stages of product development to identify and anticipate undesirable toxicological effects and optimize product development. The Green Chemistry movement embraces similar ideas for development of less toxic products, safer processes, and less waste and exposure. Further, the concept of benign design suggests ways to consider possible toxicities before the actual synthesis and to apply some structure/activity rules (SAR) and in silico methods. This requires not only scientific development but also a change in corporate culture in which synthetic chemists work with toxicologists. An emerging discipline called Green Toxicology (Anastas, 2012) provides a framework for integrating the principles of toxicology into the enterprise of designing safer chemicals, thereby minimizing potential toxicity as early in production as possible. Green Toxicology`s novel utility lies in driving innovation by moving safety considerations to the earliest stage in a chemical`s lifecycle, i.e., to molecular design. In principle, this field is no different than other subdisciplines of toxicology that endeavor to focus on a specific area - for example, clinical, environmental or forensic toxicology. We use the same principles and tools to evaluate an existing substance or to design a new one. The unique emphasis is in using 21st century toxicology tools as a preventative strategy to "design out" undesired human health and environmental effects, thereby increasing the likelihood of launching a successful, sustainable product. Starting with the formation of a steering group and a series of workshops

  2. Delivering The Benefits of Chemical-Biological Integration in Computational Toxicology at the EPA (ACS Fall meeting)

    Science.gov (United States)

    Abstract: Researchers at the EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The intent...

  3. New developments in delivering public access to data from the National Center for Computational Toxicology at the EPA

    Science.gov (United States)

    Researchers at EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The goal of this researc...

  4. [Non-animal toxicology in the safety testing of chemicals].

    Science.gov (United States)

    Heinonen, Tuula; Tähti, Hanna

    2013-01-01

    There is an urgent need to develop predictive test methods better than animal experiments for assessing the safety of chemical substances to man. According to today's vision this is achieved by using human cell based tissue and organ models. In the new testing strategy the toxic effects are assessed by the changes in the critical parameters of the cellular biochemical routes (AOP, adverse toxic outcome pathway-principle) in the target tissues. In vitro-tests are rapid and effective, and with them automation can be applied. The change in the testing paradigm is supported by all stakeholders: scientists, regulators and people concerned on animal welfare.

  5. Toxicology Testing in the 21st Century (Tox21)

    Science.gov (United States)

    Tox21 researchers aim to develop better toxicity assessment methods to quickly and efficiently test whether certain chemical compounds have the potential to disrupt processes in the human body that may lead to negative health effects.

  6. Forensic toxicology.

    Science.gov (United States)

    Davis, Gregory G

    2012-01-01

    Toxicologic analysis is an integral part of death investigation, and the use or abuse of an unsuspected substance belongs in the differential diagnosis of patients who have a sudden, unexpected change in their condition. History and physical findings may alter suspicion that intoxication played a role in a patient's decline or death, but suspicions cannot be confirmed and is performed, analysis unless toxicologic no toxicologic analysis is possible unless someone collects the proper specimens necessary for analysis. In a hospital autopsy the only specimens that can rightfully be collected are those within the restrictions stated in the autopsy permit. Autopsies performed by the medical examiner do not have these restrictions. Sometimes the importance of toxicologic testing in a case is not evident until days or weeks after the change in the patient's status, thus retaining the appropriate specimens until investigation of that case has ended is important. Proper interpretation of toxicologic findings requires integrating the clinical setting and findings with the toxicologic results in a way that makes medical sense. If called upon to testify concerning findings, answer the questions truthfully, politely, and in a way that is understandable to someone who has no special training in toxicology.

  7. Chemical products toxicological tests performed on lake and river fish

    International Nuclear Information System (INIS)

    Teulon, F.; Simeon, C.

    1966-01-01

    The volume and toxical values of industrial and urban effluents are growing higher and therefore acute or chronic pollution hazard is proportionally increased. Hence it is necessary to determine the effluent components minimum lethal dose for fish (one hour or six hours according to applicable standards). The following tests are described in this report: toxicity of some chemical products, tested individually (sodium, sulphate, sodium chloride, sodium fluoride, etc...); toxicity of some metal ions (Al 3+ , Fe ++ , Fe 3+ , Pb ++ , etc...); toxicity of certain mixed compounds for various fish species (sun perch, tench, gold fish, roach, gudgeon, bleak). The test results obtained represent local values and may be used for reference and as a general basis for other investigation and calculation of the effluents data when released. (author) [fr

  8. Emerging approaches in predictive toxicology.

    Science.gov (United States)

    Zhang, Luoping; McHale, Cliona M; Greene, Nigel; Snyder, Ronald D; Rich, Ivan N; Aardema, Marilyn J; Roy, Shambhu; Pfuhler, Stefan; Venkatactahalam, Sundaresan

    2014-12-01

    Predictive toxicology plays an important role in the assessment of toxicity of chemicals and the drug development process. While there are several well-established in vitro and in vivo assays that are suitable for predictive toxicology, recent advances in high-throughput analytical technologies and model systems are expected to have a major impact on the field of predictive toxicology. This commentary provides an overview of the state of the current science and a brief discussion on future perspectives for the field of predictive toxicology for human toxicity. Computational models for predictive toxicology, needs for further refinement and obstacles to expand computational models to include additional classes of chemical compounds are highlighted. Functional and comparative genomics approaches in predictive toxicology are discussed with an emphasis on successful utilization of recently developed model systems for high-throughput analysis. The advantages of three-dimensional model systems and stem cells and their use in predictive toxicology testing are also described. © 2014 Wiley Periodicals, Inc.

  9. Handbook of safety assessment of nanomaterials from toxicological testing to personalized medicine

    CERN Document Server

    Fadeel, Bengt

    2014-01-01

    "The Handbook of Safety Assessment of Nanomaterials: From Toxicological Testing to Personalized Medicine provides a comprehensive overview of the state of the art of nanotoxicology and is a unique resource that fills up many knowledge gaps in the toxicity issue of nanomaterials in medical applications. The book is distinguished by up-to-date insights into creating a science-based framework for safety assessment of nanomedicines." -Prof. Yuliang Zhao, National Center for Nanosciences and Technology, China.

  10. [Adverse Effect Predictions Based on Computational Toxicology Techniques and Large-scale Databases].

    Science.gov (United States)

    Uesawa, Yoshihiro

    2018-01-01

     Understanding the features of chemical structures related to the adverse effects of drugs is useful for identifying potential adverse effects of new drugs. This can be based on the limited information available from post-marketing surveillance, assessment of the potential toxicities of metabolites and illegal drugs with unclear characteristics, screening of lead compounds at the drug discovery stage, and identification of leads for the discovery of new pharmacological mechanisms. This present paper describes techniques used in computational toxicology to investigate the content of large-scale spontaneous report databases of adverse effects, and it is illustrated with examples. Furthermore, volcano plotting, a new visualization method for clarifying the relationships between drugs and adverse effects via comprehensive analyses, will be introduced. These analyses may produce a great amount of data that can be applied to drug repositioning.

  11. Micro-computed tomography imaging and analysis in developmental biology and toxicology.

    Science.gov (United States)

    Wise, L David; Winkelmann, Christopher T; Dogdas, Belma; Bagchi, Ansuman

    2013-06-01

    Micro-computed tomography (micro-CT) is a high resolution imaging technique that has expanded and strengthened in use since it was last reviewed in this journal in 2004. The technology has expanded to include more detailed analysis of bone, as well as soft tissues, by use of various contrast agents. It is increasingly applied to questions in developmental biology and developmental toxicology. Relatively high-throughput protocols now provide a powerful and efficient means to evaluate embryos and fetuses subjected to genetic manipulations or chemical exposures. This review provides an overview of the technology, including scanning, reconstruction, visualization, segmentation, and analysis of micro-CT generated images. This is followed by a review of more recent applications of the technology in some common laboratory species that highlight the diverse issues that can be addressed. Copyright © 2013 Wiley Periodicals, Inc.

  12. Computer-Based Testing: Test Site Security.

    Science.gov (United States)

    Rosen, Gerald A.

    Computer-based testing places great burdens on all involved parties to ensure test security. A task analysis of test site security might identify the areas of protecting the test, protecting the data, and protecting the environment as essential issues in test security. Protecting the test involves transmission of the examinations, identifying the…

  13. Hair testing of GHB: an everlasting issue in forensic toxicology.

    Science.gov (United States)

    Busardò, Francesco Paolo; Pichini, Simona; Zaami, Simona; Pacifici, Roberta; Kintz, Pascal

    2018-01-26

    In this paper, the authors present a critical review of different studies regarding hair testing of endogenous γ-hydroxybutyrate (GHB), concentrations in chronic users, and values measured after a single GHB exposure in drug facilitated sexual assault (DFSA) cases together with the role of a recently identified GHB metabolite, GHB-glucuronide. The following databases (up to March 2017) PubMed, Scopus and Web of Science were used, searching the following key words: γ-hydroxybutyrate, GHB, GHB glucuronide, hair. The main key words "GHB" and "γ-hydroxybutyrate" were searched singularly and then associated individually to each of the other keywords. Of the 2304 sources found, only 20 were considered appropriate for the purpose of this paper. Summing up all the studies investigating endogenous GHB concentration in hair, a very broad concentration range from 0 to 12 ng/mg was found. In order to detect a single GHB dose in hair it is necessary to commonly wait 1 month for collecting hair and a segmental analysis of 3 or 5 mm fragments and the calculation of a ratio between the targeted segment and the others represent a reliable method to detect a single GHB intake considering that the ratios presently proposed vary from 3 and 10. The only two studies so far performed, investigating GHB-Glucuronide in hair, show that the latter does not seem to provide any diagnostic information regarding GHB exposure. A practical operative protocol is proposed to be applied in all suspected cases of GHB-facilitated sexual assault (GHB-FSA).

  14. How Adverse Outcome Pathways Can Aid the Development and Use of Computational Prediction Models for Regulatory Toxicology

    Energy Technology Data Exchange (ETDEWEB)

    Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J.; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M. E. (Bette); Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M.; Whelan, Maurice

    2016-12-19

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework has emerged as a systematic approach for organizing knowledge that supports such inference. We argue that this systematic organization of knowledge can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment.

  15. The FAA's postmortem forensic toxicology self-evaluated proficiency test program: the second seven years.

    Science.gov (United States)

    Chaturvedi, Arvind K; Craft, Kristi J; Cardona, Patrick S; Rogers, Paul B; Canfield, Dennis V

    2009-05-01

    During toxicological evaluations of samples from fatally injured pilots involved in civil aviation accidents, a high degree of quality control/quality assurance (QC/QA) is maintained. Under this philosophy, the Federal Aviation Administration (FAA) started a forensic toxicology proficiency-testing (PT) program in July 1991. In continuation of the first seven years of the PT findings reported earlier, PT findings of the next seven years are summarized herein. Twenty-eight survey samples (12 urine, 9 blood, and 7 tissue homogenate) with/without alcohols/volatiles, drugs, and/or putrefactive amine(s) were submitted to an average of 31 laboratories, of which an average of 25 participants returned their results. Analytes in survey samples were correctly identified and quantitated by a large number of participants, but some false positives of concern were reported. It is anticipated that the FAA's PT program will continue to serve the forensic toxicology community through this important part of the QC/QA for laboratory accreditations.

  16. Integrative Systems Biology Applied to Toxicology

    DEFF Research Database (Denmark)

    Kongsbak, Kristine Grønning

    associated with combined exposure to multiple chemicals. Testing all possible combinations of the tens of thousands environmental chemicals is impractical. This PhD project was launched to apply existing computational systems biology methods to toxicological research. In this thesis, I present in three...... of a system thereby suggesting new ways of thinking specific toxicological endpoints. Furthermore, computational methods can serve as valuable input for the hypothesis generating phase of the preparations of a research project....

  17. How adverse outcome pathways can aid the development and use of computational prediction models for regulatory toxicology

    Science.gov (United States)

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumu...

  18. Use of animals for toxicology testing is necessary to ensure patient safety in pharmaceutical development.

    Science.gov (United States)

    Mangipudy, Raja; Burkhardt, John; Kadambi, Vivek J

    2014-11-01

    There is an active debate in toxicology literature about the utility of animal testing vis-a-vis alternative in vitro paradigms. To provide a balanced perspective and add to this discourse it is important to review the current paradigms, explore pros and cons of alternatives, and provide a vision for the future. The fundamental goal of toxicity testing is to ensure safety in humans. In this article, IQ Consortium DruSafe, while submitting the view that nonclinical testing in animals is an important and critical component of the risk assessment paradigm in developing new drugs, also discusses its views on alternative approaches including a roadmap for what would be required to enhance the utilization of alternative approaches in the safety assessment process. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Evaluation of sampling methods for toxicological testing of indoor air particulate matter.

    Science.gov (United States)

    Tirkkonen, Jenni; Täubel, Martin; Hirvonen, Maija-Riitta; Leppänen, Hanna; Lindsley, William G; Chen, Bean T; Hyvärinen, Anne; Huttunen, Kati

    2016-09-01

    There is a need for toxicity tests capable of recognizing indoor environments with compromised air quality, especially in the context of moisture damage. One of the key issues is sampling, which should both provide meaningful material for analyses and fulfill requirements imposed by practitioners using toxicity tests for health risk assessment. We aimed to evaluate different existing methods of sampling indoor particulate matter (PM) to develop a suitable sampling strategy for a toxicological assay. During three sampling campaigns in moisture-damaged and non-damaged school buildings, we evaluated one passive and three active sampling methods: the Settled Dust Box (SDB), the Button Aerosol Sampler, the Harvard Impactor and the National Institute for Occupational Safety and Health (NIOSH) Bioaerosol Cyclone Sampler. Mouse RAW264.7 macrophages were exposed to particle suspensions and cell metabolic activity (CMA), production of nitric oxide (NO) and tumor necrosis factor (TNFα) were determined after 24 h of exposure. The repeatability of the toxicological analyses was very good for all tested sampler types. Variability within the schools was found to be high especially between different classrooms in the moisture-damaged school. Passively collected settled dust and PM collected actively with the NIOSH Sampler (Stage 1) caused a clear response in exposed cells. The results suggested the higher relative immunotoxicological activity of dust from the moisture-damaged school. The NIOSH Sampler is a promising candidate for the collection of size-fractionated PM to be used in toxicity testing. The applicability of such sampling strategy in grading moisture damage severity in buildings needs to be developed further in a larger cohort of buildings.

  20. Terrestrial Eco-Toxicological Tests as Screening Tool to Assess Soil Contamination in Krompachy Area

    Science.gov (United States)

    Ol'ga, Šestinová; Findoráková, Lenka; Hančuľák, Jozef; Fedorová, Erika; Tomislav, Špaldon

    2016-10-01

    In this study, we present screening tool of heavy metal inputs to agricultural and permanent grass vegetation of the soils in Krompachy. This study is devoted to Ecotoxicity tests, Terrestrial Plant Test (modification of OECD 208, Phytotoxkit microbiotest on Sinapis Alba) and chronic tests of Earthworm (Dendrobaena veneta, modification of OECD Guidelines for the testing of chemicals 317, Bioaccumulation in Terrestrial Oligochaetes) as practical and sensitive screening method for assessing the effects of heavy metals in Krompachy soils. The total Cu, Zn, As, Pb and Hg concentrations and eco-toxicological tests of soils from the Krompachy area were determined of 4 sampling sites in 2015. An influence of the sampling sites distance from the copper smeltery on the absolutely concentrations of metals were recorded for copper, lead, zinc, arsenic and mercury. The highest concentrations of these metals were detected on the sampling sites up to 3 km from the copper smeltery. The samples of soil were used to assess of phytotoxic effect. Total mortality was established at earthworms using chronic toxicity test after 7 exposure days. The results of our study confirmed that no mortality was observed in any of the study soils. Based on the phytotoxicity testing, phytotoxic effects of the metals contaminated soils from the samples 3KR (7-9) S.alba seeds was observed.

  1. Fluorescence in situ hybridization in combination with the comet assay and micronucleus test in genetic toxicology

    Directory of Open Access Journals (Sweden)

    Hovhannisyan Galina G

    2010-09-01

    Full Text Available Abstract Comet assay and micronucleus (MN test are widely applied in genotoxicity testing and biomonitoring. While comet assay permits to measure direct DNA-strand breaking capacity of a tested agent MN test allows estimating the induced amount of chromosome and/or genome mutations. The potential of these two methods can be enhanced by the combination with fluorescence in situ hybridization (FISH techniques. FISH plus comet assay allows the recognition of targets of DNA damage and repairing directly. FISH combined with MN test is able to characterize the occurrence of different chromosomes in MN and to identify potential chromosomal targets of mutagenic substances. Thus, combination of FISH with the comet assay or MN test proved to be promising techniques for evaluation of the distribution of DNA and chromosome damage in the entire genome of individual cells. FISH technique also permits to study comet and MN formation, necessary for correct application of these methods. This paper reviews the relevant literature on advantages and limitations of Comet-FISH and MN-FISH assays application in genetic toxicology.

  2. Towards improved behavioural testing in aquatic toxicology: Acclimation and observation times are important factors when designing behavioural tests with fish.

    Science.gov (United States)

    Melvin, Steven D; Petit, Marie A; Duvignacq, Marion C; Sumpter, John P

    2017-08-01

    The quality and reproducibility of science has recently come under scrutiny, with criticisms spanning disciplines. In aquatic toxicology, behavioural tests are currently an area of controversy since inconsistent findings have been highlighted and attributed to poor quality science. The problem likely relates to limitations to our understanding of basic behavioural patterns, which can influence our ability to design statistically robust experiments yielding ecologically relevant data. The present study takes a first step towards understanding baseline behaviours in fish, including how basic choices in experimental design might influence behavioural outcomes and interpretations in aquatic toxicology. Specifically, we explored how fish acclimate to behavioural arenas and how different lengths of observation time impact estimates of basic swimming parameters (i.e., average, maximum and angular velocity). We performed a semi-quantitative literature review to place our findings in the context of the published literature describing behavioural tests with fish. Our results demonstrate that fish fundamentally change their swimming behaviour over time, and that acclimation and observational timeframes may therefore have implications for influencing both the ecological relevance and statistical robustness of behavioural toxicity tests. Our review identified 165 studies describing behavioural responses in fish exposed to various stressors, and revealed that the majority of publications documenting fish behavioural responses report extremely brief acclimation times and observational durations, which helps explain inconsistencies identified across studies. We recommend that researchers applying behavioural tests with fish, and other species, apply a similar framework to better understand baseline behaviours and the implications of design choices for influencing study outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Self-Testing Computer Memory

    Science.gov (United States)

    Chau, Savio, N.; Rennels, David A.

    1988-01-01

    Memory system for computer repeatedly tests itself during brief, regular interruptions of normal processing of data. Detects and corrects transient faults as single-event upsets (changes in bits due to ionizing radiation) within milliseconds after occuring. Self-testing concept surpasses conventional by actively flushing latent defects out of memory and attempting to correct before accumulating beyond capacity for self-correction or detection. Cost of improvement modest increase in complexity of circuitry and operating time.

  4. Computer controlled testing of batteries

    NARCIS (Netherlands)

    Kuiper, A.C.J.; Einerhand, R.E.F.; Visscher, W.

    1989-01-01

    A computerized testing device for batteries consists of a power supply, a multiplexer circuit connected to the batteries, a protection circuit, and an IBM Data Aquisition and Control Adapter card, connected to a personal computer. The software is written in Turbo-Pascal and can be easily adapted to

  5. Has Toxicity Testing Moved into the 21st Century? A Survey and Analysis of Perceptions in the Field of Toxicology.

    Science.gov (United States)

    Zaunbrecher, Virginia; Beryt, Elizabeth; Parodi, Daniela; Telesca, Donatello; Doherty, Joseph; Malloy, Timothy; Allard, Patrick

    2017-08-30

    Ten years ago, leaders in the field of toxicology called for a transformation of the discipline and a shift from primarily relying on traditional animal testing to incorporating advances in biotechnology and predictive methodologies into alternative testing strategies (ATS). Governmental agencies and academic and industry partners initiated programs to support such a transformation, but a decade later, the outcomes of these efforts are not well understood. We aimed to assess the use of ATS and the perceived barriers and drivers to their adoption by toxicologists and by others working in, or closely linked with, the field of toxicology. We surveyed 1,381 toxicologists and experts in associated fields regarding the viability and use of ATS and the perceived barriers and drivers of ATS for a range of applications. We performed ranking, hierarchical clustering, and correlation analyses of the survey data. Many respondents indicated that they were already using ATS, or believed that ATS were already viable approaches, for toxicological assessment of one or more end points in their primary area of interest or concern (26-86%, depending on the specific ATS/application pair). However, the proportions of respondents reporting use of ATS in the previous 12 mo were smaller (4.5-41%). Concern about regulatory acceptance was the most commonly cited factor inhibiting the adoption of ATS, and a variety of technical concerns were also cited as significant barriers to ATS viability. The factors most often cited as playing a significant role (currently or in the future) in driving the adoption of ATS were the need for expedited toxicology information, the need for reduced toxicity testing costs, demand by regulatory agencies, and ethical or moral concerns. Our findings indicate that the transformation of the field of toxicology is partly implemented, but significant barriers to acceptance and adoption remain. https://doi.org/10.1289/EHP1435.

  6. Thresholds of Toxicological Concern - Setting a threshold for testing below which there is little concern.

    Science.gov (United States)

    Hartung, Thomas

    2017-01-01

    Low dose, low risk; very low dose, no real risk. Setting a pragmatic threshold below which concerns become negligible is the purpose of thresholds of toxicological concern (TTC). The idea is that such threshold values do not need to be established for each and every chemical based on experimental data, but that by analyzing the distribution of lowest or no-effect doses of many chemicals, a TTC can be defined - typically using the 5th percentile of this distribution and lowering it by an uncertainty factor of, e.g., 100. In doing so, TTC aims to compare exposure information (dose) with a threshold below which any hazard manifestation is very unlikely to occur. The history and current developments of this concept are reviewed and the application of TTC for different regulated products and their hazards is discussed. TTC lends itself as a pragmatic filter to deprioritize testing needs whenever real-life exposures are much lower than levels where hazard manifestation would be expected, a situation that is called "negligible exposure" in the REACH legislation, though the TTC concept has not been fully incorporated in its implementation (yet). Other areas and regulations - especially in the food sector and for pharmaceutical impurities - are more proactive. Large, curated databases on toxic effects of chemicals provide us with the opportunity to set TTC for many hazards and substance classes and thus offer a precautionary second tier for risk assessments if hazard cannot be excluded. This allows focusing testing efforts better on relevant exposures to chemicals.

  7. The Benefits of Making Data from the EPA National Center for Computational Toxicology available for reuse (ACS Fall meeting 3 of 12)

    Science.gov (United States)

    Researchers at EPA’s National Center for Computational Toxicology (NCCT) integrate advances in biology, chemistry, exposure and computer science to help prioritize chemicals for further research based on potential human health risks. The goal of this research is to quickly evalua...

  8. DarT: The embryo test with the Zebrafish Danio rerio--a general model in ecotoxicology and toxicology.

    Science.gov (United States)

    Nagel, Roland

    2002-01-01

    The acute fish test is an animal test whose ecotoxicological relevance is worthy of discussion. The primary aim of protection in ecotoxicology is the population and not the individual. Furthermore the concentration of pollutants in the environment is normally not in the lethal range. Therefore the acute fish test covers solely the situation after chemical spills. Nevertheless, acute fish toxicity data still belong to the base set used for the assessment of chemicals. The embryo test with the zebrafish Danio rerio (DarT) is recommended as a substitute for the acute fish test. For validation an international laboratory comparison test was carried out. A summary of the results is presented in this paper. Based on the promising results of testing chemicals and waste water the test design was validated by the DIN-working group "7.6 Fischei-Test". A normed test guideline for testing waste water with fish is available. The test duration is short (48 h) and within the test different toxicological endpoints can be examined. Endpoints from the embryo test are suitable for QSAR-studies. Besides the use in ecotoxicology the introduction as a toxicological model was investigated. Disturbance of pigmentation and effects on the frequency of heart-beat were examined. A further important application is testing of teratogenic chemicals. Based on the results DarT could be a screening test within preclinical studies.

  9. Choosing the right laboratory: a review of clinical and forensic toxicology services for urine drug testing in pain management.

    Science.gov (United States)

    Reisfield, Gary M; Goldberger, Bruce A; Bertholf, Roger L

    2015-01-01

    Urine drug testing (UDT) services are provided by a variety of clinical, forensic, and reference/specialty laboratories. These UDT services differ based on the principal activity of the laboratory. Clinical laboratories provide testing primarily focused on medical care (eg, emergency care, inpatients, and outpatient clinics), whereas forensic laboratories perform toxicology tests related to postmortem and criminal investigations, and drug-free workplace programs. Some laboratories now provide UDT specifically designed for monitoring patients on chronic opioid therapy. Accreditation programs for clinical laboratories have existed for nearly half a century, and a federal certification program for drug-testing laboratories was established in the 1980s. Standards of practice for forensic toxicology services other than workplace drug testing have been established in recent years. However, no accreditation program currently exists for UDT in pain management, and this review considers several aspects of laboratory accreditation and certification relevant to toxicology services, with the intention to provide guidance to clinicians in their selection of the appropriate laboratory for UDT surveillance of their patients on opioid therapy.

  10. Toxicology Study No. S.0024589d 15, Human Cell Line Activation Test of the Novel Energetic, 3,4 -Dinitropyrazole (DNP)

    Science.gov (United States)

    2016-04-01

    Assay 1 0.259 0.278 Assay 2 0.299 6.3 CD54 and CD86 expression in response to DNP exposure of THP -1 cells Three independent tests were...2 Toxicology Study No. S.0024589d-15, April 2016 Toxicology Directorate Human Cell Line Activation Test of the Novel Energetic 3,4...report. 17-05-2016 Technical Report March 2016-April 2016 Toxicology Study No. S.0024589d-15 Human Cell Line Activation Test of the Novel

  11. Fossil fuel toxicology

    International Nuclear Information System (INIS)

    Anon.

    1976-01-01

    A program is described for the investigation of the toxicology of coal-derived effluents that will utilize a battery of cellular and mammalian test systems and end points to evaluate the toxicological effects of acute, sub-acute, and long-term, low-level exposure to gaseous and particulate effluents from combustion of coal, with special emphasis on fluidized bed combustion

  12. Evidence-Based Toxicology.

    Science.gov (United States)

    Hoffmann, Sebastian; Hartung, Thomas; Stephens, Martin

    Evidence-based toxicology (EBT) was introduced independently by two groups in 2005, in the context of toxicological risk assessment and causation as well as based on parallels between the evaluation of test methods in toxicology and evidence-based assessment of diagnostics tests in medicine. The role model of evidence-based medicine (EBM) motivated both proposals and guided the evolution of EBT, whereas especially systematic reviews and evidence quality assessment attract considerable attention in toxicology.Regarding test assessment, in the search of solutions for various problems related to validation, such as the imperfectness of the reference standard or the challenge to comprehensively evaluate tests, the field of Diagnostic Test Assessment (DTA) was identified as a potential resource. DTA being an EBM discipline, test method assessment/validation therefore became one of the main drivers spurring the development of EBT.In the context of pathway-based toxicology, EBT approaches, given their objectivity, transparency and consistency, have been proposed to be used for carrying out a (retrospective) mechanistic validation.In summary, implementation of more evidence-based approaches may provide the tools necessary to adapt the assessment/validation of toxicological test methods and testing strategies to face the challenges of toxicology in the twenty first century.

  13. Toxicology ontology perspectives.

    Science.gov (United States)

    Hardy, Barry; Apic, Gordana; Carthew, Philip; Clark, Dominic; Cook, David; Dix, Ian; Escher, Sylvia; Hastings, Janna; Heard, David J; Jeliazkova, Nina; Judson, Philip; Matis-Mitchell, Sherri; Mitic, Dragana; Myatt, Glenn; Shah, Imran; Spjuth, Ola; Tcheremenskaia, Olga; Toldo, Luca; Watson, David; White, Andrew; Yang, Chihae

    2012-01-01

    The field of predictive toxicology requires the development of open, public, computable, standardized toxicology vocabularies and ontologies to support the applications required by in silico, in vitro, and in vivo toxicology methods and related analysis and reporting activities. In this article we review ontology developments based on a set of perspectives showing how ontologies are being used in predictive toxicology initiatives and applications. Perspectives on resources and initiatives reviewed include OpenTox, eTOX, Pistoia Alliance, ToxWiz, Virtual Liver, EU-ADR, BEL, ToxML, and Bioclipse. We also review existing ontology developments in neighboring fields that can contribute to establishing an ontological framework for predictive toxicology. A significant set of resources is already available to provide a foundation for an ontological framework for 21st century mechanistic-based toxicology research. Ontologies such as ToxWiz provide a basis for application to toxicology investigations, whereas other ontologies under development in the biological, chemical, and biomedical communities could be incorporated in an extended future framework. OpenTox has provided a semantic web framework for the implementation of such ontologies into software applications and linked data resources. Bioclipse developers have shown the benefit of interoperability obtained through ontology by being able to link their workbench application with remote OpenTox web services. Although these developments are promising, an increased international coordination of efforts is greatly needed to develop a more unified, standardized, and open toxicology ontology framework.

  14. Toxicology elements

    International Nuclear Information System (INIS)

    Viala, A.

    1998-01-01

    This work studies the different aspects of the modern toxicology: toxico-kinetic, biological, medico legal, food, professional, pharmaceuticals, environmental, social and regulatory. It is divided in three parts that consider the principle problems of general toxicology and analytical toxicology. (N.C.)

  15. Systems Toxicology: Real World Applications and Opportunities

    Science.gov (United States)

    2017-01-01

    Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized from empirical end points to describing modes of action as adverse outcome pathways and perturbed networks. Toward this aim, Systems Toxicology entails the integration of in vitro and in vivo toxicity data with computational modeling. This evolving approach depends critically on data reliability and relevance, which in turn depends on the quality of experimental models and bioanalysis techniques used to generate toxicological data. Systems Toxicology involves the use of large-scale data streams (“big data”), such as those derived from omics measurements that require computational means for obtaining informative results. Thus, integrative analysis of multiple molecular measurements, particularly acquired by omics strategies, is a key approach in Systems Toxicology. In recent years, there have been significant advances centered on in vitro test systems and bioanalytical strategies, yet a frontier challenge concerns linking observed network perturbations to phenotypes, which will require understanding pathways and networks that give rise to adverse responses. This summary perspective from a 2016 Systems Toxicology meeting, an international conference held in the Alps of Switzerland, describes the limitations and opportunities of selected emerging applications in this rapidly advancing field. Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized, from empirical end points to pathways of toxicity. This requires the integration of in vitro and in vivo data with computational modeling. Test systems and bioanalytical technologies have made significant advances, but ensuring data reliability and relevance is an ongoing concern. The major challenge facing the new pathway approach is determining how to link observed network perturbations to phenotypic toxicity. PMID:28362102

  16. Systems Toxicology: Real World Applications and Opportunities.

    Science.gov (United States)

    Hartung, Thomas; FitzGerald, Rex E; Jennings, Paul; Mirams, Gary R; Peitsch, Manuel C; Rostami-Hodjegan, Amin; Shah, Imran; Wilks, Martin F; Sturla, Shana J

    2017-04-17

    Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized from empirical end points to describing modes of action as adverse outcome pathways and perturbed networks. Toward this aim, Systems Toxicology entails the integration of in vitro and in vivo toxicity data with computational modeling. This evolving approach depends critically on data reliability and relevance, which in turn depends on the quality of experimental models and bioanalysis techniques used to generate toxicological data. Systems Toxicology involves the use of large-scale data streams ("big data"), such as those derived from omics measurements that require computational means for obtaining informative results. Thus, integrative analysis of multiple molecular measurements, particularly acquired by omics strategies, is a key approach in Systems Toxicology. In recent years, there have been significant advances centered on in vitro test systems and bioanalytical strategies, yet a frontier challenge concerns linking observed network perturbations to phenotypes, which will require understanding pathways and networks that give rise to adverse responses. This summary perspective from a 2016 Systems Toxicology meeting, an international conference held in the Alps of Switzerland, describes the limitations and opportunities of selected emerging applications in this rapidly advancing field. Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized, from empirical end points to pathways of toxicity. This requires the integration of in vitro and in vivo data with computational modeling. Test systems and bioanalytical technologies have made significant advances, but ensuring data reliability and relevance is an ongoing concern. The major challenge facing the new pathway approach is determining how to link observed network perturbations to phenotypic toxicity.

  17. Implementation of the 3Rs (refinement, reduction, and replacement): validation and regulatory acceptance considerations for alternative toxicological test methods.

    Science.gov (United States)

    Schechtman, Leonard M

    2002-01-01

    Toxicological testing in the current regulatory environment is steeped in a history of using animals to answer questions about the safety of products to which humans are exposed. That history forms the basis for the testing strategies that have evolved to satisfy the needs of the regulatory bodies that render decisions that affect, for the most part, virtually all phases of premarket product development and evaluation and, to a lesser extent, postmarketing surveillance. Only relatively recently have the levels of awareness of, and responsiveness to, animal welfare issues reached current proportions. That paradigm shift, although sluggish, has nevertheless been progressive. New and alternative toxicological methods for hazard evaluation and risk assessment have now been adopted and are being viewed as a means to address those issues in a manner that considers humane treatment of animals yet maintains scientific credibility and preserves the goal of ensuring human safety. To facilitate this transition, regulatory agencies and regulated industry must work together toward improved approaches. They will need assurance that the methods will be reliable and the results comparable with, or better than, those derived from the current classical methods. That confidence will be a function of the scientific validation and resultant acceptance of any given method. In the United States, to fulfill this need, the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) and its operational center, the National Toxicology Program Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM), have been constituted as prescribed in federal law. Under this mandate, ICCVAM has developed a process and established criteria for the scientific validation and regulatory acceptance of new and alternative methods. The role of ICCVAM in the validation and acceptance process and the criteria instituted toward that end are described. Also

  18. The Salmonella Mutagenicity Assay: The Stethoscope of Genetic Toxicology for the 21 st Century

    Science.gov (United States)

    OBJECTIVES: According to the 2007 National Research Council report Toxicology for the Twenty-first Century, modem methods ("omics," in vitro assays, high-throughput testing, computational methods, etc.) will lead to the emergence of a new approach to toxicology. The Salmonella ma...

  19. Proficiency testing as a basis for estimating uncertainty of measurement: application to forensic alcohol and toxicology quantitations.

    Science.gov (United States)

    Wallace, Jack

    2010-05-01

    While forensic laboratories will soon be required to estimate uncertainties of measurement for those quantitations reported to the end users of the information, the procedures for estimating this have been little discussed in the forensic literature. This article illustrates how proficiency test results provide the basis for estimating uncertainties in three instances: (i) For breath alcohol analyzers the interlaboratory precision is taken as a direct measure of uncertainty. This approach applies when the number of proficiency tests is small. (ii) For blood alcohol, the uncertainty is calculated from the differences between the laboratory's proficiency testing results and the mean quantitations determined by the participants; this approach applies when the laboratory has participated in a large number of tests. (iii) For toxicology, either of these approaches is useful for estimating comparability between laboratories, but not for estimating absolute accuracy. It is seen that data from proficiency tests enable estimates of uncertainty that are empirical, simple, thorough, and applicable to a wide range of concentrations.

  20. Insights into the molecular mechanisms of Polygonum multiflorum Thunb-induced liver injury: a computational systems toxicology approach.

    Science.gov (United States)

    Wang, Yin-Yin; Li, Jie; Wu, Zeng-Rui; Zhang, Bo; Yang, Hong-Bin; Wang, Qin; Cai, Ying-Chun; Liu, Gui-Xia; Li, Wei-Hua; Tang, Yun

    2017-05-01

    An increasing number of cases of herb-induced liver injury (HILI) have been reported, presenting new clinical challenges. In this study, taking Polygonum multiflorum Thunb (PmT) as an example, we proposed a computational systems toxicology approach to explore the molecular mechanisms of HILI. First, the chemical components of PmT were extracted from 3 main TCM databases as well as the literature related to natural products. Then, the known targets were collected through data integration, and the potential compound-target interactions (CTIs) were predicted using our substructure-drug-target network-based inference (SDTNBI) method. After screening for hepatotoxicity-related genes by assessing the symptoms of HILI, a compound-target interaction network was constructed. A scoring function, namely, Ascore, was developed to estimate the toxicity of chemicals in the liver. We conducted network analysis to determine the possible mechanisms of the biphasic effects using the analysis tools, including BiNGO, pathway enrichment, organ distribution analysis and predictions of interactions with CYP450 enzymes. Among the chemical components of PmT, 54 components with good intestinal absorption were used for analysis, and 2939 CTIs were obtained. After analyzing the mRNA expression data in the BioGPS database, 1599 CTIs and 125 targets related to liver diseases were identified. In the top 15 compounds, seven with Ascore values >3000 (emodin, quercetin, apigenin, resveratrol, gallic acid, kaempferol and luteolin) were obviously associated with hepatotoxicity. The results from the pathway enrichment analysis suggest that multiple interactions between apoptosis and metabolism may underlie PmT-induced liver injury. Many of the pathways have been verified in specific compounds, such as glutathione metabolism, cytochrome P450 metabolism, and the p53 pathway, among others. Hepatitis symptoms, the perturbation of nine bile acids and yellow or tawny urine also had corresponding pathways

  1. Agenda of behavioral toxicology

    Energy Technology Data Exchange (ETDEWEB)

    Weiss, B

    1978-01-01

    The author describes behavioral toxicology as a new discipline and contrasts it to the fields of physics and pharmacology. Several questions are raised and discussed concerning the field of behavioral toxicology. Some of these questions are: (1) how is an adverse behavioral effect recognized; (2) how can the non-specific be specified; (3) are standardized test batteries feasible. The problem of chronic intake is discussed as well as drawing information from other related disciplines such as neurochemistry, neuropathology and neurophysiology. The author concludes with several statements concerning new directions in the discipline of behavioral toxicology.

  2. CAT -- computer aided testing for resonant inspection

    International Nuclear Information System (INIS)

    Foley, David K.

    1998-01-01

    Application of computer technology relates to inspection and quality control. The computer aided testing (CAT) can be used to analyze various NDT technologies, such as eddy current, ultrasonics, and resonant inspection

  3. Test Anxiety, Computer-Adaptive Testing and the Common Core

    Science.gov (United States)

    Colwell, Nicole Makas

    2013-01-01

    This paper highlights the current findings and issues regarding the role of computer-adaptive testing in test anxiety. The computer-adaptive test (CAT) proposed by one of the Common Core consortia brings these issues to the forefront. Research has long indicated that test anxiety impairs student performance. More recent research indicates that…

  4. The Center for Alternatives to Animal Testing - Europe (CAAT-EU): a transatlantic bridge for the paradigm shift in toxicology.

    Science.gov (United States)

    Daneshian, Mardas; Leist, Marcel; Hartung, Thomas

    2010-01-01

    The Center for Alternatives to Animal Testing - Europe (CAAT-EU) was founded based collaboration between the Johns Hopkins Bloomberg School of Public Health and the University of Konstanz. CAAT-EU, housed at the University of Konstanz, will coordinate transatlantic activities to promote humane science in research and education, and participate, as partner or coordinator, in publicly and privately funded European projects. Thomas Hartung will serve as program liaison representing Johns Hopkins University and Marcel Leist as the University of Konstanz liaison. CAAT-EU aims to: 1) Set up transatlantic consortia for international research projects on alternative methods. 2) Establish a CAAT Europe faculty and advisory board composed of sponsor representatives and prominent academics from Europe . 3) Participate in the Transatlantic Think Tank for Toxicology (t4) devoted to conceptual work for the paradigm shift in toxicology. 4) Coordinate a series of information days in Europe on relevant developments in the US, similar to the 2009 series CAAT held in the US on EU issues (one on the 7th Amendment to the EU Cosmetics Directive and one on EU and US chemical regulation). 5) Support ALTEX as the official journal of CAAT and CAAT-EU. 6) Develop strategic projects with sponsors to promote humane science and new toxicology, especially with CAAT faculty members. 7) Develop a joint education program between Johns Hopkins and the University of Konstanz, such as e-courses and the existing Humane Science Certificate program developed by CAAT, a student exchange program, and collaboration with the International Graduate School "Cell-based Characterization of De- and Regeneration" in Konstanz.

  5. Predictive Toxicology: Current Status and Future Outlook (EBI ...

    Science.gov (United States)

    Slide presentation at the EBI-EMBL Industry Programme Workshop on Predictive Toxicology and the currently status of Computational Toxicology activities at the US EPA. Slide presentation at the EBI-EMBL Industry Programme Workshop on Predictive Toxicology and the currently status of Computational Toxicology activities at the US EPA.

  6. In silico toxicology protocols.

    Science.gov (United States)

    Myatt, Glenn J; Ahlberg, Ernst; Akahori, Yumi; Allen, David; Amberg, Alexander; Anger, Lennart T; Aptula, Aynur; Auerbach, Scott; Beilke, Lisa; Bellion, Phillip; Benigni, Romualdo; Bercu, Joel; Booth, Ewan D; Bower, Dave; Brigo, Alessandro; Burden, Natalie; Cammerer, Zoryana; Cronin, Mark T D; Cross, Kevin P; Custer, Laura; Dettwiler, Magdalena; Dobo, Krista; Ford, Kevin A; Fortin, Marie C; Gad-McDonald, Samantha E; Gellatly, Nichola; Gervais, Véronique; Glover, Kyle P; Glowienke, Susanne; Van Gompel, Jacky; Gutsell, Steve; Hardy, Barry; Harvey, James S; Hillegass, Jedd; Honma, Masamitsu; Hsieh, Jui-Hua; Hsu, Chia-Wen; Hughes, Kathy; Johnson, Candice; Jolly, Robert; Jones, David; Kemper, Ray; Kenyon, Michelle O; Kim, Marlene T; Kruhlak, Naomi L; Kulkarni, Sunil A; Kümmerer, Klaus; Leavitt, Penny; Majer, Bernhard; Masten, Scott; Miller, Scott; Moser, Janet; Mumtaz, Moiz; Muster, Wolfgang; Neilson, Louise; Oprea, Tudor I; Patlewicz, Grace; Paulino, Alexandre; Lo Piparo, Elena; Powley, Mark; Quigley, Donald P; Reddy, M Vijayaraj; Richarz, Andrea-Nicole; Ruiz, Patricia; Schilter, Benoit; Serafimova, Rositsa; Simpson, Wendy; Stavitskaya, Lidiya; Stidl, Reinhard; Suarez-Rodriguez, Diana; Szabo, David T; Teasdale, Andrew; Trejo-Martin, Alejandra; Valentin, Jean-Pierre; Vuorinen, Anna; Wall, Brian A; Watts, Pete; White, Angela T; Wichard, Joerg; Witt, Kristine L; Woolley, Adam; Woolley, David; Zwickl, Craig; Hasselgren, Catrin

    2018-04-17

    The present publication surveys several applications of in silico (i.e., computational) toxicology approaches across different industries and institutions. It highlights the need to develop standardized protocols when conducting toxicity-related predictions. This contribution articulates the information needed for protocols to support in silico predictions for major toxicological endpoints of concern (e.g., genetic toxicity, carcinogenicity, acute toxicity, reproductive toxicity, developmental toxicity) across several industries and regulatory bodies. Such novel in silico toxicology (IST) protocols, when fully developed and implemented, will ensure in silico toxicological assessments are performed and evaluated in a consistent, reproducible, and well-documented manner across industries and regulatory bodies to support wider uptake and acceptance of the approaches. The development of IST protocols is an initiative developed through a collaboration among an international consortium to reflect the state-of-the-art in in silico toxicology for hazard identification and characterization. A general outline for describing the development of such protocols is included and it is based on in silico predictions and/or available experimental data for a defined series of relevant toxicological effects or mechanisms. The publication presents a novel approach for determining the reliability of in silico predictions alongside experimental data. In addition, we discuss how to determine the level of confidence in the assessment based on the relevance and reliability of the information. Copyright © 2018. Published by Elsevier Inc.

  7. 21 CFR 862.3200 - Clinical toxicology calibrator.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Clinical toxicology calibrator. 862.3200 Section... (CONTINUED) MEDICAL DEVICES CLINICAL CHEMISTRY AND CLINICAL TOXICOLOGY DEVICES Clinical Toxicology Test Systems § 862.3200 Clinical toxicology calibrator. (a) Identification. A clinical toxicology calibrator is...

  8. Identifying Key Events in AOPs for Embryonic Disruption using Computational Toxicology (European Teratology Society - AOP symp.)

    Science.gov (United States)

    Addressing safety aspects of drugs and environmental chemicals relies extensively on animal testing; however, the quantity of chemicals needing assessment and challenges of species extrapolation require alternative approaches to traditional animal studies. Newer in vitro and in s...

  9. Development of methodology for alternative testing strategies for the assessment of the toxicological profile of nanoparticles used in medical diagnostics. NanoTEST - EC FP7 project

    International Nuclear Information System (INIS)

    Dusinska, Maria; Fjellsbo, Lise Maria; Heimstad, Eldbjorg; Harju, Mikael; Bartonova, Alena; Tran, Lang; Juillerat-Jeanneret, Lucienne; Halamoda, Blanka; Marano, Francelyne; Boland, Sonja; Saunders, Margaret; Cartwright, Laura; Carreira, Sara; Thawley, Susan; Whelan, Maurice; Klein, Christoph; Housiadas, Christos; Volkovova, Katarina; Tulinska, Jana; Beno, Milan

    2009-01-01

    Nanoparticles (NPs) have unique, potentially beneficial properties, but their possible impact on human health is still not known. The area of nanomedicine brings humans into direct contact with NPs and it is essential for both public confidence and the nanotech companies that appropriate risk assessments are undertaken in relation to health and safety. There is a pressing need to understand how engineered NPs can interact with the human body following exposure. The FP7 project NanoTEST (www.nanotest-fp7.eu) addresses these requirements in relation to the toxicological profile of NPs used in medical diagnostics.

  10. 42 CFR 493.937 - Toxicology.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Toxicology. 493.937 Section 493.937 Public Health... Proficiency Testing Programs by Specialty and Subspecialty § 493.937 Toxicology. (a) Program content and frequency of challenge. To be approved for proficiency testing for toxicology, the annual program must...

  11. High Throughput Transcriptomics @ USEPA (Toxicology ...

    Science.gov (United States)

    The ideal chemical testing approach will provide complete coverage of all relevant toxicological responses. It should be sensitive and specific It should identify the mechanism/mode-of-action (with dose-dependence). It should identify responses relevant to the species of interest. Responses should ideally be translated into tissue-, organ-, and organism-level effects. It must be economical and scalable. Using a High Throughput Transcriptomics platform within US EPA provides broader coverage of biological activity space and toxicological MOAs and helps fill the toxicological data gap. Slide presentation at the 2016 ToxForum on using High Throughput Transcriptomics at US EPA for broader coverage biological activity space and toxicological MOAs.

  12. SEURAT: Safety Evaluation Ultimately Replacing Animal Testing--recommendations for future research in the field of predictive toxicology.

    Science.gov (United States)

    Daston, George; Knight, Derek J; Schwarz, Michael; Gocht, Tilman; Thomas, Russell S; Mahony, Catherine; Whelan, Maurice

    2015-01-01

    The development of non-animal methodology to evaluate the potential for a chemical to cause systemic toxicity is one of the grand challenges of modern science. The European research programme SEURAT is active in this field and will conclude its first phase, SEURAT-1, in December 2015. Drawing on the experience gained in SEURAT-1 and appreciating international advancement in both basic and regulatory science, we reflect here on how SEURAT should evolve and propose that further research and development should be directed along two complementary and interconnecting work streams. The first work stream would focus on developing new 'paradigm' approaches for regulatory science. The goal here is the identification of 'critical biological targets' relevant for toxicity and to test their suitability to be used as anchors for predicting toxicity. The second work stream would focus on integration and application of new approach methods for hazard (and risk) assessment within the current regulatory 'paradigm', aiming for acceptance of animal-free testing strategies by regulatory authorities (i.e. translating scientific achievements into regulation). Components for both work streams are discussed and may provide a structure for a future research programme in the field of predictive toxicology.

  13. QUASI-RANDOM TESTING OF COMPUTER SYSTEMS

    Directory of Open Access Journals (Sweden)

    S. V. Yarmolik

    2013-01-01

    Full Text Available Various modified random testing approaches have been proposed for computer system testing in the black box environment. Their effectiveness has been evaluated on the typical failure patterns by employing three measures, namely, P-measure, E-measure and F-measure. A quasi-random testing, being a modified version of the random testing, has been proposed and analyzed. The quasi-random Sobol sequences and modified Sobol sequences are used as the test patterns. Some new methods for Sobol sequence generation have been proposed and analyzed.

  14. An investigation of the genetic toxicology of irradiated foodstuffs using short-term test systems

    International Nuclear Information System (INIS)

    Phillips, B.J.; Kranz, E.; Elias, P.S.

    1980-01-01

    As part of a programme of short-term tests used to detect possible genetic toxicity in irradiated foodstuffs, cultured Chinese hamster ovary cells were exposed to extracts and digests of irradiated and unirradiated dates, fish and chicken and subjected to tests for cytotoxicity, sister chromatid exchange induction and mutation to thioguanine resistance. The results showed no evidence of genetic toxicity induced in food by irradiation. The general applicability of cell culture tests to the detection of mutagens in food is discussed. (author)

  15. Medical Devices; Clinical Chemistry and Clinical Toxicology Devices; Classification of the Organophosphate Test System. Final order.

    Science.gov (United States)

    2017-10-18

    The Food and Drug Administration (FDA or we) is classifying the organophosphate test system into class II (special controls). The special controls that apply to the device type are identified in this order and will be part of the codified language for the organophosphate test system's classification. We are taking this action because we have determined that classifying the device into class II (special controls) will provide a reasonable assurance of safety and effectiveness of the device. We believe this action will also enhance patients' access to beneficial innovative devices, in part by reducing regulatory burdens.

  16. Immunotoxicity testing: Implementation of mechanistic understanding, key pathways of toxicological concern and components of these pathways.

    Science.gov (United States)

    At present, several animal-based assays are used to assess immunotoxic effects such as immunosuppression and sensitization. Growing societal and ethical concerns, European legislation and current research demands by industry are driving animal-based toxicity testing towards new a...

  17. Computer board for radioactive ray test

    International Nuclear Information System (INIS)

    Zuo Mingfu

    1996-05-01

    The present status of the radioactive-ray test system for industrial applications, the newly designed computer board for overcoming the shortcomings of the current system are described. The functions, measurement principles and the feature of the board as well as the test results for this board are discussed. The board puts together many functions of the radioactive-ray test system, such as energy calibration, MCS, etc.. It also provides many other subordinate practical function such as motor control, ADC and so on. The board summarizes two sets of test parts into one and therefore composes a powerful unit for the system. Not only can it replace all units in a normal test system for signal analysis, signal process, data management, and motor control, but also can be used in more complex test systems, such as those for double source/double energy/double channel testing, multichannel testing, position testing and core positioning, etc.. The board makes the test system more easier to achieve miniaturization, computerization goals, and therefore improves the quality of the test and reduces the cost of the system. (10 refs., 8 figs.)

  18. Mercury exposure on potential plant Ludwigia octovalvis L. - Preliminary toxicological testing

    Science.gov (United States)

    Alrawiq, Huda S. M.; Mushrifah, I.

    2013-11-01

    The preliminary test in phytoremediation is necessaryto determine the ability of plant to survive in media with different concentrations of contaminant. It was conducted to determine the maximum concentration of the contaminant that isharmful to the plant and suppress the plant growth. This study showed the ability of Ludwigia octovalvisto resist mercury (Hg) contaminant in sand containing different concentrations of Hg (0, 0.5, 1, 2, 4, 6 and 8 mg/L). The experimental work wasperformed under greenhouse conditions for an observation period of 4 weeks. Throughout the 4 weeks duration, the resultsshowed that 66.66% of the plants withered for on exposure to Hg concentration of 4 mg/L and 100% withered at higher concentrations of 6 and 8 mg/L. The results of this study may serve as a basis for research that aims to study uptake and accumulation of Hg using potential phytoremediation plants.

  19. Tutoring system for nondestructive testing using computer

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jin Koo; Koh, Sung Nam [Joong Ang Inspection Co.,Ltd., Seoul (Korea, Republic of); Shim, Yun Ju; Kim, Min Koo [Dept. of Computer Engineering, Aju University, Suwon (Korea, Republic of)

    1997-10-15

    This paper is written to introduce a multimedia tutoring system for nondestructive testing using personal computer. Nondestructive testing, one of the chief methods for inspecting welds and many other components, is very difficult for the NDT inspectors to understand its technical basis without a wide experience. And it is necessary for considerable repeated education and training for keeping their knowledge. The tutoring system that can simulate NDT works is suggested to solve the above problem based on reasonable condition. The tutoring system shows basic theories of nondestructive testing in a book-style with video images and hyper-links, and it offers practices, in which users can simulate the testing equipment. The book-style and simulation practices provide effective and individual environments for learning nondestructive testing.

  20. Tutoring system for nondestructive testing using computer

    International Nuclear Information System (INIS)

    Kim, Jin Koo; Koh, Sung Nam; Shim, Yun Ju; Kim, Min Koo

    1997-01-01

    This paper is written to introduce a multimedia tutoring system for nondestructive testing using personal computer. Nondestructive testing, one of the chief methods for inspecting welds and many other components, is very difficult for the NDT inspectors to understand its technical basis without a wide experience. And it is necessary for considerable repeated education and training for keeping their knowledge. The tutoring system that can simulate NDT works is suggested to solve the above problem based on reasonable condition. The tutoring system shows basic theories of nondestructive testing in a book-style with video images and hyper-links, and it offers practices, in which users can simulate the testing equipment. The book-style and simulation practices provide effective and individual environments for learning nondestructive testing.

  1. Toxcast and the Use of Human Relevant In Vitro Exposures: Incorporating High-Throughput Exposure and Toxicity Testing Data for 21st Century Risk Assessments (British Toxicological Society Annual Congress)

    Science.gov (United States)

    The path for incorporating new approach methods and technologies into quantitative chemical risk assessment poses a diverse set of scientific challenges. These challenges include sufficient coverage of toxicological mechanisms to meaningfully interpret negative test results, dev...

  2. An investigation of normal urine with a creatinine concentration under the cutoff of 20 mg/dL for specimen validity testing in a toxicology laboratory.

    Science.gov (United States)

    Holden, Brad; Guice, Erica A

    2014-05-01

    In clinical and forensic toxicology laboratories, one commonly used method for urine specimen validity testing is creatinine concentration. In this study, workplace guidelines are examined to determine their relevance to forensic and clinical toxicology samples. Specifically, it investigates the occurrence of urine creatinine concentrations under 20 mg/dL and notes potential issues with factors influencing creatinine concentration by utilizing a simple, novel method consisting of cation-paring high-pressure liquid chromatography in tandem with ultraviolet detection to determine the creatinine concentration in 3019 donors. Of the 4227 sample population in this study, 209 (4.94%) were below the cutoff value of 20 mg/dL for dilute urine. Because there are many factors that can influence the urinary creatinine concentration, samples that have creatinine under the 20 mg/dL cutoff do not always implicate sample adulteration. © 2014 American Academy of Forensic Sciences.

  3. A Design Methodology for Computer Security Testing

    OpenAIRE

    Ramilli, Marco

    2013-01-01

    The field of "computer security" is often considered something in between Art and Science. This is partly due to the lack of widely agreed and standardized methodologies to evaluate the degree of the security of a system. This dissertation intends to contribute to this area by investigating the most common security testing strategies applied nowadays and by proposing an enhanced methodology that may be effectively applied to different threat scenarios with the same degree of effectiveness. ...

  4. TOXNET: Toxicology Data Network

    Science.gov (United States)

    ... to TOXNET Your resource for searching databases on toxicology, hazardous chemicals, environmental health, and toxic releases SEARCH ... over 3,000 chemicals (1991-1998) Environmental Health & Toxicology Resources on environmental health and toxicology Visit Site ...

  5. Behavioral assays in environmental toxicology

    Energy Technology Data Exchange (ETDEWEB)

    Weiss, B.

    1979-01-01

    Environmental toxicology is too permeated by questions about how the whole organism functions to abandon intact animals as test systems. Behavior does not participate as a single entity or discipline. It ranges across the total spectrum of functional toxicity, from tenuous subjective complaints to subtle sensory and motor disturbances demanding advanced instrumentation for their evaluation. Three facets of behavioral toxicology that illustrate its breadth of interests and potential contributions are discussed.

  6. Adaptation of the ToxRTool to Assess the Reliability of Toxicology Studies Conducted with Genetically Modified Crops and Implications for Future Safety Testing.

    Science.gov (United States)

    Koch, Michael S; DeSesso, John M; Williams, Amy Lavin; Michalek, Suzanne; Hammond, Bruce

    2016-01-01

    To determine the reliability of food safety studies carried out in rodents with genetically modified (GM) crops, a Food Safety Study Reliability Tool (FSSRTool) was adapted from the European Centre for the Validation of Alternative Methods' (ECVAM) ToxRTool. Reliability was defined as the inherent quality of the study with regard to use of standardized testing methodology, full documentation of experimental procedures and results, and the plausibility of the findings. Codex guidelines for GM crop safety evaluations indicate toxicology studies are not needed when comparability of the GM crop to its conventional counterpart has been demonstrated. This guidance notwithstanding, animal feeding studies have routinely been conducted with GM crops, but their conclusions on safety are not always consistent. To accurately evaluate potential risks from GM crops, risk assessors need clearly interpretable results from reliable studies. The development of the FSSRTool, which provides the user with a means of assessing the reliability of a toxicology study to inform risk assessment, is discussed. Its application to the body of literature on GM crop food safety studies demonstrates that reliable studies report no toxicologically relevant differences between rodents fed GM crops or their non-GM comparators.

  7. Assuring safety without animal testing concept (ASAT). Integration of human disease data with in vitro data to improve toxicology testing

    NARCIS (Netherlands)

    Stierum, Rob; Aarts, Jac; Boorsma, Andre; Bosgra, Sieto; Caiment, Florian; Ezendam, Janine; Greupink, Rick; Hendriksen, Peter; Soeteman-Hernandez, Lya G.; Jennen, Danyel; Kleinjans, Jos; Kroese, Dinant; Kuper, Frieke; van Loveren, Henk; Monshouwer, Mario; Russel, Frans; van Someren, Eugene; Tsamou, Maria; Groothuis, Geny

    2014-01-01

    According to the Assuring Safety Without Animal Testing (ASAT) principle, risk assessment may ultimately become possible without the use of animals (Fentem et al., (2004). Altern. Lab. Anim. 32, 617-623). The ASAT concept takes human disease mechanisms as starting point and tries to define if

  8. Using a micro computer based test bank

    International Nuclear Information System (INIS)

    Hamel, R.T.

    1987-01-01

    Utilizing a micro computer based test bank offers a training department many advantages and can have a positive impact upon training procedures and examination standards. Prior to data entry, Training Department management must pre-review the examination questions and answers to ensure compliance with examination standards and to verify the validity of all questions. Management must adhere to the TSD format since all questions require an enabling objective numbering scheme. Each question is entered under the enabling objective upon which it is based. Then the question is selected via the enabling objective. This eliminates any instructor bias because a random number generator chooses the test question. However, the instructor may load specific questions to create an emphasis theme for any test. The examination, answer and cover sheets are produced and printed within minutes. The test bank eliminates the large amount of time that is normally required for an instructor to formulate an examination. The need for clerical support is reduced by the elimination of typing examinations and also by the software's ability to maintain and generate student/course lists, attendance sheets, and grades. Software security measures limit access to the test bank, and the impromptu method used to generate and print an examination enhance its security

  9. Space Toxicology

    Science.gov (United States)

    James, John T.

    2011-01-01

    Safe breathing air for space faring crews is essential whether they are inside an Extravehicular Mobility Suit (EMU), a small capsule such as Soyuz, or the expansive International Space Station (ISS). Sources of air pollution can include entry of propellants, excess offgassing from polymeric materials, leakage of systems compounds, escape of payload compounds, over-use of utility compounds, microbial metabolism, and human metabolism. The toxicological risk posed by a compound is comprised of the probability of escaping to cause air pollution and the magnitude of adverse effects on human health if escape occurs. The risk from highly toxic compounds is controlled by requiring multiple levels of containment to greatly reduce the probability of escape; whereas compounds that are virtually non-toxic may require little or no containment. The potential for toxicity is determined by the inherent toxicity of the compound and the amount that could potentially escape into the breathing air.

  10. Animal toxicology

    Energy Technology Data Exchange (ETDEWEB)

    Amdur, M.

    1996-12-31

    The chapter evaluates results of toxicological studies on experimental animals to investigate health effects of air pollutants and examines the animal data have predicted the response to human subject. Data are presented on the comparative toxicity of sulfur dioxide and sulfuric acid. The animal data obtained by measurement of airway resistance in guinea pigs and of bronchial clearance of particles in donkeys predicted clearly that sulfuric acid was more irritant than sulfur dioxide. Data obtained on human subjects confirmed this prediction. These acute studies also correctly predicted the comparative toxicity of the two compounds in two year studies of monkeys. Such chronic studies are not possible in human subjects but it is a reasonable to assume that sulfuric acid would be more toxic than sulfur dioxide. Current findings in epidemiological studies certainly support this assumption.

  11. Veterinary Forensic Toxicology.

    Science.gov (United States)

    Gwaltney-Brant, S M

    2016-09-01

    Veterinary pathologists working in diagnostic laboratories are sometimes presented with cases involving animal poisonings that become the object of criminal or civil litigation. Forensic veterinary toxicology cases can include cases involving animal cruelty (malicious poisoning), regulatory issues (eg, contamination of the food supply), insurance litigation, or poisoning of wildlife. An understanding of the appropriate approach to these types of cases, including proper sample collection, handling, and transport, is essential so that chain of custody rules are followed and proper samples are obtained for toxicological analysis. Consultation with veterinary toxicologists at the diagnostic laboratory that will be processing the samples before, during, and after the forensic necropsy can help to ensure that the analytical tests performed are appropriate for the circumstances and findings surrounding the individual case. © The Author(s) 2016.

  12. 76 FR 23323 - Meeting of the Scientific Advisory Committee on Alternative Toxicological Methods (SACATM)

    Science.gov (United States)

    2011-04-26

    ... the scientific validation and regulatory acceptance of toxicological and safety testing methods that... DEPARTMENT OF HEALTH AND HUMAN SERVICES Meeting of the Scientific Advisory Committee on Alternative Toxicological Methods (SACATM) AGENCY: National Toxicology Program (NTP), National Institute of...

  13. Computers in Language Testing: Present Research and Some Future Directions.

    Science.gov (United States)

    Brown, James Dean

    1997-01-01

    Explores recent developments in the use of computers in language testing in four areas: (1) item banking; (2) computer-assisted language testing; (3) computerized-adaptive language testing; and (4) research on the effectiveness of computers in language testing. Examines educational measurement literature in an attempt to forecast the directions…

  14. Dynamic leaching test of personal computer components.

    Science.gov (United States)

    Li, Yadong; Richardson, Jay B; Niu, Xiaojun; Jackson, Ollie J; Laster, Jeremy D; Walker, Aaron K

    2009-11-15

    A dynamic leaching test (DLT) was developed and used to evaluate the leaching of toxic substances for electronic waste in the environment. The major components in personal computers (PCs) including motherboards, hard disc drives, floppy disc drives, and compact disc drives were tested. The tests lasted for 2 years for motherboards and 1.5 year for the disc drives. The extraction fluids for the standard toxicity characteristic leaching procedure (TCLP) and synthetic precipitation leaching procedure (SPLP) were used as the DLT leaching solutions. A total of 18 elements including Ag, Al, As, Au, Ba, Be, Cd, Cr, Cu, Fe, Ga, Ni, Pd, Pb, Sb, Se, Sn, and Zn were analyzed in the DLT leachates. Only Al, Cu, Fe, Ni, Pb, and Zn were commonly found in the DLT leachates of the PC components. Their leaching levels were much higher in TCLP extraction fluid than in SPLP extraction fluid. The toxic heavy metal Pb was found to continuously leach out of the components over the entire test periods. The cumulative amounts of Pb leached out of the motherboards in TCLP extraction fluid reached 2.0 g per motherboard over the 2-year test period, and that in SPLP extraction fluid were 75-90% less. The leaching rates or levels of Pb were largely affected by the content of galvanized steel in the PC components. The higher was the steel content, the lower the Pb leaching rate would be. The findings suggest that the obsolete PCs disposed of in landfills or discarded in the environment continuously release Pb for years when subjected to landfill leachate or rains.

  15. A practice analysis of toxicology.

    Science.gov (United States)

    Wood, Carol S; Weis, Christopher P; Caro, Carla M; Roe, Amy

    2016-12-01

    In 2015, the American Board of Toxicology (ABT), with collaboration from the Society of Toxicology (SOT), in consultation with Professional Examination Service, performed a practice analysis study of the knowledge required for general toxicology. The purpose of this study is to help assure that the examination and requirements for attainment of Diplomate status are relevant to modern toxicology and based upon an empirical foundation of knowledge. A profile of the domains and tasks used in toxicology practice was developed by subject-matter experts representing a broad range of experiences and perspectives. An on-line survey of toxicologists, including Diplomates of the ABT and SOT members, confirmed the delineation. Results of the study can be used to improve understanding of toxicology practice, to better serve all toxicologists, and to present the role of toxicologists to those outside the profession. Survey results may also be used by the ABT Board of Directors to develop test specifications for the certifying examination and will be useful for evaluating and updating the content of professional preparation, development, and continuing education programs. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  16. Multiscale Toxicology - Building the Next Generation Tools for Toxicology

    Energy Technology Data Exchange (ETDEWEB)

    Thrall, Brian D.; Minard, Kevin R.; Teeguarden, Justin G.; Waters, Katrina M.

    2012-09-01

    A Cooperative Research and Development Agreement (CRADA) was sponsored by Battelle Memorial Institute (Battelle, Columbus), to initiate a collaborative research program across multiple Department of Energy (DOE) National Laboratories aimed at developing a suite of new capabilities for predictive toxicology. Predicting the potential toxicity of emerging classes of engineered nanomaterials was chosen as one of two focusing problems for this program. PNNL’s focus toward this broader goal was to refine and apply experimental and computational tools needed to provide quantitative understanding of nanoparticle dosimetry for in vitro cell culture systems, which is necessary for comparative risk estimates for different nanomaterials or biological systems. Research conducted using lung epithelial and macrophage cell models successfully adapted magnetic particle detection and fluorescent microscopy technologies to quantify uptake of various forms of engineered nanoparticles, and provided experimental constraints and test datasets for benchmark comparison against results obtained using an in vitro computational dosimetry model, termed the ISSD model. The experimental and computational approaches developed were used to demonstrate how cell dosimetry is applied to aid in interpretation of genomic studies of nanoparticle-mediated biological responses in model cell culture systems. The combined experimental and theoretical approach provides a highly quantitative framework for evaluating relationships between biocompatibility of nanoparticles and their physical form in a controlled manner.

  17. Cornerstones of Toxicology.

    Science.gov (United States)

    Hayes, A Wallace; Dixon, Darlene

    2017-01-01

    The 35th Annual Society of Toxicologic Pathology Symposium, held in June 2016 in San Diego, California, focused on "The Basis and Relevance of Variation in Toxicologic Responses." In order to review the basic tenants of toxicology, a "broad brush" interactive talk that gave an overview of the Cornerstones of Toxicology was presented. The presentation focused on the historical milestones and perspectives of toxicology and through many scientific graphs, data, and real-life examples covered the three basic principles of toxicology that can be summarized, as dose matters (as does timing), people differ, and things change (related to metabolism and biotransformation).

  18. Toxicological Benchmarks for Wildlife

    Energy Technology Data Exchange (ETDEWEB)

    Sample, B.E. Opresko, D.M. Suter, G.W.

    1993-01-01

    Ecological risks of environmental contaminants are evaluated by using a two-tiered process. In the first tier, a screening assessment is performed where concentrations of contaminants in the environment are compared to no observed adverse effects level (NOAEL)-based toxicological benchmarks. These benchmarks represent concentrations of chemicals (i.e., concentrations presumed to be nonhazardous to the biota) in environmental media (water, sediment, soil, food, etc.). While exceedance of these benchmarks does not indicate any particular level or type of risk, concentrations below the benchmarks should not result in significant effects. In practice, when contaminant concentrations in food or water resources are less than these toxicological benchmarks, the contaminants may be excluded from further consideration. However, if the concentration of a contaminant exceeds a benchmark, that contaminant should be retained as a contaminant of potential concern (COPC) and investigated further. The second tier in ecological risk assessment, the baseline ecological risk assessment, may use toxicological benchmarks as part of a weight-of-evidence approach (Suter 1993). Under this approach, based toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. Other sources of evidence include media toxicity tests, surveys of biota (abundance and diversity), measures of contaminant body burdens, and biomarkers. This report presents NOAEL- and lowest observed adverse effects level (LOAEL)-based toxicological benchmarks for assessment of effects of 85 chemicals on 9 representative mammalian wildlife species (short-tailed shrew, little brown bat, meadow vole, white-footed mouse, cottontail rabbit, mink, red fox, and whitetail deer) or 11 avian wildlife species (American robin, rough-winged swallow, American woodcock, wild turkey, belted kingfisher, great blue heron, barred owl, barn owl, Cooper's hawk, and red

  19. 42 CFR 493.845 - Standard; Toxicology.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard; Toxicology. 493.845 Section 493.845 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... These Tests § 493.845 Standard; Toxicology. (a) Failure to attain a score of at least 80 percent of...

  20. National Toxicology Program

    Science.gov (United States)

    ... NTP? NTP develops and applies tools of modern toxicology and molecular biology to identify substances in the ... depend on for decisions that matter. The National Toxicology Program provides the scientific basis for programs, activities, ...

  1. Toxicology Education Foundation

    Science.gov (United States)

    ... bodies and our world. Welcome to the Toxicology Education Foundation! Our mission is to enhance public understanding ... In with us, follow our Tweets, choose Toxicology Education Foundation as your preferred charity through Smile.Amazon. ...

  2. Environmental Toxicology Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Fully-equipped facilities for environmental toxicology researchThe Environmental Toxicology Research Facility (ETRF) located in Vicksburg, MS provides over 8,200 ft...

  3. Handbook of systems toxicology

    National Research Council Canada - National Science Library

    Casciano, Daniel A; Sahu, Saura C

    2011-01-01

    "In the first handbook to comprehensively cover the emerging area of systems toxicology, the Handbook of Systems Toxicology provides an authoritative compilation of up-to-date developments presented...

  4. Green Toxicology – Application of predictive toxicology

    DEFF Research Database (Denmark)

    Vinggaard, Anne Marie; Wedebye, Eva Bay; Taxvig, Camilla

    2014-01-01

    safer chemicals and to identify problematic compounds already in use such as industrial compounds, drugs, pesticides and cosmetics, is required. Green toxicology is the application of predictive toxicology to the production of chemicals with the specific intent of improving their design for hazard...

  5. Innovative non-animal testing strategies for reproductive toxicology: the contribution of Italian partners within the EU project ReProTect

    Directory of Open Access Journals (Sweden)

    Stefano Lorenzetti

    2011-12-01

    Full Text Available Reproductive toxicity, with its many targets and mechanisms, is a complex area of toxicology; thus, the screening and identification of reproductive toxicants is a main scientific challenge for the safety assessment of chemicals, including the European Regulation on Chemicals (REACH. Regulatory agencies recommend the implementation of the 3Rs principle (refinement, reduction, replacement as well as of intelligent testing strategies, through the development of in vitro methods and the use of mechanistic information in the hazard identification and characterization steps of the risk assessment process. The EU Integrated Project ReProTect (6th Framework Programme implemented an array of in vitro tests to study different building blocks of the mammalian reproductive cycle: methodological developments and results on male and female germ cells, prostate and placenta are presented.

  6. Grid computing faces IT industry test

    CERN Multimedia

    Magno, L

    2003-01-01

    Software company Oracle Corp. unveiled it's Oracle 10g grid computing platform at the annual OracleWorld user convention in San Francisco. It gave concrete examples of how grid computing can be a viable option outside the scientific community where the concept was born (1 page).

  7. 40 CFR 161.340 - Toxicology data requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Toxicology data requirements. 161.340... Toxicology data requirements. (a) Table. Sections 161.100 through 161.102 describe how to use this table to determine the toxicology data requirements and the substance to be tested. Kind of data required (b) Notes...

  8. Toxicological aspects of energy production

    International Nuclear Information System (INIS)

    Sanders, C.L.

    1986-01-01

    Part I reviews the principles of toxicology, describes the biological fate of chemicals in the body, discusses basic pathobiology, and reviews short-term toxicity tests. Part II describes the toxicology and pathology of pollutants in several important organ systems. The greatest emphasis is placed on the respiratory tract because of its high probability as a route of exposure to pollutants from energy technologies and its high sensitivity to pollutant related tissue damage. Part III describes the toxicological aspects of specific chemical classes associated with fossil fuels; these include polycyclic hydrocarbons, gases and metals. Part IV describes the biomedical effects associated with each energy technology, including coal and oil, fossil fuel and biomass conversions, solar and geothermal and radiological health aspects associated with uranium mining, nuclear fission and fusion, and with nonionising radiations and electromagnetic fields

  9. Methodological testing: Are fast quantum computers illusions?

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Steven [Tachyon Design Automation, San Francisco, CA (United States)

    2013-07-01

    Popularity of the idea for computers constructed from the principles of QM started with Feynman's 'Lectures On Computation', but he called the idea crazy and dependent on statistical mechanics. In 1987, Feynman published a paper in 'Quantum Implications - Essays in Honor of David Bohm' on negative probabilities which he said gave him cultural shock. The problem with imagined fast quantum computers (QC) is that speed requires both statistical behavior and truth of the mathematical formalism. The Swedish Royal Academy 2012 Nobel Prize in physics press release touted the discovery of methods to control ''individual quantum systems'', to ''measure and control very fragile quantum states'' which enables ''first steps towards building a new type of super fast computer based on quantum physics.'' A number of examples where widely accepted mathematical descriptions have turned out to be problematic are examined: Problems with the use of Oracles in P=NP computational complexity, Paul Finsler's proof of the continuum hypothesis, and Turing's Enigma code breaking versus William tutte's Colossus. I view QC research as faith in computational oracles with wished for properties. Arther Fine's interpretation in 'The Shaky Game' of Einstein's skepticism toward QM is discussed. If Einstein's reality as space-time curvature is correct, then space-time computers will be the next type of super fast computer.

  10. Simple and Effective Algorithms: Computer-Adaptive Testing.

    Science.gov (United States)

    Linacre, John Michael

    Computer-adaptive testing (CAT) allows improved security, greater scoring accuracy, shorter testing periods, quicker availability of results, and reduced guessing and other undesirable test behavior. Simple approaches can be applied by the classroom teacher, or other content specialist, who possesses simple computer equipment and elementary…

  11. Computer-Adaptive Testing: Implications for Students' Achievement, Motivation, Engagement, and Subjective Test Experience

    Science.gov (United States)

    Martin, Andrew J.; Lazendic, Goran

    2018-01-01

    The present study investigated the implications of computer-adaptive testing (operationalized by way of multistage adaptive testing; MAT) and "conventional" fixed order computer testing for various test-relevant outcomes in numeracy, including achievement, test-relevant motivation and engagement, and subjective test experience. It did so…

  12. Toxicology and metabolism of nickel compounds. Progress report, December 1, 1975--November 30, 1976. [Tests made with rats and hamsters

    Energy Technology Data Exchange (ETDEWEB)

    Sunderman, F.W. Jr.

    1976-08-15

    The toxicology and metabolism of nickel compounds (NiCl/sub 2/, Ni/sub 3/S/sub 2/, NiS, Ni powder, and Ni(CO)/sub 4/) were investigated in rats and hamsters. Triethylenetetramine (TETA) and d-penicillamine are more effective than other chelating agents (Na-diethyldithiocarbamate, CaNa/sub 2/-versenate, diglycylhistidine-N-methylamide and ..cap alpha..-lipoic acid) as antidotes for acute Ni(II)-toxicity in rats. The antidotal efficacy of triethylenetetramine (TETA) in acute Ni(II)-toxicity is mediated by rapid reduction of the plasma concentration of Ni(II), consistent with renal clearance of the TETA-Ni complex at a rate more than twenty times greater than the renal clearance of non-chelated Ni(II). Fischer rats are more susceptible than other rat strains (Wistar-Lewis, Long-Evans and NIH-Black) to induction of erythrocytosis after an intrarenal injection of Ni/sub 3/S/sub 2/, and elucidation of the serial pathologic changes that occur in rats after an intrarenal injection of Ni/sub 3/S/sub 2/. When amorphous nickel monosulfide (NiS) and nickel subsulfide (Ni/sub 3/S/sub 2/) were administered by im injection to randomly selected Fischer rats in equivalent amounts under identical conditions, NiS did not induce any tumors whereas Ni/sub 3/S/sub 2/ induced sarcomas in almost all of the rats.

  13. Multiscale Toxicology- Building the Next Generation Tools for Toxicology

    Energy Technology Data Exchange (ETDEWEB)

    Retterer, S. T. [ORNL; Holsapple, M. P. [Battelle Memorial Institute

    2013-10-31

    A Cooperative Research and Development Agreement (CRADA) was established between Battelle Memorial Institute (BMI), Pacific Northwest National Laboratory (PNNL), Oak Ridge National Laboratory (ORNL), Brookhaven National Laboratory (BNL), Lawrence Livermore National Laboratory (LLNL) with the goal of combining the analytical and synthetic strengths of the National Laboratories with BMI's expertise in basic and translational medical research to develop a collaborative pipeline and suite of high throughput and imaging technologies that could be used to provide a more comprehensive understanding of material and drug toxicology in humans. The Multi-Scale Toxicity Initiative (MSTI), consisting of the team members above, was established to coordinate cellular scale, high-throughput in vitro testing, computational modeling and whole animal in vivo toxicology studies between MSTI team members. Development of a common, well-characterized set of materials for testing was identified as a crucial need for the initiative. Two research tracks were established by BMI during the course of the CRADA. The first research track focused on the development of tools and techniques for understanding the toxicity of nanomaterials, specifically inorganic nanoparticles (NPs). ORNL"s work focused primarily on the synthesis, functionalization and characterization of a common set of NPs for dissemination to the participating laboratories. These particles were synthesized to retain the same surface characteristics and size, but to allow visualization using the variety of imaging technologies present across the team. Characterization included the quantitative analysis of physical and chemical properties of the materials as well as the preliminary assessment of NP toxicity using commercially available toxicity screens and emerging optical imaging strategies. Additional efforts examined the development of high-throughput microfluidic and imaging assays for measuring NP uptake, localization, and

  14. AN INTELLIGENT REPRODUCTIVE AND DEVELOPMENTAL TESTING PARADIGM FOR THE 21ST CENTURY

    Science.gov (United States)

    Addressing the chemical evaluation bottleneck that currently exists can only be achieved through progressive changes to the current testing paradigm. The primary resources for addressing these issues lie in computational toxicology, a field enriched by recent advances in computer...

  15. Computer-Aided Test Flow in Core-Based Design

    NARCIS (Netherlands)

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the efficient test-pattern generation in a core-based design. A consistent Computer-Aided Test (CAT) flow is proposed based on the required core-test strategy. It generates a test-pattern set for the embedded cores with high fault coverage and low DfT area overhead. The CAT

  16. Toxicology: a discipline in need of academic anchoring--the point of view of the German Society of Toxicology.

    Science.gov (United States)

    Gundert-Remy, U; Barth, H; Bürkle, A; Degen, G H; Landsiedel, R

    2015-10-01

    The paper describes the importance of toxicology as a discipline, its past achievements, current scientific challenges, and future development. Toxicological expertise is instrumental in the reduction of human health risks arising from chemicals and drugs. Toxicological assessment is needed to evaluate evidence and arguments, whether or not there is a scientific base for concern. The immense success already achieved by toxicological work is exemplified by reduced pollution of air, soil, water, and safer working places. Predominantly predictive toxicological testing is derived from the findings to assess risks to humans and the environment. Assessment of the adversity of molecular effects (including epigenetic effects), the effects of mixtures, and integration of exposure and biokinetics into in vitro testing are emerging challenges for toxicology. Toxicology is a translational science with its base in fundamental science. Academic institutions play an essential part by providing scientific innovation and education of young scientists.

  17. Reproductive and developmental toxicology

    National Research Council Canada - National Science Library

    Gupta, Ramesh C

    2011-01-01

    .... Reproductive and Developmental Toxicology is a comprehensive and authoritative resource providing the latest literature enriched with relevant references describing every aspect of this area of science...

  18. Computerized adaptive testing in computer assisted learning?

    NARCIS (Netherlands)

    Veldkamp, Bernard P.; Matteucci, Mariagiulia; Eggen, Theodorus Johannes Hendrikus Maria; De Wannemacker, Stefan; Clarebout, Geraldine; De Causmaecker, Patrick

    2011-01-01

    A major goal in computerized learning systems is to optimize learning, while in computerized adaptive tests (CAT) efficient measurement of the proficiency of students is the main focus. There seems to be a common interest to integrate computerized adaptive item selection in learning systems and

  19. Predictive Modeling and Computational Toxicology

    Science.gov (United States)

    Embryonic development is orchestrated via a complex series of cellular interactions controlling behaviors such as mitosis, migration, differentiation, adhesion, contractility, apoptosis, and extracellular matrix remodeling. Any chemical exposure that perturbs these cellular proce...

  20. Principles and procedures in forensic toxicology.

    Science.gov (United States)

    Wyman, John F

    2012-09-01

    The principles and procedures employed in a modern forensic toxicology lab are detailed in this review. Aspects of Behavioral and Postmortem toxicology, including certification of analysts and accreditation of labs, chain of custody requirements, typical testing services provided, rationale for specimen selection, and principles of quality assurance are discussed. Interpretation of toxicology results in postmortem specimens requires the toxicologist and pathologist to be cognizant of drug-drug interactions, drug polymorphisms and pharmacogenomics, the gross signs of toxic pathology, postmortem redistribution, confirmation of systemic toxicity in suspected overdoses, the possibility of developed tolerance, and the effects of decomposition on drug concentration.

  1. Genetic toxicology in the 21st century: Reflections and future ...

    Science.gov (United States)

    A symposium at the 40th anniversary of the Environmental Mutagen Society, held from October 24–28, 2009 in St. Louis, MO, surveyed the current status and future directions of genetic toxicology. This article summarizes the presentations and provides a perspective on the future. An abbreviated history is presented, highlighting the current standard battery of genotoxicity assays and persistent challenges. Application of computational toxicology to safety testing within a regulatory setting is discussed as a means for reducing the need for animal testing and human clinical trials, and current approaches and applications of in silico genotoxicity screening approaches across the pharmaceutical industry were surveyed and are reported here. The expanded use of toxicogenomics to illuminate mechanisms and bridge genotoxicity and carcinogenicity, and new public efforts to use high-throughput screening technologies to address lack of toxicity evaluation for the backlog of thousands of industrial chemicals in the environment are detailed. The Tox21 project involves coordinated efforts of four U.S. Government regulatory/research entities to use new and innovative assays to characterize key steps in toxicity pathways, including genotoxic and nongenotoxic mechanisms for carcinogenesis. Progress to date, highlighting preliminary test results from the National Toxicology Program is summarized. Finally, an overview is presented of ToxCast™, a related research program of the

  2. Computer Adaptive Testing, Big Data and Algorithmic Approaches to Education

    Science.gov (United States)

    Thompson, Greg

    2017-01-01

    This article critically considers the promise of computer adaptive testing (CAT) and digital data to provide better and quicker data that will improve the quality, efficiency and effectiveness of schooling. In particular, it uses the case of the Australian NAPLAN test that will become an online, adaptive test from 2016. The article argues that…

  3. Computer-Aided Test Flow in Core-Based Design

    NARCIS (Netherlands)

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the test-pattern generation and fault coverage determination in the core based design. The basic core-test strategy that one has to apply in the core-based design is stated in this work. A Computer-Aided Test (CAT) flow is proposed resulting in accurate fault coverage of

  4. Computer Aided Education System SuperTest. Present and Prospective

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available This paper analyzes the testing and self-testing process for the Computer Aided Education System (CAES SuperTest, used at the Academy of Economic Studies of Chisinau, Moldova and recently implemented at the University of Bacau, Romania. We discuss here the future of this software, from the Information Society and Knowledge Society point of view.

  5. High Performance Computing Modernization Program Kerberos Throughput Test Report

    Science.gov (United States)

    2017-10-26

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/5524--17-9751 High Performance Computing Modernization Program Kerberos Throughput Test ...NUMBER 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 2. REPORT TYPE1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 6. AUTHOR(S) 8. PERFORMING...PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT High Performance Computing Modernization Program Kerberos Throughput Test Report Daniel G. Gdula* and

  6. Advancing Risk Assessment through the Application of Systems Toxicology

    Science.gov (United States)

    Sauer, John Michael; Kleensang, André; Peitsch, Manuel C.; Hayes, A. Wallace

    2016-01-01

    Risk assessment is the process of quantifying the probability of a harmful effect to individuals or populations from human activities. Mechanistic approaches to risk assessment have been generally referred to as systems toxicology. Systems toxicology makes use of advanced analytical and computational tools to integrate classical toxicology and quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Three presentations including two case studies involving both in vitro and in vivo approaches described the current state of systems toxicology and the potential for its future application in chemical risk assessment. PMID:26977253

  7. Computer-Aided Test Flow in Core-Based Design

    OpenAIRE

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the test-pattern generation and fault coverage determination in the core based design. The basic core-test strategy that one has to apply in the core-based design is stated in this work. A Computer-Aided Test (CAT) flow is proposed resulting in accurate fault coverage of embedded cores. The CAT now is applied to a few cores within the Philips Core Test Pilot IC project

  8. Toxicological assessment of enzyme-treated asparagus extract in rat acute and subchronic oral toxicity studies and genotoxicity tests.

    Science.gov (United States)

    Ito, Tomohiro; Ono, Tomoko; Sato, Atsuya; Goto, Kazunori; Miura, Takehito; Wakame, Koji; Nishioka, Hiroshi; Maeda, Takahiro

    2014-03-01

    The safety of enzyme-treated asparagus extract (ETAS) developed as a novel anti-stress functional material was assessed in acute and subchronic studies and genotoxicity assays. In the acute oral dose toxicity study, all rats survived during the test period and ETAS did not influence clinical appearance, body weight gain and necropsy findings at a dosage of 2000mg/kg body weight. Thus, the 50% lethal dose (LD50) of ETAS was determined to be greater than 2000mg/kg. The 90-day subchronic study (500, 1000 and 2000mg/kg body weight, delivered by gavage) in rats reported no significant adverse effects in food consumption, body weight, mortality, urinalysis, hematology, biochemistry, necropsy, organ weight and histopathology. In the micronucleus test of mice, the incidence of micronuclei in ETAS-administered groups (500, 1000 and 2000mg/kg/day, injected twice) was equivalent to that of the negative control group, while the positive control group receiving mitomycin C showed a high incidence. The potential of ETAS to induce gene mutation was tested using four Salmonella typhimurium strains and Escherichia coli WP2uvrA. The test sample was not mutagenic to the test strains. These results support the safety of ETAS as food and dietary supplement. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. An investigation of the genetic toxicology of irradiated food-stuffs using short-term test systems

    International Nuclear Information System (INIS)

    Renner, H.W.; Altmann, H.; Asquith, J.C.; Elias, P.S.

    1982-01-01

    Six in vivo genetic toxicity tests were carried out on irradiated or unirradiated cooked chicken, dried dates and cooked fish. The tests were as follows: sex-linked recessive lethal mutations in Drosophila melanogaster (dried dates only), chromosome aberrations in bone marrow of Chinese hamsters, micronucleus test in rats, mice and Chinese hamsters, sister-chromatid exchange in bone marrow of mice and Chinese hamsters and in spermatogonia of mice, and DNA metabolism in spleen cells of Chinese hamsters. None of the tests provided any evidence of genetic toxicity induced by irradiation. However, dried dates, whether irradiated or not, showed evidence of some genetic toxicity in their effect on DNA metabolism in spleen cells and SCE induction in bone marrow. Feeding irradiated fish affected DNA metabolism in the spleen cells of Chinese hamsters. This effect could be interpreted as an induction of an immunoactive compound, although it could also be explained by the persistence of an immunoactive compound due to the removal by irradiation of spoilage organisms that would normally degrade it. (author)

  10. A Computational Tool for Testing Dose-related Trend Using an Age-adjusted Bootstrap-based Poly-k Test

    Directory of Open Access Journals (Sweden)

    Hojin Moon

    2006-08-01

    Full Text Available A computational tool for testing for a dose-related trend and/or a pairwise difference in the incidence of an occult tumor via an age-adjusted bootstrap-based poly-k test and the original poly-k test is presented in this paper. The poly-k test (Bailer and Portier 1988 is a survival-adjusted Cochran-Armitage test, which achieves robustness to effects of differential mortality across dose groups. The original poly-k test is asymptotically standard normal under the null hypothesis. However, the asymptotic normality is not valid if there is a deviation from the tumor onset distribution that is assumed in this test. Our age-adjusted bootstrap-based poly-k test assesses the significance of assumed asymptotic normal tests and investigates an empirical distribution of the original poly-k test statistic using an age-adjusted bootstrap method. A tumor of interest is an occult tumor for which the time to onset is not directly observable. Since most of the animal carcinogenicity studies are designed with a single terminal sacrifice, the present tool is applicable to rodent tumorigenicity assays that have a single terminal sacrifice. The present tool takes input information simply from a user screen and reports testing results back to the screen through a user-interface. The computational tool is implemented in C/C++ and is applied to analyze a real data set as an example. Our tool enables the FDA and the pharmaceutical industry to implement a statistical analysis of tumorigenicity data from animal bioassays via our age-adjusted bootstrap-based poly-k test and the original poly-k test which has been adopted by the National Toxicology Program as its standard statistical test.

  11. Repetitive Domain-Referenced Testing Using Computers: the TITA System.

    Science.gov (United States)

    Olympia, P. L., Jr.

    The TITA (Totally Interactive Testing and Analysis) System algorithm for the repetitive construction of domain-referenced tests utilizes a compact data bank, is highly portable, is useful in any discipline, requires modest computer hardware, and does not present a security problem. Clusters of related keyphrases, statement phrases, and distractors…

  12. Computer-Adaptive Testing in Second Language Contexts.

    Science.gov (United States)

    Chalhoub-Deville, Micheline; Deville, Craig

    1999-01-01

    Provides a broad overview of computerized testing issues with an emphasis on computer-adaptive testing (CAT). A survey of the potential benefits and drawbacks of CAT are given, the process of CAT development is described; and some L2 instruments developed to assess various language skills are summarized. (Author/VWL)

  13. A computer-controlled automated test system for fatigue and fracture testing

    International Nuclear Information System (INIS)

    Nanstad, R.K.; Alexander, D.J.; Swain, R.L.; Hutton, J.T.; Thomas, D.L.

    1989-01-01

    A computer-controlled system consisting of a servohydraulic test machine, an in-house designed test controller, and a desktop computer has been developed for performing automated fracture toughness and fatigue crack growth testing both in the laboratory and in hot cells for remote testing of irradiated specimens. Both unloading compliance and dc-potential drop can be used to monitor crack growth. The test controller includes a dc-current supply programmer, a function generator for driving the servohydraulic test machine to required test outputs, five measurement channels (each consisting of low-pass filter, track/hold amplifier, and 16-bit analog-to-digital converter), and digital logic for various control and data multiplexing functions. The test controller connects to the computer via a 16-bit wide photo-isolated bidirectional bus. The computer, a Hewlett-Packard series 200/300, inputs specimen and test parameters from the operator, configures the test controller, stores test data from the test controller in memory, does preliminary analysis during the test, and records sensor calibrations, specimen and test parameters, and test data on flexible diskette for later recall and analysis with measured initial and final crack length information. During the test, the operator can change test parameters as necessary. 24 refs., 6 figs

  14. Metabonomics and toxicology.

    Science.gov (United States)

    Zhao, Liang; Hartung, Thomas

    2015-01-01

    Being an emerging field of "omics" research, metabonomics has been increasingly used in toxicological studies mostly because this technology has the ability to provide more detailed information to elucidate mechanism of toxicity. As an interdisciplinary field of science, metabonomics combines analytical chemistry, bioinformatics, statistics, and biochemistry. When applied to toxicology, metabonomics also includes aspects of patho-biochemistry, systems biology, and molecular diagnostics. During a toxicological study, the metabolic changes over time and dose after chemical treatment can be monitored. Therefore, the most important use of this emerging technology is the identification of signatures of toxicity-patterns of metabolic changes predictive of a hazard manifestation. This chapter summarizes the current state of metabonomics technology and its applications in various areas of toxicological studies.

  15. Occupational medicine and toxicology

    Directory of Open Access Journals (Sweden)

    Fischer Axel

    2006-02-01

    Full Text Available Abstract This editorial is to announce the Journal of Occupational Medicine and Toxicology, a new Open Access, peer-reviewed, online journal published by BioMed Central. Occupational medicine and toxicology belong to the most wide ranging disciplines of all medical specialties. The field is devoted to the diagnosis, prevention, management and scientific analysis of diseases from the fields of occupational and environmental medicine and toxicology. It also covers the promotion of occupational and environmental health. The complexity of modern industrial processes has dramatically changed over the past years and today's areas include effects of atmospheric pollution, carcinogenesis, biological monitoring, ergonomics, epidemiology, product safety and health promotion. We hope that the launch of the Journal of Occupational Medicine and Toxicology will aid in the advance of these important areas of research bringing together multi-disciplinary research findings.

  16. Radiographic test phantom for computed tomographic lung nodule analysis

    International Nuclear Information System (INIS)

    Zerhouni, E.A.

    1987-01-01

    This patent describes a method for evaluating a computed tomograph scan of a nodule in a lung of a human or non-human animal. The method comprises generating a computer tomograph of a transverse section of the animal containing lung and nodule tissue, and generating a second computer tomograph of a test phantom comprising a device which simulates the transverse section of the animal. The tissue simulating portions of the device are constructed of materials having radiographic densities substantially identical to those of the corresponding tissue in the simulated transverse section of the animal and have voids therein which simulate, in size and shape, the lung cavities in the transverse section and which contain a test reference nodule constructed of a material of predetermined radiographic density which simulates in size, shape and position within a lung cavity void of the test phantom the nodule in the transverse section of the animal and comparing the respective tomographs

  17. A semiautomated computer-interactive dynamic impact testing system

    International Nuclear Information System (INIS)

    Alexander, D.J.; Nanstad, R.K.; Corwin, W.R.; Hutton, J.T.

    1989-01-01

    A computer-assisted semiautomated system has been developed for testing a variety of specimen types under dynamic impact conditions. The primary use of this system is for the testing of Charpy specimens. Full-, half-, and third-size specimens have been tested, both in the lab and remotely in a hot cell for irradiated specimens. Specimens are loaded into a transfer device which moves the specimen into a chamber, where a hot air gun is used to heat the specimen, or cold nitrogen gas is used for cooling, as required. The specimen is then quickly transferred from the furnace to the anvils and then broken. This system incorporates an instrumented tup to determine the change in voltage during the fracture process. These data are analyzed by the computer system after the test is complete. The voltage-time trace is recorded with a digital oscilloscope, transferred to the computer, and analyzed. The analysis program incorporates several unique features. It interacts with the operator and identifies the maximum voltage during the test, the amount of rapid fracture during the test (if any), and the end of the fracture process. The program then calculates the area to maximum voltage and the total area under the voltage-time curve. The data acquisition and analysis part of the system can also be used to conduct other dynamic testing. Dynamic tear and precracked specimens can be tested with an instrumented tup and analyzed in a similar manner. 3 refs., 7 figs

  18. Design Of Computer Based Test Using The Unified Modeling Language

    Science.gov (United States)

    Tedyyana, Agus; Danuri; Lidyawati

    2017-12-01

    The Admission selection of Politeknik Negeri Bengkalis through interest and talent search (PMDK), Joint Selection of admission test for state Polytechnics (SB-UMPN) and Independent (UM-Polbeng) were conducted by using paper-based Test (PBT). Paper Based Test model has some weaknesses. They are wasting too much paper, the leaking of the questios to the public, and data manipulation of the test result. This reasearch was Aimed to create a Computer-based Test (CBT) models by using Unified Modeling Language (UML) the which consists of Use Case diagrams, Activity diagram and sequence diagrams. During the designing process of the application, it is important to pay attention on the process of giving the password for the test questions before they were shown through encryption and description process. RSA cryptography algorithm was used in this process. Then, the questions shown in the questions banks were randomized by using the Fisher-Yates Shuffle method. The network architecture used in Computer Based test application was a client-server network models and Local Area Network (LAN). The result of the design was the Computer Based Test application for admission to the selection of Politeknik Negeri Bengkalis.

  19. Test bank to accompany Computers data and processing

    CERN Document Server

    Deitel, Harvey M

    1980-01-01

    Test Bank to Accompany Computers and Data Processing provides a variety of questions from which instructors can easily custom tailor exams appropriate for their particular courses. This book contains over 4000 short-answer questions that span the full range of topics for introductory computing course.This book is organized into five parts encompassing 19 chapters. This text provides a very large number of questions so that instructors can produce different exam testing essentially the same topics in succeeding semesters. Three types of questions are included in this book, including multiple ch

  20. 77 FR 40358 - Meeting of the Scientific Advisory Committee on Alternative Toxicological Methods (SACATM)

    Science.gov (United States)

    2012-07-09

    ..., revised, and alternative safety testing methods with regulatory applicability and promotes the scientific... DEPARTMENT OF HEALTH AND HUMAN SERVICES Meeting of the Scientific Advisory Committee on Alternative Toxicological Methods (SACATM) AGENCY: Division of the National Toxicology Program (DNTP...

  1. Comparative analysis of perturbed molecular pathways identified in in vitro and in vivo toxicology studies

    NARCIS (Netherlands)

    Wiesinger, Martin; Mayer, Bernd; Jennings, Paul; Lukas, Arno

    The development of in vitro toxicological testing strategies are hampered by the difficulty in extrapolation to the intact organism. Academic toxicological literature contains a wealth of mechanistically rich information, especially arising from omic studies, which could potentially be utilized to

  2. Analysing Test-Takers’ Views on a Computer-Based Speaking Test

    Directory of Open Access Journals (Sweden)

    Marian Amengual-Pizarro

    2017-11-01

    Full Text Available This study examines test-takers’ views on a computer-delivered speaking test in order to investigate the aspects they consider most relevant in technology-based oral assessment, and to explore the main advantages and disadvantages computer-based tests may offer as compared to face-to-face speaking tests. A small-scale open questionnaire was administered to 80 test-takers who took the APTIS speaking test at the Universidad de Alcalá in April 2016. Results reveal that examinees believe computer-based tests provide a valid measure of oral competence in English and are considered to be an adequate method for the assessment of speaking. Interestingly, the data suggest that personal characteristics of test-takers seem to play a key role in deciding upon the most suitable and reliable delivery mode.

  3. Ethanol Forensic Toxicology.

    Science.gov (United States)

    Perry, Paul J; Doroudgar, Shadi; Van Dyke, Priscilla

    2017-12-01

    Ethanol abuse can lead to negative consequences that oftentimes result in criminal charges and civil lawsuits. When an individual is suspected of driving under the influence, law enforcement agents can determine the extent of intoxication by measuring the blood alcohol concentration (BAC) and performing a standardized field sobriety test. The BAC is dependent on rates of absorption, distribution, and elimination, which are influenced mostly by the dose of ethanol ingested and rate of consumption. Other factors contributing to BAC are gender, body mass and composition, food effects, type of alcohol, and chronic alcohol exposure. Because of individual variability in ethanol pharmacology and toxicology, careful extrapolation and interpretation of the BAC is needed, to justify an arrest and assignment of criminal liability. This review provides a summary of the pharmacokinetic properties of ethanol and the clinical effects of acute intoxication as they relate to common forensic questions. Concerns regarding the extrapolation of BAC and the implications of impaired memory caused by alcohol-induced blackouts are discussed. © 2017 American Academy of Psychiatry and the Law.

  4. Comparison of tests of accommodation for computer users.

    Science.gov (United States)

    Kolker, David; Hutchinson, Robert; Nilsen, Erik

    2002-04-01

    With the increased use of computers in the workplace and at home, optometrists are finding more patients presenting with symptoms of Computer Vision Syndrome. Among these symptomatic individuals, research supports that accommodative disorders are the most common vision finding. A prepresbyopic group (N= 30) and a presbyopic group (N = 30) were selected from a private practice. Assignment to a group was determined by age, accommodative amplitude, and near visual acuity with their distance prescription. Each subject was given a thorough vision and ocular health examination, then administered several nearpoint tests of accommodation at a computer working distance. All the tests produced similar results in the presbyopic group. For the prepresbyopic group, the tests yielded very different results. To effectively treat symptomatic VDT users, optometrists must assess the accommodative system along with the binocular and refractive status. For presbyopic patients, all nearpoint tests studied will yield virtually the same result. However, the method of testing accommodation, as well as the test stimulus presented, will yield significantly different responses for prepresbyopic patients. Previous research indicates that a majority of patients prefer the higher plus prescription yielded by the Gaussian image test.

  5. Evaluation of Computer Based Testing in lieu of Regular Examinations in Computer Literacy

    Science.gov (United States)

    Murayama, Koichi

    Because computer based testing (CBT) has many advantages compared with the conventional paper and pencil testing (PPT) examination method, CBT has begun to be used in various situations in Japan, such as in qualifying examinations and in the TOEFL. This paper describes the usefulness and the problems of CBT applied to a regular college examination. The regular computer literacy examinations for first year students were held using CBT, and the results were analyzed. Responses to a questionnaire indicated many students accepted CBT with no unpleasantness and considered CBT a positive factor, improving their motivation to study. CBT also decreased the work of faculty in terms of marking tests and reducing data.

  6. Specifying colours for colour vision testing using computer graphics.

    Science.gov (United States)

    Toufeeq, A

    2004-10-01

    This paper describes a novel test of colour vision using a standard personal computer, which is simple and reliable to perform. Twenty healthy individuals with normal colour vision and 10 healthy individuals with a red/green colour defect were tested binocularly at 13 selected points in the CIE (Commission International d'Eclairage, 1931) chromaticity triangle, representing the gamut of a computer monitor, where the x, y coordinates of the primary colour phosphors were known. The mean results from individuals with normal colour vision were compared to those with defective colour vision. Of the 13 points tested, five demonstrated consistently high sensitivity in detecting colour defects. The test may provide a convenient method for classifying colour vision abnormalities.

  7. Good cell culture practices &in vitro toxicology.

    Science.gov (United States)

    Eskes, Chantra; Boström, Ann-Charlotte; Bowe, Gerhard; Coecke, Sandra; Hartung, Thomas; Hendriks, Giel; Pamies, David; Piton, Alain; Rovida, Costanza

    2017-12-01

    Good Cell Culture Practices (GCCP) is of high relevance to in vitro toxicology. The European Society of Toxicology In Vitro (ESTIV), the Center for Alternatives for Animal Testing (CAAT) and the In Vitro Toxicology Industrial Platform (IVTIP) joined forces to address by means of an ESTIV 2016 pre-congress session the different aspects and applications of GCCP. The covered aspects comprised the current status of the OECD guidance document on Good In Vitro Method Practices, the importance of quality assurance for new technological advances in in vitro toxicology including stem cells, and the optimized implementation of Good Manufacturing Practices and Good Laboratory Practices for regulatory testing purposes. General discussions raised the duality related to the difficulties in implementing GCCP in an academic innovative research framework on one hand, and on the other hand, the need for such GCCP principles in order to ensure reproducibility and robustness of in vitro test methods for toxicity testing. Indeed, if good cell culture principles are critical to take into consideration for all uses of in vitro test methods for toxicity testing, the level of application of such principles may depend on the stage of development of the test method as well as on the applications of the test methods, i.e., academic innovative research vs. regulatory standardized test method. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Digital computed radiography in industrial X-ray testing

    International Nuclear Information System (INIS)

    Osterloh, K.; Onel, Y.; Zscherpel, U.; Ewert, U.

    2001-01-01

    Computed radiography is used for X-ray testing in many industrial applications. There are different systems depending on the application, e.g. fast systems for detection of material inhomogeneities and slower systems with higher local resolution for detection of cracks and fine details, e.g. in highly stressed areas or in welded seams. The method is more dynamic than film methods, and digital image processing is possible during testing [de

  9. Sample test cases using the environmental computer code NECTAR

    International Nuclear Information System (INIS)

    Ponting, A.C.

    1984-06-01

    This note demonstrates a few of the many different ways in which the environmental computer code NECTAR may be used. Four sample test cases are presented and described to show how NECTAR input data are structured. Edited output is also presented to illustrate the format of the results. Two test cases demonstrate how NECTAR may be used to study radio-isotopes not explicitly included in the code. (U.K.)

  10. Security Considerations and Recommendations in Computer-Based Testing

    Directory of Open Access Journals (Sweden)

    Saleh M. Al-Saleem

    2014-01-01

    Full Text Available Many organizations and institutions around the globe are moving or planning to move their paper-and-pencil based testing to computer-based testing (CBT. However, this conversion will not be the best option for all kinds of exams and it will require significant resources. These resources may include the preparation of item banks, methods for test delivery, procedures for test administration, and last but not least test security. Security aspects may include but are not limited to the identification and authentication of examinee, the risks that are associated with cheating on the exam, and the procedures related to test delivery to the examinee. This paper will mainly investigate the security considerations associated with CBT and will provide some recommendations for the security of these kinds of tests. We will also propose a palm-based biometric authentication system incorporated with basic authentication system (username/password in order to check the identity and authenticity of the examinee.

  11. Security considerations and recommendations in computer-based testing.

    Science.gov (United States)

    Al-Saleem, Saleh M; Ullah, Hanif

    2014-01-01

    Many organizations and institutions around the globe are moving or planning to move their paper-and-pencil based testing to computer-based testing (CBT). However, this conversion will not be the best option for all kinds of exams and it will require significant resources. These resources may include the preparation of item banks, methods for test delivery, procedures for test administration, and last but not least test security. Security aspects may include but are not limited to the identification and authentication of examinee, the risks that are associated with cheating on the exam, and the procedures related to test delivery to the examinee. This paper will mainly investigate the security considerations associated with CBT and will provide some recommendations for the security of these kinds of tests. We will also propose a palm-based biometric authentication system incorporated with basic authentication system (username/password) in order to check the identity and authenticity of the examinee.

  12. A Look at Computer-Assisted Testing Operations. The Illinois Series on Educational Application of Computers, No. 12e.

    Science.gov (United States)

    Muiznieks, Viktors; Dennis, J. Richard

    In computer assisted test construction (CATC) systems, the computer is used to perform the mechanical aspects of testing while the teacher retains control over question content. Advantages of CATC systems include question banks, decreased importance of test item security, computer analysis and response to student test answers, item analysis…

  13. Improving personality facet scores with multidimensional computer adaptive testing

    DEFF Research Database (Denmark)

    Makransky, Guido; Mortensen, Erik Lykke; Glas, Cees A W

    2013-01-01

    personality tests contain many highly correlated facets. This article investigates the possibility of increasing the precision of the NEO PI-R facet scores by scoring items with multidimensional item response theory and by efficiently administering and scoring items with multidimensional computer adaptive...

  14. Evolution of a Computer-Based Testing Laboratory

    Science.gov (United States)

    Moskal, Patrick; Caldwell, Richard; Ellis, Taylor

    2009-01-01

    In 2003, faced with increasing growth in technology-based and large-enrollment courses, the College of Business Administration at the University of Central Florida opened a computer-based testing lab to facilitate administration of course examinations. Patrick Moskal, Richard Caldwell, and Taylor Ellis describe the development and evolution of the…

  15. The advanced computational testing and simulation toolkit (ACTS)

    International Nuclear Information System (INIS)

    Drummond, L.A.; Marques, O.

    2002-01-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  16. The advanced computational testing and simulation toolkit (ACTS)

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  17. COMPUTATION FORMAT computer codes X4TOC4 and PLOTC4. Implementing and Testing on a Personal Computer

    International Nuclear Information System (INIS)

    McLaughlin, P.K.

    1987-05-01

    This document describes the contents of the diskette containing the COMPUTATION FORMAT codes X4TOC4 and PLOTC4 by D.E. Cullen, and example data for use in implementing and testing these codes on a Personal Computer of the type IBM-PC/AT. Upon request the codes are available from the IAEA Nuclear Data Section, free of charge, on a single diskette. (author)

  18. Communicative Language Testing: Implications for Computer Based Language Testing in French for Specific Purposes

    Science.gov (United States)

    García Laborda, Jesús; López Santiago, Mercedes; Otero de Juan, Nuria; Álvarez Álvarez, Alfredo

    2014-01-01

    Current evolutions of language testing have led to integrating computers in FSP assessments both in oral and written communicative tasks. This paper deals with two main issues: learners' expectations about the types of questions in FSP computer based assessments and the relation with their own experience. This paper describes the experience of 23…

  19. Aerospace Toxicology and Microbiology

    Science.gov (United States)

    James, John T.; Parmet, A. J.; Pierson, Duane L.

    2007-01-01

    Toxicology dates to the very earliest history of humanity with various poisons and venom being recognized as a method of hunting or waging war with the earliest documentation in the Evers papyrus (circa 1500 BCE). The Greeks identified specific poisons such as hemlock, a method of state execution, and the Greek word toxos (arrow) became the root of our modern science. The first scientific approach to the understanding of poisons and toxicology was the work during the late middle ages of Paracelsus. He formulated what were then revolutionary views that a specific toxic agent or "toxicon" caused specific dose-related effects. His principles have established the basis of modern pharmacology and toxicology. In 1700, Bernardo Ramazzini published the book De Morbis Artificum Diatriba (The Diseases of Workers) describing specific illnesses associated with certain labor, particularly metal workers exposed to mercury, lead, arsenic, and rock dust. Modern toxicology dates from development of the modern industrial chemical processes, the earliest involving an analytical method for arsenic by Marsh in 1836. Industrial organic chemicals were synthesized in the late 1800 s along with anesthetics and disinfectants. In 1908, Hamilton began the long study of occupational toxicology issues, and by WW I the scientific use of toxicants saw Haber creating war gases and defining time-dosage relationships that are used even today.

  20. Recent developments in analytical toxicology : for better or for worse

    NARCIS (Netherlands)

    de Zeeuw, RA

    1998-01-01

    When considering the state of the art in toxicology from an analytical perspective, the key developments relate to three major areas. (1) Forensic horizon: Today forensic analysis has broadened its scope dramatically, to include workplace toxicology, drug abuse testing, drugs and driving, doping,

  1. Computer Based Test Untuk Seleksi Masuk Politeknik Negeri Bengkalis

    Directory of Open Access Journals (Sweden)

    Agus Tedyyana

    2017-11-01

    Full Text Available AbstrakPenyeleksian calon mahasiswa baru dapat dilakukan dengan aplikasi Computer Based Test (CBT. Metode yang digunakan meliputi teknik pengumpulan data, analisis sistem, model perancangan, implementasi dan pengujian. Penelitian ini menghasilkan aplikasi CBT dimana soal yang dimunculkan dari bank soal melalui proses pengacakan dengan tidak akan memunculkan soal yang sama dengan menggunakan metoda Fisher-Yates Shuffle. Dalam proses pengamanan informasi soal saat terhubung ke jaringan maka diperlukan teknik untuk penyandian pesan agar soal tersebut sebeum dimunculkan melewati proses enkripsi dan deskripsi data terlebih dahulu maka digunakan algoritma kriptografi  RSA. Metode perancangan perangkat lunak menggunakan model waterfall, perancangan database menggunakan entity relationship diagram, perancangan antarmuka menggunakan hypertext markup language (HTML Cascading Style Sheet (CSS dan jQuery serta diimplementasikan berbasis web dengan menggunakan bahasa pemrograman PHP dan database MySQL, Arsitektur jaringan yang digunakan aplikasi Computer Based Test adalah model jaringan client-server dengan jaringan Local Area Network (LAN. Kata kunci: Computer Based Test, Fisher-Yates Shuffle, Criptography, Local Area Network AbstractSelection of new student candidates can be done with Computer Based Test (CBT application. The methods used include data collection techniques, system analysis, design model, implementation and testing. This study produces a CBT application where the questions raised from the question bank through randomization process will not bring up the same problem using the Fisher-Yates Shuffle method. In the process of securing information about the problem when connected to the network it is necessary techniques for encoding the message so that the problem before appear through the process of encryption and description of data first then used RSA cryptography algorithm. Software design method using waterfall model, database design

  2. Computational analysis in support of the SSTO flowpath test

    Science.gov (United States)

    Duncan, Beverly S.; Trefny, Charles J.

    1994-10-01

    A synergistic approach of combining computational methods and experimental measurements is used in the analysis of a hypersonic inlet. There are four major focal points within this study which examine the boundary layer growth on a compression ramp upstream of the cowl lip of a scramjet inlet. Initially, the boundary layer growth on the NASP Concept Demonstrator Engine (CDE) is examined. The follow-up study determines the optimum diverter height required by the SSTO Flowpath test to best duplicate the CDE results. These flow field computations are then compared to the experimental measurements and the mass average Mach number is determined for this inlet.

  3. Animal-free toxicology

    DEFF Research Database (Denmark)

    Knudsen, Lisbeth E

    2013-01-01

    Human data on exposure and adverse effects are the most appropriate for human risk assessment, and modern toxicology focuses on human pathway analysis and the development of human biomarkers. Human biomonitoring and human placental transport studies provide necessary information for human risk...... assessment, in accordance with the legislation on chemical, medicine and food safety. Toxicology studies based on human mechanistic and exposure information can replace animal studies. These animal-free approaches can be further supplemented by new in silico methods and chemical structure......-activity relationships. The inclusion of replacement expertise in the international Three Rs centres, the ongoing exploration of alternatives to animal research, and the improvement of conditions for research animals, all imply the beginning of a paradigm shift in toxicology research toward the use of human data....

  4. Non-precautionary aspects of toxicology

    International Nuclear Information System (INIS)

    Grandjean, Philippe

    2005-01-01

    Empirical studies in toxicology aim at deciphering complex causal relationships, especially in regard to human disease etiologies. Several scientific traditions limit the usefulness of documentation from current toxicological research, in regard to decision-making based on the precautionary principle. Among non-precautionary aspects of toxicology are the focus on simplified model systems and the effects of single hazards, one by one. Thus, less attention is paid to sources of variability and uncertainty, including individual susceptibility, impacts of mixed and variable exposures, susceptible life-stages, and vulnerable communities. In emphasizing the need for confirmatory evidence, toxicology tends to penalize false positives more than false negatives. An important source of uncertainty is measurement error that results in misclassification, especially in regard to exposure assessment. Standard statistical analysis assumes that the exposure is measured without error, and imprecisions will usually result in an underestimation of the dose-effect relationship. In testing whether an effect could be considered a possible result of natural variability, a 5% limit for 'statistical significance' is usually applied, even though it may rule out many findings of causal associations, simply because the study was too small (and thus lacked statistical power) or because some imprecision or limited sensitivity of the parameters precluded a more definitive observation. These limitations may be aggravated when toxicology is influenced by vested interests. Because current toxicology overlooks the important goal of achieving a better characterization of uncertainties and their implications, research approaches should be revised and strengthened to counteract the innate ideological biases, thereby supporting our confidence in using toxicology as a main source of documentation and in using the precautionary principle as a decision procedure in the public policy arena

  5. Computer Adaptive Multistage Testing: Practical Issues, Challenges and Principles

    Directory of Open Access Journals (Sweden)

    Halil Ibrahim SARI

    2016-12-01

    Full Text Available The purpose of many test in the educational and psychological measurement is to measure test takers’ latent trait scores from responses given to a set of items. Over the years, this has been done by traditional methods (paper and pencil tests. However, compared to other test administration models (e.g., adaptive testing, traditional methods are extensively criticized in terms of producing low measurement accuracy and long test length. Adaptive testing has been proposed to overcome these problems. There are two popular adaptive testing approaches. These are computerized adaptive testing (CAT and computer adaptive multistage testing (ca-MST. The former is a well-known approach that has been predominantly used in this field. We believe that researchers and practitioners are fairly familiar with many aspects of CAT because it has more than a hundred years of history. However, the same thing is not true for the latter one. Since ca-MST is relatively new, many researchers are not familiar with features of it. The purpose of this study is to closely examine the characteristics of ca-MST, including its working principle, the adaptation procedure called the routing method, test assembly, and scoring, and provide an overview to researchers, with the aim of drawing researchers’ attention to ca-MST and encouraging them to contribute to the research in this area. The books, software and future work for ca-MST are also discussed.

  6. Green Toxicology-Know Early About and Avoid Toxic Product Liabilities.

    Science.gov (United States)

    Maertens, Alexandra; Hartung, Thomas

    2018-02-01

    Toxicology uniquely among the life sciences relies largely on methods which are more than 40-years old. Over the last 3 decades with more or less success some additions to and few replacements in this toolbox took place, mainly as alternatives to animal testing. The acceptance of such new approaches faces the needs of formal validation and the conservative attitude toward change in safety assessments. Only recently, there is growing awareness that the same alternative methods, especially in silico and in vitro tools can also much earlier and before validation inform decision-taking in the product life cycle. As similar thoughts developed in the context of Green Chemistry, the term of Green Toxicology was coined to describe this change in approach. Here, the current developments in the alternative field, especially computational and more organo-typic cell cultures are reviewed, as they lend themselves to front-loaded chemical safety assessments. The initiatives of the Center for Alternatives to Animal Testing Green Toxicology Collaboration are presented. They aim first of all for forming a community to promote this concept and then for a cultural change in companies with the necessary training of chemists, product stewards and later regulators. © The Author 2017. Published by Oxford University Press on behalf of the Society of Toxicology. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. Shuttle Lesson Learned - Toxicology

    Science.gov (United States)

    James, John T.

    2010-01-01

    This is a script for a video about toxicology and the space shuttle. The first segment is deals with dust in the space vehicle. The next segment will be about archival samples. Then we'll look at real time on-board analyzers that give us a lot of capability in terms of monitoring for combustion products and the ability to monitor volatile organics on the station. Finally we will look at other issues that are about setting limits and dealing with ground based lessons that pertain to toxicology.

  8. Testing and Validation of Computational Methods for Mass Spectrometry.

    Science.gov (United States)

    Gatto, Laurent; Hansen, Kasper D; Hoopmann, Michael R; Hermjakob, Henning; Kohlbacher, Oliver; Beyer, Andreas

    2016-03-04

    High-throughput methods based on mass spectrometry (proteomics, metabolomics, lipidomics, etc.) produce a wealth of data that cannot be analyzed without computational methods. The impact of the choice of method on the overall result of a biological study is often underappreciated, but different methods can result in very different biological findings. It is thus essential to evaluate and compare the correctness and relative performance of computational methods. The volume of the data as well as the complexity of the algorithms render unbiased comparisons challenging. This paper discusses some problems and challenges in testing and validation of computational methods. We discuss the different types of data (simulated and experimental validation data) as well as different metrics to compare methods. We also introduce a new public repository for mass spectrometric reference data sets ( http://compms.org/RefData ) that contains a collection of publicly available data sets for performance evaluation for a wide range of different methods.

  9. The EPA Comptox Chemistry Dashboard: A Web-Based Data Integration Hub for Toxicology Data (SOT)

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data drive...

  10. Computer Forensic Function Testing: Media Preparation, Write Protection And Verification

    Directory of Open Access Journals (Sweden)

    Yinghua (David Guo

    2010-06-01

    Full Text Available Normal 0 false false false EN-US JA AR-SA The growth in the computer forensic field has created a demand for new software (or increased functionality to existing software and a means to verify that this software is truly forensic i.e. capable of meeting the requirements of the trier of fact. In this work, we review our previous work---a function oriented testing framework for validation and verification of computer forensic tools. This framework consists of three parts: function mapping, requirements specification and reference set development. Through function mapping, we give a scientific and systemical description of the fundamentals of computer forensic discipline, i.e. what functions are needed in the computer forensic investigation process. We focus this paper on the functions of media preparation, write protection and verification. Specifically, we complete the function mapping of these functions and specify their requirements. Based on this work, future work can be conducted to develop corresponding reference sets to test any tools that possess these functions.

  11. Toxicología Vegetal

    OpenAIRE

    García Fernández, Antonio Juan

    2010-01-01

    Presentaciones de clase de los temas de Toxicología Vegetal de la licenciatura de Veterinaria de la Universidad de Murcia del curso 2011/12. Presentaciones de Toxicología Vegetal de la asignatura de Toxicología de la Licenciatura de Veterinaria del curso 2011/12

  12. Predictive Toxicology: Modeling Chemical Induced Toxicological Response Combining Circular Fingerprints with Random Forest and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Alexios eKoutsoukas

    2016-03-01

    Full Text Available Modern drug discovery and toxicological research are under pressure, as the cost of developing and testing new chemicals for potential toxicological risk is rising. Extensive evaluation of chemical products for potential adverse effects is a challenging task, due to the large number of chemicals and the possible hazardous effects on human health. Safety regulatory agencies around the world are dealing with two major challenges. First, the growth of chemicals introduced every year in household products and medicines that need to be tested, and second the need to protect public welfare. Hence, alternative and more efficient toxicological risk assessment methods are in high demand. The Toxicology in the 21st Century (Tox21 consortium a collaborative effort was formed to develop and investigate alternative assessment methods. A collection of 10,000 compounds composed of environmental chemicals and approved drugs were screened for interference in biochemical pathways and released for crowdsourcing data analysis. The physicochemical space covered by Tox21 library was explored, measured by Molecular Weight (MW and the octanol/water partition coefficient (cLogP. It was found that on average chemical structures had MW of 272.6 Daltons. In case of cLogP the average value was 2.476. Next relationships between assays were examined based on compounds activity profiles across the assays utilizing the Pearson correlation coefficient r. A cluster was observed between the Androgen and Estrogen Receptors and their ligand bind domains accordingly indicating presence of cross talks among the receptors. The highest correlations observed were between NR.AR and NR.AR_LBD, where it was r=0.66 and between NR.ER and NR.ER_LBD, where it was r=0.5.Our approach to model the Tox21 data consisted of utilizing circular molecular fingerprints combined with Random Forest and Support Vector Machine by modeling each assay independently. In all of the 12 sub-challenges our modeling

  13. Scheduling and recording of reactor maintenance and testing by computer

    International Nuclear Information System (INIS)

    Gray, P.L.

    1975-01-01

    The use of a computer program, Maintenance Information and Control (MIAC), at the Savannah River Laboratory (SRL) assists a small operating staff in maintaining three research reactors and a subcritical facility. The program schedules and defines preventive maintenance, schedules required periodic tests, logs repair and cost information, specifies custodial and service responsibilities, and provides equipment maintenance history, all with a minimum of record-keeping

  14. Computer tomography of flows external to test models

    Science.gov (United States)

    Prikryl, I.; Vest, C. M.

    1982-01-01

    Computer tomographic techniques for reconstruction of three-dimensional aerodynamic density fields, from interferograms recorded from several different viewing directions were studied. Emphasis is on the case in which an opaque object such as a test model in a wind tunnel obscures significant regions of the interferograms (projection data). A method called the Iterative Convolution Method (ICM), existing methods in which the field is represented by a series expansions, and analysis of real experimental data in the form of aerodynamic interferograms are discussed.

  15. Computer-aided system for interactive psychomotor testing

    Science.gov (United States)

    Selivanova, Karina G.; Ignashchuk, Olena V.; Koval, Leonid G.; Kilivnik, Volodymyr S.; Zlepko, Alexandra S.; Sawicki, Daniel; Kalizhanova, Aliya; Zhanpeisova, Aizhan; Smailova, Saule

    2017-08-01

    Nowadays research of psychomotor actions has taken a special place in education, sports, medicine, psychology etc. Development of computer system for psychomotor testing could help solve many operational problems in psychoneurology and psychophysiology and also determine the individual characteristics of fine motor skills. This is particularly relevant issue when it comes to children, students, athletes for definition of personal and professional features. The article presents the dynamics of a developing psychomotor skills and application in the training process of means. The results of testing indicated their significant impact on psychomotor skills development.

  16. Toxicological aspects of water

    International Nuclear Information System (INIS)

    Garcia Puertas, P.

    1991-01-01

    Different toxicological aspects of water have been studied, remarking the activity of various chemical substances in the organism. These substances are divided in: trace metals (Sb, As, Cd, Zn, Cu, Cr, Fe, Mn, Hg, Ni, Pb, Se), other contaminants (CN-, polycyclic aromatic hydrocarbons, phenols, pesticides, detergents) and radioactivity. Finally, some considerations on this subject are made [es

  17. National toxicology program chemical nomination and selection process

    Energy Technology Data Exchange (ETDEWEB)

    Selkirk, J.K. [National Institute of Environmental Health Sciences, Research Triangle Park, NC (United States)

    1990-12-31

    The National Toxicology Program (NTP) was organized to support national public health programs by initiating research designed to understand the physiological, metabolic, and genetic basis for chemical toxicity. The primary mandated responsibilities of NTP were in vivo and vitro toxicity testing of potentially hazardous chemicals; broadening the spectrum of toxicological information on known hazardous chemicals; validating current toxicological assay systems as well as developing new and innovative toxicity testing technology; and rapidly communicating test results to government agencies with regulatory responsibilities and to the medical and scientific communities. 2 figs.

  18. Phytochemical Screening, Antibacterial and Toxicological Activities ...

    African Journals Online (AJOL)

    The phytochemical screening, antibacterial and toxicological activities of extracts of the stem bark of Acacia senegal were investigated. The phytochemical analyses according to standard screening tests using conventional protocols revealed the presence of tannins, saponins and sterols in the stem bark of the plant.

  19. Students Perception on the Use of Computer Based Test

    Science.gov (United States)

    Nugroho, R. A.; Kusumawati, N. S.; Ambarwati, O. C.

    2018-02-01

    Teaching nowadays might use technology in order to disseminate science and knowledge. As part of teaching, the way evaluating study progress and result has also benefited from this IT rapid progress. The computer-based test (CBT) has been introduced to replace the more conventional Paper and Pencil Test (PPT). CBT are considered more advantageous than PPT. It is considered as more efficient, transparent, and has the ability of minimising fraud in cognitive evaluation. Current studies have indicated the debate of CBT vs PPT usage. Most of the current research compares the two methods without exploring the students’ perception about the test. This study will fill the gap in the literature by providing students’ perception on the two tests method. Survey approach is conducted to obtain the data. The sample is collected in two identical classes with similar subject in a public university in Indonesia. Mann-Whitney U test used to analyse the data. The result indicates that there is a significant difference between two groups of students regarding CBT usage. Student with different test method prefers to have test other than what they were having. Further discussion and research implication is discussed in the paper.

  20. Interlaboratory computational comparisons of critical fast test reactor pin lattices

    International Nuclear Information System (INIS)

    Mincey, J.F.; Kerr, H.T.; Durst, B.M.

    1979-01-01

    An objective of the Consolidated Fuel Reprocessing Program's (CFRP) nuclear engineering group at Oak Ridge National Laboratory (ORNL) is to ensure that chemical equipment components designed for the reprocessing of spent LMFBR fuel (among other fuel types) are safe from a criticality standpoint. As existing data are inadequate for the general validation of computational models describing mixed plutonium--uranium oxide systems with isotopic compositions typical of LMFBR fuel, a program of critical experiments has been initiated at the Battelle Pacific Northwest Laboratories (PNL). The first series of benchmark experiments consisted of five square-pitched lattices of unirradiated Fast Test Reactor (FTR) fuel moderated and reflected by light water. Calculations of these five experiments have been conducted by both ORNL/CFRP and PNL personnel with the purpose of exploring how accurately various computational models will predict k/sub eff/ values for such neutronic systems and if differences between k/sub eff/ values obtained with these different models are significant

  1. Toxicological perspectives of inhaled therapeutics and nanoparticles.

    Science.gov (United States)

    Hayes, Amanda J; Bakand, Shahnaz

    2014-07-01

    The human respiratory system is an important route for the entry of inhaled therapeutics into the body to treat diseases. Inhaled materials may consist of gases, vapours, aerosols and particulates. In all cases, assessing the toxicological effect of inhaled therapeutics has many challenges. This article provides an overview of in vivo and in vitro models for testing the toxicity of inhaled therapeutics and nanoparticles implemented in drug delivery. Traditionally, inhalation toxicity has been performed on test animals to identify the median lethal concentration of airborne materials. Later maximum tolerable concentration denoted by LC0 has been introduced as a more ethically acceptable end point. More recently, in vitro methods have been developed, allowing the direct exposure of airborne material to cultured human target cells on permeable porous membranes at the air-liquid interface. Modifications of current inhalation therapies, new pulmonary medications for respiratory diseases and implementation of the respiratory tract for systemic drug delivery are providing new challenges when conducting well-designed inhalation toxicology studies. In particular, the area of nanoparticles and nanocarriers is of critical toxicological concern. There is a need to develop toxicological test models, which characterise the toxic response and cellular interaction between inhaled particles and the respiratory system.

  2. Describing of elements IO field in a testing computer program

    Directory of Open Access Journals (Sweden)

    Igor V. Loshkov

    2017-01-01

    Full Text Available A standard of describing the process of displaying interactive windows on a computer monitor, through which an output of questions and input of answers are implemented during computer testing, is presented in the article [11]. According to the proposed standard, the description of the process mentioned above is performed with a format line, containing element names, their parameters as well as grouping and auxiliary symbols. Program objects are described using elements of standard. The majority of objects create input and output windows on a computer monitor. The aim of our research was to develop a minimum possible set of elements of standard to perform mathematical and computer science testing.The choice of elements of the standard was conducted in parallel with the development and testing of the program that uses them. This approach made it possible to choose a sufficiently complete set of elements for testing in fields of study mentioned above. For the proposed elements, names were selected in such a way: firstly, they indicate their function and secondly, they coincide with the names of elements in other programming languages that are similar by function. Parameters, their names, their assignments and accepted values are proposed for the elements. The principle of name selection for the parameters was the same as for elements of the standard: the names should correspond to their assignments or coincide with names of similar parameters in other programming languages. The parameters define properties of objects. Particularly, while the elements of standard create windows, the parameters define object properties (location, size, appearance and the sequence in which windows are created. All elements of standard, proposed in this article are composed in a table, the columns of which have names and functions of these elements. Inside the table, the elements of standard are grouped row by row into four sets: input elements, output elements, input

  3. Programs for Testing Processor-in-Memory Computing Systems

    Science.gov (United States)

    Katz, Daniel S.

    2006-01-01

    The Multithreaded Microbenchmarks for Processor-In-Memory (PIM) Compilers, Simulators, and Hardware are computer programs arranged in a series for use in testing the performances of PIM computing systems, including compilers, simulators, and hardware. The programs at the beginning of the series test basic functionality; the programs at subsequent positions in the series test increasingly complex functionality. The programs are intended to be used while designing a PIM system, and can be used to verify that compilers, simulators, and hardware work correctly. The programs can also be used to enable designers of these system components to examine tradeoffs in implementation. Finally, these programs can be run on non-PIM hardware (either single-threaded or multithreaded) using the POSIX pthreads standard to verify that the benchmarks themselves operate correctly. [POSIX (Portable Operating System Interface for UNIX) is a set of standards that define how programs and operating systems interact with each other. pthreads is a library of pre-emptive thread routines that comply with one of the POSIX standards.

  4. Impact of online toxicology training on health professionals: the Global Educational Toxicology Uniting Project (GETUP).

    Science.gov (United States)

    Wong, Anselm; Vohra, Rais; Dawson, Andrew H; Stolbach, Andrew

    2017-11-01

    The Global Educational Toxicology Uniting Project (GETUP), supported by the American College of Medical Toxicology, links countries with and without toxicology services via distance education with the aim to improve education. Due to the lack of toxicology services in some countries there is a knowledge gap in the management of poisonings. We describe our experience with the worldwide delivery of an online introductory toxicology curriculum to emergency doctors and other health professionals treating poisoned patients. We delivered a 15-module introductory Internet-based toxicology curriculum to emergency doctors and health professionals, conducted from August to December 2016. This Internet-based curriculum was adapted from one used to teach emergency residents toxicology in the United States. Modules covered themes such as pharmaceutical (n = 8), toxidromes (n = 2) and agrochemicals (n = 5) poisoning. Participants completed pre-test and post-test multiple choice questions (MCQs) before and after completing the online module, respectively, throughout the course. We collected information on participant demographics, education and training, and perception of relevance of the curriculum. Participants gave feedback on the course and how it affected their practice. One hundred and thirty-six health professionals from 33 countries participated in the course: 98 emergency doctors/medical officers, 25 physicians, eight pharmacists/poisons information specialists, two toxicologists, two medical students and one nurse. Median age of participants was 34 years. Median number of years postgraduate was seven. Ninety (65%) had access to either a poisons information centre over the phone or toxicologist and 48 (35%) did not. All participants expected the course to help improve their knowledge. Overall median pre-module MCQ scores were 56% (95%CI: 38, 75%) compared to post-module MCQ scores median 89% (95% CI: 67, 100%) (p education to health professionals treating

  5. Testing an extrapolation chamber in computed tomography standard beams

    Science.gov (United States)

    Castro, M. C.; Silva, N. F.; Caldas, L. V. E.

    2018-03-01

    The computed tomography (CT) is responsible for the highest dose values to the patients. Therefore, the radiation doses in this procedure must be accurate. However, there is no primary standard system for this kind of radiation beam yet. In order to search for a CT primary standard, an extrapolation ionization chamber built at the Calibration Laboratory (LCI) of the Instituto de Pesquisas Energéticas e Nucleares (IPEN), was tested in this work. The results showed to be within the international recommended limits.

  6. Token test and computed tomogram in cerebral apoplexy

    International Nuclear Information System (INIS)

    Hanazono, Toshihide; Watanabe, Shunzo; Tasaki, Hiroichi; Hojo, Kei; Sato, Tokijiro; Hirano, Takashi; Metoki, Hirofumi.

    1985-01-01

    One hundred and eighteen patients (103 with cerebrovascular disorder and 15 with head injury or cerebral tumor) who developed aphasia were examined using computed tomography (CT). Token test (TT) scores and the presence or absence of lesions on CT were inputted onto microcomputer. The affected area was drawn by hand using a standardized matrix and a digitizer. There was linear correlation between measured TT scores and expected TT scores from CT. There was no evidence of relationship between TT scores and the lateral lobe which has been considered responsible for speech function. CT seemed to predict TT scores to some extent. (Namekawa, K.)

  7. A Randomized Rounding Approach for Optimization of Test Sheet Composing and Exposure Rate Control in Computer-Assisted Testing

    Science.gov (United States)

    Wang, Chu-Fu; Lin, Chih-Lung; Deng, Jien-Han

    2012-01-01

    Testing is an important stage of teaching as it can assist teachers in auditing students' learning results. A good test is able to accurately reflect the capability of a learner. Nowadays, Computer-Assisted Testing (CAT) is greatly improving traditional testing, since computers can automatically and quickly compose a proper test sheet to meet user…

  8. Toxicology of inorganic tin

    International Nuclear Information System (INIS)

    Burba, J.V.

    1982-01-01

    Tin(II) or stannous ion as a reducing agent is important in nuclear medicine because it is an essential component and common denominator for many in vivo radiodiagnostic agents, commonly called kits for the preparation of radiopharmaceuticals. This report is intended to alert nuclear medicine community regarding the wide range of biological effects that the stannous ion is capable of producing, and is a review of a large number of selected publications on the toxicological potential of tin(II)

  9. Operational Toxicology Research

    Science.gov (United States)

    2006-08-01

    techniques for perchlorate in water, groundwater, soil and biological matrices such as blood, urine, milk . thyroid and other tissues required for...toxicity when they are inhaled or ingested and they are irritating to the skin and mucus membranes (Committee on Toxicology, 1996). When compared to...the data collected. Develop analytical techniques for perchlorate in water, groundwater, soil, and biological matrices such as blood, urine, milk

  10. Computer simulation of ultrasonic testing for aerospace vehicle

    Energy Technology Data Exchange (ETDEWEB)

    Yamawaki, H [National Institute for Materials Science, 1-2-1, Sengen, 305-0047 Tsukuba (Japan); Moriya, S; Masuoka, T [Japan Aerospace Exploration Agency, 1 Koganesawa, Kimigawa, 981-1525 Kakuda (Japan); Takatsubo, J, E-mail: yamawaki.hisashi@nims.go.jp [Advanced Industrial Science and Technology, AIST Tsukuba Central 2, 1-1-1 Umezono, 305-8568 Tsukuba (Japan)

    2011-01-01

    Non-destructive testing techniques are developed to secure reliability of aerospace vehicles used repetitively. In the case of cracks caused by thermal stress on walls in combustion chambers of liquid-fuel rockets, it is examined by ultrasonic waves visualization technique developed in AIST. The technique is composed with non-contact ultrasonic generation by pulsed-laser scanning, piezoelectric transducer for the ultrasonic detection, and image reconstruction processing. It enables detection of defects by visualization of ultrasonic waves scattered by the defects. In NIMS, the condition of the detection by the visualization is investigated using computer simulation for ultrasonic propagation that has capability of fast 3-D calculation. The simulation technique is based on finite-difference method and two-step elastic wave equations. It is reported about the investigation by the calculation, and shows availability of the simulation for the ultrasonic testing technique of the wall cracks.

  11. Computer simulation of the Charpy V-notch toughness test

    International Nuclear Information System (INIS)

    Norris, D.M. Jr.

    1977-01-01

    The dynamic Charpy V-notch test was simulated on a computer. The calculational models (for A-533 Grade B class 1 steel) used both a rounded and a flat-tipped striker. The notch stress/strain state was found to be independent of the three-point loading type and was most strongly correlated with notch-opening displacement. The dynamic stress/strain state at the time of fracture initiation was obtained by comparing the calculated deformed shape with that obtained in interrupted Charpy V-notch tests where cracking had started. The calculation was also compared with stress/strain states calculated in other geometries at failure. The distribution and partition of specimen energy was calculated and adiabatic heating and strain rate are discussed

  12. Computer-based tests: The impact of test design and problem of equivalency

    Czech Academy of Sciences Publication Activity Database

    Květon, Petr; Jelínek, Martin; Vobořil, Dalibor; Klimusová, H.

    -, č. 23 (2007), s. 32-51 ISSN 0747-5632 R&D Projects: GA ČR(CZ) GA406/99/1052; GA AV ČR(CZ) KSK9058117 Institutional research plan: CEZ:AV0Z7025918 Keywords : Computer-based assessment * speeded test * equivalency Subject RIV: AN - Psychology Impact factor: 1.344, year: 2007

  13. Computational model for simulation small testing launcher, technical solution

    Energy Technology Data Exchange (ETDEWEB)

    Chelaru, Teodor-Viorel, E-mail: teodor.chelaru@upb.ro [University POLITEHNICA of Bucharest - Research Center for Aeronautics and Space, Str. Ghe Polizu, nr. 1, Bucharest, Sector 1 (Romania); Cristian, Barbu, E-mail: barbucr@mta.ro [Military Technical Academy, Romania, B-dul. George Coşbuc, nr. 81-83, Bucharest, Sector 5 (Romania); Chelaru, Adrian, E-mail: achelaru@incas.ro [INCAS -National Institute for Aerospace Research Elie Carafoli, B-dul Iuliu Maniu 220, 061126, Bucharest, Sector 6 (Romania)

    2014-12-10

    The purpose of this paper is to present some aspects regarding the computational model and technical solutions for multistage suborbital launcher for testing (SLT) used to test spatial equipment and scientific measurements. The computational model consists in numerical simulation of SLT evolution for different start conditions. The launcher model presented will be with six degrees of freedom (6DOF) and variable mass. The results analysed will be the flight parameters and ballistic performances. The discussions area will focus around the technical possibility to realize a small multi-stage launcher, by recycling military rocket motors. From technical point of view, the paper is focused on national project 'Suborbital Launcher for Testing' (SLT), which is based on hybrid propulsion and control systems, obtained through an original design. Therefore, while classical suborbital sounding rockets are unguided and they use as propulsion solid fuel motor having an uncontrolled ballistic flight, SLT project is introducing a different approach, by proposing the creation of a guided suborbital launcher, which is basically a satellite launcher at a smaller scale, containing its main subsystems. This is why the project itself can be considered an intermediary step in the development of a wider range of launching systems based on hybrid propulsion technology, which may have a major impact in the future European launchers programs. SLT project, as it is shown in the title, has two major objectives: first, a short term objective, which consists in obtaining a suborbital launching system which will be able to go into service in a predictable period of time, and a long term objective that consists in the development and testing of some unconventional sub-systems which will be integrated later in the satellite launcher as a part of the European space program. This is why the technical content of the project must be carried out beyond the range of the existing suborbital

  14. Developmental toxicology: adequacy of current methods.

    Science.gov (United States)

    Peters, P W

    1998-01-01

    Toxicology embraces several disciplines such as carcinogenicity, mutagenicity and reproductive toxicity. Reproductive toxicology is concerned with possible effects of substances on the reproductive process, i.e. on sexual organs and their functions, endocrine regulation, fertilization, transport of the fertilized ovum, implantation, and embryonic, fetal and postnatal development, until the end-differentiation of the organs is achieved. Reproductive toxicology is divided into areas related to male and female fertility, and developmental toxicology. Developmental toxicology can be further broken down into prenatal and postnatal toxicology. Today, much new information is available about the origins of developmental disorders resulting from chemical exposure. While these findings seem to promise important new developments in methodology and research, there is a danger of losing sight of the precepts and principles established in the light of existing knowledge. There is also a danger that we may fail to correct shortcomings in our existing procedures and practice. The aim of this presentation is to emphasize the importance of testing substances for their impact in advance of their use and to underline that we must use the best existing tools for carrying out risk assessments. Moreover, it needs to be stressed that there are many substances that are never assessed with respect to reproductive and developmental toxicity. Similarly, our programmes for post-marketing surveillance with respect to developmental toxicology are grossly inadequate. Our ability to identify risks to normal development and reproduction would be much improved, first if a number of straightforward precepts were always followed and second, if we had a clearer understanding of what we mean by risk and acceptable levels of risk in the context of development. Other aims of this paper are: to stress the complexity of the different stages of normal prenatal development; to note the principles that are

  15. Test and control computer user's guide for a digital beam former test system

    Science.gov (United States)

    Alexovich, Robert E.; Mallasch, Paul G.

    1992-01-01

    A Digital Beam Former Test System was developed to determine the effects of noise, interferers and distortions, and digital implementations of beam forming as applied to the Tracking and Data Relay Satellite 2 (TDRS 2) architectures. The investigation of digital beam forming with application to TDRS 2 architectures, as described in TDRS 2 advanced concept design studies, was conducted by the NASA/Lewis Research Center for NASA/Goddard Space Flight Center. A Test and Control Computer (TCC) was used as the main controlling element of the digital Beam Former Test System. The Test and Control Computer User's Guide for a Digital Beam Former Test System provides an organized description of the Digital Beam Former Test System commands. It is written for users who wish to conduct tests of the Digital Beam forming Test processor using the TCC. The document describes the function, use, and syntax of the TCC commands available to the user while summarizing and demonstrating the use of the commands wtihin DOS batch files.

  16. Comparison of computer code calculations with FEBA test data

    International Nuclear Information System (INIS)

    Zhu, Y.M.

    1988-06-01

    The FEBA forced feed reflood experiments included base line tests with unblocked geometry. The experiments consisted of separate effect tests on a full-length 5x5 rod bundle. Experimental cladding temperatures and heat transfer coefficients of FEBA test No. 216 are compared with the analytical data postcalculated utilizing the SSYST-3 computer code. The comparison indicates a satisfactory matching of the peak cladding temperatures, quench times and heat transfer coefficients for nearly all axial positions. This agreement was made possible by the use of an artificially adjusted value of the empirical code input parameter in the heat transfer for the dispersed flow regime. A limited comparison of test data and calculations using the RELAP4/MOD6 transient analysis code are also included. In this case the input data for the water entrainment fraction and the liquid weighting factor in the heat transfer for the dispersed flow regime were adjusted to match the experimental data. On the other hand, no fitting of the input parameters was made for the COBRA-TF calculations which are included in the data comparison. (orig.) [de

  17. Genetic Toxicology in the 21st Century: Reflections and Future Directions

    Science.gov (United States)

    Mahadevan, Brinda; Snyder, Ronald D.; Waters, Michael D.; Benz, R. Daniel; Kemper, Raymond A.; Tice, Raymond R.; Richard, Ann M.

    2011-01-01

    A symposium at the 40th anniversary of the Environmental Mutagen Society, held from October 24–28, 2009 in St. Louis, MO, surveyed the current status and future directions of genetic toxicology. This article summarizes the presentations and provides a perspective on the future. An abbreviated history is presented, highlighting the current standard battery of genotoxicity assays and persistent challenges. Application of computational toxicology to safety testing within a regulatory setting is discussed as a means for reducing the need for animal testing and human clinical trials, and current approaches and applications of in silico genotoxicity screening approaches across the pharmaceutical industry were surveyed and are reported here. The expanded use of toxicogenomics to illuminate mechanisms and bridge genotoxicity and carcinogenicity, and new public efforts to use high-throughput screening technologies to address lack of toxicity evaluation for the backlog of thousands of industrial chemicals in the environment are detailed. The Tox21 project involves coordinated efforts of four U.S. Government regulatory/research entities to use new and innovative assays to characterize key steps in toxicity pathways, including genotoxic and nongenotoxic mechanisms for carcinogenesis. Progress to date, highlighting preliminary test results from the National Toxicology Program is summarized. Finally, an overview is presented of ToxCast™, a related research program of the U.S. Environmental Protection Agency, using a broad array of high throughput and high content technologies for toxicity profiling of environmental chemicals, and computational toxicology modeling. Progress and challenges, including the pressing need to incorporate metabolic activation capability, are summarized. PMID:21538556

  18. Summary introduction to environmental toxicology

    International Nuclear Information System (INIS)

    Heinzow, B.; Jessen, H.; Wendorff, D.

    1986-01-01

    The increasing environmental consciousness and the increasing public interest in environmental medicine and toxicology is much appreciated by the Research Institute for Environmental Toxicology. This information brochure gives the reader some insight into the importance of environmental toxicology and into the waste of the Research Institute. In response to the current situation, the authors have included an appendix on radiation protection. (orig./PW) [de

  19. Systems Toxicology: From Basic Research to Risk Assessment

    Science.gov (United States)

    2014-01-01

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment. PMID:24446777

  20. A computer vision based candidate for functional balance test.

    Science.gov (United States)

    Nalci, Alican; Khodamoradi, Alireza; Balkan, Ozgur; Nahab, Fatta; Garudadri, Harinath

    2015-08-01

    Balance in humans is a motor skill based on complex multimodal sensing, processing and control. Ability to maintain balance in activities of daily living (ADL) is compromised due to aging, diseases, injuries and environmental factors. Center for Disease Control and Prevention (CDC) estimate of the costs of falls among older adults was $34 billion in 2013 and is expected to reach $54.9 billion in 2020. In this paper, we present a brief review of balance impairments followed by subjective and objective tools currently used in clinical settings for human balance assessment. We propose a novel computer vision (CV) based approach as a candidate for functional balance test. The test will take less than a minute to administer and expected to be objective, repeatable and highly discriminative in quantifying ability to maintain posture and balance. We present an informal study with preliminary data from 10 healthy volunteers, and compare performance with a balance assessment system called BTrackS Balance Assessment Board. Our results show high degree of correlation with BTrackS. The proposed system promises to be a good candidate for objective functional balance tests and warrants further investigations to assess validity in clinical settings, including acute care, long term care and assisted living care facilities. Our long term goals include non-intrusive approaches to assess balance competence during ADL in independent living environments.

  1. TRAC, a collaborative computer tool for tracer-test interpretation

    Directory of Open Access Journals (Sweden)

    Fécamp C.

    2013-05-01

    Full Text Available Artificial tracer tests are widely used by consulting engineers for demonstrating water circulation, proving the existence of leakage, or estimating groundwater velocity. However, the interpretation of such tests is often very basic, with the result that decision makers and professionals commonly face unreliable results through hasty and empirical interpretation. There is thus an increasing need for a reliable interpretation tool, compatible with the latest operating systems and available in several languages. BRGM, the French Geological Survey, has developed a project together with hydrogeologists from various other organizations to build software assembling several analytical solutions in order to comply with various field contexts. This computer program, called TRAC, is very light and simple, allowing the user to add his own analytical solution if the formula is not yet included. It aims at collaborative improvement by sharing the tool and the solutions. TRAC can be used for interpreting data recovered from a tracer test as well as for simulating the transport of a tracer in the saturated zone (for the time being. Calibration of a site operation is based on considering the hydrodynamic and hydrodispersive features of groundwater flow as well as the amount, nature and injection mode of the artificial tracer. The software is available in French, English and Spanish, and the latest version can be downloaded from the web site http://trac.brgm.fr.

  2. Toxicological awakenings: the rebirth of hormesis as a central pillar of toxicology

    International Nuclear Information System (INIS)

    Calabrese, Edward J.

    2005-01-01

    This paper assesses historical reasons that may account for the marginalization of hormesis as a dose-response model in the biomedical sciences in general and toxicology in particular. The most significant and enduring explanatory factors are the early and close association of the concept of hormesis with the highly controversial medical practice of homeopathy and the difficulty in assessing hormesis with high-dose testing protocols which have dominated the discipline of toxicology, especially regulatory toxicology. The long-standing and intensely acrimonious conflict between homeopathy and 'traditional' medicine (allopathy) lead to the exclusion of the hormesis concept from a vast array of medical- and public health-related activities including research, teaching, grant funding, publishing, professional societal meetings, and regulatory initiatives of governmental agencies and their advisory bodies. Recent publications indicate that the hormetic dose-response is far more common and fundamental than the dose-response models [threshold/linear no threshold (LNT)] used in toxicology and risk assessment, and by governmental regulatory agencies in the establishment of exposure standards for workers and the general public. Acceptance of the possibility of hormesis has the potential to profoundly affect the practice of toxicology and risk assessment, especially with respect to carcinogen assessment

  3. Application of Model Animals in the Study of Drug Toxicology

    Science.gov (United States)

    Song, Yagang; Miao, Mingsan

    2018-01-01

    Drug safety is a key factor in drug research and development, Drug toxicology test is the main method to evaluate the safety of drugs, The body condition of an animal has important implications for the results of the study, Previous toxicological studies of drugs were carried out in normal animals in the past, There is a great deviation from the clinical practice.The purpose of this study is to investigate the necessity of model animals as a substitute for normal animals for toxicological studies, It is expected to provide exact guidance for future drug safety evaluation.

  4. Validation and testing of the VAM2D computer code

    International Nuclear Information System (INIS)

    Kool, J.B.; Wu, Y.S.

    1991-10-01

    This document describes two modeling studies conducted by HydroGeoLogic, Inc. for the US NRC under contract no. NRC-04089-090, entitled, ''Validation and Testing of the VAM2D Computer Code.'' VAM2D is a two-dimensional, variably saturated flow and transport code, with applications for performance assessment of nuclear waste disposal. The computer code itself is documented in a separate NUREG document (NUREG/CR-5352, 1989). The studies presented in this report involve application of the VAM2D code to two diverse subsurface modeling problems. The first one involves modeling of infiltration and redistribution of water and solutes in an initially dry, heterogeneous field soil. This application involves detailed modeling over a relatively short, 9-month time period. The second problem pertains to the application of VAM2D to the modeling of a waste disposal facility in a fractured clay, over much larger space and time scales and with particular emphasis on the applicability and reliability of using equivalent porous medium approach for simulating flow and transport in fractured geologic media. Reflecting the separate and distinct nature of the two problems studied, this report is organized in two separate parts. 61 refs., 31 figs., 9 tabs

  5. Postmortem Biochemistry and Toxicology

    Directory of Open Access Journals (Sweden)

    Robert Flanagan

    2017-04-01

    Full Text Available The aim of postmortem biochemistry and toxicology is either to help establish the cause of death, or to gain information on events immediately before death. If self-poisoning is suspected, the diagnosis may be straightforward and all that could be required is confirmation of the agents involved. However, if the cause of death is not immediately obvious then suspicion of possible poisoning or of conditions such as alcoholic ketoacidosis is of course crucial. On the other hand, it may be important to investigate adherence to prescribed therapy, for example with anticonvulsants or antipsychotics, hence sensitive methods are required. Blood sampling (needle aspiration, peripheral vein, for example femoral, ideally after proximal ligation before opening the body minimizes the risk of sample contamination with, for example, gut contents or urine. Other specimens (stomach contents, urine, liver, vitreous humor may also be valuable and may be needed to corroborate unexpected or unusual findings in the absence of other evidence. The site of sampling should always be recorded. The availability of antemortem specimens should not necessarily preclude postmortem sampling. Appropriate sample preservation, transport, and storage are mandatory. Interpretation of analytical toxicology results must take into account what is known of the pharmacokinetics and toxicology of the agent(s in question, the circumstances under which death occurred including the mechanism of exposure, and other factors such as the stability of the analyte(s and the analytical methods used. It is important to realise that changes may occur in the composition of body fluids, even peripheral blood, after death. Such changes are likely to be greater after attempted resuscitation, and with centrally-acting drugs with large volumes of distribution given chronically, and may perhaps be minimised by prompt refrigeration of the body and performing the autopsy quickly.

  6. Toxicology of freshwater cyanobacteria.

    Science.gov (United States)

    Liyanage, H M; Arachchi, D N Magana; Abeysekara, T; Guneratne, L

    2016-07-02

    Many chemical contaminants in drinking water have been shown to cause adverse health effects in humans after prolonged exposure. Cyanobacteria are one of the most potent and diverse groups of photosynthetic prokaryotes. One key component of cyanobacterial success in the environment is the production of potent toxins as secondary metabolites, which have been responsible for numerous adverse health impacts in humans. Anthropogenic activities have led to the increase of eutrophication in freshwater bodies' worldwide, causing cyanobacterial blooms to become more frequent. The present article will discuss about harmful cyanobacteria and their toxicology with special references to microcystin, nodularin, and cylindrospermopsin.

  7. Pharmacogenetics and forensic toxicology.

    Science.gov (United States)

    Musshoff, Frank; Stamer, Ulrike M; Madea, Burkhard

    2010-12-15

    Large inter-individual variability in drug response and toxicity, as well as in drug concentrations after application of the same dosage, can be of genetic, physiological, pathophysiological, or environmental origin. Absorption, distribution and metabolism of a drug and interactions with its target often are determined by genetic differences. Pharmacokinetic and pharmacodynamic variations can appear at the level of drug metabolizing enzymes (e.g., the cytochrome P450 system), drug transporters, drug targets or other biomarker genes. Pharmacogenetics or toxicogenetics can therefore be relevant in forensic toxicology. This review presents relevant aspects together with some examples from daily routines. Copyright © 2010. Published by Elsevier Ireland Ltd.

  8. The minipig as a platform for new technologies in toxicology

    DEFF Research Database (Denmark)

    Forster, Roy; Ancian, Philippe; Fredholm, Merete

    2010-01-01

    The potential of the minipig as a platform for future developments in genomics, high density biology, transgenic technology, in vitro toxicology and related emerging technologies was reviewed. Commercial interests in the pig as an agricultural production species have driven scientific progress...... pigs and humans suggest that minipigs will be useful for the testing of biotechnology products (and possibly for in silico toxicology) and (iii) the minipig is the only non-rodent toxicology model where transgenic animals can be readily generated, and reproductive technologies are well developed...... in the pig. These properties should also make the minipig an interesting model for the testing of biotechnology products. These factors all support the idea that the minipig is well placed to meet the challenges of the emerging technologies and the toxicology of the future; it also seems likely...

  9. Airflow Patterns In Nuclear Workplace - Computer Simulation And Qualitative Tests

    International Nuclear Information System (INIS)

    Haim, M.; Szanto, M.; Weiss, Y.; Kravchick, T.; Levinson, S.; German, U.

    1999-01-01

    Concentration of airborne radioactive materials inside a room can vary widely from one location to another, sometimes by orders of magnitude even for locations that are relatively close. Inappropriately placed samplers can give misleading results and. therefore, the location of air samplers is important. Proper placement of samplers cannot be determined simply by observing the position of room air supply and exhaust vents. Airflow studies, such as the release of smoke aerosols, should be used. The significance of airflow pattern studies depends on the purpose of sampling - for estimating worker intakes, warning of high concentrations. defacing airborne radioactive areas, testing for confinement of sealed radioactive materials. etc. When sampling air in rooms with complex airflow patterns, it may be useful to use qualitative airflow studies with smoke tubes, smoke candles or isostatic bubbles. The U.S. Nuclear Regulatory Commission - Regulatory Guide 8.25 [1]. suggests that an airflow study should be conducted after any changes at work area including changes in the setup of work areas, ventilation system changes, etc. The present work presents an airflow patterns study conducted in a typical room using two methods: a computer simulation and a qualitative test using a smoke tube

  10. Implementation of Computer Assisted Test Selection System in Local Governments

    Directory of Open Access Journals (Sweden)

    Abdul Azis Basri

    2016-05-01

    Full Text Available As an evaluative way of selection of civil servant system in all government areas, Computer Assisted Test selection system was started to apply in 2013. In phase of implementation for first time in all areas in 2014, this system selection had trouble in several areas, such as registration procedure and passing grade. The main objective of this essay was to describe implementation of new selection system for civil servants in the local governments and to seek level of effectiveness of this selection system. This essay used combination of study literature and field survey which data collection was made by interviews, observations, and documentations from various sources, and to analyze the collected data, this essay used reduction, display data and verification for made the conclusion. The result of this essay showed, despite there a few parts that be problem of this system such as in the registration phase but almost all phases of implementation of CAT selection system in local government areas can be said was working clearly likes in preparation, implementation and result processing phase. And also this system was fulfilled two of three criterias of effectiveness for selection system, they were accuracy and trusty. Therefore, this selection system can be said as an effective way to select new civil servant. As suggestion, local governments have to make prime preparation in all phases of test and make a good feedback as evaluation mechanism and together with central government to seek, fix and improve infrastructures as supporting tool and competency of local residents.

  11. Green Toxicology: a strategy for sustainable chemical and material development.

    Science.gov (United States)

    Crawford, Sarah E; Hartung, Thomas; Hollert, Henner; Mathes, Björn; van Ravenzwaay, Bennard; Steger-Hartmann, Thomas; Studer, Christoph; Krug, Harald F

    2017-01-01

    Green Toxicology refers to the application of predictive toxicology in the sustainable development and production of new less harmful materials and chemicals, subsequently reducing waste and exposure. Built upon the foundation of "Green Chemistry" and "Green Engineering", "Green Toxicology" aims to shape future manufacturing processes and safe synthesis of chemicals in terms of environmental and human health impacts. Being an integral part of Green Chemistry, the principles of Green Toxicology amplify the role of health-related aspects for the benefit of consumers and the environment, in addition to being economical for manufacturing companies. Due to the costly development and preparation of new materials and chemicals for market entry, it is no longer practical to ignore the safety and environmental status of new products during product development stages. However, this is only possible if toxicologists and chemists work together early on in the development of materials and chemicals to utilize safe design strategies and innovative in vitro and in silico tools. This paper discusses some of the most relevant aspects, advances and limitations of the emergence of Green Toxicology from the perspective of different industry and research groups. The integration of new testing methods and strategies in product development, testing and regulation stages are presented with examples of the application of in silico, omics and in vitro methods. Other tools for Green Toxicology, including the reduction of animal testing, alternative test methods, and read-across approaches are also discussed.

  12. Cross-Mode Comparability of Computer-Based Testing (CBT) versus Paper-Pencil Based Testing (PPT): An Investigation of Testing Administration Mode among Iranian Intermediate EFL Learners

    Science.gov (United States)

    Khoshsima, Hooshang; Hosseini, Monirosadat; Toroujeni, Seyyed Morteza Hashemi

    2017-01-01

    Advent of technology has caused growing interest in using computers to convert conventional paper and pencil-based testing (Henceforth PPT) into Computer-based testing (Henceforth CBT) in the field of education during last decades. This constant promulgation of computers to reshape the conventional tests into computerized format permeated the…

  13. Distance learning in toxicology: Australia's RMIT program

    International Nuclear Information System (INIS)

    Ahokas, Jorma; Donohue, Diana; Rix, Colin; Wright, Paul

    2005-01-01

    RMIT University was the first to offer a comprehensive Masters of Toxicology in Australasia 19 years ago. In 2001 the program was transformed into two stages, leading to a Graduate Diploma and Master of Applied Science in Toxicology. Now, these programs are fully online and suitable for graduates living and working anywhere in the world. The modular distance-learning courses are specifically designed to equip students with essential skills for entering fields such as chemical and drug evaluation; risk assessment of chemicals in the workplace; environmental and food toxicology. RMIT's online course delivery system has made it possible to deliver the toxicology programs, both nationally and internationally. The learning material and interactive activities (tests and quizzes, discussion boards, chat sessions) use Blackboard and WebBoard, each with a different educational function. Students log in to a Learning Hub to access their courses. The Learning Hub enables students to extend their learning beyond the classroom to the home, workplace, library and any other location with Internet access. The teaching staff log in to the Learning Hub to maintain and administer the online programs and courses which they have developed and/or which they teach. The Learning Hub is also a communication tool for students and staff, providing access to email, a diary and announcements. The early experience of delivering a full toxicology program online is very positive. However this mode of teaching continues to present many interesting technical, educational and cultural challenges, including: the design and presentation of the material; copyright issues; internationalisation of content; interactive participation; and the assessment procedures

  14. Utilisation of Wearable Computing for Space Programmes Test Activities Optimasation

    Science.gov (United States)

    Basso, V.; Lazzari, D.; Alemanni, M.

    2004-08-01

    New technologies are assuming a relevant importance in the Space business domain also in the Assembly Integration and Test (AIT) activities allowing process optimization and capability that were unthinkable only few years ago. This paper has the aim to describe Alenia Spazio (ALS) gained experience on the remote interaction techniques as a results of collaborations established both on European Communities (EC) initiatives, with Alenia Aeronautica (ALA) and Politecnico of Torino (POLITO). The H/W and S/W components performances increase and costs reduction due to the home computing massive utilization (especially demanded by the games business) together with the network technology possibility (offered by the web as well as the hi-speed links and the wireless communications) allow today to re-think the traditional AIT process activities in the light of the multimedia data exchange: graphical, voice video and by sure more in the future. Aerospace business confirm its innovation vocation which in the year '80 represents the cradle of the CAD systems and today is oriented to the 3D data visualization/ interaction technologies and remote visualisation/ interaction in collaborative way on a much more user friendly bases (i.e. not for specialists). Fig. 1 collects AIT extended scenario studied and adopted by ALS in these years. ALS experimented two possibilities of remote visualization/interaction: Portable [e.g. Fig.2 Personal Digital Assistant (PDA), Wearable] and walls (e.g.VR-Lab) screens as both 2D/3D visualisation and interaction devices which could support many types of traditional (mainly based on EGSE and PDM/CAD utilisation/reports) company internal AIT applications: 1. design review support 2. facility management 3. storage management 4. personnel training 5. integration sequences definition 6. assembly and test operations follow up 7. documentation review and external access to AIT activities for remote operations (e.g. tele-testing) EGSE Portable Clean room

  15. 3S - Systematic, systemic, and systems biology and toxicology.

    Science.gov (United States)

    Smirnova, Lena; Kleinstreuer, Nicole; Corvi, Raffaella; Levchenko, Andre; Fitzpatrick, Suzanne C; Hartung, Thomas

    2018-01-01

    A biological system is more than the sum of its parts - it accomplishes many functions via synergy. Deconstructing the system down to the molecular mechanism level necessitates the complement of reconstructing functions on all levels, i.e., in our conceptualization of biology and its perturbations, our experimental models and computer modelling. Toxicology contains the somewhat arbitrary subclass "systemic toxicities"; however, there is no relevant toxic insult or general disease that is not systemic. At least inflammation and repair are involved that require coordinated signaling mechanisms across the organism. However, the more body components involved, the greater the challenge to reca-pitulate such toxicities using non-animal models. Here, the shortcomings of current systemic testing and the development of alternative approaches are summarized. We argue that we need a systematic approach to integrating existing knowledge as exemplified by systematic reviews and other evidence-based approaches. Such knowledge can guide us in modelling these systems using bioengineering and virtual computer models, i.e., via systems biology or systems toxicology approaches. Experimental multi-organ-on-chip and microphysiological systems (MPS) provide a more physiological view of the organism, facilitating more comprehensive coverage of systemic toxicities, i.e., the perturbation on organism level, without using substitute organisms (animals). The next challenge is to establish disease models, i.e., micropathophysiological systems (MPPS), to expand their utility to encompass biomedicine. Combining computational and experimental systems approaches and the chal-lenges of validating them are discussed. The suggested 3S approach promises to leverage 21st century technology and systematic thinking to achieve a paradigm change in studying systemic effects.

  16. In silico toxicology for the pharmaceutical sciences

    International Nuclear Information System (INIS)

    Valerio, Luis G.

    2009-01-01

    The applied use of in silico technologies (a.k.a. computational toxicology, in silico toxicology, computer-assisted tox, e-tox, i-drug discovery, predictive ADME, etc.) for predicting preclinical toxicological endpoints, clinical adverse effects, and metabolism of pharmaceutical substances has become of high interest to the scientific community and the public. The increased accessibility of these technologies for scientists and recent regulations permitting their use for chemical risk assessment supports this notion. The scientific community is interested in the appropriate use of such technologies as a tool to enhance product development and safety of pharmaceuticals and other xenobiotics, while ensuring the reliability and accuracy of in silico approaches for the toxicological and pharmacological sciences. For pharmaceutical substances, this means active and impurity chemicals in the drug product may be screened using specialized software and databases designed to cover these substances through a chemical structure-based screening process and algorithm specific to a given software program. A major goal for use of these software programs is to enable industry scientists not only to enhance the discovery process but also to ensure the judicious use of in silico tools to support risk assessments of drug-induced toxicities and in safety evaluations. However, a great amount of applied research is still needed, and there are many limitations with these approaches which are described in this review. Currently, there is a wide range of endpoints available from predictive quantitative structure-activity relationship models driven by many different computational software programs and data sources, and this is only expected to grow. For example, there are models based on non-proprietary and/or proprietary information specific to assessing potential rodent carcinogenicity, in silico screens for ICH genetic toxicity assays, reproductive and developmental toxicity, theoretical

  17. Diagnostic reliability of MMPI-2 computer-based test interpretations.

    Science.gov (United States)

    Pant, Hina; McCabe, Brian J; Deskovitz, Mark A; Weed, Nathan C; Williams, John E

    2014-09-01

    Reflecting the common use of the MMPI-2 to provide diagnostic considerations, computer-based test interpretations (CBTIs) also typically offer diagnostic suggestions. However, these diagnostic suggestions can sometimes be shown to vary widely across different CBTI programs even for identical MMPI-2 profiles. The present study evaluated the diagnostic reliability of 6 commercially available CBTIs using a 20-item Q-sort task developed for this study. Four raters each sorted diagnostic classifications based on these 6 CBTI reports for 20 MMPI-2 profiles. Two questions were addressed. First, do users of CBTIs understand the diagnostic information contained within the reports similarly? Overall, diagnostic sorts of the CBTIs showed moderate inter-interpreter diagnostic reliability (mean r = .56), with sorts for the 1/2/3 profile showing the highest inter-interpreter diagnostic reliability (mean r = .67). Second, do different CBTIs programs vary with respect to diagnostic suggestions? It was found that diagnostic sorts of the CBTIs had a mean inter-CBTI diagnostic reliability of r = .56, indicating moderate but not strong agreement across CBTIs in terms of diagnostic suggestions. The strongest inter-CBTI diagnostic agreement was found for sorts of the 1/2/3 profile CBTIs (mean r = .71). Limitations and future directions are discussed. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  18. ICPP radiological and toxicological sabotage analysis

    International Nuclear Information System (INIS)

    Kubiak, V.R.; Mortensen, F.G.

    1995-01-01

    In June of 1993, the Department of Energy (DOE) issued Notice 5630.3A, open-quotes Protection of Departmental Facilities Against Radiological and Toxicological Sabotage,close quotes which states that all significant radiological and toxicological hazards at Department facilities must be examined for potential sabotage. This analysis has been completed at the Idaho Chemical Processing Plant (ICPP). The ICPP radiological and toxicological hazards include spent government and commercial fuels, Special Nuclear Materials (SNM), high-level liquid wastes, high-level solid wastes, and process and decontamination chemicals. The analysis effort included identification and assessment of quantities of hazardous materials present at the facility; identification and ranking of hazardous material targets; development of worst case scenarios detailing possible sabotage actions and hazard releases; performance of vulnerability assessments using table top and computer methodologies on credible threat targets; evaluation of potential risks to the public, workers, and the environment; evaluation of sabotage risk reduction options; and selection of cost effective prevention and mitigation options

  19. Walk a Mile in My Shoes: Stakeholder Accounts of Testing Experience with a Computer-Administered Test

    Science.gov (United States)

    Fox, Janna; Cheng, Liying

    2015-01-01

    In keeping with the trend to elicit multiple stakeholder responses to operational tests as part of test validation, this exploratory mixed methods study examines test-taker accounts of an Internet-based (i.e., computer-administered) test in the high-stakes context of proficiency testing for university admission. In 2013, as language testing…

  20. Computer automation of ultrasonic testing. [inspection of ultrasonic welding

    Science.gov (United States)

    Yee, B. G. W.; Kerlin, E. E.; Gardner, A. H.; Dunmyer, D.; Wells, T. G.; Robinson, A. R.; Kunselman, J. S.; Walker, T. C.

    1974-01-01

    Report describes a prototype computer-automated ultrasonic system developed for the inspection of weldments. This system can be operated in three modes: manual, automatic, and computer-controlled. In the computer-controlled mode, the system will automatically acquire, process, analyze, store, and display ultrasonic inspection data in real-time. Flaw size (in cross-section), location (depth), and type (porosity-like or crack-like) can be automatically discerned and displayed. The results and pertinent parameters are recorded.

  1. Nuclear toxicology at CEA

    International Nuclear Information System (INIS)

    Giustranti, C.

    2001-01-01

    CEA (French commission of atomic energy) has launched a new program dedicated to the study of the transfer of heavy metals and some radionuclides from environment to living beings. The substances that will be studied, are those that are involved in research, medical activities, and in nuclear industry. It means iodine, technetium, trans-uranides (uranium and plutonium), fission products (iodine, cesium), carbon, cobalt, boron and beryllium. This program is composed of 2 axis: the first one concerns the bio-geo-chemical cycles that are involved in transfer and the second axis deals with the detoxication processes that appear in animal and man cells. This program will rely on the strong competencies of CEA in chemistry, radiochemistry, biology, physiology and toxicology. (A.C.)

  2. From Classical Toxicology to Tox21: Some Critical Conceptual and Technological Advances in the Molecular Understanding of the Toxic Response Beginning From the Last Quarter of the 20th Century.

    Science.gov (United States)

    Choudhuri, Supratim; Patton, Geoffrey W; Chanderbhan, Ronald F; Mattia, Antonia; Klaassen, Curtis D

    2018-01-01

    Toxicology has made steady advances over the last 60+ years in understanding the mechanisms of toxicity at an increasingly finer level of cellular organization. Traditionally, toxicological studies have used animal models. However, the general adoption of the principles of 3R (Replace, Reduce, Refine) provided the impetus for the development of in vitro models in toxicity testing. The present commentary is an attempt to briefly discuss the transformation in toxicology that began around 1980. Many genes important in cellular protection and metabolism of toxicants were cloned and characterized in the 80s, and gene expression studies became feasible, too. The development of transgenic and knockout mice provided valuable animal models to investigate the role of specific genes in producing toxic effects of chemicals or protecting the organism from the toxic effects of chemicals. Further developments in toxicology came from the incorporation of the tools of "omics" (genomics, proteomics, metabolomics, interactomics), epigenetics, systems biology, computational biology, and in vitro biology. Collectively, the advances in toxicology made during the last 30-40 years are expected to provide more innovative and efficient approaches to risk assessment. A goal of experimental toxicology going forward is to reduce animal use and yet be able to conduct appropriate risk assessments and make sound regulatory decisions using alternative methods of toxicity testing. In that respect, Tox21 has provided a big picture framework for the future. Currently, regulatory decisions involving drugs, biologics, food additives, and similar compounds still utilize data from animal testing and human clinical trials. In contrast, the prioritization of environmental chemicals for further study can be made using in vitro screening and computational tools. Published by Oxford University Press on behalf of the Society of Toxicology 2017. This work is written by US Government employees and is in the

  3. Mass Spectrometry Applications for Toxicology.

    Science.gov (United States)

    Mbughuni, Michael M; Jannetto, Paul J; Langman, Loralie J

    2016-12-01

    Toxicology is a multidisciplinary study of poisons, aimed to correlate the quantitative and qualitative relationships between poisons and their physiological and behavioural effects in living systems. Other key aspects of toxicology focus on elucidation of the mechanisms of action of poisons and development of remedies and treatment plans for associated toxic effects. In these endeavours, Mass spectrometry (MS) has become a powerful analytical technique with a wide range of application used in the Toxicological analysis of drugs, poisons, and metabolites of both. To date, MS applications have permeated all fields of toxicology which include; environmental, clinical, and forensic toxicology. While many different analytical applications are used in these fields, MS and its hyphenated applications such as; gas chromatography MS (GC-MS), liquid chromatography MS (LC-MS), inductively coupled plasma ionization MS (ICP-MS), tandem mass spectrometry (MS/MS and MS n ) have emerged as powerful tools used in toxicology laboratories. This review will focus on these hyphenated MS technologies and their applications for toxicology.

  4. Mass Spectrometry Applications for Toxicology

    Science.gov (United States)

    Mbughuni, Michael M.; Jannetto, Paul J.

    2016-01-01

    Toxicology is a multidisciplinary study of poisons, aimed to correlate the quantitative and qualitative relationships between poisons and their physiological and behavioural effects in living systems. Other key aspects of toxicology focus on elucidation of the mechanisms of action of poisons and development of remedies and treatment plans for associated toxic effects. In these endeavours, Mass spectrometry (MS) has become a powerful analytical technique with a wide range of application used in the Toxicological analysis of drugs, poisons, and metabolites of both. To date, MS applications have permeated all fields of toxicology which include; environmental, clinical, and forensic toxicology. While many different analytical applications are used in these fields, MS and its hyphenated applications such as; gas chromatography MS (GC-MS), liquid chromatography MS (LC-MS), inductively coupled plasma ionization MS (ICP-MS), tandem mass spectrometry (MS/MS and MSn) have emerged as powerful tools used in toxicology laboratories. This review will focus on these hyphenated MS technologies and their applications for toxicology. PMID:28149262

  5. The use of computers for the performance and analysis of non-destructive testing

    International Nuclear Information System (INIS)

    Edelmann, X.; Pfister, O.

    1988-01-01

    Examples of the use of computers in non-destructive testing are related. Ultrasonic testing is especially addressed. The employment of computers means improvements for the user, the possibility of registering the reflector position, storage of test data and help with documentation. The test can be automated. The introduction of expert systems is expected for the future. 8 figs., 12 refs

  6. [Interest of toxicological analysis for poisonings].

    Science.gov (United States)

    Mégarbane, Bruno; Baud, Frédéric J

    2008-04-30

    The clinical approach of the poisoned patients is mainly based on the analysis of the circumstances of intoxication and the search for toxidromes. Toxicological analysis aims to detect the toxicants or measure their concentrations, in order to confirm the hypothesis of poisoning, to evaluate its severity and to help the follow-up regarding the treatment efficiency. Emergent toxicological analysis appears only useful if the method is specific and the results rapidly obtained. Therefore, systematic screening using immunochesmistry-based tests is not recommended in the situation of emergency. Measurement of blood concentrations of the toxicants is only indicated if it may influence the patient management. However, in the perspective of research, the study of toxicokinetic/toxicodynamic relationships, i.e. the relationships between the toxicant effects and its blood concentrations, may be helpful to understand the inter-individual variability of the response to a toxicant.

  7. Pharmacological and toxicological evaluation of Urtica dioica.

    Science.gov (United States)

    Dar, Sabzar Ahmad; Ganai, Farooq Ahmad; Yousuf, Abdul Rehman; Balkhi, Masood-Ul-Hassan; Bhat, Towseef Mohsin; Sharma, Poonam

    2013-02-01

    Medicinal plants are a largely unexplored source of drug repository. Urtica dioica L. (Urticaceae) is used in traditional medicine to treat diverse conditions. The present study describes the antidiabetic, antiinflammatory, antibacterial activity, and toxicological studies of Urtica dioica. U. dioica leaves were subjected to solvent extraction with hexane, chloroform, ethyl acetate, methanol, and aqueous, respectively, and screened for antidiabetic (300 mg/kg bw by glucose tolerance test; GTT), antiinflammatory (200 mg/kg bw by rat paw edema assay) and antibacterial activities [by disc-diffusion and minimum inhibitory concentration (MIC) assays]. Toxicological studies were carried on Artemia salina and Wistar rats; phytochemical analyses were carried out, using chromatographic and spectroscopic techniques. The aqueous extract of U. dioica (AEUD) significantly (p 1000 μg/mL each on A. salina. Our results showed that the U. dioica leaves are an interesting source of bioactive compounds, justifying their use in folk medicine, to treat various diseases.

  8. Ninth Triennial Toxicology Salary Survey.

    Science.gov (United States)

    Gad, Shayne Cox; Sullivan, Dexter Wayne

    2016-01-01

    This survey serves as the ninth in a series of toxicology salary surveys conducted at 3-year intervals and beginning in 1988. An electronic survey instrument was distributed to 5919 individuals including members of the Society of Toxicology, American College of Toxicology, and 23 additional professional organizations. Question items inquired about gender, age, degree, years of experience, certifications held, areas of specialization, society membership, employment and income. Overall, 1293 responses were received (response rate 21.8%). The results of the 2014 survey provide insight into the job market and career path for current and future toxicologists. © The Author(s) 2016.

  9. Nails in Forensic Toxicology: An Update.

    Science.gov (United States)

    Solimini, Renata; Minutillo, Adele; Kyriakou, Chrystalla; Pichini, Simona; Pacifici, Roberta; Busardo, Francesco Paolo

    2017-01-01

    The analysis of nails as a keratinized matrix to detect drugs or illicit substances has been increasingly used in forensic and clinical toxicology as a complementary test, especially for the specific characteristics of stably accumulating substances for long periods of time. This allows a retrospective investigation of chronic drug abuse, monitoring continuous drug or pharmaceutical use, reveal in utero drug exposure or environmental exposures. We herein review the recent literature investigating drug incorporation mechanisms and drug detection in nails for forensic toxicological purposes. Mechanisms of drug incorporation have not yet been fully elucidated. However, some research has lately contributed to a better understanding of how substances are incorporated into nails, suggesting three potential mechanisms of drug incorporation: contamination from sweat, incorporation from nail bed and incorporation from germinal matrix. In addition, numerous methods dealing with the determination of drugs of abuse, medications and alcohol biomarkers in nails have been reported in studies over the years. The latter methods could find application in clinical and forensic toxicology. The studies herein reviewed point out how important it is to standardize and harmonize the methodologies (either pre-analytical or analytical) for nails analysis and the optimization of sampling as well as the development of proficiency testing programs and the determination of cut-off values. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  10. USING COMPUTER-BASED TESTING AS ALTERNATIVE ASSESSMENT METHOD OF STUDENT LEARNING IN DISTANCE EDUCATION

    Directory of Open Access Journals (Sweden)

    Amalia SAPRIATI

    2010-04-01

    Full Text Available This paper addresses the use of computer-based testing in distance education, based on the experience of Universitas Terbuka (UT, Indonesia. Computer-based testing has been developed at UT for reasons of meeting the specific needs of distance students as the following: Ø students’ inability to sit for the scheduled test, Ø conflicting test schedules, and Ø students’ flexibility to take examination to improve their grades. In 2004, UT initiated a pilot project in the development of system and program for computer-based testing method. Then in 2005 and 2006 tryouts in the use of computer-based testing methods were conducted in 7 Regional Offices that were considered as having sufficient supporting recourses. The results of the tryouts revealed that students were enthusiastic in taking computer-based tests and they expected that the test method would be provided by UT as alternative to the traditional paper and pencil test method. UT then implemented computer-based testing method in 6 and 12 Regional Offices in 2007 and 2008 respectively. The computer-based testing was administered in the city of the designated Regional Office and was supervised by the Regional Office staff. The development of the computer-based testing was initiated with conducting tests using computers in networked configuration. The system has been continually improved, and it currently uses devices linked to the internet or the World Wide Web. The construction of the test involves the generation and selection of the test items from the item bank collection of the UT Examination Center. Thus the combination of the selected items compromises the test specification. Currently UT has offered 250 courses involving the use of computer-based testing. Students expect that more courses are offered with computer-based testing in Regional Offices within easy access by students.

  11. American College of Medical Toxicology

    Science.gov (United States)

    ... Publications Journal of Medical Toxicology About ACMT About Us History of ACMT ACMT Fact Sheet Strategic Plan ACMT ... Policies IJMT JMT Editorial Board About ACMT About Us History of ACMT ACMT Fact Sheet Strategic Plan ACMT ...

  12. Predictive toxicology in drug safety

    National Research Council Canada - National Science Library

    Xu, Jinghai J; Urban, Laszlo

    2011-01-01

    .... It provides information on the present knowledge of drug side effects and their mitigation strategy during drug discovery, gives guidance for risk assessment, and promotes evidence-based toxicology...

  13. Toxicology of Biodiesel Combustion products

    Science.gov (United States)

    1. Introduction The toxicology of combusted biodiesel is an emerging field. Much of the current knowledge about biological responses and health effects stems from studies of exposures to other fuel sources (typically petroleum diesel, gasoline, and wood) incompletely combusted. ...

  14. Computer applications for the Fast Flux Test Facility

    International Nuclear Information System (INIS)

    Worth, G.A.; Patterson, J.R.

    1976-01-01

    Computer applications for the FFTF reactor include plant surveillance functions and fuel handling and examination control functions. Plant surveillance systems provide the reactor operator with a selection of over forty continuously updated, formatted displays of correlated data. All data are checked for limits and validity and the operator is advised of any anomaly. Data are also recorded on magnetic tape for historical purposes. The system also provides calculated variables, such as reactor thermal power and anomalous reactivity. Supplementing the basic plant surveillance computer system is a minicomputer system that monitors the reactor cover gas to detect and characterize absorber or fuel pin failures. In addition to plant surveillance functions, computers are used in the FFTF for controlling selected refueling equipment and for post-irradiation fuel pin examination. Four fuel handling or examination systems operate under computer control with manual monitoring and over-ride capability

  15. Verifiable Measurement-Only Blind Quantum Computing with Stabilizer Testing.

    Science.gov (United States)

    Hayashi, Masahito; Morimae, Tomoyuki

    2015-11-27

    We introduce a simple protocol for verifiable measurement-only blind quantum computing. Alice, a client, can perform only single-qubit measurements, whereas Bob, a server, can generate and store entangled many-qubit states. Bob generates copies of a graph state, which is a universal resource state for measurement-based quantum computing, and sends Alice each qubit of them one by one. Alice adaptively measures each qubit according to her program. If Bob is honest, he generates the correct graph state, and, therefore, Alice can obtain the correct computation result. Regarding the security, whatever Bob does, Bob cannot get any information about Alice's computation because of the no-signaling principle. Furthermore, malicious Bob does not necessarily send the copies of the correct graph state, but Alice can check the correctness of Bob's state by directly verifying the stabilizers of some copies.

  16. Use of computed tomography in nondestructive testing of polymeric materials

    International Nuclear Information System (INIS)

    Persson, S.; Oestman, E.

    1985-01-01

    Computed tomography has been used to detect imperfections and to measure cross-link density gradients in polymeric products, such as airplane tires, rubber shock absorbers, and filament-wound high-pressure tanks

  17. Mass Spectrometry Applications for Toxicology

    OpenAIRE

    Mbughuni, Michael M.; Jannetto, Paul J.; Langman, Loralie J.

    2016-01-01

    Toxicology is a multidisciplinary study of poisons, aimed to correlate the quantitative and qualitative relationships between poisons and their physiological and behavioural effects in living systems. Other key aspects of toxicology focus on elucidation of the mechanisms of action of poisons and development of remedies and treatment plans for associated toxic effects. In these endeavours, Mass spectrometry (MS) has become a powerful analytical technique with a wide range of application used i...

  18. Analysis of Statistical Methods Currently used in Toxicology Journals.

    Science.gov (United States)

    Na, Jihye; Yang, Hyeri; Bae, SeungJin; Lim, Kyung-Min

    2014-09-01

    Statistical methods are frequently used in toxicology, yet it is not clear whether the methods employed by the studies are used consistently and conducted based on sound statistical grounds. The purpose of this paper is to describe statistical methods used in top toxicology journals. More specifically, we sampled 30 papers published in 2014 from Toxicology and Applied Pharmacology, Archives of Toxicology, and Toxicological Science and described methodologies used to provide descriptive and inferential statistics. One hundred thirteen endpoints were observed in those 30 papers, and most studies had sample size less than 10, with the median and the mode being 6 and 3 & 6, respectively. Mean (105/113, 93%) was dominantly used to measure central tendency, and standard error of the mean (64/113, 57%) and standard deviation (39/113, 34%) were used to measure dispersion, while few studies provide justifications regarding why the methods being selected. Inferential statistics were frequently conducted (93/113, 82%), with one-way ANOVA being most popular (52/93, 56%), yet few studies conducted either normality or equal variance test. These results suggest that more consistent and appropriate use of statistical method is necessary which may enhance the role of toxicology in public health.

  19. Computer-Based English Language Testing in China: Present and Future

    Science.gov (United States)

    Yu, Guoxing; Zhang, Jing

    2017-01-01

    In this special issue on high-stakes English language testing in China, the two articles on computer-based testing (Jin & Yan; He & Min) highlight a number of consistent, ongoing challenges and concerns in the development and implementation of the nationwide IB-CET (Internet Based College English Test) and institutional computer-adaptive…

  20. Advancing alternatives analysis: The role of predictive toxicology in selecting safer chemical products and processes.

    Science.gov (United States)

    Malloy, Timothy; Zaunbrecher, Virginia; Beryt, Elizabeth; Judson, Richard; Tice, Raymond; Allard, Patrick; Blake, Ann; Cote, Ila; Godwin, Hilary; Heine, Lauren; Kerzic, Patrick; Kostal, Jakub; Marchant, Gary; McPartland, Jennifer; Moran, Kelly; Nel, Andre; Ogunseitan, Oladele; Rossi, Mark; Thayer, Kristina; Tickner, Joel; Whittaker, Margaret; Zarker, Ken

    2017-09-01

    Alternatives analysis (AA) is a method used in regulation and product design to identify, assess, and evaluate the safety and viability of potential substitutes for hazardous chemicals. It requires toxicological data for the existing chemical and potential alternatives. Predictive toxicology uses in silico and in vitro approaches, computational models, and other tools to expedite toxicological data generation in a more cost-effective manner than traditional approaches. The present article briefly reviews the challenges associated with using predictive toxicology in regulatory AA, then presents 4 recommendations for its advancement. It recommends using case studies to advance the integration of predictive toxicology into AA, adopting a stepwise process to employing predictive toxicology in AA beginning with prioritization of chemicals of concern, leveraging existing resources to advance the integration of predictive toxicology into the practice of AA, and supporting transdisciplinary efforts. The further incorporation of predictive toxicology into AA would advance the ability of companies and regulators to select alternatives to harmful ingredients, and potentially increase the use of predictive toxicology in regulation more broadly. Integr Environ Assess Manag 2017;13:915-925. © 2017 SETAC. © 2017 SETAC.

  1. Computer versus paper--does it make any difference in test performance?

    Science.gov (United States)

    Karay, Yassin; Schauber, Stefan K; Stosch, Christoph; Schüttpelz-Brauns, Katrin

    2015-01-01

    CONSTRUCT: In this study, we examine the differences in test performance between the paper-based and the computer-based version of the Berlin formative Progress Test. In this context it is the first study that allows controlling for students' prior performance. Computer-based tests make possible a more efficient examination procedure for test administration and review. Although university staff will benefit largely from computer-based tests, the question arises if computer-based tests influence students' test performance. A total of 266 German students from the 9th and 10th semester of medicine (comparable with the 4th-year North American medical school schedule) participated in the study (paper = 132, computer = 134). The allocation of the test format was conducted as a randomized matched-pair design in which students were first sorted according to their prior test results. The organizational procedure, the examination conditions, the room, and seating arrangements, as well as the order of questions and answers, were identical in both groups. The sociodemographic variables and pretest scores of both groups were comparable. The test results from the paper and computer versions did not differ. The groups remained within the allotted time, but students using the computer version (particularly the high performers) needed significantly less time to complete the test. In addition, we found significant differences in guessing behavior. Low performers using the computer version guess significantly more than low-performing students in the paper-pencil version. Participants in computer-based tests are not at a disadvantage in terms of their test results. The computer-based test required less processing time. The reason for the longer processing time when using the paper-pencil version might be due to the time needed to write the answer down, controlling for transferring the answer correctly. It is still not known why students using the computer version (particularly low

  2. [Toxicologic blood emergency screening].

    Science.gov (United States)

    Cohen, Sabine; Manat, Aurélie; Dumont, Benoit; Bévalot, Fabien; Manchon, Monique; Berny, Claudette

    2010-01-01

    In order to overcome the stop marketing by Biorad company of automated high performance liquid chromatograph with UV detection (Remedi), we developed a gas chromatography-mass spectrometry (GC-MS) to detect and to give an approximation of the overdose of molecules frequently encountered in drug intoxications. Therefore two hundred eighty seventeen blood samples were collected over a period of one year and allowed us to evaluate and compare the performance of these two techniques. As identification, GC-MS does not identify all molecules detected by Remedi in 24.2% of cases; there is a lack of sensitivity for opiates and the systematic absence of certain molecules such as betablockers. However, in 75.8% of cases the GC-MS detects all molecules found by Remedi and other molecules such as meprobamate, paracetamol, benzodiazepines and phenobarbital. The concentrations obtained are interpreted in terms of overdose showed 15.7% of discrepancy and 84.3% of concordance between the two techniques. The GC-MS technique described here is robust, fast and relatively simple to implement; the identification is facilitated by macro commands and the semi quantification remains manual. Despite a sequence of cleaning the column after each sample, carryover of a sample to the next remains possible. This technique can be used for toxicologic screening in acute intoxications. Nevertheless it must be supplemented by a HPLC with UV detection if molecules such as betablockers are suspected.

  3. Toxicology of plutonium

    International Nuclear Information System (INIS)

    Bair, W.J.

    1974-01-01

    Data are reviewed from studies on the toxicity of Pu in experimental animals. Of the several plutonium isotopes, only 238 Pu and 239 Pu have been studied well. Sufficient results have been obtained to show that the behavior of 238 Pu in biological systems and the resulting biological effects cannot be precisely predicted from studies of 239 Pu. This probably applies also to other radiologically important plutonium isotopes which have half-lives ranging from 45 days to 10 7 years and decay by β-emission, electron capture, and spontaneous fission, as well as by emission of α-particles. All the biological effects of plutonium described in this review are attributed to alpha-particle radiation emitted by the plutonium. However, since plutonium is a chemically active heavy metal, one cannot ignore the possibility of chemical toxicity of the low-specific-activity isotopes, 239 Pu, 242 Pu, and 244 Pu. The preponderance of our knowledge of plutonium toxicology has come from short-term studies of relatively high dosage levels in several animal species. The consequences of high-level internal exposures can be predicted with confidence in experimental animals and probably also in man. However, considering the care with which plutonium is handled in the nuclear industry, a high-level contamination event is unlikely. Considerably less is known about the long-term effects of low levels of contamination. (250 references) (U.S.)

  4. Predictive Models and Computational Toxicology (II IBAMTOX)

    Science.gov (United States)

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  5. Predictive Toxicology and Computer Simulation of Male ...

    Science.gov (United States)

    The reproductive tract is a complex, integrated organ system with diverse embryology and unique sensitivity to prenatal environmental exposures that disrupt morphoregulatory processes and endocrine signaling. U.S. EPA’s in vitro high-throughput screening (HTS) database (ToxCastDB) was used to profile the bioactivity of 54 chemicals with male developmental consequences across ~800 molecular and cellular features. The in vitro bioactivity on molecular targets could be condensed into 156 gene annotations in a bipartite network. These results highlighted the role of estrogen and androgen signaling pathways in male reproductive tract development, and importantly, broadened the list of molecular targets to include GPCRs, cytochrome-P450s, vascular remodeling proteins, and retinoic acid signaling. A multicellular agent-based model was used to simulate the complex interactions between morphoregulatory, endocrine, and environmental influences during genital tubercle (GT) development. Spatially dynamic signals (e.g., SHH, FGF10, and androgen) were implemented in the model to address differential adhesion, cell motility, proliferation, and apoptosis. Under control of androgen signaling, urethral tube closure was an emergent feature of the model that was linked to gender-specific rates of ventral mesenchymal proliferation and urethral plate endodermal apoptosis. A systemic parameter sweep was used to examine the sensitivity of crosstalk between genetic deficiency and envi

  6. Distributed storage and cloud computing: a test case

    International Nuclear Information System (INIS)

    Piano, S; Ricca, G Delia

    2014-01-01

    Since 2003 the computing farm hosted by the INFN Tier3 facility in Trieste supports the activities of many scientific communities. Hundreds of jobs from 45 different VOs, including those of the LHC experiments, are processed simultaneously. Given that normally the requirements of the different computational communities are not synchronized, the probability that at any given time the resources owned by one of the participants are not fully utilized is quite high. A balanced compensation should in principle allocate the free resources to other users, but there are limits to this mechanism. In fact, the Trieste site may not hold the amount of data needed to attract enough analysis jobs, and even in that case there could be a lack of bandwidth for their access. The Trieste ALICE and CMS computing groups, in collaboration with other Italian groups, aim to overcome the limitations of existing solutions using two approaches: sharing the data among all the participants taking full advantage of GARR-X wide area networks (10 GB/s) and integrating the resources dedicated to batch analysis with the ones reserved for dynamic interactive analysis, through modern solutions as cloud computing.

  7. The quark gluon plasma: Lattice computations put to experimental test

    Indian Academy of Sciences (India)

    I describe how lattice computations are being used to extract experimentally relevant features of the quark gluon plasma. I deal specifically with relaxation times, photon emissivity, strangeness yields, event-by-event fluctuations of conserved quantities and hydrodynamic flow. Finally I give evidence that the plasma is rather ...

  8. Transitioning the GED[R] Mathematics Test to Computer with and without Accommodations: A Pilot Project

    Science.gov (United States)

    Patterson, Margaret Becker; Higgins, Jennifer; Bozman, Martha; Katz, Michael

    2011-01-01

    We conducted a pilot study to see how the GED Mathematics Test could be administered on computer with embedded accessibility tools. We examined test scores and test-taker experience. Nineteen GED test centers across five states and 216 randomly assigned GED Tests candidates participated in the project. GED candidates completed two GED mathematics…

  9. Cancer and Toxicology Section

    International Nuclear Information System (INIS)

    Anon.

    1980-01-01

    The Cancer and Toxicology Section is concerned with the investigation of the mechanisms by which chemicals, radiation, and viruses cause the changes broadly identified as cancer. In addition, the study of mechanisms has been extended to include the nontumorigenic effects of various agents associated with fossil energy and fuels. Research in molecular genetics of carcinogenesis focuses largely on the transposon properties of the genomes of retroviruses. The transposon structure of the DNA genomes of endogenous murine N-tropic and B-tropic type C retroviruses is being elucidated, and their chromosomal location mapped in hamster-mouse cell hybrids. A model of the mechanism of retrovirus induction by radiation and chemicals is being developed, and experiments have established that compounds such as hydroxyurea act as inducer. There is the possibility that transposition of sequences of this endogenous virus may be linked to leukemogenesis. Research in regulation of gene expression aims at defining in molecular terms the mechanisms determining expression of specific genes, how these are regulated by hormones, and the events responsible for dysfunction of gene expression in cancer. In corollary work, a library of cloned cDNAs specific for products of genes of special interest to regulation is being developed. Improvement of reversed-phase chromatography as a means of isolating bacterial plasmids and restriction fragments of DNA is underway. Newly developed techniques permit the isolation of supercoiled plasmid DNA directly from bacterial extracts. The technology has been developed recently for the photosynthetic growth of the chemo-autotrophic organism Rhodospirillum rubrum and the enzyme ribulosebisphosphate carboxylase has been produced in quantity

  10. Network architecture test-beds as platforms for ubiquitous computing.

    Science.gov (United States)

    Roscoe, Timothy

    2008-10-28

    Distributed systems research, and in particular ubiquitous computing, has traditionally assumed the Internet as a basic underlying communications substrate. Recently, however, the networking research community has come to question the fundamental design or 'architecture' of the Internet. This has been led by two observations: first, that the Internet as it stands is now almost impossible to evolve to support new functionality; and second, that modern applications of all kinds now use the Internet rather differently, and frequently implement their own 'overlay' networks above it to work around its perceived deficiencies. In this paper, I discuss recent academic projects to allow disruptive change to the Internet architecture, and also outline a radically different view of networking for ubiquitous computing that such proposals might facilitate.

  11. Computer-aided dispatch--traffic management center field operational test final detailed test plan : WSDOT deployment

    Science.gov (United States)

    2003-10-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : WSDOT deployment". This document defines the objective, approach,...

  12. Computer-aided dispatch--traffic management center field operational test final test plans : state of Utah

    Science.gov (United States)

    2004-01-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : state of Utah". This document defines the objective, approach, an...

  13. Computer simulation and cold model testing of CCL cavities

    International Nuclear Information System (INIS)

    Chang, C.R.; Yao, C.G.; Swenson, D.A.; Funk, L.W.

    1993-01-01

    The SSC coupled-cavity-linac (CCL) consists of nine modules with eight tanks in each module. Multicavity magnetically coupled bridge couplers are used to couple the eight tanks within a module into one RF resonant chain. The operating frequency is 1282.851 MHz. In this paper the authors discuss both computer calculations and cold model measurements to determine the geometry dimension of the RF structure

  14. Training Senior Teachers in Compulsory Computer Based Language Tests

    Science.gov (United States)

    Laborda, Jesus Garcia; Royo, Teresa Magal

    2009-01-01

    The IBT TOEFL has become the principal example of online high stakes language testing since 2005. Most instructors who do the preparation for IBT TOEFL face two main realities: first, students are eager and highly motivated to take the test because of the prospective implications; and, second, specific studies would be necessary to see if…

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  16. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  19. Computer-assisted organization of the weld seam test

    International Nuclear Information System (INIS)

    Lorenz, H.; Richter, I.; Hansen, W.

    1986-01-01

    The article describes the task set and the solution found for an EDP program to be used for assisting in non-destructive tests. It covers the activities of test planning, disposition, calculation, test instructions, documentation and quality statistics. The phases of development and implementation have been essentially concluded. The program is not expected to reduce actual test work. Its advantages rather result from complete planning and execution, from early sensing of deviations, easier documentation and fast as well as easily surveyed informations. When the program has been fully integrated into the flow schedule of order handling, it is expected that 'additional work' will be reduced to an extent close to abt. 15 percent of the total work invested. By transferring the invested work from the test and documentation phase to the planning phase, the target of the program system meets the principle of modern quality assurance, namely to intensify measures for preventing errors. (orig.) [de

  20. Advancing the 3Rs in Regulatory Toxicology - Carcinogenicity Testing: Scope for Harmonisation and Advancing the 3Rs in Regulated Sectors of the European Union

    Science.gov (United States)

    Abstract Different government agencies operating in the European Union regulate different types of chemical products, but all require testing for carcinogenicity to support applications for product marketing and commercialisation. A conference was held in Brussels in 2013 where ...

  1. Applications of NLP Techniques to Computer-Assisted Authoring of Test Items for Elementary Chinese

    Science.gov (United States)

    Liu, Chao-Lin; Lin, Jen-Hsiang; Wang, Yu-Chun

    2010-01-01

    The authors report an implemented environment for computer-assisted authoring of test items and provide a brief discussion about the applications of NLP techniques for computer assisted language learning. Test items can serve as a tool for language learners to examine their competence in the target language. The authors apply techniques for…

  2. Systematic Testing should not be a Topic in the Computer Science Curriculum!

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    In this paper we argue that treating "testing" as an isolated topic is a wrong approach in computer science and software engineering teaching. Instead testing should pervade practical topics and exercises in the computer science curriculum to teach students the importance of producing software...

  3. The analog computation and contrast test of leaked electromagnetic noise in the klystron corridor

    International Nuclear Information System (INIS)

    Tao Xiaoping; Wang Guicheng

    2001-01-01

    In order to obtain a better understand of the characteristics and location of noise source, the leaked electromagnetic noise in the klystron corridor of NSRL has been analogously calculated. The computational method and formula of high frequency leaked noise of the modulator were given. On-the-spot contrast tests have been made on the base of analog computation. The contrast test results show reasonableness of analog computation and whereby offer a theoretic base for reducing noise leakage in corridor

  4. Computer-facilitated rapid HIV testing in emergency care settings: provider and patient usability and acceptability.

    Science.gov (United States)

    Spielberg, Freya; Kurth, Ann E; Severynen, Anneleen; Hsieh, Yu-Hsiang; Moring-Parris, Daniel; Mackenzie, Sara; Rothman, Richard

    2011-06-01

    Providers in emergency care settings (ECSs) often face barriers to expanded HIV testing. We undertook formative research to understand the potential utility of a computer tool, "CARE," to facilitate rapid HIV testing in ECSs. Computer tool usability and acceptability were assessed among 35 adult patients, and provider focus groups were held, in two ECSs in Washington State and Maryland. The computer tool was usable by patients of varying computer literacy. Patients appreciated the tool's privacy and lack of judgment and their ability to reflect on HIV risks and create risk reduction plans. Staff voiced concerns regarding ECS-based HIV testing generally, including resources for follow-up of newly diagnosed people. Computer-delivered HIV testing support was acceptable and usable among low-literacy populations in two ECSs. Such tools may help circumvent some practical barriers associated with routine HIV testing in busy settings though linkages to care will still be needed.

  5. 2007 TOXICOLOGY AND RISK ASSESSMENT ...

    Science.gov (United States)

    EPA has announced The 2007 Toxicology and Risk Assessment Conference Cincinnati Marriott North, West Chester (Cincinnati), OHApril 23- 26, 2007 - Click to register!The Annual Toxicology and Risk Assessment Conference is a unique meeting where several Government Agencies come together to discuss toxicology and risk assessment issues that are not only of concern to the government, but also to a broader audience including academia and industry. The theme of this year's conference is Emerging Issues and Challenges in Risk Assessment and the preliminary agenda includes: Plenary Sessions and prominent speakers (tentative) include: Issues of Emerging Chemical ContaminantsUncertainty and Variability in Risk Assessment Use of Mechanistic data in IARC evaluationsParallel Sessions:Uncertainty and Variability in Dose-Response Assessment Recent Advances in Toxicity and Risk Assessment of RDX The Use of Epidemiologic Data for Risk Assessment Applications Cumulative Health Risk Assessment:

  6. Advancing the 3Rs in regulatory toxicology - Carcinogenicity testing: Scope for harmonisation and advancing the 3Rs in regulated sectors of the European Union.

    Science.gov (United States)

    Annys, Erwin; Billington, Richard; Clayton, Rick; Bremm, Klaus-Dieter; Graziano, Michael; McKelvie, Jo; Ragan, Ian; Schwarz, Michael; van der Laan, Jan Willem; Wood, Charles; Öberg, Mattias; Wester, Piet; Woodward, Kevin N

    2014-07-01

    Different government agencies operating in the European Union regulate different types of chemical products but all require testing for carcinogenicity to support applications for product marketing and commercialisation. A conference was held in Brussels in 2013 where representatives of the pharmaceutical, animal health, chemical and plant protection industries, together with representatives of regulatory agencies, universities and other stakeholders, met under the auspices of The European Partnership for Alternative Approaches to Animal Testing (EPAA) to discuss the varying requirements for carcinogenicity testing, and how these studies might be refined to improve hazard evaluation and risk assessment while implementing principles of the 3Rs (replacement, refinement and reduction in animal studies). While there are some similarities, the regulatory approaches in pharmaceutical, animal health, chemical and plant protection sectors have varying degrees of flexibility in requirements for carcinogenicity testing, to an extent reflecting concerns over the magnitude and duration of human exposure, either directly as in therapeutic exposure to pharmaceuticals, or indirectly through the ingestion of residues of veterinary drugs or plant protection chemicals. The article discusses these differences and other considerations for modified carcinogenicity testing paradigms on the basis of scientific and 3Rs approaches. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Developmental and Reproductive Toxicology Database (DART)

    Data.gov (United States)

    U.S. Department of Health & Human Services — A bibliographic database on the National Library of Medicine's (NLM) Toxicology Data Network (TOXNET) with references to developmental and reproductive toxicology...

  8. IRIS Toxicological Review of Acrolein (2003 Final)

    Science.gov (United States)

    EPA announced the release of the final report, Toxicological Review of Acrolein: in support of the Integrated Risk Information System (IRIS). The updated Summary for Acrolein and accompanying toxicological review have been added to the IRIS Database.

  9. Computer-controlled environmental test systems - Criteria for selection, installation, and maintenance.

    Science.gov (United States)

    Chapman, C. P.

    1972-01-01

    Applications for presently marketed, new computer-controlled environmental test systems are suggested. It is shown that capital costs of these systems follow an exponential cost function curve that levels out as additional applications are implemented. Some test laboratory organization changes are recommended in terms of new personnel requirements, and facility modification are considered in support of a computer-controlled test system. Software for computer-controlled test systems are discussed, and control loop speed constraints are defined for real-time control functions. Suitable input and output devices and memory storage device tradeoffs are also considered.

  10. Comparing Postsecondary Marketing Student Performance on Computer-Based and Handwritten Essay Tests

    Science.gov (United States)

    Truell, Allen D.; Alexander, Melody W.; Davis, Rodney E.

    2004-01-01

    The purpose of this study was to determine if there were differences in postsecondary marketing student performance on essay tests based on test format (i.e., computer-based or handwritten). Specifically, the variables of performance, test completion time, and gender were explored for differences based on essay test format. Results of the study…

  11. Inhalation toxicology. I., Design of a small-animal test system, II. Determination of the relative toxic hazards of 75 aircraft cabin materials.

    Science.gov (United States)

    1977-01-01

    In an effort to further the cause of increased safety for those who ride in commercial aircraft, this paper presents a detailed description of the genesis of a small-scale, laboratory test system that utilizes small animals to evaluate the relative t...

  12. Test experience on an ultrareliable computer communication network

    Science.gov (United States)

    Abbott, L. W.

    1984-01-01

    The dispersed sensor processing mesh (DSPM) is an experimental, ultra-reliable, fault-tolerant computer communications network that exhibits an organic-like ability to regenerate itself after suffering damage. The regeneration is accomplished by two routines - grow and repair. This paper discusses the DSPM concept for achieving fault tolerance and provides a brief description of the mechanization of both the experiment and the six-node experimental network. The main topic of this paper is the system performance of the growth algorithm contained in the grow routine. The characteristics imbued to DSPM by the growth algorithm are also discussed. Data from an experimental DSPM network and software simulation of larger DSPM-type networks are used to examine the inherent limitation on growth time by the growth algorithm and the relationship of growth time to network size and topology.

  13. Comptox Chemistry Dashboard: Web-Based Data Integration Hub for Environmental Chemistry and Toxicology Data (ACS Fall meeting 4 of 12)

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrate advances in biology, chemistry, exposure and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and da...

  14. Application of computer techniques to charpy impact testing of irradiated pressure vessel steels

    International Nuclear Information System (INIS)

    Landow, M.P.; Fromm, E.O.; Perrin, J.S.

    1982-01-01

    A Rockwell AIM 65 microcomputer has been modified to control a remote Charpy V-notch impact test machine. It controls not only handling and testing of the specimen but also transference and storage of instrumented Charpy test data. A system of electrical solenoid activated pneumatic cylinders and switches provides the interface between the computer and the test apparatus. A command language has been designated that allows the operator to command checkout, test procedure, and data storage via the computer. Automatic compliance with ASTM test procedures is built into the program

  15. Constraints in Teacher Training for Computer Assisted Language Testing Implementation

    Science.gov (United States)

    Garcia Laborda, Jesus; Litzler, Mary Frances

    2011-01-01

    Many ELT examinations have gone online in the last few years and a large number of educational institutions have also started considering the possibility of implementing their own tests. This paper deals with the training of a group of 24 ELT teachers in the Region of Valencia (Spain). In 2007, the Ministry of Education provided funds to determine…

  16. Collaborative development of predictive toxicology applications.

    Science.gov (United States)

    Hardy, Barry; Douglas, Nicki; Helma, Christoph; Rautenberg, Micha; Jeliazkova, Nina; Jeliazkov, Vedrin; Nikolova, Ivelina; Benigni, Romualdo; Tcheremenskaia, Olga; Kramer, Stefan; Girschick, Tobias; Buchwald, Fabian; Wicker, Joerg; Karwath, Andreas; Gütlein, Martin; Maunz, Andreas; Sarimveis, Haralambos; Melagraki, Georgia; Afantitis, Antreas; Sopasakis, Pantelis; Gallagher, David; Poroikov, Vladimir; Filimonov, Dmitry; Zakharov, Alexey; Lagunin, Alexey; Gloriozova, Tatyana; Novikov, Sergey; Skvortsova, Natalia; Druzhilovsky, Dmitry; Chawla, Sunil; Ghosh, Indira; Ray, Surajit; Patel, Hitesh; Escher, Sylvia

    2010-08-31

    OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals.The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation.Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH

  17. Collaborative development of predictive toxicology applications

    Directory of Open Access Journals (Sweden)

    Hardy Barry

    2010-08-01

    Full Text Available Abstract OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals. The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation. Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure

  18. 42 CFR 493.1213 - Condition: Toxicology.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Toxicology. 493.1213 Section 493.1213 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES....1213 Condition: Toxicology. If the laboratory provides services in the subspecialty of Toxicology, the...

  19. A Comparison of Computer-Based Classification Testing Approaches Using Mixed-Format Tests with the Generalized Partial Credit Model

    Science.gov (United States)

    Kim, Jiseon

    2010-01-01

    Classification testing has been widely used to make categorical decisions by determining whether an examinee has a certain degree of ability required by established standards. As computer technologies have developed, classification testing has become more computerized. Several approaches have been proposed and investigated in the context of…

  20. Chemical products toxicological tests performed on lake and river fish; Essai toxicologiques de produits chimiques sur des poissons d'eau douce

    Energy Technology Data Exchange (ETDEWEB)

    Teulon, F.; Simeon, C. [Commissariat a l' Energie Atomique, Centre de Pierrelatte (France). Centre d' Etudes Nucleaires

    1966-07-01

    The volume and toxical values of industrial and urban effluents are growing higher and therefore acute or chronic pollution hazard is proportionally increased. Hence it is necessary to determine the effluent components minimum lethal dose for fish (one hour or six hours according to applicable standards). The following tests are described in this report: toxicity of some chemical products, tested individually (sodium, sulphate, sodium chloride, sodium fluoride, etc...); toxicity of some metal ions (Al{sup 3+}, Fe{sup ++}, Fe{sup 3+}, Pb{sup ++}, etc...); toxicity of certain mixed compounds for various fish species (sun perch, tench, gold fish, roach, gudgeon, bleak). The test results obtained represent local values and may be used for reference and as a general basis for other investigation and calculation of the effluents data when released. (author) [French] Le volume et la toxicite des effluents industriels et urbains deviennent de plus en plus importants, les risques de pollution aigue ou chronique croissent en proportion. Il est donc necessaire de determiner les doses minima mortelles pour le poisson (1 h ou 6 h, d'apres les conventions) des composants eventuels des effluents consideres. Les essais qui font l'objet de ce rapport sont les suivants: toxicite de quelques produits chimiques, pris separement (sulfate, chlorure, fluorure de sodium, etc...); toxicite de quelques ions metalliques: Al{sup 3+}, Fe{sup ++} et Fe{sup 3+}, Pb{sup ++}, etc...); toxicite de quelques melanges sur des especes differentes de poissons ( perche-soleil, tanche, carassin, gardon, goujon, ablette). Les chiffres obtenus representent des valeurs locales et peuvent servir de references et de base de travail pour le calcul des caracteristiques des effluents a leur rejet. (auteur)

  1. Single-dose Intramuscular-injection Toxicology Test of Water-soluble Carthami-flos and Cervi cornu parvum Pharmacopuncture in a Rat Model

    Directory of Open Access Journals (Sweden)

    Sunju Park

    2015-09-01

    Full Text Available Objectives: The aim of the study is to investigate both the single-dose intramuscular injection toxicity and the approximate lethal dose of water-soluble Carthami-flos and Cervi cornu parvum pharmacopuncture (WCFC in male and female Sprague-Dawley (SD rats. Methods: The study was conducted at Biotoxtech Co. according to the Good Laboratory Practice (GLP regulation and the toxicity test guidelines of the Ministry of Food and Drug Safety (MFDS after approval of the Institutional Animal Care and Use Committee. Dosages for the control, high dose, middle dose and low dose groups were 0.5 mL/animal of saline and 0.5, 0.25 and 0.125 mL/animal of WCFC, respectively. WCFC was injected into the muscle of the left femoral region by using a disposable syringe (1 mL, 26 gauge. The general symptoms and mortality were observed 30 minutes, 1, 2, 4, and 6 hours after the first injection and then daily for 14 days after the injection. The body weights of the SD rats were measured on the day of the injection (before injection and on the third, seventh, and fourteenth days after the injection. Serum biochemical and hematologic tests, necropsy examinations, and histopathologic examinations at the injection site were performed after the observation period. Results: No deaths, abnormal clinical symptoms, or significant weight changes were observed in either male or female SD rats in the control or the test (0.125, 0.25, and 0.5 mL/animal groups during the observation period. No significant differences in hematology and serum biochemistry and no macroscopic abnormalities at necropsy were found. No abnormal reactions at injection sites were noted on the topical tolerance tests. Conclusion: The results of this single-dose toxicity study show that WCFC is safe, its lethal doses in male and female SD rats being estimated to be higher than 0.5 mL/animal.

  2. Pulmonary toxicology of respirable particles

    International Nuclear Information System (INIS)

    Sanders, C.L.; Cross, F.T.; Dagle, G.E.; Mahaffey, J.A.

    1980-09-01

    Separate abstracts were prepared for the 44 papers presented in these proceedings that deal will radioactive particles. The last paper (Stannard) in the proceedings is an historical review of the field of inhalation toxicology and is not included in the analytics

  3. Surprises and omissions in toxicology

    Czech Academy of Sciences Publication Activity Database

    Rašková, H.; Zídek, Zdeněk

    2004-01-01

    Roč. 12, - (2004), S94-S96 ISSN 1210-7778. [Inderdisciplinary Czech-Slovak Toxicological Conference /8./. Praha, 03.09.2004-05.09.2004] Institutional research plan: CEZ:AV0Z5008914 Keywords : bacterial toxins Subject RIV: FR - Pharmacology ; Medidal Chemistry

  4. Juvenile Toxicology: Relevance and Challenges for Toxicologists and Pathologists

    Science.gov (United States)

    Remick, Amera K.; Catlin, Natasha R.; Quist, Erin M.; Steinbach, Thomas J.; Dixon, Darlene

    2015-01-01

    The Society of Toxicologic Pathology (STP) Education Committee and the STP Reproductive Special Interest Group held a North Carolina regional meeting entitled, “Juvenile Toxicology: Relevance and Challenges for Toxicologists and Pathologists” on March 13, 2015, at the National Institute of Environmental Health Sciences/National Toxicology Program in Research Triangle Park, North Carolina. The purpose of this regional meeting was to familiarize attendees with the topic of juvenile toxicity testing and discuss its relevance to clinical pediatric medicine, regulatory perspectives, challenges of appropriate study design confronted by toxicologists, and challenges of histopathologic examination and interpretation of juvenile tissues faced by pathologists. The 1-day meeting was a success with over 60 attendees representing industry, government, research organizations, and academia. PMID:26220944

  5. A Cerebellar Neuroprosthetic System: Computational Architecture and in vivo Test

    International Nuclear Information System (INIS)

    Herreros, Ivan; Giovannucci, Andrea; Taub, Aryeh H.; Hogri, Roni; Magal, Ari; Bamford, Sim; Prueckl, Robert; Verschure, Paul F. M. J.

    2014-01-01

    Emulating the input–output functions performed by a brain structure opens the possibility for developing neuroprosthetic systems that replace damaged neuronal circuits. Here, we demonstrate the feasibility of this approach by replacing the cerebellar circuit responsible for the acquisition and extinction of motor memories. Specifically, we show that a rat can undergo acquisition, retention, and extinction of the eye-blink reflex even though the biological circuit responsible for this task has been chemically inactivated via anesthesia. This is achieved by first developing a computational model of the cerebellar microcircuit involved in the acquisition of conditioned reflexes and training it with synthetic data generated based on physiological recordings. Secondly, the cerebellar model is interfaced with the brain of an anesthetized rat, connecting the model’s inputs and outputs to afferent and efferent cerebellar structures. As a result, we show that the anesthetized rat, equipped with our neuroprosthetic system, can be classically conditioned to the acquisition of an eye-blink response. However, non-stationarities in the recorded biological signals limit the performance of the cerebellar model. Thus, we introduce an updated cerebellar model and validate it with physiological recordings showing that learning becomes stable and reliable. The resulting system represents an important step toward replacing lost functions of the central nervous system via neuroprosthetics, obtained by integrating a synthetic circuit with the afferent and efferent pathways of a damaged brain region. These results also embody an early example of science-based medicine, where on the one hand the neuroprosthetic system directly validates a theory of cerebellar learning that informed the design of the system, and on the other one it takes a step toward the development of neuro-prostheses that could recover lost learning functions in animals and, in the longer term, humans.

  6. A Cerebellar Neuroprosthetic System: Computational Architecture and in vivo Test

    Energy Technology Data Exchange (ETDEWEB)

    Herreros, Ivan; Giovannucci, Andrea [Synthetic Perceptive, Emotive and Cognitive Systems group (SPECS), Universitat Pompeu Fabra, Barcelona (Spain); Taub, Aryeh H.; Hogri, Roni; Magal, Ari [Psychobiology Research Unit, Tel Aviv University, Tel Aviv (Israel); Bamford, Sim [Physics Laboratory, Istituto Superiore di Sanità, Rome (Italy); Prueckl, Robert [Guger Technologies OG, Graz (Austria); Verschure, Paul F. M. J., E-mail: paul.verschure@upf.edu [Synthetic Perceptive, Emotive and Cognitive Systems group (SPECS), Universitat Pompeu Fabra, Barcelona (Spain); Institució Catalana de Recerca i Estudis Avançats, Barcelona (Spain)

    2014-05-21

    Emulating the input–output functions performed by a brain structure opens the possibility for developing neuroprosthetic systems that replace damaged neuronal circuits. Here, we demonstrate the feasibility of this approach by replacing the cerebellar circuit responsible for the acquisition and extinction of motor memories. Specifically, we show that a rat can undergo acquisition, retention, and extinction of the eye-blink reflex even though the biological circuit responsible for this task has been chemically inactivated via anesthesia. This is achieved by first developing a computational model of the cerebellar microcircuit involved in the acquisition of conditioned reflexes and training it with synthetic data generated based on physiological recordings. Secondly, the cerebellar model is interfaced with the brain of an anesthetized rat, connecting the model’s inputs and outputs to afferent and efferent cerebellar structures. As a result, we show that the anesthetized rat, equipped with our neuroprosthetic system, can be classically conditioned to the acquisition of an eye-blink response. However, non-stationarities in the recorded biological signals limit the performance of the cerebellar model. Thus, we introduce an updated cerebellar model and validate it with physiological recordings showing that learning becomes stable and reliable. The resulting system represents an important step toward replacing lost functions of the central nervous system via neuroprosthetics, obtained by integrating a synthetic circuit with the afferent and efferent pathways of a damaged brain region. These results also embody an early example of science-based medicine, where on the one hand the neuroprosthetic system directly validates a theory of cerebellar learning that informed the design of the system, and on the other one it takes a step toward the development of neuro-prostheses that could recover lost learning functions in animals and, in the longer term, humans.

  7. A survey on microorganisms and their sensitivity by E-test in ventilator-associated pneumonia at Toxicological-Intensive Care Unit of Loghman-Hakim Hospital.

    Science.gov (United States)

    Talaie, Haleh; Sabeti, Shahram; Mahdavinejad, Arezou; Barari, Behjat; Kamalbeik, Sepideh

    2010-12-01

    Ventilator associated pneumonia (VAP) is the most common nosocomial infection at ICUs, with high mortality and morbidity. The diagnostic method for VAP is based on the combination of clinical, radiological, and microbiological criteria. Lower respiratory tract culture results are useful to confirm the etiology of VAP and adjusted antibiotics. Endotracheal aspiration (EA) is the simplest noninvasive technique for performing lower respiratory tract culture, with high sensitivity and moderately high specificity. The aim of this survey was to evaluate the quantitative cultures of endotracheal aspirates in VAP patients and the sensitivity patterns of microorganisms through E-test. Among 582 ICU admitted patients who were under mechanical ventilation for more than 48 hours, 72 suspected patients of VAP were prospectively evaluated during a 10 month period. Evaluation of our ICU standards by APACHE III scoring, and GCS were carried out on the first day of admission in all patients. Quantitative cultures of EA were performed on all 72 patients. Antibiotic resistance pattern of isolated pathogens was defined by E-test. VAP was confirmed in 46 out of 72 cases (50, 69.4% males and 22, 30.6% females - mean age was 33 +/- 12 years) through quantitative cultures of EA samples. The probable incidence of VAP was 7.9% (per ventilated patients > or = 48 hours). The mean APACHE III score was 31.28 +/- 16. GCS in most of the patients was between 8 and 12. Staphylococcus aureus was the most frequently isolated organism (58.7%), with high sensitivity to Amikacin, Ciprofloxacin, and Teicoplanin (>92%); Pseudomonas aeruginosa was the second most frequent organism (17.4 percent); Acinetobacter isolates were potentially drug resistant, and only Amikacin was effective. Tracheal aspirates in combination with clinical findings show important roles in the management of VAP and decrease inappropriate antimicrobial therapy. S. aureus is the main agent leading to VAP in the TICU of the Loghman

  8. Regulatory issues in accreditation of toxicology laboratories.

    Science.gov (United States)

    Bissell, Michael G

    2012-09-01

    Clinical toxicology laboratories and forensic toxicology laboratories operate in a highly regulated environment. This article outlines major US legal/regulatory issues and requirements relevant to accreditation of toxicology laboratories (state and local regulations are not covered in any depth). The most fundamental regulatory distinction involves the purposes for which the laboratory operates: clinical versus nonclinical. The applicable regulations and the requirements and options for operations depend most basically on this consideration, with clinical toxicology laboratories being directly subject to federal law including mandated options for accreditation and forensic toxicology laboratories being subject to degrees of voluntary or state government–required accreditation.

  9. Behavioral Screening for Toxicology | Science Inventory | US ...

    Science.gov (United States)

    Screening for behavioral toxicity, or neurotoxicity, has been in use for decades; however, only in the past 20 years has this become a standard practice in toxicology. Current screening batteries, such as the functional observational battery (FOB), are derived from protocols used in pharmacology, toxicology, and psychology. Although there is a range of protocols in use today, all focus on detailed observations and specific tests of reflexes and responses. Several neurological functions are typically assessed, including autonomic, neuromuscular, and sensory, as well as levels of activity and excitability. The tests have been shown to be valid in detecting expected effects of known neurotoxicants, and reliable and reproducible whn compared across laboratories. Regardless of the specific protocol used, proper conduct and statistical analyses of the data are critical. Interpretation is based on the information from individual end points as well as the profile, or pattern, of effects observed. As long as continual refinements are made, behavioral screening methods will continue to be important tools with which to protect human health in the future.autonomic function; behavior; behavioral phenotypes; behavioral toxicity; excitability; functional observational battery ; motor activity; mouse; neuromuscular function; positive controls; rat; screening battery ; sensory function Screening for behavioral toxicity, or neurotoxicity, has been in use for decades; how

  10. Integration of QSAR and in vitro toxicology.

    Science.gov (United States)

    Barratt, M D

    1998-01-01

    The principles of quantitative structure-activity relationships (QSAR) are based on the premise that the properties of a chemical are implicit in its molecular structure. Therefore, if a mechanistic hypothesis can be proposed linking a group of related chemicals with a particular toxic end point, the hypothesis can be used to define relevant parameters to establish a QSAR. Ways in which QSAR and in vitro toxicology can complement each other in development of alternatives to live animal experiments are described and illustrated by examples from acute toxicological end points. Integration of QSAR and in vitro methods is examined in the context of assessing mechanistic competence and improving the design of in vitro assays and the development of prediction models. The nature of biological variability is explored together with its implications for the selection of sets of chemicals for test development, optimization, and validation. Methods are described to support the use of data from in vivo tests that do not meet today's stringent requirements of acceptability. Integration of QSAR and in vitro methods into strategic approaches for the replacement, reduction, and refinement of the use of animals is described with examples. PMID:9599692

  11. The need for a paradigm shift in toxicology xx.

    Science.gov (United States)

    This manuscript briefly reviews the impact of the NAS report “Toxicity Testing in the 21st Century: A Vision and A Strategy” and it’s potential impact on the field of toxicology. ). This report provides a strategic and tactical framework for attaining the goals of deter...

  12. Strategies for improving approximate Bayesian computation tests for synchronous diversification.

    Science.gov (United States)

    Overcast, Isaac; Bagley, Justin C; Hickerson, Michael J

    2017-08-24

    Estimating the variability in isolation times across co-distributed taxon pairs that may have experienced the same allopatric isolating mechanism is a core goal of comparative phylogeography. The use of hierarchical Approximate Bayesian Computation (ABC) and coalescent models to infer temporal dynamics of lineage co-diversification has been a contentious topic in recent years. Key issues that remain unresolved include the choice of an appropriate prior on the number of co-divergence events (Ψ), as well as the optimal strategies for data summarization. Through simulation-based cross validation we explore the impact of the strategy for sorting summary statistics and the choice of prior on Ψ on the estimation of co-divergence variability. We also introduce a new setting (β) that can potentially improve estimation of Ψ by enforcing a minimal temporal difference between pulses of co-divergence. We apply this new method to three empirical datasets: one dataset each of co-distributed taxon pairs of Panamanian frogs and freshwater fishes, and a large set of Neotropical butterfly sister-taxon pairs. We demonstrate that the choice of prior on Ψ has little impact on inference, but that sorting summary statistics yields substantially more reliable estimates of co-divergence variability despite violations of assumptions about exchangeability. We find the implementation of β improves estimation of Ψ, with improvement being most dramatic given larger numbers of taxon pairs. We find equivocal support for synchronous co-divergence for both of the Panamanian groups, but we find considerable support for asynchronous divergence among the Neotropical butterflies. Our simulation experiments demonstrate that using sorted summary statistics results in improved estimates of the variability in divergence times, whereas the choice of hyperprior on Ψ has negligible effect. Additionally, we demonstrate that estimating the number of pulses of co-divergence across co-distributed taxon

  13. Computer aided instrumented Charpy test applied dynamic fracture toughness evaluation system

    International Nuclear Information System (INIS)

    Kobayashi, Toshiro; Niinomi, Mitsuo

    1986-01-01

    Micro computer aided data treatment system and personal computer aided data analysis system were applied to the traditional instrumented Charpy impact test system. The analysis of Charpy absorbed energy (E i , E p , E t ) and load (P y , P m ), and the evaluation of dynamic toughness through whole fracture process, i.e. J Id , J R curve and T mat was examined using newly developed computer aided instrumented Charpy impact test system. E i , E p , E t , P y and P m were effectively analyzed using moving average method and printed out automatically by micro computer aided data treatment system. J Id , J R curve and T mat could be measured by stop block test method. Then, J Id , J R curve and T mat were effectively estimated using compliance changing rate method and key curve method on the load-load point displacement curve of single fatigue cracked specimen by personal computer aided data analysis system. (author)

  14. Computer analysis on ANO-2 turbine trip test

    International Nuclear Information System (INIS)

    Senda, Yasuhide; Kanda, Keiji; McDonald, T.A.; Tessier, J.H.; Abramson, P.B.

    1983-01-01

    Safety analysis for nuclear power plants usually uses so detailed and large codes that it can be expensive and time-consuming. It is preferable to employ a simplified plant model to save cost and time. In this research, using RELAP5, a turbine trip test performed at Arkansas Nuclear One-Unit 2 (ANO-2) was analyzed with the simplified plant model in order to evaluate it for the turbine trip. Before the closure of the Main Steam Isolation Valve (MSIV), the calculation results agree well with the experimental data. After the MSIV closure, the results of the calculation explain the experimental data fairly well except for pressure recovery in the pressurizer. (author)

  15. Test computations on the dynamical evolution of star clusters

    International Nuclear Information System (INIS)

    Angeletti, L.; Giannone, P.

    1977-01-01

    Test calculations have been carried out on the evolution of star clusters using the fluid-dynamical method devised by Larson (1970). Large systems of stars have been considered with specific concern with globular clusters. With reference to the analogous 'standard' model by Larson, the influence of varying in turn the various free parameters (cluster mass, star mass, tidal radius, mass concentration of the initial model) has been studied for the results. Furthermore, the partial release of some simplifying assumptions with regard to the relaxation time and distribution of the 'target' stars has been considered. The change of the structural properties is discussed, and the variation of the evolutionary time scale is outlined. An indicative agreement of the results obtained here with structural properties of globular clusters as deduced from previous theoretical models is pointed out. (Auth.)

  16. Radiochromic film calibration for dosimetry in computed tomography tests

    Energy Technology Data Exchange (ETDEWEB)

    Costa, K. C.; Prata M, A. [Federal Center for Technological Education of Minas Gerais, Biomedical Engineering Center, Av. Amazonas 5253, Nova Suica, 30421-169 Belo Horizonte, Minas Gerais (Brazil); Ladino G, A. M. [Federal University of Minas Gerais, Department of Nuclear Engineering, Av. Antonio Carlos 6627, Pampulha, 31270-90 Belo Horizonte, Minas Gerais (Brazil); Costa, K. L., E-mail: apratabhz@gmail.com [University of Itauna, Medicine Department, Rodovia Mg 431 Km 45 s/n, El Dorado, 35680-142 Itauna, Minas Gerais (Brazil)

    2017-10-15

    Radiochromic film applications in dosimetry have become increasingly significant for studies on radiotherapy and diagnostic tests. Due to sensitivity to exposure to ionizing radiation, radiochromic films are commonly used to obtain dose distribution maps. The objective of this study is to obtain the calibration curves of the radiographic film for exposure with X-ray beam in a computerized tomography (CT) scanner to realize measures of typical doses found in radiodiagnosis tests. It was used Gafchromic Xr-AQ2 film, which shows little sensitivity to visible light and a response in the range of 0.1 to 20 Gy for X-ray beam in a tube voltage supply range ranging from 20 kV to 200 kV. In the experiments, a head polymethylmethacrylate (PMMA) phantom, with a cylindrical shape with five openings was used. This phantom was placed in the CT scanner isocenter and radiochromic film strips were placed into two openings. The irradiations were performed in a Toshiba Asteion scanner that allows making acquisitions in helical mode. The central slice of the head phantom was irradiated to obtain the values of air kerma in PMMA measured with a pencil ionization chamber. Thereafter, radiochromic film strips were placed into the central and one peripheral opening and 10 cm long scans of the central region of the phantom were carried out with feed voltage of 120 kV. The strips irradiated with different X-ray tube currents were scanned and processed using the ImageJ software to obtain the intensity values resulting from the absorbed radiation by optical density analysis. The calibration curves were obtained for both region, central and peripheral corresponding to the values of air kerma in PMMA measured with ionization chamber. With the curves in hand, CT experiments with applied beams can use radiochromic films as a dosimetry method and then seek the generation of images with lower dose deposition and higher diagnostic quality. (Author)

  17. Radiochromic film calibration for dosimetry in computed tomography tests

    International Nuclear Information System (INIS)

    Costa, K. C.; Prata M, A.; Ladino G, A. M.; Costa, K. L.

    2017-10-01

    Radiochromic film applications in dosimetry have become increasingly significant for studies on radiotherapy and diagnostic tests. Due to sensitivity to exposure to ionizing radiation, radiochromic films are commonly used to obtain dose distribution maps. The objective of this study is to obtain the calibration curves of the radiographic film for exposure with X-ray beam in a computerized tomography (CT) scanner to realize measures of typical doses found in radiodiagnosis tests. It was used Gafchromic Xr-AQ2 film, which shows little sensitivity to visible light and a response in the range of 0.1 to 20 Gy for X-ray beam in a tube voltage supply range ranging from 20 kV to 200 kV. In the experiments, a head polymethylmethacrylate (PMMA) phantom, with a cylindrical shape with five openings was used. This phantom was placed in the CT scanner isocenter and radiochromic film strips were placed into two openings. The irradiations were performed in a Toshiba Asteion scanner that allows making acquisitions in helical mode. The central slice of the head phantom was irradiated to obtain the values of air kerma in PMMA measured with a pencil ionization chamber. Thereafter, radiochromic film strips were placed into the central and one peripheral opening and 10 cm long scans of the central region of the phantom were carried out with feed voltage of 120 kV. The strips irradiated with different X-ray tube currents were scanned and processed using the ImageJ software to obtain the intensity values resulting from the absorbed radiation by optical density analysis. The calibration curves were obtained for both region, central and peripheral corresponding to the values of air kerma in PMMA measured with ionization chamber. With the curves in hand, CT experiments with applied beams can use radiochromic films as a dosimetry method and then seek the generation of images with lower dose deposition and higher diagnostic quality. (Author)

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  19. Case studies to test: A framework for using structural, reactivity, metabolic and physicochemical similarity to evaluate the suitability of analogs for SAR-based toxicological assessments.

    Science.gov (United States)

    Blackburn, Karen; Bjerke, Donald; Daston, George; Felter, Susan; Mahony, Catherine; Naciff, Jorge; Robison, Steven; Wu, Shengde

    2011-06-01

    A process for evaluating analogs for use in SAR (Structure-Activity Relationship) assessments was previously published (Wu et al. 2010). Subsequently, this process has been updated to include a decision tree for estrogen binding (from US EPA) and flags for developmental and reproductive toxicity (DART). This paper presents the results of blinded case studies designed to test this updated framework. The results of these case studies support the conclusion that the process outlined by Wu et al. (2010) can be successfully applied to develop surrogate values for risk assessment. The read across results generated by the process were shown to be protective when compared to the actual toxicity data. Successful application of the approach requires significant expertise as well as discipline to not overstep the boundaries of the defined analogs and the rating system. The end result of this rigor can be the inability to read across all endpoints for all chemicals resulting in data gaps that cannot be filled using read across, however, this reflects the current state of the science and is preferable to making non-protective decisions. Future work will be targeted towards expanding read across capabilities. Two examples of a broader category approach are also shown. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. Play for Performance: Using Computer Games to Improve Motivation and Test-Taking Performance

    Science.gov (United States)

    Dennis, Alan R.; Bhagwatwar, Akshay; Minas, Randall K.

    2013-01-01

    The importance of testing, especially certification and high-stakes testing, has increased substantially over the past decade. Building on the "serious gaming" literature and the psychology "priming" literature, we developed a computer game designed to improve test-taking performance using psychological priming. The game primed…

  1. A Review of Models for Computer-Based Testing. Research Report 2011-12

    Science.gov (United States)

    Luecht, Richard M.; Sireci, Stephen G.

    2011-01-01

    Over the past four decades, there has been incremental growth in computer-based testing (CBT) as a viable alternative to paper-and-pencil testing. However, the transition to CBT is neither easy nor inexpensive. As Drasgow, Luecht, and Bennett (2006) noted, many design engineering, test development, operations/logistics, and psychometric changes…

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  3. Computer control system for sup 6 sup 0 Co industrial DR nondestructive testing system

    CERN Document Server

    Chen Hai Jun

    2002-01-01

    The author presents the application of sup 6 sup 0 Co industrial DR nondestructive testing system, which including the control of step-motor, electrical protection, computer monitor program. The computer control system has good performance, high reliability and cheap expense

  4. Addressing unmet need for HIV testing in emergency care settings: a role for computer-facilitated rapid HIV testing?

    Science.gov (United States)

    Kurth, Ann E; Severynen, Anneleen; Spielberg, Freya

    2013-08-01

    HIV testing in emergency departments (EDs) remains underutilized. The authors evaluated a computer tool to facilitate rapid HIV testing in an urban ED. Randomly assigned nonacute adult ED patients were randomly assigned to a computer tool (CARE) and rapid HIV testing before a standard visit (n = 258) or to a standard visit (n = 259) with chart access. The authors assessed intervention acceptability and compared noted HIV risks. Participants were 56% nonWhite and 58% male; median age was 37 years. In the CARE arm, nearly all (251/258) of the patients completed the session and received HIV results; four declined to consent to the test. HIV risks were reported by 54% of users; one participant was confirmed HIV-positive, and two were confirmed false-positive (seroprevalence 0.4%, 95% CI [0.01, 2.2]). Half (55%) of the patients preferred computerized rather than face-to-face counseling for future HIV testing. In the standard arm, one HIV test and two referrals for testing occurred. Computer-facilitated HIV testing appears acceptable to ED patients. Future research should assess cost-effectiveness compared with staff-delivered approaches.

  5. Computer-aided dispatch--traffic management center field operational test : state of Utah final report

    Science.gov (United States)

    2006-07-01

    This document provides the final report for the evaluation of the USDOT-sponsored Computer-Aided Dispatch Traffic Management Center Integration Field Operations Test in the State of Utah. The document discusses evaluation findings in the followin...

  6. Computer-aided dispatch--traffic management center field operational test : Washington State final report

    Science.gov (United States)

    2006-05-01

    This document provides the final report for the evaluation of the USDOT-sponsored Computer-Aided Dispatch - Traffic Management Center Integration Field Operations Test in the State of Washington. The document discusses evaluation findings in the foll...

  7. Toxicology and Epidemiology: Improving the Science with a Framework for Combining Toxicological and Epidemiological Evidence to Establish Causal Inference

    Science.gov (United States)

    Adami, Hans-Olov; Berry, Sir Colin L.; Breckenridge, Charles B.; Smith, Lewis L.; Swenberg, James A.; Trichopoulos, Dimitrios; Weiss, Noel S.; Pastoor, Timothy P.

    2011-01-01

    Historically, toxicology has played a significant role in verifying conclusions drawn on the basis of epidemiological findings. Agents that were suggested to have a role in human diseases have been tested in animals to firmly establish a causative link. Bacterial pathogens are perhaps the oldest examples, and tobacco smoke and lung cancer and asbestos and mesothelioma provide two more recent examples. With the advent of toxicity testing guidelines and protocols, toxicology took on a role that was intended to anticipate or predict potential adverse effects in humans, and epidemiology, in many cases, served a role in verifying or negating these toxicological predictions. The coupled role of epidemiology and toxicology in discerning human health effects by environmental agents is obvious, but there is currently no systematic and transparent way to bring the data and analysis of the two disciplines together in a way that provides a unified view on an adverse causal relationship between an agent and a disease. In working to advance the interaction between the fields of toxicology and epidemiology, we propose here a five-step “Epid-Tox” process that would focus on: (1) collection of all relevant studies, (2) assessment of their quality, (3) evaluation of the weight of evidence, (4) assignment of a scalable conclusion, and (5) placement on a causal relationship grid. The causal relationship grid provides a clear view of how epidemiological and toxicological data intersect, permits straightforward conclusions with regard to a causal relationship between agent and effect, and can show how additional data can influence conclusions of causality. PMID:21561883

  8. Toxicological effects of Kuwaiti oil fires

    Energy Technology Data Exchange (ETDEWEB)

    Engi, D.; Boozer, D.D.; Church, H.W.; Einfeld, W.; Gotway, C.A.; Spencer, F.W.; Zak, B.D. [Sandia National Labs., Albuquerque, NM (United States); Moore, P.W. [Tech. Reps., Inc., Albuquerque, NM (United States)

    1992-06-01

    The possibility of long-term smoke emissions (from 1 to 3 years) from burning Kuwaiti oil wells has increased concerns regarding personnel exposure and acute and chronic health effects. This document, which is the result of work done in the spring of 1991, addresses those concerns. Part 1 of this document describes follow-on efforts to the pre-war modeling studies of the toxicological hazards to exposed Kuwaiti populations. Part 2 describes a pollutant monitoring program that could be carried out in the summer of 1991 to measure real-time exposure levels and to obtain more detailed information about the pollutant source terms and meteorological conditions that are necessary inputs to model computations.

  9. Y2K issues for real time computer systems for fast breeder test reactor

    International Nuclear Information System (INIS)

    Swaminathan, P.

    1999-01-01

    Presentation shows the classification of real time systems related to operation, control and monitoring of the fast breeder test reactor. Software life cycle includes software requirement specification, software design description, coding, commissioning, operation and management. A software scheme in supervisory computer of fast breeder test rector is described with the twenty years of experience in design, development, installation, commissioning, operation and maintenance of computer based supervision control system for nuclear installation with a particular emphasis on solving the Y2K problem

  10. A comprehensive toxicological evaluation of three adhesives using experimental cigarettes.

    Science.gov (United States)

    Coggins, Christopher R E; Jerome, Ann M; Lilly, Patrick D; McKinney, Willie J; Oldham, Michael J

    2013-01-01

    Adhesives are used in several different manufacturing operations in the production of cigarettes. The use of new, "high-speed-manufacture" adhesives (e.g. vinyl acetate based) could affect the smoke chemistry and toxicology of cigarettes, compared with older "low-speed-manufacture" adhesives (e.g. starch based). This study was conducted to determine whether the inclusion of different levels of three adhesives (ethylene vinyl acetate, polyvinyl acetate and starch) in experimental cigarettes results in different smoke chemistry and toxicological responses in in vitro and in vivo assays. A battery of tests (analytical chemistry, in vitro and in vivo assays) was used to compare the chemistry and toxicology of smoke from experimental cigarettes made with different combinations of the three adhesives. Varying levels of the different side-seam adhesives, as well as the transfer of adhesives from packaging materials, were tested. There were differences in some mainstream cigarette smoke constituents as a function of the level of adhesive added to experimental cigarettes and between the tested adhesives. None of these differences translated into statistically significant differences in the in vitro or in vivo assays. The use of newer "high-speed-manufacture" vinyl acetate-based adhesives in cigarettes does not produce toxicological profiles that prevent the adhesives from replacing the older "low-speed-manufacture" adhesives (such as starch).

  11. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  12. Independent verification and validation testing of the FLASH computer code, Versiion 3.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies

  13. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  14. 77 FR 35395 - Draft Five-Year Plan (2013-2017) for the National Toxicology Program Interagency Center for the...

    Science.gov (United States)

    2012-06-13

    ... (ICCVAM) has developed a draft NICEATM-ICCVAM Five-Year Plan. The plan describes four core strategies to... innovations are driving transformative changes in toxicology and how safety testing is performed. The field of toxicology is evolving from a system based largely on animal testing toward one based on the integration of...

  15. The Chemistry and Toxicology of Depleted Uranium

    Directory of Open Access Journals (Sweden)

    Sidney A. Katz

    2014-03-01

    Full Text Available Natural uranium is comprised of three radioactive isotopes: 238U, 235U, and 234U. Depleted uranium (DU is a byproduct of the processes for the enrichment of the naturally occurring 235U isotope. The world wide stock pile contains some 1½ million tons of depleted uranium. Some of it has been used to dilute weapons grade uranium (~90% 235U down to reactor grade uranium (~5% 235U, and some of it has been used for heavy tank armor and for the fabrication of armor-piercing bullets and missiles. Such weapons were used by the military in the Persian Gulf, the Balkans and elsewhere. The testing of depleted uranium weapons and their use in combat has resulted in environmental contamination and human exposure. Although the chemical and the toxicological behaviors of depleted uranium are essentially the same as those of natural uranium, the respective chemical forms and isotopic compositions in which they usually occur are different. The chemical and radiological toxicity of depleted uranium can injure biological systems. Normal functioning of the kidney, liver, lung, and heart can be adversely affected by depleted uranium intoxication. The focus of this review is on the chemical and toxicological properties of depleted and natural uranium and some of the possible consequences from long term, low dose exposure to depleted uranium in the environment.

  16. In silico toxicology: comprehensive benchmarking of multi-label classification methods applied to chemical toxicity data

    KAUST Repository

    Raies, Arwa B.

    2017-12-05

    One goal of toxicity testing, among others, is identifying harmful effects of chemicals. Given the high demand for toxicity tests, it is necessary to conduct these tests for multiple toxicity endpoints for the same compound. Current computational toxicology methods aim at developing models mainly to predict a single toxicity endpoint. When chemicals cause several toxicity effects, one model is generated to predict toxicity for each endpoint, which can be labor and computationally intensive when the number of toxicity endpoints is large. Additionally, this approach does not take into consideration possible correlation between the endpoints. Therefore, there has been a recent shift in computational toxicity studies toward generating predictive models able to predict several toxicity endpoints by utilizing correlations between these endpoints. Applying such correlations jointly with compounds\\' features may improve model\\'s performance and reduce the number of required models. This can be achieved through multi-label classification methods. These methods have not undergone comprehensive benchmarking in the domain of predictive toxicology. Therefore, we performed extensive benchmarking and analysis of over 19,000 multi-label classification models generated using combinations of the state-of-the-art methods. The methods have been evaluated from different perspectives using various metrics to assess their effectiveness. We were able to illustrate variability in the performance of the methods under several conditions. This review will help researchers to select the most suitable method for the problem at hand and provide a baseline for evaluating new approaches. Based on this analysis, we provided recommendations for potential future directions in this area.

  17. In silico toxicology: comprehensive benchmarking of multi-label classification methods applied to chemical toxicity data

    KAUST Repository

    Raies, Arwa B.; Bajic, Vladimir B.

    2017-01-01

    One goal of toxicity testing, among others, is identifying harmful effects of chemicals. Given the high demand for toxicity tests, it is necessary to conduct these tests for multiple toxicity endpoints for the same compound. Current computational toxicology methods aim at developing models mainly to predict a single toxicity endpoint. When chemicals cause several toxicity effects, one model is generated to predict toxicity for each endpoint, which can be labor and computationally intensive when the number of toxicity endpoints is large. Additionally, this approach does not take into consideration possible correlation between the endpoints. Therefore, there has been a recent shift in computational toxicity studies toward generating predictive models able to predict several toxicity endpoints by utilizing correlations between these endpoints. Applying such correlations jointly with compounds' features may improve model's performance and reduce the number of required models. This can be achieved through multi-label classification methods. These methods have not undergone comprehensive benchmarking in the domain of predictive toxicology. Therefore, we performed extensive benchmarking and analysis of over 19,000 multi-label classification models generated using combinations of the state-of-the-art methods. The methods have been evaluated from different perspectives using various metrics to assess their effectiveness. We were able to illustrate variability in the performance of the methods under several conditions. This review will help researchers to select the most suitable method for the problem at hand and provide a baseline for evaluating new approaches. Based on this analysis, we provided recommendations for potential future directions in this area.

  18. An Integrated Chemical Environment to Support 21st-Century Toxicology.

    Science.gov (United States)

    Bell, Shannon M; Phillips, Jason; Sedykh, Alexander; Tandon, Arpit; Sprankle, Catherine; Morefield, Stephen Q; Shapiro, Andy; Allen, David; Shah, Ruchir; Maull, Elizabeth A; Casey, Warren M; Kleinstreuer, Nicole C

    2017-05-25

    SUMMARY : Access to high-quality reference data is essential for the development, validation, and implementation of in vitro and in silico approaches that reduce and replace the use of animals in toxicity testing. Currently, these data must often be pooled from a variety of disparate sources to efficiently link a set of assay responses and model predictions to an outcome or hazard classification. To provide a central access point for these purposes, the National Toxicology Program Interagency Center for the Evaluation of Alternative Toxicological Methods developed the Integrated Chemical Environment (ICE) web resource. The ICE data integrator allows users to retrieve and combine data sets and to develop hypotheses through data exploration. Open-source computational workflows and models will be available for download and application to local data. ICE currently includes curated in vivo test data, reference chemical information, in vitro assay data (including Tox21 TM /ToxCast™ high-throughput screening data), and in silico model predictions. Users can query these data collections focusing on end points of interest such as acute systemic toxicity, endocrine disruption, skin sensitization, and many others. ICE is publicly accessible at https://ice.ntp.niehs.nih.gov. https://doi.org/10.1289/EHP1759.

  19. COMPUTER SIMULATION OF THE THERMAL TESTING PROCESS FOR STUDENTS OF «NONDESTRUCTIVE TESTING AND TECHNICAL DIAGNOSTICS» SPECIALITY

    Directory of Open Access Journals (Sweden)

    Anatolii H. Protasov

    2010-08-01

    Full Text Available This paper is devoted to the computer simulation method of thermal nondestructive testing procedure. FEMLAB is interactive software package and used for simulation. It allows forming a model of physical objects with given parameters and properties. A proposed method helps students to understand better the processes happen in solid under the action of temperature.

  20. Greater power and computational efficiency for kernel-based association testing of sets of genetic variants.

    Science.gov (United States)

    Lippert, Christoph; Xiang, Jing; Horta, Danilo; Widmer, Christian; Kadie, Carl; Heckerman, David; Listgarten, Jennifer

    2014-11-15

    Set-based variance component tests have been identified as a way to increase power in association studies by aggregating weak individual effects. However, the choice of test statistic has been largely ignored even though it may play an important role in obtaining optimal power. We compared a standard statistical test-a score test-with a recently developed likelihood ratio (LR) test. Further, when correction for hidden structure is needed, or gene-gene interactions are sought, state-of-the art algorithms for both the score and LR tests can be computationally impractical. Thus we develop new computationally efficient methods. After reviewing theoretical differences in performance between the score and LR tests, we find empirically on real data that the LR test generally has more power. In particular, on 15 of 17 real datasets, the LR test yielded at least as many associations as the score test-up to 23 more associations-whereas the score test yielded at most one more association than the LR test in the two remaining datasets. On synthetic data, we find that the LR test yielded up to 12% more associations, consistent with our results on real data, but also observe a regime of extremely small signal where the score test yielded up to 25% more associations than the LR test, consistent with theory. Finally, our computational speedups now enable (i) efficient LR testing when the background kernel is full rank, and (ii) efficient score testing when the background kernel changes with each test, as for gene-gene interaction tests. The latter yielded a factor of 2000 speedup on a cohort of size 13 500. Software available at http://research.microsoft.com/en-us/um/redmond/projects/MSCompBio/Fastlmm/. heckerma@microsoft.com Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  1. Safety and Toxicology of Cannabinoids

    OpenAIRE

    Sachs, Jane; McGlade, Erin; Yurgelun-Todd, Deborah

    2015-01-01

    There is extensive research on the safety, toxicology, potency, and therapeutic potential of cannabis. However, uncertainty remains facilitating continued debate on medical and recreational cannabis policies at the state and federal levels. This review will include a brief description of cannabinoids and the endocannabinoid system; a summary of the acute and long-term effects of cannabis; and a discussion of the therapeutic potential of cannabis. The conclusions about safety and efficacy will...

  2. Screening for cognitive impairment in older individuals. Validation study of a computer-based test.

    Science.gov (United States)

    Green, R C; Green, J; Harrison, J M; Kutner, M H

    1994-08-01

    This study examined the validity of a computer-based cognitive test that was recently designed to screen the elderly for cognitive impairment. Criterion-related validity was examined by comparing test scores of impaired patients and normal control subjects. Construct-related validity was computed through correlations between computer-based subtests and related conventional neuropsychological subtests. University center for memory disorders. Fifty-two patients with mild cognitive impairment by strict clinical criteria and 50 unimpaired, age- and education-matched control subjects. Control subjects were rigorously screened by neurological, neuropsychological, imaging, and electrophysiological criteria to identify and exclude individuals with occult abnormalities. Using a cut-off total score of 126, this computer-based instrument had a sensitivity of 0.83 and a specificity of 0.96. Using a prevalence estimate of 10%, predictive values, positive and negative, were 0.70 and 0.96, respectively. Computer-based subtests correlated significantly with conventional neuropsychological tests measuring similar cognitive domains. Thirteen (17.8%) of 73 volunteers with normal medical histories were excluded from the control group, with unsuspected abnormalities on standard neuropsychological tests, electroencephalograms, or magnetic resonance imaging scans. Computer-based testing is a valid screening methodology for the detection of mild cognitive impairment in the elderly, although this particular test has important limitations. Broader applications of computer-based testing will require extensive population-based validation. Future studies should recognize that normal control subjects without a history of disease who are typically used in validation studies may have a high incidence of unsuspected abnormalities on neurodiagnostic studies.

  3. Multiple-Choice versus Constructed-Response Tests in the Assessment of Mathematics Computation Skills.

    Science.gov (United States)

    Gadalla, Tahany M.

    The equivalence of multiple-choice (MC) and constructed response (discrete) (CR-D) response formats as applied to mathematics computation at grade levels two to six was tested. The difference between total scores from the two response formats was tested for statistical significance, and the factor structure of items in both response formats was…

  4. Independent validation testing of the FLAME computer code, Version 1.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-07-01

    Independent testing of the FLAME computer code, Version 1.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Validation tests, (i.e., tests which compare field data to the computer generated solutions) were used to determine the operational status of the FLAME computer code and were done on a qualitative basis through graphical comparisons of the experimental and numerical data. These tests were specifically designed to check: (1) correctness of the FORTRAN coding, (2) computational accuracy, and (3) suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: (1) independent applications, and (2) graduated difficulty of test cases. Three tests ranging in complexity from simple one-dimensional steady-state flow field problems under near-saturated conditions to two-dimensional transient flow problems with very dry initial conditions

  5. Relative User Ratings of MMPI-2 Computer-Based Test Interpretations

    Science.gov (United States)

    Williams, John E.; Weed, Nathan C.

    2004-01-01

    There are eight commercially available computer-based test interpretations (CBTIs) for the Minnesota Multiphasic Personality Inventory-2 (MMPI-2), of which few have been empirically evaluated. Prospective users of these programs have little scientific data to guide choice of a program. This study compared ratings of these eight CBTIs. Test users…

  6. Testing strategies in mutagenicity and genetic toxicology: an appraisal of the guidelines of the European Scientific Committee for Cosmetics and Non-Food Products for the evaluation of hair dyes.

    Science.gov (United States)

    Kirkland, D J; Henderson, L; Marzin, D; Müller, L; Parry, J M; Speit, G; Tweats, D J; Williams, G M

    2005-12-30

    reaction between the chemicals present in hair-dye formulations. Ideally, these should also be tested for genotoxicity, but at present such experiences are very limited. There is also the possibility that one component could mask the genotoxicity of another (e.g. by being more toxic), and so it is not practical at this time to recommend routine testing of complete hair-dye formulations as well. The most sensible approach would be to establish whether any reaction products within the hair-dye formulation penetrate the skin under normal conditions of use and test only those that penetrate at toxicologically relevant levels in the three-test in vitro battery. Recently published data [D. Kirkland, M. Aardema, L. Henderson, L. Müller, Evaluation of the ability of a battery of three in vitro genotoxicity tests to discriminate rodent carcinogens and non-carcinogens. I. Sensitivity, specificity and relative predictivity, Mutat. Res. 584 (2005) 1-256] suggest the three-test battery will produce a significant number of false as well as real positives. Whilst we are aware of the desire to reduce animal experiments, determining the relevance of positive results in any of the three recommended in vitro assays will most likely have to be determined by use of in vivo assays. The bone marrow micronucleus test using routes of administration such as oral or intraperitoneal may be used where the objective is extended hazard identification. If negative results are obtained in this test, then a second in vivo test should be conducted. This could be an in vivo UDS in rat liver or a Comet assay in a relevant tissue. However, for hazard characterisation, tests using topical application with measurement of genotoxicity in the skin would be more appropriate. Such specific site-of-contact in vivo tests would minimise animal toxicity burden and invasiveness, and, especially for hair dyes, be more relevant to human routes of exposure, but there are not sufficient scientific data available to allow

  7. Resource Guide to Careers in Toxicology, 3rd Edition.

    Science.gov (United States)

    Society of Toxicology, Reston, VA.

    This resource guide was prepared by the Tox 90's Educational Issues Task Force of the Society of Toxicology. The introduction provides information on the Society of Toxicology and financial support for graduate students in toxicology. Other sections include career opportunities in toxicology, academic and postdoctoral programs in toxicology, and…

  8. Research on computer aided testing of pilot response to critical in-flight events

    Science.gov (United States)

    Giffin, W. C.; Rockwell, T. H.; Smith, P. J.

    1984-01-01

    Experiments on pilot decision making are described. The development of models of pilot decision making in critical in flight events (CIFE) are emphasized. The following tests are reported on the development of: (1) a frame system representation describing how pilots use their knowledge in a fault diagnosis task; (2) assessment of script norms, distance measures, and Markov models developed from computer aided testing (CAT) data; and (3) performance ranking of subject data. It is demonstrated that interactive computer aided testing either by touch CRT's or personal computers is a useful research and training device for measuring pilot information management in diagnosing system failures in simulated flight situations. Performance is dictated by knowledge of aircraft sybsystems, initial pilot structuring of the failure symptoms and efficient testing of plausible causal hypotheses.

  9. Seismic proving test of process computer systems with a seismic floor isolation system

    International Nuclear Information System (INIS)

    Fujimoto, S.; Niwa, H.; Kondo, H.

    1995-01-01

    The authors have carried out seismic proving tests for process computer systems as a Nuclear Power Engineering Corporation (NUPEC) project sponsored by the Ministry of International Trade and Industry (MITI). This paper presents the seismic test results for evaluating functional capabilities of process computer systems with a seismic floor isolation system. The seismic floor isolation system to isolate the horizontal motion was composed of a floor frame (13 m x 13 m), ball bearing units, and spring-damper units. A series of seismic excitation tests was carried out using a large-scale shaking table of NUPEC. From the test results, the functional capabilities during large earthquakes of computer systems with a seismic floor isolation system were verified

  10. Computer control and data acquisition system for the R.F. Test Facility

    International Nuclear Information System (INIS)

    Stewart, K.A.; Burris, R.D.; Mankin, J.B.; Thompson, D.H.

    1986-01-01

    The Radio Frequency Test Facility (RFTF) at Oak Ridge National Laboratory, used to test and evaluate high-power ion cyclotron resonance heating (ICRH) systems and components, is monitored and controlled by a multicomponent computer system. This data acquisition and control system consists of three major hardware elements: (1) an Allen-Bradley PLC-3 programmable controller; (2) a VAX 11/780 computer; and (3) a CAMAC serial highway interface. Operating in LOCAL as well as REMOTE mode, the programmable logic controller (PLC) performs all the control functions of the test facility. The VAX computer acts as the operator's interface to the test facility by providing color mimic panel displays and allowing input via a trackball device. The VAX also provides archiving of trend data acquired by the PLC. Communications between the PLC and the VAX are via the CAMAC serial highway. Details of the hardware, software, and the operation of the system are presented in this paper

  11. Visualization of flaws within heavy section ultrasonic test blocks using high energy computed tomography

    International Nuclear Information System (INIS)

    House, M.B.; Ross, D.M.; Janucik, F.X.; Friedman, W.D.; Yancey, R.N.

    1996-05-01

    The feasibility of high energy computed tomography (9 MeV) to detect volumetric and planar discontinuities in large pressure vessel mock-up blocks was studied. The data supplied by the manufacturer of the test blocks on the intended flaw geometry were compared to manual, contact ultrasonic test and computed tomography test data. Subsequently, a visualization program was used to construct fully three-dimensional morphological information enabling interactive data analysis on the detected flaws. Density isosurfaces show the relative shape and location of the volumetric defects within the mock-up blocks. Such a technique may be used to qualify personnel or newly developed ultrasonic test methods without the associated high cost of destructive evaluation. Data is presented showing the capability of the volumetric data analysis program to overlay the computed tomography and destructive evaluation (serial metallography) data for a direct, three-dimensional comparison

  12. Predictive Systems Toxicology

    KAUST Repository

    Kiani, Narsis A.; Shang, Ming-Mei; Zenil, Hector; Tegner, Jesper

    2018-01-01

    In this review we address to what extent computational techniques can augment our ability to predict toxicity. The first section provides a brief history of empirical observations on toxicity dating back to the dawn of Sumerian civilization. Interestingly, the concept of dose emerged very early on, leading up to the modern emphasis on kinetic properties, which in turn encodes the insight that toxicity is not solely a property of a compound but instead depends on the interaction with the host organism. The next logical step is the current conception of evaluating drugs from a personalized medicine point-of-view. We review recent work on integrating what could be referred to as classical pharmacokinetic analysis with emerging systems biology approaches incorporating multiple omics data. These systems approaches employ advanced statistical analytical data processing complemented with machine learning techniques and use both pharmacokinetic and omics data. We find that such integrated approaches not only provide improved predictions of toxicity but also enable mechanistic interpretations of the molecular mechanisms underpinning toxicity and drug resistance. We conclude the chapter by discussing some of the main challenges, such as how to balance the inherent tension between the predictive capacity of models, which in practice amounts to constraining the number of features in the models versus allowing for rich mechanistic interpretability, i.e. equipping models with numerous molecular features. This challenge also requires patient-specific predictions on toxicity, which in turn requires proper stratification of patients as regards how they respond, with or without adverse toxic effects. In summary, the transformation of the ancient concept of dose is currently successfully operationalized using rich integrative data encoded in patient-specific models.

  13. Predictive Systems Toxicology

    KAUST Repository

    Kiani, Narsis A.

    2018-01-15

    In this review we address to what extent computational techniques can augment our ability to predict toxicity. The first section provides a brief history of empirical observations on toxicity dating back to the dawn of Sumerian civilization. Interestingly, the concept of dose emerged very early on, leading up to the modern emphasis on kinetic properties, which in turn encodes the insight that toxicity is not solely a property of a compound but instead depends on the interaction with the host organism. The next logical step is the current conception of evaluating drugs from a personalized medicine point-of-view. We review recent work on integrating what could be referred to as classical pharmacokinetic analysis with emerging systems biology approaches incorporating multiple omics data. These systems approaches employ advanced statistical analytical data processing complemented with machine learning techniques and use both pharmacokinetic and omics data. We find that such integrated approaches not only provide improved predictions of toxicity but also enable mechanistic interpretations of the molecular mechanisms underpinning toxicity and drug resistance. We conclude the chapter by discussing some of the main challenges, such as how to balance the inherent tension between the predictive capacity of models, which in practice amounts to constraining the number of features in the models versus allowing for rich mechanistic interpretability, i.e. equipping models with numerous molecular features. This challenge also requires patient-specific predictions on toxicity, which in turn requires proper stratification of patients as regards how they respond, with or without adverse toxic effects. In summary, the transformation of the ancient concept of dose is currently successfully operationalized using rich integrative data encoded in patient-specific models.

  14. Postmortem aviation forensic toxicology: an overview.

    Science.gov (United States)

    Chaturvedi, Arvind K

    2010-05-01

    An overview of the subtopic aviation combustion toxicology of the field of aerospace toxicology has been published. In a continuation of the overview, the findings associated with postmortem aviation forensic toxicology are being summarized in the present overview. A literature search for the period of 1960-2007 was performed. The important findings related to postmortem toxicology were evaluated. In addition to a brief introduction, this overview is divided into the sections of analytical methods; carboxyhemoglobin and blood cyanide ion; ethanol; drugs; result interpretation; glucose and hemoglobin A(1c); and references. Specific details of the subject matter were discussed. It is anticipated that this overview will be an outline source for aviation forensic toxicology within the field of aerospace toxicology.

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  16. Statistical test data selection for reliability evalution of process computer software

    International Nuclear Information System (INIS)

    Volkmann, K.P.; Hoermann, H.; Ehrenberger, W.

    1976-01-01

    The paper presents a concept for converting knowledge about the characteristics of process states into practicable procedures for the statistical selection of test cases in testing process computer software. Process states are defined as vectors whose components consist of values of input variables lying in discrete positions or within given limits. Two approaches for test data selection, based on knowledge about cases of demand, are outlined referring to a purely probabilistic method and to the mathematics of stratified sampling. (orig.) [de

  17. Computer-aided-engineering system for modeling and analysis of ECLSS integration testing

    Science.gov (United States)

    Sepahban, Sonbol

    1987-01-01

    The accurate modeling and analysis of two-phase fluid networks found in environmental control and life support systems is presently undertaken by computer-aided engineering (CAE) techniques whose generalized fluid dynamics package can solve arbitrary flow networks. The CAE system for integrated test bed modeling and analysis will also furnish interfaces and subsystem/test-article mathematical models. Three-dimensional diagrams of the test bed are generated by the system after performing the requisite simulation and analysis.

  18. History of Japanese Society of Toxicology.

    Science.gov (United States)

    Satoh, Tetsuo

    2016-01-01

    Founded in 1981, the Japanese Society of Toxicology (JSOT) has grown into an organization of nearly 3,000 members working together to advance the nation's scientific knowledge and understanding of toxicology through the implementation of planning that ensures a systematic and efficient expenditure of energies and resources, and is closely aligned with a strategy for accomplishing the Society's long-range plans. To promote public education in toxicology, the Society organizes public lectures during each year's annual meeting. Other activities include hosting scientific conferences, promoting continuing education, and facilitating international collaboration. Internally, the JSOT operates five standing committees: General Affairs, Educational, Editorial, Finance, and Science and Publicity to handle its necessary relationships. To bestow official recognition, the Society established its Toxicologist Certification Program in 1997, and has certified 536 members as Diplomat Toxicologists (DJSOT) as of May 1, 2016. Furthermore, on the same date, 43 JSOT members were certified as Emeritus Diplomats of the JSOT (EDJSOT). The Society has launched two official journals, the "Journal of Toxicological Sciences (JTS)" in 1981 and "Fundamental Toxicological Sciences (Fundam. Toxicol. Sci.)" in 2014. As for participation in the international organizations, the JSOT (then known as the Toxicological Research Group) joined the International Union of Toxicology as a charter member in 1980, and became a founding member of the Asian Society of Toxicology at its inauguration in 1994. Into the future, the JSOT will continue working diligently to advance knowledge and understanding of toxicology and secure its place among the interdisciplinary fields of science, humane studies, and ethics.

  19. Speed test results and hardware/software study of computational speed problem, appendix D

    Science.gov (United States)

    1984-01-01

    The HP9845C is a desktop computer which is tested and evaluated for processing speed. A study was made to determine the availability and approximate cost of computers and/or hardware accessories necessary to meet the 20 ms sample period speed requirements. Additional requirements were that the control algorithm could be programmed in a high language and that the machine have sufficient storage to store the data from a complete experiment.

  20. 78 FR 45253 - National Toxicology Program Scientific Advisory Committee on Alternative Toxicological Methods...

    Science.gov (United States)

    2013-07-26

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health National Toxicology Program... Alternative Methods (ICCVAM), the National Toxicology Program (NTP) Interagency Center for the Evaluation of... Director, National Toxicology Program. [FR Doc. 2013-17919 Filed 7-25-13; 8:45 am] BILLING CODE 4140-01-P ...

  1. Breath biomarkers in toxicology.

    Science.gov (United States)

    Pleil, Joachim D

    2016-11-01

    Exhaled breath has joined blood and urine as a valuable resource for sampling and analyzing biomarkers in human media for assessing exposure, uptake metabolism, and elimination of toxic chemicals. This article focuses current use of exhaled gas, aerosols, and vapor in human breath, the methods for collection, and ultimately the use of the resulting data. Some advantages of breath are the noninvasive and self-administered nature of collection, the essentially inexhaustible supply, and that breath sampling does not produce potentially infectious waste such as needles, wipes, bandages, and glassware. In contrast to blood and urine, breath samples can be collected on demand in rapid succession and so allow toxicokinetic observations of uptake and elimination in any time frame. Furthermore, new technologies now allow capturing condensed breath vapor directly, or just the aerosol fraction alone, to gain access to inorganic species, lung pH, proteins and protein fragments, cellular DNA, and whole microorganisms from the pulmonary microbiome. Future applications are discussed, especially the use of isotopically labeled probes, non-targeted (discovery) analysis, cellular level toxicity testing, and ultimately assessing "crowd breath" of groups of people and the relation to dose of airborne and other environmental chemicals at the population level.

  2. Recent developments in preclinical toxicological pathology

    International Nuclear Information System (INIS)

    Finch, John M.

    2005-01-01

    In the late nineteenth century, microscopists developed a quaint method for examining the fine structure of biological specimens: paraffin embedding and staining with hematoxylin and eosin. This ancient technology is here to stay for the foreseeable future, because it can and does reveal the truth about biological processes. However, the role of pathology is developing with ever greater worldwide interaction between pathologists, and better communication and agreeing of international standards. Furthermore, recent techniques including immunohistochemistry, electron microscopy and image analysis complement the traditional tried and tested tools. There is also in toxicologic pathology a willingness to use pathology methods and skills in new contexts, drug discovery in particular. But even in these days of genetic modification, proteomics and high throughput screening, pathologists continue to rely on dyes extracted from a Central American logwood used in Mexico before the Spanish invasion in 1520

  3. Aerodynamic research of a racing car based on wind tunnel test and computational fluid dynamics

    Directory of Open Access Journals (Sweden)

    Wang Jianfeng

    2018-01-01

    Full Text Available Wind tunnel test and computational fluid dynamics (CFD simulation are two main methods for the study of automotive aerodynamics. CFD simulation software solves the results in calculation by using the basic theory of aerodynamic. Calculation will inevitably lead to bias, and the wind tunnel test can effectively simulate the real driving condition, which is the most effective aerodynamics research method. This paper researches the aerodynamic characteristics of the wing of a racing car. Aerodynamic model of a racing car is established. Wind tunnel test is carried out and compared with the simulation results of computational fluid dynamics. The deviation of the two methods is small, and the accuracy of computational fluid dynamics simulation is verified. By means of CFD software simulation, the coefficients of six aerodynamic forces are fitted and the aerodynamic equations are obtained. Finally, the aerodynamic forces and torques of the racing car travel in bend are calculated.

  4. Toxicological relationships between proteins obtained from protein target predictions of large toxicity databases

    International Nuclear Information System (INIS)

    Nigsch, Florian; Mitchell, John B.O.

    2008-01-01

    The combination of models for protein target prediction with large databases containing toxicological information for individual molecules allows the derivation of 'toxiclogical' profiles, i.e., to what extent are molecules of known toxicity predicted to interact with a set of protein targets. To predict protein targets of drug-like and toxic molecules, we built a computational multiclass model using the Winnow algorithm based on a dataset of protein targets derived from the MDL Drug Data Report. A 15-fold Monte Carlo cross-validation using 50% of each class for training, and the remaining 50% for testing, provided an assessment of the accuracy of that model. We retained the 3 top-ranking predictions and found that in 82% of all cases the correct target was predicted within these three predictions. The first prediction was the correct one in almost 70% of cases. A model built on the whole protein target dataset was then used to predict the protein targets for 150 000 molecules from the MDL Toxicity Database. We analysed the frequency of the predictions across the panel of protein targets for experimentally determined toxicity classes of all molecules. This allowed us to identify clusters of proteins related by their toxicological profiles, as well as toxicities that are related. Literature-based evidence is provided for some specific clusters to show the relevance of the relationships identified

  5. Operational statistical analysis of the results of computer-based testing of students

    Directory of Open Access Journals (Sweden)

    Виктор Иванович Нардюжев

    2018-12-01

    Full Text Available The article is devoted to the issues of statistical analysis of results of computer-based testing for evaluation of educational achievements of students. The issues are relevant due to the fact that computerbased testing in Russian universities has become an important method for evaluation of educational achievements of students and quality of modern educational process. Usage of modern methods and programs for statistical analysis of results of computer-based testing and assessment of quality of developed tests is an actual problem for every university teacher. The article shows how the authors solve this problem using their own program “StatInfo”. For several years the program has been successfully applied in a credit system of education at such technological stages as loading computerbased testing protocols into a database, formation of queries, generation of reports, lists, and matrices of answers for statistical analysis of quality of test items. Methodology, experience and some results of its usage by university teachers are described in the article. Related topics of a test development, models, algorithms, technologies, and software for large scale computer-based testing has been discussed by the authors in their previous publications which are presented in the reference list.

  6. Electronic cigarettes in the USA: a summary of available toxicology data and suggestions for the future.

    Science.gov (United States)

    Orr, Michael S

    2014-05-01

    To review the available evidence evaluating the toxicological profiles of electronic cigarettes (e-cigarettes) in order to understand the potential impact of e-cigarettes on individual users and the public health. Systematic literature searches were conducted between October 2012 and October 2013 using five electronic databases. Search terms such as 'e-cigarettes' and 'electronic delivery devices' were used to identify the toxicology information for e-cigarettes. As of October 2013, the scientific literature contains very limited information regarding the toxicity of e-cigarettes commercially available in the USA. While some preliminary toxicology data suggests that e-cigarette users are exposed to lower levels of toxicants relative to cigarette smokers, the data available is extremely limited at this time. At present, there is insufficient toxicological data available to perform thorough risk assessment analyses for e-cigarettes; few toxicology studies evaluating e-cigarettes have been conducted to date, and standard toxicological testing paradigms have not been developed for comparing disparate types of tobacco products such as e-cigarettes and traditional cigarettes. Overall, the limited toxicology data on e-cigarettes in the public domain is insufficient to allow a thorough toxicological evaluation of this new type of tobacco product. In the future, the acquisition of scientific datasets that are derived from scientifically robust standard testing paradigms, include comprehensive chemical characterisation of the aerosol, provide information on users' toxicant exposure levels, and from studies replicated by independent researchers will improve the scientific community's ability to perform robust toxicological evaluations of e-cigarettes.

  7. Transmitted wavefront testing with large dynamic range based on computer-aided deflectometry

    Science.gov (United States)

    Wang, Daodang; Xu, Ping; Gong, Zhidong; Xie, Zhongmin; Liang, Rongguang; Xu, Xinke; Kong, Ming; Zhao, Jun

    2018-06-01

    The transmitted wavefront testing technique is demanded for the performance evaluation of transmission optics and transparent glass, in which the achievable dynamic range is a key issue. A computer-aided deflectometric testing method with fringe projection is proposed for the accurate testing of transmitted wavefronts with a large dynamic range. Ray tracing of the modeled testing system is carried out to achieve the virtual ‘null’ testing of transmitted wavefront aberrations. The ray aberration is obtained from the ray tracing result and measured slope, with which the test wavefront aberration can be reconstructed. To eliminate testing system modeling errors, a system geometry calibration based on computer-aided reverse optimization is applied to realize accurate testing. Both numerical simulation and experiments have been carried out to demonstrate the feasibility and high accuracy of the proposed testing method. The proposed testing method can achieve a large dynamic range compared with the interferometric method, providing a simple, low-cost and accurate way for the testing of transmitted wavefronts from various kinds of optics and a large amount of industrial transmission elements.

  8. Experimental and Computational Study of Ductile Fracture in Small Punch Tests

    Directory of Open Access Journals (Sweden)

    Betül Gülçimen Çakan

    2017-10-01

    Full Text Available A unified experimental-computational study on ductile fracture initiation and propagation during small punch testing is presented. Tests are carried out at room temperature with unnotched disks of different thicknesses where large-scale yielding prevails. In thinner specimens, the fracture occurs with severe necking under membrane tension, whereas for thicker ones a through thickness shearing mode prevails changing the crack orientation relative to the loading direction. Computational studies involve finite element simulations using a shear modified Gurson-Tvergaard-Needleman porous plasticity model with an integral-type nonlocal formulation. The predicted punch load-displacement curves and deformed profiles are in good agreement with the experimental results.

  9. Experimental and Computational Study of Ductile Fracture in Small Punch Tests.

    Science.gov (United States)

    Gülçimen Çakan, Betül; Soyarslan, Celal; Bargmann, Swantje; Hähner, Peter

    2017-10-17

    A unified experimental-computational study on ductile fracture initiation and propagation during small punch testing is presented. Tests are carried out at room temperature with unnotched disks of different thicknesses where large-scale yielding prevails. In thinner specimens, the fracture occurs with severe necking under membrane tension, whereas for thicker ones a through thickness shearing mode prevails changing the crack orientation relative to the loading direction. Computational studies involve finite element simulations using a shear modified Gurson-Tvergaard-Needleman porous plasticity model with an integral-type nonlocal formulation. The predicted punch load-displacement curves and deformed profiles are in good agreement with the experimental results.

  10. Safety and Toxicology of Cannabinoids.

    Science.gov (United States)

    Sachs, Jane; McGlade, Erin; Yurgelun-Todd, Deborah

    2015-10-01

    There is extensive research on the safety, toxicology, potency, and therapeutic potential of cannabis. However, uncertainty remains facilitating continued debate on medical and recreational cannabis policies at the state and federal levels. This review will include a brief description of cannabinoids and the endocannabinoid system; a summary of the acute and long-term effects of cannabis; and a discussion of the therapeutic potential of cannabis. The conclusions about safety and efficacy will then be compared with the current social and political climate to suggest future policy directions and general guidelines.

  11. Relationship Between Ocular Surface Disease Index, Dry Eye Tests, and Demographic Properties in Computer Users

    Directory of Open Access Journals (Sweden)

    Hüseyin Simavlı

    2014-03-01

    Full Text Available Objectives: The aim of the present study is to evaluate the ocular surface disease index (OSDI in computer users and to investigate the correlations of this index with dry eye tests and demographic properties. Materials and Methods: In this prospective study, 178 subjects with an age range of 20-40 years and who spent most of their daily life in front of the computers were included. All participants underwent a complete ophthalmologic examination including basal secretion test, tear break-up time test, and ocular surface staining. In addition, all patients completed the OSDI questionnaire. Results: A total of 178 volunteers (101 female, 77 male with a mean age of 28.8±4.5 years were included in the study. Mean time of computer use was 7.7±1.9 (5-14 hours/day, and mean computer use period was 71.1±39.7 (4-204 months. Mean OSDI score was 44.1±24.7 (0-100. There was a significant negative correlation between the OSDI score and tear break-up time test in the right (p=0.005 r=-0.21 and the left eyes (p=0.003 r=-0.22. There was a significant positive correlation between the OSDI score and gender (p=0.014 r=0.18 and daily computer usage time (p=0.008 r=0.2. In addition to this, there was a significant positive correlation between the OSDI score and ocular surface staining pattern in the right (p=0.03 r=0.16 and the left eyes (p=0.03 r=0.17. Age, smoking, type of computer, use of glasses, presence of symptoms, and basal secretion test were not found to be correlated with OSDI score. Conclusions: Long-term computer use causes ocular surface problems. The OSDI were found to be correlated with tear break-up time test, gender, daily computer usage time, and ocular surface staining pattern in computer users. (Turk J Ophthalmol 2014; 44: 115-8

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  13. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  14. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  15. Current status and prospects of the toxicological assessment of engineered nanomaterials

    International Nuclear Information System (INIS)

    Pacchiarotti, Francesca; Grollino, Maria Giuseppa; Leter, Giorgio

    2015-01-01

    Nano toxicology is a branch of experimental toxicology dealing with identification and characterization of the harmful biological effects of engineered nanomaterials. The physico-chemical properties of these materials affect their biological level interactions. From the first generation of experimental studies it showed the need for adaptation to nanomaterials methodologies and toxicological evaluation of current strategies. Special challenges are presented by the variety of materials to be tested, from the definition of relevant dose quantities, by the standardization of the preparation and characterization of the nanomaterial in the biological sample matrices, by techniques for the determination of the biodistribution in the body. 'Omics' technologies are now an innovative tool for toxicological approach based on understanding the mechanisms of action, which will allow the most advanced laboratories to implement high-performance screening test. [it

  16. Comparative analysis of quality control tests on computed tomography in accordance with national and international laws

    International Nuclear Information System (INIS)

    Ramos, Fernando S.; Vasconcelos, Rebeca S.; Goncalves, Marcel S.; Oliveira, Marcus V.L. de

    2014-01-01

    The objective of this study is to perform a comparative analysis between the Brazilian legislation and internationals protocols, with respect to the quality control tests for computerized tomography. We used 07 references, published from 1998-2012: the Protocolo Brasileiro - Portaria 453/98 SVS/MS and the Guia de Radiodiagnostico Medico da ANVISA; Quality Assurance Programme for Computed Tomography: Diagnostic and Therapy Applications of the IAEA; European Protocol - European Guidelines on Quality Criteria for Computed Tomography of the EUR No. 16262 EN; Radiation Protection No. 162 - Criteria for Acceptability of Medical Radiology, Nuclear Medicine and Radiotherapy of the European Commission; the Protocols of Control de Calidad en Radiodiagnostico IAEA / ARCAL XLIX; and the Protocolo Espanol de Control de Calidad en Radiodignostico. The comparative analysis of these legislations was based on aspects of tolerance / limit, frequency and objectives of the recommended tests. Were found 18 tests in the Brazilian legislation. The tests were grouped according to their nature (dosimetric tests / exposure and geometric tests and image quality tests). Among the evaluated protocols was identified divergence between tests contained in the documents and the criteria of assessment set out in this work. It is clear, moreover, that for certain documents are not observed tolerances, well-defined methodologies and even frequency of testing. We conclude that the current legislation in Brazil differs in certain respects from international protocols analyzed, although this has a great numbers of quality control tests. However, it is necessary that the Brazilian legislation takes into account technological advances presented to time

  17. The Toxicology Education Summit: building the future of toxicology through education.

    Science.gov (United States)

    Barchowsky, Aaron; Buckley, Lorrene A; Carlson, Gary P; Fitsanakis, Vanessa A; Ford, Sue M; Genter, Mary Beth; Germolec, Dori R; Leavens, Teresa L; Lehman-McKeeman, Lois D; Safe, Stephen H; Sulentic, Courtney E W; Eidemiller, Betty J

    2012-06-01

    Toxicology and careers in toxicology, as well as many other scientific disciplines, are undergoing rapid and dramatic changes as new discoveries, technologies, and hazards advance at a blinding rate. There are new and ever increasing demands on toxicologists to keep pace with expanding global economies, highly fluid policy debates, and increasingly complex global threats to public health. These demands must be met with new paradigms for multidisciplinary, technologically complex, and collaborative approaches that require advanced and continuing education in toxicology and associated disciplines. This requires paradigm shifts in educational programs that support recruitment, development, and training of the modern toxicologist, as well as continued education and retraining of the midcareer professional to keep pace and sustain careers in industry, government, and academia. The Society of Toxicology convened the Toxicology Educational Summit to discuss the state of toxicology education and to strategically address educational needs and the sustained advancement of toxicology as a profession. The Summit focused on core issues of: building for the future of toxicology through educational programs; defining education and training needs; developing the "Total Toxicologist"; continued training and retraining toxicologists to sustain their careers; and, finally, supporting toxicology education and professional development. This report summarizes the outcomes of the Summit, presents examples of successful programs that advance toxicology education, and concludes with strategies that will insure the future of toxicology through advanced educational initiatives.

  18. The Emergence of Systematic Review in Toxicology.

    Science.gov (United States)

    Stephens, Martin L; Betts, Kellyn; Beck, Nancy B; Cogliano, Vincent; Dickersin, Kay; Fitzpatrick, Suzanne; Freeman, James; Gray, George; Hartung, Thomas; McPartland, Jennifer; Rooney, Andrew A; Scherer, Roberta W; Verloo, Didier; Hoffmann, Sebastian

    2016-07-01

    The Evidence-based Toxicology Collaboration hosted a workshop on "The Emergence of Systematic Review and Related Evidence-based Approaches in Toxicology," on November 21, 2014 in Baltimore, Maryland. The workshop featured speakers from agencies and organizations applying systematic review approaches to questions in toxicology, speakers with experience in conducting systematic reviews in medicine and healthcare, and stakeholders in industry, government, academia, and non-governmental organizations. Based on the workshop presentations and discussion, here we address the state of systematic review methods in toxicology, historical antecedents in both medicine and toxicology, challenges to the translation of systematic review from medicine to toxicology, and thoughts on the way forward. We conclude with a recommendation that as various agencies and organizations adapt systematic review methods, they continue to work together to ensure that there is a harmonized process for how the basic elements of systematic review methods are applied in toxicology. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology.

  19. The impacts of computer adaptive testing from a variety of perspectives

    Directory of Open Access Journals (Sweden)

    Tetsuo Kimura

    2017-05-01

    Full Text Available Computer adaptive testing (CAT is a kind of tailored testing, in that it is a form of computer-based testing that is adaptive to each test-taker’s ability level. In this review, the impacts of CAT are discussed from different perspectives in order to illustrate crucial points to keep in mind during the development and implementation of CAT. Test developers and psychometricians often emphasize the efficiency and accuracy of CAT in comparison to traditional linear tests. However, many test-takers report feeling discouraged after taking CATs, and this feeling can reduce learning self-efficacy and motivation. A trade-off must be made between the psychological experiences of test-takers and measurement efficiency. From the perspective of educators and subject matter experts, nonstatistical specifications, such as content coverage, content balance, and form length are major concerns. Thus, accreditation bodies may be faced with a discrepancy between the perspectives of psychometricians and those of subject matter experts. In order to improve test-takers’ impressions of CAT, the author proposes increasing the target probability of answering correctly in the item selection algorithm even if doing so consequently decreases measurement efficiency. Two different methods, CAT with a shadow test approach and computerized multistage testing, have been developed in order to ensure the satisfaction of subject matter experts. In the shadow test approach, a full-length test is assembled that meets the constraints and provides maximum information at the current ability estimate, while computerized multistage testing gives subject matter experts an opportunity to review all test forms prior to administration.

  20. Toxicological Evaluation of Lactase Derived from Recombinant Pichia pastoris

    Science.gov (United States)

    Liu, Yifei; Chen, Delong; Luo, Yunbo; Huang, Kunlun; Zhang, Wei; Xu, Wentao

    2014-01-01

    A recombinant lactase was expressed in Pichia pastoris, resulting in enzymatic activity of 3600 U/mL in a 5 L fermenter. The lactase product was subjected to a series of toxicological tests to determine its safety for use as an enzyme preparation in the dairy industry. This recombinant lactase had the highest activity of all recombinant strains reported thus far. Acute oral toxicity, mutagenicity, genotoxic, and subchronic toxicity tests performed in rats and mice showed no death in any groups. The lethal dose 50% (LD50) based on the acute oral toxicity study is greater than 30 mL/kg body weight, which is in accordance with the 1500 L milk consumption of a 50 kg human daily. The lactase showed no mutagenic activity in the Ames test or a mouse sperm abnormality test at levels of up to 5 mg/plate and 1250 mg/kg body weight, respectively. It also showed no genetic toxicology in a bone marrow cell micronucleus test at levels of up to 1250 mg/kg body weight. A 90-day subchronic repeated toxicity study via the diet with lactase levels up to 1646 mg/kg (1000-fold greater than the mean human exposure) did not show any treatment-related significant toxicological effects on body weight, food consumption, organ weights, hematological and clinical chemistry, or histopathology compared to the control groups. This toxicological evaluation system is comprehensive and can be used in the safety evaluation of other enzyme preparations. The lactase showed no acute, mutagenic, genetic, or subchronic toxicity under our evaluation system. PMID:25184300

  1. Toxicological evaluation of lactase derived from recombinant Pichia pastoris.

    Directory of Open Access Journals (Sweden)

    Shiying Zou

    Full Text Available A recombinant lactase was expressed in Pichia pastoris, resulting in enzymatic activity of 3600 U/mL in a 5 L fermenter. The lactase product was subjected to a series of toxicological tests to determine its safety for use as an enzyme preparation in the dairy industry. This recombinant lactase had the highest activity of all recombinant strains reported thus far. Acute oral toxicity, mutagenicity, genotoxic, and subchronic toxicity tests performed in rats and mice showed no death in any groups. The lethal dose 50% (LD50 based on the acute oral toxicity study is greater than 30 mL/kg body weight, which is in accordance with the 1500 L milk consumption of a 50 kg human daily. The lactase showed no mutagenic activity in the Ames test or a mouse sperm abnormality test at levels of up to 5 mg/plate and 1250 mg/kg body weight, respectively. It also showed no genetic toxicology in a bone marrow cell micronucleus test at levels of up to 1250 mg/kg body weight. A 90-day subchronic repeated toxicity study via the diet with lactase levels up to 1646 mg/kg (1000-fold greater than the mean human exposure did not show any treatment-related significant toxicological effects on body weight, food consumption, organ weights, hematological and clinical chemistry, or histopathology compared to the control groups. This toxicological evaluation system is comprehensive and can be used in the safety evaluation of other enzyme preparations. The lactase showed no acute, mutagenic, genetic, or subchronic toxicity under our evaluation system.

  2. Computer-aided testing and operational aids for PARR-1 nuclear reactor

    International Nuclear Information System (INIS)

    Ansari, S.A.

    1990-01-01

    The utilization of the plant computer of Pakistan Research Reactor (PARR-1) for automatic periodic testing of nuclear instrumentation in the reactor is described. Computer algorithms have been developed for on-line acquisition and real-time processing of nuclear channel signals. The mean value, standard deviation, and probability distributions of nuclear channel signals are obtained in real time, and the computer generates a warning message if the signal error exceeds the maximum permissible error. In this way a faulty channel is automatically identified. Other real-time algorithms are also described that assist the operator in safe reactor operation by automatically computing approach-to-criticality during reactor start-up and the control rod worth determination

  3. Citham a computer code for calculating fuel depletion-description, tests, modifications and evaluation

    International Nuclear Information System (INIS)

    Alvarenga, M.A.B.

    1984-12-01

    The CITHAN computer code was developed at IPEN (Instituto de Pesquisas Energeticas e Nucleares) to link the HAMMER computer code with a fuel depletion routine and to provide neutron cross sections to be read with the appropriate format of the CITATION code. The problem arised due to the efforts to addapt the new version denomined HAMMER-TECHION with the routine refered. The HAMMER-TECHION computer code was elaborated by Haifa Institute, Israel within a project with EPRI. This version is at CNEN to be used in multigroup constant generation for neutron diffusion calculation in the scope of the new methodology to be adopted by CNEN. The theoretical formulation of CITHAM computer code, tests and modificatins are described. (Author) [pt

  4. Gender, general theory of crime and computer crime: an empirical test.

    Science.gov (United States)

    Moon, Byongook; McCluskey, John D; McCluskey, Cynthia P; Lee, Sangwon

    2013-04-01

    Regarding the gender gap in computer crime, studies consistently indicate that boys are more likely than girls to engage in various types of computer crime; however, few studies have examined the extent to which traditional criminology theories account for gender differences in computer crime and the applicability of these theories in explaining computer crime across gender. Using a panel of 2,751 Korean youths, the current study tests the applicability of the general theory of crime in explaining the gender gap in computer crime and assesses the theory's utility in explaining computer crime across gender. Analyses show that self-control theory performs well in predicting illegal use of others' resident registration number (RRN) online for both boys and girls, as predicted by the theory. However, low self-control, a dominant criminogenic factor in the theory, fails to mediate the relationship between gender and computer crime and is inadequate in explaining illegal downloading of software in both boy and girl models. Theoretical implication of the findings and the directions for future research are discussed.

  5. Blood transcriptomics: applications in toxicology

    Science.gov (United States)

    Joseph, Pius; Umbright, Christina; Sellamuthu, Rajendran

    2015-01-01

    The number of new chemicals that are being synthesized each year has been steadily increasing. While chemicals are of immense benefit to mankind, many of them have a significant negative impact, primarily owing to their inherent chemistry and toxicity, on the environment as well as human health. In addition to chemical exposures, human exposures to numerous non-chemical toxic agents take place in the environment and workplace. Given that human exposure to toxic agents is often unavoidable and many of these agents are found to have detrimental human health effects, it is important to develop strategies to prevent the adverse health effects associated with toxic exposures. Early detection of adverse health effects as well as a clear understanding of the mechanisms, especially at the molecular level, underlying these effects are key elements in preventing the adverse health effects associated with human exposure to toxic agents. Recent developments in genomics, especially transcriptomics, have prompted investigations into this important area of toxicology. Previous studies conducted in our laboratory and elsewhere have demonstrated the potential application of blood gene expression profiling as a sensitive, mechanistically relevant and practical surrogate approach for the early detection of adverse health effects associated with exposure to toxic agents. The advantages of blood gene expression profiling as a surrogate approach to detect early target organ toxicity and the molecular mechanisms underlying the toxicity are illustrated and discussed using recent studies on hepatotoxicity and pulmonary toxicity. Furthermore, the important challenges this emerging field in toxicology faces are presented in this review article. PMID:23456664

  6. Evolution of toxicology information systems

    Energy Technology Data Exchange (ETDEWEB)

    Wassom, J.S.; Lu, P.Y. [Oak Ridge National Laboratory, TN (United States)

    1990-12-31

    Society today is faced with new health risk situations that have been brought about by recent scientific and technical advances. Federal and state governments are required to assess the many potential health risks to exposed populations from the products (chemicals) and by-products (pollutants) of these advances. Because a sound analysis of any potential health risk should be based on the use of relevant information, it behooves those individuals responsible for making the risk assessments to know where to obtain needed information. This paper reviews the origins of toxicology information systems and explores the specialized information center concept that was proposed in 1963 as a means of providing ready access to scientific and technical information. As a means of illustrating this concept, the operation of one specialized information center (the Environmental Mutagen Information Center at Oak Ridge National Laboratory) will be discussed. Insights into how toxicological information resources came into being, their design and makeup, will be of value to those seeking to acquire information for risk assessment purposes. 7 refs., 1 fig., 4 tabs.

  7. simulate_CAT: A Computer Program for Post-Hoc Simulation for Computerized Adaptive Testing

    Directory of Open Access Journals (Sweden)

    İlker Kalender

    2015-06-01

    Full Text Available This paper presents a computer software developed by the author. The software conducts post-hoc simulations for computerized adaptive testing based on real responses of examinees to paper and pencil tests under different parameters that can be defined by user. In this paper, short information is given about post-hoc simulations. After that, the working principle of the software is provided and a sample simulation with required input files is shown. And last, output files are described

  8. [Vision test program for ophthalmologists on Apple II, IIe and IIc computers].

    Science.gov (United States)

    Huber, C

    1985-03-01

    A microcomputer program for the Apple II family of computers on a monochrome and a color screen is described. The program draws most of the tests used by ophthalmologists, and is offered as an alternative to a projector system. One advantage of the electronic generation of drawings is that true random orientation of Pflueger's E is possible. Tests are included for visual acuity (Pflueger's E, Landolt rings, numbers and children's drawings). Colored tests include a duochrome test, simple color vision tests, a fixation help with a musical background, a cobalt blue test and a Worth figure. In the astigmatic dial a mobile pointer helps to determine the axis. New tests can be programmed by the user and exchanged on disks among collageues.

  9. Toxicological profile for thorium. Draft report (Final)

    International Nuclear Information System (INIS)

    1990-10-01

    The ATSDR Toxicological Profile for Thorium is intended to characterize succinctly the toxicological and health effects information for the substance. It identifies and reviews the key literature that describes the substance's toxicological properties. Other literature is presented but described in less detail. The profile is not intended to be an exhaustive document; however, more comprehensive sources of specialty information are referenced. The profile begins with a public health statement, which describes in nontechnical language the substance's relevant toxicological properties. Following the statement is material that presents levels of significant human exposure and, where known, significant health effects. The adequacy of information to determine the substance's health effects is described. Research gaps in nontoxic and health effects information are described. Research gaps that are of significance to the protection of public health will be identified in a separate effort. The focus of the document is on health and toxicological information

  10. Toxicological profile for uranium. Final report

    International Nuclear Information System (INIS)

    1990-12-01

    The ATSDR Toxicological Profile for Uranium is intended to characterize succinctly the toxicological and health effects information for the substance. It identifies and reviews the key literature that describes the substances's toxicological properties. Other literature is presented but described in less detail. The profile is not intended to be an exhaustive document; however, more comprehensive sources of specialty information are referenced. The profile begins with a public health statement, which describes in nontechnical language the substance's relevant toxicological properties. Following the statement is material that presents levels of significant human exposure and, where known, significant health effects. The adequacy of information to determine the substance's health effects is described. Research gaps in nontoxic and health effects information are described. Research gaps that are of significance to the protection of public health will be identified in a separate effort. The focus of the document is on health and toxicological information

  11. Toxicological profile for radon. Final report

    International Nuclear Information System (INIS)

    1990-12-01

    The ATSDR Toxicological Profile for Radon is intended to characterize succinctly the toxicological and health effects information for the substance. It identifies and reviews the key literature that describes the substance's toxicological properties. Other literature is presented but described in less detail. The profile is not intended to be an exhaustive document; however, more comprehensive sources of specialty information are referenced. The profile begins with a public health statement, which describes in nontechnical language the substance's relevant toxicological properties. Following the statement is material that presents levels of significant human exposure and, where known, significant health effects. The adequacy of information to determine the substance's health effects is described. Research gaps in nontoxic and health effects information are described. Research gaps that are of significance to the protection of public health will be identified in a separate effort. The focus of the document is on health and toxicological information

  12. Toxicological profile for plutonium. Final report

    International Nuclear Information System (INIS)

    1990-12-01

    The ATSDR Toxicological Profile for Plutonium is intended to characterize succinctly the toxicological and health effects information for the substance. It identifies and reviews the key literature that describes the substance's toxicological properties. Other literature is presented but described in less detail. The profile is not intended to be an exhaustive document; however, more comprehensive sources of specialty information are referenced. The profile begins with a public health statement, which describes in nontechnical language the substance's relevant toxicological properties. Following the statement is material that presents levels of significant human exposure and, where known, significant health effects. The adequacy of information to determine the substance's health effects is described. Research gaps in nontoxic and health effects information are described. Research gaps that are of significance to the protection of public health will be identified in a separate effort. The focus of the document is on health and toxicological information

  13. Toxicological profile for radium. Final report

    International Nuclear Information System (INIS)

    1990-12-01

    The ATSDR Toxicological Profile for Radium is intended to characterize succinctly the toxicological and health effects information for the substance. It identifies and reviews the key literature that describes the substances' toxicological properties. Other literature is presented but described in less detail. The profile is not intended to be an exhaustive document; however, more comprehensive sources of specialty information are referenced. The profile begins with a public health statement, which describes in nontechnical language the substance's relevant toxicological properties. Following the statement is material that presents levels of significant human exposure and, where known, significant health effects. The adequacy of information to determine the substance's health effects is described. Research gaps in nontoxic and health effects information are described. Research gaps that are of significance to the protection of public health will be identified in a separate effort. The focus of the document is on health and toxicological information

  14. Toxicology

    Science.gov (United States)

    Macewen, J. W.

    1973-01-01

    Oxygen toxicity is examined, including the effects of oxygen partial pressure variations on toxicity and oxygen effects on ozone and nitrogen dioxide toxicity. Toxicity of fuels and oxidizers, such as hydrazines, are reported. Carbon monoxide, spacecraft threshold limit values, emergency exposure limits, spacecraft contaminants, and water quality standards for space missions are briefly summarized.

  15. Hair: a complementary source of bioanalytical information in forensic toxicology.

    Science.gov (United States)

    Barroso, Mário; Gallardo, Eugenia; Vieira, Duarte Nuno; López-Rivadulla, Manuel; Queiroz, João António

    2011-01-01

    Hair has been used for years in the assessment and documentation of human exposure to drugs, as it presents characteristics that make it extremely valuable for this purpose, namely the fact that sample collection is performed in a noninvasive manner, under close supervision, the possibility of collecting a specimen reflecting a similar timeline in the case of claims or suspicion of a leak in the chain of custody, and the increased window of detection for the drugs. For these reasons, testing for drugs in hair provides unique and useful information in several fields of toxicology, from which the most prominent is the possibility of studying individual drug use histories by means of segmental analysis. This paper will review the unique role of hair as a complementary sample in documenting human exposure to drugs in the fields of clinical and forensic toxicology and workplace drug testing.

  16. Applying computer adaptive testing to optimize online assessment of suicidal behavior: a simulation study.

    NARCIS (Netherlands)

    de Beurs, D.P.; de Vries, A.L.M.; de Groot, M.H.; de Keijser, J.; Kerkhof, A.J.F.M.

    2014-01-01

    Background: The Internet is used increasingly for both suicide research and prevention. To optimize online assessment of suicidal patients, there is a need for short, good-quality tools to assess elevated risk of future suicidal behavior. Computer adaptive testing (CAT) can be used to reduce

  17. Infrared Testing of the Wide-field Infrared Survey Telescope Grism Using Computer Generated Holograms

    Science.gov (United States)

    Dominguez, Margaret Z.; Content, David A.; Gong, Qian; Griesmann, Ulf; Hagopian, John G.; Marx, Catherine T; Whipple, Arthur L.

    2017-01-01

    Infrared Computer Generated Holograms (CGHs) were designed, manufactured and used to measure the performance of the grism (grating prism) prototype which includes testing Diffractive Optical Elements (DOE). The grism in the Wide Field Infrared Survey Telescope (WFIRST) will allow the surveying of a large section of the sky to find bright galaxies.

  18. Use of Standardized Test Scores to Predict Success in a Computer Applications Course

    Science.gov (United States)

    Harris, Robert V.; King, Stephanie B.

    2016-01-01

    The purpose of this study was to see if a relationship existed between American College Testing (ACT) scores (i.e., English, reading, mathematics, science reasoning, and composite) and student success in a computer applications course at a Mississippi community college. The study showed that while the ACT scores were excellent predictors of…

  19. Design of and normative data for a new computer based test of ocular torsion.

    Science.gov (United States)

    Vaswani, Reena S; Mudgil, Ananth V

    2004-01-01

    To evaluate a new clinically practical and dynamic test for quantifying torsional binocular eye alignment changes which may occur in the change from monocular to binocular viewing conditions. The test was developed using a computer with Lotus Freelance Software, binoculars with prisms and colored filters. The subject looks through binoculars at the computer screen two meters away. For monocular vision, six concentric blue circles, a blue horizontal line and a tilted red line were displayed on the screen. For binocular vision, white circles replaced blue circles. The subject was asked to orient the lines parallel to each other. The difference in tilt (degrees) between the subjective parallel and fixed horizontal position is the torsional alignment of the eye. The time to administer the test was approximately two minutes. In 70 Normal subjects, average age 16 years, the mean degree of cyclodeviation tilt in the right eye was 0.6 degrees for monocular viewing conditions and 0.7 degrees for binocular viewing conditions, with a standard deviation of approximately one degree. There was no "statistically significant" difference between monocular and binocular viewing. This computer based test is a simple, computerized, non-invasive test that has a potential for use in the diagnosis of cyclovertical strabismus. Currently, there is no commercially available test for this purpose.

  20. Adaptation, testing and application of the two-dimensional FE computer program system for steam generator tube testing

    International Nuclear Information System (INIS)

    Betzold, K.

    1987-01-01

    The 2d-FE computing program system, taken over by EPRI, is used for the improvement of the eddy current test of steam generator heating tubes. The investigations focus on test tasks in the area of the tube plate and the scrap mark; among them: accumulation of mud in the cracking area and above the tube plate; circulating slots with and without accumulation of mud. The interaction of the factors of influence given by the test object and the parameters selectable by the tester as for example coil length and base space for absolute coils and differential coils as well as test frequencies are calculated and the form of the signal locus curves and the dynamic curves are listed in a sample catalogue. It is demonstrated with selected examples that the sample catalogue contributes to the test-specific design of the coil and to the choice of the test frequencies; interpretation of measured signals; deepening of the knowledge of the physical processe in eddy current tests. (orig./HP) [de

  1. Postmortem diagnosis and toxicological validation of illicit substance use

    OpenAIRE

    Lehrmann, E; Afanador, ZR; Deep-Soboslay, A; Gallegos, G; Darwin, WD; Lowe, RH; Barnes, AJ; Huestis, MA; Cadet, JL; Herman, MM; Hyde, TM; Kleinman, JE; Freed, WJ

    2008-01-01

    The present study examines the diagnostic challenges of identifying ante-mortem illicit substance use in human postmortem cases. Substance use, assessed by clinical case history reviews, structured next-of-kin interviews, by general toxicology of blood, urine, and/or brain, and by scalp hair testing, identified 33 cocaine, 29 cannabis, 10 phencyclidine and 9 opioid cases. Case history identified 42% cocaine, 76% cannabis, 10% phencyclidine, and 33% opioid cases. Next-of-kin interviews identif...

  2. The power of an ontology-driven developmental toxicity database for data mining and computational modeling

    Science.gov (United States)

    Modeling of developmental toxicology presents a significant challenge to computational toxicology due to endpoint complexity and lack of data coverage. These challenges largely account for the relatively few modeling successes using the structure–activity relationship (SAR) parad...

  3. Glove-Enabled Computer Operations (GECO): Design and Testing of an Extravehicular Activity Glove Adapted for Human-Computer Interface

    Science.gov (United States)

    Adams, Richard J.; Olowin, Aaron; Krepkovich, Eileen; Hannaford, Blake; Lindsay, Jack I. C.; Homer, Peter; Patrie, James T.; Sands, O. Scott

    2013-01-01

    The Glove-Enabled Computer Operations (GECO) system enables an extravehicular activity (EVA) glove to be dual-purposed as a human-computer interface device. This paper describes the design and human participant testing of a right-handed GECO glove in a pressurized glove box. As part of an investigation into the usability of the GECO system for EVA data entry, twenty participants were asked to complete activities including (1) a Simon Says Games in which they attempted to duplicate random sequences of targeted finger strikes and (2) a Text Entry activity in which they used the GECO glove to enter target phrases in two different virtual keyboard modes. In a within-subjects design, both activities were performed both with and without vibrotactile feedback. Participants mean accuracies in correctly generating finger strikes with the pressurized glove were surprisingly high, both with and without the benefit of tactile feedback. Five of the subjects achieved mean accuracies exceeding 99 in both conditions. In Text Entry, tactile feedback provided a statistically significant performance benefit, quantified by characters entered per minute, as well as reduction in error rate. Secondary analyses of responses to a NASA Task Loader Index (TLX) subjective workload assessments reveal a benefit for tactile feedback in GECO glove use for data entry. This first-ever investigation of employment of a pressurized EVA glove for human-computer interface opens up a wide range of future applications, including text chat communications, manipulation of procedureschecklists, cataloguingannotating images, scientific note taking, human-robot interaction, and control of suit andor other EVA systems.

  4. [Continuous challenges in Japanese forensic toxicology practice: strategy to address specific goals].

    Science.gov (United States)

    Kageura, Mitsuyoshi

    2002-09-01

    In this paper, the status quo of forensic toxicology in Japan and the West is surveyed and a strategy to address future goals of Japanese forensic toxicology is proposed. Forensic toxicology in the West consists of three main areas--post-mortem forensic toxicology, human-performance forensic toxicology and forensic urine drug testing. In Japan, post-mortem forensic toxicology is practiced in university forensic medicine departments while most of the human-performance forensic toxicology is carried out in police laboratories. However, at least at present, strictly controlled workplace urine drug testing is not being performed, despite the abuse of drugs even by uniformed members of the National Defence Forces and police. For several years, the author has been introducing Western forensic toxicology guidelines and recommendations, translated into Japanese with the help of Western forensic toxicologists, to Japanese forensic toxicologists. Western forensic toxicology practice is at an advanced stage, whereas Japanese practice is in a critical condition and holds many problems awaiting solution, as exemplified by the urine drug testing in police laboratories. There is never any sample left for re-examination by the defence in all cases, though the initial volume of the urine sample available for examination is 30-50 ml. Only one organisation carries out everything from sampling to reporting and, in addition, the parent drug and its metabolites are not quantified. It is clear that the police laboratories do not work within good laboratory practice guidelines, nor do they have quality manuals or standard operating procedures manuals. A basic change in Japanese forensic toxicology practice is now essential. The author strongly recommends that, first of all, Japanese toxicologists should prepare forensic toxicology guidelines based on the Western models. The guidelines would progress the following objectives for forensic toxicology laboratories: 1) to have documented good

  5. Real-time prediction of Physicochemical and Toxicological Endpoints Using the Web-based CompTox Chemistry Dashboard (ACS Fall meeting) 10 of 12

    Science.gov (United States)

    The EPA CompTox Chemistry Dashboard developed by the National Center for Computational Toxicology (NCCT) provides access to data for ~750,000 chemical substances. The data include experimental and predicted data for physicochemical, environmental fate and transport and toxicologi...

  6. Concentrator optical characterization using computer mathematical modelling and point source testing

    Science.gov (United States)

    Dennison, E. W.; John, S. L.; Trentelman, G. F.

    1984-01-01

    The optical characteristics of a paraboloidal solar concentrator are analyzed using the intercept factor curve (a format for image data) to describe the results of a mathematical model and to represent reduced data from experimental testing. This procedure makes it possible not only to test an assembled concentrator, but also to evaluate single optical panels or to conduct non-solar tests of an assembled concentrator. The use of three-dimensional ray tracing computer programs to calculate the mathematical model is described. These ray tracing programs can include any type of optical configuration from simple paraboloids to array of spherical facets and can be adapted to microcomputers or larger computers, which can graphically display real-time comparison of calculated and measured data.

  7. Initial Flight Test of the Production Support Flight Control Computers at NASA Dryden Flight Research Center

    Science.gov (United States)

    Carter, John; Stephenson, Mark

    1999-01-01

    The NASA Dryden Flight Research Center has completed the initial flight test of a modified set of F/A-18 flight control computers that gives the aircraft a research control law capability. The production support flight control computers (PSFCC) provide an increased capability for flight research in the control law, handling qualities, and flight systems areas. The PSFCC feature a research flight control processor that is "piggybacked" onto the baseline F/A-18 flight control system. This research processor allows for pilot selection of research control law operation in flight. To validate flight operation, a replication of a standard F/A-18 control law was programmed into the research processor and flight-tested over a limited envelope. This paper provides a brief description of the system, summarizes the initial flight test of the PSFCC, and describes future experiments for the PSFCC.

  8. Computer-aided, single-specimen controlled bending test for fracture-kinetics measurement in ceramics

    International Nuclear Information System (INIS)

    Borovik, V.G.; Chushko, V.M.; Kovalev, S.P.

    1995-01-01

    Fracture testing of ceramics by using controlled crack growth is proposed to allow study of crack-kinetics behavior under a given loading history. A computer-aided, real-time data acquisition system improves the quality of crack-growth parameters obtained in a simple, single-specimen bend test. Several ceramic materials were tested in the present study: aluminum nitride as a linear-elastic material; and alumina and yttria-stabilized zirconia, both representative of ceramics with microstructure-dependent nonlinear fracture properties. Ambiguities in the crack-growth diagrams are discussed to show the importance of accounting for crack-growth history in correctly describing nonequilibrium fracture behavior

  9. Lessons Learned in Designing and Implementing a Computer-Adaptive Test for English

    Directory of Open Access Journals (Sweden)

    Jack Burston

    2014-09-01

    Full Text Available This paper describes the lessons learned in designing and implementing a computer-adaptive test (CAT for English. The early identification of students with weak L2 English proficiency is of critical importance in university settings that have compulsory English language course graduation requirements. The most efficient means of diagnosing the L2 English ability of incoming students is by means of a computer-based test since such evaluation can be administered quickly, automatically corrected, and the outcome known as soon as the test is completed. While the option of using a commercial CAT is available to institutions with the ability to pay substantial annual fees, or the means of passing these expenses on to their students, language instructors without these resources can only avail themselves of the advantages of CAT evaluation by creating their own tests.  As is demonstrated by the E-CAT project described in this paper, this is a viable alternative even for those lacking any computer programing expertise.  However, language teaching experience and testing expertise are critical to such an undertaking, which requires considerable effort and, above all, collaborative teamwork to succeed. A number of practical skills are also required. Firstly, the operation of a CAT authoring programme must be learned. Once this is done, test makers must master the art of creating a question database and assigning difficulty levels to test items. Lastly, if multimedia resources are to be exploited in a CAT, test creators need to be able to locate suitable copyright-free resources and re-edit them as needed.

  10. Environmental Sciences Division Toxicology Laboratory standard operating procedures

    International Nuclear Information System (INIS)

    Kszos, L.A.; Stewart, A.J.; Wicker, L.F.; Logsdon, G.M.

    1989-09-01

    This document was developed to provide the personnel working in the Environmental Sciences Division's Toxicology Laboratory with documented methods for conducting toxicity tests. The document consists of two parts. The first part includes the standard operating procedures (SOPs) that are used by the laboratory in conducting toxicity tests. The second part includes reference procedures from the US Environmental Protection Agency document entitled Short-Term Methods for Estimating the Chronic Toxicity of Effluents and Receiving Waters to Freshwater Organisms, upon which the Toxicology Laboratory's SOPs are based. Five of the SOPs include procedures for preparing Ceriodaphnia survival and reproduction test. These SOPs include procedures for preparing Ceriodaphnia food (SOP-3), maintaining Ceriodaphnia cultures (SOP-4), conducting the toxicity test (SOP-13), analyzing the test data (SOP-13), and conducting a Ceriodaphnia reference test (SOP-15). Five additional SOPs relate specifically to the fathead minnow (Pimephales promelas) larval survival and growth test: methods for preparing fathead minnow larvae food (SOP-5), maintaining fathead minnow cultures (SOP-6), conducting the toxicity test (SOP-9), analyzing the test data (SOP-12), and conducting a fathead minnow reference test (DOP-14). The six remaining SOPs describe methods that are used with either or both tests: preparation of control/dilution water (SOP-1), washing of glassware (SOP-2), collection and handling of samples (SOP-7), preparation of samples (SOP-8), performance of chemical analyses (SOP-11), and data logging and care of technical notebooks (SOP-16)

  11. Method Development in Forensic Toxicology.

    Science.gov (United States)

    Peters, Frank T; Wissenbach, Dirk K; Busardo, Francesco Paolo; Marchei, Emilia; Pichini, Simona

    2017-01-01

    In the field of forensic toxicology, the quality of analytical methods is of great importance to ensure the reliability of results and to avoid unjustified legal consequences. A key to high quality analytical methods is a thorough method development. The presented article will provide an overview on the process of developing methods for forensic applications. This includes the definition of the method's purpose (e.g. qualitative vs quantitative) and the analytes to be included, choosing an appropriate sample matrix, setting up separation and detection systems as well as establishing a versatile sample preparation. Method development is concluded by an optimization process after which the new method is subject to method validation. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  12. Information resources in toxicology--Italy

    International Nuclear Information System (INIS)

    Preziosi, Paolo; Dracos, Adriana; Marcello, Ida

    2003-01-01

    The purpose of the present paper is to provide an overview of current resources in the field of toxicology in Italy. The discussion will begin with a brief history of toxicology in this country, which includes the study of the toxicity of plants and other natural substances, and the birth of industrial and forensic toxicology. We will also provide information on research, education, and hazard control in the field of toxicology. Within this context we will examine the public bodies responsible for surveillance and regulatory activities, state-owned and private structures involved in toxicological research, and the educational programs and research activities of universities. Particular emphasis will be placed on the activities of the National Health Service, which plays an important role in areas such as clinical toxicology, food safety, and animal health, as well as those of national and regional agencies dedicated to the protection of the environment. The presentation will be organized as follows: - A Brief History of Toxicology in Italy; - Professional Societies; - National Health Service; - National Bodies; - Resources for the Environment; - Biomedical Websites; - Recent Publications; - Research Structures; - Graduate and Postgraduate Programs; - Legislation

  13. Predictive toxicology: the paths of the future

    International Nuclear Information System (INIS)

    Detilleux, Ph.; Vallier, L.; Legallais, C.; Leclerc, E.; Prot, J.M.; Choucha, L.; Baudoin, R.; Dufresne, M.; Gautier, A.; Carpentier, B.; Mansuy, D.; Pery, A.; Brochot, C.; Manivet, Ph.; Rabilloud, Th.; Spire, C.; Coumoul, X.; Junot, Ch.; Laprevote, O.; Le pape, A.; Le Guevel, R.; Tourneur, E.; Ben Mkaddem, S.; Chassin, C.; Aloulou, M.; Goujon, J.M.; Hertif, A.; Ouali, N.; Vimont, S.; Monteiro, R.; Rondeau, E.; Elbim, C.; Werts, C.; Vandewalle, A.; Ben Mkaddem, S.; Pedruzzi, E.; Coant, N.; Bens, M.; Cluzeaud, F.; Ogier-Denis, E.; Pongnimitprasert, N.; Babin-Chevaye, C.; Fay, M.; Bernard, M.; Dupuy, C.; Ei Benna, J.; Gougerot-Pocidale, M.A.; Braut-Boucher, F.; Pinton, Ph.; Lucioli, J.; Tsybulskyy, D.; Joly, B.; Laffitte, J.; Bourges-Abella, N.; Oswald, I.P.; Kolf-Clauw, M.; Pierre, St.; Bats, A.S.; Chevallier, A.; Bui, L.Ch.; Ambolet-Camoit, A.; Garlatti, M.; Aggerbeck, M.; Barouki, R.; Al Khansa, I.; Blanck, O.; Guillouzo, A.; Bars, R.; Rouas, C.; Bensoussan, H.; Suhard, D.; Tessier, C.; Grandcolas, L.; Pallardy, M.; Gueguen, Y.; Sparfel, L.; Pinel-Marie, M.L.; Boize, M.; Koscielny, S.; Desmots, S.; Pery, A.; Fardel, O.; Alvergnas, M.; Rouleau, A.; Lucchi, G.; Mantion, G.; Heyd, B.; Richert, L.; Ducoroy, P.; Martin, H.; Val, St.; Martinon, L.; Cachier, H.; Yahyaoui, A.; Marfaing, H.; Baeza-Squiban, A.; Martin-Chouly, C.; Bonvallet, M.; Morzadec, C.; Fardel, O.; Vernhet, L.; Baverel, G.; El Hage, M.; Nazaret, R.; Conjard-Duplany, A.; Ferrier, B.; Martin, G.; Legendre, A.; Desmots, S.; Lecomte, A.; Froment, P.; Habert, R.; Lemazurier, E.; Robinel, F.; Dupont, O.; Sanfins, E.; Dairou, J.; Chaffotte, A.F.; Busi, F.; Rodrigues Lima, F.; Dupret, J.M.; Mayati, A.; Le Ferrec, E.; Levoin, N.; Paris, H.; Uriac, Ph.; N'Diaye, M.; Lagadic-Gossmann, D.; Fardel, O.; Assemat, E.; Boublil, L.; Borot, M.C.; Marano, F.; Baeza-Squiban, A.; Martiny, V.Y.; Moroy, G.; Badel, A.; Miteva, M.A.; Hussain, S.; Ferecatu, I.; Borot, C.; Andreau, K.; Baeza-Squiban, A.; Marano, F.; Boland, S.; Leroux, M.; Zucchini-Pascal, N.; Peyre, L.; Rahmani, R.; Buron, N.; Porcedou, M.; Fromenty, B.; Borgne-Sanchez, A.; Rogue, A.; Spire, C.; Claude, N.; Guillouzo, A.

    2010-01-01

    Prevention of possible noxious effects in relation with the exposure to one or several chemical, physical or biological agents present in our domestic or professional environment is one of today's big public health stakes. Another stake is the better assessment of the risks linked with the use of health-care products. The efficacy and predictiveness of toxicology studies are directly related to the combination of alternate complementary methods and animal experiments (obtaining data from different species and with different models: in vitro, ex vivo and in vivo). Despite important efforts, the toxicological evaluation remains perfectible. The proceedings of this 2010 congress of the French Society of cell pharmaco-toxicology deal with recent advances, both scientific and technological, in 'predictive toxicology'. Four main topics are approached: cell and organ models, 'omics', in silico modeling, and new technologies (imaging, cell ships, high-speed processing). Among the different presentations, 3 abstracts present some recent advances in imaging techniques applied to toxicology studies. These are: 1 - first uses in toxicology of TOF-SIMS mass spectroscopy imaging (O. Laprevote, Paris-Descartes Univ. (FR)); 2 - Small animal imaging, a tool for predictive toxicology (A. Le Pape, CNRS Orleans (FR)); 3 - uranium localization at cell level using SIMS imaging technique (C. Rouas et al., IRSN Fontenay-aux-Roses (FR)). (J.S.)

  14. A primer on systematic reviews in toxicology.

    Science.gov (United States)

    Hoffmann, Sebastian; de Vries, Rob B M; Stephens, Martin L; Beck, Nancy B; Dirven, Hubert A A M; Fowle, John R; Goodman, Julie E; Hartung, Thomas; Kimber, Ian; Lalu, Manoj M; Thayer, Kristina; Whaley, Paul; Wikoff, Daniele; Tsaioun, Katya

    2017-07-01

    Systematic reviews, pioneered in the clinical field, provide a transparent, methodologically rigorous and reproducible means of summarizing the available evidence on a precisely framed research question. Having matured to a well-established approach in many research fields, systematic reviews are receiving increasing attention as a potential tool for answering toxicological questions. In the larger framework of evidence-based toxicology, the advantages and obstacles of, as well as the approaches for, adapting and adopting systematic reviews to toxicology are still being explored. To provide the toxicology community with a starting point for conducting or understanding systematic reviews, we herein summarized available guidance documents from various fields of application. We have elaborated on the systematic review process by breaking it down into ten steps, starting with planning the project, framing the question, and writing and publishing the protocol, and concluding with interpretation and reporting. In addition, we have identified the specific methodological challenges of toxicological questions and have summarized how these can be addressed. Ultimately, this primer is intended to stimulate scientific discussions of the identified issues to fuel the development of toxicology-specific methodology and to encourage the application of systematic review methodology to toxicological issues.

  15. Long Non-Coding RNAs: A Novel Paradigm for Toxicology.

    Science.gov (United States)

    Dempsey, Joseph L; Cui, Julia Yue

    2017-01-01

    Long non-coding RNAs (lncRNAs) are over 200 nucleotides in length and are transcribed from the mammalian genome in a tissue-specific and developmentally regulated pattern. There is growing recognition that lncRNAs are novel biomarkers and/or key regulators of toxicological responses in humans and animal models. Lacking protein-coding capacity, the numerous types of lncRNAs possess a myriad of transcriptional regulatory functions that include cis and trans gene expression, transcription factor activity, chromatin remodeling, imprinting, and enhancer up-regulation. LncRNAs also influence mRNA processing, post-transcriptional regulation, and protein trafficking. Dysregulation of lncRNAs has been implicated in various human health outcomes such as various cancers, Alzheimer's disease, cardiovascular disease, autoimmune diseases, as well as intermediary metabolism such as glucose, lipid, and bile acid homeostasis. Interestingly, emerging evidence in the literature over the past five years has shown that lncRNA regulation is impacted by exposures to various chemicals such as polycyclic aromatic hydrocarbons, benzene, cadmium, chlorpyrifos-methyl, bisphenol A, phthalates, phenols, and bile acids. Recent technological advancements, including next-generation sequencing technologies and novel computational algorithms, have enabled the profiling and functional characterizations of lncRNAs on a genomic scale. In this review, we summarize the biogenesis and general biological functions of lncRNAs, highlight the important roles of lncRNAs in human diseases and especially during the toxicological responses to various xenobiotics, evaluate current methods for identifying aberrant lncRNA expression and molecular target interactions, and discuss the potential to implement these tools to address fundamental questions in toxicology. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology. All rights reserved. For Permissions, please e

  16. Historical perspectives on cadmium toxicology

    International Nuclear Information System (INIS)

    Nordberg, Gunnar F.

    2009-01-01

    The first health effects of cadmium (Cd) were reported already in 1858. Respiratory and gastrointestinal symptoms occurred among persons using Cd-containing polishing agent. The first experimental toxicological studies are from 1919. Bone effects and proteinuria in humans were reported in the 1940's. After World War II, a bone disease with fractures and severe pain, the itai-itai disease, a form of Cd-induced renal osteomalacia, was identified in Japan. Subsequently, the toxicokinetics and toxicodynamics of Cd were described including its binding to the protein metallothionein. International warnings of health risks from Cd-pollution were issued in the 1970's. Reproductive and carcinogenic effects were studied at an early stage, but a quantitative assessment of these effects in humans is still subject to considerable uncertainty. The World Health Organization in its International Program on Chemical Safety, WHO/IPCS (1992) (Cadmium. Environmental Health Criteria Document 134, IPCS. WHO, Geneva, 1-280.) identified renal dysfunction as the critical effect and a crude quantitative evaluation was presented. In the 1990's and 2000 several epidemiological studies have reported adverse health effects, sometimes at low environmental exposures to Cd, in population groups in Japan, China, Europe and USA (reviewed in other contributions to the present volume). The early identification of an important role of metallothionein in cadmium toxicology formed the basis for recent studies using biomarkers of susceptibility to development of Cd-related renal dysfunction such as gene expression of metallothionein in peripheral lymphocytes and autoantibodies against metallothionein in blood plasma. Findings in these studies indicate that very low exposure levels to cadmium may give rise to renal dysfunction among sensitive subgroups of human populations such as persons with diabetes.

  17. 20170312 - Computer Simulation of Developmental ...

    Science.gov (United States)

    Rationale: Recent progress in systems toxicology and synthetic biology have paved the way to new thinking about in vitro/in silico modeling of developmental processes and toxicities, both for embryological and reproductive impacts. Novel in vitro platforms such as 3D organotypic culture models, engineered microscale tissues and complex microphysiological systems (MPS), together with computational models and computer simulation of tissue dynamics, lend themselves to a integrated testing strategies for predictive toxicology. As these emergent methodologies continue to evolve, they must be integrally tied to maternal/fetal physiology and toxicity of the developing individual across early lifestage transitions, from fertilization to birth, through puberty and beyond. Scope: This symposium will focus on how the novel technology platforms can help now and in the future, with in vitro/in silico modeling of complex biological systems for developmental and reproductive toxicity issues, and translating systems models into integrative testing strategies. The symposium is based on three main organizing principles: (1) that novel in vitro platforms with human cells configured in nascent tissue architectures with a native microphysiological environments yield mechanistic understanding of developmental and reproductive impacts of drug/chemical exposures; (2) that novel in silico platforms with high-throughput screening (HTS) data, biologically-inspired computational models of

  18. Computer Assisted Testing of Spoken English: A Study of the SFLEP College English Oral Test System in China

    Directory of Open Access Journals (Sweden)

    John Lowe

    2009-06-01

    Full Text Available This paper reports on the on-going evaluation of a computer-assisted system (CEOTS for the assessing of spoken English skills among Chinese university students. This system is being developed to deal with the negative backwash effects of the present system of assessment of speaking skills which is only available to a tiny minority. We present data from a survey of students at the developing institution (USTC, with follow-up interviews and further interviews with English language teachers, to gauge the reactions to the test and its impact on language learning. We identify the key issue as being one of validity, with a tension existing between construct and consequential validities of the existing system and of CEOTS. We argue that a computer-based system seems to offer the only solution to the negative backwash problem but the development of the technology required to meet current construct validity demands makes this a very long term prospect. We suggest that a compromise between the competing forms of validity must therefore be accepted, probably well before a computer-based system can deliver the level of interaction with the examinees that would emulate the present face-to-face mode.

  19. Computer processing of 14C data; statistical tests and corrections of data

    International Nuclear Information System (INIS)

    Obelic, B.; Planinic, J.

    1977-01-01

    The described computer program calculates the age of samples and performs statistical tests and corrections of data. Data are obtained from the proportional counter that measures anticoincident pulses per 20 minute intervals. After every 9th interval the counter measures total number of counts per interval. Input data are punched on cards. The output list contains input data schedule and the following results: mean CPM value, correction of CPM for normal pressure and temperature (NTP), sample age calculation based on 14 C half life of 5570 and 5730 years, age correction for NTP, dendrochronological corrections and the relative radiocarbon concentration. All results are given with one standard deviation. Input data test (Chauvenet's criterion), gas purity test, standard deviation test and test of the data processor are also included in the program. (author)

  20. Zebrafish neurobehavioral phenomics for aquatic neuropharmacology and toxicology research.

    Science.gov (United States)

    Kalueff, Allan V; Echevarria, David J; Homechaudhuri, Sumit; Stewart, Adam Michael; Collier, Adam D; Kaluyeva, Aleksandra A; Li, Shaomin; Liu, Yingcong; Chen, Peirong; Wang, JiaJia; Yang, Lei; Mitra, Anisa; Pal, Subharthi; Chaudhuri, Adwitiya; Roy, Anwesha; Biswas, Missidona; Roy, Dola; Podder, Anupam; Poudel, Manoj K; Katare, Deepshikha P; Mani, Ruchi J; Kyzar, Evan J; Gaikwad, Siddharth; Nguyen, Michael; Song, Cai

    2016-01-01

    Zebrafish (Danio rerio) are rapidly emerging as an important model organism for aquatic neuropharmacology and toxicology research. The behavioral/phenotypic complexity of zebrafish allows for thorough dissection of complex human brain disorders and drug-evoked pathological states. As numerous zebrafish models become available with a wide spectrum of behavioral, genetic, and environmental methods to test novel drugs, here we discuss recent zebrafish phenomics methods to facilitate drug discovery, particularly in the field of biological psychiatry. Additionally, behavioral, neurological, and endocrine endpoints are becoming increasingly well-characterized in zebrafish, making them an inexpensive, robust and effective model for toxicology research and pharmacological screening. We also discuss zebrafish behavioral phenotypes, experimental considerations, pharmacological candidates and relevance of zebrafish neurophenomics to other 'omics' (e.g., genomic, proteomic) approaches. Finally, we critically evaluate the limitations of utilizing this model organism, and outline future strategies of research in the field of zebrafish phenomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Enclosure environment characterization testing for the base line validation of computer fire simulation codes

    International Nuclear Information System (INIS)

    Nowlen, S.P.

    1987-03-01

    This report describes a series of fire tests conducted under the direction of Sandia National Laboratories for the US Nuclear Regulatory Commission. The primary purpose of these tests was to provide data against which to validate computer fire environment simulation models to be used in the analysis of nuclear power plant enclosure fire situations. Examples of the data gathered during three of the tests are presented, though the primary objective of this report is to provide a timely description of the test effort itself. These tests were conducted in an enclosure measuring 60x40x20 feet constructed at the Factory Mutual Research Corporation fires test facility in Rhode Island. All of the tests utilized forced ventilation conditions. The ventilation system was designed to simulate typical nuclear power plant installation practices and ventilation rates. A total of 22 tests using simple gas burner, heptane pool, methanol pool, and PMMA solid fires was conducted. Four of these tests were conducted with a full-scale control room mockup in place. Parameters varied during testing were fire intensity, enclosure ventilation rate, and fire location. Data gathered include air temperatures, air velocities, radiative and convective heat flux levels, optical smoke densities, inner and outer enclosure surface temperatures, enclosure surface heat flux levels, and gas concentrations within the enclosure in the exhaust stream

  2. An expanded framework for the advanced computational testing and simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  3. ENDF/B Pre-Processing Codes: Implementing and testing on a Personal Computer

    International Nuclear Information System (INIS)

    McLaughlin, P.K.

    1987-05-01

    This document describes the contents of the diskettes containing the ENDF/B Pre-Processing codes by D.E. Cullen, and example data for use in implementing and testing these codes on a Personal Computer of the type IBM-PC/AT. Upon request the codes are available from the IAEA Nuclear Data Section, free of charge, on a series of 7 diskettes. (author)

  4. IMPACT TESTING OF MATERIALS USING AN EIGHT-INCH AIR GUN AND COMPUTER REDUCTION OF DATA

    Energy Technology Data Exchange (ETDEWEB)

    Thorne, L. F.

    1973-10-01

    A mechanical shock actuator has been converted into an air gun capable of firing 8-inch-·diameter (20.32 cm) projectiles to velocities exceeding 1000 fps (304.8 m/ s). This new capability has been used to study the effect of impact velocity upon the energy.absorbed by crushable materials. Shockpulse data is reduced by computer techniques and test results are displayed in either tabular or graphic format by use of the C DC 6600 Calcomp plotter.

  5. Computer-based testing of the modified essay question: the Singapore experience.

    Science.gov (United States)

    Lim, Erle Chuen-Hian; Seet, Raymond Chee-Seong; Oh, Vernon M S; Chia, Boon-Lock; Aw, Marion; Quak, Seng-Hock; Ong, Benjamin K C

    2007-11-01

    The modified essay question (MEQ), featuring an evolving case scenario, tests a candidate's problem-solving and reasoning ability, rather than mere factual recall. Although it is traditionally conducted as a pen-and-paper examination, our university has run the MEQ using computer-based testing (CBT) since 2003. We describe our experience with running the MEQ examination using the IVLE, or integrated virtual learning environment (https://ivle.nus.edu.sg), provide a blueprint for universities intending to conduct computer-based testing of the MEQ, and detail how our MEQ examination has evolved since its inception. An MEQ committee, comprising specialists in key disciplines from the departments of Medicine and Paediatrics, was formed. We utilized the IVLE, developed for our university in 1998, as the online platform on which we ran the MEQ. We calculated the number of man-hours (academic and support staff) required to run the MEQ examination, using either a computer-based or pen-and-paper format. With the support of our university's information technology (IT) specialists, we have successfully run the MEQ examination online, twice a year, since 2003. Initially, we conducted the examination with short-answer questions only, but have since expanded the MEQ examination to include multiple-choice and extended matching questions. A total of 1268 man-hours was spent in preparing for, and running, the MEQ examination using CBT, compared to 236.5 man-hours to run it using a pen-and-paper format. Despite being more labour-intensive, our students and staff prefer CBT to the pen-and-paper format. The MEQ can be conducted using a computer-based testing scenario, which offers several advantages over a pen-and-paper format. We hope to increase the number of questions and incorporate audio and video files, featuring clinical vignettes, to the MEQ examination in the near future.

  6. The in vitro toxicology of Swedish snus

    Science.gov (United States)

    Coggins, Christopher R. E.; Ballantyne, Mark; Curvall, Margareta; Rutqvist, Lars-Erik

    2012-01-01

    Three commercial brands of Swedish snus (SWS), an experimental SWS, and the 2S3 reference moist snuff were each tested in four in vitro toxicology assays. These assays were: Salmonella reverse mutation, mouse lymphoma, in vitro micronucleus, and cytotoxicity. Water extractions of each of the 5 products were tested using several different concentrations; the experimental SWS was also extracted using dimethyl sulfoxide (DMSO). Extraction procedures were verified by nicotine determinations. Results for SWS in the mutagenicity assays were broadly negative: there were occasional positive responses, but these were effectively at the highest concentration only (concentrations well above those suggested by regulatory guidelines), and were often associated with cytotoxicity. The 2S3 reference was unequivocally positive in one of the three conditions of the micronucleus assay (MNA), at the highest concentration only. Positive controls produced the expected responses in each assay. The SWS data are contrasted with data reported for combusted tobacco in the form of cigarettes, where strongly positive responses have been routinely reported for mutagenicity and cytotoxicity. These negative findings in a laboratory setting concur with the large amount of epidemiological data from Sweden, data showing that SWS are associated with considerably lower carcinogenic potential when compared with cigarettes. PMID:22400986

  7. Effects of computer-based immediate feedback on foreign language listening comprehension and test-associated anxiety.

    Science.gov (United States)

    Lee, Shu-Ping; Su, Hui-Kai; Lee, Shin-Da

    2012-06-01

    This study investigated the effects of immediate feedback on computer-based foreign language listening comprehension tests and on intrapersonal test-associated anxiety in 72 English major college students at a Taiwanese University. Foreign language listening comprehension of computer-based tests designed by MOODLE, a dynamic e-learning environment, with or without immediate feedback together with the state-trait anxiety inventory (STAI) were tested and repeated after one week. The analysis indicated that immediate feedback during testing caused significantly higher anxiety and resulted in significantly higher listening scores than in the control group, which had no feedback. However, repeated feedback did not affect the test anxiety and listening scores. Computer-based immediate feedback did not lower debilitating effects of anxiety but enhanced students' intrapersonal eustress-like anxiety and probably improved their attention during listening tests. Computer-based tests with immediate feedback might help foreign language learners to increase attention in foreign language listening comprehension.

  8. Computer-Aided System of Virtual Testing of Gas Turbine Engines

    Directory of Open Access Journals (Sweden)

    Rybakov Viktor N.

    2016-01-01

    Full Text Available The article describes the concept of a virtual lab that includes subsystem of gas turbine engine simulation, subsystem of experiment planning, subsystem of measurement errors simulation, subsystem of simulator identification and others. The basis for virtual lab development is the computer-aided system of thermogasdynamic research and analysis “ASTRA”. The features of gas turbine engine transient modes simulator are described. The principal difference between the simulators of transient and stationary modes of gas turbine engines is that the energy balance of the compressor and turbine becomes not applicable. The computer-aided system of virtual gas turbine engine testing was created using the developed transient modes simulator. This system solves the tasks of operational (throttling, speed, climatic, altitude characteristics calculation, analysis of transient dynamics and selection of optimal control laws. Besides, the system of virtual gas turbine engine testing is a clear demonstration of gas turbine engine working process and the regularities of engine elements collaboration. The interface of the system of virtual gas turbine engine testing is described in the article and some screenshots of the interface elements are provided. The developed system of virtual gas turbine engine testing provides means for reducing the laboriousness of gas turbine engines testing. Besides, the implementation of this system in the learning process allows the diversification of lab works and therefore improve the quality of training.

  9. VALIDITY IN COMPUTER-BASED TESTING: A LITERATURE REVIEW OF COMPARABILITY ISSUES AND EXAMINEE PERSPECTIVES

    Directory of Open Access Journals (Sweden)

    Ika Kana Trisnawati

    2015-05-01

    Full Text Available These past years have seen the growing popularity of the Computer-Based Tests (CBTs in various disciplines, for various purposes, although the Paper-and Pencil Based Tests (P&Ps are still in use. However, many question on whether the use of CBTs outperform the effectiveness of the P&Ps or if the CBTs can become a valid measuring tool compared to the PBTs. This paper tries to present the comparison on both the CBTs and the P&Ps and their respective examinee perspectives in order to figure out if doubts should arise to the emergence of the CBTs over the classic P&Ps. Findings showed that the CBTs are advantageous in that they are both efficient (reducing testing time and effective (maintaining the test reliability over the P&P versions. Nevertheless, the CBTs still need to have their variables well-designed (e.g., study design, computer algorithm in order for the scores to be comparable to those in the P&P tests since the score equivalence is one of the validity evidences needed in a CBT.

  10. Testing a computer-based ostomy care training resource for staff nurses.

    Science.gov (United States)

    Bales, Isabel

    2010-05-01

    Fragmented teaching and ostomy care provided by nonspecialized clinicians unfamiliar with state-of-the-art care and products have been identified as problems in teaching ostomy care to the new ostomate. After conducting a literature review of theories and concepts related to the impact of nurse behaviors and confidence on ostomy care, the author developed a computer-based learning resource and assessed its effect on staff nurse confidence. Of 189 staff nurses with a minimum of 1 year acute-care experience employed in the acute care, emergency, and rehabilitation departments of an acute care facility in the Midwestern US, 103 agreed to participate and returned completed pre- and post-tests, each comprising the same eight statements about providing ostomy care. F and P values were computed for differences between pre- and post test scores. Based on a scale where 1 = totally disagree and 5 = totally agree with the statement, baseline confidence and perceived mean knowledge scores averaged 3.8 and after viewing the resource program post-test mean scores averaged 4.51, a statistically significant improvement (P = 0.000). The largest difference between pre- and post test scores involved feeling confident in having the resources to learn ostomy skills independently. The availability of an electronic ostomy care resource was rated highly in both pre- and post testing. Studies to assess the effects of increased confidence and knowledge on the quality and provision of care are warranted.

  11. The application of digital computers to near-real-time processing of flutter test data

    Science.gov (United States)

    Hurley, S. R.

    1976-01-01

    Procedures used in monitoring, analyzing, and displaying flight and ground flutter test data are presented. These procedures include three digital computer programs developed to process structural response data in near real time. Qualitative and quantitative modal stability data are derived from time history response data resulting from rapid sinusoidal frequency sweep forcing functions, tuned-mode quick stops, and pilot induced control pulses. The techniques have been applied to both fixed and rotary wing aircraft, during flight, whirl tower rotor systems tests, and wind tunnel flutter model tests. An hydraulically driven oscillatory aerodynamic vane excitation system utilized during the flight flutter test programs accomplished during Lockheed L-1011 and S-3A development is described.

  12. Techniques for Investigating Molecular Toxicology of Nanomaterials.

    Science.gov (United States)

    Wang, Yanli; Li, Chenchen; Yao, Chenjie; Ding, Lin; Lei, Zhendong; Wu, Minghong

    2016-06-01

    Nanotechnology has been a rapidly developing field in the past few decades, resulting in the more and more exposure of nanomaterials to human. The increased applications of nanomaterials for industrial, commercial and life purposes, such as fillers, catalysts, semiconductors, paints, cosmetic additives and drug carriers, have caused both obvious and potential impacts on human health and environment. Nanotoxicology is used to study the safety of nanomaterials and has grown at the historic moment. Molecular toxicology is a new subdiscipline to study the interactions and impacts of materials at the molecular level. To better understand the relationship between the molecular toxicology and nanomaterials, this review summarizes the typical techniques and methods in molecular toxicology which are applied when investigating the toxicology of nanomaterials and include six categories: namely; genetic mutation detection, gene expression analysis, DNA damage detection, chromosomal aberration analysis, proteomics, and metabolomics. Each category involves several experimental techniques and methods.

  13. Space Toxicology: Human Health during Space Operations

    Science.gov (United States)

    Khan-Mayberry, Noreen; James, John T.; Tyl, ROchelle; Lam, Chiu-Wing

    2010-01-01

    Space Toxicology is a unique and targeted discipline for spaceflight, space habitation and occupation of celestial bodies including planets, moons and asteroids. Astronaut explorers face distinctive health challenges and limited resources for rescue and medical care during space operation. A central goal of space toxicology is to protect the health of the astronaut by assessing potential chemical exposures during spaceflight and setting safe limits that will protect the astronaut against chemical exposures, in a physiologically altered state. In order to maintain sustained occupation in space on the International Space Station (ISS), toxicological risks must be assessed and managed within the context of isolation continuous exposures, reuse of air and water, limited rescue options, and the need to use highly toxic compounds for propulsion. As we begin to explore other celestial bodies in situ toxicological risks, such as inhalation of reactive mineral dusts, must also be managed.

  14. Pulmonary toxicology of respirable particles. [Lead abstract

    Energy Technology Data Exchange (ETDEWEB)

    Sanders, C.L.; Cross, F.T.; Dagle, G.E.; Mahaffey, J.A. (eds.)

    1980-09-01

    Separate abstracts were prepared for the 44 papers presented in these proceedings. The last paper (Stannard) in the proceedings is an historical review of the field of inhalation toxicology and is not included in the analytics. (DS)

  15. Environmental chemistry and toxicology of mercury

    National Research Council Canada - National Science Library

    Liu, Guangliang; Cai, Yong; O'Driscoll, Nelson J

    2012-01-01

    ... employed in recent studies. The coverage discusses the environmental behavior and toxicological effects of mercury on organisms, including humans, and provides case studies at the end of each chapter...

  16. IRIS Toxicological Review of Chloroform (Final Report)

    Science.gov (United States)

    EPA is announcing the release of the final report, Toxicological Review of Chloroform: in support of the Integrated Risk Information System (IRIS). The updated Summary for Chloroform and accompanying Quickview have also been added to the IRIS Database.

  17. Toxicology in the 21st century - Working our way towards a visionary reality

    NARCIS (Netherlands)

    Berg, N.; Wever, B.de; Fuchs, H.W.; Gaca, M.; Krul, C.A.M.; Roggen, E.L.

    2011-01-01

    In November 2009 the In Vitro Testing Industrial Platform (IVTIP) organized a meeting entitled '. Toxicology in the 21st century - working our way towards a visionary reality'. Participating delegates included scientists, key opinion leaders, developers and users of 3Rs-related tests and testing

  18. Modern Instrumental Methods in Forensic Toxicology*

    Science.gov (United States)

    Smith, Michael L.; Vorce, Shawn P.; Holler, Justin M.; Shimomura, Eric; Magluilo, Joe; Jacobs, Aaron J.; Huestis, Marilyn A.

    2009-01-01

    This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids and tissues. A brief description of the theory and inherent strengths and limitations of each methodology is included. The focus is on new technologies that address current analytical limitations. A goal of this review is to encourage innovations to improve our technological capabilities and to encourage use of these analytical techniques in forensic toxicology practice. PMID:17579968

  19. 78 FR 47011 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software... revised regulatory guide (RG), revision 1 of RG 1.171, ``Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants.'' This RG endorses American National Standards...

  20. 77 FR 50722 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2012-08-22

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software...) is issuing for public comment draft regulatory guide (DG), DG-1208, ``Software Unit Testing for Digital Computer Software used in Safety Systems of Nuclear Power Plants.'' The DG-1208 is proposed...

  1. HEALTH AND ENVIRONMENTAL IMPACT OF NANOTECHNOLOGY: TOXICOLOGICAL ASSESSMENT OF MANUFACTURED NANOPARTICLES

    Science.gov (United States)

    The microtechnology of the second half of the 20th century has produced a technical revolution that has lead to the production of computers, the Internet and taken us into a new emerging era of nanotechnology. This issue of Toxicological Sciences includes two articles, "Pulmonar...

  2. Interpretation of postmortem forensic toxicology results for injury prevention research.

    Science.gov (United States)

    Drummer, Olaf H; Kennedy, Briohny; Bugeja, Lyndal; Ibrahim, Joseph Elias; Ozanne-Smith, Joan

    2013-08-01

    Forensic toxicological data provides valuable insight into the potential contribution of alcohol and drugs to external-cause deaths. There is a paucity of material that guides injury researchers on the principles that need to be considered when examining the presence and contribution of alcohol and drugs to these deaths. This paper aims to describe and discuss strengths and limitations of postmortem forensic toxicology sample selection, variations in analytical capabilities and data interpretation for injury prevention research. Issues to be considered by injury researchers include: the circumstances surrounding death (including the medical and drug use history of the deceased person); time and relevant historical factors; postmortem changes (including redistribution and instability); laboratory practices; specimens used; drug concentration; and attribution of contribution to death. This paper describes the range of considerations for testing and interpreting postmortem forensic toxicology, particularly when determining impairment or toxicity as possible causal factors in injury deaths. By describing these considerations, this paper has application to decisions about study design and case inclusion in injury prevention research, and to the interpretation of research findings.

  3. MicroRNAs and toxicology: A love marriage

    Directory of Open Access Journals (Sweden)

    Elisabeth Schraml

    Full Text Available With the dawn of personalized medicine, secreted microRNAs (miRNAs have come into the very focus of biomarker development for various diseases. MiRNAs fulfil key requirements of diagnostic tools such as i non or minimally invasive accessibility, ii robust, standardized and non-expensive quantitative analysis, iii rapid turnaround of the test result and iv most importantly because they provide a comprehensive snapshot of the ongoing physiologic processes in cells and tissues that package and release miRNAs into cell-free space. These characteristics have also established circulating miRNAs as promising biomarker candidates for toxicological studies, where they are used as biomarkers of drug-, or chemical-induced tissue injury for safety assessment. The tissue-specificity and early release of circulating miRNAs upon tissue injury, when damage is still reversible, are main factors for their clinical utility in toxicology. Here we summarize in brief, current knowledge of this field. Keywords: microRNAs, Biomarker, Toxicology, Minimal-invasive, DILI

  4. Computer-enhanced thallium scintigrams in asymptomatic men with abnormal exercise tests

    International Nuclear Information System (INIS)

    Uhl, G.S.; Kay, T.N.; Hickman, J.R. Jr.

    1981-01-01

    The use of treadmill testing in asymptomatic patients and those with an atypical chest pain syndrome is increasing, yet the proportion of false positive stress electrocardiograms increases as the prevalence of disease decreases. To determine the diagnostic accuracy of computer-enhanced thallium perfusion scintigraphy in this subgroup of patients, multigated thallium scans were obtained after peak exercise and 3 or 4 hours after exercise and the raw images enhanced by a computer before interpretations were made. The patient group consisted of 191 asymptomatic U.S. Air force aircrewmen who had an abnormal exercise electrocardiogram. Of these, 135 had normal coronary angiographic findings, 15 had subcritical coronary stenosis (less than 50 percent diameter narrowing) and 41 had significant coronary artery disease. Use of computer enhancement resulted in only four false positive and two false negative scintigrams. The small subgroup with subcritical coronary disease had equivocal results on thallium scintigraphy, 10 men having abnormal scans and 5 showing no defects. The clinical significance of such subcritical disease in unclear, but it can be detected with thallium scintigraphy. Thallium scintigrams that have been enhanced by readily available computer techniques are an accurate diagnostic tool even in asymptomatic patients with an easily interpretable abnormal maximal stress electrocardiogram. Thallium scans can be effectively used in counseling asymptomatic patients on the likelihood of their having coronary artery disease

  5. Force Limited Vibration Testing: Computation C2 for Real Load and Probabilistic Source

    Science.gov (United States)

    Wijker, J. J.; de Boer, A.; Ellenbroek, M. H. M.

    2014-06-01

    To prevent over-testing of the test-item during random vibration testing Scharton proposed and discussed the force limited random vibration testing (FLVT) in a number of publications, in which the factor C2 is besides the random vibration specification, the total mass and the turnover frequency of the load(test item), a very important parameter. A number of computational methods to estimate C2 are described in the literature, i.e. the simple and the complex two degrees of freedom system, STDFS and CTDFS, respectively. Both the STDFS and the CTDFS describe in a very reduced (simplified) manner the load and the source (adjacent structure to test item transferring the excitation forces, i.e. spacecraft supporting an instrument).The motivation of this work is to establish a method for the computation of a realistic value of C2 to perform a representative random vibration test based on force limitation, when the adjacent structure (source) description is more or less unknown. Marchand formulated a conservative estimation of C2 based on maximum modal effective mass and damping of the test item (load) , when no description of the supporting structure (source) is available [13].Marchand discussed the formal description of getting C 2 , using the maximum PSD of the acceleration and maximum PSD of the force, both at the interface between load and source, in combination with the apparent mass and total mass of the the load. This method is very convenient to compute the factor C 2 . However, finite element models are needed to compute the spectra of the PSD of both the acceleration and force at the interface between load and source.Stevens presented the coupled systems modal approach (CSMA), where simplified asparagus patch models (parallel-oscillator representation) of load and source are connected, consisting of modal effective masses and the spring stiffnesses associated with the natural frequencies. When the random acceleration vibration specification is given the CMSA

  6. Differential computation method used to calibrate the angle-centroid relationship in coaxial reverse Hartmann test

    Science.gov (United States)

    Li, Xinji; Hui, Mei; Zhao, Zhu; Liu, Ming; Dong, Liquan; Kong, Lingqin; Zhao, Yuejin

    2018-05-01

    A differential computation method is presented to improve the precision of calibration for coaxial reverse Hartmann test (RHT). In the calibration, the accuracy of the distance measurement greatly influences the surface shape test, as demonstrated in the mathematical analyses. However, high-precision absolute distance measurement is difficult in the calibration. Thus, a differential computation method that only requires the relative distance was developed. In the proposed method, a liquid crystal display screen successively displayed two regular dot matrix patterns with different dot spacing. In a special case, images on the detector exhibited similar centroid distributions during the reflector translation. Thus, the critical value of the relative displacement distance and the centroid distributions of the dots on the detector were utilized to establish the relationship between the rays at certain angles and the detector coordinates. Experiments revealed the approximately linear behavior of the centroid variation with the relative displacement distance. With the differential computation method, we increased the precision of traditional calibration 10-5 rad root mean square. The precision of the RHT was increased by approximately 100 nm.

  7. Programmed temperature control of capsule in irradiation test with personal computer at JMTR

    International Nuclear Information System (INIS)

    Saito, H.; Uramoto, T.; Fukushima, M.; Obata, M.; Suzuki, S.; Nakazaki, C.; Tanaka, I.

    1992-01-01

    The capsule irradiation facility is one of various equipments employed at the Japan Materials Testing Reactor (JMTR). The capsule facility has been used in irradiation tests of both nuclear fuels and materials. The capsule to be irradiated consists of the specimen, the outer tube and inner tube with a annular space between them. The temperature of the specimen is controlled by varying the degree of pressure (below the atmospheric pressure) of He gas in the annular space (vacuum-controlled). Beside this, in another system the temperature of the specimen is controlled with electric heaters mounted around the specimen (heater-controlled). The use of personal computer in the capsule facility has led to the development of a versatile temperature control system at the JMTR. Features of this newly-developed temperature control system lie in the following: the temperature control mode for a operation period can be preset prior to the operation; and the vacuum-controlled irradiation facility can be used in cooperation with the heater-controlled. The introduction of personal computer has brought in automatic heat-up and cool-down operations of the capsule, setting aside the hand-operated jobs which had been conducted by the operators. As a result of this, the various requirements seeking a higher accuracy and efficiency in the irradiation can be met by fully exploiting the capabilities incorporated into the facility which allow the cyclic or delicate changes in the temperature. This paper deals with a capsule temperature control system with personal computer. (author)

  8. [Toxicological evaluation in the childhood].

    Science.gov (United States)

    Arroyo, Amparo; Rodrigo, Carlos; Marrón, M Teresa

    2014-03-01

    Intoxications in infancy require urgent medical treatment within national health systems. In our country they represent 0.3% of paediatric urgencies. Most of them are accidental intoxications but is not infrequent to find some related to child abuse or to suicidal intentions, especially in adolescence. The objectives of the study are to evaluate both clinical health care and medical legal aspects in intoxications in infancy. Medical assistance is described and it includes clinical diagnosis, typology of the more common toxics, percentages and referral to social work and emergency care equipment units of the Ministry of Social Welfare and the Department of Health or, where appropriate, directly to prosecutors and courts for their intervention. In cases of detection of alcohol, drugs or medication in infants, the importance of the correct interpretation of the results of toxicological findings is discussed. Several studies for the interpretation of results concerning the detection of these toxics are reported. Both legal aspects and the forensic medical opinion are assessed. The findings will be analysed by the judicial authority in order to circumscribe responsibilities or to take appropriate decisions concerning the protection of infants' interests. In conclusion intoxication in infancy can lead to legal proceedings requiring specific actions for their protection. Both physicians and hospitals must comply with the legal requirement of the submission to the court of judicial parties. On the other hand, this information is an interesting step toward reinforcing public health surveillance. Copyright © 2014 Elsevier España, S.L. All rights reserved.

  9. Good Practices in Forensic Toxicology.

    Science.gov (United States)

    Drummer, Olaf H

    2017-01-01

    This manuscript provides an overview for analysts, medical and scientific investigators, and laboratory administrators, the range of factors that should be considered to implement best practice forensic toxicology. These include laboratory influence over the collection of specimens, their proper transport and chain-of-custody before arrival in the laboratory. In addition, the laboratory needs to ensure properly trained staff use suitably validated and documented analytical procedures that meet the intended purpose and type of case in an accredited or suitably quality oriented management system. To assist the investigating officers laboratory results require an interpretation over their possible significance when sufficient details are available over the circumstances of the case. This requires a thorough understanding of the various factors that influence concentrations of substances and ultimately their likely physiological effect. These include consideration of the route of ingestion, influence over chronicity of usage on tissue concentrations and tolerance, possible combined drug effects or likely adverse reactions and consideration of relevant genetic factors that may have influenced pharmacokinetic or pharmacodynamic response. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  10. Introduction: biomarkers in neurodevelopment toxicology

    Energy Technology Data Exchange (ETDEWEB)

    Needleman, H.L.

    1987-10-01

    The search for markers of toxicant exposure and effect upon the development of organisms presents a set of challenges that differ in many ways from those encountered in the study of markers in reproduction or pregnancy. These latter two fields specify a relatively narrow set of organs or biological systems. The term development, on the other hand, can apply to any organ system, or to any set of phenomena that changes in an ordered way over time. For this reason the papers presented in the session on development were chosen to narrow the focus to neurodevelopmental markers, as such markers may be altered by neurotoxic exposure. In attempting to meet this task, the authors have been able to select a group of investigators who work at the leading edges of their respective fields of developmental neuroanatomy, neurotoxicology, neuroendocrinology, neuropsychology, and infant development. The notion that toxicants could affect behavior certainly is not new. Recent knowledge that behavioral aberrations can occur at exposures below those which produce organic changes, and that behavioral aberrations can occur at exposures below those which produce organic changes, and that behavioral observation might provide early markers of effect has given rise to two new fields: behavioral toxicology and behavioral teratology.

  11. Finite element simulation of nanoindentation tests using a macroscopic computational model

    International Nuclear Information System (INIS)

    Khelifa, Mourad; Fierro, Vanessa; Celzard, Alain

    2014-01-01

    The aim of this work was to develop a numerical procedure to simulate nanoindentation tests using a macroscopic computational model. Both theoretical and numerical aspects of the proposed methodology, based on the coupling of isotropic elasticity and anisotropic plasticity described with the quadratic criterion of Hill are presented to model this behaviour. The anisotropic plastic behaviour accounts for the mixed nonlinear hardening (isotropic and kinematic) under large plastic deformation. Nanoindentation tests were simulated to analyse the nonlinear mechanical behaviour of aluminium alloy. The predicted results of the finite element (FE) modelling are in good agreement with the experimental data, thereby confirming the accuracy level of the suggested FE method of analysis. The effects of some technological and mechanical parameters known to have an influence during the nanoindentation tests were also investigated.

  12. Computer aided testing of steel samples deformation at coexistence liquid and solid phase

    International Nuclear Information System (INIS)

    Hojny, M.; Glowacki, M.

    2007-01-01

    The paper reports the results of experimental and theoretical work leading to construction of a CAE system dedicated to the numerical simulation of plastic deformation of steel at coexistence liquid and solid phase. A coupled thermal-mechanical model including inverse analysis technique was adopted for the solver. The advantage of the solution was the analytical form of both incompressibility and mass conservation conditions. This can prevent usual FEM variational solution problems concerning unintentional specimen volume loss caused by the numerical errors. The only well known machine allowing tests in the discussed temperature range is the GLEEBLE thermo-mechanical simulator. Experiments of deformation of steel in semi-solid state by using this machine are very expensive. Therefore, application of dedicated computer simulation system with inverse method makes tests possible and results in lowering testing cost

  13. Data governance in predictive toxicology: A review.

    Science.gov (United States)

    Fu, Xin; Wojak, Anna; Neagu, Daniel; Ridley, Mick; Travis, Kim

    2011-07-13

    Due to recent advances in data storage and sharing for further data processing in predictive toxicology, there is an increasing need for flexible data representations, secure and consistent data curation and automated data quality checking. Toxicity prediction involves multidisciplinary data. There are hundreds of collections of chemical, biological and toxicological data that are widely dispersed, mostly in the open literature, professional research bodies and commercial companies. In order to better manage and make full use of such large amount of toxicity data, there is a trend to develop functionalities aiming towards data governance in predictive toxicology to formalise a set of processes to guarantee high data quality and better data management. In this paper, data quality mainly refers in a data storage sense (e.g. accuracy, completeness and integrity) and not in a toxicological sense (e.g. the quality of experimental results). This paper reviews seven widely used predictive toxicology data sources and applications, with a particular focus on their data governance aspects, including: data accuracy, data completeness, data integrity, metadata and its management, data availability and data authorisation. This review reveals the current problems (e.g. lack of systematic and standard measures of data quality) and desirable needs (e.g. better management and further use of captured metadata and the development of flexible multi-level user access authorisation schemas) of predictive toxicology data sources development. The analytical results will help to address a significant gap in toxicology data quality assessment and lead to the development of novel frameworks for predictive toxicology data and model governance. While the discussed public data sources are well developed, there nevertheless remain some gaps in the development of a data governance framework to support predictive toxicology. In this paper, data governance is identified as the new challenge in

  14. Data governance in predictive toxicology: A review

    Directory of Open Access Journals (Sweden)

    Fu Xin

    2011-07-01

    Full Text Available Abstract Background Due to recent advances in data storage and sharing for further data processing in predictive toxicology, there is an increasing need for flexible data representations, secure and consistent data curation and automated data quality checking. Toxicity prediction involves multidisciplinary data. There are hundreds of collections of chemical, biological and toxicological data that are widely dispersed, mostly in the open literature, professional research bodies and commercial companies. In order to better manage and make full use of such large amount of toxicity data, there is a trend to develop functionalities aiming towards data governance in predictive toxicology to formalise a set of processes to guarantee high data quality and better data management. In this paper, data quality mainly refers in a data storage sense (e.g. accuracy, completeness and integrity and not in a toxicological sense (e.g. the quality of experimental results. Results This paper reviews seven widely used predictive toxicology data sources and applications, with a particular focus on their data governance aspects, including: data accuracy, data completeness, data integrity, metadata and its management, data availability and data authorisation. This review reveals the current problems (e.g. lack of systematic and standard measures of data quality and desirable needs (e.g. better management and further use of captured metadata and the development of flexible multi-level user access authorisation schemas of predictive toxicology data sources development. The analytical results will help to address a significant gap in toxicology data quality assessment and lead to the development of novel frameworks for predictive toxicology data and model governance. Conclusions While the discussed public data sources are well developed, there nevertheless remain some gaps in the development of a data governance framework to support predictive toxicology. In this paper

  15. Botulinum Neurotoxins: Biology, Pharmacology, and Toxicology.

    Science.gov (United States)

    Pirazzini, Marco; Rossetto, Ornella; Eleopra, Roberto; Montecucco, Cesare

    2017-04-01

    The study of botulinum neurotoxins (BoNT) is rapidly progressing in many aspects. Novel BoNTs are being discovered owing to next generation sequencing, but their biologic and pharmacological properties remain largely unknown. The molecular structure of the large protein complexes that the toxin forms with accessory proteins, which are included in some BoNT type A1 and B1 pharmacological preparations, have been determined. By far the largest effort has been dedicated to the testing and validation of BoNTs as therapeutic agents in an ever increasing number of applications, including pain therapy. BoNT type A1 has been also exploited in a variety of cosmetic treatments, alone or in combination with other agents, and this specific market has reached the size of the one dedicated to the treatment of medical syndromes. The pharmacological properties and mode of action of BoNTs have shed light on general principles of neuronal transport and protein-protein interactions and are stimulating basic science studies. Moreover, the wide array of BoNTs discovered and to be discovered and the production of recombinant BoNTs endowed with specific properties suggest novel uses in therapeutics with increasing disease/symptom specifity. These recent developments are reviewed here to provide an updated picture of the biologic mechanism of action of BoNTs, of their increasing use in pharmacology and in cosmetics, and of their toxicology. Copyright © 2017 by The Author(s).

  16. Computer-enhanced thallium scintigrams in asymptomatic men with abnormal exercise tests

    International Nuclear Information System (INIS)

    Uhl, G.S.; Kay, T.N.; Hickman, J.R. Jr.

    1981-01-01

    The usefulness of computer-enhanced thallium-201 myocardial perfusion scintigraphy in excluding the diagnosis of coronary artery disease in asymptomatic patients showing abnormal exercise electrocardiograms is evaluated. Multigated thallium scans were obtained immediately following and 3 or 4 hours after maximal exercise testing in 191 consecutive asymptomatic Air Force aircrew members who had shown abnormal exercise electrocardiograms and who were due to undergo coronary angiography. Computer enhancement of the raw images is found to lead to four false positive and two false negative scintigrams as revealed by angiographic results, while the group of 15 with subcritical coronary disease exhibited equivocal results. Results reveal that enhanced thallium scintigrams are an accurate diagnostics tool in detecting myocardial ischemia in asymptomatic patients and may be used in counseling asymptomatic patients on their likelihood of having coronary artery disease

  17. A computational environment for creating and testing reduced chemical kinetic mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Montgomery, C.J.; Swensen, D.A.; Harding, T.V.; Cremer, M.A.; Bockelie, M.J. [Reaction Engineering International, Salt Lake City, UT (USA)

    2002-02-01

    This paper describes software called computer assisted reduced mechanism problem solving environment (CARM-PSE) that gives the engineer the ability to rapidly set up, run and examine large numbers of problems comparing detailed and reduced (approximate) chemistry. CARM-PSE integrates the automatic chemical mechanism reduction code CARM and the codes that simulate perfectly stirred reactors and plug flow reactors into a user-friendly computational environment. CARM-PSE gives the combustion engineer the ability to easily test chemical approximations over many hundreds of combinations of inputs in a multidimensional parameter space. The demonstration problems compare detailed and reduced chemical kinetic calculations for methane-air combustion, including nitrogen oxide formation, in a stirred reactor and selective non-catalytic reduction of NOx, in coal combustion flue gas.

  18. LIMBO computer code for analyzing coolant-voiding dynamics in LMFBR safety tests

    International Nuclear Information System (INIS)

    Bordner, G.L.

    1979-10-01

    The LIMBO (liquid metal boiling) code for the analysis of two-phase flow phenomena in an LMFBR reactor coolant channel is presented. The code uses a nonequilibrium, annular, two-phase flow model, which allows for slip between the phases. Furthermore, the model is intended to be valid for both quasi-steady boiling and rapid coolant voiding of the channel. The code was developed primarily for the prediction of, and the posttest analysis of, coolant-voiding behavior in the SLSF P-series in-pile safety test experiments. The program was conceived to be simple, efficient, and easy to use. It is particularly suited for parametric studies requiring many computer runs and for the evaluation of the effects of model or correlation changes that require modification of the computer program. The LIMBO code, of course, lacks the sophistication and model detail of the reactor safety codes, such as SAS, and is therefore intended to compliment these safety codes

  19. Computed tomography (CT) as a nondestructive test method used for composite helicopter components

    Science.gov (United States)

    Oster, Reinhold

    1991-09-01

    The first components of primary helicopter structures to be made of glass fiber reinforced plastics were the main and tail rotor blades of the Bo105 and BK 117 helicopters. These blades are now successfully produced in series. New developments in rotor components, e.g., the rotor blade technology of the Bo108 and PAH2 programs, make use of very complex fiber reinforced structures to achieve simplicity and strength. Computer tomography was found to be an outstanding nondestructive test method for examining the internal structure of components. A CT scanner generates x-ray attenuation measurements which are used to produce computer reconstructed images of any desired part of an object. The system images a range of flaws in composites in a number of views and planes. Several CT investigations and their results are reported taking composite helicopter components as an example.

  20. Computer-based data acquisition system in the Large Coil Test Facility

    International Nuclear Information System (INIS)

    Gould, S.S.; Layman, L.R.; Million, D.L.

    1983-01-01

    The utilization of computers for data acquisition and control is of paramount importance on large-scale fusion experiments because they feature the ability to acquire data from a large number of sensors at various sample rates and provide for flexible data interpretation, presentation, reduction, and analysis. In the Large Coil Test Facility (LCTF) a Digital Equipment Corporation (DEC) PDP-11/60 host computer with the DEC RSX-11M operating system coordinates the activities of five DEC LSI-11/23 front-end processors (FEPs) via direct memory access (DMA) communication links. This provides host control of scheduled data acquisition and FEP event-triggered data collection tasks. Four of the five FEPs have no operating system

  1. OpenTox predictive toxicology framework: toxicological ontology and semantic media wiki-based OpenToxipedia.

    Science.gov (United States)

    Tcheremenskaia, Olga; Benigni, Romualdo; Nikolova, Ivelina; Jeliazkova, Nina; Escher, Sylvia E; Batke, Monika; Baier, Thomas; Poroikov, Vladimir; Lagunin, Alexey; Rautenberg, Micha; Hardy, Barry

    2012-04-24

    The OpenTox Framework, developed by the partners in the OpenTox project (http://www.opentox.org), aims at providing a unified access to toxicity data, predictive models and validation procedures. Interoperability of resources is achieved using a common information model, based on the OpenTox ontologies, describing predictive algorithms, models and toxicity data. As toxicological data may come from different, heterogeneous sources, a deployed ontology, unifying the terminology and the resources, is critical for the rational and reliable organization of the data, and its automatic processing. The following related ontologies have been developed for OpenTox: a) Toxicological ontology - listing the toxicological endpoints; b) Organs system and Effects ontology - addressing organs, targets/examinations and effects observed in in vivo studies; c) ToxML ontology - representing semi-automatic conversion of the ToxML schema; d) OpenTox ontology- representation of OpenTox framework components: chemical compounds, datasets, types of algorithms, models and validation web services; e) ToxLink-ToxCast assays ontology and f) OpenToxipedia community knowledge resource on toxicology terminology.OpenTox components are made available through standardized REST web services, where every compound, data set, and predictive method has a unique resolvable address (URI), used to retrieve its Resource Description Framework (RDF) representation, or to initiate the associated calculations and generate new RDF-based resources.The services support the integration of toxicity and chemical data from various sources, the generation and validation of computer models for toxic effects, seamless integration of new algorithms and scientifically sound validation routines and provide a flexible framework, which allows building arbitrary number of applications, tailored to solving different problems by end users (e.g. toxicologists). The OpenTox toxicological ontology projects may be accessed via the Open

  2. Adjusting for cross-cultural differences in computer-adaptive tests of quality of life.

    Science.gov (United States)

    Gibbons, C J; Skevington, S M

    2018-04-01

    Previous studies using the WHOQOL measures have demonstrated that the relationship between individual items and the underlying quality of life (QoL) construct may differ between cultures. If unaccounted for, these differing relationships can lead to measurement bias which, in turn, can undermine the reliability of results. We used item response theory (IRT) to assess differential item functioning (DIF) in WHOQOL data from diverse language versions collected in UK, Zimbabwe, Russia, and India (total N = 1332). Data were fitted to the partial credit 'Rasch' model. We used four item banks previously derived from the WHOQOL-100 measure, which provided excellent measurement for physical, psychological, social, and environmental quality of life domains (40 items overall). Cross-cultural differential item functioning was assessed using analysis of variance for item residuals and post hoc Tukey tests. Simulated computer-adaptive tests (CATs) were conducted to assess the efficiency and precision of the four items banks. Splitting item parameters by DIF results in four linked item banks without DIF or other breaches of IRT model assumptions. Simulated CATs were more precise and efficient than longer paper-based alternatives. Assessing differential item functioning using item response theory can identify measurement invariance between cultures which, if uncontrolled, may undermine accurate comparisons in computer-adaptive testing assessments of QoL. We demonstrate how compensating for DIF using item anchoring allowed data from all four countries to be compared on a common metric, thus facilitating assessments which were both sensitive to cultural nuance and comparable between countries.

  3. Use of computer-assisted prediction of toxic effects of chemical substances

    International Nuclear Information System (INIS)

    Simon-Hettich, Brigitte; Rothfuss, Andreas; Steger-Hartmann, Thomas

    2006-01-01

    The current revision of the European policy for the evaluation of chemicals (REACH) has lead to a controversy with regard to the need of additional animal safety testing. To avoid increases in animal testing but also to save time and resources, alternative in silico or in vitro tests for the assessment of toxic effects of chemicals are advocated. The draft of the original document issued in 29th October 2003 by the European Commission foresees the use of alternative methods but does not give further specification on which methods should be used. Computer-assisted prediction models, so-called predictive tools, besides in vitro models, will likely play an essential role in the proposed repertoire of 'alternative methods'. The current discussion has urged the Advisory Committee of the German Toxicology Society to present its position on the use of predictive tools in toxicology. Acceptable prediction models already exist for those toxicological endpoints which are based on well-understood mechanism, such as mutagenicity and skin sensitization, whereas mechanistically more complex endpoints such as acute, chronic or organ toxicities currently cannot be satisfactorily predicted. A potential strategy to assess such complex toxicities will lie in their dissection into models for the different steps or pathways leading to the final endpoint. Integration of these models should result in a higher predictivity. Despite these limitations, computer-assisted prediction tools already today play a complementary role for the assessment of chemicals for which no data is available or for which toxicological testing is impractical due to the lack of availability of sufficient compounds for testing. Furthermore, predictive tools offer support in the screening and the subsequent prioritization of compound for further toxicological testing, as expected within the scope of the European REACH program. This program will also lead to the collection of high-quality data which will broaden the

  4. Precision toxicology based on single cell sequencing: an evolving trend in toxicological evaluations and mechanism exploration.

    Science.gov (United States)

    Zhang, Boyang; Huang, Kunlun; Zhu, Liye; Luo, Yunbo; Xu, Wentao

    2017-07-01

    In this review, we introduce a new concept, precision toxicology: the mode of action of chemical- or drug-induced toxicity can be sensitively and specifically investigated by isolating a small group of cells or even a single cell with typical phenotype of interest followed by a single cell sequencing-based analysis. Precision toxicology can contribute to the better detection of subtle intracellular changes in response to exogenous substrates, and thus help researchers find solutions to control or relieve the toxicological effects that are serious threats to human health. We give examples for single cell isolation and recommend laser capture microdissection for in vivo studies and flow cytometric sorting for in vitro studies. In addition, we introduce the procedures for single cell sequencing and describe the expected application of these techniques to toxicological evaluations and mechanism exploration, which we believe will become a trend in toxicology.

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  6. Dynamic behaviour of raft and pile foundations tests and computational models. Pt. 1

    International Nuclear Information System (INIS)

    Betbeder, J.; Garnier, J.C.; Gauvain, J.; Jeandidier, C.

    1981-01-01

    Pile foundations are commonly used for many types of buildings where the bearing capacity of soil is poor. For nuclear power plants buildings, however, there seems to be a fairly general reluctancy to accept design on piles, as it is considered difficult to demonstrate the safety of these foundations with respect to earthquakes, due to the relative lack of validation of the currently available aseismic design methods. Being conscious that pile foundations might be worth considering for future nuclear sites in France and that the reliability of design methods should be backed by experimental data, ELECTRICITE DE FRANCE decided in 1978 to undertake a series of tests, aimed at assessing the validity of computational models for seismic behaviour of pile foundations and trying to define better models if necessary. These tests on reduced scale structure, including various types of raft and pile foundations and different kinds of dynamic excitation (harmonic, earthquake simulation, impulsive release of a static force) have been made at the NICE airport site. The present paper deals with the general description of the tests and the first part of interpretation work, limited to in-structure harmonic excitation and earthquake simulation tests analyzed by simple spring -dashpot analytical models. The two following papers (K5-6 and K5-7) are devoted to specialized topics in relation with the interpretation of tests, i-e ground motions analysis for earthquake simulation and research work on a new computational model. Although preliminary conclusions can be drawn from the results obtained so far, further work will be necessary to reach a conclusive assessment on this difficult subject. (orig.)

  7. A benchmark test of computer codes for calculating average resonance parameters

    International Nuclear Information System (INIS)

    Ribon, P.; Thompson, A.

    1983-01-01

    A set of resonance parameters has been generated from known, but secret, average values; the parameters have then been adjusted to mimic experimental data by including the effects of Doppler broadening, resolution broadening and statistical fluctuations. Average parameters calculated from the dataset by various computer codes are compared with each other, and also with the true values. The benchmark test is fully described in the report NEANDC160-U (NEA Data Bank Newsletter No. 27 July 1982); the present paper is a summary of this document. (Auth.)

  8. Hardware synthesis from DDL. [Digital Design Language for computer aided design and test of LSI

    Science.gov (United States)

    Shah, A. M.; Shiva, S. G.

    1981-01-01

    The details of the digital systems can be conveniently input into the design automation system by means of Hardware Description Languages (HDL). The Computer Aided Design and Test (CADAT) system at NASA MSFC is used for the LSI design. The Digital Design Language (DDL) has been selected as HDL for the CADAT System. DDL translator output can be used for the hardware implementation of the digital design. This paper addresses problems of selecting the standard cells from the CADAT standard cell library to realize the logic implied by the DDL description of the system.

  9. State-of-the-art review of nondestructive testing with computer-assisted tomography

    International Nuclear Information System (INIS)

    Bansal, A; Islam, M.R.

    1991-01-01

    Computer assisted tomography (CAT) has been a revolutionary technique in medical radiology. Recently, CAT scanners started to be used as non-destructive testing facilities in different industrial applications. Most of the application of the new technology are in the areas of petroleum engineering. Majority of this CAT scanner technology has been developed in the U.S.A. However, several Canadian companies have acquired CAT scanners and are using for novel commercial as well as research applications. This paper presents a comprehensive review of advances made in the areas of CAT scan technology as applied to the petroleum industry

  10. Development of a KSC test and flight engineering oriented computer language, Phase 1

    Science.gov (United States)

    Case, C. W.; Kinney, E. L.; Gyure, J.

    1970-01-01

    Ten, primarily test oriented, computer languages reviewed during the phase 1 study effort are described. Fifty characteristics of ATOLL, ATLAS, and CLASP are compared. Unique characteristics of the other languages, including deficiencies, problems, safeguards, and checking provisions are identified. Programming aids related to these languages are reported, and the conclusions resulting from this phase of the study are discussed. A glossary and bibliography are included. For the reports on phase 2 of the study, see N71-35027 and N71-35029.

  11. Installation and testing of the ERANOS computer code for fast reactor calculations

    International Nuclear Information System (INIS)

    Gren, Milan

    2010-12-01

    The French ERANOS computer code was acquired and tested by solving benchmark problems. Five problems were calculated: 1D XZ Model, 1D RZ Model, 3D HEX SNR 300 reactor, 2S HEX and 3D HEX VVER 440 reactor. The multi-group diffuse approximation was used. The multiplication coefficients were compared within the first problem, neutron flux density in the calculation points was obtained within the second problem, and powers in the various reactor areas and in the assemblies were calculated within the remaining problems. (P.A.)

  12. Low cost phantom for computed radiology; Objeto de teste de baixo custo para radiologia computadorizada

    Energy Technology Data Exchange (ETDEWEB)

    Travassos, Paulo Cesar B.; Magalhaes, Luis Alexandre G., E-mail: pctravassos@ufrj.br [Universidade do Estado do Rio de Janeiro (IBRGA/UERJ), RJ (Brazil). Laboratorio de Ciencias Radiologicas; Augusto, Fernando M.; Sant' Yves, Thalis L.A.; Goncalves, Elicardo A.S. [Instituto Nacional de Cancer (INCA), Rio de Janeiro, RJ (Brazil); Botelho, Marina A. [Hospital Universitario Pedro Ernesto (UERJ), Rio de Janeiro, RJ (Brazil)

    2012-08-15

    This article presents the results obtained from a low cost phantom, used to analyze Computed Radiology (CR) equipment. The phantom was constructed to test a few parameters related to image quality, as described in [1-9]. Materials which can be easily purchased were used in the construction of the phantom, with total cost of approximately U$100.00. A bar pattern was placed only to verify the efficacy of the grids in the spatial resolution determination, and was not included in the budget because the data was acquired from the grids. (author)

  13. IRIS TOXICOLOGICAL REVIEW AND SUMMARY ...

    Science.gov (United States)

    The Draft Toxicological Review was developed to evaluate both the cancer and non cancer human health risks from environmental exposure to vinyl chloride. A reference concentration (RfC), and a reference dose (RfD) were developed based upon induction of liver cell polymorphism in a chronic dietary study utilizing Wistar rats. An RfC of 1E-1 mg/m3 and an RfD of 5E-3 mg/kg-d are recommended. On the basis of sufficient evidence for carcinogenicity in human epidemiology studies vinyl chloride is reaffirmed to be a known human carcinogen. Cancer potencies were derived for oral and inhalation exposure. An oral slope factor of 1.3 per (mg/kg-day) for continuous exposure during adulthood and 2.5 per (mg/kg-day) for continuous lifetime exposure from birth, based upon a chronic dietary study in female Wistar rats is recommended; an inhalation unit risk of 4.3 E-6 per (55g/m3) for continuous exposure during adulthood and 8.7 E-6 per (55g/m3) for continuous lifetime exposure from birth is also recommended, based upon exposure of male and female Sprague Dawley rats and Swiss mice, via inhalation, for a lifetime. A PBPK model was used in the derivation of the RfC, RfD, and cancer potency estimates. Its use is based on the assumption that equal tissue concentrations of reactive metabolite, chlorethylene oxide or chloracetaldehyde, at the critical target site will result in equivalent toxicity between species.

  14. Pesticides and Arthropods: Sublethal Effects and Demographic Toxicology

    Directory of Open Access Journals (Sweden)

    Dejan Marčić

    2007-01-01

    refers to an evaluation of individuals, rather than populations, and it is the latter that are required for a more reliable evaluation of effectiveness of pesticides in real life. A demographic-toxicologicalapproach has been proposed therefore as a way of integrating the effects that a toxicant may cause at population level, which includes the construction of life tables and computation of population growth parameters, including intrinsic rate of increase (rm as a crucialparameter. Compared to other laboratory toxicity tests, the demographic-toxicological bioassay has been found superior in terms of a capacity to evaluate overall effects of pesticides, and such approach in evaluating pesticide effects is crucial for environmentally-based programmes of integrated plant protection and a competent evaluation of ecotoxicological risks of pesticide applications.

  15. Introducing Computer-Based Testing in High-Stakes Exams in Higher Education: Results of a Field Experiment

    Science.gov (United States)

    Boevé, Anja J.; Meijer, Rob R.; Albers, Casper J.; Beetsma, Yta; Bosker, Roel J.

    2015-01-01

    The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration. PMID:26641632

  16. Introducing Computer-Based Testing in High-Stakes Exams in Higher Education: Results of a Field Experiment.

    Science.gov (United States)

    Boevé, Anja J; Meijer, Rob R; Albers, Casper J; Beetsma, Yta; Bosker, Roel J

    2015-01-01

    The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration.

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  18. SASSYS-1 computer code verification with EBR-II test data

    International Nuclear Information System (INIS)

    Warinner, D.K.; Dunn, F.E.

    1985-01-01

    The EBR-II natural circulation experiment, XX08 Test 8A, is simulated with the SASSYS-1 computer code and the results for the latter are compared with published data taken during the transient at selected points in the core. The SASSYS-1 results provide transient temperature and flow responses for all points of interest simultaneously during one run, once such basic parameters as pipe sizes, initial core flows, and elevations are specified. The SASSYS-1 simulation results for the EBR-II experiment XX08 Test 8A, conducted in March 1979, are within the published plant data uncertainties and, thereby, serve as a partial verification/validation of the SASSYS-1 code

  19. Cutting Edge PBPK Models and Analyses: Providing the Basis for Future Modeling Efforts and Bridges to Emerging Toxicology Paradigms

    Directory of Open Access Journals (Sweden)

    Jane C. Caldwell

    2012-01-01

    Full Text Available Physiologically based Pharmacokinetic (PBPK models are used for predictions of internal or target dose from environmental and pharmacologic chemical exposures. Their use in human risk assessment is dependent on the nature of databases (animal or human used to develop and test them, and includes extrapolations across species, experimental paradigms, and determination of variability of response within human populations. Integration of state-of-the science PBPK modeling with emerging computational toxicology models is critical for extrapolation between in vitro exposures, in vivo physiologic exposure, whole organism responses, and long-term health outcomes. This special issue contains papers that can provide the basis for future modeling efforts and provide bridges to emerging toxicology paradigms. In this overview paper, we present an overview of the field and introduction for these papers that includes discussions of model development, best practices, risk-assessment applications of PBPK models, and limitations and bridges of modeling approaches for future applications. Specifically, issues addressed include: (a increased understanding of human variability of pharmacokinetics and pharmacodynamics in the population, (b exploration of mode of action hypotheses (MOA, (c application of biological modeling in the risk assessment of individual chemicals and chemical mixtures, and (d identification and discussion of uncertainties in the modeling process.

  20. Test Methods for Effects on Organisms:Springtails

    OpenAIRE

    Krogh, P. H.

    1997-01-01

    NORD-UTTE er Nordisk koordineringsgrupp för utvekling av testmetoder inom toxicologi och ekotoxicology. The Nordic Co-ordination Group for the Development of Test Methods for Toxicology and Ecotoxicology

  1. Accelerating the Development of 21st-Century Toxicology: Outcome of a Human Toxicology Project Consortium Workshop

    Science.gov (United States)

    Stephens, Martin L.; Barrow, Craig; Andersen, Melvin E.; Boekelheide, Kim; Carmichael, Paul L.; Holsapple, Michael P.; Lafranconi, Mark

    2012-01-01

    The U.S. National Research Council (NRC) report on “Toxicity Testing in the 21st century” calls for a fundamental shift in the way that chemicals are tested for human health effects and evaluated in risk assessments. The new approach would move toward in vitro methods, typically using human cells in a high-throughput context. The in vitro methods would be designed to detect significant perturbations to “toxicity pathways,” i.e., key biological pathways that, when sufficiently perturbed, lead to adverse health outcomes. To explore progress on the report’s implementation, the Human Toxicology Project Consortium hosted a workshop on 9–10 November 2010 in Washington, DC. The Consortium is a coalition of several corporations, a research institute, and a non-governmental organization dedicated to accelerating the implementation of 21st-century Toxicology as aligned with the NRC vision. The goal of the workshop was to identify practical and scientific ways to accelerate implementation of the NRC vision. The workshop format consisted of plenary presentations, breakout group discussions, and concluding commentaries. The program faculty was drawn from industry, academia, government, and public interest organizations. Most presentations summarized ongoing efforts to modernize toxicology testing and approaches, each with some overlap with the NRC vision. In light of these efforts, the workshop identified recommendations for accelerating implementation of the NRC vision, including greater strategic coordination and planning across projects (facilitated by a steering group), the development of projects that test the proof of concept for implementation of the NRC vision, and greater outreach and communication across stakeholder communities. PMID:21948868

  2. Banki-Michell Optimal Design by Computational Fluid Dynamics Testing and Hydrodynamic Analysis

    Directory of Open Access Journals (Sweden)

    Tullio Tucciarelli

    2013-04-01

    Full Text Available In hydropower, the exploitation of small power sources requires the use of small turbines that combine efficiency and economy. Banki-Michell turbines represent a possible choice for their simplicity and for their good efficiency under variable load conditions. Several experimental and numerical tests have already been designed for examining the best geometry and optimal design of cross-flow type machines, but a theoretical framework for a sequential design of the turbine parameters, taking full advantage of recently expanded computational capabilities, is still missing. To this aim, after a review of the available criteria for Banki-Michell parameter design, a novel two-step procedure is described. In the first step, the initial and final blade angles, the outer impeller diameter and the shape of the nozzle are selected using a simple hydrodynamic analysis, based on a very strong simplification of reality. In the second step, the inner diameter, as well as the number of blades and their shape, are selected by testing single options using computational fluid dynamics (CFD simulations, starting from the suggested literature values. Good efficiency is attained not only for the design discharge, but also for a large range of variability around the design value.

  3. Impulse-response analysis of planar computed tomography for nondestructive test

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dae Cheon; Kim, Seung Ho; Kim, Ho Kyung [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    There have been reported that the use of radiation imaging such as digital radiography, computed tomography (CT), and digital tomosynthesis (DTS) for the nondestructive test (NDT) widely is spreading. These methods have merits and demerits of their own, in terms of image quality and inspection speed. Therefore, image for these methods for NDT should have acceptable image quality and high speed. In this study, we quantitatively evaluate impulse responses of reconstructed images from the filtered backprojection (FBP), which are most widely used in planar computed tomography (pCT) systems. We first evaluate image performance metrics due to the contrast, depth resolution, and then we design the figure of merit including image performance and system parameters, such as tube load and reconstruction speed. The final goal of this study is the application of these methods to the nondestructive test. In order to accomplish it, further study is needed. First of all, the results of the ASF from various numbers of views. Second, the analysis of modulation transfer function, noise power spectrum, and detective quantum efficiency from various angular range and numbers of views.

  4. Computer Aided Analysis and Prototype Testing of an Improved Biogas Reactor For Biomass System

    Directory of Open Access Journals (Sweden)

    Jeremy (Zheng Li

    2015-05-01

    Full Text Available The alternative fuel resources substituting for conventional fuels are required due to less availability of fuel resources than demand in the market. A large amount of crude oil and petroleum products are required to be imported in many countries over the world. Also the environmental pollution is another serious problem when use petroleum products. Biogas, with the composition of 54.5% CH4, 39.5% CO2, and 6% other elements (i.e., H2, N2, H2S, and O2, is a clear green fuel that can substitute the regular petroleum fuels to reduce the pollutant elements. Biogas can be produced by performing enriching, scrubbing, and bottling processes. The purification process can be further applied to take away the pollutants in biogas. The pure biogas process analyzed in this research is compressed to 2950 psi while being filled into gas cylinder. The daily produced biogas capacity is around 5480 ft3 and the processing efficacy is affected by surrounding environment and other factors. The design and development of this biogas system is assisted through mathematical analysis, 3D modeling, computational simulation, and prototype testing. Both computer aided analysis and prototype testing show close results which validate the feasibility of this biogas system in biomass applications.

  5. The impact of CFD on development test facilities - A National Research Council projection. [computational fluid dynamics

    Science.gov (United States)

    Korkegi, R. H.

    1983-01-01

    The results of a National Research Council study on the effect that advances in computational fluid dynamics (CFD) will have on conventional aeronautical ground testing are reported. Current CFD capabilities include the depiction of linearized inviscid flows and a boundary layer, initial use of Euler coordinates using supercomputers to automatically generate a grid, research and development on Reynolds-averaged Navier-Stokes (N-S) equations, and preliminary research on solutions to the full N-S equations. Improvements in the range of CFD usage is dependent on the development of more powerful supercomputers, exceeding even the projected abilities of the NASA Numerical Aerodynamic Simulator (1 BFLOP/sec). Full representation of the Re-averaged N-S equations will require over one million grid points, a computing level predicted to be available in 15 yr. Present capabilities allow identification of data anomalies, confirmation of data accuracy, and adequateness of model design in wind tunnel trials. Account can be taken of the wall effects and the Re in any flight regime during simulation. CFD can actually be more accurate than instrumented tests, since all points in a flow can be modeled with CFD, while they cannot all be monitored with instrumentation in a wind tunnel.

  6. Computer-assisted static/dynamic renal imaging: a screening test for renovascular hypertension

    International Nuclear Information System (INIS)

    Keim, H.J.; Johnson, P.M.; Vaughan, E.D. Jr.; Beg, K.; Follett, D.A.; Freeman, L.M.; Laragh, J.H.

    1979-01-01

    Computer-assisted static/dynamic renal imaging with [ 197 Hg] chlormerodrin and [/sup 99m/Tc] pertechnetate was evaluated prospectively as a screening test for renovascular hypertension. Results are reported for 51 patients: 33 with benign essential hypertension and 18 with renovascular hypertension, and for 21 normal controls. All patients underwent renal arteriography. Patients with significant obesity, renal insufficiency, or renoparenchymal disease were excluded from this study. Independent visual analyses of renal gamma images and time-activity transit curves identified 17 of the 18 patients with renovascular hypertension; one study was equivocal. There were five equivocal and three false-positive results in the essential hypertension and normal control groups. The sensitivity of the method was 94% and the specificity 85%. Since the prevalence of the renovascular subset of hypertension is approximately 5%, the predictive value is only 25%. Inclusion of computer-generated data did not improve this result. Accordingly, this method is not recommended as a primary screening test for renovascular hypertension

  7. Testing a bedside personal computer Clinical Care Classification System for nursing students using Microsoft Access.

    Science.gov (United States)

    Feeg, Veronica D; Saba, Virginia K; Feeg, Alan N

    2008-01-01

    This study tested a personal computer-based version of the Sabacare Clinical Care Classification System on students' performance of charting patient care plans. The application was designed as an inexpensive alternative to teach electronic charting for use on any laptop or personal computer with Windows and Microsoft Access. The data-based system was tested in a randomized trial with the control group using a type-in text-based-only system also mounted on a laptop at the bedside in the laboratory. Student care plans were more complete using the data-based system over the type-in text version. Students were more positive but not necessarily more efficient with the data-based system. The results demonstrate that the application is effective for improving student nursing care charting using the nursing process and capturing patient care information with a language that is standardized and ready for integration with other patient electronic health record data. It can be implemented on a bedside stand in the clinical laboratory or used to aggregate care planning over a student's clinical experience.

  8. Risk assessment and toxicology databases for health effects assessment

    Energy Technology Data Exchange (ETDEWEB)

    Lu, P.Y.; Wassom, J.S. [Oak Ridge National Laboratory, TN (United States)

    1990-12-31

    Scientific and technological developments bring unprecedented stress to our environment. Society has to predict the results of potential health risks from technologically based actions that may have serious, far-reaching consequences. The potential for error in making such predictions or assessment is great and multiplies with the increasing size and complexity of the problem being studied. Because of this, the availability and use of reliable data is the key to any successful forecasting effort. Scientific research and development generate new data and information. Much of the scientific data being produced daily is stored in computers for subsequent analysis. This situation provides both an invaluable resource and an enormous challenge. With large amounts of government funds being devoted to health and environmental research programs and with maintenance of our living environment at stake, we must make maximum use of the resulting data to forecast and avert catastrophic effects. Along with the readily available. The most efficient means of obtaining the data necessary for assessing the health effects of chemicals is to utilize applications include the toxicology databases and information files developed at ORNL. To make most efficient use of the data/information that has already been prepared, attention and resources should be directed toward projects that meticulously evaluate the available data/information and create specialized peer-reviewed value-added databases. Such projects include the National Library of Medicine`s Hazardous Substances Data Bank, and the U.S. Air Force Installation Restoration Toxicology Guide. These and similar value-added toxicology databases were developed at ORNL and are being maintained and updated. These databases and supporting information files, as well as some data evaluation techniques are discussed in this paper with special focus on how they are used to assess potential health effects of environmental agents. 19 refs., 5 tabs.

  9. Computational systems biology and dose-response modeling in relation to new directions in toxicity testing.

    Science.gov (United States)

    Zhang, Qiang; Bhattacharya, Sudin; Andersen, Melvin E; Conolly, Rory B

    2010-02-01

    The new paradigm envisioned for toxicity testing in the 21st century advocates shifting from the current animal-based testing process to a combination of in vitro cell-based studies, high-throughput techniques, and in silico modeling. A strategic component of the vision is the adoption of the systems biology approach to acquire, analyze, and interpret toxicity pathway data. As key toxicity pathways are identified and their wiring details elucidated using traditional and high-throughput techniques, there is a pressing need to understand their qualitative and quantitative behaviors in response to perturbation by both physiological signals and exogenous stressors. The complexity of these molecular networks makes the task of understanding cellular responses merely by human intuition challenging, if not impossible. This process can be aided by mathematical modeling and computer simulation of the networks and their dynamic behaviors. A number of theoretical frameworks were developed in the last century for understanding dynamical systems in science and engineering disciplines. These frameworks, which include metabolic control analysis, biochemical systems theory, nonlinear dynamics, and control theory, can greatly facilitate the process of organizing, analyzing, and understanding toxicity pathways. Such analysis will require a comprehensive examination of the dynamic properties of "network motifs"--the basic building blocks of molecular circuits. Network motifs like feedback and feedforward loops appear repeatedly in various molecular circuits across cell types and enable vital cellular functions like homeostasis, all-or-none response, memory, and biological rhythm. These functional motifs and associated qualitative and quantitative properties are the predominant source of nonlinearities observed in cellular dose response data. Complex response behaviors can arise from toxicity pathways built upon combinations of network motifs. While the field of computational cell

  10. Optimisation and assessment of three modern touch screen tablet computers for clinical vision testing.

    Directory of Open Access Journals (Sweden)

    Humza J Tahir

    Full Text Available Technological advances have led to the development of powerful yet portable tablet computers whose touch-screen resolutions now permit the presentation of targets small enough to test the limits of normal visual acuity. Such devices have become ubiquitous in daily life and are moving into the clinical space. However, in order to produce clinically valid tests, it is important to identify the limits imposed by the screen characteristics, such as resolution, brightness uniformity, contrast linearity and the effect of viewing angle. Previously we have conducted such tests on the iPad 3. Here we extend our investigations to 2 other devices and outline a protocol for calibrating such screens, using standardised methods to measure the gamma function, warm up time, screen uniformity and the effects of viewing angle and screen reflections. We demonstrate that all three devices manifest typical gamma functions for voltage and luminance with warm up times of approximately 15 minutes. However, there were differences in homogeneity and reflectance among the displays. We suggest practical means to optimise quality of display for vision testing including screen calibration.

  11. Optimisation and assessment of three modern touch screen tablet computers for clinical vision testing.

    Science.gov (United States)

    Tahir, Humza J; Murray, Ian J; Parry, Neil R A; Aslam, Tariq M

    2014-01-01

    Technological advances have led to the development of powerful yet portable tablet computers whose touch-screen resolutions now permit the presentation of targets small enough to test the limits of normal visual acuity. Such devices have become ubiquitous in daily life and are moving into the clinical space. However, in order to produce clinically valid tests, it is important to identify the limits imposed by the screen characteristics, such as resolution, brightness uniformity, contrast linearity and the effect of viewing angle. Previously we have conducted such tests on the iPad 3. Here we extend our investigations to 2 other devices and outline a protocol for calibrating such screens, using standardised methods to measure the gamma function, warm up time, screen uniformity and the effects of viewing angle and screen reflections. We demonstrate that all three devices manifest typical gamma functions for voltage and luminance with warm up times of approximately 15 minutes. However, there were differences in homogeneity and reflectance among the displays. We suggest practical means to optimise quality of display for vision testing including screen calibration.

  12. An experimental platform for triaxial high-pressure/high-temperature testing of rocks using computed tomography

    Science.gov (United States)

    Glatz, Guenther; Lapene, Alexandre; Castanier, Louis M.; Kovscek, Anthony R.

    2018-04-01

    A conventional high-pressure/high-temperature experimental apparatus for combined geomechanical and flow-through testing of rocks is not X-ray compatible. Additionally, current X-ray transparent systems for computed tomography (CT) of cm-sized samples are limited to design temperatures below 180 °C. We describe a novel, high-temperature (>400 °C), high-pressure (>2000 psi/>13.8 MPa confining, >10 000 psi/>68.9 MPa vertical load) triaxial core holder suitable for X-ray CT scanning. The new triaxial system permits time-lapse imaging to capture the role of effective stress on fluid distribution and porous medium mechanics. System capabilities are demonstrated using ultimate compressive strength (UCS) tests of Castlegate sandstone. In this case, flooding the porous medium with a radio-opaque gas such as krypton before and after the UCS test improves the discrimination of rock features such as fractures. The results of high-temperature tests are also presented. A Uintah Basin sample of immature oil shale is heated from room temperature to 459 °C under uniaxial compression. The sample contains kerogen that pyrolyzes as temperature rises, releasing hydrocarbons. Imaging reveals the formation of stress bands as well as the evolution and connectivity of the fracture network within the sample as a function of time.

  13. Toxicology research projects directory, 1978. Monthly repts

    International Nuclear Information System (INIS)

    1978-01-01

    The Toxicology Research Projects Directory is a monthly publication of ongoing research projects in toxicology and related fields selected from the files of the Smithsonian Science Information Exchange (SSIE). Each issue lists toxicology-related research projects reported to SSIE during the one-month period preceding that issue. Each of the summaries is categorized by scientific discipline and assigned a unique identification number for cross-referencing from the Directory Indexes--Subject, Investigator, Performing Organization, Supporting Agency, and Master Grant Number. The thirteenth issue consists of Cumulative Indexes for the entire volume with referencing to projects in all of the previous twelve issues. The emphasis of the Directory is on the manifestations of the exposure of man and animals to toxic substances. Projects are classified by toxic agents, research orientation, and areas of environmental concern

  14. Prospects for applying synthetic biology to toxicology

    DEFF Research Database (Denmark)

    Behrendorff, James Bruce Yarnton H; Gillam, Elizabeth M.J.

    2017-01-01

    The 30 years since the inception of Chemical Research in Toxicology, game-changing advances in chemical and molecular biology, the fundamental disciplines underpinning molecular toxicology, have been made. While these have led to important advances in the study of mechanisms by which chemicals...... damage cells and systems, there has been less focus on applying these advances to prediction, detection, and mitigation of toxicity. Over the last ∼15 years, synthetic biology, the repurposing of biological "parts" in systems engineered for useful ends, has been explored in other areas of the biomedical...... and life sciences, for such applications as detecting metabolites, drug discovery and delivery, investigating disease mechanisms, improving medical treatment, and producing useful chemicals. These examples provide models for the application of synthetic biology to toxicology, which, for the most part, has...

  15. Resolution of aviation forensic toxicology findings with the aid of DNA profiling.

    Science.gov (United States)

    Chaturvedi, Arvind K; Craft, Kristi J; Kupfer, Doris M; Burian, Dennis; Canfield, Dennis V

    2011-03-20

    Body components of aviation accident fatalities are often scattered, disintegrated, commingled, contaminated, and/or putrefied at accident scenes. These situations may impose difficulties in victim identification/tissue matching. The prevalence of misidentification in relation to aviation accident forensic toxicology has not been well established. Therefore, the Civil Aerospace Medical Institute (CAMI) toxicology database was searched for the 1998-2008 period for those cases wherein DNA profiling was performed to resolve identity issue of the samples submitted to CAMI for toxicological evaluation. During this period, biological samples from the casualties of a total of 3523 accidents were submitted to CAMI. The submitted samples were primarily from pilots. Out of the 3523 accidents, at least, one fatality had occurred in 3366 (≈ 96%) accidents; thus, these accidents were considered fatal accidents. Accordingly, biological samples from 3319 pilots (3319 of the 3366 accidents) were received at CAMI for toxicological testing. Of these 3319 pilots, 3275 (≈ 99%) were fatally injured. DNA profiling was performed in 15 (≈ 0.5%) of the 3319 accidents. The profiling was conducted upon the requests of families in two accidents, accident investigators in three, and pathologists in four. In six accidents, contradictory toxicological findings led CAMI to initiate DNA profiling. The requests made by families and investigators were primarily triggered by inconsistency between the toxicological results and the history of drug use of the victims, while by pathologists because of commingling of samples. In three (20%) of the 15 accidents, at least one submitted sample was misidentified or mislabeled. The present study demonstrated that the number of aviation accident cases requiring DNA profiling was small and this DNA approach was effectively applied in resolving aviation toxicology findings associated with those accidents. Published by Elsevier Ireland Ltd.

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  20. Quasi-brittle material behavior under cyclic loading: from virtual testing to structural computation

    International Nuclear Information System (INIS)

    Vassaux, Maxime

    2015-01-01

    Macroscopic constitutive laws are developed not only because they allow for large-scale computations but also because refine dissipative mechanisms observed at lower scales. Within the framework of this study, the development of such models is carried out in the context of seismic loading, that is to say reverse cyclic loading, applied to the quasi-brittle materials and more precisely, concrete-like materials. Nowadays, robust and predictive macroscopic constitutive laws are still rare because of the complexity of cracking related phenomena. Among the challenges to face, the material parameters identification is far from being the easiest due to the lack of experimental data. Indeed, the difficulties to carry out cyclic tests on concrete-like materials are numerous. To overcome these difficulties, a virtual testing approach based on a refine model is proposed in this study in order to feed continuum models with the missing material parameters. Adopting a microscopic point of view, a representative volume element is seen as a structure. The microscopic model has been developed with the aim to require a minimal number of material parameters which only need basic mechanical tests to be identified. From an existing lattice model developed to deal with monotonic loading, several enhancements have been realized in order to extend its range of applicability, making it capable of dealing with complex multi-axial cyclic loadings. The microscopic model has been validated as a virtual testing machine that is able to help the identification procedure of continuous constitutive laws. This identification approach has been applied on a new constitutive law developed within the framework of isotropic continuum damage mechanics accounting for cyclic related effects. In particular, the concept of regularized unilateral effect has been introduced to describe the progressive crack closure. The macroscopic model has been calibrated with the help from the aforementioned virtual testing