WorldWideScience

Sample records for computational exposure model

  1. Benchmarking of computer codes and approaches for modeling exposure scenarios

    International Nuclear Information System (INIS)

    Seitz, R.R.; Rittmann, P.D.; Wood, M.I.; Cook, J.R.

    1994-08-01

    The US Department of Energy Headquarters established a performance assessment task team (PATT) to integrate the activities of DOE sites that are preparing performance assessments for the disposal of newly generated low-level waste. The PATT chartered a subteam with the task of comparing computer codes and exposure scenarios used for dose calculations in performance assessments. This report documents the efforts of the subteam. Computer codes considered in the comparison include GENII, PATHRAE-EPA, MICROSHIELD, and ISOSHLD. Calculations were also conducted using spreadsheets to provide a comparison at the most fundamental level. Calculations and modeling approaches are compared for unit radionuclide concentrations in water and soil for the ingestion, inhalation, and external dose pathways. Over 30 tables comparing inputs and results are provided

  2. A way forward for the development of an exposure computational model to computed tomography dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, C.C., E-mail: cassio.c.ferreira@gmail.co [Nucleo de Fisica, Universidade Federal de Sergipe, Itabaiana-SE, CEP 49500-000 (Brazil); Galvao, L.A., E-mail: lailagalmeida@gmail.co [Departamento de Fisica, Universidade Federal de Sergipe, Sao Cristovao-SE, CEP 49100-000 (Brazil); Vieira, J.W., E-mail: jose.wilson59@uol.com.b [Instituto Federal de Educacao, Ciencia e Tecnologia de Pernambuco, Recife-PE, CEP 50740-540 (Brazil); Escola Politecnica de Pernambuco, Universidade de Pernambuco, Recife-PE, CEP 50720-001 (Brazil); Maia, A.F., E-mail: afmaia@ufs.b [Departamento de Fisica, Universidade Federal de Sergipe, Sao Cristovao-SE, CEP 49100-000 (Brazil)

    2011-04-15

    A way forward for the development of an exposure computational model to computed tomography dosimetry has been presented. In this way, an exposure computational model (ECM) for computed tomography (CT) dosimetry has been developed and validated through comparison with experimental results. For the development of the ECM, X-ray spectra generator codes have been evaluated and the head bow tie filter has been modelled through a mathematical equation. EGS4 and EGSnrc have been used for simulating the radiation transport by the ECM. Geometrical phantoms, commonly used in CT dosimetry, have been modelled by IDN software. MAX06 has also been used to simulate an adult male patient submitted for CT examinations. The evaluation of the X-ray spectra generator codes in CT dosimetry showed dependence with tube filtration (or HVL value). More generally, with the increment of total filtration (or HVL value) the X-raytbc becomes the best X-ray spectra generator code for CT dosimetry. The EGSnrc/X-raytbc combination has calculated C{sub 100,c} in better concordance with C{sub 100,c} measured in two different CT scanners. For a Toshiba CT scanner, the average percentage difference between the calculated C{sub 100,c} values and measured C{sub 100,c} values was 8.2%. Whilst for a GE CT scanner, the average percentage difference was 10.4%. By the measurements of air kerma through a prototype head bow tie filter a third-order exponential decay equation was found. C{sub 100,c} and C{sub 100,p} values calculated by the ECM are in good agreement with values measured at a specific CT scanner. A maximum percentage difference of 2% has been found in the PMMA CT head phantoms, demonstrating effective modelling of the head bow tie filter by the equation. The absorbed and effective doses calculated by the ECM developed in this work have been compared to those calculated by the ECM of Jones and Shrimpton for an adult male patient. For a head examination the absorbed dose values calculated by the

  3. Children, computer exposure and musculoskeletal outcomes: the development of pathway models for school and home computer-related musculoskeletal outcomes.

    Science.gov (United States)

    Harris, Courtenay; Straker, Leon; Pollock, Clare; Smith, Anne

    2015-01-01

    Children's computer use is rapidly growing, together with reports of related musculoskeletal outcomes. Models and theories of adult-related risk factors demonstrate multivariate risk factors associated with computer use. Children's use of computers is different from adult's computer use at work. This study developed and tested a child-specific model demonstrating multivariate relationships between musculoskeletal outcomes, computer exposure and child factors. Using pathway modelling, factors such as gender, age, television exposure, computer anxiety, sustained attention (flow), socio-economic status and somatic complaints (headache and stomach pain) were found to have effects on children's reports of musculoskeletal symptoms. The potential for children's computer exposure to follow a dose-response relationship was also evident. Developing a child-related model can assist in understanding risk factors for children's computer use and support the development of recommendations to encourage children to use this valuable resource in educational, recreational and communication environments in a safe and productive manner. Computer use is an important part of children's school and home life. Application of this developed model, that encapsulates related risk factors, enables practitioners, researchers, teachers and parents to develop strategies that assist young people to use information technology for school, home and leisure in a safe and productive manner.

  4. ADDRESSING HUMAN EXPOSURE TO AIR POLLUTANTS AROUND BUILDINGS IN URBAN AREAS WITH COMPUTATIONAL FLUID DYNAMICS (CFD) MODELS

    Science.gov (United States)

    Computational Fluid Dynamics (CFD) simulations provide a number of unique opportunities for expanding and improving capabilities for modeling exposures to environmental pollutants. The US Environmental Protection Agency's National Exposure Research Laboratory (NERL) has been c...

  5. PREMOR: a point reactor exposure model computer code for survey analysis of power plant performance

    International Nuclear Information System (INIS)

    Vondy, D.R.

    1979-10-01

    The PREMOR computer code was written to exploit a simple, two-group point nuclear reactor power plant model for survey analysis. Up to thirteen actinides, fourteen fission products, and one lumped absorber nuclide density are followed over a reactor history. Successive feed batches are accounted for with provision for from one to twenty batches resident. The effect of exposure of each of the batches to the same neutron flux is determined

  6. PREMOR: a point reactor exposure model computer code for survey analysis of power plant performance

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.

    1979-10-01

    The PREMOR computer code was written to exploit a simple, two-group point nuclear reactor power plant model for survey analysis. Up to thirteen actinides, fourteen fission products, and one lumped absorber nuclide density are followed over a reactor history. Successive feed batches are accounted for with provision for from one to twenty batches resident. The effect of exposure of each of the batches to the same neutron flux is determined.

  7. Synthetic digital radiographs using exposure computer models of Voxels / EGS4 Phantoms

    International Nuclear Information System (INIS)

    Kenned, Roberto; Vieira, Jose W.; Lima, Fernando R.A.; Loureiro, Eduardo

    2008-01-01

    The objective of this work is to produce synthetic digital radiographs from synthetic phantoms with the use of a Computational Model of Exposition (MCE). The literature explains a model consisted on a phantom, a Monte Carlo code and an algorithm of a radioactive source. In this work it was used the FAX phantom (Female Adult voXel), besides the EGS4 system code Eletron Shower-range version 4) and an external source, similar to that used in diagnostic radiology. The implementation of MCE creates files with information on external energy deposited in the voxels of fantoma used, here called EnergiaPorVoxel.dat. These files along with the targeted phantom (fax.sgi) worked as data entry for the DIP software (Digital Imaging Processing) to build the synthetic phantoms based on energy and the effective dose. This way you can save each slice that is the stack of pictures of these phantoms synthetics, which have been called synthetic digital radiography. Using this, it is possible to use techniques of emphasis in space to increase the contrast or elineate contours between organs and tissues. The practical use of these images is not only to allow a planning of examinations performed in clinics and hospitals and reducing unnecessary exposure to patients by error of radiographic techniques. (author)

  8. SOILD: A computer model for calculating the effective dose equivalent from external exposure to distributed gamma sources in soil

    International Nuclear Information System (INIS)

    Chen, S.Y.; LePoire, D.; Yu, C.; Schafetz, S.; Mehta, P.

    1991-01-01

    The SOLID computer model was developed for calculating the effective dose equivalent from external exposure to distributed gamma sources in soil. It is designed to assess external doses under various exposure scenarios that may be encountered in environmental restoration programs. The models four major functional features address (1) dose versus source depth in soil, (2) shielding of clean cover soil, (3) area of contamination, and (4) nonuniform distribution of sources. The model is also capable of adjusting doses when there are variations in soil densities for both source and cover soils. The model is supported by a data base of approximately 500 radionuclides. 4 refs

  9. Accidental release of chlorine in Chicago: Coupling of an exposure model with a Computational Fluid Dynamics model

    Science.gov (United States)

    Sanchez, E. Y.; Colman Lerner, J. E.; Porta, A.; Jacovkis, P. M.

    2013-01-01

    The adverse health effects of the release of hazardous substances into the atmosphere continue being a matter of concern, especially in densely populated urban regions. Emergency responders need to have estimates of these adverse health effects in the local population to aid planning, emergency response, and recovery efforts. For this purpose, models that predict the transport and dispersion of hazardous materials are as necessary as those that estimate the adverse health effects in the population. In this paper, we present the results obtained by coupling a Computational Fluid Dynamics model, FLACS (FLame ACceleration Simulator), with an exposure model, DDC (Damage Differential Coupling). This coupled model system is applied to a scenario of hypothetical release of chlorine with obstacles, such as buildings, and the results show how it is capable of predicting the atmospheric dispersion of hazardous chemicals, and the adverse health effects in the exposed population, to support decision makers both in charge of emergency planning and in charge of real-time response. The results obtained show how knowing the influence of obstacles in the trajectory of the toxic cloud and in the diffusion of the pollutants transported, and obtaining dynamic information of the potentially affected population and of associated symptoms, contribute to improve the planning of the protection and response measures.

  10. Suitable exposure conditions for CB Throne? New model cone beam computed tomography unit for dental use

    International Nuclear Information System (INIS)

    Tanabe, Kouji; Nishikawa, Keiichi; Yajima, Aya; Mizuta, Shigeru; Sano, Tsukasa; Yajima, Yasutomo; Nakagawa, Kanichi; Kousuge, Yuuji

    2008-01-01

    The CB Throne is a cone beam computed tomography unit for dental use, and the smaller version of the CB MercuRay developed by Hitachi Medico Co. We investigated which exposure conditions were suitable in the clinical use. Suitable exposure conditions were determined by simple subjective comparisons. The right temporomandibular joint of the head phantom was scanned at all possible combinations of tube voltage (60, 80, 100, 120 kV) and tube current (10, 15 mA). Oblique-sagittal images of the same position were obtained using multiplanar reconstruction (MPR) function. Images obtained at 120 kV and 15 mA, which are the highest exposure conditions and certain to produce images of the best quality, were used to establish the standard. Eight oral radiologists observed each image and standard image on a LCD monitor. They compared subjectively spatial resolution and noise between each image and standard image using a 10 cm scale. Evaluation points were obtained from the check positions on the scales. The Steel method was used to determine significant differences. The images at 60 kV/10 mA and 80 kV/15 mA showed significantly lower evaluation points on spatial resolution. The images at 60 kV/10 mA, 60 kV/15 mA and 80 kV/10 mA showed significantly lower evaluation points on noise. In conclusion, even if exposure conditions are reduced to 100 kV/10 mA, 100 kV/15 mA or 120 kV/10 mA, the CB Throne will produce images of the best quality. (author)

  11. Computational Exposure Science: An Emerging Discipline to ...

    Science.gov (United States)

    Background: Computational exposure science represents a frontier of environmental science that is emerging and quickly evolving.Objectives: In this commentary, we define this burgeoning discipline, describe a framework for implementation, and review some key ongoing research elements that are advancing the science with respect to exposure to chemicals in consumer products.Discussion: The fundamental elements of computational exposure science include the development of reliable, computationally efficient predictive exposure models; the identification, acquisition, and application of data to support and evaluate these models; and generation of improved methods for extrapolating across chemicals. We describe our efforts in each of these areas and provide examples that demonstrate both progress and potential.Conclusions: Computational exposure science, linked with comparable efforts in toxicology, is ushering in a new era of risk assessment that greatly expands our ability to evaluate chemical safety and sustainability and to protect public health. The National Exposure Research Laboratory’s (NERL’s) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA’s mission to protect human health and the environment. HEASD’s research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA’s strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source

  12. Construction of a computational exposure model for dosimetric calculations using the EGS4 Monte Carlo code and voxel phantoms

    International Nuclear Information System (INIS)

    Vieira, Jose Wilson

    2004-07-01

    The MAX phantom has been developed from existing segmented images of a male adult body, in order to achieve a representation as close as possible to the anatomical properties of the reference adult male specified by the ICRP. In computational dosimetry, MAX can simulate the geometry of a human body under exposure to ionizing radiations, internal or external, with the objective of calculating the equivalent dose in organs and tissues for occupational, medical or environmental purposes of the radiation protection. This study presents a methodology used to build a new computational exposure model MAX/EGS4: the geometric construction of the phantom; the development of the algorithm of one-directional, divergent, and isotropic radioactive sources; new methods for calculating the equivalent dose in the red bone marrow and in the skin, and the coupling of the MAX phantom with the EGS4 Monte Carlo code. Finally, some results of radiation protection, in the form of conversion coefficients between equivalent dose (or effective dose) and free air-kerma for external photon irradiation are presented and discussed. Comparing the results presented with similar data from other human phantoms it is possible to conclude that the coupling MAX/EGS4 is satisfactory for the calculation of the equivalent dose in radiation protection. (author)

  13. New format for storage of voxel phantom, and exposure computer model EGS4/MAX to EGSnrc/MASH update

    International Nuclear Information System (INIS)

    Leal Neto, Viriato; Vieira, Jose W.; Lima, Fernando R.A.; Lima, Lindeval F.

    2011-01-01

    In order to estimate the dosage absorbed by those subjected to ionizing radiation, it is necessary to perform simulations using the exposure computational model (ECM). Such models are consists essentially of an anthropomorphic phantom and a Monte Carlo code (MC). The conjunction of a voxel phantom of the MC code is a complex process and often results in solving a specific problem. This is partly due to the way the phantom voxel is stored on a computer. It is usually required a substantial amount of space to store a static representation of the human body and also a significant amount of memory for reading and processing a given simulation. This paper presents a new way to store data concerning the geometry irradiated (similar to the technique of repeated structures used in the geometry of MCNP code), reducing by 52% the disk space required for storage when compared to the previous format applied by Grupo de Dosimetria Numerica (GDN/CNPq). On the other hand, research in numerical dosimetry leads to a constant improvement on the resolution of voxel phantoms leading thus to a new requirement, namely, to develop new estimates of dose. Therefore, this work also performs an update of the MAX (Male Adult voXel)/EGS4 ECM for the MASH (Adult MaleMeSH)/EGSnrc ECM and presents instances of dosimetric evaluations using the new ECM. Besides the update of the phantom and the MC code, the algorithm of the source used has also been improved in contrast to previous publications. (author)

  14. ERX: a software for editing files containing X-ray spectra to be used in exposure computational models

    International Nuclear Information System (INIS)

    Cabral, Manuela O.M.; Vieira, Jose W.; Silva, Alysson G.; Leal Neto, Viriato; Oliveira, Alex C.H.; Lima, Fernando R.A.

    2011-01-01

    Exposure Computational Models (ECMs) are utilities that simulate situations in which occurs irradiation in a given environment. An ECM is composed primarily by an anthropomorphic model (phantom), and a Monte Carlo code (MC). This paper presents a tutorial of the software Espectro de Raios-X (ERX). This software performs reading and numerical and graphical analysis of text files containing diagnostic X-ray spectra for use in algorithms of radioactive sources in the ECMs of a Grupo de Dosimetria Numerica. The ERX allows the user to select one among several X-ray spectrums in the energy range Diagnostic radiology X-Ray most commonly used in radiology clinics. In the current version of the ERX there are two types of input files: the contained in mspectra.dat file and the resulting of MC simulations in Geant4. The software allows the construction of charts of the Probability Density Function (PDF) and Cumulative Distribution Function (CDF) of a selected spectrum as well as the table with the values of these functions and the spectrum. In addition, the ERX allows the user to make comparative analysis between the PDF graphics of the two catalogs of spectra available, besides being can perform dosimetric evaluations with the selected spectrum. A software of this kind is an important computational tool for researchers in numerical dosimetry because of the diversity of Diagnostic radiology X-Ray machines, which implies in a mass of input data highly diverse. And because of this, the ERX provides independence to the group related to the data origin that is contained in the catalogs created, not being necessary to resort to others. (author)

  15. Computational Model Prediction and Biological Validation Using Simplified Mixed Field Exposures for the Development of a GCR Reference Field

    Science.gov (United States)

    Hada, M.; Rhone, J.; Beitman, A.; Saganti, P.; Plante, I.; Ponomarev, A.; Slaba, T.; Patel, Z.

    2018-01-01

    The yield of chromosomal aberrations has been shown to increase in the lymphocytes of astronauts after long-duration missions of several months in space. Chromosome exchanges, especially translocations, are positively correlated with many cancers and are therefore a potential biomarker of cancer risk associated with radiation exposure. Although extensive studies have been carried out on the induction of chromosomal aberrations by low- and high-LET radiation in human lymphocytes, fibroblasts, and epithelial cells exposed in vitro, there is a lack of data on chromosome aberrations induced by low dose-rate chronic exposure and mixed field beams such as those expected in space. Chromosome aberration studies at NSRL will provide the biological validation needed to extend the computational models over a broader range of experimental conditions (more complicated mixed fields leading up to the galactic cosmic rays (GCR) simulator), helping to reduce uncertainties in radiation quality effects and dose-rate dependence in cancer risk models. These models can then be used to answer some of the open questions regarding requirements for a full GCR reference field, including particle type and number, energy, dose rate, and delivery order. In this study, we designed a simplified mixed field beam with a combination of proton, helium, oxygen, and iron ions with shielding or proton, helium, oxygen, and titanium without shielding. Human fibroblasts cells were irradiated with these mixed field beam as well as each single beam with acute and chronic dose rate, and chromosome aberrations (CA) were measured with 3-color fluorescent in situ hybridization (FISH) chromosome painting methods. Frequency and type of CA induced with acute dose rate and chronic dose rates with single and mixed field beam will be discussed. A computational chromosome and radiation-induced DNA damage model, BDSTRACKS (Biological Damage by Stochastic Tracks), was updated to simulate various types of CA induced by

  16. Computed Radiography Exposure Indices in Mammography | Koen ...

    African Journals Online (AJOL)

    Computed Radiography Exposure Indices in Mammography. L Koen, C Herbst, W Rae. Abstract. Background. Studies indicate that computed radiography (CR) can lead to increased radiation dose to patients. It is therefore important to relate the exposure indicators provided by CR manufacturers to the radiation dose ...

  17. Computational Modeling | Bioenergy | NREL

    Science.gov (United States)

    cell walls and are the source of biofuels and biomaterials. Our modeling investigates their properties . Quantum Mechanical Models NREL studies chemical and electronic properties and processes to reduce barriers Computational Modeling Computational Modeling NREL uses computational modeling to increase the

  18. Predicting Adaptive Response to Fadrozole Exposure:Computational Model of the Fathead MinnowsHypothalamic-Pituitary-Gonadal Axis

    Science.gov (United States)

    Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic mathematical model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict doseresponse and time-course (...

  19. Predicting Adaptive Response to Fadrozole Exposure: Computational Model of the Fathead Minnow Hypothalamic-Pituitary-Gonadal Axis

    Science.gov (United States)

    Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic mathematical model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose-response and time-course (...

  20. Development of a external exposure computational model for studying of input dose in skin for radiographs of thorax and vertebral column

    International Nuclear Information System (INIS)

    Muniz, Bianca C.; Menezes, Claudio J.M.; Vieira, Jose W.

    2014-01-01

    The dosimetric measurements do not always happen directly in the human body. Therefore, these assessments can be performed using anthropomorphic models (phantoms) evidencing models computational exposure (MCE) using techniques of Monte Carlo Method for virtual simulations. These processing techniques coupled with more powerful and affordable computers make the Monte Carlo method one of the tools most used worldwide in radiation transport area. In this work, the Monte Carlo EGS4 program was used to develop a computer model of external exposure to study the entrance skin dose for chest and column X-radiography and, aiming to optimize these practices by reducing doses to patients, professionals involved and the general public. The results obtained experimentally with the electrometer Radcal, model 9015, associated with the ionization chamber for radiology model 10X5-6, showed that the proposed computational model can be used in quality assurance programs in radiodiagnostic, evaluating the entrance skin dose when varying parameters of the radiation beam such as kilo voltage peak (kVp), current-time product (mAs), total filtration and distance surface source (DFS), optimizing the practices in radiodiagnostic and meeting the current regulation

  1. Computational modeling and statistical analyses on individual contact rate and exposure to disease in complex and confined transportation hubs

    Science.gov (United States)

    Wang, W. L.; Tsui, K. L.; Lo, S. M.; Liu, S. B.

    2018-01-01

    Crowded transportation hubs such as metro stations are thought as ideal places for the development and spread of epidemics. However, for the special features of complex spatial layout, confined environment with a large number of highly mobile individuals, it is difficult to quantify human contacts in such environments, wherein disease spreading dynamics were less explored in the previous studies. Due to the heterogeneity and dynamic nature of human interactions, increasing studies proved the importance of contact distance and length of contact in transmission probabilities. In this study, we show how detailed information on contact and exposure patterns can be obtained by statistical analyses on microscopic crowd simulation data. To be specific, a pedestrian simulation model-CityFlow was employed to reproduce individuals' movements in a metro station based on site survey data, values and distributions of individual contact rate and exposure in different simulation cases were obtained and analyzed. It is interesting that Weibull distribution fitted the histogram values of individual-based exposure in each case very well. Moreover, we found both individual contact rate and exposure had linear relationship with the average crowd densities of the environments. The results obtained in this paper can provide reference to epidemic study in complex and confined transportation hubs and refine the existing disease spreading models.

  2. Probabilistic dietary exposure models

    NARCIS (Netherlands)

    Boon, Polly E.; Voet, van der H.

    2015-01-01

    Exposure models are used to calculate the amount of potential harmful chemicals ingested by a human population. Examples of harmful chemicals are residues of pesticides, chemicals entering food from the environment (such as dioxins, cadmium, lead, mercury), and chemicals that are generated via

  3. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  4. Modelling exposure opportunities

    DEFF Research Database (Denmark)

    Sabel, Clive E.; Gatrell, Anthony C.; Löytönen, Markku

    2000-01-01

    This paper addresses the issues surrounding an individual's exposure to potential environmental risk factors, which can be implicated in the aetiology of a disease. We hope to further elucidate the 'lag' or latency period between the initial exposure to potential pathogens and the physical...... boundaries.We use kernel estimation to model space-time patterns. Raised relative risk is assessed by adopting appropriate adjustments for the underlying population at risk, with the use of controls. Significance of the results is assessed using Monte Carlo simulation, and comparisons are made with results...

  5. Thinking Through Computational Exposure as an Evolving Paradign Shift for Exposure Science: Development and Application of Predictive Models from Big Data

    Science.gov (United States)

    Symposium Abstract: Exposure science has evolved from a time when the primary focus was on measurements of environmental and biological media and the development of enabling field and laboratory methods. The Total Exposure Assessment Method (TEAM) studies of the 1980s were class...

  6. Computing transient exposure to indoor pollutants

    International Nuclear Information System (INIS)

    Owczarski, P.C.; Parker, G.B.

    1983-03-01

    A computer code, CORRAL, is used to compute the transient levels of gases and respirable particulates in a residence. Predictions of time-varying exposure to radon (from the outside air, soil and well water) and respirable particulates (from outside air, wood stove operation and cigarette smoke) for a mother and child over 24 hours are made. Average 24-hour radon exposures are 13 times background (0.75 pCi/l) for the child and 4.5 times background for the mother. Average 24-hour respirable particulate exposures are 5.6 times background (100 μg/m 3 ) for the mother and 4.2 times background for the child. The controlling parameters examined are source location, flow rates between rooms, air infiltration rate and lifestyle. The first three are shown to influence the formation of local pockets of high concentration of radon and particulates, and the last parameter shows that lifestyle patterns ultimately govern individual exposure to these pockets of high concentrations. The code is useful for examination of mitigation measures to reduce exposure and examination of the effects that the controlling parameters have on exposure to indoor pollutants

  7. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  8. EXPOSURE ANALYSIS MODELING SYSTEM (EXAMS): USER MANUAL AND SYSTEM DOCUMENTATION

    Science.gov (United States)

    The Exposure Analysis Modeling System, first published in 1982 (EPA-600/3-82-023), provides interactive computer software for formulating aquatic ecosystem models and rapidly evaluating the fate, transport, and exposure concentrations of synthetic organic chemicals - pesticides, ...

  9. AirPEx. Air Pollution Exposure Model

    Energy Technology Data Exchange (ETDEWEB)

    Freijer, J.I.; Bloemen, H.J.Th.; De Loos, S.; Marra, M.; Rombout, P.J.A.; Steentjes, G.M.; Van Veen, M.P.

    1997-12-01

    Analysis of inhalatory exposure to air pollution is an important area of investigation when assessing the risks of air pollution for human health. Inhalatory exposure research focuses on the exposure of humans to air pollutants and the entry of these pollutants into the human respiratory tract. The principal grounds for studying the inhalatory exposure of humans to air pollutants are formed by the need for realistic exposure/dose estimates to evaluate the health effects of these pollutants. The AirPEx (Air Pollution Exposure) model, developed to assess the time- and space-dependence of inhalatory exposure of humans to air pollution, has been implemented for use as a Windows 3.1 computer program. The program is suited to estimating various exposure and dose quantities for individuals, as well as for populations and subpopulations. This report describes the fundamentals of the AirPEx model and provides a user manual for the computer program. Several examples included in the report illustrate the possibilities of the AirPEx model in exposure assessment. The model will be used at the National Institute of Public Health and the Environment as a tool in analysing the current exposure of the Dutch population to air pollutants. 57 refs.

  10. The CMS Computing Model

    International Nuclear Information System (INIS)

    Bonacorsi, D.

    2007-01-01

    The CMS experiment at LHC has developed a baseline Computing Model addressing the needs of a computing system capable to operate in the first years of LHC running. It is focused on a data model with heavy streaming at the raw data level based on trigger, and on the achievement of the maximum flexibility in the use of distributed computing resources. The CMS distributed Computing Model includes a Tier-0 centre at CERN, a CMS Analysis Facility at CERN, several Tier-1 centres located at large regional computing centres, and many Tier-2 centres worldwide. The workflows have been identified, along with a baseline architecture for the data management infrastructure. This model is also being tested in Grid Service Challenges of increasing complexity, coordinated with the Worldwide LHC Computing Grid community

  11. Computational models of neuromodulation.

    Science.gov (United States)

    Fellous, J M; Linster, C

    1998-05-15

    Computational modeling of neural substrates provides an excellent theoretical framework for the understanding of the computational roles of neuromodulation. In this review, we illustrate, with a large number of modeling studies, the specific computations performed by neuromodulation in the context of various neural models of invertebrate and vertebrate preparations. We base our characterization of neuromodulations on their computational and functional roles rather than on anatomical or chemical criteria. We review the main framework in which neuromodulation has been studied theoretically (central pattern generation and oscillations, sensory processing, memory and information integration). Finally, we present a detailed mathematical overview of how neuromodulation has been implemented at the single cell and network levels in modeling studies. Overall, neuromodulation is found to increase and control computational complexity.

  12. Overhead Crane Computer Model

    Science.gov (United States)

    Enin, S. S.; Omelchenko, E. Y.; Fomin, N. V.; Beliy, A. V.

    2018-03-01

    The paper has a description of a computer model of an overhead crane system. The designed overhead crane system consists of hoisting, trolley and crane mechanisms as well as a payload two-axis system. With the help of the differential equation of specified mechanisms movement derived through Lagrange equation of the II kind, it is possible to build an overhead crane computer model. The computer model was obtained using Matlab software. Transients of coordinate, linear speed and motor torque of trolley and crane mechanism systems were simulated. In addition, transients of payload swaying were obtained with respect to the vertical axis. A trajectory of the trolley mechanism with simultaneous operation with the crane mechanism is represented in the paper as well as a two-axis trajectory of payload. The designed computer model of an overhead crane is a great means for studying positioning control and anti-sway control systems.

  13. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  14. Computer program for diagnostic X-ray exposure conversion

    International Nuclear Information System (INIS)

    Lewis, S.L.

    1984-01-01

    Presented is a computer program designed to convert any given set of diagnostic X-ray exposure factors sequentially into another, yielding either an equivalent photographic density or one increased or decreased by a specifiable proportion. In addition to containing the wherewithal with which to manipulate a set of exposure factors, the facility to print hard (paper) copy is included enabling the results to be pasted into a notebook and used at any time. This program was originally written as an investigative exercise into examining the potential use of computers for practical radiographic purposes as conventionally encountered. At the same time, its possible use as an educational tool was borne in mind. To these ends, the current version of this program may be used as a means whereby exposure factors used in a diagnostic department may be altered to suit a particular requirement or may be used in the school as a mathematical model to describe the behaviour of exposure factors under manipulation without patient exposure. (author)

  15. CMS computing model evolution

    International Nuclear Information System (INIS)

    Grandi, C; Bonacorsi, D; Colling, D; Fisk, I; Girone, M

    2014-01-01

    The CMS Computing Model was developed and documented in 2004. Since then the model has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the changes planned for the restart of the LHC program in 2015. We will discuss the changes planning in the use and definition of the computing tiers that were defined with the MONARC project. We will present how we intend to use new services and infrastructure to provide more efficient and transparent access to the data. We will discuss the computing plans to make better use of the computing capacity by scheduling more of the processor nodes, making better use of the disk storage, and more intelligent use of the networking.

  16. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  17. Computationally Modeling Interpersonal Trust

    Directory of Open Access Journals (Sweden)

    Jin Joo eLee

    2013-12-01

    Full Text Available We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naivete' of this domain knowledge. We then present the construction of hidden Markov models to incorporate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.

  18. Development of personnel exposure management system with personal computer

    International Nuclear Information System (INIS)

    Yamato, Ichiro; Yamamoto, Toshiki

    1992-01-01

    In nuclear power plants, large scale personnel exposure management systems have been developed and established by utilities. Though being common in the base, the implementations are specific by plants. Contractors must control their workers' exposures by their own methods and systems. To comply with the utilities' parental systems, contractors' systems tend to differ by plants, thus make it difficult for contractors to design a standard system that is common to all relevant plants. Circumstances being as such, however, we have developed a system which is applicable to various customer utilities with minimal variations, using personal computers with database management and data communication softwares, with relatively low cost. We hope that this system will develop to the standard model for all Japanese contractors' personnel exposure management systems. (author)

  19. Modeled population exposures to ozone

    Data.gov (United States)

    U.S. Environmental Protection Agency — Population exposures to ozone from APEX modeling for combinations of potential future air quality and demographic change scenarios. This dataset is not publicly...

  20. Computed tomography of the chest with model-based iterative reconstruction using a radiation exposure similar to chest X-ray examination: preliminary observations

    Energy Technology Data Exchange (ETDEWEB)

    Neroladaki, Angeliki; Botsikas, Diomidis; Boudabbous, Sana; Becker, Christoph D.; Montet, Xavier [Geneva University Hospital, Department of Radiology, Geneva 4 (Switzerland)

    2013-02-15

    The purpose of this study was to assess the diagnostic image quality of ultra-low-dose chest computed tomography (ULD-CT) obtained with a radiation dose comparable to chest radiography and reconstructed with filtered back projection (FBP), adaptive statistical iterative reconstruction (ASIR) and model-based iterative reconstruction (MBIR) in comparison with standard dose diagnostic CT (SDD-CT) or low-dose diagnostic CT (LDD-CT) reconstructed with FBP alone. Unenhanced chest CT images of 42 patients acquired with ULD-CT were compared with images obtained with SDD-CT or LDD-CT in the same examination. Noise measurements and image quality, based on conspicuity of chest lesions on all CT data sets were assessed on a five-point scale. The radiation dose of ULD-CT was 0.16 {+-} 0.006 mSv compared with 11.2 {+-} 2.7 mSv for SDD-CT (P < 0.0001) and 2.7 {+-} 0.9 mSv for LDD-CT. Image quality of ULD-CT increased significantly when using MBIR compared with FBP or ASIR (P < 0.001). ULD-CT reconstructed with MBIR enabled to detect as many non-calcified pulmonary nodules as seen on SDD-CT or LDD-CT. However, image quality of ULD-CT was clearly inferior for characterisation of ground glass opacities or emphysema. Model-based iterative reconstruction allows detection of pulmonary nodules with ULD-CT with radiation exposure in the range of a posterior to anterior (PA) and lateral chest X-ray. (orig.)

  1. A PHYSIOLOGICALLY BASED COMPUTATIONAL MODEL OF THE BPG AXIS IN FATHEAD MINNOWS: PREDICTING EFFECTS OF ENDOCRINE DISRUPTING CHEMICAL EXPOSURE ON REPRODUCTIVE ENDPOINTS

    Science.gov (United States)

    This presentation describes development and application of a physiologically-based computational model that simulates the brain-pituitary-gonadal (BPG) axis and other endpoints important in reproduction such as concentrations of sex steroid hormones, 17-estradiol, testosterone, a...

  2. Chaos Modelling with Computers

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Chaos Modelling with Computers Unpredicatable Behaviour of Deterministic Systems. Balakrishnan Ramasamy T S K V Iyer. General Article Volume 1 Issue 5 May 1996 pp 29-39 ...

  3. Efficient Computation of Exposure Profiles for Counterparty Credit Risk

    NARCIS (Netherlands)

    de Graaf, C.S.L.; Feng, Q.; Kandhai, D.; Oosterlee, C.W.

    2014-01-01

    Three computational techniques for approximation of counterparty exposure for financial derivatives are presented. The exposure can be used to quantify so-called Credit Valuation Adjustment (CVA) and Potential Future Exposure (PFE), which are of utmost importance for modern risk management in the

  4. Efficient computation of exposure profiles for counterparty credit risk

    NARCIS (Netherlands)

    C.S.L. de Graaf (Kees); Q. Feng (Qian); B.D. Kandhai; C.W. Oosterlee (Cornelis)

    2014-01-01

    htmlabstractThree computational techniques for approximation of counterparty exposure for financial derivatives are presented. The exposure can be used to quantify so-called Credit Valuation Adjustment (CVA) and Potential Future Exposure (PFE), which are of utmost importance for modern risk

  5. Modelling computer networks

    International Nuclear Information System (INIS)

    Max, G

    2011-01-01

    Traffic models in computer networks can be described as a complicated system. These systems show non-linear features and to simulate behaviours of these systems are also difficult. Before implementing network equipments users wants to know capability of their computer network. They do not want the servers to be overloaded during temporary traffic peaks when more requests arrive than the server is designed for. As a starting point for our study a non-linear system model of network traffic is established to exam behaviour of the network planned. The paper presents setting up a non-linear simulation model that helps us to observe dataflow problems of the networks. This simple model captures the relationship between the competing traffic and the input and output dataflow. In this paper, we also focus on measuring the bottleneck of the network, which was defined as the difference between the link capacity and the competing traffic volume on the link that limits end-to-end throughput. We validate the model using measurements on a working network. The results show that the initial model estimates well main behaviours and critical parameters of the network. Based on this study, we propose to develop a new algorithm, which experimentally determines and predict the available parameters of the network modelled.

  6. LHCb computing model

    CERN Document Server

    Frank, M; Pacheco, Andreu

    1998-01-01

    This document is a first attempt to describe the LHCb computing model. The CPU power needed to process data for the event filter and reconstruction is estimated to be 2.2 \\Theta 106 MIPS. This will be installed at the experiment and will be reused during non data-taking periods for reprocessing. The maximal I/O of these activities is estimated to be around 40 MB/s.We have studied three basic models concerning the placement of the CPU resources for the other computing activities, Monte Carlo-simulation (1:4 \\Theta 106 MIPS) and physics analysis (0:5 \\Theta 106 MIPS): CPU resources may either be located at the physicist's homelab, national computer centres (Regional Centres) or at CERN.The CPU resources foreseen for analysis are sufficient to allow 100 concurrent analyses. It is assumed that physicists will work in physics groups that produce analysis data at an average rate of 4.2 MB/s or 11 TB per month. However, producing these group analysis data requires reading capabilities of 660 MB/s. It is further assu...

  7. The Antares computing model

    Energy Technology Data Exchange (ETDEWEB)

    Kopper, Claudio, E-mail: claudio.kopper@nikhef.nl [NIKHEF, Science Park 105, 1098 XG Amsterdam (Netherlands)

    2013-10-11

    Completed in 2008, Antares is now the largest water Cherenkov neutrino telescope in the Northern Hemisphere. Its main goal is to detect neutrinos from galactic and extra-galactic sources. Due to the high background rate of atmospheric muons and the high level of bioluminescence, several on-line and off-line filtering algorithms have to be applied to the raw data taken by the instrument. To be able to handle this data stream, a dedicated computing infrastructure has been set up. The paper covers the main aspects of the current official Antares computing model. This includes an overview of on-line and off-line data handling and storage. In addition, the current usage of the “IceTray” software framework for Antares data processing is highlighted. Finally, an overview of the data storage formats used for high-level analysis is given.

  8. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  9. Sensitivity Analysis of Personal Exposure Assessment Using a Computer Simulated Person

    DEFF Research Database (Denmark)

    Brohus, Henrik; Jensen, H. K.

    2009-01-01

    The paper considers uncertainties related to personal exposure assessment using a computer simulated person. CFD is used to simulate a uniform flow field around a human being to determine the personal exposure to a contaminant source. For various vertical locations of a point contaminant source...... three additional factors are varied, namely the velocity, details of the computer simulated person, and the CFD model of the wind channel. The personal exposure is found to be highly dependent on the relative source location. Variation in the range of two orders of magnitude is found. The exposure...

  10. Plasticity modeling & computation

    CERN Document Server

    Borja, Ronaldo I

    2013-01-01

    There have been many excellent books written on the subject of plastic deformation in solids, but rarely can one find a textbook on this subject. “Plasticity Modeling & Computation” is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids. It adopts a simple narrative style that is not mathematically overbearing, and has been written to emulate a professor giving a lecture on this subject inside a classroom. Each section is written to provide a balance between the relevant equations and the explanations behind them. Where relevant, sections end with one or more exercises designed to reinforce the understanding of the “lecture.” Color figures enhance the presentation and make the book very pleasant to read. For professors planning to use this textbook for their classes, the contents are sufficient for Parts A and B that can be taught in sequence over a period of two semesters or quarters.

  11. Models of optical quantum computing

    Directory of Open Access Journals (Sweden)

    Krovi Hari

    2017-03-01

    Full Text Available I review some work on models of quantum computing, optical implementations of these models, as well as the associated computational power. In particular, we discuss the circuit model and cluster state implementations using quantum optics with various encodings such as dual rail encoding, Gottesman-Kitaev-Preskill encoding, and coherent state encoding. Then we discuss intermediate models of optical computing such as boson sampling and its variants. Finally, we review some recent work in optical implementations of adiabatic quantum computing and analog optical computing. We also provide a brief description of the relevant aspects from complexity theory needed to understand the results surveyed.

  12. Computer code for calculating personnel doses due to tritium exposures

    International Nuclear Information System (INIS)

    Graham, C.L.; Parlagreco, J.R.

    1977-01-01

    This report describes a computer code written in LLL modified Fortran IV that can be used on a CDC 7600 for calculating personnel doses due to internal exposures to tritium. The code is capable of handling various exposure situations and is also capable of detecting a large variety of data input errors that would lead to errors in the dose assessment. The critical organ is the body water

  13. A physicist's model of computation

    International Nuclear Information System (INIS)

    Fredkin, E.

    1991-01-01

    An attempt is presented to make a statement about what a computer is and how it works from the perspective of physics. The single observation that computation can be a reversible process allows for the same kind of insight into computing as was obtained by Carnot's discovery that heat engines could be modelled as reversible processes. It allows us to bring computation into the realm of physics, where the power of physics allows us to ask and answer questions that seemed intractable from the viewpoint of computer science. Strangely enough, this effort makes it clear why computers get cheaper every year. (author) 14 refs., 4 figs

  14. Computational modeling in biomechanics

    CERN Document Server

    Mofrad, Mohammad

    2010-01-01

    This book provides a glimpse of the diverse and important roles that modern computational technology is playing in various areas of biomechanics. It includes unique chapters on ab initio quantum mechanical, molecular dynamic and scale coupling methods..

  15. Significant RF-EMF and thermal levels observed in a computational model of a person with a tibial plate for grounded 40 MHz exposure.

    Science.gov (United States)

    McIntosh, Robert L; Iskra, Steve; Anderson, Vitas

    2014-05-01

    Using numerical modeling, a worst-case scenario is considered when a person with a metallic implant is exposed to a radiofrequency (RF) electromagnetic field (EMF). An adult male standing on a conductive ground plane was exposed to a 40 MHz vertically polarized plane wave field, close to whole-body resonance where maximal induced current flows are expected in the legs. A metal plate (50-300 mm long) was attached to the tibia in the left leg. The findings from this study re-emphasize the need to ensure compliance with limb current reference levels for exposures near whole-body resonance, and not just rely on compliance with ambient electric (E) and magnetic (H) field reference levels. Moreover, we emphasize this recommendation for someone with a tibial plate, as failure to comply may result in significant tissue damage (increases in the localized temperature of 5-10 °C were suggested by the modeling for an incident E-field of 61.4 V/m root mean square (rms)). It was determined that the occupational reference level for limb current (100 mA rms), as stipulated in the 1998 guidelines of the International Commission on Non-Ionizing Radiation Protection (ICNIRP), is satisfied if the plane wave incident E-field levels are no more than 29.8 V/m rms without an implant and 23.4 V/m rms for the model with a 300 mm implant. © 2014 Wiley Periodicals, Inc.

  16. Mathematical Modeling and Computational Thinking

    Science.gov (United States)

    Sanford, John F.; Naidu, Jaideep T.

    2017-01-01

    The paper argues that mathematical modeling is the essence of computational thinking. Learning a computer language is a valuable assistance in learning logical thinking but of less assistance when learning problem-solving skills. The paper is third in a series and presents some examples of mathematical modeling using spreadsheets at an advanced…

  17. COMPUTATIONAL MODELS FOR SUSTAINABLE DEVELOPMENT

    OpenAIRE

    Monendra Grover; Rajesh Kumar; Tapan Kumar Mondal; S. Rajkumar

    2011-01-01

    Genetic erosion is a serious problem and computational models have been developed to prevent it. The computational modeling in this field not only includes (terrestrial) reserve design, but also decision modeling for related problems such as habitat restoration, marine reserve design, and nonreserve approaches to conservation management. Models have been formulated for evaluating tradeoffs between socioeconomic, biophysical, and spatial criteria in establishing marine reserves. The percolatio...

  18. Computer-Aided Modeling Framework

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    Models are playing important roles in design and analysis of chemicals based products and the processes that manufacture them. Computer-aided methods and tools have the potential to reduce the number of experiments, which can be expensive and time consuming, and there is a benefit of working...... development and application. The proposed work is a part of the project for development of methods and tools that will allow systematic generation, analysis and solution of models for various objectives. It will use the computer-aided modeling framework that is based on a modeling methodology, which combines....... In this contribution, the concept of template-based modeling is presented and application is highlighted for the specific case of catalytic membrane fixed bed models. The modeling template is integrated in a generic computer-aided modeling framework. Furthermore, modeling templates enable the idea of model reuse...

  19. Construction of a computational exposure model for dosimetric calculations using the EGS4 Monte Carlo code and voxel phantoms; Construcao de um modelo computacional de exposicao para calculos dosimetricos utilizando o codigo Monte Carlo EGS4 e fantomas de voxels

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Jose Wilson

    2004-07-15

    The MAX phantom has been developed from existing segmented images of a male adult body, in order to achieve a representation as close as possible to the anatomical properties of the reference adult male specified by the ICRP. In computational dosimetry, MAX can simulate the geometry of a human body under exposure to ionizing radiations, internal or external, with the objective of calculating the equivalent dose in organs and tissues for occupational, medical or environmental purposes of the radiation protection. This study presents a methodology used to build a new computational exposure model MAX/EGS4: the geometric construction of the phantom; the development of the algorithm of one-directional, divergent, and isotropic radioactive sources; new methods for calculating the equivalent dose in the red bone marrow and in the skin, and the coupling of the MAX phantom with the EGS4 Monte Carlo code. Finally, some results of radiation protection, in the form of conversion coefficients between equivalent dose (or effective dose) and free air-kerma for external photon irradiation are presented and discussed. Comparing the results presented with similar data from other human phantoms it is possible to conclude that the coupling MAX/EGS4 is satisfactory for the calculation of the equivalent dose in radiation protection. (author)

  20. Development of a spatial stochastic multimedia exposure model to assess population exposure at a regional scale

    International Nuclear Information System (INIS)

    Caudeville, Julien; Bonnard, Roseline; Boudet, Céline; Denys, Sébastien; Govaert, Gérard; Cicolella, André

    2012-01-01

    Analyzing the relationship between the environment and health has become a major focus of public health efforts in France, as evidenced by the national action plans for health and the environment. These plans have identified the following two priorities: -identify and manage geographic areas where hotspot exposures are a potential risk to human health; and -reduce exposure inequalities. The aim of this study is to develop a spatial stochastic multimedia exposure model for detecting vulnerable populations and analyzing exposure determinants at a fine resolution and regional scale. A multimedia exposure model was developed by INERIS to assess the transfer of substances from the environment to humans through inhalation and ingestion pathways. The RESPIR project adds a spatial dimension by linking GIS (Geographic Information System) to the model. Tools are developed using modeling, spatial analysis and geostatistic methods to build and discretize interesting variables and indicators from different supports and resolutions on a 1-km 2 regular grid. We applied this model to the risk assessment of exposure to metals (cadmium, lead and nickel) using data from a region in France (Nord-Pas-de-Calais). The considered exposure pathways include the atmospheric contaminant inhalation and ingestion of soil, vegetation, meat, egg, milk, fish and drinking water. Exposure scenarios are defined for different reference groups (age, dietary properties, and the fraction of food produced locally). The two largest risks correspond to an ancient industrial site (Metaleurop) and the Lille agglomeration. In these areas, cadmium, vegetation ingestion and soil contamination are the principal determinants of the computed risk. -- Highlights: ► We present a multimedia exposure model for mapping environmental disparities. ► We perform a risk assessment on a region of France at a fine scale for three metals. ► We examine exposure determinants and detect vulnerable population. ► The largest

  1. Computational Fluid Dynamics Modeling of Bacillus anthracis ...

    Science.gov (United States)

    Journal Article Three-dimensional computational fluid dynamics and Lagrangian particle deposition models were developed to compare the deposition of aerosolized Bacillus anthracis spores in the respiratory airways of a human with that of the rabbit, a species commonly used in the study of anthrax disease. The respiratory airway geometries for each species were derived from computed tomography (CT) or µCT images. Both models encompassed airways that extended from the external nose to the lung with a total of 272 outlets in the human model and 2878 outlets in the rabbit model. All simulations of spore deposition were conducted under transient, inhalation-exhalation breathing conditions using average species-specific minute volumes. Four different exposure scenarios were modeled in the rabbit based upon experimental inhalation studies. For comparison, human simulations were conducted at the highest exposure concentration used during the rabbit experimental exposures. Results demonstrated that regional spore deposition patterns were sensitive to airway geometry and ventilation profiles. Despite the complex airway geometries in the rabbit nose, higher spore deposition efficiency was predicted in the upper conducting airways of the human at the same air concentration of anthrax spores. This greater deposition of spores in the upper airways in the human resulted in lower penetration and deposition in the tracheobronchial airways and the deep lung than that predict

  2. RFQ modeling computer program

    International Nuclear Information System (INIS)

    Potter, J.M.

    1985-01-01

    The mathematical background for a multiport-network-solving program is described. A method for accurately numerically modeling an arbitrary, continuous, multiport transmission line is discussed. A modification to the transmission-line equations to accommodate multiple rf drives is presented. An improved model for the radio-frequency quadrupole (RFQ) accelerator that corrects previous errors is given. This model permits treating the RFQ as a true eight-port network for simplicity in interpreting the field distribution and ensures that all modes propagate at the same velocity in the high-frequency limit. The flexibility of the multiport model is illustrated by simple modifications to otherwise two-dimensional systems that permit modeling them as linear chains of multiport networks

  3. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    GENERAL I ARTICLE. Computer Based ... universities, and later did system analysis, ... sonal computers (PC) and low cost software packages and tools. They can serve as useful learning experience through student projects. Models are .... Let us consider a numerical example: to calculate the velocity of a trainer aircraft ...

  4. Computational Modeling of Space Physiology

    Science.gov (United States)

    Lewandowski, Beth E.; Griffin, Devon W.

    2016-01-01

    The Digital Astronaut Project (DAP), within NASAs Human Research Program, develops and implements computational modeling for use in the mitigation of human health and performance risks associated with long duration spaceflight. Over the past decade, DAP developed models to provide insights into space flight related changes to the central nervous system, cardiovascular system and the musculoskeletal system. Examples of the models and their applications include biomechanical models applied to advanced exercise device development, bone fracture risk quantification for mission planning, accident investigation, bone health standards development, and occupant protection. The International Space Station (ISS), in its role as a testing ground for long duration spaceflight, has been an important platform for obtaining human spaceflight data. DAP has used preflight, in-flight and post-flight data from short and long duration astronauts for computational model development and validation. Examples include preflight and post-flight bone mineral density data, muscle cross-sectional area, and muscle strength measurements. Results from computational modeling supplement space physiology research by informing experimental design. Using these computational models, DAP personnel can easily identify both important factors associated with a phenomenon and areas where data are lacking. This presentation will provide examples of DAP computational models, the data used in model development and validation, and applications of the model.

  5. Modeling Exposure to Heat Stress with a Simple Urban Model

    Directory of Open Access Journals (Sweden)

    Peter Hoffmann

    2018-01-01

    Full Text Available As a first step in modeling health-related urban well-being (UrbWellth, a mathematical model is constructed that dynamically simulates heat stress exposure of commuters in an idealized city. This is done by coupling the Simple Urban Radiation Model (SURM, which computes the mean radiant temperature ( T m r t , with a newly developed multi-class multi-mode traffic model. Simulation results with parameters chosen for the city of Hamburg for a hot summer day show that commuters are potentially most exposed to heat stress in the early afternoon when T m r t has its maximum. Varying the morphology with respect to street width and building height shows that a more compact city configuration reduces T m r t and therefore the exposure to heat stress. The impact resulting from changes in the city structure on traffic is simulated to determine the time spent outside during the commute. While the time in traffic jams increases for compact cities, the total commuting time decreases due to shorter distances between home and work place. Concerning adaptation measures, it is shown that increases in the albedo of the urban surfaces lead to an increase in daytime heat stress. Dramatic increases in heat stress exposure are found when both, wall and street albedo, are increased.

  6. Computational modelling in fluid mechanics

    International Nuclear Information System (INIS)

    Hauguel, A.

    1985-01-01

    The modelling of the greatest part of environmental or industrial flow problems gives very similar types of equations. The considerable increase in computing capacity over the last ten years consequently allowed numerical models of growing complexity to be processed. The varied group of computer codes presented are now a complementary tool of experimental facilities to achieve studies in the field of fluid mechanics. Several codes applied in the nuclear field (reactors, cooling towers, exchangers, plumes...) are presented among others [fr

  7. Chaos Modelling with Computers

    Indian Academy of Sciences (India)

    Chaos is one of the major scientific discoveries of our times. In fact many scientists ... But there are other natural phenomena that are not predictable though ... characteristics of chaos. ... The position and velocity are all that are needed to determine the motion of a .... a system of equations that modelled the earth's weather ...

  8. Patient-Specific Computational Modeling

    CERN Document Server

    Peña, Estefanía

    2012-01-01

    This book addresses patient-specific modeling. It integrates computational modeling, experimental procedures, imagine clinical segmentation and mesh generation with the finite element method (FEM) to solve problems in computational biomedicine and bioengineering. Specific areas of interest include cardiovascular problems, ocular and muscular systems and soft tissue modeling. Patient-specific modeling has been the subject of serious research over the last seven years and interest in the area is continually growing and this area is expected to further develop in the near future.

  9. Computer model for ductile fracture

    International Nuclear Information System (INIS)

    Moran, B.; Reaugh, J. E.

    1979-01-01

    A computer model is described for predicting ductile fracture initiation and propagation. The computer fracture model is calibrated by simple and notched round-bar tension tests and a precracked compact tension test. The model is used to predict fracture initiation and propagation in a Charpy specimen and compare the results with experiments. The calibrated model provides a correlation between Charpy V-notch (CVN) fracture energy and any measure of fracture toughness, such as J/sub Ic/. A second simpler empirical correlation was obtained using the energy to initiate fracture in the Charpy specimen rather than total energy CVN, and compared the results with the empirical correlation of Rolfe and Novak

  10. Trust Models in Ubiquitous Computing

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro

    2008-01-01

    We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models.......We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models....

  11. Computational assessment of pregnant woman models exposed to uniform ELF-magnetic fields: compliance with the European current exposure regulations for the general public and occupational exposures at 50 Hz

    International Nuclear Information System (INIS)

    Liorni, Ilaria; Parazzini, Marta; Fiocchi, Serena; Ravazzani, Paolo; Douglas, Mark; Capstick, Myles; Kuster, Niels

    2016-01-01

    The Recommendation 1999/529/EU and the Directive 2013/35/EU suggest limits for both general public and occupational exposures to extremely low-frequency magnetic fields, but without special limits for pregnant women. This study aimed to assess the compliance of pregnant women to the current regulations, when exposed to uniform MF at 50 Hz (100 μT for EU Recommendation and 1 and 6 mT for EU Directive). For general public, exposure of pregnant women and fetus always resulted in compliance with EU Recommendation. For occupational exposures, (1) Electric fields in pregnant women were in compliance with the Directive, with exposure variations due to fetal posture of 40 % in head tissues, (3) Electric fields in fetal CNS tissues of head are above the ICNIRP 2010 limits for general public at 1 mT (in 7 and 9 months gestational age) and at 6 mT (in all gestational ages). (authors)

  12. Radiation exposure of infants and children in computed tomography

    International Nuclear Information System (INIS)

    Schmidt, T.; Stieve, F.E.

    1980-01-01

    The radiation exposure of infants and small children with Computed Tomography is different in several aspects from that of adults undergoing an identical examination. The surface dose at radiation entrance is higher in children because of a smaller body diameter for the same dose rate at the tube. The critical organ dose in the directly irradiated area is higher in children than in adults. The exposure of organs outside the examined area is also higher in children -because of short intervals- than in adults. The absorbed energy, i.e. integral dose, however, is lower in children than in adults because of the lesser volume. The differences between conventional procedures and Computed Tomography, are greater in children than in adults. Here, CT shows higher values than conventional explorations. As a result of the low number of examinations with CT, the contribution towards a genetically significant dose is currently, at least, relatively small [fr

  13. 76 FR 365 - Exposure Modeling Public Meeting

    Science.gov (United States)

    2011-01-04

    ... classification for ecological risk assessments using aerial photography and GIS data. Dermal contact, movement... ENVIRONMENTAL PROTECTION AGENCY [EPA-HQ-OPP-2009-0879; FRL-8860-5] Exposure Modeling Public Meeting AGENCY: Environmental Protection Agency (EPA). ACTION: Notice. SUMMARY: An Exposure Modeling...

  14. Trust models in ubiquitous computing.

    Science.gov (United States)

    Krukow, Karl; Nielsen, Mogens; Sassone, Vladimiro

    2008-10-28

    We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models.

  15. Ch. 33 Modeling: Computational Thermodynamics

    International Nuclear Information System (INIS)

    Besmann, Theodore M.

    2012-01-01

    This chapter considers methods and techniques for computational modeling for nuclear materials with a focus on fuels. The basic concepts for chemical thermodynamics are described and various current models for complex crystalline and liquid phases are illustrated. Also included are descriptions of available databases for use in chemical thermodynamic studies and commercial codes for performing complex equilibrium calculations.

  16. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  17. Computer Modelling of Dynamic Processes

    Directory of Open Access Journals (Sweden)

    B. Rybakin

    2000-10-01

    Full Text Available Results of numerical modeling of dynamic problems are summed in the article up. These problems are characteristic for various areas of human activity, in particular for problem solving in ecology. The following problems are considered in the present work: computer modeling of dynamic effects on elastic-plastic bodies, calculation and determination of performances of gas streams in gas cleaning equipment, modeling of biogas formation processes.

  18. Computational models of complex systems

    CERN Document Server

    Dabbaghian, Vahid

    2014-01-01

    Computational and mathematical models provide us with the opportunities to investigate the complexities of real world problems. They allow us to apply our best analytical methods to define problems in a clearly mathematical manner and exhaustively test our solutions before committing expensive resources. This is made possible by assuming parameter(s) in a bounded environment, allowing for controllable experimentation, not always possible in live scenarios. For example, simulation of computational models allows the testing of theories in a manner that is both fundamentally deductive and experimental in nature. The main ingredients for such research ideas come from multiple disciplines and the importance of interdisciplinary research is well recognized by the scientific community. This book provides a window to the novel endeavours of the research communities to present their works by highlighting the value of computational modelling as a research tool when investigating complex systems. We hope that the reader...

  19. FDTD computation of human eye exposure to ultra-wideband electromagnetic pulses

    Energy Technology Data Exchange (ETDEWEB)

    Simicevic, Neven [Center for Applied Physics Studies, Louisiana Tech University, Ruston, LA 71272 (United States)], E-mail: neven@phys.latech.edu

    2008-03-21

    With an increase in the application of ultra-wideband (UWB) electromagnetic pulses in the communications industry, radar, biotechnology and medicine, comes an interest in UWB exposure safety standards. Despite an increase of the scientific research on bioeffects of exposure to non-ionizing UWB pulses, characterization of those effects is far from complete. A numerical computational approach, such as a finite-difference time domain (FDTD) method, is required to visualize and understand the complexity of broadband electromagnetic interactions. The FDTD method has almost no limits in the description of the geometrical and dispersive properties of the simulated material, it is numerically robust and appropriate for current computer technology. In this paper, a complete calculation of exposure of the human eye to UWB electromagnetic pulses in the frequency range of 3.1-10.6, 22-29 and 57-64 GHz is performed. Computation in this frequency range required a geometrical resolution of the eye of 0.1 mm and an arbitrary precision in the description of its dielectric properties in terms of the Debye model. New results show that the interaction of UWB pulses with the eye tissues exhibits the same properties as the interaction of the continuous electromagnetic waves (CWs) with the frequencies from the pulse's frequency spectrum. It is also shown that under the same exposure conditions the exposure to UWB pulses is from one to many orders of magnitude safer than the exposure to CW.

  20. FDTD computation of human eye exposure to ultra-wideband electromagnetic pulses.

    Science.gov (United States)

    Simicevic, Neven

    2008-03-21

    With an increase in the application of ultra-wideband (UWB) electromagnetic pulses in the communications industry, radar, biotechnology and medicine, comes an interest in UWB exposure safety standards. Despite an increase of the scientific research on bioeffects of exposure to non-ionizing UWB pulses, characterization of those effects is far from complete. A numerical computational approach, such as a finite-difference time domain (FDTD) method, is required to visualize and understand the complexity of broadband electromagnetic interactions. The FDTD method has almost no limits in the description of the geometrical and dispersive properties of the simulated material, it is numerically robust and appropriate for current computer technology. In this paper, a complete calculation of exposure of the human eye to UWB electromagnetic pulses in the frequency range of 3.1-10.6, 22-29 and 57-64 GHz is performed. Computation in this frequency range required a geometrical resolution of the eye of 0.1 mm and an arbitrary precision in the description of its dielectric properties in terms of the Debye model. New results show that the interaction of UWB pulses with the eye tissues exhibits the same properties as the interaction of the continuous electromagnetic waves (CWs) with the frequencies from the pulse's frequency spectrum. It is also shown that under the same exposure conditions the exposure to UWB pulses is from one to many orders of magnitude safer than the exposure to CW.

  1. FDTD computation of human eye exposure to ultra-wideband electromagnetic pulses

    International Nuclear Information System (INIS)

    Simicevic, Neven

    2008-01-01

    With an increase in the application of ultra-wideband (UWB) electromagnetic pulses in the communications industry, radar, biotechnology and medicine, comes an interest in UWB exposure safety standards. Despite an increase of the scientific research on bioeffects of exposure to non-ionizing UWB pulses, characterization of those effects is far from complete. A numerical computational approach, such as a finite-difference time domain (FDTD) method, is required to visualize and understand the complexity of broadband electromagnetic interactions. The FDTD method has almost no limits in the description of the geometrical and dispersive properties of the simulated material, it is numerically robust and appropriate for current computer technology. In this paper, a complete calculation of exposure of the human eye to UWB electromagnetic pulses in the frequency range of 3.1-10.6, 22-29 and 57-64 GHz is performed. Computation in this frequency range required a geometrical resolution of the eye of 0.1 mm and an arbitrary precision in the description of its dielectric properties in terms of the Debye model. New results show that the interaction of UWB pulses with the eye tissues exhibits the same properties as the interaction of the continuous electromagnetic waves (CWs) with the frequencies from the pulse's frequency spectrum. It is also shown that under the same exposure conditions the exposure to UWB pulses is from one to many orders of magnitude safer than the exposure to CW

  2. Climate Modeling Computing Needs Assessment

    Science.gov (United States)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  3. Computer Profiling Based Model for Investigation

    OpenAIRE

    Neeraj Choudhary; Nikhil Kumar Singh; Parmalik Singh

    2011-01-01

    Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a comp...

  4. Getting computer models to communicate

    International Nuclear Information System (INIS)

    Caremoli, Ch.; Erhard, P.

    1999-01-01

    Today's computers have the processing power to deliver detailed and global simulations of complex industrial processes such as the operation of a nuclear reactor core. So should we be producing new, global numerical models to take full advantage of this new-found power? If so, it would be a long-term job. There is, however, another solution; to couple the existing validated numerical models together so that they work as one. (authors)

  5. Chapter three: methodology of exposure modeling

    CSIR Research Space (South Africa)

    Moschandreas, DJ

    2002-12-01

    Full Text Available methodologies and models are reviewed. Three exposure/measurement methodologies are assessed. Estimation methods focus on source evaluation and attribution, sources include those outdoors and indoors as well as in occupational and in-transit environments. Fate...

  6. Development of a spatial stochastic multimedia exposure model to assess population exposure at a regional scale.

    Science.gov (United States)

    Caudeville, Julien; Bonnard, Roseline; Boudet, Céline; Denys, Sébastien; Govaert, Gérard; Cicolella, André

    2012-08-15

    Analyzing the relationship between the environment and health has become a major focus of public health efforts in France, as evidenced by the national action plans for health and the environment. These plans have identified the following two priorities: - identify and manage geographic areas where hotspot exposures are a potential risk to human health; and - reduce exposure inequalities. The aim of this study is to develop a spatial stochastic multimedia exposure model for detecting vulnerable populations and analyzing exposure determinants at a fine resolution and regional scale. A multimedia exposure model was developed by INERIS to assess the transfer of substances from the environment to humans through inhalation and ingestion pathways. The RESPIR project adds a spatial dimension by linking GIS (Geographic Information System) to the model. Tools are developed using modeling, spatial analysis and geostatistic methods to build and discretize interesting variables and indicators from different supports and resolutions on a 1-km(2) regular grid. We applied this model to the risk assessment of exposure to metals (cadmium, lead and nickel) using data from a region in France (Nord-Pas-de-Calais). The considered exposure pathways include the atmospheric contaminant inhalation and ingestion of soil, vegetation, meat, egg, milk, fish and drinking water. Exposure scenarios are defined for different reference groups (age, dietary properties, and the fraction of food produced locally). The two largest risks correspond to an ancient industrial site (Metaleurop) and the Lille agglomeration. In these areas, cadmium, vegetation ingestion and soil contamination are the principal determinants of the computed risk. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Computational Modeling in Liver Surgery

    Directory of Open Access Journals (Sweden)

    Bruno Christ

    2017-11-01

    Full Text Available The need for extended liver resection is increasing due to the growing incidence of liver tumors in aging societies. Individualized surgical planning is the key for identifying the optimal resection strategy and to minimize the risk of postoperative liver failure and tumor recurrence. Current computational tools provide virtual planning of liver resection by taking into account the spatial relationship between the tumor and the hepatic vascular trees, as well as the size of the future liver remnant. However, size and function of the liver are not necessarily equivalent. Hence, determining the future liver volume might misestimate the future liver function, especially in cases of hepatic comorbidities such as hepatic steatosis. A systems medicine approach could be applied, including biological, medical, and surgical aspects, by integrating all available anatomical and functional information of the individual patient. Such an approach holds promise for better prediction of postoperative liver function and hence improved risk assessment. This review provides an overview of mathematical models related to the liver and its function and explores their potential relevance for computational liver surgery. We first summarize key facts of hepatic anatomy, physiology, and pathology relevant for hepatic surgery, followed by a description of the computational tools currently used in liver surgical planning. Then we present selected state-of-the-art computational liver models potentially useful to support liver surgery. Finally, we discuss the main challenges that will need to be addressed when developing advanced computational planning tools in the context of liver surgery.

  8. Reconstructing exposures from biomarkers using exposure-pharmacokinetic modeling--A case study with carbaryl.

    Science.gov (United States)

    Brown, Kathleen; Phillips, Martin; Grulke, Christopher; Yoon, Miyoung; Young, Bruce; McDougall, Robin; Leonard, Jeremy; Lu, Jingtao; Lefew, William; Tan, Yu-Mei

    2015-12-01

    Sources of uncertainty involved in exposure reconstruction for short half-life chemicals were characterized using computational models that link external exposures to biomarkers. Using carbaryl as an example, an exposure model, the Cumulative and Aggregate Risk Evaluation System (CARES), was used to generate time-concentration profiles for 500 virtual individuals exposed to carbaryl. These exposure profiles were used as inputs into a physiologically based pharmacokinetic (PBPK) model to predict urinary biomarker concentrations. These matching dietary intake levels and biomarker concentrations were used to (1) compare three reverse dosimetry approaches based on their ability to predict the central tendency of the intake dose distribution; and (2) identify parameters necessary for a more accurate exposure reconstruction. This study illustrates the trade-offs between using non-iterative reverse dosimetry methods that are fast, less precise and iterative methods that are slow, more precise. This study also intimates the necessity of including urine flow rate and elapsed time between last dose and urine sampling as part of the biomarker sampling collection for better interpretation of urinary biomarker data of short biological half-life chemicals. Resolution of these critical data gaps can allow exposure reconstruction methods to better predict population-level intake doses from large biomonitoring studies. Published by Elsevier Inc.

  9. Comparison of modeling approaches to prioritize chemicals based on estimates of exposure and exposure potential.

    Science.gov (United States)

    Mitchell, Jade; Arnot, Jon A; Jolliet, Olivier; Georgopoulos, Panos G; Isukapalli, Sastry; Dasgupta, Surajit; Pandian, Muhilan; Wambaugh, John; Egeghy, Peter; Cohen Hubal, Elaine A; Vallero, Daniel A

    2013-08-01

    While only limited data are available to characterize the potential toxicity of over 8 million commercially available chemical substances, there is even less information available on the exposure and use-scenarios that are required to link potential toxicity to human and ecological health outcomes. Recent improvements and advances such as high throughput data gathering, high performance computational capabilities, and predictive chemical inherency methodology make this an opportune time to develop an exposure-based prioritization approach that can systematically utilize and link the asymmetrical bodies of knowledge for hazard and exposure. In response to the US EPA's need to develop novel approaches and tools for rapidly prioritizing chemicals, a "Challenge" was issued to several exposure model developers to aid the understanding of current systems in a broader sense and to assist the US EPA's effort to develop an approach comparable to other international efforts. A common set of chemicals were prioritized under each current approach. The results are presented herein along with a comparative analysis of the rankings of the chemicals based on metrics of exposure potential or actual exposure estimates. The analysis illustrates the similarities and differences across the domains of information incorporated in each modeling approach. The overall findings indicate a need to reconcile exposures from diffuse, indirect sources (far-field) with exposures from directly, applied chemicals in consumer products or resulting from the presence of a chemical in a microenvironment like a home or vehicle. Additionally, the exposure scenario, including the mode of entry into the environment (i.e. through air, water or sediment) appears to be an important determinant of the level of agreement between modeling approaches. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Comparison of modeling approaches to prioritize chemicals based on estimates of exposure and exposure potential

    Science.gov (United States)

    Mitchell, Jade; Arnot, Jon A.; Jolliet, Olivier; Georgopoulos, Panos G.; Isukapalli, Sastry; Dasgupta, Surajit; Pandian, Muhilan; Wambaugh, John; Egeghy, Peter; Cohen Hubal, Elaine A.; Vallero, Daniel A.

    2014-01-01

    While only limited data are available to characterize the potential toxicity of over 8 million commercially available chemical substances, there is even less information available on the exposure and use-scenarios that are required to link potential toxicity to human and ecological health outcomes. Recent improvements and advances such as high throughput data gathering, high performance computational capabilities, and predictive chemical inherency methodology make this an opportune time to develop an exposure-based prioritization approach that can systematically utilize and link the asymmetrical bodies of knowledge for hazard and exposure. In response to the US EPA’s need to develop novel approaches and tools for rapidly prioritizing chemicals, a “Challenge” was issued to several exposure model developers to aid the understanding of current systems in a broader sense and to assist the US EPA’s effort to develop an approach comparable to other international efforts. A common set of chemicals were prioritized under each current approach. The results are presented herein along with a comparative analysis of the rankings of the chemicals based on metrics of exposure potential or actual exposure estimates. The analysis illustrates the similarities and differences across the domains of information incorporated in each modeling approach. The overall findings indicate a need to reconcile exposures from diffuse, indirect sources (far-field) with exposures from directly, applied chemicals in consumer products or resulting from the presence of a chemical in a microenvironment like a home or vehicle. Additionally, the exposure scenario, including the mode of entry into the environment (i.e. through air, water or sediment) appears to be an important determinant of the level of agreement between modeling approaches. PMID:23707726

  11. CONSEXPO 3.0, consumer exposure and uptake models

    NARCIS (Netherlands)

    Veen MP van; LBM

    2001-01-01

    The report provides a modelling approach to consumer exposure to chemicals, based on mathematical contact, exposure and uptake models. For each route of exposure, a number of exposure and uptake models are included. A general framework joins the exposure and uptake models selected by the user. By

  12. Parallel computing in enterprise modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.

    2008-08-01

    This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.

  13. Cosmic logic: a computational model

    International Nuclear Information System (INIS)

    Vanchurin, Vitaly

    2016-01-01

    We initiate a formal study of logical inferences in context of the measure problem in cosmology or what we call cosmic logic. We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or Gödel number) as input and output the corresponding probability. Similarly, CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies that it is impossible to construct CM machines to compute probabilities on the set of all CO machines using cut-off prescriptions. The cut-off measures can still be used if the set is reduced to include only machines which halt after a finite and predetermined number of steps

  14. Development of a spatial stochastic multimedia exposure model to assess population exposure at a regional scale

    Energy Technology Data Exchange (ETDEWEB)

    Caudeville, Julien, E-mail: Julien.CAUDEVILLE@ineris.fr [INERIS (French National Institute for Industrial Environment and Risks), Parc Technologique Alata, BP 2, 60550 Verneuil-en-Halatte (France); Joint research unit UMR 6599, Heudiasyc (Heuristic and Diagnoses of Complex Systems), University of Technology of Compiegne and CNRS, Rue du Dr Schweitzer, 60200 Compiegne (France); Bonnard, Roseline, E-mail: Roseline.BONNARD@ineris.fr [INERIS (French National Institute for Industrial Environment and Risks), Parc Technologique Alata, BP 2, 60550 Verneuil-en-Halatte (France); Boudet, Celine, E-mail: Celine.BOUDET@ineris.fr [INERIS (French National Institute for Industrial Environment and Risks), Parc Technologique Alata, BP 2, 60550 Verneuil-en-Halatte (France); Denys, Sebastien, E-mail: Sebastien.DENYS@ineris.fr [INERIS (French National Institute for Industrial Environment and Risks), Parc Technologique Alata, BP 2, 60550 Verneuil-en-Halatte (France); Govaert, Gerard, E-mail: gerard.govaert@utc.fr [Joint research unit UMR 6599, Heudiasyc (Heuristic and Diagnoses of Complex Systems), University of Technology of Compiegne and CNRS, Rue du Dr Schweitzer, 60200 Compiegne (France); Cicolella, Andre, E-mail: Andre.CICOLELLA@ineris.fr [INERIS (French National Institute for Industrial Environment and Risks), Parc Technologique Alata, BP 2, 60550 Verneuil-en-Halatte (France)

    2012-08-15

    Analyzing the relationship between the environment and health has become a major focus of public health efforts in France, as evidenced by the national action plans for health and the environment. These plans have identified the following two priorities: -identify and manage geographic areas where hotspot exposures are a potential risk to human health; and -reduce exposure inequalities. The aim of this study is to develop a spatial stochastic multimedia exposure model for detecting vulnerable populations and analyzing exposure determinants at a fine resolution and regional scale. A multimedia exposure model was developed by INERIS to assess the transfer of substances from the environment to humans through inhalation and ingestion pathways. The RESPIR project adds a spatial dimension by linking GIS (Geographic Information System) to the model. Tools are developed using modeling, spatial analysis and geostatistic methods to build and discretize interesting variables and indicators from different supports and resolutions on a 1-km{sup 2} regular grid. We applied this model to the risk assessment of exposure to metals (cadmium, lead and nickel) using data from a region in France (Nord-Pas-de-Calais). The considered exposure pathways include the atmospheric contaminant inhalation and ingestion of soil, vegetation, meat, egg, milk, fish and drinking water. Exposure scenarios are defined for different reference groups (age, dietary properties, and the fraction of food produced locally). The two largest risks correspond to an ancient industrial site (Metaleurop) and the Lille agglomeration. In these areas, cadmium, vegetation ingestion and soil contamination are the principal determinants of the computed risk. -- Highlights: Black-Right-Pointing-Pointer We present a multimedia exposure model for mapping environmental disparities. Black-Right-Pointing-Pointer We perform a risk assessment on a region of France at a fine scale for three metals. Black-Right-Pointing-Pointer We

  15. Minimal models of multidimensional computations.

    Directory of Open Access Journals (Sweden)

    Jeffrey D Fitzgerald

    2011-03-01

    Full Text Available The multidimensional computations performed by many biological systems are often characterized with limited information about the correlations between inputs and outputs. Given this limitation, our approach is to construct the maximum noise entropy response function of the system, leading to a closed-form and minimally biased model consistent with a given set of constraints on the input/output moments; the result is equivalent to conditional random field models from machine learning. For systems with binary outputs, such as neurons encoding sensory stimuli, the maximum noise entropy models are logistic functions whose arguments depend on the constraints. A constraint on the average output turns the binary maximum noise entropy models into minimum mutual information models, allowing for the calculation of the information content of the constraints and an information theoretic characterization of the system's computations. We use this approach to analyze the nonlinear input/output functions in macaque retina and thalamus; although these systems have been previously shown to be responsive to two input dimensions, the functional form of the response function in this reduced space had not been unambiguously identified. A second order model based on the logistic function is found to be both necessary and sufficient to accurately describe the neural responses to naturalistic stimuli, accounting for an average of 93% of the mutual information with a small number of parameters. Thus, despite the fact that the stimulus is highly non-Gaussian, the vast majority of the information in the neural responses is related to first and second order correlations. Our results suggest a principled and unbiased way to model multidimensional computations and determine the statistics of the inputs that are being encoded in the outputs.

  16. Estimators for longitudinal latent exposure models: examining measurement model assumptions.

    Science.gov (United States)

    Sánchez, Brisa N; Kim, Sehee; Sammel, Mary D

    2017-06-15

    Latent variable (LV) models are increasingly being used in environmental epidemiology as a way to summarize multiple environmental exposures and thus minimize statistical concerns that arise in multiple regression. LV models may be especially useful when multivariate exposures are collected repeatedly over time. LV models can accommodate a variety of assumptions but, at the same time, present the user with many choices for model specification particularly in the case of exposure data collected repeatedly over time. For instance, the user could assume conditional independence of observed exposure biomarkers given the latent exposure and, in the case of longitudinal latent exposure variables, time invariance of the measurement model. Choosing which assumptions to relax is not always straightforward. We were motivated by a study of prenatal lead exposure and mental development, where assumptions of the measurement model for the time-changing longitudinal exposure have appreciable impact on (maximum-likelihood) inferences about the health effects of lead exposure. Although we were not particularly interested in characterizing the change of the LV itself, imposing a longitudinal LV structure on the repeated multivariate exposure measures could result in high efficiency gains for the exposure-disease association. We examine the biases of maximum likelihood estimators when assumptions about the measurement model for the longitudinal latent exposure variable are violated. We adapt existing instrumental variable estimators to the case of longitudinal exposures and propose them as an alternative to estimate the health effects of a time-changing latent predictor. We show that instrumental variable estimators remain unbiased for a wide range of data generating models and have advantages in terms of mean squared error. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Computational Models of Rock Failure

    Science.gov (United States)

    May, Dave A.; Spiegelman, Marc

    2017-04-01

    Practitioners in computational geodynamics, as per many other branches of applied science, typically do not analyse the underlying PDE's being solved in order to establish the existence or uniqueness of solutions. Rather, such proofs are left to the mathematicians, and all too frequently these results lag far behind (in time) the applied research being conducted, are often unintelligible to the non-specialist, are buried in journals applied scientists simply do not read, or simply have not been proven. As practitioners, we are by definition pragmatic. Thus, rather than first analysing our PDE's, we first attempt to find approximate solutions by throwing all our computational methods and machinery at the given problem and hoping for the best. Typically this approach leads to a satisfactory outcome. Usually it is only if the numerical solutions "look odd" that we start delving deeper into the math. In this presentation I summarise our findings in relation to using pressure dependent (Drucker-Prager type) flow laws in a simplified model of continental extension in which the material is assumed to be an incompressible, highly viscous fluid. Such assumptions represent the current mainstream adopted in computational studies of mantle and lithosphere deformation within our community. In short, we conclude that for the parameter range of cohesion and friction angle relevant to studying rocks, the incompressibility constraint combined with a Drucker-Prager flow law can result in problems which have no solution. This is proven by a 1D analytic model and convincingly demonstrated by 2D numerical simulations. To date, we do not have a robust "fix" for this fundamental problem. The intent of this submission is to highlight the importance of simple analytic models, highlight some of the dangers / risks of interpreting numerical solutions without understanding the properties of the PDE we solved, and lastly to stimulate discussions to develop an improved computational model of

  18. Media Exposure: How Models Simplify Sampling

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl

    1998-01-01

    In media planning, the distribution of exposures to more ad spots in more media (print, TV, radio) is crucial to the evaluation of the campaign. If such information should be sampled, it would only be possible in expensive panel-studies (eg TV-meter panels). Alternatively, the distribution...... of exposures may be modelled statistically, using the Beta distribution combined with the Binomial Distribution. Examples are given....

  19. Modelling of indoor exposure to nitrogen dioxide in the UK

    Science.gov (United States)

    Dimitroulopoulou, C.; Ashmore, M. R.; Byrne, M. A.; Kinnersley, R. P.

    A dynamic multi-compartment computer model has been developed to describe the physical processes determining indoor pollutant concentrations as a function of outdoor concentrations, indoor emission rates and building characteristics. The model has been parameterised for typical UK homes and workplaces and linked to a time-activity model to calculate exposures for a representative homemaker, schoolchild and office worker, with respect to NO 2. The estimates of population exposures, for selected urban and rural sites, are expressed in terms of annual means and frequency of hours in which air quality standards are exceeded. The annual mean exposures are estimated to fall within the range of 5-21 ppb for homes with no source, and 21-27 ppb for homes with gas cooking, varying across sites and population groups. The contribution of outdoor exposure to annual mean NO 2 exposure varied from 5 to 24%, that of indoor penetration of outdoor air from 17 to 86% and that of gas cooking from 0 to 78%. The frequency of exposure to 1 h mean concentrations above 150 ppb was very low, except for people cooking with gas.

  20. External exposure model for various geometries of contaminated materials

    International Nuclear Information System (INIS)

    LePoire, D.; Kamboj, S.; Yu, C.

    1996-01-01

    A computational model for external exposure was developed for the U.S. Department of Energy's residual radioactive material guideline computer code (RESRAD) on the basis of dose coefficients from Federal Guidance Report No. 12 and the point-kernel method. This model includes the effects of different materials and exposure distances, as well as source geometry (cover thickness, source depth, area, and shape). A material factor is calculated on the basis of the point-kernel method using material-specific photon cross-section data and buildup factors. This present model was incorporated into RESRAD-RECYCLE (a RESRAD family code used for computing radiological impacts of metal recycling) and is being incorporated into RESRAD-BUILD (a DOE recommended code for computing impacts of building decontamination). The model was compared with calculations performed with the Monte Carlo N-Particle Code (MCNP) and the Microshield code for three different source geometries, three different radionuclides ( 234 U, 238 U, and 60 Co, representing low, medium, and high energy, respectively), and five different source materials (iron, copper, aluminum, water, and soil). The comparison shows that results of this model are in very good agreement with MCNP calculations (within 5% for 60 Co and within 30% for 238 U and 234 U for all materials and source geometries). Significant differences (greater than 100%) were observed with Microshield for thin 234 U sources

  1. Business model elements impacting cloud computing adoption

    DEFF Research Database (Denmark)

    Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek

    The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology a...

  2. Enduring Influence of Stereotypical Computer Science Role Models on Women's Academic Aspirations

    Science.gov (United States)

    Cheryan, Sapna; Drury, Benjamin J.; Vichayapai, Marissa

    2013-01-01

    The current work examines whether a brief exposure to a computer science role model who fits stereotypes of computer scientists has a lasting influence on women's interest in the field. One-hundred undergraduate women who were not computer science majors met a female or male peer role model who embodied computer science stereotypes in appearance…

  3. Computational Modeling in Tissue Engineering

    CERN Document Server

    2013-01-01

    One of the major challenges in tissue engineering is the translation of biological knowledge on complex cell and tissue behavior into a predictive and robust engineering process. Mastering this complexity is an essential step towards clinical applications of tissue engineering. This volume discusses computational modeling tools that allow studying the biological complexity in a more quantitative way. More specifically, computational tools can help in:  (i) quantifying and optimizing the tissue engineering product, e.g. by adapting scaffold design to optimize micro-environmental signals or by adapting selection criteria to improve homogeneity of the selected cell population; (ii) quantifying and optimizing the tissue engineering process, e.g. by adapting bioreactor design to improve quality and quantity of the final product; and (iii) assessing the influence of the in vivo environment on the behavior of the tissue engineering product, e.g. by investigating vascular ingrowth. The book presents examples of each...

  4. Development of a new fuzzy exposure model

    International Nuclear Information System (INIS)

    Vasconcelos, Wagner Eustaquio de; Lira, Carlos Alberto Brayner de Oliveira; Texeira, Marcello Goulart

    2007-01-01

    The main topic of this study is the development of an exposure fuzzy model to evaluate the exposure of inhabitants in an area containing uranium, which present a high natural background. In this work, a fuzzy model was created, based on some of the following main factors: activity concentration of uranium, physiological factors and characteristic customs of the exposed individuals. An inference block was created to evaluate some factors of radiation exposure. For this, AHP-fuzzy technique (Analytic Hierarchic Process) was used and its application was demonstrated for a subjected population to the radiation of the natural uranium. The Mandami type fuzzy model was also created from the opinion of specialists. The Monte Carlo method was used to generate a statistics of input data and the daily average exposure served as comparison parameter between the three techniques. The output fuzzy sets were expressed in form of linguistic variables, such as high, medium and low. In the qualitative analysis, the obtained results were satisfactory when translating the opinion of the specialists. In the quantitative analysis, the obtained values are part of the same fuzzy set as the values found in literature. The global results suggest that this type of fuzzy model is highly promising for analysis of exposure to ionizing radiation. (author)

  5. Development of a new fuzzy exposure model

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Wagner Eustaquio de; Lira, Carlos Alberto Brayner de Oliveira [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Dept. de Energia Nuclear. Grupo de Engenharia de Reatores], E-mail: wagner@ufpe.br, E-mail: cabol@ufpe.br; Texeira, Marcello Goulart [Instituto Militar de Engenharia (IME), Rio de Janeiro, RJ (Brazil). Terrestrial Modelling Group], E-mail: marcellogt@ime.eb.br

    2007-07-01

    The main topic of this study is the development of an exposure fuzzy model to evaluate the exposure of inhabitants in an area containing uranium, which present a high natural background. In this work, a fuzzy model was created, based on some of the following main factors: activity concentration of uranium, physiological factors and characteristic customs of the exposed individuals. An inference block was created to evaluate some factors of radiation exposure. For this, AHP-fuzzy technique (Analytic Hierarchic Process) was used and its application was demonstrated for a subjected population to the radiation of the natural uranium. The Mandami type fuzzy model was also created from the opinion of specialists. The Monte Carlo method was used to generate a statistics of input data and the daily average exposure served as comparison parameter between the three techniques. The output fuzzy sets were expressed in form of linguistic variables, such as high, medium and low. In the qualitative analysis, the obtained results were satisfactory when translating the opinion of the specialists. In the quantitative analysis, the obtained values are part of the same fuzzy set as the values found in literature. The global results suggest that this type of fuzzy model is highly promising for analysis of exposure to ionizing radiation. (author)

  6. Computer modeling for optimal placement of gloveboxes

    Energy Technology Data Exchange (ETDEWEB)

    Hench, K.W.; Olivas, J.D. [Los Alamos National Lab., NM (United States); Finch, P.R. [New Mexico State Univ., Las Cruces, NM (United States)

    1997-08-01

    Reduction of the nuclear weapons stockpile and the general downsizing of the nuclear weapons complex has presented challenges for Los Alamos. One is to design an optimized fabrication facility to manufacture nuclear weapon primary components (pits) in an environment of intense regulation and shrinking budgets. Historically, the location of gloveboxes in a processing area has been determined without benefit of industrial engineering studies to ascertain the optimal arrangement. The opportunity exists for substantial cost savings and increased process efficiency through careful study and optimization of the proposed layout by constructing a computer model of the fabrication process. This paper presents an integrative two- stage approach to modeling the casting operation for pit fabrication. The first stage uses a mathematical technique for the formulation of the facility layout problem; the solution procedure uses an evolutionary heuristic technique. The best solutions to the layout problem are used as input to the second stage - a computer simulation model that assesses the impact of competing layouts on operational performance. The focus of the simulation model is to determine the layout that minimizes personnel radiation exposures and nuclear material movement, and maximizes the utilization of capacity for finished units.

  7. Computer modeling for optimal placement of gloveboxes

    International Nuclear Information System (INIS)

    Hench, K.W.; Olivas, J.D.; Finch, P.R.

    1997-08-01

    Reduction of the nuclear weapons stockpile and the general downsizing of the nuclear weapons complex has presented challenges for Los Alamos. One is to design an optimized fabrication facility to manufacture nuclear weapon primary components (pits) in an environment of intense regulation and shrinking budgets. Historically, the location of gloveboxes in a processing area has been determined without benefit of industrial engineering studies to ascertain the optimal arrangement. The opportunity exists for substantial cost savings and increased process efficiency through careful study and optimization of the proposed layout by constructing a computer model of the fabrication process. This paper presents an integrative two- stage approach to modeling the casting operation for pit fabrication. The first stage uses a mathematical technique for the formulation of the facility layout problem; the solution procedure uses an evolutionary heuristic technique. The best solutions to the layout problem are used as input to the second stage - a computer simulation model that assesses the impact of competing layouts on operational performance. The focus of the simulation model is to determine the layout that minimizes personnel radiation exposures and nuclear material movement, and maximizes the utilization of capacity for finished units

  8. Computational Strategy for Quantifying Human Pesticide Exposure based upon a Saliva Measurement

    Directory of Open Access Journals (Sweden)

    Charles eTimchalk

    2015-05-01

    Full Text Available Quantitative exposure data is important for evaluating toxicity risk and biomonitoring is a critical tool for evaluating human exposure. Direct personal monitoring provides the most accurate estimation of a subject’s true dose, and non-invasive methods are advocated for quantifying exposure to xenobiotics. In this regard, there is a need to identify chemicals that are cleared in saliva at concentrations that can be quantified to support the implementation of this approach. This manuscript reviews the computational modeling approaches that are coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics and provides additional insight on species-dependent differences in partitioning that are of key importance for extrapolation. The primary mechanism by which xenobiotics leave the blood and enter saliva involves paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of xenobiotics transferred by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computationally modeled using compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of the Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis identified that both protein-binding and pKa (for weak acids and bases have significant impact on determining partitioning and species dependent differences based upon physiological variance. Future strategies are focused on an in vitro salivary acinar cell based system to experimentally determine and computationally predict salivary gland uptake and clearance for xenobiotics. It is envisioned that a combination of salivary biomonitoring and computational modeling will enable the non-invasive measurement of chemical exposures in human

  9. Opportunity for Realizing Ideal Computing System using Cloud Computing Model

    OpenAIRE

    Sreeramana Aithal; Vaikunth Pai T

    2017-01-01

    An ideal computing system is a computing system with ideal characteristics. The major components and their performance characteristics of such hypothetical system can be studied as a model with predicted input, output, system and environmental characteristics using the identified objectives of computing which can be used in any platform, any type of computing system, and for application automation, without making modifications in the form of structure, hardware, and software coding by an exte...

  10. Virtual Reality versus Computer-Aided Exposure Treatments for Fear of Flying

    Science.gov (United States)

    Tortella-Feliu, Miquel; Botella, Cristina; Llabres, Jordi; Breton-Lopez, Juana Maria; del Amo, Antonio Riera; Banos, Rosa M.; Gelabert, Joan M.

    2011-01-01

    Evidence is growing that two modalities of computer-based exposure therapies--virtual reality and computer-aided psychotherapy--are effective in treating anxiety disorders, including fear of flying. However, they have not yet been directly compared. The aim of this study was to analyze the efficacy of three computer-based exposure treatments for…

  11. International Conference on Computational Intelligence, Cyber Security, and Computational Models

    CERN Document Server

    Ramasamy, Vijayalakshmi; Sheen, Shina; Veeramani, C; Bonato, Anthony; Batten, Lynn

    2016-01-01

    This book aims at promoting high-quality research by researchers and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security, and Computational Models ICC3 2015 organized by PSG College of Technology, Coimbatore, India during December 17 – 19, 2015. This book enriches with innovations in broad areas of research like computational modeling, computational intelligence and cyber security. These emerging inter disciplinary research areas have helped to solve multifaceted problems and gained lot of attention in recent years. This encompasses theory and applications, to provide design, analysis and modeling of the aforementioned key areas.

  12. Computer modeling of liquid crystals

    International Nuclear Information System (INIS)

    Al-Barwani, M.S.

    1999-01-01

    In this thesis, we investigate several aspects of the behaviour of liquid crystal molecules near interfaces using computer simulation. We briefly discuss experiment, theoretical and computer simulation studies of some of the liquid crystal interfaces. We then describe three essentially independent research topics. The first of these concerns extensive simulations of a liquid crystal formed by long flexible molecules. We examined the bulk behaviour of the model and its structure. Studies of a film of smectic liquid crystal surrounded by vapour were also carried out. Extensive simulations were also done for a long-molecule/short-molecule mixture, studies were then carried out to investigate the liquid-vapour interface of the mixture. Next, we report the results of large scale simulations of soft-spherocylinders of two different lengths. We examined the bulk coexistence of the nematic and isotropic phases of the model. Once the bulk coexistence behaviour was known, properties of the nematic-isotropic interface were investigated. This was done by fitting order parameter and density profiles to appropriate mathematical functions and calculating the biaxial order parameter. We briefly discuss the ordering at the interfaces and make attempts to calculate the surface tension. Finally, in our third project, we study the effects of different surface topographies on creating bistable nematic liquid crystal devices. This was carried out using a model based on the discretisation of the free energy on a lattice. We use simulation to find the lowest energy states and investigate if they are degenerate in energy. We also test our model by studying the Frederiks transition and comparing with analytical and other simulation results. (author)

  13. Computational and Organotypic Modeling of Microcephaly ...

    Science.gov (United States)

    Microcephaly is associated with reduced cortical surface area and ventricular dilations. Many genetic and environmental factors precipitate this malformation, including prenatal alcohol exposure and maternal Zika infection. This complexity motivates the engineering of computational and experimental models to probe the underlying molecular targets, cellular consequences, and biological processes. We describe an Adverse Outcome Pathway (AOP) framework for microcephaly derived from literature on all gene-, chemical-, or viral- effects and brain development. Overlap with NTDs is likely, although the AOP connections identified here focused on microcephaly as the adverse outcome. A query of the Mammalian Phenotype Browser database for ‘microcephaly’ (MP:0000433) returned 85 gene associations; several function in microtubule assembly and centrosome cycle regulated by (microcephalin, MCPH1), a gene for primary microcephaly in humans. The developing ventricular zone is the likely target. In this zone, neuroprogenitor cells (NPCs) self-replicate during the 1st trimester setting brain size, followed by neural differentiation of the neocortex. Recent studies with human NPCs confirmed infectivity with Zika virions invoking critical cell loss (apoptosis) of precursor NPCs; similar findings have been shown with fetal alcohol or methylmercury exposure in rodent studies, leading to mathematical models of NPC dynamics in size determination of the ventricular zone. A key event

  14. Computer models for economic and silvicultural decisions

    Science.gov (United States)

    Rosalie J. Ingram

    1989-01-01

    Computer systems can help simplify decisionmaking to manage forest ecosystems. We now have computer models to help make forest management decisions by predicting changes associated with a particular management action. Models also help you evaluate alternatives. To be effective, the computer models must be reliable and appropriate for your situation.

  15. Computational strategy for quantifying human pesticide exposure based upon a saliva measurement

    Energy Technology Data Exchange (ETDEWEB)

    Timchalk, Charles; Weber, Thomas J.; Smith, Jordan N.

    2015-05-27

    The National Research Council of the National Academies report, Toxicity Testing in the 21st Century: A Vision and Strategy, highlighted the importance of quantitative exposure data for evaluating human toxicity risk and noted that biomonitoring is a critical tool for quantitatively evaluating exposure from both environmental and occupational settings. Direct measurement of chemical exposures using personal monitoring provides the most accurate estimation of a subject’s true exposure, and non-invasive methods have also been advocated for quantifying the pharmacokinetics and bioavailability of drugs and xenobiotics. In this regard, there is a need to identify chemicals that are readily cleared in saliva at concentrations that can be quantified to support the implementation of this approach.. The current manuscript describes the use of computational modeling approaches that are closely coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics. The primary mechanism by which xenobiotics leave the blood and enter saliva is thought to involve paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of drugs and xenobiotics cleared from plasma into saliva by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computational modeled using a combination of compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of a modified Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis of key model parameters specifically identified that both protein-binding and pKa (for weak acids and bases) had the most significant impact on the determination of partitioning and that there were clear species dependent differences based upon physiological variance between

  16. Modeling residential exposure to secondhand tobacco smoke

    Science.gov (United States)

    Klepeis, Neil E.; Nazaroff, William W.

    We apply a simulation model to explore the effect of a house's multicompartment character on a nonsmoker's inhalation exposure to secondhand tobacco smoke (SHS). The model tracks the minute-by-minute movement of people and pollutants among multiple zones of a residence and generates SHS pollutant profiles for each room in response to room-specific smoking patterns. In applying the model, we consider SHS emissions of airborne particles, nicotine, and carbon monoxide in two hypothetical houses, one with a typical four-room layout and one dominated by a single large space. We use scripted patterns of room-to-room occupant movement and a cohort of 5000 activity patterns sampled from a US nationwide survey. The results for scripted and cohort simulation trials indicate that the multicompartment nature of homes, manifested as inter-room differences in pollutant levels and the movement of people among zones, can cause substantial variation in nonsmoker SHS exposure.

  17. PEDIC - A COMPUTER PROGRAM TO ESTIMATE THE EFFECT OF EVACUATION ON POPULATION EXPOSURE FOLLOWING ACUTE RADIONUCLIDE RELEASES TO THE ATOMSPHERE

    Energy Technology Data Exchange (ETDEWEB)

    Strenge, D. L.; Peloquin, R. A.

    1981-01-01

    The computer program PEDIC is described for estimation of the effect of evacuation on population exposure. The program uses joint frequency, annual average meteorological data and a simple population evacuation model to estimate exposure reduction due to movement of people away from radioactive plumes following an acute release of activity. Atmospheric dispersion is based on a sector averaged Gaussian model with consideration of plume rise and building wake effects. Appendices to the report provide details of the computer program design, a program listing, input card preparation instructions and sample problems.

  18. Development of a external exposure computational model for studying of input dose in skin for radiographs of thorax and vertebral column; Desenvolvimento de um modelo computacional de exposicao externa para estudo da dose de entrada na pele para radiografias de torax e coluna

    Energy Technology Data Exchange (ETDEWEB)

    Muniz, Bianca C.; Menezes, Claudio J.M., E-mail: bianca.cm95@gmail.com, E-mail: cjmm@cnen.gov.br [Centro Regional de Ciencias Nucleares do Nordeste (CRCN-NE/CNEN-PE), Recife, PE (Brazil); Vieira, Jose W., E-mail: jwvieira@br.inter.net [Instituto Federal de Educacao, Ciencia e Tecnologia de Pernambuco (IFPE), Recife, PE (Brazil)

    2014-07-01

    The dosimetric measurements do not always happen directly in the human body. Therefore, these assessments can be performed using anthropomorphic models (phantoms) evidencing models computational exposure (MCE) using techniques of Monte Carlo Method for virtual simulations. These processing techniques coupled with more powerful and affordable computers make the Monte Carlo method one of the tools most used worldwide in radiation transport area. In this work, the Monte Carlo EGS4 program was used to develop a computer model of external exposure to study the entrance skin dose for chest and column X-radiography and, aiming to optimize these practices by reducing doses to patients, professionals involved and the general public. The results obtained experimentally with the electrometer Radcal, model 9015, associated with the ionization chamber for radiology model 10X5-6, showed that the proposed computational model can be used in quality assurance programs in radiodiagnostic, evaluating the entrance skin dose when varying parameters of the radiation beam such as kilo voltage peak (kVp), current-time product (mAs), total filtration and distance surface source (DFS), optimizing the practices in radiodiagnostic and meeting the current regulation.

  19. Predicting Forearm Physical Exposures During Computer Work Using Self-Reports, Software-Recorded Computer Usage Patterns, and Anthropometric and Workstation Measurements.

    Science.gov (United States)

    Huysmans, Maaike A; Eijckelhof, Belinda H W; Garza, Jennifer L Bruno; Coenen, Pieter; Blatter, Birgitte M; Johnson, Peter W; van Dieën, Jaap H; van der Beek, Allard J; Dennerlein, Jack T

    2017-12-15

    Alternative techniques to assess physical exposures, such as prediction models, could facilitate more efficient epidemiological assessments in future large cohort studies examining physical exposures in relation to work-related musculoskeletal symptoms. The aim of this study was to evaluate two types of models that predict arm-wrist-hand physical exposures (i.e. muscle activity, wrist postures and kinematics, and keyboard and mouse forces) during computer use, which only differed with respect to the candidate predicting variables; (i) a full set of predicting variables, including self-reported factors, software-recorded computer usage patterns, and worksite measurements of anthropometrics and workstation set-up (full models); and (ii) a practical set of predicting variables, only including the self-reported factors and software-recorded computer usage patterns, that are relatively easy to assess (practical models). Prediction models were build using data from a field study among 117 office workers who were symptom-free at the time of measurement. Arm-wrist-hand physical exposures were measured for approximately two hours while workers performed their own computer work. Each worker's anthropometry and workstation set-up were measured by an experimenter, computer usage patterns were recorded using software and self-reported factors (including individual factors, job characteristics, computer work behaviours, psychosocial factors, workstation set-up characteristics, and leisure-time activities) were collected by an online questionnaire. We determined the predictive quality of the models in terms of R2 and root mean squared (RMS) values and exposure classification agreement to low-, medium-, and high-exposure categories (in the practical model only). The full models had R2 values that ranged from 0.16 to 0.80, whereas for the practical models values ranged from 0.05 to 0.43. Interquartile ranges were not that different for the two models, indicating that only for some

  20. Real-time exposure fusion on a mobile computer

    CSIR Research Space (South Africa)

    Bachoo, AK

    2009-12-01

    Full Text Available information in these scenarios. An image captured using a short exposure time will not saturate bright image re- gions while an image captured with a long exposure time will show more detail in the dark regions. The pixel depth provided by most camera.... The auto exposure also creates strong blown-out highlights in the foreground (the grass patch). The short shutter time (Exposure 1) correctly exposes the grass while the long shutter time (Exposure 3) is able to correctly expose the camouflaged dummy...

  1. Disciplines, models, and computers: the path to computational quantum chemistry.

    Science.gov (United States)

    Lenhard, Johannes

    2014-12-01

    Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computa- tional quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.

  2. Computational biomechanics for medicine imaging, modeling and computing

    CERN Document Server

    Doyle, Barry; Wittek, Adam; Nielsen, Poul; Miller, Karol

    2016-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements. This volume comprises eighteen of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, UK, Switzerland, Scotland, France and Russia. Some of the interesting topics discussed are: tailored computational models; traumatic brain injury; soft-tissue mechanics; medical image analysis; and clinically-relevant simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  3. Evaluation of automatic exposure control systems in computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Reina, Thamiris Rosado

    2014-07-01

    The development of the computed tomography (CT) technology has brought wider possibilities on diagnostic medicine. It is a non-invasive method to see the human body in details. As the CT application increases, it raises the concern about patient dose, because the higher dose levels imparted compared to other diagnostic imaging modalities. The radiology community (radiologists, medical physicists and manufacturer) are working together to find the lowest dose level possible, without compromising the diagnostic image quality. The greatest and relatively new advance to lower the patient dose is the automatic exposure control (AEC) systems in CT. These systems are designed to ponder the dose distribution along the patient scanning and between patients taking into account their sizes and irradiated tissue densities. Based on the CT scanning geometry, the AEC-systems are very complex and their functioning is yet not fully understood. This work aims to evaluate the clinical performance of AEC-systems and their susceptibilities to assist on possible patient dose optimizations. The approach to evaluate the AEC-systems of three of the leading CT manufacturers in Brazil, General Electric, Philips and Toshiba, was the extraction of tube current modulation data from the DICOM standard image sequences, measurement and analysis of the image noise of those image sequences and measurement of the dose distribution along the scan length on the surface and inside of two different phantoms configurations. The tube current modulation of each CT scanner associated to the resulted image quality provides the performance of the AECsystem. The dose distribution measurements provide the dose profile due to the tube current modulation. Dose measurements with the AEC-system ON and OFF were made to quantify the impact of these systems regarding patient dose. The results attained give rise to optimizations on the AEC-systems applications and, by consequence, decreases the patient dose without

  4. Evaluation of automatic exposure control systems in computed tomography

    International Nuclear Information System (INIS)

    Reina, Thamiris Rosado

    2014-01-01

    The development of the computed tomography (CT) technology has brought wider possibilities on diagnostic medicine. It is a non-invasive method to see the human body in details. As the CT application increases, it raises the concern about patient dose, because the higher dose levels imparted compared to other diagnostic imaging modalities. The radiology community (radiologists, medical physicists and manufacturer) are working together to find the lowest dose level possible, without compromising the diagnostic image quality. The greatest and relatively new advance to lower the patient dose is the automatic exposure control (AEC) systems in CT. These systems are designed to ponder the dose distribution along the patient scanning and between patients taking into account their sizes and irradiated tissue densities. Based on the CT scanning geometry, the AEC-systems are very complex and their functioning is yet not fully understood. This work aims to evaluate the clinical performance of AEC-systems and their susceptibilities to assist on possible patient dose optimizations. The approach to evaluate the AEC-systems of three of the leading CT manufacturers in Brazil, General Electric, Philips and Toshiba, was the extraction of tube current modulation data from the DICOM standard image sequences, measurement and analysis of the image noise of those image sequences and measurement of the dose distribution along the scan length on the surface and inside of two different phantoms configurations. The tube current modulation of each CT scanner associated to the resulted image quality provides the performance of the AECsystem. The dose distribution measurements provide the dose profile due to the tube current modulation. Dose measurements with the AEC-system ON and OFF were made to quantify the impact of these systems regarding patient dose. The results attained give rise to optimizations on the AEC-systems applications and, by consequence, decreases the patient dose without

  5. Predictive Modeling and Computational Toxicology

    Science.gov (United States)

    Embryonic development is orchestrated via a complex series of cellular interactions controlling behaviors such as mitosis, migration, differentiation, adhesion, contractility, apoptosis, and extracellular matrix remodeling. Any chemical exposure that perturbs these cellular proce...

  6. Computer modeling of the gyrocon

    International Nuclear Information System (INIS)

    Tallerico, P.J.; Rankin, J.E.

    1979-01-01

    A gyrocon computer model is discussed in which the electron beam is followed from the gun output to the collector region. The initial beam may be selected either as a uniform circular beam or may be taken from the output of an electron gun simulated by the program of William Herrmannsfeldt. The fully relativistic equations of motion are then integrated numerically to follow the beam successively through a drift tunnel, a cylindrical rf beam deflection cavity, a combination drift space and magnetic bender region, and an output rf cavity. The parameters for each region are variable input data from a control file. The program calculates power losses in the cavity wall, power required by beam loading, power transferred from the beam to the output cavity fields, and electronic and overall efficiency. Space-charge effects are approximated if selected. Graphical displays of beam motions are produced. We discuss the Los Alamos Scientific Laboratory (LASL) prototype design as an example of code usage. The design shows a gyrocon of about two-thirds megawatt output at 450 MHz with up to 86% overall efficiency

  7. The Fermilab central computing facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-01-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front-end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS cluster interactive front-end, an Amdahl VM Computing engine, ACP farms, and (primarily) VMS workstations. This paper will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. (orig.)

  8. The Fermilab Central Computing Facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-05-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS Cluster interactive front end, an Amdahl VM computing engine, ACP farms, and (primarily) VMS workstations. This presentation will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. 2 figs

  9. Quantum vertex model for reversible classical computing.

    Science.gov (United States)

    Chamon, C; Mucciolo, E R; Ruckenstein, A E; Yang, Z-C

    2017-05-12

    Mappings of classical computation onto statistical mechanics models have led to remarkable successes in addressing some complex computational problems. However, such mappings display thermodynamic phase transitions that may prevent reaching solution even for easy problems known to be solvable in polynomial time. Here we map universal reversible classical computations onto a planar vertex model that exhibits no bulk classical thermodynamic phase transition, independent of the computational circuit. Within our approach the solution of the computation is encoded in the ground state of the vertex model and its complexity is reflected in the dynamics of the relaxation of the system to its ground state. We use thermal annealing with and without 'learning' to explore typical computational problems. We also construct a mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating an approach to reversible classical computation based on state-of-the-art implementations of quantum annealing.

  10. Modeling Computer Virus and Its Dynamics

    Directory of Open Access Journals (Sweden)

    Mei Peng

    2013-01-01

    Full Text Available Based on that the computer will be infected by infected computer and exposed computer, and some of the computers which are in suscepitible status and exposed status can get immunity by antivirus ability, a novel coumputer virus model is established. The dynamic behaviors of this model are investigated. First, the basic reproduction number R0, which is a threshold of the computer virus spreading in internet, is determined. Second, this model has a virus-free equilibrium P0, which means that the infected part of the computer disappears, and the virus dies out, and P0 is a globally asymptotically stable equilibrium if R01 then this model has only one viral equilibrium P*, which means that the computer persists at a constant endemic level, and P* is also globally asymptotically stable. Finally, some numerical examples are given to demonstrate the analytical results.

  11. The IceCube Computing Infrastructure Model

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Besides the big LHC experiments a number of mid-size experiments is coming online which need to define new computing models to meet the demands on processing and storage requirements of those experiments. We present the hybrid computing model of IceCube which leverages GRID models with a more flexible direct user model as an example of a possible solution. In IceCube a central datacenter at UW-Madison servers as Tier-0 with a single Tier-1 datacenter at DESY Zeuthen. We describe the setup of the IceCube computing infrastructure and report on our experience in successfully provisioning the IceCube computing needs.

  12. Computational nanophotonics modeling and applications

    CERN Document Server

    Musa, Sarhan M

    2013-01-01

    This reference offers tools for engineers, scientists, biologists, and others working with the computational techniques of nanophotonics. It introduces the key concepts of computational methods in a manner that is easily digestible for newcomers to the field. The book also examines future applications of nanophotonics in the technical industry and covers new developments and interdisciplinary research in engineering, science, and medicine. It provides an overview of the key computational nanophotonics and describes the technologies with an emphasis on how they work and their key benefits.

  13. Determinants of Dermal Exposure Relevant for Exposure Modelling in Regulatory Risk Assessment

    NARCIS (Netherlands)

    Marquart, J.; Brouwer, D.H.; Gijsbers, J.H.J.; Links, I.H.M.; Warren, N.; Hemmen, J.J. van

    2003-01-01

    Risk assessment of chemicals requires assessment of the exposure levels of workers. In the absence of adequate specific measured data, models are often used to estimate exposure levels. For dermal exposure only a few models exist, which are not validated externally. In the scope of a large European

  14. Patient dose, gray level and exposure index with a computed radiography system

    Science.gov (United States)

    Silva, T. R.; Yoshimura, E. M.

    2014-02-01

    Computed radiography (CR) is gradually replacing conventional screen-film system in Brazil. To assess image quality, manufactures provide the calculation of an exposure index through the acquisition software of the CR system. The objective of this study is to verify if the CR image can be used as an evaluator of patient absorbed dose too, through a relationship between the entrance skin dose and the exposure index or the gray level values obtained in the image. The CR system used for this study (Agfa model 30-X with NX acquisition software) calculates an exposure index called Log of the Median (lgM), related to the absorbed dose to the IP. The lgM value depends on the average gray level (called Scan Average Level (SAL)) of the segmented pixel value histogram of the whole image. A Rando male phantom was used to simulate a human body (chest and head), and was irradiated with an X-ray equipment, using usual radiologic techniques for chest exams. Thermoluminescent dosimeters (LiF, TLD100) were used to evaluate entrance skin dose and exit dose. The results showed a logarithm relation between entrance dose and SAL in the image center, regardless of the beam filtration. The exposure index varies linearly with the entrance dose, but the angular coefficient is beam quality dependent. We conclude that, with an adequate calibration, the CR system can be used to evaluate the patient absorbed dose.

  15. Pervasive Computing and Prosopopoietic Modelling

    DEFF Research Database (Denmark)

    Michelsen, Anders Ib

    2011-01-01

    the mid-20th century of a paradoxical distinction/complicity between the technical organisation of computed function and the human Being, in the sense of creative action upon such function. This paradoxical distinction/complicity promotes a chiastic (Merleau-Ponty) relationship of extension of one......This article treats the philosophical underpinnings of the notions of ubiquity and pervasive computing from a historical perspective. The current focus on these notions reflects the ever increasing impact of new media and the underlying complexity of computed function in the broad sense of ICT...... that have spread vertiginiously since Mark Weiser coined the term ‘pervasive’, e.g., digitalised sensoring, monitoring, effectuation, intelligence, and display. Whereas Weiser’s original perspective may seem fulfilled since computing is everywhere, in his and Seely Brown’s (1997) terms, ‘invisible...

  16. Modeling Exposure of Mammalian Predatorsto Anticoagulant Rodenticides

    DEFF Research Database (Denmark)

    Topping, Christopher John; Elmeros, Morten

    2016-01-01

    as vectors of AR, and was used to evaluate likely impacts of restrictions imposed on AR use in Denmark banning the use of rodenticides for plant protection in woodlands and tree-crops. The model uses input based on frequencies and timings of baiting for rodent control for urban, rural and woodland locations......Anticoagulant rodenticides (AR) are a widespread and effective method of rodent control but there is concern about the impact these may have on non-target organisms, in particular secondary poisoning of rodent predators. Incidence and concentration of AR in free-living predators in Denmark is very...... high. We postulate that this is caused by widespread exposure due to widespread use of AR in Denmark in and around buildings. To investigate this theory a spatio-temporal model of AR use and mammalian predator distribution was created. This model was supported by data from an experimental study of mice...

  17. A Spatial Model of the Mere Exposure Effect.

    Science.gov (United States)

    Fink, Edward L.; And Others

    1989-01-01

    Uses a spatial model to examine the relationship between stimulus exposure, cognition, and affect. Notes that this model accounts for cognitive changes that a stimulus may acquire as a result of exposure. Concludes that the spatial model is useful for evaluating the mere exposure effect and that affective change does not require cognitive change.…

  18. Climate Ocean Modeling on Parallel Computers

    Science.gov (United States)

    Wang, P.; Cheng, B. N.; Chao, Y.

    1998-01-01

    Ocean modeling plays an important role in both understanding the current climatic conditions and predicting future climate change. However, modeling the ocean circulation at various spatial and temporal scales is a very challenging computational task.

  19. Computational Intelligence. Mortality Models for the Actuary

    NARCIS (Netherlands)

    Willemse, W.J.

    2001-01-01

    This thesis applies computational intelligence to the field of actuarial (insurance) science. In particular, this thesis deals with life insurance where mortality modelling is important. Actuaries use ancient models (mortality laws) from the nineteenth century, for example Gompertz' and Makeham's

  20. Applications of computer modeling to fusion research

    International Nuclear Information System (INIS)

    Dawson, J.M.

    1989-01-01

    Progress achieved during this report period is presented on the following topics: Development and application of gyrokinetic particle codes to tokamak transport, development of techniques to take advantage of parallel computers; model dynamo and bootstrap current drive; and in general maintain our broad-based program in basic plasma physics and computer modeling

  1. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  2. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  3. Task-based dermal exposure models for regulatory risk assessment.

    Science.gov (United States)

    Warren, Nicholas D; Marquart, Hans; Christopher, Yvette; Laitinen, Juha; VAN Hemmen, Joop J

    2006-07-01

    The regulatory risk assessment of chemicals requires the estimation of occupational dermal exposure. Until recently, the models used were either based on limited data or were specific to a particular class of chemical or application. The EU project RISKOFDERM has gathered a considerable number of new measurements of dermal exposure together with detailed contextual information. This article describes the development of a set of generic task-based models capable of predicting potential dermal exposure to both solids and liquids in a wide range of situations. To facilitate modelling of the wide variety of dermal exposure situations six separate models were made for groupings of exposure scenarios called Dermal Exposure Operation units (DEO units). These task-based groupings cluster exposure scenarios with regard to the expected routes of dermal exposure and the expected influence of exposure determinants. Within these groupings linear mixed effect models were used to estimate the influence of various exposure determinants and to estimate components of variance. The models predict median potential dermal exposure rates for the hands and the rest of the body from the values of relevant exposure determinants. These rates are expressed as mg or microl product per minute. Using these median potential dermal exposure rates and an accompanying geometric standard deviation allows a range of exposure percentiles to be calculated.

  4. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    where x increases from zero to N, the saturation value. Box 1. Matrix Meth- ... such as Laplace transforms and non-linear differential equa- tions with .... atomic bomb project in the. US in the early ... his work on game theory and computers.

  5. A computer system for occupational radiation exposure information

    International Nuclear Information System (INIS)

    Hunt, H.W.

    1984-01-01

    A computerized occupational radiation exposure information system has been developed to maintain records for contractors at the U.S. Department of Energy's (DOE) Hanford Site. The system also allows indexing and retrieval of three million documents from microfilm, thus significantly reducing storage needs and costs. The users are linked by display terminals to the data base permitting them instant access to dosemetry and other radiation exposure information. Personnel dosemeter and bioassay results, radiation training, respirator fittings, skin contaminations and other radiation occurrence records are included in the data base. The system yields immediate analysis of radiological exposures for operating management and health physics personnel, thereby releasing personnel to use their time more effectively

  6. Effect of exposure time reduction towards sensitivity and SNR for computed radiography (CR) application in NDT

    International Nuclear Information System (INIS)

    Sapizah Rahim; Khairul Anuar Mohd Salleh; Noorhazleena Azaman; Shaharudin Sayuti; Siti Madiha Muhammad Amir; Arshad Yassin; Abdul Razak Hamzah

    2010-01-01

    Signal-to-noise ratio (SNR) and sensitivity study of Computed Radiography (CR) system with reduction of exposure time is presented. The purposes of this research are to determine the behavior of SNR toward three different thicknesses (step wedge; 5, 10 and 15 mm) and the ability of CR system to recognize hole type penetrameter when the exposure time decreased up to 80 % according to the exposure chart (D7; ISOVOLT Titan E). It is shown that the SNR is decreased with decreasing of exposure time percentage but the high quality image is achieved until 80 % reduction of exposure time. (author)

  7. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define the m...

  8. A Categorisation of Cloud Computing Business Models

    OpenAIRE

    Chang, Victor; Bacigalupo, David; Wills, Gary; De Roure, David

    2010-01-01

    This paper reviews current cloud computing business models and presents proposals on how organisations can achieve sustainability by adopting appropriate models. We classify cloud computing business models into eight types: (1) Service Provider and Service Orientation; (2) Support and Services Contracts; (3) In-House Private Clouds; (4) All-In-One Enterprise Cloud; (5) One-Stop Resources and Services; (6) Government funding; (7) Venture Capitals; and (8) Entertainment and Social Networking. U...

  9. A computational model of selection by consequences.

    OpenAIRE

    McDowell, J J

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of computational experiments that arranged reinforcement according to random-interval (RI) schedules. The quantitative features of the model were varied o...

  10. Creation of 'Ukrytie' objects computer model

    International Nuclear Information System (INIS)

    Mazur, A.B.; Kotlyarov, V.T.; Ermolenko, A.I.; Podbereznyj, S.S.; Postil, S.D.; Shaptala, D.V.

    1999-01-01

    A partial computer model of the 'Ukrytie' object was created with the use of geoinformation technologies. The computer model makes it possible to carry out information support of the works related to the 'Ukrytie' object stabilization and its conversion into ecologically safe system for analyzing, forecasting and controlling the processes occurring in the 'Ukrytie' object. Elements and structures of the 'Ukryttia' object were designed and input into the model

  11. Computational models in physics teaching: a framework

    Directory of Open Access Journals (Sweden)

    Marco Antonio Moreira

    2012-08-01

    Full Text Available The purpose of the present paper is to present a theoretical framework to promote and assist meaningful physics learning through computational models. Our proposal is based on the use of a tool, the AVM diagram, to design educational activities involving modeling and computer simulations. The idea is to provide a starting point for the construction and implementation of didactical approaches grounded in a coherent epistemological view about scientific modeling.

  12. FDTD computation of temperature elevation in the elderly for far-field RF exposures

    International Nuclear Information System (INIS)

    Nomura, Tomoki; Laakso, Ilkka; Hirata, Akimasa

    2014-01-01

    Core temperature elevation and perspiration in younger and older adults is investigated for plane-wave exposure at whole-body averaged specific absorption rate of 0.4 W kg -1 . Numeric Japanese male model is considered together with a thermo-regulatory response formula proposed in the authors' previous study. The frequencies considered were at 65 MHz and 2 GHz where the total power absorption in humans becomes maximal for the allowable power density prescribed in the international guidelines. From the computational results used here, the core temperature elevation in the older adult model was larger than that in the younger one at both frequencies. The reason for this difference is attributable to the difference of sweating, which is originated from the difference in the threshold activating the sweating and the decline in sweating in the legs. (authors)

  13. FDTD computation of temperature elevation in the elderly for far-field RF exposures.

    Science.gov (United States)

    Nomura, Tomoki; Laakso, Ilkka; Hirata, Akimasa

    2014-03-01

    Core temperature elevation and perspiration in younger and older adults is investigated for plane-wave exposure at whole-body averaged specific absorption rate of 0.4 W kg(-1). Numeric Japanese male model is considered together with a thermoregulatory response formula proposed in the authors' previous study. The frequencies considered were at 65 MHz and 2 GHz where the total power absorption in humans becomes maximal for the allowable power density prescribed in the international guidelines. From the computational results used here, the core temperature elevation in the older adult model was larger than that in the younger one at both frequencies. The reason for this difference is attributable to the difference of sweating, which is originated from the difference in the threshold activating the sweating and the decline in sweating in the legs.

  14. Introducing Seismic Tomography with Computational Modeling

    Science.gov (United States)

    Neves, R.; Neves, M. L.; Teodoro, V.

    2011-12-01

    Learning seismic tomography principles and techniques involves advanced physical and computational knowledge. In depth learning of such computational skills is a difficult cognitive process that requires a strong background in physics, mathematics and computer programming. The corresponding learning environments and pedagogic methodologies should then involve sets of computational modelling activities with computer software systems which allow students the possibility to improve their mathematical or programming knowledge and simultaneously focus on the learning of seismic wave propagation and inverse theory. To reduce the level of cognitive opacity associated with mathematical or programming knowledge, several computer modelling systems have already been developed (Neves & Teodoro, 2010). Among such systems, Modellus is particularly well suited to achieve this goal because it is a domain general environment for explorative and expressive modelling with the following main advantages: 1) an easy and intuitive creation of mathematical models using just standard mathematical notation; 2) the simultaneous exploration of images, tables, graphs and object animations; 3) the attribution of mathematical properties expressed in the models to animated objects; and finally 4) the computation and display of mathematical quantities obtained from the analysis of images and graphs. Here we describe virtual simulations and educational exercises which enable students an easy grasp of the fundamental of seismic tomography. The simulations make the lecture more interactive and allow students the possibility to overcome their lack of advanced mathematical or programming knowledge and focus on the learning of seismological concepts and processes taking advantage of basic scientific computation methods and tools.

  15. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  16. AirPEx: Air Pollution Exposure Model

    NARCIS (Netherlands)

    Freijer JI; Bloemen HJTh; Loos S de; Marra M; Rombout PJA; Steentjes GM; Veen MP van; LBO

    1997-01-01

    Analysis of inhalatory exposure to air pollution is an important area of investigation when assessing the risks of air pollution for human health. Inhalatory exposure research focuses on the exposure of humans to air pollutants and the entry of these pollutants into the human respiratory tract. The

  17. Ranked retrieval of Computational Biology models.

    Science.gov (United States)

    Henkel, Ron; Endler, Lukas; Peters, Andre; Le Novère, Nicolas; Waltemath, Dagmar

    2010-08-11

    The study of biological systems demands computational support. If targeting a biological problem, the reuse of existing computational models can save time and effort. Deciding for potentially suitable models, however, becomes more challenging with the increasing number of computational models available, and even more when considering the models' growing complexity. Firstly, among a set of potential model candidates it is difficult to decide for the model that best suits ones needs. Secondly, it is hard to grasp the nature of an unknown model listed in a search result set, and to judge how well it fits for the particular problem one has in mind. Here we present an improved search approach for computational models of biological processes. It is based on existing retrieval and ranking methods from Information Retrieval. The approach incorporates annotations suggested by MIRIAM, and additional meta-information. It is now part of the search engine of BioModels Database, a standard repository for computational models. The introduced concept and implementation are, to our knowledge, the first application of Information Retrieval techniques on model search in Computational Systems Biology. Using the example of BioModels Database, it was shown that the approach is feasible and extends the current possibilities to search for relevant models. The advantages of our system over existing solutions are that we incorporate a rich set of meta-information, and that we provide the user with a relevance ranking of the models found for a query. Better search capabilities in model databases are expected to have a positive effect on the reuse of existing models.

  18. 49 CFR Appendix A to Part 227 - Noise Exposure Computation

    Science.gov (United States)

    2010-10-01

    ... + . . . + Cn/Tn), where Cn indicates the total time of exposure at a specific noise level, and Tn indicates the....054 127 0.047 128 0.041 129 0.036 130 0.031 140 0.078 1 Numbers above 115 dB(A) are italicized to indicate that they are noise levels that are not permitted. The italicized numbers are included only...

  19. Computational challenges in modeling gene regulatory events.

    Science.gov (United States)

    Pataskar, Abhijeet; Tiwari, Vijay K

    2016-10-19

    Cellular transcriptional programs driven by genetic and epigenetic mechanisms could be better understood by integrating "omics" data and subsequently modeling the gene-regulatory events. Toward this end, computational biology should keep pace with evolving experimental procedures and data availability. This article gives an exemplified account of the current computational challenges in molecular biology.

  20. Modelling of aircrew radiation exposure during solar particle events

    Science.gov (United States)

    Al Anid, Hani Khaled

    show a very different response during anisotropic events, leading to variations in aircrew radiation doses that may be significant for dose assessment. To estimate the additional exposure due to solar flares, a model was developed using a Monte-Carlo radiation transport code, MCNPX. The model transports an extrapolated particle spectrum based on satellite measurements through the atmosphere using the MCNPX analysis. This code produces the estimated flux at a specific altitude where radiation dose conversion coefficients are applied to convert the particle flux into effective and ambient dose-equivalent rates. A cut-off rigidity model accounts for the shielding effects of the Earth's magnetic field. Comparisons were made between the model predictions and actual flight measurements taken with various types of instruments used to measure the mixed radiation field during Ground Level Enhancements 60 and 65. An anisotropy analysis that uses neutron monitor responses and the pitch angle distribution of energetic solar particles was used to identify particle anisotropy for a solar event in December 2006. In anticipation of future commercial use, a computer code has been developed to implement the radiation dose assessment model for routine analysis. Keywords: Radiation Dosimetry, Radiation Protection, Space Physics.

  1. Notions of similarity for computational biology models

    KAUST Repository

    Waltemath, Dagmar

    2016-03-21

    Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users\\' intuition about model similarity, and to support complex model searches in databases.

  2. Notions of similarity for computational biology models

    KAUST Repository

    Waltemath, Dagmar; Henkel, Ron; Hoehndorf, Robert; Kacprowski, Tim; Knuepfer, Christian; Liebermeister, Wolfram

    2016-01-01

    Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users' intuition about model similarity, and to support complex model searches in databases.

  3. Predictive Models and Computational Embryology

    Science.gov (United States)

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  4. Air Pollution Exposure Modeling for Health Studies | Science ...

    Science.gov (United States)

    Dr. Michael Breen is leading the development of air pollution exposure models, integrated with novel personal sensor technologies, to improve exposure and risk assessments for individuals in health studies. He is co-investigator for multiple health studies assessing the exposure and effects of air pollutants. These health studies include participants with asthma, diabetes, and coronary artery disease living in various U.S. cities. He has developed, evaluated, and applied novel exposure modeling and time-activity tools, which includes the Exposure Model for Individuals (EMI), GPS-based Microenvironment Tracker (MicroTrac) and Exposure Tracker models. At this seminar, Dr. Breen will present the development and application of these models to predict individual-level personal exposures to particulate matter (PM) for two health studies in central North Carolina. These health studies examine the association between PM and adverse health outcomes for susceptible individuals. During Dr. Breen’s visit, he will also have the opportunity to establish additional collaborations with researchers at Harvard University that may benefit from the use of exposure models for cohort health studies. These research projects that link air pollution exposure with adverse health outcomes benefit EPA by developing model-predicted exposure-dose metrics for individuals in health studies to improve the understanding of exposure-response behavior of air pollutants, and to reduce participant

  5. Sierra toolkit computational mesh conceptual model

    International Nuclear Information System (INIS)

    Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.

    2010-01-01

    The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

  6. Computer simulations of the random barrier model

    DEFF Research Database (Denmark)

    Schrøder, Thomas; Dyre, Jeppe

    2002-01-01

    A brief review of experimental facts regarding ac electronic and ionic conduction in disordered solids is given followed by a discussion of what is perhaps the simplest realistic model, the random barrier model (symmetric hopping model). Results from large scale computer simulations are presented...

  7. Computational Modeling of Culture's Consequences

    NARCIS (Netherlands)

    Hofstede, G.J.; Jonker, C.M.; Verwaart, T.

    2010-01-01

    This paper presents an approach to formalize the influence of culture on the decision functions of agents in social simulations. The key components are (a) a definition of the domain of study in the form of a decision model, (b) knowledge acquisition based on a dimensional theory of culture,

  8. Computational aspects of premixing modelling

    Energy Technology Data Exchange (ETDEWEB)

    Fletcher, D.F. [Sydney Univ., NSW (Australia). Dept. of Chemical Engineering; Witt, P.J.

    1998-01-01

    In the steam explosion research field there is currently considerable effort being devoted to the modelling of premixing. Practically all models are based on the multiphase flow equations which treat the mixture as an interpenetrating continuum. Solution of these equations is non-trivial and a wide range of solution procedures are in use. This paper addresses some numerical aspects of this problem. In particular, we examine the effect of the differencing scheme for the convective terms and show that use of hybrid differencing can cause qualitatively wrong solutions in some situations. Calculations are performed for the Oxford tests, the BNL tests, a MAGICO test and to investigate various sensitivities of the solution. In addition, we show that use of a staggered grid can result in a significant error which leads to poor predictions of `melt` front motion. A correction is given which leads to excellent convergence to the analytic solution. Finally, we discuss the issues facing premixing model developers and highlight the fact that model validation is hampered more by the complexity of the process than by numerical issues. (author)

  9. Computational modeling of concrete flow

    DEFF Research Database (Denmark)

    Roussel, Nicolas; Geiker, Mette Rica; Dufour, Frederic

    2007-01-01

    particle flow, and numerical techniques allowing the modeling of particles suspended in a fluid. The general concept behind each family of techniques is described. Pros and cons for each technique are given along with examples and references to applications to fresh cementitious materials....

  10. The ability of non-computer tasks to increase biomechanical exposure variability in computer-intensive office work.

    Science.gov (United States)

    Barbieri, Dechristian França; Srinivasan, Divya; Mathiassen, Svend Erik; Nogueira, Helen Cristina; Oliveira, Ana Beatriz

    2015-01-01

    Postures and muscle activity in the upper body were recorded from 50 academics office workers during 2 hours of normal work, categorised by observation into computer work (CW) and three non-computer (NC) tasks (NC seated work, NC standing/walking work and breaks). NC tasks differed significantly in exposures from CW, with standing/walking NC tasks representing the largest contrasts for most of the exposure variables. For the majority of workers, exposure variability was larger in their present job than in CW alone, as measured by the job variance ratio (JVR), i.e. the ratio between min-min variabilities in the job and in CW. Calculations of JVRs for simulated jobs containing different proportions of CW showed that variability could, indeed, be increased by redistributing available tasks, but that substantial increases could only be achieved by introducing more vigorous tasks in the job, in casu illustrated by cleaning.

  11. Variation in calculated human exposure. Comparison of calculations with seven European human exposure models

    NARCIS (Netherlands)

    Swartjes F; ECO

    2003-01-01

    Twenty scenarios, differing with respect to land use, soil type and contaminant, formed the basis for calculating human exposure from soil contaminants with the use of models contributed by seven European countries (one model per country). Here, the human exposures to children and children

  12. Advanced REACH tool: A Bayesian model for occupational exposure assessment

    NARCIS (Netherlands)

    McNally, K.; Warren, N.; Fransman, W.; Entink, R.K.; Schinkel, J.; Van Tongeren, M.; Cherrie, J.W.; Kromhout, H.; Schneider, T.; Tielemans, E.

    2014-01-01

    This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate

  13. Computer Modeling of Direct Metal Laser Sintering

    Science.gov (United States)

    Cross, Matthew

    2014-01-01

    A computational approach to modeling direct metal laser sintering (DMLS) additive manufacturing process is presented. The primary application of the model is for determining the temperature history of parts fabricated using DMLS to evaluate residual stresses found in finished pieces and to assess manufacturing process strategies to reduce part slumping. The model utilizes MSC SINDA as a heat transfer solver with imbedded FORTRAN computer code to direct laser motion, apply laser heating as a boundary condition, and simulate the addition of metal powder layers during part fabrication. Model results are compared to available data collected during in situ DMLS part manufacture.

  14. Visual and Computational Modelling of Minority Games

    Directory of Open Access Journals (Sweden)

    Robertas Damaševičius

    2017-02-01

    Full Text Available The paper analyses the Minority Game and focuses on analysis and computational modelling of several variants (variable payoff, coalition-based and ternary voting of Minority Game using UAREI (User-Action-Rule-Entities-Interface model. UAREI is a model for formal specification of software gamification, and the UAREI visual modelling language is a language used for graphical representation of game mechanics. The URAEI model also provides the embedded executable modelling framework to evaluate how the rules of the game will work for the players in practice. We demonstrate flexibility of UAREI model for modelling different variants of Minority Game rules for game design.

  15. Model to Implement Virtual Computing Labs via Cloud Computing Services

    Directory of Open Access Journals (Sweden)

    Washington Luna Encalada

    2017-07-01

    Full Text Available In recent years, we have seen a significant number of new technological ideas appearing in literature discussing the future of education. For example, E-learning, cloud computing, social networking, virtual laboratories, virtual realities, virtual worlds, massive open online courses (MOOCs, and bring your own device (BYOD are all new concepts of immersive and global education that have emerged in educational literature. One of the greatest challenges presented to e-learning solutions is the reproduction of the benefits of an educational institution’s physical laboratory. For a university without a computing lab, to obtain hands-on IT training with software, operating systems, networks, servers, storage, and cloud computing similar to that which could be received on a university campus computing lab, it is necessary to use a combination of technological tools. Such teaching tools must promote the transmission of knowledge, encourage interaction and collaboration, and ensure students obtain valuable hands-on experience. That, in turn, allows the universities to focus more on teaching and research activities than on the implementation and configuration of complex physical systems. In this article, we present a model for implementing ecosystems which allow universities to teach practical Information Technology (IT skills. The model utilizes what is called a “social cloud”, which utilizes all cloud computing services, such as Software as a Service (SaaS, Platform as a Service (PaaS, and Infrastructure as a Service (IaaS. Additionally, it integrates the cloud learning aspects of a MOOC and several aspects of social networking and support. Social clouds have striking benefits such as centrality, ease of use, scalability, and ubiquity, providing a superior learning environment when compared to that of a simple physical lab. The proposed model allows students to foster all the educational pillars such as learning to know, learning to be, learning

  16. Validation of the dermal exposure model in ECETOC TRA.

    Science.gov (United States)

    Marquart, Hans; Franken, Remy; Goede, Henk; Fransman, Wouter; Schinkel, Jody

    2017-08-01

    The ECETOC TRA model (presently version 3.1) is often used to estimate worker inhalation and dermal exposure in regulatory risk assessment. The dermal model in ECETOC TRA has not yet been validated by comparison with independent measured exposure levels. This was the goal of the present study. Measured exposure levels and relevant contextual information were gathered via literature search, websites of relevant occupational health institutes and direct requests for data to industry. Exposure data were clustered in so-called exposure cases, which are sets of data from one data source that are expected to have the same values for input parameters in the ECETOC TRA dermal exposure model. For each exposure case, the 75th percentile of measured values was calculated, because the model intends to estimate these values. The input values for the parameters in ECETOC TRA were assigned by an expert elicitation and consensus building process, based on descriptions of relevant contextual information.From more than 35 data sources, 106 useful exposure cases were derived, that were used for direct comparison with the model estimates. The exposure cases covered a large part of the ECETOC TRA dermal exposure model. The model explained 37% of the variance in the 75th percentiles of measured values. In around 80% of the exposure cases, the model estimate was higher than the 75th percentile of measured values. In the remaining exposure cases, the model estimate may not be sufficiently conservative.The model was shown to have a clear bias towards (severe) overestimation of dermal exposure at low measured exposure values, while all cases of apparent underestimation by the ECETOC TRA dermal exposure model occurred at high measured exposure values. This can be partly explained by a built-in bias in the effect of concentration of substance in product used, duration of exposure and the use of protective gloves in the model. The effect of protective gloves was calculated to be on average a

  17. Computational modeling of epiphany learning.

    Science.gov (United States)

    Chen, Wei James; Krajbich, Ian

    2017-05-02

    Models of reinforcement learning (RL) are prevalent in the decision-making literature, but not all behavior seems to conform to the gradual convergence that is a central feature of RL. In some cases learning seems to happen all at once. Limited prior research on these "epiphanies" has shown evidence of sudden changes in behavior, but it remains unclear how such epiphanies occur. We propose a sequential-sampling model of epiphany learning (EL) and test it using an eye-tracking experiment. In the experiment, subjects repeatedly play a strategic game that has an optimal strategy. Subjects can learn over time from feedback but are also allowed to commit to a strategy at any time, eliminating all other options and opportunities to learn. We find that the EL model is consistent with the choices, eye movements, and pupillary responses of subjects who commit to the optimal strategy (correct epiphany) but not always of those who commit to a suboptimal strategy or who do not commit at all. Our findings suggest that EL is driven by a latent evidence accumulation process that can be revealed with eye-tracking data.

  18. Air pollution exposure modeling of individuals

    Science.gov (United States)

    Air pollution epidemiology studies of ambient fine particulate matter (PM2.5) often use outdoor concentrations as exposure surrogates. These surrogates can induce exposure error since they do not account for (1) time spent indoors with ambient PM2.5 levels attenuated from outdoor...

  19. Disclosive Computer Ethics: The Exposure and Evaluation of Embedded Normativity in Computer Technology.

    NARCIS (Netherlands)

    Brey, Philip A.E.

    2000-01-01

    This essay provides a critique of mainstream computer ethics and argues for the importance of a complementary approach called disclosive computer ethics, which is concerned with the moral deciphering of embedded values and norms in computer systems, applications and practices. Also, four key values

  20. Computional algorithm for lifetime exposure to antimicrobials in pigs using register data − the LEA algorithm

    DEFF Research Database (Denmark)

    Birkegård, Anna Camilla; Dalhoff Andersen, Vibe; Hisham Beshara Halasa, Tariq

    2017-01-01

    Accurate and detailed data on antimicrobial exposure in pig production are essential when studying the association between antimicrobial exposure and antimicrobial resistance. Due to difficulties in obtaining primary data on antimicrobial exposure in a large number of farms, there is a need...... for a robust and valid method to estimate the exposure using register data. An approach that estimates the antimicrobial exposure in every rearing period during the lifetime of a pig using register data was developed into a computational algorithm. In this approach data from national registers on antimicrobial...... purchases, movements of pigs and farm demographics registered at farm level are used. The algorithm traces batches of pigs retrospectively from slaughter to the farm(s) that housed the pigs during their finisher, weaner, and piglet period. Subsequently, the algorithm estimates the antimicrobial exposure...

  1. Computational models of airway branching morphogenesis.

    Science.gov (United States)

    Varner, Victor D; Nelson, Celeste M

    2017-07-01

    The bronchial network of the mammalian lung consists of millions of dichotomous branches arranged in a highly complex, space-filling tree. Recent computational models of branching morphogenesis in the lung have helped uncover the biological mechanisms that construct this ramified architecture. In this review, we focus on three different theoretical approaches - geometric modeling, reaction-diffusion modeling, and continuum mechanical modeling - and discuss how, taken together, these models have identified the geometric principles necessary to build an efficient bronchial network, as well as the patterning mechanisms that specify airway geometry in the developing embryo. We emphasize models that are integrated with biological experiments and suggest how recent progress in computational modeling has advanced our understanding of airway branching morphogenesis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. A Comparison of Two Strategies for Building an Exposure Prediction Model.

    Science.gov (United States)

    Heiden, Marina; Mathiassen, Svend Erik; Garza, Jennifer; Liv, Per; Wahlström, Jens

    2016-01-01

    Cost-efficient assessments of job exposures in large populations may be obtained from models in which 'true' exposures assessed by expensive measurement methods are estimated from easily accessible and cheap predictors. Typically, the models are built on the basis of a validation study comprising 'true' exposure data as well as an extensive collection of candidate predictors from questionnaires or company data, which cannot all be included in the models due to restrictions in the degrees of freedom available for modeling. In these situations, predictors need to be selected using procedures that can identify the best possible subset of predictors among the candidates. The present study compares two strategies for selecting a set of predictor variables. One strategy relies on stepwise hypothesis testing of associations between predictors and exposure, while the other uses cluster analysis to reduce the number of predictors without relying on empirical information about the measured exposure. Both strategies were applied to the same dataset on biomechanical exposure and candidate predictors among computer users, and they were compared in terms of identified predictors of exposure as well as the resulting model fit using bootstrapped resamples of the original data. The identified predictors were, to a large part, different between the two strategies, and the initial model fit was better for the stepwise testing strategy than for the clustering approach. Internal validation of the models using bootstrap resampling with fixed predictors revealed an equally reduced model fit in resampled datasets for both strategies. However, when predictor selection was incorporated in the validation procedure for the stepwise testing strategy, the model fit was reduced to the extent that both strategies showed similar model fit. Thus, the two strategies would both be expected to perform poorly with respect to predicting biomechanical exposure in other samples of computer users. © The

  3. Computational multiscale modeling of intergranular cracking

    International Nuclear Information System (INIS)

    Simonovski, Igor; Cizelj, Leon

    2011-01-01

    A novel computational approach for simulation of intergranular cracks in a polycrystalline aggregate is proposed in this paper. The computational model includes a topological model of the experimentally determined microstructure of a 400 μm diameter stainless steel wire and automatic finite element discretization of the grains and grain boundaries. The microstructure was spatially characterized by X-ray diffraction contrast tomography and contains 362 grains and some 1600 grain boundaries. Available constitutive models currently include isotropic elasticity for the grain interior and cohesive behavior with damage for the grain boundaries. The experimentally determined lattice orientations are employed to distinguish between resistant low energy and susceptible high energy grain boundaries in the model. The feasibility and performance of the proposed computational approach is demonstrated by simulating the onset and propagation of intergranular cracking. The preliminary numerical results are outlined and discussed.

  4. Modeling multimodal human-computer interaction

    NARCIS (Netherlands)

    Obrenovic, Z.; Starcevic, D.

    2004-01-01

    Incorporating the well-known Unified Modeling Language into a generic modeling framework makes research on multimodal human-computer interaction accessible to a wide range off software engineers. Multimodal interaction is part of everyday human discourse: We speak, move, gesture, and shift our gaze

  5. A Computational Model of Selection by Consequences

    Science.gov (United States)

    McDowell, J. J.

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of…

  6. Generating Computational Models for Serious Gaming

    NARCIS (Netherlands)

    Westera, Wim

    2018-01-01

    Many serious games include computational models that simulate dynamic systems. These models promote enhanced interaction and responsiveness. Under the social web paradigm more and more usable game authoring tools become available that enable prosumers to create their own games, but the inclusion of

  7. Improved heat transfer modeling of the eye for electromagnetic wave exposures.

    Science.gov (United States)

    Hirata, Akimasa

    2007-05-01

    This study proposed an improved heat transfer model of the eye for exposure to electromagnetic (EM) waves. Particular attention was paid to the difference from the simplified heat transfer model commonly used in this field. From our computational results, the temperature elevation in the eye calculated with the simplified heat transfer model was largely influenced by the EM absorption outside the eyeball, but not when we used our improved model.

  8. Validation od computational model ALDERSON/EGSnrc for chest radiography

    International Nuclear Information System (INIS)

    Muniz, Bianca C.; Santos, André L. dos; Menezes, Claudio J.M.

    2017-01-01

    To perform dose studies in situations of exposure to radiation, without exposing individuals, the numerical dosimetry uses Computational Exposure Models (ECM). Composed essentially by a radioactive source simulator algorithm, a voxel phantom representing the human anatomy and a Monte Carlo code, the ECMs must be validated to determine the reliability of the physical array representation. The objective of this work is to validate the ALDERSON / EGSnrc MCE by through comparisons between the experimental measurements obtained with the ionization chamber and virtual simulations using Monte Carlo Method to determine the ratio of the input and output radiation dose. Preliminary results of these comparisons showed that the ECM reproduced the results of the experimental measurements performed with the physical phantom with a relative error of less than 10%, validating the use of this model for simulations of chest radiographs and estimates of radiation doses in tissues in the irradiated structures

  9. Security Management Model in Cloud Computing Environment

    OpenAIRE

    Ahmadpanah, Seyed Hossein

    2016-01-01

    In the cloud computing environment, cloud virtual machine (VM) will be more and more the number of virtual machine security and management faced giant Challenge. In order to address security issues cloud computing virtualization environment, this paper presents a virtual machine based on efficient and dynamic deployment VM security management model state migration and scheduling, study of which virtual machine security architecture, based on AHP (Analytic Hierarchy Process) virtual machine de...

  10. Ewe: a computer model for ultrasonic inspection

    International Nuclear Information System (INIS)

    Douglas, S.R.; Chaplin, K.R.

    1991-11-01

    The computer program EWE simulates the propagation of elastic waves in solids and liquids. It has been applied to ultrasonic testing to study the echoes generated by cracks and other types of defects. A discussion of the elastic wave equations is given, including the first-order formulation, shear and compression waves, surface waves and boundaries, numerical method of solution, models for cracks and slot defects, input wave generation, returning echo construction, and general computer issues

  11. Light reflection models for computer graphics.

    Science.gov (United States)

    Greenberg, D P

    1989-04-14

    During the past 20 years, computer graphic techniques for simulating the reflection of light have progressed so that today images of photorealistic quality can be produced. Early algorithms considered direct lighting only, but global illumination phenomena with indirect lighting, surface interreflections, and shadows can now be modeled with ray tracing, radiosity, and Monte Carlo simulations. This article describes the historical development of computer graphic algorithms for light reflection and pictorially illustrates what will be commonly available in the near future.

  12. Finite difference computing with exponential decay models

    CERN Document Server

    Langtangen, Hans Petter

    2016-01-01

    This text provides a very simple, initial introduction to the complete scientific computing pipeline: models, discretization, algorithms, programming, verification, and visualization. The pedagogical strategy is to use one case study – an ordinary differential equation describing exponential decay processes – to illustrate fundamental concepts in mathematics and computer science. The book is easy to read and only requires a command of one-variable calculus and some very basic knowledge about computer programming. Contrary to similar texts on numerical methods and programming, this text has a much stronger focus on implementation and teaches testing and software engineering in particular. .

  13. Do's and Don'ts of Computer Models for Planning

    Science.gov (United States)

    Hammond, John S., III

    1974-01-01

    Concentrates on the managerial issues involved in computer planning models. Describes what computer planning models are and the process by which managers can increase the likelihood of computer planning models being successful in their organizations. (Author/DN)

  14. Computational estimation of decline in sweating in the elderly from measured body temperatures and sweating for passive heat exposure

    International Nuclear Information System (INIS)

    Hirata, Akimasa; Nomura, Tomoki; Laakso, Ilkka

    2012-01-01

    Several studies reported the difference in heat tolerance between younger and older adults, which may be attributable to the decline in the sweating rate. One of the studies suggested a hypothesis that the dominant factor causing the decline in sweating was the decline in thermal sensitivity due to a weaker signal from the periphery to the regulatory centres. However, no quantitative investigation of the skin temperature threshold for activating the sweating has been conducted in previous studies. In this study, we developed a computational code to simulate the time evolution of the temperature variation and sweating in realistic human models under heat exposure, in part by comparing the computational results with measured data from younger and older adults. Based on our computational results, the difference in the threshold temperatures for activating the thermophysiological response, especially for sweating, is examined between older and younger adults. The threshold for activating sweating in older individuals was found to be about 1.5 °C higher than that in younger individuals. However, our computation did not suggest that it was possible to evaluate the central alteration with ageing by comparing the computation with the measurements for passive heat exposure, since the sweating rate is marginally affected by core temperature elevation at least for the scenarios considered here. The computational technique developed herein is useful for understanding the thermophysiological response of older individuals from measured data. (note)

  15. Quantum Vertex Model for Reversible Classical Computing

    Science.gov (United States)

    Chamon, Claudio; Mucciolo, Eduardo; Ruckenstein, Andrei; Yang, Zhicheng

    We present a planar vertex model that encodes the result of a universal reversible classical computation in its ground state. The approach involves Boolean variables (spins) placed on links of a two-dimensional lattice, with vertices representing logic gates. Large short-ranged interactions between at most two spins implement the operation of each gate. The lattice is anisotropic with one direction corresponding to computational time, and with transverse boundaries storing the computation's input and output. The model displays no finite temperature phase transitions, including no glass transitions, independent of circuit. The computational complexity is encoded in the scaling of the relaxation rate into the ground state with the system size. We use thermal annealing and a novel and more efficient heuristic \\x9Dannealing with learning to study various computational problems. To explore faster relaxation routes, we construct an explicit mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating a novel approach to reversible classical computation based on quantum annealing.

  16. Computer aided dosimetry and verification of exposure to radiation. Technical report

    International Nuclear Information System (INIS)

    Waller, D.; Stodilka, R.Z.; Leach, K.E.; Prud'homme-Lalonde, L.

    2002-06-01

    In the timeframe following the September 11th attacks on the United States, increased emphasis has been placed on Chemical, Biological, Radiological and Nuclear (CBRN) preparedness. Of prime importance is rapid field assessment of potential radiation exposure to Canadian Forces field personnel. This work set up a framework for generating an 'expert' computer system for aiding and assisting field personnel in determining the extent of radiation insult to military personnel. Data was gathered by review of the available literature, discussions with medical and health physics personnel having hands-on experience dealing with radiation accident victims, and from experience of the principal investigator. Flow charts and generic data fusion algorithms were developed. Relationships between known exposure parameters, patient interview and history, clinical symptoms, clinical work-ups, physical dosimetry, biological dosimetry, and dose reconstruction as critical data indicators were investigated. The data obtained was examined in terms of information theory. A main goal was to determine how best to generate an adaptive model (i.e. when more data becomes available, how is the prediction improved). Consideration was given to determination of predictive algorithms for health outcome. In addition, the concept of coding an expert medical treatment advisor system was developed. (author)

  17. Computer aided dosimetry and verification of exposure to radiation. Technical report

    Energy Technology Data Exchange (ETDEWEB)

    Waller, D. [SAIC Canada (Canada); Stodilka, R.Z.; Leach, K.E.; Prud' homme-Lalonde, L. [Defence R and D Canada (DRDC), Radiation Effects Group, Space Systems and Technology, Ottawa, Ontario (Canada)

    2002-06-15

    In the timeframe following the September 11th attacks on the United States, increased emphasis has been placed on Chemical, Biological, Radiological and Nuclear (CBRN) preparedness. Of prime importance is rapid field assessment of potential radiation exposure to Canadian Forces field personnel. This work set up a framework for generating an 'expert' computer system for aiding and assisting field personnel in determining the extent of radiation insult to military personnel. Data was gathered by review of the available literature, discussions with medical and health physics personnel having hands-on experience dealing with radiation accident victims, and from experience of the principal investigator. Flow charts and generic data fusion algorithms were developed. Relationships between known exposure parameters, patient interview and history, clinical symptoms, clinical work-ups, physical dosimetry, biological dosimetry, and dose reconstruction as critical data indicators were investigated. The data obtained was examined in terms of information theory. A main goal was to determine how best to generate an adaptive model (i.e. when more data becomes available, how is the prediction improved). Consideration was given to determination of predictive algorithms for health outcome. In addition, the concept of coding an expert medical treatment advisor system was developed. (author)

  18. Computational disease modeling – fact or fiction?

    Directory of Open Access Journals (Sweden)

    Stephan Klaas

    2009-06-01

    Full Text Available Abstract Background Biomedical research is changing due to the rapid accumulation of experimental data at an unprecedented scale, revealing increasing degrees of complexity of biological processes. Life Sciences are facing a transition from a descriptive to a mechanistic approach that reveals principles of cells, cellular networks, organs, and their interactions across several spatial and temporal scales. There are two conceptual traditions in biological computational-modeling. The bottom-up approach emphasizes complex intracellular molecular models and is well represented within the systems biology community. On the other hand, the physics-inspired top-down modeling strategy identifies and selects features of (presumably essential relevance to the phenomena of interest and combines available data in models of modest complexity. Results The workshop, "ESF Exploratory Workshop on Computational disease Modeling", examined the challenges that computational modeling faces in contributing to the understanding and treatment of complex multi-factorial diseases. Participants at the meeting agreed on two general conclusions. First, we identified the critical importance of developing analytical tools for dealing with model and parameter uncertainty. Second, the development of predictive hierarchical models spanning several scales beyond intracellular molecular networks was identified as a major objective. This contrasts with the current focus within the systems biology community on complex molecular modeling. Conclusion During the workshop it became obvious that diverse scientific modeling cultures (from computational neuroscience, theory, data-driven machine-learning approaches, agent-based modeling, network modeling and stochastic-molecular simulations would benefit from intense cross-talk on shared theoretical issues in order to make progress on clinically relevant problems.

  19. Data Sources Available for Modeling Environmental Exposures in Older Adults

    Science.gov (United States)

    This report, “Data Sources Available for Modeling Environmental Exposures in Older Adults,” focuses on information sources and data available for modeling environmental exposures in the older U.S. population, defined here to be people 60 years and older, with an emphasis on those...

  20. Towards The Deep Model : Understanding Visual Recognition Through Computational Models

    OpenAIRE

    Wang, Panqu

    2017-01-01

    Understanding how visual recognition is achieved in the human brain is one of the most fundamental questions in vision research. In this thesis I seek to tackle this problem from a neurocomputational modeling perspective. More specifically, I build machine learning-based models to simulate and explain cognitive phenomena related to human visual recognition, and I improve computational models using brain-inspired principles to excel at computer vision tasks.I first describe how a neurocomputat...

  1. Hybrid computer modelling in plasma physics

    International Nuclear Information System (INIS)

    Hromadka, J; Ibehej, T; Hrach, R

    2016-01-01

    Our contribution is devoted to development of hybrid modelling techniques. We investigate sheath structures in the vicinity of solids immersed in low temperature argon plasma of different pressures by means of particle and fluid computer models. We discuss the differences in results obtained by these methods and try to propose a way to improve the results of fluid models in the low pressure area. There is a possibility to employ Chapman-Enskog method to find appropriate closure relations of fluid equations in a case when particle distribution function is not Maxwellian. We try to follow this way to enhance fluid model and to use it in hybrid plasma model further. (paper)

  2. Time series modeling, computation, and inference

    CERN Document Server

    Prado, Raquel

    2010-01-01

    The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit

  3. Biomedical Imaging and Computational Modeling in Biomechanics

    CERN Document Server

    Iacoviello, Daniela

    2013-01-01

    This book collects the state-of-art and new trends in image analysis and biomechanics. It covers a wide field of scientific and cultural topics, ranging from remodeling of bone tissue under the mechanical stimulus up to optimizing the performance of sports equipment, through the patient-specific modeling in orthopedics, microtomography and its application in oral and implant research, computational modeling in the field of hip prostheses, image based model development and analysis of the human knee joint, kinematics of the hip joint, micro-scale analysis of compositional and mechanical properties of dentin, automated techniques for cervical cell image analysis, and iomedical imaging and computational modeling in cardiovascular disease.   The book will be of interest to researchers, Ph.D students, and graduate students with multidisciplinary interests related to image analysis and understanding, medical imaging, biomechanics, simulation and modeling, experimental analysis.

  4. Computational algebraic geometry of epidemic models

    Science.gov (United States)

    Rodríguez Vega, Martín.

    2014-06-01

    Computational Algebraic Geometry is applied to the analysis of various epidemic models for Schistosomiasis and Dengue, both, for the case without control measures and for the case where control measures are applied. The models were analyzed using the mathematical software Maple. Explicitly the analysis is performed using Groebner basis, Hilbert dimension and Hilbert polynomials. These computational tools are included automatically in Maple. Each of these models is represented by a system of ordinary differential equations, and for each model the basic reproductive number (R0) is calculated. The effects of the control measures are observed by the changes in the algebraic structure of R0, the changes in Groebner basis, the changes in Hilbert dimension, and the changes in Hilbert polynomials. It is hoped that the results obtained in this paper become of importance for designing control measures against the epidemic diseases described. For future researches it is proposed the use of algebraic epidemiology to analyze models for airborne and waterborne diseases.

  5. Computer modeling of commercial refrigerated warehouse facilities

    International Nuclear Information System (INIS)

    Nicoulin, C.V.; Jacobs, P.C.; Tory, S.

    1997-01-01

    The use of computer models to simulate the energy performance of large commercial refrigeration systems typically found in food processing facilities is an area of engineering practice that has seen little development to date. Current techniques employed in predicting energy consumption by such systems have focused on temperature bin methods of analysis. Existing simulation tools such as DOE2 are designed to model commercial buildings and grocery store refrigeration systems. The HVAC and Refrigeration system performance models in these simulations tools model equipment common to commercial buildings and groceries, and respond to energy-efficiency measures likely to be applied to these building types. The applicability of traditional building energy simulation tools to model refrigerated warehouse performance and analyze energy-saving options is limited. The paper will present the results of modeling work undertaken to evaluate energy savings resulting from incentives offered by a California utility to its Refrigerated Warehouse Program participants. The TRNSYS general-purpose transient simulation model was used to predict facility performance and estimate program savings. Custom TRNSYS components were developed to address modeling issues specific to refrigerated warehouse systems, including warehouse loading door infiltration calculations, an evaporator model, single-state and multi-stage compressor models, evaporative condenser models, and defrost energy requirements. The main focus of the paper will be on the modeling approach. The results from the computer simulations, along with overall program impact evaluation results, will also be presented

  6. A statistical framework for the validation of a population exposure model based on personal exposure data

    Science.gov (United States)

    Rodriguez, Delphy; Valari, Myrto; Markakis, Konstantinos; Payan, Sébastien

    2016-04-01

    Currently, ambient pollutant concentrations at monitoring sites are routinely measured by local networks, such as AIRPARIF in Paris, France. Pollutant concentration fields are also simulated with regional-scale chemistry transport models such as CHIMERE (http://www.lmd.polytechnique.fr/chimere) under air-quality forecasting platforms (e.g. Prev'Air http://www.prevair.org) or research projects. These data may be combined with more or less sophisticated techniques to provide a fairly good representation of pollutant concentration spatial gradients over urban areas. Here we focus on human exposure to atmospheric contaminants. Based on census data on population dynamics and demographics, modeled outdoor concentrations and infiltration of outdoor air-pollution indoors we have developed a population exposure model for ozone and PM2.5. A critical challenge in the field of population exposure modeling is model validation since personal exposure data are expensive and therefore, rare. However, recent research has made low cost mobile sensors fairly common and therefore personal exposure data should become more and more accessible. In view of planned cohort field-campaigns where such data will be available over the Paris region, we propose in the present study a statistical framework that makes the comparison between modeled and measured exposures meaningful. Our ultimate goal is to evaluate the exposure model by comparing modeled exposures to monitor data. The scientific question we address here is how to downscale modeled data that are estimated on the county population scale at the individual scale which is appropriate to the available measurements. To assess this question we developed a Bayesian hierarchical framework that assimilates actual individual data into population statistics and updates the probability estimate.

  7. Radiation exposure modeling and project schedule visualization

    International Nuclear Information System (INIS)

    Jaquish, W.R.; Enderlin, V.R.

    1995-10-01

    This paper discusses two applications using IGRIP (Interactive Graphical Robot Instruction Program) to assist environmental remediation efforts at the Department of Energy (DOE) Hanford Site. In the first application, IGRIP is used to calculate the estimated radiation exposure to workers conducting tasks in radiation environments. In the second, IGRIP is used as a configuration management tool to detect interferences between equipment and personnel work areas for multiple projects occurring simultaneously in one area. Both of these applications have the capability to reduce environmental remediation costs by reducing personnel radiation exposure and by providing a method to effectively manage multiple projects in a single facility

  8. A dermatotoxicokinetic model of human exposures to jet fuel.

    Science.gov (United States)

    Kim, David; Andersen, Melvin E; Nylander-French, Leena A

    2006-09-01

    Workers, both in the military and the commercial airline industry, are exposed to jet fuel by inhalation and dermal contact. We present a dermatotoxicokinetic (DTK) model that quantifies the absorption, distribution, and elimination of aromatic and aliphatic components of jet fuel following dermal exposures in humans. Kinetic data were obtained from 10 healthy volunteers following a single dose of JP-8 to the forearm over a surface area of 20 cm2. Blood samples were taken before exposure (t = 0 h), after exposure (t = 0.5 h), and every 0.5 h for up to 3.5 h postexposure. The DTK model that best fit the data included five compartments: (1) surface, (2) stratum corneum (SC), (3) viable epidermis, (4) blood, and (5) storage. The DTK model was used to predict blood concentrations of the components of JP-8 based on dermal-exposure measurements made in occupational-exposure settings in order to better understand the toxicokinetic behavior of these compounds. Monte Carlo simulations of dermal exposure and cumulative internal dose demonstrated no overlap among the low-, medium-, and high-exposure groups. The DTK model provides a quantitative understanding of the relationship between the mass of JP-8 components in the SC and the concentrations of each component in the systemic circulation. The model may be used for the development of a toxicokinetic modeling strategy for multiroute exposure to jet fuel.

  9. Applied Mathematics, Modelling and Computational Science

    CERN Document Server

    Kotsireas, Ilias; Makarov, Roman; Melnik, Roderick; Shodiev, Hasan

    2015-01-01

    The Applied Mathematics, Modelling, and Computational Science (AMMCS) conference aims to promote interdisciplinary research and collaboration. The contributions in this volume cover the latest research in mathematical and computational sciences, modeling, and simulation as well as their applications in natural and social sciences, engineering and technology, industry, and finance. The 2013 conference, the second in a series of AMMCS meetings, was held August 26–30 and organized in cooperation with AIMS and SIAM, with support from the Fields Institute in Toronto, and Wilfrid Laurier University. There were many young scientists at AMMCS-2013, both as presenters and as organizers. This proceedings contains refereed papers contributed by the participants of the AMMCS-2013 after the conference. This volume is suitable for researchers and graduate students, mathematicians and engineers, industrialists, and anyone who would like to delve into the interdisciplinary research of applied and computational mathematics ...

  10. Description of mathematical models and computer programs

    International Nuclear Information System (INIS)

    1977-01-01

    The paper gives a description of mathematical models and computer programs for analysing possible strategies for spent fuel management, with emphasis on economic analysis. The computer programs developed, describe the material flows, facility construction schedules, capital investment schedules and operating costs for the facilities used in managing the spent fuel. The computer programs use a combination of simulation and optimization procedures for the economic analyses. Many of the fuel cycle steps (such as spent fuel discharges, storage at the reactor, and transport to the RFCC) are described in physical and economic terms through simulation modeling, while others (such as reprocessing plant size and commissioning schedules, interim storage facility commissioning schedules etc.) are subjected to economic optimization procedures to determine the approximate lowest-cost plans from among the available feasible alternatives

  11. Modeling inputs to computer models used in risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.

    1987-01-01

    Computer models for various risk assessment applications are closely scrutinized both from the standpoint of questioning the correctness of the underlying mathematical model with respect to the process it is attempting to model and from the standpoint of verifying that the computer model correctly implements the underlying mathematical model. A process that receives less scrutiny, but is nonetheless of equal importance, concerns the individual and joint modeling of the inputs. This modeling effort clearly has a great impact on the credibility of results. Model characteristics are reviewed in this paper that have a direct bearing on the model input process and reasons are given for using probabilities-based modeling with the inputs. The authors also present ways to model distributions for individual inputs and multivariate input structures when dependence and other constraints may be present

  12. Integrating interactive computational modeling in biology curricula.

    Directory of Open Access Journals (Sweden)

    Tomáš Helikar

    2015-03-01

    Full Text Available While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  13. Integrating interactive computational modeling in biology curricula.

    Science.gov (United States)

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  14. Computer Modelling of Photochemical Smog Formation

    Science.gov (United States)

    Huebert, Barry J.

    1974-01-01

    Discusses a computer program that has been used in environmental chemistry courses as an example of modelling as a vehicle for teaching chemical dynamics, and as a demonstration of some of the factors which affect the production of smog. (Author/GS)

  15. A Computational Model of Fraction Arithmetic

    Science.gov (United States)

    Braithwaite, David W.; Pyke, Aryn A.; Siegler, Robert S.

    2017-01-01

    Many children fail to master fraction arithmetic even after years of instruction, a failure that hinders their learning of more advanced mathematics as well as their occupational success. To test hypotheses about why children have so many difficulties in this area, we created a computational model of fraction arithmetic learning and presented it…

  16. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  17. Computational Modeling of Complex Protein Activity Networks

    NARCIS (Netherlands)

    Schivo, Stefano; Leijten, Jeroen; Karperien, Marcel; Post, Janine N.; Prignet, Claude

    2017-01-01

    Because of the numerous entities interacting, the complexity of the networks that regulate cell fate makes it impossible to analyze and understand them using the human brain alone. Computational modeling is a powerful method to unravel complex systems. We recently described the development of a

  18. Computer Modeling of Platinum Reforming Reactors | Momoh ...

    African Journals Online (AJOL)

    This paper, instead of using a theoretical approach has considered a computer model as means of assessing the reformate composition for three-stage fixed bed reactors in platforming unit. This is done by identifying many possible hydrocarbon transformation reactions that are peculiar to the process unit, identify the ...

  19. Particle modeling of plasmas computational plasma physics

    International Nuclear Information System (INIS)

    Dawson, J.M.

    1991-01-01

    Recently, through the development of supercomputers, a powerful new method for exploring plasmas has emerged; it is computer modeling of plasmas. Such modeling can duplicate many of the complex processes that go on in a plasma and allow scientists to understand what the important processes are. It helps scientists gain an intuition about this complex state of matter. It allows scientists and engineers to explore new ideas on how to use plasma before building costly experiments; it allows them to determine if they are on the right track. It can duplicate the operation of devices and thus reduce the need to build complex and expensive devices for research and development. This is an exciting new endeavor that is in its infancy, but which can play an important role in the scientific and technological competitiveness of the US. There are a wide range of plasma models that are in use. There are particle models, fluid models, hybrid particle fluid models. These can come in many forms, such as explicit models, implicit models, reduced dimensional models, electrostatic models, magnetostatic models, electromagnetic models, and almost an endless variety of other models. Here the author will only discuss particle models. He will give a few examples of the use of such models; these will be taken from work done by the Plasma Modeling Group at UCLA because he is most familiar with work. However, it only gives a small view of the wide range of work being done around the US, or for that matter around the world

  20. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  1. Applied modelling and computing in social science

    CERN Document Server

    Povh, Janez

    2015-01-01

    In social science outstanding results are yielded by advanced simulation methods, based on state of the art software technologies and an appropriate combination of qualitative and quantitative methods. This book presents examples of successful applications of modelling and computing in social science: business and logistic process simulation and optimization, deeper knowledge extractions from big data, better understanding and predicting of social behaviour and modelling health and environment changes.

  2. Validation of a phytoremediation computer model

    Energy Technology Data Exchange (ETDEWEB)

    Corapcioglu, M Y; Sung, K; Rhykerd, R L; Munster, C; Drew, M [Texas A and M Univ., College Station, TX (United States)

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg[sub -1

  3. COMPUTER MODEL AND SIMULATION OF A GLOVE BOX PROCESS

    International Nuclear Information System (INIS)

    Foster, C.

    2001-01-01

    The development of facilities to deal with the disposition of nuclear materials at an acceptable level of Occupational Radiation Exposure (ORE) is a significant issue facing the nuclear community. One solution is to minimize the worker's exposure though the use of automated systems. However, the adoption of automated systems for these tasks is hampered by the challenging requirements that these systems must meet in order to be cost effective solutions in the hazardous nuclear materials processing environment. Retrofitting current glove box technologies with automation systems represents potential near-term technology that can be applied to reduce worker ORE associated with work in nuclear materials processing facilities. Successful deployment of automation systems for these applications requires the development of testing and deployment strategies to ensure the highest level of safety and effectiveness. Historically, safety tests are conducted with glove box mock-ups around the finished design. This late detection of problems leads to expensive redesigns and costly deployment delays. With wide spread availability of computers and cost effective simulation software it is possible to discover and fix problems early in the design stages. Computer simulators can easily create a complete model of the system allowing a safe medium for testing potential failures and design shortcomings. The majority of design specification is now done on computer and moving that information to a model is relatively straightforward. With a complete model and results from a Failure Mode Effect Analysis (FMEA), redesigns can be worked early. Additional issues such as user accessibility, component replacement, and alignment problems can be tackled early in the virtual environment provided by computer simulation. In this case, a commercial simulation package is used to simulate a lathe process operation at the Los Alamos National Laboratory (LANL). The Lathe process operation is indicative of

  4. Excess lifetime cancer mortality risk attributable to radiation exposure from computed tomography examinations in children

    NARCIS (Netherlands)

    Chodick, Gabriel; Ronckers, Cécile M.; Shalev, Varda; Ron, Elaine

    2007-01-01

    The use of computed tomography in Israel has been growing rapidly during recent decades. The major drawback of this important technology is the exposure to ionizing radiation, especially among children who have increased organ radiosensitivity and a long lifetime to potentially develop

  5. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1986-01-01

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  6. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  7. Image based Monte Carlo modeling for computational phantom

    International Nuclear Information System (INIS)

    Cheng, M.; Wang, W.; Zhao, K.; Fan, Y.; Long, P.; Wu, Y.

    2013-01-01

    Full text of the publication follows. The evaluation on the effects of ionizing radiation and the risk of radiation exposure on human body has been becoming one of the most important issues for radiation protection and radiotherapy fields, which is helpful to avoid unnecessary radiation and decrease harm to human body. In order to accurately evaluate the dose on human body, it is necessary to construct more realistic computational phantom. However, manual description and verification of the models for Monte Carlo (MC) simulation are very tedious, error-prone and time-consuming. In addition, it is difficult to locate and fix the geometry error, and difficult to describe material information and assign it to cells. MCAM (CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport Simulation) was developed as an interface program to achieve both CAD- and image-based automatic modeling. The advanced version (Version 6) of MCAM can achieve automatic conversion from CT/segmented sectioned images to computational phantoms such as MCNP models. Imaged-based automatic modeling program(MCAM6.0) has been tested by several medical images and sectioned images. And it has been applied in the construction of Rad-HUMAN. Following manual segmentation and 3D reconstruction, a whole-body computational phantom of Chinese adult female called Rad-HUMAN was created by using MCAM6.0 from sectioned images of a Chinese visible human dataset. Rad-HUMAN contains 46 organs/tissues, which faithfully represented the average anatomical characteristics of the Chinese female. The dose conversion coefficients (Dt/Ka) from kerma free-in-air to absorbed dose of Rad-HUMAN were calculated. Rad-HUMAN can be applied to predict and evaluate dose distributions in the Treatment Plan System (TPS), as well as radiation exposure for human body in radiation protection. (authors)

  8. A computer-assisted procedure for estimating patient exposure and fetal dose in radiographic examinations

    International Nuclear Information System (INIS)

    Glaze, S.; Schneiders, N.; Bushong, S.C.

    1982-01-01

    A computer program for calculating patient entrance exposure and fetal dose for 11 common radiographic examinations was developed. The output intensity measured at 70 kVp and a 30-inch (76-cm) source-to-skin distance was entered into the program. The change in output intensity with changing kVp was examined for 17 single-phase and 12 three-phase x-ray units. The relationships obtained from a least squares regression analysis of the data, along with the technique factors for each examination, were used to calculate patient exposure. Fetal dose was estimated using published fetal dose in mrad (10 -5 Gy) per 1,000 mR (258 μC/kg) entrance exposure values. The computations are fully automated and individualized to each radiographic unit. The information provides a ready reference in large institutions and is particularly useful at smaller facilities that do not have available physicists who can make the calculations immediately

  9. A computer-assisted procedure for estimating patient exposure and fetal dose in radiographic examinations

    International Nuclear Information System (INIS)

    Glaze, S.; Schneiders, N.; Bushong, S.C.

    1982-01-01

    A computer program for calculating patient entrance exposure and fetal dose for 11 common radiographic examinations was developed. The output intensity measured at 70 kVp and a 30-inch (76-cm) source-to-skin distance was entered into the program. The change in output intensity with changing kVp was examined for 17 single-phase and 12 three-phase x-ray units. The relationships obtained from a least squares regression analysis of the data, along with the technique factors for each examination, were used to calculate patient exposure. Fetal dose was estimated using published fetal dose in mrad (10(-5) Gy) per 1,000 mR (258 microC/kg) entrance exposure values. The computations are fully automated and individualized to each radiographic unit. The information provides a ready reference in large institutions and is particularly useful at smaller facilities that do not have available physicians who can make the calculations immediately

  10. Grid computing in large pharmaceutical molecular modeling.

    Science.gov (United States)

    Claus, Brian L; Johnson, Stephen R

    2008-07-01

    Most major pharmaceutical companies have employed grid computing to expand their compute resources with the intention of minimizing additional financial expenditure. Historically, one of the issues restricting widespread utilization of the grid resources in molecular modeling is the limited set of suitable applications amenable to coarse-grained parallelization. Recent advances in grid infrastructure technology coupled with advances in application research and redesign will enable fine-grained parallel problems, such as quantum mechanics and molecular dynamics, which were previously inaccessible to the grid environment. This will enable new science as well as increase resource flexibility to load balance and schedule existing workloads.

  11. Attacker Modelling in Ubiquitous Computing Systems

    DEFF Research Database (Denmark)

    Papini, Davide

    in with our everyday life. This future is visible to everyone nowadays: terms like smartphone, cloud, sensor, network etc. are widely known and used in our everyday life. But what about the security of such systems. Ubiquitous computing devices can be limited in terms of energy, computing power and memory...... attacker remain somehow undened and still under extensive investigation. This Thesis explores the nature of the ubiquitous attacker with a focus on how she interacts with the physical world and it denes a model that captures the abilities of the attacker. Furthermore a quantitative implementation...

  12. Climate models on massively parallel computers

    International Nuclear Information System (INIS)

    Vitart, F.; Rouvillois, P.

    1993-01-01

    First results got on massively parallel computers (Multiple Instruction Multiple Data and Simple Instruction Multiple Data) allow to consider building of coupled models with high resolutions. This would make possible simulation of thermoaline circulation and other interaction phenomena between atmosphere and ocean. The increasing of computers powers, and then the improvement of resolution will go us to revise our approximations. Then hydrostatic approximation (in ocean circulation) will not be valid when the grid mesh will be of a dimension lower than a few kilometers: We shall have to find other models. The expert appraisement got in numerical analysis at the Center of Limeil-Valenton (CEL-V) will be used again to imagine global models taking in account atmosphere, ocean, ice floe and biosphere, allowing climate simulation until a regional scale

  13. Rough – Granular Computing knowledge discovery models

    Directory of Open Access Journals (Sweden)

    Mohammed M. Eissa

    2016-11-01

    Full Text Available Medical domain has become one of the most important areas of research in order to richness huge amounts of medical information about the symptoms of diseases and how to distinguish between them to diagnose it correctly. Knowledge discovery models play vital role in refinement and mining of medical indicators to help medical experts to settle treatment decisions. This paper introduces four hybrid Rough – Granular Computing knowledge discovery models based on Rough Sets Theory, Artificial Neural Networks, Genetic Algorithm and Rough Mereology Theory. A comparative analysis of various knowledge discovery models that use different knowledge discovery techniques for data pre-processing, reduction, and data mining supports medical experts to extract the main medical indicators, to reduce the misdiagnosis rates and to improve decision-making for medical diagnosis and treatment. The proposed models utilized two medical datasets: Coronary Heart Disease dataset and Hepatitis C Virus dataset. The main purpose of this paper was to explore and evaluate the proposed models based on Granular Computing methodology for knowledge extraction according to different evaluation criteria for classification of medical datasets. Another purpose is to make enhancement in the frame of KDD processes for supervised learning using Granular Computing methodology.

  14. 40 CFR 194.23 - Models and computer codes.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  15. Animal Model Selection for Inhalational HCN Exposure

    Science.gov (United States)

    2016-08-01

    effects. Following acute inhalation exposure in humans and animals, cyanide is found in the lung, heart, blood , kidneys, and brain (Ballantyne, 1983...Pritchard, 2007). Other direct or secondary effects associated with CN are reacting with the ferric and carbonyl group of enzymes (e.g. catalase...mechanisms occurs before myocardial depression. Clinically, an initial period of bradycardia and hypertension may occur, followed by hypotension with reflex

  16. Determinants of dermal exposure relevant for exposure modelling in regulatory risk assessment.

    Science.gov (United States)

    Marquart, J; Brouwer, D H; Gijsbers, J H J; Links, I H M; Warren, N; van Hemmen, J J

    2003-11-01

    Risk assessment of chemicals requires assessment of the exposure levels of workers. In the absence of adequate specific measured data, models are often used to estimate exposure levels. For dermal exposure only a few models exist, which are not validated externally. In the scope of a large European research programme, an analysis of potential dermal exposure determinants was made based on the available studies and models and on the expert judgement of the authors of this publication. Only a few potential determinants appear to have been studied in depth. Several studies have included clusters of determinants into vaguely defined parameters, such as 'task' or 'cleaning and maintenance of clothing'. Other studies include several highly correlated parameters, such as 'amount of product handled', 'duration of task' and 'area treated', and separation of these parameters to study their individual influence is not possible. However, based on the available information, a number of determinants could clearly be defined as proven or highly plausible determinants of dermal exposure in one or more exposure situation. This information was combined with expert judgement on the scientific plausibility of the influence of parameters that have not been extensively studied and on the possibilities to gather relevant information during a risk assessment process. The result of this effort is a list of determinants relevant for dermal exposure models in the scope of regulatory risk assessment. The determinants have been divided into the major categories 'substance and product characteristics', 'task done by the worker', 'process technique and equipment', 'exposure control measures', 'worker characteristics and habits' and 'area and situation'. To account for the complex nature of the dermal exposure processes, a further subdivision was made into the three major processes 'direct contact', 'surface contact' and 'deposition'.

  17. Exporters’ exposures to currencies: Beyond the loglinear model

    NARCIS (Netherlands)

    Boudt, K.M.R.; Liu, F.; Sercu, P.

    2016-01-01

    We extend the constant-elasticity regression that is the default choice when equities' exposure to currencies is estimated. In a proper real-option-style model for the exporters' equity exposure to the foreign exchange rate, we argue, the convexity of the relationship implies that the elasticity

  18. ConsExpo - Consumer Exposure and Uptake Models -Program Manual

    NARCIS (Netherlands)

    Delmaar JE; Park MVDZ; Engelen JGM van; SIR

    2006-01-01

    This report provides guidance to the use of ConsExpo 4.0, successor to ConsExpo 3.0, a computer program that was developed to assist in the exposure assessment of compounds in non-food consumer products. The wide range of available consumer products is associated with an even wider variation in

  19. Computational Aerodynamic Modeling of Small Quadcopter Vehicles

    Science.gov (United States)

    Yoon, Seokkwan; Ventura Diaz, Patricia; Boyd, D. Douglas; Chan, William M.; Theodore, Colin R.

    2017-01-01

    High-fidelity computational simulations have been performed which focus on rotor-fuselage and rotor-rotor aerodynamic interactions of small quad-rotor vehicle systems. The three-dimensional unsteady Navier-Stokes equations are solved on overset grids using high-order accurate schemes, dual-time stepping, low Mach number preconditioning, and hybrid turbulence modeling. Computational results for isolated rotors are shown to compare well with available experimental data. Computational results in hover reveal the differences between a conventional configuration where the rotors are mounted above the fuselage and an unconventional configuration where the rotors are mounted below the fuselage. Complex flow physics in forward flight is investigated. The goal of this work is to demonstrate that understanding of interactional aerodynamics can be an important factor in design decisions regarding rotor and fuselage placement for next-generation multi-rotor drones.

  20. SAR exposure from UHF RFID reader in adult, child, pregnant woman, and fetus anatomical models.

    Science.gov (United States)

    Fiocchi, Serena; Markakis, Ioannis A; Ravazzani, Paolo; Samaras, Theodoros

    2013-09-01

    The spread of radio frequency identification (RFID) devices in ubiquitous applications without their simultaneous exposure assessment could give rise to public concerns about their potential adverse health effects. Among the various RFID system categories, the ultra high frequency (UHF) RFID systems have recently started to be widely used in many applications. This study addresses a computational exposure assessment of the electromagnetic radiation generated by a realistic UHF RFID reader, quantifying the exposure levels in different exposure scenarios and subjects (two adults, four children, and two anatomical models of women 7 and 9 months pregnant). The results of the computations are presented in terms of the whole-body and peak spatial specific absorption rate (SAR) averaged over 10 g of tissue to allow comparison with the basic restrictions of the exposure guidelines. The SAR levels in the adults and children were below 0.02 and 0.8 W/kg in whole-body SAR and maximum peak SAR levels, respectively, for all tested positions of the antenna. On the contrary, exposure of pregnant women and fetuses resulted in maximum peak SAR(10 g) values close to the values suggested by the guidelines (2 W/kg) in some of the exposure scenarios with the antenna positioned in front of the abdomen and with a 100% duty cycle and 1 W radiated power. Copyright © 2013 Wiley Periodicals, Inc.

  1. Measuring and modeling exposure from environmental radiation on tidal flats

    International Nuclear Information System (INIS)

    Gould, T.J.; Hess, C.T.

    2005-01-01

    To examine the shielding effects of the tide cycle, a high pressure ion chamber was used to measure the exposure rate from environmental radiation on tidal flats. A theoretical model is derived to predict the behavior of exposure rate as a function of time for a detector placed one meter above ground on a tidal flat. The numerical integration involved in this derivation results in an empirical formula which implies exposure rate ∝tan-1(sint). We propose that calculating the total exposure incurred on a tidal flat requires measurements of only the slope of the tidal flat and the exposure rate when no shielding occurs. Experimental results are consistent with the model

  2. Literature review on induced exposure models, Task 2 HS-270

    Science.gov (United States)

    1982-02-01

    Sections 1, 2 and 3 of this report describe the development of : induced exposure models, together with d discussion of questions : of validity. These Sections focus on the most important and : relevant results from the literature, while Appendix A c...

  3. Computional algorithm for lifetime exposure to antimicrobials in pigs using register data-The LEA algorithm.

    Science.gov (United States)

    Birkegård, Anna Camilla; Andersen, Vibe Dalhoff; Halasa, Tariq; Jensen, Vibeke Frøkjær; Toft, Nils; Vigre, Håkan

    2017-10-01

    Accurate and detailed data on antimicrobial exposure in pig production are essential when studying the association between antimicrobial exposure and antimicrobial resistance. Due to difficulties in obtaining primary data on antimicrobial exposure in a large number of farms, there is a need for a robust and valid method to estimate the exposure using register data. An approach that estimates the antimicrobial exposure in every rearing period during the lifetime of a pig using register data was developed into a computational algorithm. In this approach data from national registers on antimicrobial purchases, movements of pigs and farm demographics registered at farm level are used. The algorithm traces batches of pigs retrospectively from slaughter to the farm(s) that housed the pigs during their finisher, weaner, and piglet period. Subsequently, the algorithm estimates the antimicrobial exposure as the number of Animal Defined Daily Doses for treatment of one kg pig in each of the rearing periods. Thus, the antimicrobial purchase data at farm level are translated into antimicrobial exposure estimates at batch level. A batch of pigs is defined here as pigs sent to slaughter at the same day from the same farm. In this study we present, validate, and optimise a computational algorithm that calculate the lifetime exposure of antimicrobials for slaughter pigs. The algorithm was evaluated by comparing the computed estimates to data on antimicrobial usage from farm records in 15 farm units. We found a good positive correlation between the two estimates. The algorithm was run for Danish slaughter pigs sent to slaughter in January to March 2015 from farms with more than 200 finishers to estimate the proportion of farms that it was applicable for. In the final process, the algorithm was successfully run for batches of pigs originating from 3026 farms with finisher units (77% of the initial population). This number can be increased if more accurate register data can be

  4. Computational hemodynamics theory, modelling and applications

    CERN Document Server

    Tu, Jiyuan; Wong, Kelvin Kian Loong

    2015-01-01

    This book discusses geometric and mathematical models that can be used to study fluid and structural mechanics in the cardiovascular system.  Where traditional research methodologies in the human cardiovascular system are challenging due to its invasive nature, several recent advances in medical imaging and computational fluid and solid mechanics modelling now provide new and exciting research opportunities. This emerging field of study is multi-disciplinary, involving numerical methods, computational science, fluid and structural mechanics, and biomedical engineering. Certainly any new student or researcher in this field may feel overwhelmed by the wide range of disciplines that need to be understood. This unique book is one of the first to bring together knowledge from multiple disciplines, providing a starting point to each of the individual disciplines involved, attempting to ease the steep learning curve. This book presents elementary knowledge on the physiology of the cardiovascular system; basic knowl...

  5. Computer model for harmonic ultrasound imaging.

    Science.gov (United States)

    Li, Y; Zagzebski, J A

    2000-01-01

    Harmonic ultrasound imaging has received great attention from ultrasound scanner manufacturers and researchers. In this paper, we present a computer model that can generate realistic harmonic images. In this model, the incident ultrasound is modeled after the "KZK" equation, and the echo signal is modeled using linear propagation theory because the echo signal is much weaker than the incident pulse. Both time domain and frequency domain numerical solutions to the "KZK" equation were studied. Realistic harmonic images of spherical lesion phantoms were generated for scans by a circular transducer. This model can be a very useful tool for studying the harmonic buildup and dissipation processes in a nonlinear medium, and it can be used to investigate a wide variety of topics related to B-mode harmonic imaging.

  6. Computer modelling of superconductive fault current limiters

    Energy Technology Data Exchange (ETDEWEB)

    Weller, R.A.; Campbell, A.M.; Coombs, T.A.; Cardwell, D.A.; Storey, R.J. [Cambridge Univ. (United Kingdom). Interdisciplinary Research Centre in Superconductivity (IRC); Hancox, J. [Rolls Royce, Applied Science Division, Derby (United Kingdom)

    1998-05-01

    Investigations are being carried out on the use of superconductors for fault current limiting applications. A number of computer programs are being developed to predict the behavior of different `resistive` fault current limiter designs under a variety of fault conditions. The programs achieve solution by iterative methods based around real measured data rather than theoretical models in order to achieve accuracy at high current densities. (orig.) 5 refs.

  7. Computational fluid dynamics modelling in cardiovascular medicine.

    Science.gov (United States)

    Morris, Paul D; Narracott, Andrew; von Tengg-Kobligk, Hendrik; Silva Soto, Daniel Alejandro; Hsiao, Sarah; Lungu, Angela; Evans, Paul; Bressloff, Neil W; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P

    2016-01-01

    This paper reviews the methods, benefits and challenges associated with the adoption and translation of computational fluid dynamics (CFD) modelling within cardiovascular medicine. CFD, a specialist area of mathematics and a branch of fluid mechanics, is used routinely in a diverse range of safety-critical engineering systems, which increasingly is being applied to the cardiovascular system. By facilitating rapid, economical, low-risk prototyping, CFD modelling has already revolutionised research and development of devices such as stents, valve prostheses, and ventricular assist devices. Combined with cardiovascular imaging, CFD simulation enables detailed characterisation of complex physiological pressure and flow fields and the computation of metrics which cannot be directly measured, for example, wall shear stress. CFD models are now being translated into clinical tools for physicians to use across the spectrum of coronary, valvular, congenital, myocardial and peripheral vascular diseases. CFD modelling is apposite for minimally-invasive patient assessment. Patient-specific (incorporating data unique to the individual) and multi-scale (combining models of different length- and time-scales) modelling enables individualised risk prediction and virtual treatment planning. This represents a significant departure from traditional dependence upon registry-based, population-averaged data. Model integration is progressively moving towards 'digital patient' or 'virtual physiological human' representations. When combined with population-scale numerical models, these models have the potential to reduce the cost, time and risk associated with clinical trials. The adoption of CFD modelling signals a new era in cardiovascular medicine. While potentially highly beneficial, a number of academic and commercial groups are addressing the associated methodological, regulatory, education- and service-related challenges. Published by the BMJ Publishing Group Limited. For permission

  8. Comprehensive European dietary exposure model (CEDEM) for food additives.

    Science.gov (United States)

    Tennant, David R

    2016-05-01

    European methods for assessing dietary exposures to nutrients, additives and other substances in food are limited by the availability of detailed food consumption data for all member states. A proposed comprehensive European dietary exposure model (CEDEM) applies summary data published by the European Food Safety Authority (EFSA) in a deterministic model based on an algorithm from the EFSA intake method for food additives. The proposed approach can predict estimates of food additive exposure provided in previous EFSA scientific opinions that were based on the full European food consumption database.

  9. The study of radiographic technique with low exposure using computed panoramic tomography

    International Nuclear Information System (INIS)

    Saito, Yasuhiro

    1987-01-01

    A new imaging system for the dental field that combines recent advances in both the electronics and computer technologies was developed. This new imaging system is a computed panoramic tomography process based on the newly developed laser-scan system. In this study a quantitative image evaluation was performed comparing anatomical landmark in computed panoramic tomography at a low exposure (LPT) and in conventional panoramic tomography at a routin (CPT), and the following results were obtained: 1. The diagnostic value of the CPT decreased with decreasing exposure, paticularly with regard to the normal anatomical landmarks of such microstructural parts as the periodontal space, lamina dura and the enamel-dentin border. 2. The LPT was highly diagnostic value for all normal anatomical landmark, averaging about twice as valuable diagnostically as CPT. 3. The visually diagnostic value of the periodontal space, lamina dura, enamel-dentin border and the anatomical morphology of the teeth on the LPT beeing slightly dependent on the spatial frequency enhancement rank. 4. The LPT formed images with almost the same range of density as the CPT. 5. Computed panoramic tomographs taken at a low exposure revealed more information of the trabecular bone pattern on the image than conventional panoramic tomographs taken under routine condition in the visual spatial frequency range (0.1 - 5.0 cycle/mm). (author) 67 refs

  10. Modeling Exposure to Persistent Chemicals in Hazard and Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Cowan-Ellsberry, Christina E.; McLachlan, Michael S.; Arnot, Jon A.; MacLeod, Matthew; McKone, Thomas E.; Wania, Frank

    2008-11-01

    Fate and exposure modeling has not thus far been explicitly used in the risk profile documents prepared to evaluate significant adverse effect of candidate chemicals for either the Stockholm Convention or the Convention on Long-Range Transboundary Air Pollution. However, we believe models have considerable potential to improve the risk profiles. Fate and exposure models are already used routinely in other similar regulatory applications to inform decisions, and they have been instrumental in building our current understanding of the fate of POP and PBT chemicals in the environment. The goal of this paper is to motivate the use of fate and exposure models in preparing risk profiles in the POP assessment procedure by providing strategies for incorporating and using models. The ways that fate and exposure models can be used to improve and inform the development of risk profiles include: (1) Benchmarking the ratio of exposure and emissions of candidate chemicals to the same ratio for known POPs, thereby opening the possibility of combining this ratio with the relative emissions and relative toxicity to arrive at a measure of relative risk. (2) Directly estimating the exposure of the environment, biota and humans to provide information to complement measurements, or where measurements are not available or are limited. (3) To identify the key processes and chemical and/or environmental parameters that determine the exposure; thereby allowing the effective prioritization of research or measurements to improve the risk profile. (4) Predicting future time trends including how quickly exposure levels in remote areas would respond to reductions in emissions. Currently there is no standardized consensus model for use in the risk profile context. Therefore, to choose the appropriate model the risk profile developer must evaluate how appropriate an existing model is for a specific setting and whether the assumptions and input data are relevant in the context of the application

  11. Modeling exposure to persistent chemicals in hazard and risk assessment.

    Science.gov (United States)

    Cowan-Ellsberry, Christina E; McLachlan, Michael S; Arnot, Jon A; Macleod, Matthew; McKone, Thomas E; Wania, Frank

    2009-10-01

    Fate and exposure modeling has not, thus far, been explicitly used in the risk profile documents prepared for evaluating the significant adverse effect of candidate chemicals for either the Stockholm Convention or the Convention on Long-Range Transboundary Air Pollution. However, we believe models have considerable potential to improve the risk profiles. Fate and exposure models are already used routinely in other similar regulatory applications to inform decisions, and they have been instrumental in building our current understanding of the fate of persistent organic pollutants (POP) and persistent, bioaccumulative, and toxic (PBT) chemicals in the environment. The goal of this publication is to motivate the use of fate and exposure models in preparing risk profiles in the POP assessment procedure by providing strategies for incorporating and using models. The ways that fate and exposure models can be used to improve and inform the development of risk profiles include 1) benchmarking the ratio of exposure and emissions of candidate chemicals to the same ratio for known POPs, thereby opening the possibility of combining this ratio with the relative emissions and relative toxicity to arrive at a measure of relative risk; 2) directly estimating the exposure of the environment, biota, and humans to provide information to complement measurements or where measurements are not available or are limited; 3) to identify the key processes and chemical or environmental parameters that determine the exposure, thereby allowing the effective prioritization of research or measurements to improve the risk profile; and 4) forecasting future time trends, including how quickly exposure levels in remote areas would respond to reductions in emissions. Currently there is no standardized consensus model for use in the risk profile context. Therefore, to choose the appropriate model the risk profile developer must evaluate how appropriate an existing model is for a specific setting and

  12. Analytical performance modeling for computer systems

    CERN Document Server

    Tay, Y C

    2013-01-01

    This book is an introduction to analytical performance modeling for computer systems, i.e., writing equations to describe their performance behavior. It is accessible to readers who have taken college-level courses in calculus and probability, networking and operating systems. This is not a training manual for becoming an expert performance analyst. Rather, the objective is to help the reader construct simple models for analyzing and understanding the systems that they are interested in.Describing a complicated system abstractly with mathematical equations requires a careful choice of assumpti

  13. The deterministic computational modelling of radioactivity

    International Nuclear Information System (INIS)

    Damasceno, Ralf M.; Barros, Ricardo C.

    2009-01-01

    This paper describes a computational applicative (software) that modelling the simply radioactive decay, the stable nuclei decay, and tbe chain decay directly coupled with superior limit of thirteen radioactive decays, and a internal data bank with the decay constants of the various existent decays, facilitating considerably the use of program by people who does not have access to the program are not connected to the nuclear area; this makes access of the program to people that do not have acknowledgment of that area. The paper presents numerical results for typical problem-models

  14. Cloud Computing, Tieto Cloud Server Model

    OpenAIRE

    Suikkanen, Saara

    2013-01-01

    The purpose of this study is to find out what is cloud computing. To be able to make wise decisions when moving to cloud or considering it, companies need to understand what cloud is consists of. Which model suits best to they company, what should be taken into account before moving to cloud, what is the cloud broker role and also SWOT analysis of cloud? To be able to answer customer requirements and business demands, IT companies should develop and produce new service models. IT house T...

  15. ADGEN: ADjoint GENerator for computer models

    Energy Technology Data Exchange (ETDEWEB)

    Worley, B.A.; Pin, F.G.; Horwedel, J.E.; Oblow, E.M.

    1989-05-01

    This paper presents the development of a FORTRAN compiler and an associated supporting software library called ADGEN. ADGEN reads FORTRAN models as input and produces and enhanced version of the input model. The enhanced version reproduces the original model calculations but also has the capability to calculate derivatives of model results of interest with respect to any and all of the model data and input parameters. The method for calculating the derivatives and sensitivities is the adjoint method. Partial derivatives are calculated analytically using computer calculus and saved as elements of an adjoint matrix on direct assess storage. The total derivatives are calculated by solving an appropriate adjoint equation. ADGEN is applied to a major computer model of interest to the Low-Level Waste Community, the PRESTO-II model. PRESTO-II sample problem results reveal that ADGEN correctly calculates derivatives of response of interest with respect to 300 parameters. The execution time to create the adjoint matrix is a factor of 45 times the execution time of the reference sample problem. Once this matrix is determined, the derivatives with respect to 3000 parameters are calculated in a factor of 6.8 that of the reference model for each response of interest. For a single 3000 for determining these derivatives by parameter perturbations. The automation of the implementation of the adjoint technique for calculating derivatives and sensitivities eliminates the costly and manpower-intensive task of direct hand-implementation by reprogramming and thus makes the powerful adjoint technique more amenable for use in sensitivity analysis of existing models. 20 refs., 1 fig., 5 tabs.

  16. ADGEN: ADjoint GENerator for computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Horwedel, J.E.; Oblow, E.M.

    1989-05-01

    This paper presents the development of a FORTRAN compiler and an associated supporting software library called ADGEN. ADGEN reads FORTRAN models as input and produces and enhanced version of the input model. The enhanced version reproduces the original model calculations but also has the capability to calculate derivatives of model results of interest with respect to any and all of the model data and input parameters. The method for calculating the derivatives and sensitivities is the adjoint method. Partial derivatives are calculated analytically using computer calculus and saved as elements of an adjoint matrix on direct assess storage. The total derivatives are calculated by solving an appropriate adjoint equation. ADGEN is applied to a major computer model of interest to the Low-Level Waste Community, the PRESTO-II model. PRESTO-II sample problem results reveal that ADGEN correctly calculates derivatives of response of interest with respect to 300 parameters. The execution time to create the adjoint matrix is a factor of 45 times the execution time of the reference sample problem. Once this matrix is determined, the derivatives with respect to 3000 parameters are calculated in a factor of 6.8 that of the reference model for each response of interest. For a single 3000 for determining these derivatives by parameter perturbations. The automation of the implementation of the adjoint technique for calculating derivatives and sensitivities eliminates the costly and manpower-intensive task of direct hand-implementation by reprogramming and thus makes the powerful adjoint technique more amenable for use in sensitivity analysis of existing models. 20 refs., 1 fig., 5 tabs

  17. Computational Design Modelling : Proceedings of the Design Modelling Symposium

    CERN Document Server

    Kilian, Axel; Palz, Norbert; Scheurer, Fabian

    2012-01-01

    This book publishes the peer-reviewed proceeding of the third Design Modeling Symposium Berlin . The conference constitutes a platform for dialogue on experimental practice and research within the field of computationally informed architectural design. More than 60 leading experts the computational processes within the field of computationally informed architectural design to develop a broader and less exotic building practice that bears more subtle but powerful traces of the complex tool set and approaches we have developed and studied over recent years. The outcome are new strategies for a reasonable and innovative implementation of digital potential in truly innovative and radical design guided by both responsibility towards processes and the consequences they initiate.

  18. Toward a computational model of hemostasis

    Science.gov (United States)

    Leiderman, Karin; Danes, Nicholas; Schoeman, Rogier; Neeves, Keith

    2017-11-01

    Hemostasis is the process by which a blood clot forms to prevent bleeding at a site of injury. The formation time, size and structure of a clot depends on the local hemodynamics and the nature of the injury. Our group has previously developed computational models to study intravascular clot formation, a process confined to the interior of a single vessel. Here we present the first stage of an experimentally-validated, computational model of extravascular clot formation (hemostasis) in which blood through a single vessel initially escapes through a hole in the vessel wall and out a separate injury channel. This stage of the model consists of a system of partial differential equations that describe platelet aggregation and hemodynamics, solved via the finite element method. We also present results from the analogous, in vitro, microfluidic model. In both models, formation of a blood clot occludes the injury channel and stops flow from escaping while blood in the main vessel retains its fluidity. We discuss the different biochemical and hemodynamic effects on clot formation using distinct geometries representing intra- and extravascular injuries.

  19. Advanced computational modeling for in vitro nanomaterial dosimetry.

    Science.gov (United States)

    DeLoid, Glen M; Cohen, Joel M; Pyrgiotakis, Georgios; Pirela, Sandra V; Pal, Anoop; Liu, Jiying; Srebric, Jelena; Demokritou, Philip

    2015-10-24

    Accurate and meaningful dose metrics are a basic requirement for in vitro screening to assess potential health risks of engineered nanomaterials (ENMs). Correctly and consistently quantifying what cells "see," during an in vitro exposure requires standardized preparation of stable ENM suspensions, accurate characterizatoin of agglomerate sizes and effective densities, and predictive modeling of mass transport. Earlier transport models provided a marked improvement over administered concentration or total mass, but included assumptions that could produce sizable inaccuracies, most notably that all particles at the bottom of the well are adsorbed or taken up by cells, which would drive transport downward, resulting in overestimation of deposition. Here we present development, validation and results of two robust computational transport models. Both three-dimensional computational fluid dynamics (CFD) and a newly-developed one-dimensional Distorted Grid (DG) model were used to estimate delivered dose metrics for industry-relevant metal oxide ENMs suspended in culture media. Both models allow simultaneous modeling of full size distributions for polydisperse ENM suspensions, and provide deposition metrics as well as concentration metrics over the extent of the well. The DG model also emulates the biokinetics at the particle-cell interface using a Langmuir isotherm, governed by a user-defined dissociation constant, K(D), and allows modeling of ENM dissolution over time. Dose metrics predicted by the two models were in remarkably close agreement. The DG model was also validated by quantitative analysis of flash-frozen, cryosectioned columns of ENM suspensions. Results of simulations based on agglomerate size distributions differed substantially from those obtained using mean sizes. The effect of cellular adsorption on delivered dose was negligible for K(D) values consistent with non-specific binding (> 1 nM), whereas smaller values (≤ 1 nM) typical of specific high

  20. Ferrofluids: Modeling, numerical analysis, and scientific computation

    Science.gov (United States)

    Tomas, Ignacio

    This dissertation presents some developments in the Numerical Analysis of Partial Differential Equations (PDEs) describing the behavior of ferrofluids. The most widely accepted PDE model for ferrofluids is the Micropolar model proposed by R.E. Rosensweig. The Micropolar Navier-Stokes Equations (MNSE) is a subsystem of PDEs within the Rosensweig model. Being a simplified version of the much bigger system of PDEs proposed by Rosensweig, the MNSE are a natural starting point of this thesis. The MNSE couple linear velocity u, angular velocity w, and pressure p. We propose and analyze a first-order semi-implicit fully-discrete scheme for the MNSE, which decouples the computation of the linear and angular velocities, is unconditionally stable and delivers optimal convergence rates under assumptions analogous to those used for the Navier-Stokes equations. Moving onto the much more complex Rosensweig's model, we provide a definition (approximation) for the effective magnetizing field h, and explain the assumptions behind this definition. Unlike previous definitions available in the literature, this new definition is able to accommodate the effect of external magnetic fields. Using this definition we setup the system of PDEs coupling linear velocity u, pressure p, angular velocity w, magnetization m, and magnetic potential ϕ We show that this system is energy-stable and devise a numerical scheme that mimics the same stability property. We prove that solutions of the numerical scheme always exist and, under certain simplifying assumptions, that the discrete solutions converge. A notable outcome of the analysis of the numerical scheme for the Rosensweig's model is the choice of finite element spaces that allow the construction of an energy-stable scheme. Finally, with the lessons learned from Rosensweig's model, we develop a diffuse-interface model describing the behavior of two-phase ferrofluid flows and present an energy-stable numerical scheme for this model. For a

  1. Computer Modeling of Human Delta Opioid Receptor

    Directory of Open Access Journals (Sweden)

    Tatyana Dzimbova

    2013-04-01

    Full Text Available The development of selective agonists of δ-opioid receptor as well as the model of interaction of ligands with this receptor is the subjects of increased interest. In the absence of crystal structures of opioid receptors, 3D homology models with different templates have been reported in the literature. The problem is that these models are not available for widespread use. The aims of our study are: (1 to choose within recently published crystallographic structures templates for homology modeling of the human δ-opioid receptor (DOR; (2 to evaluate the models with different computational tools; and (3 to precise the most reliable model basing on correlation between docking data and in vitro bioassay results. The enkephalin analogues, as ligands used in this study, were previously synthesized by our group and their biological activity was evaluated. Several models of DOR were generated using different templates. All these models were evaluated by PROCHECK and MolProbity and relationship between docking data and in vitro results was determined. The best correlations received for the tested models of DOR were found between efficacy (erel of the compounds, calculated from in vitro experiments and Fitness scoring function from docking studies. New model of DOR was generated and evaluated by different approaches. This model has good GA341 value (0.99 from MODELLER, good values from PROCHECK (92.6% of most favored regions and MolProbity (99.5% of favored regions. Scoring function correlates (Pearson r = -0.7368, p-value = 0.0097 with erel of a series of enkephalin analogues, calculated from in vitro experiments. So, this investigation allows suggesting a reliable model of DOR. Newly generated model of DOR receptor could be used further for in silico experiments and it will give possibility for faster and more correct design of selective and effective ligands for δ-opioid receptor.

  2. Modelling of individual subject ozone exposure response kinetics.

    Science.gov (United States)

    Schelegle, Edward S; Adams, William C; Walby, William F; Marion, M Susan

    2012-06-01

    A better understanding of individual subject ozone (O(3)) exposure response kinetics will provide insight into how to improve models used in the risk assessment of ambient ozone exposure. To develop a simple two compartment exposure-response model that describes individual subject decrements in forced expiratory volume in one second (FEV(1)) induced by the acute inhalation of O(3) lasting up to 8 h. FEV(1) measurements of 220 subjects who participated in 14 previously completed studies were fit to the model using both particle swarm and nonlinear least squares optimization techniques to identify three subject-specific coefficients producing minimum "global" and local errors, respectively. Observed and predicted decrements in FEV(1) of the 220 subjects were used for validation of the model. Further validation was provided by comparing the observed O(3)-induced FEV(1) decrements in an additional eight studies with predicted values obtained using model coefficients estimated from the 220 subjects used in cross validation. Overall the individual subject measured and modeled FEV(1) decrements were highly correlated (mean R(2) of 0.69 ± 0.24). In addition, it was shown that a matrix of individual subject model coefficients can be used to predict the mean and variance of group decrements in FEV(1). This modeling approach provides insight into individual subject O(3) exposure response kinetics and provides a potential starting point for improving the risk assessment of environmental O(3) exposure.

  3. Validation of a phytoremediation computer model

    International Nuclear Information System (INIS)

    Corapcioglu, M.Y.; Sung, K.; Rhykerd, R.L.; Munster, C.; Drew, M.

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg -1 ] TNT, PBB and chrysene. Vegetated and unvegetated treatments were conducted in triplicate to obtain data regarding contaminant concentrations in the soil, plant roots, root distribution, microbial activity, plant water use and soil moisture. When given the parameters of time and depth, the model successfully predicted contaminant concentrations under actual field conditions. Other model parameters are currently being evaluated. 15 refs., 2 figs

  4. Computer models for optimizing radiation therapy

    International Nuclear Information System (INIS)

    Duechting, W.

    1998-01-01

    The aim of this contribution is to outline how methods of system analysis, control therapy and modelling can be applied to simulate normal and malignant cell growth and to optimize cancer treatment as for instance radiation therapy. Based on biological observations and cell kinetic data, several types of models have been developed describing the growth of tumor spheroids and the cell renewal of normal tissue. The irradiation model is represented by the so-called linear-quadratic model describing the survival fraction as a function of the dose. Based thereon, numerous simulation runs for different treatment schemes can be performed. Thus, it is possible to study the radiation effect on tumor and normal tissue separately. Finally, this method enables a computer-assisted recommendation for an optimal patient-specific treatment schedule prior to clinical therapy. (orig.) [de

  5. Computational Modeling of Large Wildfires: A Roadmap

    KAUST Repository

    Coen, Janice L.

    2010-08-01

    Wildland fire behavior, particularly that of large, uncontrolled wildfires, has not been well understood or predicted. Our methodology to simulate this phenomenon uses high-resolution dynamic models made of numerical weather prediction (NWP) models coupled to fire behavior models to simulate fire behavior. NWP models are capable of modeling very high resolution (< 100 m) atmospheric flows. The wildland fire component is based upon semi-empirical formulas for fireline rate of spread, post-frontal heat release, and a canopy fire. The fire behavior is coupled to the atmospheric model such that low level winds drive the spread of the surface fire, which in turn releases sensible heat, latent heat, and smoke fluxes into the lower atmosphere, feeding back to affect the winds directing the fire. These coupled dynamic models capture the rapid spread downwind, flank runs up canyons, bifurcations of the fire into two heads, and rough agreement in area, shape, and direction of spread at periods for which fire location data is available. Yet, intriguing computational science questions arise in applying such models in a predictive manner, including physical processes that span a vast range of scales, processes such as spotting that cannot be modeled deterministically, estimating the consequences of uncertainty, the efforts to steer simulations with field data ("data assimilation"), lingering issues with short term forecasting of weather that may show skill only on the order of a few hours, and the difficulty of gathering pertinent data for verification and initialization in a dangerous environment. © 2010 IEEE.

  6. The modelling of external exposure and inhalation pathways in COSYMA

    International Nuclear Information System (INIS)

    Brown, J.; Simmonds, JR.; Ehrhardt, J.; Hasemann, I.

    1991-01-01

    Following an accidental release of radionuclides to atmosphere the major direct exposure pathways of concern are: external irradiation from material in the cloud; internal exposure following inhalation of material in the cloud; external irradiation from material deposited on the ground; and external irradiation due to contamination of skin and clothes. In addition material resuspended from the ground can be inhaled and lead to internal exposure. In this paper the way that these exposure pathways are modelled in COSYMA is described. At present in COSYMA external exposure from deposited material is modelled using a dataset of doses per unit deposit of various radionuclides. This dataset, is based on activity deposited on undisturbed soil. The basic data are for doses outdoors and shielding factors are used to estimate doses for people indoors. Various groups of people spending different amounts of time indoors and out can be considered and shielding factors appropriate to three building types can be adopted. A more complex model has also been developed to predict radiation exposure following deposition to different surfaces in the environment. This model called EXPURT is briefly described in this paper. Using EXPURT, doses as a function of time after a single deposit have been calculated for people living in three types of area. These results are described in the paper and compared with those that are currently used in COSYMA. The paper will also discuss what future work is required in this area and the adequacy of existing models

  7. Modelling survival: exposure pattern, species sensitivity and uncertainty.

    Science.gov (United States)

    Ashauer, Roman; Albert, Carlo; Augustine, Starrlight; Cedergreen, Nina; Charles, Sandrine; Ducrot, Virginie; Focks, Andreas; Gabsi, Faten; Gergs, André; Goussen, Benoit; Jager, Tjalling; Kramer, Nynke I; Nyman, Anna-Maija; Poulsen, Veronique; Reichenberger, Stefan; Schäfer, Ralf B; Van den Brink, Paul J; Veltman, Karin; Vogel, Sören; Zimmer, Elke I; Preuss, Thomas G

    2016-07-06

    The General Unified Threshold model for Survival (GUTS) integrates previously published toxicokinetic-toxicodynamic models and estimates survival with explicitly defined assumptions. Importantly, GUTS accounts for time-variable exposure to the stressor. We performed three studies to test the ability of GUTS to predict survival of aquatic organisms across different pesticide exposure patterns, time scales and species. Firstly, using synthetic data, we identified experimental data requirements which allow for the estimation of all parameters of the GUTS proper model. Secondly, we assessed how well GUTS, calibrated with short-term survival data of Gammarus pulex exposed to four pesticides, can forecast effects of longer-term pulsed exposures. Thirdly, we tested the ability of GUTS to estimate 14-day median effect concentrations of malathion for a range of species and use these estimates to build species sensitivity distributions for different exposure patterns. We find that GUTS adequately predicts survival across exposure patterns that vary over time. When toxicity is assessed for time-variable concentrations species may differ in their responses depending on the exposure profile. This can result in different species sensitivity rankings and safe levels. The interplay of exposure pattern and species sensitivity deserves systematic investigation in order to better understand how organisms respond to stress, including humans.

  8. Neurotoxicity in Preclinical Models of Occupational Exposure to Organophosphorus Compounds

    Science.gov (United States)

    Voorhees, Jaymie R.; Rohlman, Diane S.; Lein, Pamela J.; Pieper, Andrew A.

    2017-01-01

    Organophosphorus (OPs) compounds are widely used as insecticides, plasticizers, and fuel additives. These compounds potently inhibit acetylcholinesterase (AChE), the enzyme that inactivates acetylcholine at neuronal synapses, and acute exposure to high OP levels can cause cholinergic crisis in humans and animals. Evidence further suggests that repeated exposure to lower OP levels insufficient to cause cholinergic crisis, frequently encountered in the occupational setting, also pose serious risks to people. For example, multiple epidemiological studies have identified associations between occupational OP exposure and neurodegenerative disease, psychiatric illness, and sensorimotor deficits. Rigorous scientific investigation of the basic science mechanisms underlying these epidemiological findings requires valid preclinical models in which tightly-regulated exposure paradigms can be correlated with neurotoxicity. Here, we review the experimental models of occupational OP exposure currently used in the field. We found that animal studies simulating occupational OP exposures do indeed show evidence of neurotoxicity, and that utilization of these models is helping illuminate the mechanisms underlying OP-induced neurological sequelae. Still, further work is necessary to evaluate exposure levels, protection methods, and treatment strategies, which taken together could serve to modify guidelines for improving workplace conditions globally. PMID:28149268

  9. Computational and Organotypic Modeling of Microcephaly (Teratology Society)

    Science.gov (United States)

    Microcephaly is associated with reduced cortical surface area and ventricular dilations. Many genetic and environmental factors precipitate this malformation, including prenatal alcohol exposure and maternal Zika infection. This complexity motivates the engineering of computation...

  10. Modelling indoor exposure to natural radioactive nuclides

    International Nuclear Information System (INIS)

    Hofmann, W.; Daschil, F.

    1986-01-01

    Radon enters buildings from several sources primarily from building materials and from the soil or rocks that underlie or surround building fundaments. A multicompartment model was developed which describes the fate of radon and attached or free radon decay products in a model room by a set of linear time-dependent differential equations. Time-dependent coefficients allow to model temporal parameter changes, e.g. the opening of windows, or a sudden pressure drop leading to enhanced exhalation. While steady-state models were used to study the effect of parameter changes on steady state nuclide concentrations, the time-dependent models provided additional information on the time scale of these changes. (author)

  11. Underwater Sound Propagation Modeling Methods for Predicting Marine Animal Exposure.

    Science.gov (United States)

    Hamm, Craig A; McCammon, Diana F; Taillefer, Martin L

    2016-01-01

    The offshore exploration and production (E&P) industry requires comprehensive and accurate ocean acoustic models for determining the exposure of marine life to the high levels of sound used in seismic surveys and other E&P activities. This paper reviews the types of acoustic models most useful for predicting the propagation of undersea noise sources and describes current exposure models. The severe problems caused by model sensitivity to the uncertainty in the environment are highlighted to support the conclusion that it is vital that risk assessments include transmission loss estimates with statistical measures of confidence.

  12. Development of a computational model for astronaut reorientation.

    Science.gov (United States)

    Stirling, Leia; Willcox, Karen; Newman, Dava

    2010-08-26

    The ability to model astronaut reorientations computationally provides a simple way to develop and study human motion control strategies. Since the cost of experimenting in microgravity is high, and underwater training can lead to motions inappropriate for microgravity, these techniques allow for motions to be developed and well-understood prior to any microgravity exposure. By including a model of the current space suit, we have the ability to study both intravehicular and extravehicular activities. We present several techniques for rotating about the axes of the body and show that motions performed by the legs create a greater net rotation than those performed by the arms. Adding a space suit to the motions was seen to increase the resistance torque and limit the available range of motion. While rotations about the body axes can be performed in the current space suit, the resulting motions generated a reduced rotation when compared to the unsuited configuration. 2010 Elsevier Ltd. All rights reserved.

  13. Computer models in the design of FXR

    International Nuclear Information System (INIS)

    Vogtlin, G.; Kuenning, R.

    1980-01-01

    Lawrence Livermore National Laboratory is developing a 15 to 20 MeV electron accelerator with a beam current goal of 4 kA. This accelerator will be used for flash radiography and has a requirement of high reliability. Components being developed include spark gaps, Marx generators, water Blumleins and oil insulation systems. A SCEPTRE model was developed that takes into consideration the non-linearity of the ferrite and the time dependency of the emission from a field emitter cathode. This model was used to predict an optimum charge time to obtain maximum magnetic flux change from the ferrite. This model and its application will be discussed. JASON was used extensively to determine optimum locations and shapes of supports and insulators. It was also used to determine stress within bubbles adjacent to walls in oil. Computer results will be shown and bubble breakdown will be related to bubble size

  14. Computational modeling of a forward lunge

    DEFF Research Database (Denmark)

    Alkjær, Tine; Wieland, Maja Rose; Andersen, Michael Skipper

    2012-01-01

    during forward lunging. Thus, the purpose of the present study was to establish a musculoskeletal model of the forward lunge to computationally investigate the complete mechanical force equilibrium of the tibia during the movement to examine the loading pattern of the cruciate ligaments. A healthy female...... was selected from a group of healthy subjects who all performed a forward lunge on a force platform, targeting a knee flexion angle of 90°. Skin-markers were placed on anatomical landmarks on the subject and the movement was recorded by five video cameras. The three-dimensional kinematic data describing...... the forward lunge movement were extracted and used to develop a biomechanical model of the lunge movement. The model comprised two legs including femur, crus, rigid foot segments and the pelvis. Each leg had 35 independent muscle units, which were recruited according to a minimum fatigue criterion...

  15. Computational fluid dynamic modelling of cavitation

    Science.gov (United States)

    Deshpande, Manish; Feng, Jinzhang; Merkle, Charles L.

    1993-01-01

    Models in sheet cavitation in cryogenic fluids are developed for use in Euler and Navier-Stokes codes. The models are based upon earlier potential-flow models but enable the cavity inception point, length, and shape to be determined as part of the computation. In the present paper, numerical solutions are compared with experimental measurements for both pressure distribution and cavity length. Comparisons between models are also presented. The CFD model provides a relatively simple modification to an existing code to enable cavitation performance predictions to be included. The analysis also has the added ability of incorporating thermodynamic effects of cryogenic fluids into the analysis. Extensions of the current two-dimensional steady state analysis to three-dimensions and/or time-dependent flows are, in principle, straightforward although geometrical issues become more complicated. Linearized models, however offer promise of providing effective cavitation modeling in three-dimensions. This analysis presents good potential for improved understanding of many phenomena associated with cavity flows.

  16. Hydroquinone PBPK model refinement and application to dermal exposure.

    Science.gov (United States)

    Poet, Torka S; Carlton, Betsy D; Deyo, James A; Hinderliter, Paul M

    2010-11-01

    A physiologically based pharmacokinetic (PBPK) model for hydroquinone (HQ) was refined to include an expanded description of HQ-glucuronide metabolites and a description of dermal exposures to support route-to-route and cross-species extrapolation. Total urinary excretion of metabolites from in vivo rat dermal exposures was used to estimate a percutaneous permeability coefficient (K(p); 3.6×10(-5) cm/h). The human in vivo K(p) was estimated to be 1.62×10(-4) cm/h, based on in vitro skin permeability data in rats and humans and rat in vivo values. The projected total multi-substituted glutathione (which was used as an internal dose surrogate for the toxic glutathione metabolites) was modeled following an exposure scenario based on submersion of both hands in a 5% aqueous solution of HQ (similar to black and white photographic developing solution) for 2 h, a worst-case exposure scenario. Total multi-substituted glutathione following this human dermal exposure scenario was several orders of magnitude lower than the internal total glutathione conjugates in rats following an oral exposure to the rat NOEL of 20 mg/kg. Thus, under more realistic human dermal exposure conditions, it is unlikely that toxic glutathione conjugates (primarily the di- and, to a lesser degree, the tri-glutathione conjugate) will reach significant levels in target tissues. Copyright © 2010. Published by Elsevier Ltd.

  17. Reduction of exposure of pet staff by computer-aided injection of radiotracers

    International Nuclear Information System (INIS)

    Balduyck, S.; Sarracanie, M.; Trevisan, L.

    2006-01-01

    Full text of publication follows: Positron Emission Tomograph y (PET) is now widely used, and everyday employment of positron emitters have brought new needs for radiation protection. Analysing the dosimetry of PET staff in the course of medical examinations, it was shown that the injection step is the most irradiating part. The necessity to monitor the safety of the patient during the injection of the radiotracer prevents us from reducing exposure time or to insert a complete shielding between the source and the operator. To lower the dose by increasing the distance, we designed a system based on two syringe pumps: one for the tracer and one for saline solution. Both a re remotely controlled by computer. The exposure of staff during the injection step is thus reduced to the five seconds necessary to place the shielded syringe in the pump. The sequence of injection is fully automatic, but all relevant parameters (such as pressure, volume, flow, occlusion detection, etc...) are continuously monitored and the operator can safely interrupt the sequence at any time. A visual, dialogue-based, interface has been designed to provide a convenient full monitoring service even for non-familiar computer users, without lowering the security of the patient. With this computer-aided injection system, the mean dose of exposure to PET staff was divided by eight. Such a system can be used in all radiopharmaceutical injections encountered in nuclear medicine or radiotherapy facilities, by adapting the injection sequence and the number of syringe pumps. (authors)

  18. Modelling of data uncertainties on hybrid computers

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Anke (ed.)

    2016-06-15

    The codes d{sup 3}f and r{sup 3}t are well established for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. They are applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. Development of the basic framework of d{sup 3}f and r{sup 3}t had begun more than 20 years ago. Since that time significant advancements took place in the requirements for safety assessment as well as for computer hardware development. The period of safety assessment for a repository of high-level radioactive waste was extended to 1 million years, and the complexity of the models is steadily growing. Concurrently, the demands on accuracy increase. Additionally, model and parameter uncertainties become more and more important for an increased understanding of prediction reliability. All this leads to a growing demand for computational power that requires a considerable software speed-up. An effective way to achieve this is the use of modern, hybrid computer architectures which requires basically the set-up of new data structures and a corresponding code revision but offers a potential speed-up by several orders of magnitude. The original codes d{sup 3}f and r{sup 3}t were applications of the software platform UG /BAS 94/ whose development had begun in the early nineteennineties. However, UG had recently been advanced to the C++ based, substantially revised version UG4 /VOG 13/. To benefit also in the future from state-of-the-art numerical algorithms and to use hybrid computer architectures, the codes d{sup 3}f and r{sup 3}t were transferred to this new code platform. Making use of the fact that coupling between different sets of equations is natively supported in UG4, d{sup 3}f and r{sup 3}t were combined to one conjoint code d{sup 3}f++. A direct estimation of uncertainties for complex groundwater flow models with the

  19. Computational model of a whole tree combustor

    Energy Technology Data Exchange (ETDEWEB)

    Bryden, K.M.; Ragland, K.W. [Univ. of Wisconsin, Madison, WI (United States)

    1993-12-31

    A preliminary computational model has been developed for the whole tree combustor and compared to test results. In the simulation model presented hardwood logs, 15 cm in diameter are burned in a 4 m deep fuel bed. Solid and gas temperature, solid and gas velocity, CO, CO{sub 2}, H{sub 2}O, HC and O{sub 2} profiles are calculated. This deep, fixed bed combustor obtains high energy release rates per unit area due to the high inlet air velocity and extended reaction zone. The lowest portion of the overall bed is an oxidizing region and the remainder of the bed acts as a gasification and drying region. The overfire air region completes the combustion. Approximately 40% of the energy is released in the lower oxidizing region. The wood consumption rate obtained from the computational model is 4,110 kg/m{sup 2}-hr which matches well the consumption rate of 3,770 kg/m{sup 2}-hr observed during the peak test period of the Aurora, MN test. The predicted heat release rate is 16 MW/m{sup 2} (5.0*10{sup 6} Btu/hr-ft{sup 2}).

  20. Modeling population exposures to silver nanoparticles present in consumer products

    Science.gov (United States)

    Royce, Steven G.; Mukherjee, Dwaipayan; Cai, Ting; Xu, Shu S.; Alexander, Jocelyn A.; Mi, Zhongyuan; Calderon, Leonardo; Mainelis, Gediminas; Lee, KiBum; Lioy, Paul J.; Tetley, Teresa D.; Chung, Kian Fan; Zhang, Junfeng; Georgopoulos, Panos G.

    2014-11-01

    Exposures of the general population to manufactured nanoparticles (MNPs) are expected to keep rising due to increasing use of MNPs in common consumer products (PEN 2014). The present study focuses on characterizing ambient and indoor population exposures to silver MNPs (nAg). For situations where detailed, case-specific exposure-related data are not available, as in the present study, a novel tiered modeling system, Prioritization/Ranking of Toxic Exposures with GIS (geographic information system) Extension (PRoTEGE), has been developed: it employs a product life cycle analysis (LCA) approach coupled with basic human life stage analysis (LSA) to characterize potential exposures to chemicals of current and emerging concern. The PRoTEGE system has been implemented for ambient and indoor environments, utilizing available MNP production, usage, and properties databases, along with laboratory measurements of potential personal exposures from consumer spray products containing nAg. Modeling of environmental and microenvironmental levels of MNPs employs probabilistic material flow analysis combined with product LCA to account for releases during manufacturing, transport, usage, disposal, etc. Human exposure and dose characterization further employ screening microenvironmental modeling and intake fraction methods combined with LSA for potentially exposed populations, to assess differences associated with gender, age, and demographics. Population distributions of intakes, estimated using the PRoTEGE framework, are consistent with published individual-based intake estimates, demonstrating that PRoTEGE is capable of capturing realistic exposure scenarios for the US population. Distributions of intakes are also used to calculate biologically relevant population distributions of uptakes and target tissue doses through human airway dosimetry modeling that takes into account product MNP size distributions and age-relevant physiological parameters.

  1. Optimization and mathematical modeling in computer architecture

    CERN Document Server

    Sankaralingam, Karu; Nowatzki, Tony

    2013-01-01

    In this book we give an overview of modeling techniques used to describe computer systems to mathematical optimization tools. We give a brief introduction to various classes of mathematical optimization frameworks with special focus on mixed integer linear programming which provides a good balance between solver time and expressiveness. We present four detailed case studies -- instruction set customization, data center resource management, spatial architecture scheduling, and resource allocation in tiled architectures -- showing how MILP can be used and quantifying by how much it outperforms t

  2. Dynamical Models for Computer Viruses Propagation

    Directory of Open Access Journals (Sweden)

    José R. C. Piqueira

    2008-01-01

    Full Text Available Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network.

  3. Computational social dynamic modeling of group recruitment.

    Energy Technology Data Exchange (ETDEWEB)

    Berry, Nina M.; Lee, Marinna; Pickett, Marc; Turnley, Jessica Glicken (Sandia National Laboratories, Albuquerque, NM); Smrcka, Julianne D. (Sandia National Laboratories, Albuquerque, NM); Ko, Teresa H.; Moy, Timothy David (Sandia National Laboratories, Albuquerque, NM); Wu, Benjamin C.

    2004-01-01

    The Seldon software toolkit combines concepts from agent-based modeling and social science to create a computationally social dynamic model for group recruitment. The underlying recruitment model is based on a unique three-level hybrid agent-based architecture that contains simple agents (level one), abstract agents (level two), and cognitive agents (level three). This uniqueness of this architecture begins with abstract agents that permit the model to include social concepts (gang) or institutional concepts (school) into a typical software simulation environment. The future addition of cognitive agents to the recruitment model will provide a unique entity that does not exist in any agent-based modeling toolkits to date. We use social networks to provide an integrated mesh within and between the different levels. This Java based toolkit is used to analyze different social concepts based on initialization input from the user. The input alters a set of parameters used to influence the values associated with the simple agents, abstract agents, and the interactions (simple agent-simple agent or simple agent-abstract agent) between these entities. The results of phase-1 Seldon toolkit provide insight into how certain social concepts apply to different scenario development for inner city gang recruitment.

  4. Separation of uncertainty and interindividual variability in human exposure modeling.

    NARCIS (Netherlands)

    Ragas, A.M.J.; Brouwer, F.P.E.; Buchner, F.L.; Hendriks, H.W.; Huijbregts, M.A.J.

    2009-01-01

    The NORMTOX model predicts the lifetime-averaged exposure to contaminants through multiple environmental media, that is, food, air, soil, drinking and surface water. The model was developed to test the coherence of Dutch environmental quality objectives (EQOs). A set of EQOs is called coherent if

  5. Validation of Computed Radiography (CR) Exposure Chart for Stainless Steel and Aluminium

    International Nuclear Information System (INIS)

    Nassir, M.A.; Khairul Anuar Mohd Salleh; Arshad Yassin

    2015-01-01

    Computed radiography (CR) is a technique that is currently used to complement the conventional radiography in Non Destructive Testing (NDT). With CR, phosphorous base imaging plate (IP) is used to acquire digital radiographic images. The degree of absorption by the IP is proportional to the intensity of the dose received. The IP stores latent image which subsequently digitized by the CR reader. Prior to radiography exposure, the radiation dosage is determined by referring to the exposure chart. Exposure chart is one of the most important tools for achieving acceptable quality radiographs. Therefore, it is important to have a reliable and accurate exposure chart. The aim of this study is to test and validate exposure charts that were developed based on the statistical analysis of the digital radiographic grey values. The charts produced were for stainless steel and aluminum. According to EN 14784-2:2005, acceptable normalized signal-to-noise ratio (SNR N ) for testing class A and class B shall have minimum 70 and 120 respectively. (author)

  6. Mechanistic models for cancer development after short time radiation exposure

    International Nuclear Information System (INIS)

    Kottbauer, M. M.

    1997-12-01

    In this work two biological based models were developed. First the single-hit model for solid tumors (SHM-S) and second the single-hit model for leukemia (SHM-L). These models are a further development of the Armitage-Doll model for the special case of a short time radiation exposure. The basis of the models is the multistage process of carcinogeneses. The single-hit models provide simultaneously the age-dependent cancer-rate of spontaneous and radiation induced tumors as well as the dose-effect relationships at any age after exposure. The SHM-S leads to a biological based dose-effect relationship, which is similar to the relative risk model suggested by the ICRP 60. The SHM-S describes the increased mortality rate of the bomb survivors more accurate than the relative risk model. The SHM-L results in an additive dose-effect relationship. It is shown that only small differences in the derivation of the two models lead to the two dose-effect relationships. Beside the radiation exposure the new models consider the decrease of the cancer mortality rate at higher ages (age>75) which can be traced back mainly to three causes: competitive causes of death, reduction of cell proliferation and reduction of risk groups. The single-hit models also consider children cancer, the different rates of incidence and mortality, influence of the immune system and the cell-killing effect. (author)

  7. Stoffenmanager exposure model: company-specific exposure assessments using a Bayesian methodology.

    NARCIS (Netherlands)

    Ven, P. van de; Fransman, W.; Schinkel, J.; Rubingh, C.; Warren, N.; Tielemans, E.

    2010-01-01

    The web-based tool "Stoffenmanager" was initially developed to assist small- and medium-sized enterprises in the Netherlands to make qualitative risk assessments and to provide advice on control at the workplace. The tool uses a mechanistic model to arrive at a "Stoffenmanager score" for exposure.

  8. Getting computer models to communicate; Faire communiquer les modeles numeriques

    Energy Technology Data Exchange (ETDEWEB)

    Caremoli, Ch. [Electricite de France (EDF), 75 - Paris (France). Dept. Mecanique et Modeles Numeriques; Erhard, P. [Electricite de France (EDF), 75 - Paris (France). Dept. Physique des Reacteurs

    1999-07-01

    Today's computers have the processing power to deliver detailed and global simulations of complex industrial processes such as the operation of a nuclear reactor core. So should we be producing new, global numerical models to take full advantage of this new-found power? If so, it would be a long-term job. There is, however, another solution; to couple the existing validated numerical models together so that they work as one. (authors)

  9. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  10. Modeling Reality: How Computers Mirror Life

    International Nuclear Information System (INIS)

    Inoue, J-I

    2005-01-01

    Modeling Reality: How Computers Mirror Life covers a wide range of modern subjects in complex systems, suitable not only for undergraduate students who want to learn about modelling 'reality' by using computer simulations, but also for researchers who want to learn something about subjects outside of their majors and need a simple guide. Readers are not required to have specialized training before they start the book. Each chapter is organized so as to train the reader to grasp the essential idea of simulating phenomena and guide him/her towards more advanced areas. The topics presented in this textbook fall into two categories. The first is at graduate level, namely probability, statistics, information theory, graph theory, and the Turing machine, which are standard topics in the course of information science and information engineering departments. The second addresses more advanced topics, namely cellular automata, deterministic chaos, fractals, game theory, neural networks, and genetic algorithms. Several topics included here (neural networks, game theory, information processing, etc) are now some of the main subjects of statistical mechanics, and many papers related to these interdisciplinary fields are published in Journal of Physics A: Mathematical and General, so readers of this journal will be familiar with the subject areas of this book. However, each area is restricted to an elementary level and if readers wish to know more about the topics they are interested in, they will need more advanced books. For example, on neural networks, the text deals with the back-propagation algorithm for perceptron learning. Nowadays, however, this is a rather old topic, so the reader might well choose, for example, Introduction to the Theory of Neural Computation by J Hertz et al (Perseus books, 1991) or Statistical Physics of Spin Glasses and Information Processing by H Nishimori (Oxford University Press, 2001) for further reading. Nevertheless, this book is worthwhile

  11. Electromagnetic Physics Models for Parallel Computing Architectures

    International Nuclear Information System (INIS)

    Amadio, G; Bianchini, C; Iope, R; Ananya, A; Apostolakis, J; Aurora, A; Bandieramonte, M; Brun, R; Carminati, F; Gheata, A; Gheata, M; Goulas, I; Nikitina, T; Bhattacharyya, A; Mohanty, A; Canal, P; Elvira, D; Jun, S Y; Lima, G; Duhem, L

    2016-01-01

    The recent emergence of hardware architectures characterized by many-core or accelerated processors has opened new opportunities for concurrent programming models taking advantage of both SIMD and SIMT architectures. GeantV, a next generation detector simulation, has been designed to exploit both the vector capability of mainstream CPUs and multi-threading capabilities of coprocessors including NVidia GPUs and Intel Xeon Phi. The characteristics of these architectures are very different in terms of the vectorization depth and type of parallelization needed to achieve optimal performance. In this paper we describe implementation of electromagnetic physics models developed for parallel computing architectures as a part of the GeantV project. Results of preliminary performance evaluation and physics validation are presented as well. (paper)

  12. Electromagnetic Physics Models for Parallel Computing Architectures

    Science.gov (United States)

    Amadio, G.; Ananya, A.; Apostolakis, J.; Aurora, A.; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, P.; Carminati, F.; Duhem, L.; Elvira, D.; Gheata, A.; Gheata, M.; Goulas, I.; Iope, R.; Jun, S. Y.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.; Zhang, Y.

    2016-10-01

    The recent emergence of hardware architectures characterized by many-core or accelerated processors has opened new opportunities for concurrent programming models taking advantage of both SIMD and SIMT architectures. GeantV, a next generation detector simulation, has been designed to exploit both the vector capability of mainstream CPUs and multi-threading capabilities of coprocessors including NVidia GPUs and Intel Xeon Phi. The characteristics of these architectures are very different in terms of the vectorization depth and type of parallelization needed to achieve optimal performance. In this paper we describe implementation of electromagnetic physics models developed for parallel computing architectures as a part of the GeantV project. Results of preliminary performance evaluation and physics validation are presented as well.

  13. A COMPUTATIONAL MODEL OF MOTOR NEURON DEGENERATION

    Science.gov (United States)

    Le Masson, Gwendal; Przedborski, Serge; Abbott, L.F.

    2014-01-01

    SUMMARY To explore the link between bioenergetics and motor neuron degeneration, we used a computational model in which detailed morphology and ion conductance are paired with intracellular ATP production and consumption. We found that reduced ATP availability increases the metabolic cost of a single action potential and disrupts K+/Na+ homeostasis, resulting in a chronic depolarization. The magnitude of the ATP shortage at which this ionic instability occurs depends on the morphology and intrinsic conductance characteristic of the neuron. If ATP shortage is confined to the distal part of the axon, the ensuing local ionic instability eventually spreads to the whole neuron and involves fasciculation-like spiking events. A shortage of ATP also causes a rise in intracellular calcium. Our modeling work supports the notion that mitochondrial dysfunction can account for salient features of the paralytic disorder amyotrophic lateral sclerosis, including motor neuron hyperexcitability, fasciculation, and differential vulnerability of motor neuron subpopulations. PMID:25088365

  14. A computational model of motor neuron degeneration.

    Science.gov (United States)

    Le Masson, Gwendal; Przedborski, Serge; Abbott, L F

    2014-08-20

    To explore the link between bioenergetics and motor neuron degeneration, we used a computational model in which detailed morphology and ion conductance are paired with intracellular ATP production and consumption. We found that reduced ATP availability increases the metabolic cost of a single action potential and disrupts K+/Na+ homeostasis, resulting in a chronic depolarization. The magnitude of the ATP shortage at which this ionic instability occurs depends on the morphology and intrinsic conductance characteristic of the neuron. If ATP shortage is confined to the distal part of the axon, the ensuing local ionic instability eventually spreads to the whole neuron and involves fasciculation-like spiking events. A shortage of ATP also causes a rise in intracellular calcium. Our modeling work supports the notion that mitochondrial dysfunction can account for salient features of the paralytic disorder amyotrophic lateral sclerosis, including motor neuron hyperexcitability, fasciculation, and differential vulnerability of motor neuron subpopulations. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Computational models of intergroup competition and warfare.

    Energy Technology Data Exchange (ETDEWEB)

    Letendre, Kenneth (University of New Mexico); Abbott, Robert G.

    2011-11-01

    This document reports on the research of Kenneth Letendre, the recipient of a Sandia Graduate Research Fellowship at the University of New Mexico. Warfare is an extreme form of intergroup competition in which individuals make extreme sacrifices for the benefit of their nation or other group to which they belong. Among animals, limited, non-lethal competition is the norm. It is not fully understood what factors lead to warfare. We studied the global variation in the frequency of civil conflict among countries of the world, and its positive association with variation in the intensity of infectious disease. We demonstrated that the burden of human infectious disease importantly predicts the frequency of civil conflict and tested a causal model for this association based on the parasite-stress theory of sociality. We also investigated the organization of social foraging by colonies of harvester ants in the genus Pogonomyrmex, using both field studies and computer models.

  16. Algebraic computability and enumeration models recursion theory and descriptive complexity

    CERN Document Server

    Nourani, Cyrus F

    2016-01-01

    This book, Algebraic Computability and Enumeration Models: Recursion Theory and Descriptive Complexity, presents new techniques with functorial models to address important areas on pure mathematics and computability theory from the algebraic viewpoint. The reader is first introduced to categories and functorial models, with Kleene algebra examples for languages. Functorial models for Peano arithmetic are described toward important computational complexity areas on a Hilbert program, leading to computability with initial models. Infinite language categories are also introduced to explain descriptive complexity with recursive computability with admissible sets and urelements. Algebraic and categorical realizability is staged on several levels, addressing new computability questions with omitting types realizably. Further applications to computing with ultrafilters on sets and Turing degree computability are examined. Functorial models computability is presented with algebraic trees realizing intuitionistic type...

  17. Method of generating a computer readable model

    DEFF Research Database (Denmark)

    2008-01-01

    A method of generating a computer readable model of a geometrical object constructed from a plurality of interconnectable construction elements, wherein each construction element has a number of connection elements for connecting the construction element with another construction element. The met......A method of generating a computer readable model of a geometrical object constructed from a plurality of interconnectable construction elements, wherein each construction element has a number of connection elements for connecting the construction element with another construction element....... The method comprises encoding a first and a second one of the construction elements as corresponding data structures, each representing the connection elements of the corresponding construction element, and each of the connection elements having associated with it a predetermined connection type. The method...... further comprises determining a first connection element of the first construction element and a second connection element of the second construction element located in a predetermined proximity of each other; and retrieving connectivity information of the corresponding connection types of the first...

  18. Direct modeling for computational fluid dynamics

    Science.gov (United States)

    Xu, Kun

    2015-06-01

    All fluid dynamic equations are valid under their modeling scales, such as the particle mean free path and mean collision time scale of the Boltzmann equation and the hydrodynamic scale of the Navier-Stokes (NS) equations. The current computational fluid dynamics (CFD) focuses on the numerical solution of partial differential equations (PDEs), and its aim is to get the accurate solution of these governing equations. Under such a CFD practice, it is hard to develop a unified scheme that covers flow physics from kinetic to hydrodynamic scales continuously because there is no such governing equation which could make a smooth transition from the Boltzmann to the NS modeling. The study of fluid dynamics needs to go beyond the traditional numerical partial differential equations. The emerging engineering applications, such as air-vehicle design for near-space flight and flow and heat transfer in micro-devices, do require further expansion of the concept of gas dynamics to a larger domain of physical reality, rather than the traditional distinguishable governing equations. At the current stage, the non-equilibrium flow physics has not yet been well explored or clearly understood due to the lack of appropriate tools. Unfortunately, under the current numerical PDE approach, it is hard to develop such a meaningful tool due to the absence of valid PDEs. In order to construct multiscale and multiphysics simulation methods similar to the modeling process of constructing the Boltzmann or the NS governing equations, the development of a numerical algorithm should be based on the first principle of physical modeling. In this paper, instead of following the traditional numerical PDE path, we introduce direct modeling as a principle for CFD algorithm development. Since all computations are conducted in a discretized space with limited cell resolution, the flow physics to be modeled has to be done in the mesh size and time step scales. Here, the CFD is more or less a direct

  19. Stochastic linear programming models, theory, and computation

    CERN Document Server

    Kall, Peter

    2011-01-01

    This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...

  20. Modelling exposure of mammalian predators to anticoagulant rodenticide

    Directory of Open Access Journals (Sweden)

    Christopher John Topping

    2016-12-01

    Full Text Available Anticoagulant rodenticides (AR are a widespread and effective method of rodent control but there is concern about the impact these may have on non-target organisms, in particular secondary poisoning of rodent predators. Incidence and concentration of AR in free-living predators in Denmark is very high. We postulate that this is caused by widespread exposure due to widespread use of AR in Denmark in and around buildings. To investigate this theory a spatio-temporal model of AR use and mammalian predator distribution was created. This model was supported by data from an experimental study of mice as vectors of AR, and was used to evaluate likely impacts of restrictions imposed on AR use in Denmark banning the use of rodenticides for plant protection in woodlands and tree-crops. The model uses input based on frequencies and timings of baiting for rodent control for urban, rural and woodland locations and creates an exposure map based on spatio-temporal modelling of movement of mice-vectored AR (based on Apodemus flavicollis. Simulated predator territories are super-imposed over this exposure map to create an exposure index. Predictions from the model concur with field studies of AR prevalence both before and after the change in AR use. In most cases incidence of exposure to AR is predicted to be greater than 90%, although cessation of use in woodlots and Christmas tree plantations should reduce mean exposure concentrations. Model results suggest that the driver of high AR incidence in non-target small mammal predators is likely to be the pattern of use and not the distance AR is vectored. Reducing baiting frequency by 75% had different effects depending on the landscape simulated, but having a maximum of 12% reduction in exposure incidence, and in one landscape a maximum reduction of <2%. We discuss sources of uncertainty in the model and directions for future development of predictive models for environmental impact assessment of rodenticides. The

  1. FOODCHAIN: a Monte Carlo model to estimate individual exposure to airborne pollutants via the foodchain pathway

    International Nuclear Information System (INIS)

    Dixon, E.; Holton, G.A.

    1984-01-01

    Ingestion of contaminated food due to the airborne release of radionuclides or chemical pollutants is a particularly difficult human exposure pathway to quantify. There are a number of important physical and biological processes such as atmospheric deposition and plant uptake to consider. These processes are approximate by techniques encoded in the computer program TEREX. Once estimates of pollutant concentrations are made, the problem can be reduced to computing exposure from ingestion of the food. Some assessments do not account for where the contaminated food is eaten, while others limit consumption to meat and vegetables produced within the affected area. While those approaches lead to an upper bound of exposure, a more realistic assumption is that if locally produced food is not sufficient to meet the dietary needs of the local populace, then uncontaminated food will be imported. This is the approach taken by the computer model FOODCHAIN. Exposures via ingestion of six basic types of food are modeled: beef, milk, grains, leafy vegetables, exposed produce (edible parts are exposed to atmospheric deposition), and protected produce (edible parts are protected from atmospheric deposition). Intake requirements for these six foods are based on a standard diet. Using TEREX-produced site-specific crop production values and food contamination values, FOODCHAIN randomly samples pollutant concentrations in each of the six foodstuffs in an inerative manner. Consumption of a particular food is weighted by a factor proportional to the total production of that food within the area studied. The exposures due to consumption of each of the six foodstuffs are summed to produce the total exposure for each randomly calculated diet

  2. Mapping the Most Significant Computer Hacking Events to a Temporal Computer Attack Model

    OpenAIRE

    Heerden , Renier ,; Pieterse , Heloise; Irwin , Barry

    2012-01-01

    Part 4: Section 3: ICT for Peace and War; International audience; This paper presents eight of the most significant computer hacking events (also known as computer attacks). These events were selected because of their unique impact, methodology, or other properties. A temporal computer attack model is presented that can be used to model computer based attacks. This model consists of the following stages: Target Identification, Reconnaissance, Attack, and Post-Attack Reconnaissance stages. The...

  3. Computer models of vocal tract evolution: an overview and critique

    NARCIS (Netherlands)

    de Boer, B.; Fitch, W. T.

    2010-01-01

    Human speech has been investigated with computer models since the invention of digital computers, and models of the evolution of speech first appeared in the late 1960s and early 1970s. Speech science and computer models have a long shared history because speech is a physical signal and can be

  4. An improved model for the reconstruction of past radon exposure.

    Science.gov (United States)

    Cauwels, P; Poffijn, A

    2000-05-01

    If the behavior of long-lived radon progeny was well understood, measurements of these could be used in epidemiological studies to estimate past radon exposure. Field measurements were done in a radon-prone area in the Ardennes (Belgium). The surface activity of several glass sheets was measured using detectors that were fixed on indoor glass surfaces. Simultaneously the indoor radon concentration was measured using diffusion chambers. By using Monte Carlo techniques, it could be proven that there is a discrepancy between this data set and the room model calculations, which are normally used to correlate surface activity and past radon exposure. To solve this, a modification of the model is proposed.

  5. Integrated multiscale modeling of molecular computing devices

    International Nuclear Information System (INIS)

    Cummings, Peter T; Leng Yongsheng

    2005-01-01

    Molecular electronics, in which single organic molecules are designed to perform the functions of transistors, diodes, switches and other circuit elements used in current siliconbased microelecronics, is drawing wide interest as a potential replacement technology for conventional silicon-based lithographically etched microelectronic devices. In addition to their nanoscopic scale, the additional advantage of molecular electronics devices compared to silicon-based lithographically etched devices is the promise of being able to produce them cheaply on an industrial scale using wet chemistry methods (i.e., self-assembly from solution). The design of molecular electronics devices, and the processes to make them on an industrial scale, will require a thorough theoretical understanding of the molecular and higher level processes involved. Hence, the development of modeling techniques for molecular electronics devices is a high priority from both a basic science point of view (to understand the experimental studies in this field) and from an applied nanotechnology (manufacturing) point of view. Modeling molecular electronics devices requires computational methods at all length scales - electronic structure methods for calculating electron transport through organic molecules bonded to inorganic surfaces, molecular simulation methods for determining the structure of self-assembled films of organic molecules on inorganic surfaces, mesoscale methods to understand and predict the formation of mesoscale patterns on surfaces (including interconnect architecture), and macroscopic scale methods (including finite element methods) for simulating the behavior of molecular electronic circuit elements in a larger integrated device. Here we describe a large Department of Energy project involving six universities and one national laboratory aimed at developing integrated multiscale methods for modeling molecular electronics devices. The project is funded equally by the Office of Basic

  6. Computational modeling of intraocular gas dynamics

    International Nuclear Information System (INIS)

    Noohi, P; Abdekhodaie, M J; Cheng, Y L

    2015-01-01

    The purpose of this study was to develop a computational model to simulate the dynamics of intraocular gas behavior in pneumatic retinopexy (PR) procedure. The presented model predicted intraocular gas volume at any time and determined the tolerance angle within which a patient can maneuver and still gas completely covers the tear(s). Computational fluid dynamics calculations were conducted to describe PR procedure. The geometrical model was constructed based on the rabbit and human eye dimensions. SF_6 in the form of pure and diluted with air was considered as the injected gas. The presented results indicated that the composition of the injected gas affected the gas absorption rate and gas volume. After injection of pure SF_6, the bubble expanded to 2.3 times of its initial volume during the first 23 h, but when diluted SF_6 was used, no significant expansion was observed. Also, head positioning for the treatment of retinal tear influenced the rate of gas absorption. Moreover, the determined tolerance angle depended on the bubble and tear size. More bubble expansion and smaller retinal tear caused greater tolerance angle. For example, after 23 h, for the tear size of 2 mm the tolerance angle of using pure SF_6 is 1.4 times more than that of using diluted SF_6 with 80% air. Composition of the injected gas and conditions of the tear in PR may dramatically affect the gas absorption rate and gas volume. Quantifying these effects helps to predict the tolerance angle and improve treatment efficiency. (paper)

  7. Applied exposure modeling for residual radioactivity and release criteria

    International Nuclear Information System (INIS)

    Lee, D.W.

    1989-01-01

    The protection of public health and the environment from the release of materials with residual radioactivity for recycle or disposal as wastes without radioactive contents of concern presents a formidable challenge. Existing regulatory criteria are based on technical judgment concerning detectability and simple modeling. Recently, exposure modeling methodologies have been developed to provide a more consistent level of health protection. Release criteria derived from the application of exposure modeling methodologies share the same basic elements of analysis but are developed to serve a variety of purposes. Models for the support of regulations for all applications rely on conservative interpretations of generalized conditions while models developed to show compliance incorporate specific conditions not likely to be duplicated at other sites. Research models represent yet another type of modeling which strives to simulate the actual behavior of released material. In spite of these differing purposes, exposure modeling permits the application of sound and reasoned principles of radiation protection to the release of materials with residual levels of radioactivity. Examples of the similarities and differences of these models are presented and an application to the disposal of materials with residual levels of uranium contamination is discussed. 5 refs., 2 tabs

  8. Preliminary Phase Field Computational Model Development

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yulan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hu, Shenyang Y. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Xu, Ke [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Suter, Jonathan D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McCloy, John S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Johnson, Bradley R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ramuhalli, Pradeep [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-15

    This interim report presents progress towards the development of meso-scale models of magnetic behavior that incorporate microstructural information. Modeling magnetic signatures in irradiated materials with complex microstructures (such as structural steels) is a significant challenge. The complexity is addressed incrementally, using the monocrystalline Fe (i.e., ferrite) film as model systems to develop and validate initial models, followed by polycrystalline Fe films, and by more complicated and representative alloys. In addition, the modeling incrementally addresses inclusion of other major phases (e.g., martensite, austenite), minor magnetic phases (e.g., carbides, FeCr precipitates), and minor nonmagnetic phases (e.g., Cu precipitates, voids). The focus of the magnetic modeling is on phase-field models. The models are based on the numerical solution to the Landau-Lifshitz-Gilbert equation. From the computational standpoint, phase-field modeling allows the simulation of large enough systems that relevant defect structures and their effects on functional properties like magnetism can be simulated. To date, two phase-field models have been generated in support of this work. First, a bulk iron model with periodic boundary conditions was generated as a proof-of-concept to investigate major loop effects of single versus polycrystalline bulk iron and effects of single non-magnetic defects. More recently, to support the experimental program herein using iron thin films, a new model was generated that uses finite boundary conditions representing surfaces and edges. This model has provided key insights into the domain structures observed in magnetic force microscopy (MFM) measurements. Simulation results for single crystal thin-film iron indicate the feasibility of the model for determining magnetic domain wall thickness and mobility in an externally applied field. Because the phase-field model dimensions are limited relative to the size of most specimens used in

  9. Parallel Computing for Terrestrial Ecosystem Carbon Modeling

    International Nuclear Information System (INIS)

    Wang, Dali; Post, Wilfred M.; Ricciuto, Daniel M.; Berry, Michael

    2011-01-01

    Terrestrial ecosystems are a primary component of research on global environmental change. Observational and modeling research on terrestrial ecosystems at the global scale, however, has lagged behind their counterparts for oceanic and atmospheric systems, largely because the unique challenges associated with the tremendous diversity and complexity of terrestrial ecosystems. There are 8 major types of terrestrial ecosystem: tropical rain forest, savannas, deserts, temperate grassland, deciduous forest, coniferous forest, tundra, and chaparral. The carbon cycle is an important mechanism in the coupling of terrestrial ecosystems with climate through biological fluxes of CO 2 . The influence of terrestrial ecosystems on atmospheric CO 2 can be modeled via several means at different timescales. Important processes include plant dynamics, change in land use, as well as ecosystem biogeography. Over the past several decades, many terrestrial ecosystem models (see the 'Model developments' section) have been developed to understand the interactions between terrestrial carbon storage and CO 2 concentration in the atmosphere, as well as the consequences of these interactions. Early TECMs generally adapted simple box-flow exchange models, in which photosynthetic CO 2 uptake and respiratory CO 2 release are simulated in an empirical manner with a small number of vegetation and soil carbon pools. Demands on kinds and amount of information required from global TECMs have grown. Recently, along with the rapid development of parallel computing, spatially explicit TECMs with detailed process based representations of carbon dynamics become attractive, because those models can readily incorporate a variety of additional ecosystem processes (such as dispersal, establishment, growth, mortality etc.) and environmental factors (such as landscape position, pest populations, disturbances, resource manipulations, etc.), and provide information to frame policy options for climate change

  10. Modeling of Communication in a Computational Situation Assessment Model

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Seong, Poong Hyun

    2009-01-01

    Operators in nuclear power plants have to acquire information from human system interfaces (HSIs) and the environment in order to create, update, and confirm their understanding of a plant state, or situation awareness, because failures of situation assessment may result in wrong decisions for process control and finally errors of commission in nuclear power plants. Quantitative or prescriptive models to predict operator's situation assessment in a situation, the results of situation assessment, provide many benefits such as HSI design solutions, human performance data, and human reliability. Unfortunately, a few computational situation assessment models for NPP operators have been proposed and those insufficiently embed human cognitive characteristics. Thus we proposed a new computational situation assessment model of nuclear power plant operators. The proposed model incorporating significant cognitive factors uses a Bayesian belief network (BBN) as model architecture. It is believed that communication between nuclear power plant operators affects operators' situation assessment and its result, situation awareness. We tried to verify that the proposed model represent the effects of communication on situation assessment. As the result, the proposed model succeeded in representing the operators' behavior and this paper shows the details

  11. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  12. Chlorpyrifos PBPK/PD model for multiple routes of exposure.

    Science.gov (United States)

    Poet, Torka S; Timchalk, Charles; Hotchkiss, Jon A; Bartels, Michael J

    2014-10-01

    1. Chlorpyrifos (CPF) is an important pesticide used to control crop insects. Human Exposures to CPF will occur primarily through oral exposure to residues on foods. A physiologically based pharmacokinetic/pharmacodynamic (PBPK/PD) model has been developed that describes the relationship between oral, dermal and inhalation doses of CPF and key events in the pathway for cholinergic effects. The model was built on a prior oral model that addressed age-related changes in metabolism and physiology. This multi-route model was developed in rats and humans to validate all scenarios in a parallelogram design. 2. Critical biological effects from CPF exposure require metabolic activation to CPF oxon, and small amounts of metabolism in tissues will potentially have a great effect on pharmacokinetics and pharmacodynamic outcomes. Metabolism (bioactivation and detoxification) was therefore added in diaphragm, brain, lung and skin compartments. Pharmacokinetic data are available for controlled human exposures via the oral and dermal routes and from oral and inhalation studies in rats. The validated model was then used to determine relative dermal versus inhalation uptake from human volunteers exposed to CPF in an indoor scenario.

  13. Estimation of radiation exposure from lung cancer screening program with low-dose computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Su Yeon; Jun, Jae Kwan [Graduate School of Cancer Science and Policy, National Cancer Center, Seoul (Korea, Republic of)

    2016-12-15

    The National Lung Screening Trial (NLST) demonstrated that screening with Low-dose Computed Tomography (LDCT) screening reduced lung cancer mortality in a high-risk population. Recently, the United States Preventive Services Task Force (USPSTF) gave a B recommendation for annual LDCT screening for individuals at high-risk. With the promising results, Korea developed lung cancer screening guideline and is planning a pilot study for implementation of national lung cancer screening. With widespread adoption of lung cancer screening with LDCT, there are concerns about harms of screening, including high false-positive rates and radiation exposure. Over the 3 rounds of screening in the NLST, 96.4% of positive results were false-positives. Although the initial screening is performed at low dose, subsequent diagnostic examinations following positive results additively contribute to patient's lifetime exposure. As with implementing a large-scale screening program, there is a lack of established risk assessment about the effect of radiation exposure from long-term screening program. Thus, the purpose of this study was to estimate cumulative radiation exposure of annual LDCT lung cancer screening program over 20-year period.

  14. Model to Implement Virtual Computing Labs via Cloud Computing Services

    OpenAIRE

    Washington Luna Encalada; José Luis Castillo Sequera

    2017-01-01

    In recent years, we have seen a significant number of new technological ideas appearing in literature discussing the future of education. For example, E-learning, cloud computing, social networking, virtual laboratories, virtual realities, virtual worlds, massive open online courses (MOOCs), and bring your own device (BYOD) are all new concepts of immersive and global education that have emerged in educational literature. One of the greatest challenges presented to e-learning solutions is the...

  15. Online self-report questionnaire on computer work-related exposure (OSCWE): validity and internal consistency.

    Science.gov (United States)

    Mekhora, Keerin; Jalayondeja, Wattana; Jalayondeja, Chutima; Bhuanantanondh, Petcharatana; Dusadiisariyavong, Asadang; Upiriyasakul, Rujiret; Anuraktam, Khajornyod

    2014-07-01

    To develop an online, self-report questionnaire on computer work-related exposure (OSCWE) and to determine the internal consistency, face and content validity of the questionnaire. The online, self-report questionnaire was developed to determine the risk factors related to musculoskeletal disorders in computer users. It comprised five domains: personal, work-related, work environment, physical health and psychosocial factors. The questionnaire's content was validated by an occupational medical doctor and three physical therapy lecturers involved in ergonomic teaching. Twenty-five lay people examined the feasibility of computer-administered and the user-friendly language. The item correlation in each domain was analyzed by the internal consistency (Cronbach's alpha; alpha). The content of the questionnaire was considered congruent with the testing purposes. Eight hundred and thirty-five computer users at the PTT Exploration and Production Public Company Limited registered to the online self-report questionnaire. The internal consistency of the five domains was: personal (alpha = 0.58), work-related (alpha = 0.348), work environment (alpha = 0.72), physical health (alpha = 0.68) and psychosocial factor (alpha = 0.93). The findings suggested that the OSCWE had acceptable internal consistency for work environment and psychosocial factors. The OSCWE is available to use in population-based survey research among computer office workers.

  16. Computer modelling of eddy current probes

    International Nuclear Information System (INIS)

    Sullivan, S.P.

    1992-01-01

    Computer programs have been developed for modelling impedance and transmit-receive eddy current probes in two-dimensional axis-symmetric configurations. These programs, which are based on analytic equations, simulate bobbin probes in infinitely long tubes and surface probes on plates. They calculate probe signal due to uniform variations in conductor thickness, resistivity and permeability. These signals depend on probe design and frequency. A finite element numerical program has been procured to calculate magnetic permeability in non-linear ferromagnetic materials. Permeability values from these calculations can be incorporated into the above analytic programs to predict signals from eddy current probes with permanent magnets in ferromagnetic tubes. These programs were used to test various probe designs for new testing applications. Measurements of magnetic permeability in magnetically biased ferromagnetic materials have been performed by superimposing experimental signals, from special laboratory ET probes, on impedance plane diagrams calculated using these programs. (author). 3 refs., 2 figs

  17. The MESORAD dose assessment model: Computer code

    International Nuclear Information System (INIS)

    Ramsdell, J.V.; Athey, G.F.; Bander, T.J.; Scherpelz, R.I.

    1988-10-01

    MESORAD is a dose equivalent model for emergency response applications that is designed to be run on minicomputers. It has been developed by the Pacific Northwest Laboratory for use as part of the Intermediate Dose Assessment System in the US Nuclear Regulatory Commission Operations Center in Washington, DC, and the Emergency Management System in the US Department of Energy Unified Dose Assessment Center in Richland, Washington. This volume describes the MESORAD computer code and contains a listing of the code. The technical basis for MESORAD is described in the first volume of this report (Scherpelz et al. 1986). A third volume of the documentation planned. That volume will contain utility programs and input and output files that can be used to check the implementation of MESORAD. 18 figs., 4 tabs

  18. Computational Process Modeling for Additive Manufacturing (OSU)

    Science.gov (United States)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  19. Computational toxicology as implemented by the U.S. EPA: providing high throughput decision support tools for screening and assessing chemical exposure, hazard and risk.

    Science.gov (United States)

    Kavlock, Robert; Dix, David

    2010-02-01

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the Toxicity of Chemicals (U.S. EPA, 2009a). Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast) and exposure (ExpoCast), and creating virtual liver (v-Liver) and virtual embryo (v-Embryo) systems models. U.S. EPA-funded STAR centers are also providing bioinformatics, computational toxicology data and models, and developmental toxicity data and models. The models and underlying data are being made publicly

  20. Modeling Techniques for a Computational Efficient Dynamic Turbofan Engine Model

    Directory of Open Access Journals (Sweden)

    Rory A. Roberts

    2014-01-01

    Full Text Available A transient two-stream engine model has been developed. Individual component models developed exclusively in MATLAB/Simulink including the fan, high pressure compressor, combustor, high pressure turbine, low pressure turbine, plenum volumes, and exit nozzle have been combined to investigate the behavior of a turbofan two-stream engine. Special attention has been paid to the development of transient capabilities throughout the model, increasing physics model, eliminating algebraic constraints, and reducing simulation time through enabling the use of advanced numerical solvers. The lessening of computation time is paramount for conducting future aircraft system-level design trade studies and optimization. The new engine model is simulated for a fuel perturbation and a specified mission while tracking critical parameters. These results, as well as the simulation times, are presented. The new approach significantly reduces the simulation time.

  1. Sex Differences in Adolescent Depression: Stress Exposure and Reactivity Models

    Science.gov (United States)

    Hankin, Benjamin L.; Mermelstein, Robin; Roesch, Linda

    2007-01-01

    Stress exposure and reactivity models were examined as explanations for why girls exhibit greater levels of depressive symptoms than boys. In a multiwave, longitudinal design, adolescents' depressive symptoms, alcohol usage, and occurrence of stressors were assessed at baseline, 6, and 12 months later (N=538; 54.5% female; ages 13-18, average…

  2. Task-based dermal exposure models for regulatory risk assessment

    NARCIS (Netherlands)

    Warren, N.D.; Marquart, H.; Christopher, Y.; Laitinen, J.; Hemmen, J.J. van

    2006-01-01

    The regulatory risk assessment of chemicals requires the estimation of occupational dermal exposure. Until recently, the models used were either based on limited data or were specific to a particular class of chemical or application. The EU project RISKOFDERM has gathered a considerable number of

  3. Population-based nutrikinetic modeling of polyphenol exposure

    NARCIS (Netherlands)

    van Velzen, E.J.J.; Westerhuis, J.A.; Grün, C.H.; Jacobs, D.M.; Eilers, P.H.C.; Mulder, Th.P.; Foltz, M.; Garczarek, U.; Kemperman, R.; Vaughan, E. E.; van Duynhoven, J.P.M.; Smilde, A.K.

    2014-01-01

    The beneficial health effects of fruits and vegetables have been attributed to their polyphenol content. These compounds undergo many bioconversions in the body. Modeling polyphenol exposure of humans upon intake is a prerequisite for understanding the modulating effect of the food matrix and the

  4. Population-based nutrikinetic modelling of phytochemical exposure

    NARCIS (Netherlands)

    Velzen, van E.J.J.; Westerhuis, J.A.; Grün, C.H.; Duynhoven, van J.P.M.; Jacobs, D.M.; Eilers, P.H.C.; Mulder, T.P.; Foltz, M.; Garczarek, U.; Kemperman, R.; Vaughan, E.E.; Smilde, A.K.

    2014-01-01

    The beneficial health effects of fruits and vegetables have been attributed to their polyphenol content. These compounds undergo many bioconversions in the body. Modeling polyphenol exposure of humans upon intake is a prerequisite for understanding the modulating effect of the food matrix and the

  5. Population models for time-varying pesticide exposure

    NARCIS (Netherlands)

    Jager T; Jong FMW de; Traas TP; LER; SEC

    2007-01-01

    A model has recently been developed at RIVM to predict the effects of variable exposure to pesticides of plant and animal populations in surface water. Before a pesticide is placed on the market, the environmental risk of the substance has to be assessed. This risk is estimated by comparing

  6. Parameterization models for pesticide exposure via crop consumption.

    Science.gov (United States)

    Fantke, Peter; Wieland, Peter; Juraske, Ronnie; Shaddick, Gavin; Itoiz, Eva Sevigné; Friedrich, Rainer; Jolliet, Olivier

    2012-12-04

    An approach for estimating human exposure to pesticides via consumption of six important food crops is presented that can be used to extend multimedia models applied in health risk and life cycle impact assessment. We first assessed the variation of model output (pesticide residues per kg applied) as a function of model input variables (substance, crop, and environmental properties) including their possible correlations using matrix algebra. We identified five key parameters responsible for between 80% and 93% of the variation in pesticide residues, namely time between substance application and crop harvest, degradation half-lives in crops and on crop surfaces, overall residence times in soil, and substance molecular weight. Partition coefficients also play an important role for fruit trees and tomato (Kow), potato (Koc), and lettuce (Kaw, Kow). Focusing on these parameters, we develop crop-specific models by parametrizing a complex fate and exposure assessment framework. The parametric models thereby reflect the framework's physical and chemical mechanisms and predict pesticide residues in harvest using linear combinations of crop, crop surface, and soil compartments. Parametric model results correspond well with results from the complex framework for 1540 substance-crop combinations with total deviations between a factor 4 (potato) and a factor 66 (lettuce). Predicted residues also correspond well with experimental data previously used to evaluate the complex framework. Pesticide mass in harvest can finally be combined with reduction factors accounting for food processing to estimate human exposure from crop consumption. All parametric models can be easily implemented into existing assessment frameworks.

  7. Exposure to traffic pollution: comparison between measurements and a model.

    Science.gov (United States)

    Alili, F; Momas, I; Callais, F; Le Moullec, Y; Sacre, C; Chiron, M; Flori, J P

    2001-01-01

    French researchers from the Building Scientific and Technical Center have produced a traffic-exposure index. To achieve this, they used an air pollution dispersion model that enabled them to calculate automobile pollutant concentrations in front of subjects' residences and places of work. Researchers used this model, which was tested at 27 Paris canyon street sites, and compared nitrogen oxides measurements obtained with passive samplers during a 6-wk period and calculations derived from the model. There was a highly significant correlation (r = .83) between the 2 series of values; their mean concentrations were not significantly different. The results suggested that the aforementioned model could be a useful epidemiological tool for the classification of city dwellers by present-or even cumulative exposure to automobile air pollution.

  8. Blackboard architecture and qualitative model in a computer aided assistant designed to define computers for HEP computing

    International Nuclear Information System (INIS)

    Nodarse, F.F.; Ivanov, V.G.

    1991-01-01

    Using BLACKBOARD architecture and qualitative model, an expert systm was developed to assist the use in defining the computers method for High Energy Physics computing. The COMEX system requires an IBM AT personal computer or compatible with than 640 Kb RAM and hard disk. 5 refs.; 9 figs

  9. COGMIR: A computer model for knowledge integration

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Z.X.

    1988-01-01

    This dissertation explores some aspects of knowledge integration, namely, accumulation of scientific knowledge and performing analogical reasoning on the acquired knowledge. Knowledge to be integrated is conveyed by paragraph-like pieces referred to as documents. By incorporating some results from cognitive science, the Deutsch-Kraft model of information retrieval is extended to a model for knowledge engineering, which integrates acquired knowledge and performs intelligent retrieval. The resulting computer model is termed COGMIR, which stands for a COGnitive Model for Intelligent Retrieval. A scheme, named query invoked memory reorganization, is used in COGMIR for knowledge integration. Unlike some other schemes which realize knowledge integration through subjective understanding by representing new knowledge in terms of existing knowledge, the proposed scheme suggests at storage time only recording the possible connection of knowledge acquired from different documents. The actual binding of the knowledge acquired from different documents is deferred to query time. There is only one way to store knowledge and numerous ways to utilize the knowledge. Each document can be represented as a whole as well as its meaning. In addition, since facts are constructed from the documents, document retrieval and fact retrieval are treated in a unified way. When the requested knowledge is not available, query invoked memory reorganization can generate suggestion based on available knowledge through analogical reasoning. This is done by revising the algorithms developed for document retrieval and fact retrieval, and by incorporating Gentner's structure mapping theory. Analogical reasoning is treated as a natural extension of intelligent retrieval, so that two previously separate research areas are combined. A case study is provided. All the components are implemented as list structures similar to relational data-bases.

  10. Estimation of effectiveness of automatic exposure control in computed tomograph scanner

    International Nuclear Information System (INIS)

    Akhilesh, Philomina; Sharma, S.D.; Datta, D.; Kulkarni, Arti

    2018-01-01

    With the advent of multiple detector array technology, the use of Computed Tomography (CT) scanning has increase tremendously. Computed Tomography examinations deliver relatively high radiation dose to patients in comparison with conventional radiography. It is therefore required to reduce the dose delivered in CT scans without compromising the image quality. Several parameters like applied potential, tube current, scan length, pitch etc. influence the dose delivered in CT scans. For optimization of patient dose and image quality, all modern CT scanners are enabled with Automatic Exposure Control (AEC) systems. The aim of this work is to compare the dose delivered during CT scans performed with and without AEC in order to estimate the effectiveness of AEC techniques used in CT scanners of various manufacturer

  11. The use of conduction model in laser weld profile computation

    Science.gov (United States)

    Grabas, Bogusław

    2007-02-01

    Profiles of joints resulting from deep penetration laser beam welding of a flat workpiece of carbon steel were computed. A semi-analytical conduction model solved with Green's function method was used in computations. In the model, the moving heat source was attenuated exponentially in accordance with Beer-Lambert law. Computational results were compared with those in the experiment.

  12. Performance of Air Pollution Models on Massively Parallel Computers

    DEFF Research Database (Denmark)

    Brown, John; Hansen, Per Christian; Wasniewski, Jerzy

    1996-01-01

    To compare the performance and use of three massively parallel SIMD computers, we implemented a large air pollution model on the computers. Using a realistic large-scale model, we gain detailed insight about the performance of the three computers when used to solve large-scale scientific problems...

  13. Computer modeling of the Cabriolet Event

    International Nuclear Information System (INIS)

    Kamegai, M.

    1979-01-01

    Computer modeling techniques are described for calculating the results of underground nuclear explosions at depths shallow enough to produce cratering. The techniques are applied to the Cabriolet Event, a well-documented nuclear excavation experiment, and the calculations give good agreement with the experimental results. It is concluded that, given data obtainable by outside observers, these modeling techniques are capable of verifying the yield and depth of underground nuclear cratering explosions, and that they could thus be useful in monitoring another country's compliance with treaty agreements on nuclear testing limitations. Several important facts emerge from the study: (1) seismic energy is produced by only a fraction of the nuclear yield, a fraction depending strongly on the depth of shot and the mechanical properties of the surrounding rock; (2) temperature of the vented gas can be predicted accurately only if good equations of state are available for the rock in the detonation zone; and (3) temperature of the vented gas is strongly dependent on the cooling effect, before venting, of mixing with melted rock in the expanding cavity and, to a lesser extent, on the cooling effect of water in the rock

  14. Random matrix model of adiabatic quantum computing

    International Nuclear Information System (INIS)

    Mitchell, David R.; Adami, Christoph; Lue, Waynn; Williams, Colin P.

    2005-01-01

    We present an analysis of the quantum adiabatic algorithm for solving hard instances of 3-SAT (an NP-complete problem) in terms of random matrix theory (RMT). We determine the global regularity of the spectral fluctuations of the instantaneous Hamiltonians encountered during the interpolation between the starting Hamiltonians and the ones whose ground states encode the solutions to the computational problems of interest. At each interpolation point, we quantify the degree of regularity of the average spectral distribution via its Brody parameter, a measure that distinguishes regular (i.e., Poissonian) from chaotic (i.e., Wigner-type) distributions of normalized nearest-neighbor spacings. We find that for hard problem instances - i.e., those having a critical ratio of clauses to variables - the spectral fluctuations typically become irregular across a contiguous region of the interpolation parameter, while the spectrum is regular for easy instances. Within the hard region, RMT may be applied to obtain a mathematical model of the probability of avoided level crossings and concomitant failure rate of the adiabatic algorithm due to nonadiabatic Landau-Zener-type transitions. Our model predicts that if the interpolation is performed at a uniform rate, the average failure rate of the quantum adiabatic algorithm, when averaged over hard problem instances, scales exponentially with increasing problem size

  15. Computational Modeling of Biological Systems From Molecules to Pathways

    CERN Document Server

    2012-01-01

    Computational modeling is emerging as a powerful new approach for studying and manipulating biological systems. Many diverse methods have been developed to model, visualize, and rationally alter these systems at various length scales, from atomic resolution to the level of cellular pathways. Processes taking place at larger time and length scales, such as molecular evolution, have also greatly benefited from new breeds of computational approaches. Computational Modeling of Biological Systems: From Molecules to Pathways provides an overview of established computational methods for the modeling of biologically and medically relevant systems. It is suitable for researchers and professionals working in the fields of biophysics, computational biology, systems biology, and molecular medicine.

  16. A dermal model for spray painters, part I : subjective exposure modelling of spray paint deposition

    NARCIS (Netherlands)

    Brouwer, D.H.; Semple, S.; Marquart, J.; Cherrie, J.W.

    2001-01-01

    The discriminative power of existing dermal exposure models is limited. Most models only allow occupational hygienists to rank workers between and within workplaces according to broad bands of dermal exposure. No allowance is made for the work practices of different individuals. In this study a

  17. A Parallel and Distributed Surrogate Model Implementation for Computational Steering

    KAUST Repository

    Butnaru, Daniel; Buse, Gerrit; Pfluger, Dirk

    2012-01-01

    of the input parameters. Such an exploration process is however not possible if the simulation is computationally too expensive. For these cases we present in this paper a scalable computational steering approach utilizing a fast surrogate model as substitute

  18. AIR INGRESS ANALYSIS: COMPUTATIONAL FLUID DYNAMIC MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Chang H. Oh; Eung S. Kim; Richard Schultz; Hans Gougar; David Petti; Hyung S. Kang

    2010-08-01

    The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.

  19. Computer models for kinetic equations of magnetically confined plasmas

    International Nuclear Information System (INIS)

    Killeen, J.; Kerbel, G.D.; McCoy, M.G.; Mirin, A.A.; Horowitz, E.J.; Shumaker, D.E.

    1987-01-01

    This paper presents four working computer models developed by the computational physics group of the National Magnetic Fusion Energy Computer Center. All of the models employ a kinetic description of plasma species. Three of the models are collisional, i.e., they include the solution of the Fokker-Planck equation in velocity space. The fourth model is collisionless and treats the plasma ions by a fully three-dimensional particle-in-cell method

  20. Risk assessment through drinking water pathway via uncertainty modeling of contaminant transport using soft computing

    International Nuclear Information System (INIS)

    Datta, D.; Ranade, A.K.; Pandey, M.; Sathyabama, N.; Kumar, Brij

    2012-01-01

    The basic objective of an environmental impact assessment (EIA) is to build guidelines to reduce the associated risk or mitigate the consequences of the reactor accident at its source to prevent deterministic health effects, to reduce the risk of stochastic health effects (eg. cancer and severe hereditary effects) as much as reasonable achievable by implementing protective actions in accordance with IAEA guidance (IAEA Safety Series No. 115, 1996). The measure of exposure being the basic tool to take any appropriate decisions related to risk reduction, EIA is traditionally expressed in terms of radiation exposure to the member of the public. However, models used to estimate the exposure received by the member of the public are governed by parameters some of which are deterministic with relative uncertainty and some of which are stochastic as well as imprecise (insufficient knowledge). In an admixture environment of this type, it is essential to assess the uncertainty of a model to estimate the bounds of the exposure to the public to invoke a decision during an event of nuclear or radiological emergency. With a view to this soft computing technique such as evidence theory based assessment of model parameters is addressed to compute the risk or exposure to the member of the public. The possible pathway of exposure to the member of the public in the aquatic food stream is the drinking of water. Accordingly, this paper presents the uncertainty analysis of exposure via uncertainty analysis of the contaminated water. Evidence theory finally addresses the uncertainty in terms of lower bound as belief measure and upper bound of exposure as plausibility measure. In this work EIA is presented using evidence theory. Data fusion technique is used to aggregate the knowledge on the uncertain information. Uncertainty of concentration and exposure is expressed as an interval of belief, plausibility

  1. Editorial: Modelling and computational challenges in granular materials

    OpenAIRE

    Weinhart, Thomas; Thornton, Anthony Richard; Einav, Itai

    2015-01-01

    This is the editorial for the special issue on “Modelling and computational challenges in granular materials” in the journal on Computational Particle Mechanics (CPM). The issue aims to provide an opportunity for physicists, engineers, applied mathematicians and computational scientists to discuss the current progress and latest advancements in the field of advanced numerical methods and modelling of granular materials. The focus will be on computational methods, improved algorithms and the m...

  2. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. SHEDS-HT: an integrated probabilistic exposure model for prioritizing exposures to chemicals with near-field and dietary sources.

    Science.gov (United States)

    Isaacs, Kristin K; Glen, W Graham; Egeghy, Peter; Goldsmith, Michael-Rock; Smith, Luther; Vallero, Daniel; Brooks, Raina; Grulke, Christopher M; Özkaynak, Halûk

    2014-11-04

    United States Environmental Protection Agency (USEPA) researchers are developing a strategy for high-throughput (HT) exposure-based prioritization of chemicals under the ExpoCast program. These novel modeling approaches for evaluating chemicals based on their potential for biologically relevant human exposures will inform toxicity testing and prioritization for chemical risk assessment. Based on probabilistic methods and algorithms developed for The Stochastic Human Exposure and Dose Simulation Model for Multimedia, Multipathway Chemicals (SHEDS-MM), a new mechanistic modeling approach has been developed to accommodate high-throughput (HT) assessment of exposure potential. In this SHEDS-HT model, the residential and dietary modules of SHEDS-MM have been operationally modified to reduce the user burden, input data demands, and run times of the higher-tier model, while maintaining critical features and inputs that influence exposure. The model has been implemented in R; the modeling framework links chemicals to consumer product categories or food groups (and thus exposure scenarios) to predict HT exposures and intake doses. Initially, SHEDS-HT has been applied to 2507 organic chemicals associated with consumer products and agricultural pesticides. These evaluations employ data from recent USEPA efforts to characterize usage (prevalence, frequency, and magnitude), chemical composition, and exposure scenarios for a wide range of consumer products. In modeling indirect exposures from near-field sources, SHEDS-HT employs a fugacity-based module to estimate concentrations in indoor environmental media. The concentration estimates, along with relevant exposure factors and human activity data, are then used by the model to rapidly generate probabilistic population distributions of near-field indirect exposures via dermal, nondietary ingestion, and inhalation pathways. Pathway-specific estimates of near-field direct exposures from consumer products are also modeled

  4. Elements of matrix modeling and computing with Matlab

    CERN Document Server

    White, Robert E

    2006-01-01

    As discrete models and computing have become more common, there is a need to study matrix computation and numerical linear algebra. Encompassing a diverse mathematical core, Elements of Matrix Modeling and Computing with MATLAB examines a variety of applications and their modeling processes, showing you how to develop matrix models and solve algebraic systems. Emphasizing practical skills, it creates a bridge from problems with two and three variables to more realistic problems that have additional variables. Elements of Matrix Modeling and Computing with MATLAB focuses on seven basic applicat

  5. Vehicle - Bridge interaction, comparison of two computing models

    Science.gov (United States)

    Melcer, Jozef; Kuchárová, Daniela

    2017-07-01

    The paper presents the calculation of the bridge response on the effect of moving vehicle moves along the bridge with various velocities. The multi-body plane computing model of vehicle is adopted. The bridge computing models are created in two variants. One computing model represents the bridge as the Bernoulli-Euler beam with continuously distributed mass and the second one represents the bridge as the lumped mass model with 1 degrees of freedom. The mid-span bridge dynamic deflections are calculated for both computing models. The results are mutually compared and quantitative evaluated.

  6. Inhalation Exposure Input Parameters for the Biosphere Model

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2006-06-05

    This analysis is one of the technical reports that support the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), referred to in this report as the biosphere model. ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the conceptual model as well as the mathematical model and its input parameters. This report documents development of input parameters for the biosphere model that are related to atmospheric mass loading and supports the use of the model to develop biosphere dose conversion factors (BDCFs). The biosphere model is one of a series of process models supporting the total system performance assessment (TSPA) for a Yucca Mountain repository. ''Inhalation Exposure Input Parameters for the Biosphere Model'' is one of five reports that develop input parameters for the biosphere model. A graphical representation of the documentation hierarchy for the biosphere model is presented in Figure 1-1 (based on BSC 2006 [DIRS 176938]). This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and how this analysis report contributes to biosphere modeling. This analysis report defines and justifies values of atmospheric mass loading for the biosphere model. Mass loading is the total mass concentration of resuspended particles (e.g., dust, ash) in a volume of air. Mass loading values are used in the air submodel of the biosphere model to calculate concentrations of radionuclides in air inhaled by a receptor and concentrations in air surrounding crops. Concentrations in air to which the receptor is exposed are then used in the inhalation submodel to calculate the dose contribution to the receptor from inhalation of contaminated airborne particles. Concentrations in air surrounding plants are used in the plant submodel to calculate the concentrations of radionuclides in foodstuffs contributed from uptake by foliar interception. This

  7. Inhalation Exposure Input Parameters for the Biosphere Model

    International Nuclear Information System (INIS)

    M. Wasiolek

    2006-01-01

    This analysis is one of the technical reports that support the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), referred to in this report as the biosphere model. ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the conceptual model as well as the mathematical model and its input parameters. This report documents development of input parameters for the biosphere model that are related to atmospheric mass loading and supports the use of the model to develop biosphere dose conversion factors (BDCFs). The biosphere model is one of a series of process models supporting the total system performance assessment (TSPA) for a Yucca Mountain repository. ''Inhalation Exposure Input Parameters for the Biosphere Model'' is one of five reports that develop input parameters for the biosphere model. A graphical representation of the documentation hierarchy for the biosphere model is presented in Figure 1-1 (based on BSC 2006 [DIRS 176938]). This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and how this analysis report contributes to biosphere modeling. This analysis report defines and justifies values of atmospheric mass loading for the biosphere model. Mass loading is the total mass concentration of resuspended particles (e.g., dust, ash) in a volume of air. Mass loading values are used in the air submodel of the biosphere model to calculate concentrations of radionuclides in air inhaled by a receptor and concentrations in air surrounding crops. Concentrations in air to which the receptor is exposed are then used in the inhalation submodel to calculate the dose contribution to the receptor from inhalation of contaminated airborne particles. Concentrations in air surrounding plants are used in the plant submodel to calculate the concentrations of radionuclides in foodstuffs contributed from uptake by foliar interception. This report is concerned primarily with the

  8. A cost modelling system for cloud computing

    OpenAIRE

    Ajeh, Daniel; Ellman, Jeremy; Keogh, Shelagh

    2014-01-01

    An advance in technology unlocks new opportunities for organizations to increase their productivity, efficiency and process automation while reducing the cost of doing business as well. The emergence of cloud computing addresses these prospects through the provision of agile systems that are scalable, flexible and reliable as well as cost effective. Cloud computing has made hosting and deployment of computing resources cheaper and easier with no up-front charges but pay per-use flexible payme...

  9. International Nuclear Model personal computer (PCINM): Model documentation

    International Nuclear Information System (INIS)

    1992-08-01

    The International Nuclear Model (INM) was developed to assist the Energy Information Administration (EIA), U.S. Department of Energy (DOE) in producing worldwide projections of electricity generation, fuel cycle requirements, capacities, and spent fuel discharges from commercial nuclear reactors. The original INM was developed, maintained, and operated on a mainframe computer system. In spring 1992, a streamlined version of INM was created for use on a microcomputer utilizing CLIPPER and PCSAS software. This new version is known as PCINM. This documentation is based on the new PCINM version. This document is designed to satisfy the requirements of several categories of users of the PCINM system including technical analysts, theoretical modelers, and industry observers. This document assumes the reader is familiar with the nuclear fuel cycle and each of its components. This model documentation contains four chapters and seven appendices. Chapter Two presents the model overview containing the PCINM structure and process flow, the areas for which projections are made, and input data and output reports. Chapter Three presents the model technical specifications showing all model equations, algorithms, and units of measure. Chapter Four presents an overview of all parameters, variables, and assumptions used in PCINM. The appendices present the following detailed information: variable and parameter listings, variable and equation cross reference tables, source code listings, file layouts, sample report outputs, and model run procedures. 2 figs

  10. Predictive Capability Maturity Model for computational modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  11. PETRI NET MODELING OF COMPUTER VIRUS LIFE CYCLE

    African Journals Online (AJOL)

    Dr Obe

    dynamic system analysis is applied to model the virus life cycle. Simulation of the derived model ... Keywords: Virus lifecycle, Petri nets, modeling. simulation. .... complex process. Figure 2 .... by creating Matlab files for five different computer ...

  12. Regenerating computer model of the thymus

    International Nuclear Information System (INIS)

    Lumb, J.R.

    1975-01-01

    This computer model simulates the cell population kinetics of the development and later degeneration of the thymus. Nutritional factors are taken into account by the growth of blood vessels in the simulated thymus. The stem cell population is kept at its maximum by allowing some stem cells to divide into two stem cells until the population reaches its maximum, thus regenerating the thymus after an insult such as irradiation. After a given number of population doublings the maximum allowed stem cell population is gradually decreased in order to simulate the degeneration of the thymus. Results show that the simulated thymus develops and degenerates in a pattern similar to that of the natural thymus. This simulation is used to evaluate cellular kinetic data for the the thymus. The results from testing the internal consistency of available data are reported. The number of generations which most represents the natural thymus includes seven dividing generations of lymphocytes and one mature, nondividing generation of small lymphocytes. The size of the resulting developed thymus can be controlled without affecting other variables by changing the maximum stem cell population allowed. In addition, recovery from irradiation is simulated

  13. Computational modeling of epidural cortical stimulation

    Science.gov (United States)

    Wongsarnpigoon, Amorn; Grill, Warren M.

    2008-12-01

    Epidural cortical stimulation (ECS) is a developing therapy to treat neurological disorders. However, it is not clear how the cortical anatomy or the polarity and position of the electrode affects current flow and neural activation in the cortex. We developed a 3D computational model simulating ECS over the precentral gyrus. With the electrode placed directly above the gyrus, about half of the stimulus current flowed through the crown of the gyrus while current density was low along the banks deep in the sulci. Beneath the electrode, neurons oriented perpendicular to the cortical surface were depolarized by anodic stimulation, and neurons oriented parallel to the boundary were depolarized by cathodic stimulation. Activation was localized to the crown of the gyrus, and neurons on the banks deep in the sulci were not polarized. During regulated voltage stimulation, the magnitude of the activating function was inversely proportional to the thickness of the CSF and dura. During regulated current stimulation, the activating function was not sensitive to the thickness of the dura but was slightly more sensitive than during regulated voltage stimulation to the thickness of the CSF. Varying the width of the gyrus and the position of the electrode altered the distribution of the activating function due to changes in the orientation of the neurons beneath the electrode. Bipolar stimulation, although often used in clinical practice, reduced spatial selectivity as well as selectivity for neuron orientation.

  14. Geometric modeling for computer aided design

    Science.gov (United States)

    Schwing, James L.; Olariu, Stephen

    1995-01-01

    The primary goal of this grant has been the design and implementation of software to be used in the conceptual design of aerospace vehicles particularly focused on the elements of geometric design, graphical user interfaces, and the interaction of the multitude of software typically used in this engineering environment. This has resulted in the development of several analysis packages and design studies. These include two major software systems currently used in the conceptual level design of aerospace vehicles. These tools are SMART, the Solid Modeling Aerospace Research Tool, and EASIE, the Environment for Software Integration and Execution. Additional software tools were designed and implemented to address the needs of the engineer working in the conceptual design environment. SMART provides conceptual designers with a rapid prototyping capability and several engineering analysis capabilities. In addition, SMART has a carefully engineered user interface that makes it easy to learn and use. Finally, a number of specialty characteristics have been built into SMART which allow it to be used efficiently as a front end geometry processor for other analysis packages. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand-alone, analysis codes. Resulting in a streamlining of the exchange of data between programs reducing errors and improving the efficiency. EASIE provides both a methodology and a collection of software tools to ease the task of coordinating engineering design and analysis codes.

  15. Review of computational thermal-hydraulic modeling

    International Nuclear Information System (INIS)

    Keefer, R.H.; Keeton, L.W.

    1995-01-01

    Corrosion of heat transfer tubing in nuclear steam generators has been a persistent problem in the power generation industry, assuming many different forms over the years depending on chemistry and operating conditions. Whatever the corrosion mechanism, a fundamental understanding of the process is essential to establish effective management strategies. To gain this fundamental understanding requires an integrated investigative approach that merges technology from many diverse scientific disciplines. An important aspect of an integrated approach is characterization of the corrosive environment at high temperature. This begins with a thorough understanding of local thermal-hydraulic conditions, since they affect deposit formation, chemical concentration, and ultimately corrosion. Computational Fluid Dynamics (CFD) can and should play an important role in characterizing the thermal-hydraulic environment and in predicting the consequences of that environment,. The evolution of CFD technology now allows accurate calculation of steam generator thermal-hydraulic conditions and the resulting sludge deposit profiles. Similar calculations are also possible for model boilers, so that tests can be designed to be prototypic of the heat exchanger environment they are supposed to simulate. This paper illustrates the utility of CFD technology by way of examples in each of these two areas. This technology can be further extended to produce more detailed local calculations of the chemical environment in support plate crevices, beneath thick deposits on tubes, and deep in tubesheet sludge piles. Knowledge of this local chemical environment will provide the foundation for development of mechanistic corrosion models, which can be used to optimize inspection and cleaning schedules and focus the search for a viable fix

  16. Impact of heart rate and rhythm on radiation exposure in prospectively ECG triggered computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Luecke, Christian, E-mail: neep@gmx.de [University of Leipzig – Heart Center, Department of Diagnostic and Interventional Radiology, Strümpellstrasse 39, D-04289, Leipzig (Germany); Andres, Claudia; Foldyna, Borek [University of Leipzig – Heart Center, Department of Diagnostic and Interventional Radiology, Strümpellstrasse 39, D-04289, Leipzig (Germany); Nagel, Hans Dieter [Wissenschaft and Technik für die Radiologie, Buchhholz i.d.N (Germany); Hoffmann, Janine; Grothoff, Matthias; Nitzsche, Stefan; Gutberlet, Matthias; Lehmkuhl, Lukas [University of Leipzig – Heart Center, Department of Diagnostic and Interventional Radiology, Strümpellstrasse 39, D-04289, Leipzig (Germany)

    2012-09-15

    Purpose: To evaluate the influence of different heart rates and arrhythmias on scanner performance, image acquisition and applied radiation exposure in prospectively ECG triggered computed tomography (pCT). Materials and methods: An ECG simulator (EKG Phantom 320, Müller and Sebastiani Elektronik GmbH, Munich, Germany) was used to generate different heart rhythms and arrhythmias: sinus rhythm (SR) at 45, 60, 75, 90 and 120/min, supraventricular arrhythmias (e.g. sinus arrhythmia, atrial fibrillation) and ventricular arrhythmias (e.g. ventricular extrasystoles), pacemaker-ECGs, ST-changes and technical artifacts. The analysis of the image acquisition process was performed on a 64-row multidetector CT (Brilliance, Philips Medical Systems, Cleveland, USA). A prospectively triggered scan protocol as used for routine was applied (120 kV; 150 mA s; 0.4 s rotation and exposure time per scan; image acquisition predominantly in end-diastole at 75% R-R-interval, in arrythmias with a mean heart rate above 80/min in systole at 45% of the R-R-interval; FOV 25 cm). The mean dose length product (DLP) and its percentage increase from baseline (SR at 60/min) were determined. Result: Radiation exposure can increase significantly when the heart rhythm deviates from sinus rhythm. ECG-changes leading to a significant DLP increase (p < 0.05) were bifocal pacemaker (61%), pacemaker dysfunction (22%), SVES (20%), ventricular salvo (20%), and atrial fibrillation (14%). Significantly (p < 0.05) prolonged scan time (>8 s) could be observed in bifocal pacemaker (12.8 s), pacemaker dysfunction (10.7 s), atrial fibrillation (10.3 s) and sinus arrhythmia (9.3 s). Conclusion: In prospectively ECG triggered CT, heart rate and rhythm can provoke different types of scanner performance, which can significantly alter radiation exposure and scan time. These results might have an important implication for indication, informed consent and contrast agent injection protocols.

  17. Impact of heart rate and rhythm on radiation exposure in prospectively ECG triggered computed tomography

    International Nuclear Information System (INIS)

    Luecke, Christian; Andres, Claudia; Foldyna, Borek; Nagel, Hans Dieter; Hoffmann, Janine; Grothoff, Matthias; Nitzsche, Stefan; Gutberlet, Matthias; Lehmkuhl, Lukas

    2012-01-01

    Purpose: To evaluate the influence of different heart rates and arrhythmias on scanner performance, image acquisition and applied radiation exposure in prospectively ECG triggered computed tomography (pCT). Materials and methods: An ECG simulator (EKG Phantom 320, Müller and Sebastiani Elektronik GmbH, Munich, Germany) was used to generate different heart rhythms and arrhythmias: sinus rhythm (SR) at 45, 60, 75, 90 and 120/min, supraventricular arrhythmias (e.g. sinus arrhythmia, atrial fibrillation) and ventricular arrhythmias (e.g. ventricular extrasystoles), pacemaker-ECGs, ST-changes and technical artifacts. The analysis of the image acquisition process was performed on a 64-row multidetector CT (Brilliance, Philips Medical Systems, Cleveland, USA). A prospectively triggered scan protocol as used for routine was applied (120 kV; 150 mA s; 0.4 s rotation and exposure time per scan; image acquisition predominantly in end-diastole at 75% R-R-interval, in arrythmias with a mean heart rate above 80/min in systole at 45% of the R-R-interval; FOV 25 cm). The mean dose length product (DLP) and its percentage increase from baseline (SR at 60/min) were determined. Result: Radiation exposure can increase significantly when the heart rhythm deviates from sinus rhythm. ECG-changes leading to a significant DLP increase (p 8 s) could be observed in bifocal pacemaker (12.8 s), pacemaker dysfunction (10.7 s), atrial fibrillation (10.3 s) and sinus arrhythmia (9.3 s). Conclusion: In prospectively ECG triggered CT, heart rate and rhythm can provoke different types of scanner performance, which can significantly alter radiation exposure and scan time. These results might have an important implication for indication, informed consent and contrast agent injection protocols

  18. SU-E-I-27: Establishing Target Exposure Index Values for Computed Radiography

    International Nuclear Information System (INIS)

    Murphy, N; Tchou, P; Belcher, K; Scott, A

    2014-01-01

    Purpose: To develop a standard set of target exposure index (TEI) values to be applied to Agfa Computed Radiography (CR) readers in accordance with International Electrotechnical Committee 62494-1 (ed. 1.0). Methods: A large data cohort was collected from six USAF Medical Treatment Facilities that exclusively use Agfa CR Readers. Dose monitoring statistics were collected from each reader. The data was analyzed based on anatomic region, view, and processing speed class. The Agfa specific exposure metric, logarithmic mean (LGM), was converted to exposure index (EI) for each data set. The optimum TEI value was determined by minimizing the number of studies that fell outside the acceptable deviation index (DI) range of +/− 2 for phototimed techniques or a range of +/−3 for fixed techniques. An anthropomorphic radiographic phantom was used to corroborate the TEI recommendations. Images were acquired of several anatomic regions and views using standard techniques. The images were then evaluated by two radiologists as either acceptable or unacceptable. The acceptable image with the lowest exposure and EI value was compared to the recommended TEI values using a passing DI range. Results: Target EI values were determined for a comprehensive list of anatomic regions and views. Conclusion: Target EI values must be established on each CR unit in order to provide a positive feedback system for the technologist. This system will serve as a mechanism to prevent under or overexposures of patients. The TEI recommendations are a first attempt at a large scale process improvement with the goal of setting reasonable and standardized TEI values. The implementation and effectiveness of the recommended TEI values should be monitored and adjustments made as necessary

  19. Automatic exposure control in computed tomography - an evaluation of systems from different manufacturers

    Energy Technology Data Exchange (ETDEWEB)

    Soederberg, Marcus; Gunnarsson, Mikael (Dept. of Medical Radiation Physics, Skaane Univ. Hospital, Malmoe (Sweden)), e-mail: marcus.soderberg@med.lu.se

    2010-07-15

    Background: Today, practically all computed tomography (CT) systems are delivered with automatic exposure control (AEC) systems operating with tube current modulation in three dimensions. Each of these systems has different specifications and operates somewhat differently. Purpose: To evaluate AEC systems from four different CT scanner manufacturers: General Electric (GE), Philips, Siemens, and Toshiba, considering their potential for reducing radiation exposure to the patient while maintaining adequate image quality. Material and Methods: The dynamics (adaptation along the longitudinal axis) of tube current modulation of each AEC system were investigated by scanning an anthropomorphic chest phantom using both 16- and 64-slice CT scanners from each manufacturer with the AEC systems activated and inactivated. The radiation dose was estimated using the parameters in the DICOM image information and image quality was evaluated based on image noise (standard deviation of CT numbers) calculated in 0.5 cm2 circular regions of interest situated throughout the spine region of the chest phantom. Results: We found that tube current modulation dynamics were similar among the different AEC systems, especially between GE and Toshiba systems and between Philips and Siemens systems. Furthermore, the magnitude of the reduction in the exposure dose was considerable, in the range of 35-60%. However, in general the image noise increased when the AEC systems were used, especially in regions where the tube current was greatly decreased, such as the lung region. However, the variation in image noise among images obtained along the scanning direction was lower when using the AEC systems compared with fixed mAs. Conclusion: The AEC systems available in modern CT scanners can contribute to a significant reduction in radiation exposure to the patient and the image noise becomes more uniform within any given scan.

  20. Impact of temporal upscaling and chemical transport model horizontal resolution on reducing ozone exposure misclassification

    Science.gov (United States)

    Xu, Yadong; Serre, Marc L.; Reyes, Jeanette M.; Vizuete, William

    2017-10-01

    We have developed a Bayesian Maximum Entropy (BME) framework that integrates observations from a surface monitoring network and predictions from a Chemical Transport Model (CTM) to create improved exposure estimates that can be resolved into any spatial and temporal resolution. The flexibility of the framework allows for input of data in any choice of time scales and CTM predictions of any spatial resolution with varying associated degrees of estimation error and cost in terms of implementation and computation. This study quantifies the impact on exposure estimation error due to these choices by first comparing estimations errors when BME relied on ozone concentration data either as an hourly average, the daily maximum 8-h average (DM8A), or the daily 24-h average (D24A). Our analysis found that the use of DM8A and D24A data, although less computationally intensive, reduced estimation error more when compared to the use of hourly data. This was primarily due to the poorer CTM model performance in the hourly average predicted ozone. Our second analysis compared spatial variability and estimation errors when BME relied on CTM predictions with a grid cell resolution of 12 × 12 km2 versus a coarser resolution of 36 × 36 km2. Our analysis found that integrating the finer grid resolution CTM predictions not only reduced estimation error, but also increased the spatial variability in daily ozone estimates by 5 times. This improvement was due to the improved spatial gradients and model performance found in the finer resolved CTM simulation. The integration of observational and model predictions that is permitted in a BME framework continues to be a powerful approach for improving exposure estimates of ambient air pollution. The results of this analysis demonstrate the importance of also understanding model performance variability and its implications on exposure error.

  1. Modelling, abstraction, and computation in systems biology: A view from computer science.

    Science.gov (United States)

    Melham, Tom

    2013-04-01

    Systems biology is centrally engaged with computational modelling across multiple scales and at many levels of abstraction. Formal modelling, precise and formalised abstraction relationships, and computation also lie at the heart of computer science--and over the past decade a growing number of computer scientists have been bringing their discipline's core intellectual and computational tools to bear on biology in fascinating new ways. This paper explores some of the apparent points of contact between the two fields, in the context of a multi-disciplinary discussion on conceptual foundations of systems biology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Human Exposure Model (HEM): A modular, web-based application to characterize near-field chemical exposures and releases

    Science.gov (United States)

    The U.S. EPA’s Chemical Safety and Sustainability research program is developing the Human Exposure Model (HEM) to assess near-field exposures to chemicals that occur in various populations over the entire life cycle of a consumer product. The model will be implemented as a...

  3. Computational Intelligence Agent-Oriented Modelling

    Czech Academy of Sciences Publication Activity Database

    Neruda, Roman

    2006-01-01

    Roč. 5, č. 2 (2006), s. 430-433 ISSN 1109-2777 R&D Projects: GA MŠk 1M0567 Institutional research plan: CEZ:AV0Z10300504 Keywords : multi-agent systems * adaptive agents * computational intelligence Subject RIV: IN - Informatics, Computer Science

  4. A Model of Computation for Bit-Level Concurrent Computing and Programming: APEC

    Science.gov (United States)

    Ajiro, Takashi; Tsuchida, Kensei

    A concurrent model of computation and a language based on the model for bit-level operation are useful for developing asynchronous and concurrent programs compositionally, which frequently use bit-level operations. Some examples are programs for video games, hardware emulation (including virtual machines), and signal processing. However, few models and languages are optimized and oriented to bit-level concurrent computation. We previously developed a visual programming language called A-BITS for bit-level concurrent programming. The language is based on a dataflow-like model that computes using processes that provide serial bit-level operations and FIFO buffers connected to them. It can express bit-level computation naturally and develop compositionally. We then devised a concurrent computation model called APEC (Asynchronous Program Elements Connection) for bit-level concurrent computation. This model enables precise and formal expression of the process of computation, and a notion of primitive program elements for controlling and operating can be expressed synthetically. Specifically, the model is based on a notion of uniform primitive processes, called primitives, that have three terminals and four ordered rules at most, as well as on bidirectional communication using vehicles called carriers. A new notion is that a carrier moving between two terminals can briefly express some kinds of computation such as synchronization and bidirectional communication. The model's properties make it most applicable to bit-level computation compositionally, since the uniform computation elements are enough to develop components that have practical functionality. Through future application of the model, our research may enable further research on a base model of fine-grain parallel computer architecture, since the model is suitable for expressing massive concurrency by a network of primitives.

  5. Deployment Models: Towards Eliminating Security Concerns From Cloud Computing

    OpenAIRE

    Zhao, Gansen; Chunming, Rong; Jaatun, Martin Gilje; Sandnes, Frode Eika

    2010-01-01

    Cloud computing has become a popular choice as an alternative to investing new IT systems. When making decisions on adopting cloud computing related solutions, security has always been a major concern. This article summarizes security concerns in cloud computing and proposes five service deployment models to ease these concerns. The proposed models provide different security related features to address different requirements and scenarios and can serve as reference models for deployment. D...

  6. Cloud Computing Adoption Business Model Factors: Does Enterprise Size Matter?

    OpenAIRE

    Bogataj Habjan, Kristina; Pucihar, Andreja

    2017-01-01

    This paper presents the results of research investigating the impact of business model factors on cloud computing adoption. The introduced research model consists of 40 cloud computing business model factors, grouped into eight factor groups. Their impact and importance for cloud computing adoption were investigated among enterpirses in Slovenia. Furthermore, differences in opinion according to enterprise size were investigated. Research results show no statistically significant impacts of in...

  7. A physiologically based pharmacokinetic (PB/PK) model for multiple exposure routes for soman in multiple species

    NARCIS (Netherlands)

    Sweeney, R.E.; Langenberg, J.P.; Maxwell, D.M.

    2006-01-01

    A physiologically based pharmacokinetic (PB/PK) model has been developed in advanced computer simulation language (ACSL) to describe blood and tissue concentration-time profiles of the C(±)P(-) stereoisomers of soman after inhalation, subcutaneous and intravenous exposures at low (0.8-1.0 × LD50),

  8. Econometric model for age- and population-dependent radiation exposures

    International Nuclear Information System (INIS)

    Sandquist, G.M.; Slaughter, D.M.; Rogers, V.C.

    1991-01-01

    The economic impact associated with ionizing radiation exposures in a given human population depends on numerous factors including the individual's mean economic status as a function age, the age distribution of the population, the future life expectancy at each age, and the latency period for the occurrence of radiation-induced health effects. A simple mathematical model has been developed that provides an analytical methodology for estimating the societal econometrics associated with radiation effects are to be assessed and compared for economic evaluation

  9. The complete guide to blender graphics computer modeling and animation

    CERN Document Server

    Blain, John M

    2014-01-01

    Smoothly Leads Users into the Subject of Computer Graphics through the Blender GUIBlender, the free and open source 3D computer modeling and animation program, allows users to create and animate models and figures in scenes, compile feature movies, and interact with the models and create video games. Reflecting the latest version of Blender, The Complete Guide to Blender Graphics: Computer Modeling & Animation, 2nd Edition helps beginners learn the basics of computer animation using this versatile graphics program. This edition incorporates many new features of Blender, including developments

  10. Computational Models for Nonlinear Aeroelastic Systems, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. and Duke University propose to develop and demonstrate new and efficient computational methods of modeling nonlinear aeroelastic systems. The...

  11. Editorial: Modelling and computational challenges in granular materials

    NARCIS (Netherlands)

    Weinhart, Thomas; Thornton, Anthony Richard; Einav, Itai

    2015-01-01

    This is the editorial for the special issue on “Modelling and computational challenges in granular materials” in the journal on Computational Particle Mechanics (CPM). The issue aims to provide an opportunity for physicists, engineers, applied mathematicians and computational scientists to discuss

  12. Security Issues Model on Cloud Computing: A Case of Malaysia

    OpenAIRE

    Komeil Raisian; Jamaiah Yahaya

    2015-01-01

    By developing the cloud computing, viewpoint of many people regarding the infrastructure architectures, software distribution and improvement model changed significantly. Cloud computing associates with the pioneering deployment architecture, which could be done through grid calculating, effectiveness calculating and autonomic calculating. The fast transition towards that, has increased the worries regarding a critical issue for the effective transition of cloud computing. From the security v...

  13. Computation Modeling and Assessment of Nanocoatings for Ultra Supercritical Boilers

    Energy Technology Data Exchange (ETDEWEB)

    J. Shingledecker; D. Gandy; N. Cheruvu; R. Wei; K. Chan

    2011-06-21

    Forced outages and boiler unavailability of coal-fired fossil plants is most often caused by fire-side corrosion of boiler waterwalls and tubing. Reliable coatings are required for Ultrasupercritical (USC) application to mitigate corrosion since these boilers will operate at a much higher temperatures and pressures than in supercritical (565 C {at} 24 MPa) boilers. Computational modeling efforts have been undertaken to design and assess potential Fe-Cr-Ni-Al systems to produce stable nanocrystalline coatings that form a protective, continuous scale of either Al{sub 2}O{sub 3} or Cr{sub 2}O{sub 3}. The computational modeling results identified a new series of Fe-25Cr-40Ni with or without 10 wt.% Al nanocrystalline coatings that maintain long-term stability by forming a diffusion barrier layer at the coating/substrate interface. The computational modeling predictions of microstructure, formation of continuous Al{sub 2}O{sub 3} scale, inward Al diffusion, grain growth, and sintering behavior were validated with experimental results. Advanced coatings, such as MCrAl (where M is Fe, Ni, or Co) nanocrystalline coatings, have been processed using different magnetron sputtering deposition techniques. Several coating trials were performed and among the processing methods evaluated, the DC pulsed magnetron sputtering technique produced the best quality coating with a minimum number of shallow defects and the results of multiple deposition trials showed that the process is repeatable. scale, inward Al diffusion, grain growth, and sintering behavior were validated with experimental results. The cyclic oxidation test results revealed that the nanocrystalline coatings offer better oxidation resistance, in terms of weight loss, localized oxidation, and formation of mixed oxides in the Al{sub 2}O{sub 3} scale, than widely used MCrAlY coatings. However, the ultra-fine grain structure in these coatings, consistent with the computational model predictions, resulted in accelerated Al

  14. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  15. Scaling predictive modeling in drug development with cloud computing.

    Science.gov (United States)

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  16. The emerging role of cloud computing in molecular modelling.

    Science.gov (United States)

    Ebejer, Jean-Paul; Fulle, Simone; Morris, Garrett M; Finn, Paul W

    2013-07-01

    There is a growing recognition of the importance of cloud computing for large-scale and data-intensive applications. The distinguishing features of cloud computing and their relationship to other distributed computing paradigms are described, as are the strengths and weaknesses of the approach. We review the use made to date of cloud computing for molecular modelling projects and the availability of front ends for molecular modelling applications. Although the use of cloud computing technologies for molecular modelling is still in its infancy, we demonstrate its potential by presenting several case studies. Rapid growth can be expected as more applications become available and costs continue to fall; cloud computing can make a major contribution not just in terms of the availability of on-demand computing power, but could also spur innovation in the development of novel approaches that utilize that capacity in more effective ways. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Reduced order methods for modeling and computational reduction

    CERN Document Server

    Rozza, Gianluigi

    2014-01-01

    This monograph addresses the state of the art of reduced order methods for modeling and computational reduction of complex parametrized systems, governed by ordinary and/or partial differential equations, with a special emphasis on real time computing techniques and applications in computational mechanics, bioengineering and computer graphics.  Several topics are covered, including: design, optimization, and control theory in real-time with applications in engineering; data assimilation, geometry registration, and parameter estimation with special attention to real-time computing in biomedical engineering and computational physics; real-time visualization of physics-based simulations in computer science; the treatment of high-dimensional problems in state space, physical space, or parameter space; the interactions between different model reduction and dimensionality reduction approaches; the development of general error estimation frameworks which take into account both model and discretization effects. This...

  18. Multi-resolution voxel phantom modeling: a high-resolution eye model for computational dosimetry.

    Science.gov (United States)

    Caracappa, Peter F; Rhodes, Ashley; Fiedler, Derek

    2014-09-21

    Voxel models of the human body are commonly used for simulating radiation dose with a Monte Carlo radiation transport code. Due to memory limitations, the voxel resolution of these computational phantoms is typically too large to accurately represent the dimensions of small features such as the eye. Recently reduced recommended dose limits to the lens of the eye, which is a radiosensitive tissue with a significant concern for cataract formation, has lent increased importance to understanding the dose to this tissue. A high-resolution eye model is constructed using physiological data for the dimensions of radiosensitive tissues, and combined with an existing set of whole-body models to form a multi-resolution voxel phantom, which is used with the MCNPX code to calculate radiation dose from various exposure types. This phantom provides an accurate representation of the radiation transport through the structures of the eye. Two alternate methods of including a high-resolution eye model within an existing whole-body model are developed. The accuracy and performance of each method is compared against existing computational phantoms.

  19. Reconstruction of Exposure to m-Xylene from Human Biomonitoring Data Using PBPK Modelling, Bayesian Inference, and Markov Chain Monte Carlo Simulation

    Science.gov (United States)

    McNally, Kevin; Cotton, Richard; Cocker, John; Jones, Kate; Bartels, Mike; Rick, David; Price, Paul; Loizou, George

    2012-01-01

    There are numerous biomonitoring programs, both recent and ongoing, to evaluate environmental exposure of humans to chemicals. Due to the lack of exposure and kinetic data, the correlation of biomarker levels with exposure concentrations leads to difficulty in utilizing biomonitoring data for biological guidance values. Exposure reconstruction or reverse dosimetry is the retrospective interpretation of external exposure consistent with biomonitoring data. We investigated the integration of physiologically based pharmacokinetic modelling, global sensitivity analysis, Bayesian inference, and Markov chain Monte Carlo simulation to obtain a population estimate of inhalation exposure to m-xylene. We used exhaled breath and venous blood m-xylene and urinary 3-methylhippuric acid measurements from a controlled human volunteer study in order to evaluate the ability of our computational framework to predict known inhalation exposures. We also investigated the importance of model structure and dimensionality with respect to its ability to reconstruct exposure. PMID:22719759

  20. Reconstruction of Exposure to m-Xylene from Human Biomonitoring Data Using PBPK Modelling, Bayesian Inference, and Markov Chain Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    Kevin McNally

    2012-01-01

    Full Text Available There are numerous biomonitoring programs, both recent and ongoing, to evaluate environmental exposure of humans to chemicals. Due to the lack of exposure and kinetic data, the correlation of biomarker levels with exposure concentrations leads to difficulty in utilizing biomonitoring data for biological guidance values. Exposure reconstruction or reverse dosimetry is the retrospective interpretation of external exposure consistent with biomonitoring data. We investigated the integration of physiologically based pharmacokinetic modelling, global sensitivity analysis, Bayesian inference, and Markov chain Monte Carlo simulation to obtain a population estimate of inhalation exposure to m-xylene. We used exhaled breath and venous blood m-xylene and urinary 3-methylhippuric acid measurements from a controlled human volunteer study in order to evaluate the ability of our computational framework to predict known inhalation exposures. We also investigated the importance of model structure and dimensionality with respect to its ability to reconstruct exposure.

  1. Computed tomography of head: impact of the use of automatic exposure control

    International Nuclear Information System (INIS)

    Souza, Giordana Salvi de Souza; Froner, Ana Paula Pastre; Silva, Ana Maria Marques da

    2017-01-01

    The use of dose reduction systems in computed tomography is particularly important for pediatric patients, who have a high radiosensitivity, since they are in the growing phase. The aim of this study was to evaluate the impact of Siemens AEC Care Dose system use on dose estimation in head computed tomography scans in pediatric patients. A retrospective study was conducted with 199 computed tomography head scans of pediatric patients, divided in different age groups, on a 16-channel Siemens Emotion scanner. It was possible to observe, for all age groups, that the use of AEC Care Dose system, not only reduced CTDI vol , but also the interquartile range, reducing the total dose in the investigated population. Concluding, AEC Care Dose system models the tube current according to the patient's dimensions, reducing individual and collective dose without affecting the diagnostic quality . (author)

  2. Kriged and modeled ambient air levels of benzene in an urban environment: an exposure assessment study

    Directory of Open Access Journals (Sweden)

    Lai Dejian

    2011-03-01

    Full Text Available Abstract Background There is increasing concern regarding the potential adverse health effects of air pollution, particularly hazardous air pollutants (HAPs. However, quantifying exposure to these pollutants is problematic. Objective Our goal was to explore the utility of kriging, a spatial interpolation method, for exposure assessment in epidemiologic studies of HAPs. We used benzene as an example and compared census tract-level kriged predictions to estimates obtained from the 1999 U.S. EPA National Air Toxics Assessment (NATA, Assessment System for Population Exposure Nationwide (ASPEN model. Methods Kriged predictions were generated for 649 census tracts in Harris County, Texas using estimates of annual benzene air concentrations from 17 monitoring sites operating in Harris and surrounding counties from 1998 to 2000. Year 1999 ASPEN modeled estimates were also obtained for each census tract. Spearman rank correlation analyses were performed on the modeled and kriged benzene levels. Weighted kappa statistics were computed to assess agreement between discretized kriged and modeled estimates of ambient air levels of benzene. Results There was modest correlation between the predicted and modeled values across census tracts. Overall, 56.2%, 40.7%, 31.5% and 28.2% of census tracts were classified as having 'low', 'medium-low', 'medium-high' and 'high' ambient air levels of benzene, respectively, comparing predicted and modeled benzene levels. The weighted kappa statistic was 0.26 (95% confidence interval (CI = 0.20, 0.31, indicating poor agreement between the two methods. Conclusions There was a lack of concordance between predicted and modeled ambient air levels of benzene. Applying methods of spatial interpolation for assessing exposure to ambient air pollutants in health effect studies is hindered by the placement and number of existing stationary monitors collecting HAP data. Routine monitoring needs to be expanded if we are to use these data

  3. Reduction of radiation exposure and image quality using dose reduction tool on computed tomography fluoroscopy

    International Nuclear Information System (INIS)

    Sakabe, Daisuke; Tochihara, Syuichi; Ono, Michiaki; Tokuda, Masaki; Kai, Noriyuki; Nakato, Kengo; Hashida, Masahiro; Funama, Yoshinori; Murazaki, Hiroo

    2012-01-01

    The purpose of our study was to measure the reduction rate of radiation dose and variability of image noise using the angular beam modulation (ABM) on computed tomography (CT) fluoroscopy. The Alderson-Rando phantom and the homemade phantom were used in our study. These phantoms were scanned at on-center and off-center positions at -12 cm along y-axis with and without ABM technique. Regarding the technique, the x-ray tube is turned off in a 100-degree angle sector at the center of 12 o'clock, 10 o'clock, and 2 o'clock positions during CT fluoroscopy. CT fluoroscopic images were obtained with tube voltages, 120 kV; tube current-time product per reconstructed image, 30 mAs; rotation time, 0.5 s/rot; slice thickness, 4.8 mm; and reconstruction kernel B30s in each scanning. After CT scanning, radiation exposure and image noise were measured and the image artifacts were evaluated with and without the technique. The reduction rate for radiation exposure was 75-80% with and without the technique at on-center position regardless of each angle position. In the case of the off-center position at -12 cm, the reduction rate was 50% with and without the technique. In contrast, image noise remained constant with and without the technique. Visual inspection for image artifacts almost have the same scores with and without the technique and no statistical significance was found in both techniques (p>0.05). ABM is an appropriate tool for reducing radiation exposure and maintaining image-noise and artifacts during CT fluoroscopy. (author)

  4. Structure, function, and behaviour of computational models in systems biology.

    Science.gov (United States)

    Knüpfer, Christian; Beckstein, Clemens; Dittrich, Peter; Le Novère, Nicolas

    2013-05-31

    Systems Biology develops computational models in order to understand biological phenomena. The increasing number and complexity of such "bio-models" necessitate computer support for the overall modelling task. Computer-aided modelling has to be based on a formal semantic description of bio-models. But, even if computational bio-models themselves are represented precisely in terms of mathematical expressions their full meaning is not yet formally specified and only described in natural language. We present a conceptual framework - the meaning facets - which can be used to rigorously specify the semantics of bio-models. A bio-model has a dual interpretation: On the one hand it is a mathematical expression which can be used in computational simulations (intrinsic meaning). On the other hand the model is related to the biological reality (extrinsic meaning). We show that in both cases this interpretation should be performed from three perspectives: the meaning of the model's components (structure), the meaning of the model's intended use (function), and the meaning of the model's dynamics (behaviour). In order to demonstrate the strengths of the meaning facets framework we apply it to two semantically related models of the cell cycle. Thereby, we make use of existing approaches for computer representation of bio-models as much as possible and sketch the missing pieces. The meaning facets framework provides a systematic in-depth approach to the semantics of bio-models. It can serve two important purposes: First, it specifies and structures the information which biologists have to take into account if they build, use and exchange models. Secondly, because it can be formalised, the framework is a solid foundation for any sort of computer support in bio-modelling. The proposed conceptual framework establishes a new methodology for modelling in Systems Biology and constitutes a basis for computer-aided collaborative research.

  5. Assessment of radiation exposure on a dual-source computed tomography-scanner performing coronary computed tomography-angiography

    International Nuclear Information System (INIS)

    Kirchhoff, S.; Herzog, P.; Johnson, T.; Boehm, H.; Nikolaou, K.; Reiser, M.F.; Becker, C.H.

    2010-01-01

    Objective: The radiation exposure of a dual-source-64-channel multi-detector-computed-tomography-scanner (Somatom-Defintion, Siemens, Germany) was assessed in a phantom-study performing coronary-CT-angiography (CTCA) in comparison to patients' data randomly selected from routine scanning. Methods: 240 CT-acquisitions of a computed tomography dose index (CTDI)-phantom (PTW, Freiburg, Germany) were performed using a synthetically generated Electrocardiography (ECG)-signal with variable heart rates (30-180 beats per minute (bpm)). 120 measurements were acquired using continuous tube-output; 120 measurements were performed using ECG-synchronized tube-modulation. The pulsing window was set at minimum duration at 65% of the cardiac cycle between 30 and 75 bpm. From 90-180 bpm the pulsing window was set at 30-70% of the cardiac cycle. Automated pitch adaptation was always used. A comparison between phantom CTDI and two patient groups' CTDI corresponding to the two pulsing groups was performed. Results: Without ECG-tube-modulation CDTI-values were affected by heart-rate-changes resulting in 85.7 mGray (mGy) at 30 and 45 bpm, 65.5 mGy/60 bpm, 54.7 mGy/75 bpm, 46.5 mGy/90 bpm, 34.2 mGy/120 bpm, 27.0 mGy/150 bpm and 22.1 mGy/180 bpm equal to effective doses between 14.5 mSievert (mSv) at 30/45 bpm and 3.6 mSv at 180 bpm. Using ECG-tube-modulation these CTDI-values resulted: 32.6 mGy/30 bpm, 36.6 mGy/45 bpm, 31.4 mGy/60 bpm, 26.8 mGy/75 bpm, 23.7 mGy/90 bpm, 19.4 mGy/120 bpm, 17.2 mGy/150 bpm and 15.6 mGy/180 bpm equal to effective doses between 5.5 mSv at 30 bpm and 2.6 mSv at 180 bpm. Significant CTDI-differences were found between patients with lower/moderate and higher heart rates in comparison to the phantom CTDI-results. Conclusions: Dual source CTCA is particularly dose efficient at high heart rates when automated pitch adaptation, especially in combination with ECG-based tube-modulation is used. However in clinical routine scanning for patients with higher heart rates

  6. Predictive modeling of terrestrial radiation exposure from geologic materials

    Science.gov (United States)

    Haber, Daniel A.

    Aerial gamma ray surveys are an important tool for national security, scientific, and industrial interests in determining locations of both anthropogenic and natural sources of radioactivity. There is a relationship between radioactivity and geology and in the past this relationship has been used to predict geology from an aerial survey. The purpose of this project is to develop a method to predict the radiologic exposure rate of the geologic materials in an area by creating a model using geologic data, images from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), geochemical data, and pre-existing low spatial resolution aerial surveys from the National Uranium Resource Evaluation (NURE) Survey. Using these data, geospatial areas, referred to as background radiation units, homogenous in terms of K, U, and Th are defined and the gamma ray exposure rate is predicted. The prediction is compared to data collected via detailed aerial survey by our partner National Security Technologies, LLC (NSTec), allowing for the refinement of the technique. High resolution radiation exposure rate models have been developed for two study areas in Southern Nevada that include the alluvium on the western shore of Lake Mohave, and Government Wash north of Lake Mead; both of these areas are arid with little soil moisture and vegetation. We determined that by using geologic units to define radiation background units of exposed bedrock and ASTER visualizations to subdivide radiation background units of alluvium, regions of homogeneous geochemistry can be defined allowing for the exposure rate to be predicted. Soil and rock samples have been collected at Government Wash and Lake Mohave as well as a third site near Cameron, Arizona. K, U, and Th concentrations of these samples have been determined using inductively coupled mass spectrometry (ICP-MS) and laboratory counting using radiation detection equipment. In addition, many sample locations also have

  7. Reducing overlay sampling for APC-based correction per exposure by replacing measured data with computational prediction

    Science.gov (United States)

    Noyes, Ben F.; Mokaberi, Babak; Oh, Jong Hun; Kim, Hyun Sik; Sung, Jun Ha; Kea, Marc

    2016-03-01

    One of the keys to successful mass production of sub-20nm nodes in the semiconductor industry is the development of an overlay correction strategy that can meet specifications, reduce the number of layers that require dedicated chuck overlay, and minimize measurement time. Three important aspects of this strategy are: correction per exposure (CPE), integrated metrology (IM), and the prioritization of automated correction over manual subrecipes. The first and third aspects are accomplished through an APC system that uses measurements from production lots to generate CPE corrections that are dynamically applied to future lots. The drawback of this method is that production overlay sampling must be extremely high in order to provide the system with enough data to generate CPE. That drawback makes IM particularly difficult because of the throughput impact that can be created on expensive bottleneck photolithography process tools. The goal is to realize the cycle time and feedback benefits of IM coupled with the enhanced overlay correction capability of automated CPE without impacting process tool throughput. This paper will discuss the development of a system that sends measured data with reduced sampling via an optimized layout to the exposure tool's computational modelling platform to predict and create "upsampled" overlay data in a customizable output layout that is compatible with the fab user CPE APC system. The result is dynamic CPE without the burden of extensive measurement time, which leads to increased utilization of IM.

  8. Ocean Modeling and Visualization on Massively Parallel Computer

    Science.gov (United States)

    Chao, Yi; Li, P. Peggy; Wang, Ping; Katz, Daniel S.; Cheng, Benny N.

    1997-01-01

    Climate modeling is one of the grand challenges of computational science, and ocean modeling plays an important role in both understanding the current climatic conditions and predicting future climate change.

  9. Predictive Models and Tools for Screening Chemicals under TSCA: Consumer Exposure Models 1.5

    Science.gov (United States)

    CEM contains a combination of models and default parameters which are used to estimate inhalation, dermal, and oral exposures to consumer products and articles for a wide variety of product and article use categories.

  10. Inhalation Exposure Input Parameters for the Biosphere Model

    Energy Technology Data Exchange (ETDEWEB)

    K. Rautenstrauch

    2004-09-10

    This analysis is one of 10 reports that support the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN) biosphere model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the conceptual model as well as the mathematical model and its input parameters. This report documents development of input parameters for the biosphere model that are related to atmospheric mass loading and supports the use of the model to develop biosphere dose conversion factors (BDCFs). The biosphere model is one of a series of process models supporting the total system performance assessment (TSPA) for a Yucca Mountain repository. Inhalation Exposure Input Parameters for the Biosphere Model is one of five reports that develop input parameters for the biosphere model. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling, and the plan for development of the biosphere abstraction products for TSPA, as identified in the Technical Work Plan for Biosphere Modeling and Expert Support (BSC 2004 [DIRS 169573]). This analysis report defines and justifies values of mass loading for the biosphere model. Mass loading is the total mass concentration of resuspended particles (e.g., dust, ash) in a volume of air. Mass loading values are used in the air submodel of ERMYN to calculate concentrations of radionuclides in air inhaled by a receptor and concentrations in air surrounding crops. Concentrations in air to which the receptor is exposed are then used in the inhalation submodel to calculate the dose contribution to the receptor from inhalation of contaminated airborne particles. Concentrations in air surrounding plants are used in the plant submodel to calculate the concentrations of radionuclides in foodstuffs contributed from uptake by foliar interception.

  11. Inhalation Exposure Input Parameters for the Biosphere Model

    International Nuclear Information System (INIS)

    K. Rautenstrauch

    2004-01-01

    This analysis is one of 10 reports that support the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN) biosphere model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the conceptual model as well as the mathematical model and its input parameters. This report documents development of input parameters for the biosphere model that are related to atmospheric mass loading and supports the use of the model to develop biosphere dose conversion factors (BDCFs). The biosphere model is one of a series of process models supporting the total system performance assessment (TSPA) for a Yucca Mountain repository. Inhalation Exposure Input Parameters for the Biosphere Model is one of five reports that develop input parameters for the biosphere model. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling, and the plan for development of the biosphere abstraction products for TSPA, as identified in the Technical Work Plan for Biosphere Modeling and Expert Support (BSC 2004 [DIRS 169573]). This analysis report defines and justifies values of mass loading for the biosphere model. Mass loading is the total mass concentration of resuspended particles (e.g., dust, ash) in a volume of air. Mass loading values are used in the air submodel of ERMYN to calculate concentrations of radionuclides in air inhaled by a receptor and concentrations in air surrounding crops. Concentrations in air to which the receptor is exposed are then used in the inhalation submodel to calculate the dose contribution to the receptor from inhalation of contaminated airborne particles. Concentrations in air surrounding plants are used in the plant submodel to calculate the concentrations of radionuclides in foodstuffs contributed from uptake by foliar interception

  12. Computer model for economic study of unbleached kraft paperboard production

    Science.gov (United States)

    Peter J. Ince

    1984-01-01

    Unbleached kraft paperboard is produced from wood fiber in an industrial papermaking process. A highly specific and detailed model of the process is presented. The model is also presented as a working computer program. A user of the computer program will provide data on physical parameters of the process and on prices of material inputs and outputs. The program is then...

  13. Airfoil Computations using the γ - Reθ Model

    DEFF Research Database (Denmark)

    Sørensen, Niels N.

    computations. Based on this, an estimate of the error in the computations is determined to be approximately one percent in the attached region. Following the verification of the implemented model, the model is applied to four airfoils, NACA64- 018, NACA64-218, NACA64-418 and NACA64-618 and the results...

  14. Python for Scientific Computing Education: Modeling of Queueing Systems

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2014-01-01

    Full Text Available In this paper, we present the methodology for the introduction to scientific computing based on model-centered learning. We propose multiphase queueing systems as a basis for learning objects. We use Python and parallel programming for implementing the models and present the computer code and results of stochastic simulations.

  15. Development and application of a complex numerical model and software for the computation of dose conversion factors for radon progenies.

    Science.gov (United States)

    Farkas, Árpád; Balásházy, Imre

    2015-04-01

    A more exact determination of dose conversion factors associated with radon progeny inhalation was possible due to the advancements in epidemiological health risk estimates in the last years. The enhancement of computational power and the development of numerical techniques allow computing dose conversion factors with increasing reliability. The objective of this study was to develop an integrated model and software based on a self-developed airway deposition code, an own bronchial dosimetry model and the computational methods accepted by International Commission on Radiological Protection (ICRP) to calculate dose conversion coefficients for different exposure conditions. The model was tested by its application for exposure and breathing conditions characteristic of mines and homes. The dose conversion factors were 8 and 16 mSv WLM(-1) for homes and mines when applying a stochastic deposition model combined with the ICRP dosimetry model (named PM-A model), and 9 and 17 mSv WLM(-1) when applying the same deposition model combined with authors' bronchial dosimetry model and the ICRP bronchiolar and alveolar-interstitial dosimetry model (called PM-B model). User friendly software for the computation of dose conversion factors has also been developed. The software allows one to compute conversion factors for a large range of exposure and breathing parameters and to perform sensitivity analyses. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Computational intelligence applications in modeling and control

    CERN Document Server

    Vaidyanathan, Sundarapandian

    2015-01-01

    The development of computational intelligence (CI) systems was inspired by observable and imitable aspects of intelligent activity of human being and nature. The essence of the systems based on computational intelligence is to process and interpret data of various nature so that that CI is strictly connected with the increase of available data as well as capabilities of their processing, mutually supportive factors. Developed theories of computational intelligence were quickly applied in many fields of engineering, data analysis, forecasting, biomedicine and others. They are used in images and sounds processing and identifying, signals processing, multidimensional data visualization, steering of objects, analysis of lexicographic data, requesting systems in banking, diagnostic systems, expert systems and many other practical implementations. This book consists of 16 contributed chapters by subject experts who are specialized in the various topics addressed in this book. The special chapters have been brought ...

  17. COMPUTATIONAL MODELING OF AIRFLOW IN NONREGULAR SHAPED CHANNELS

    Directory of Open Access Journals (Sweden)

    A. A. Voronin

    2013-05-01

    Full Text Available The basic approaches to computational modeling of airflow in the human nasal cavity are analyzed. Different models of turbulent flow which may be used in order to calculate air velocity and pressure are discussed. Experimental measurement results of airflow temperature are illustrated. Geometrical model of human nasal cavity reconstructed from computer-aided tomography scans and numerical simulation results of airflow inside this model are also given. Spatial distributions of velocity and temperature for inhaled and exhaled air are shown.

  18. Modeling emission rates and exposures from outdoor cooking

    Science.gov (United States)

    Edwards, Rufus; Princevac, Marko; Weltman, Robert; Ghasemian, Masoud; Arora, Narendra K.; Bond, Tami

    2017-09-01

    Approximately 3 billion individuals rely on solid fuels for cooking globally. For a large portion of these - an estimated 533 million - cooking is outdoors, where emissions from cookstoves pose a health risk to both cooks and other household and village members. Models that estimate emissions rates from stoves in indoor environments that would meet WHO air quality guidelines (AQG), explicitly don't account for outdoor cooking. The objectives of this paper are to link health based exposure guidelines with emissions from outdoor cookstoves, using a Monte Carlo simulation of cooking times from Haryana India coupled with inverse Gaussian dispersion models. Mean emission rates for outdoor cooking that would result in incremental increases in personal exposure equivalent to the WHO AQG during a 24-h period were 126 ± 13 mg/min for cooking while squatting and 99 ± 10 mg/min while standing. Emission rates modeled for outdoor cooking are substantially higher than emission rates for indoor cooking to meet AQG, because the models estimate impact of emissions on personal exposure concentrations rather than microenvironment concentrations, and because the smoke disperses more readily outdoors compared to indoor environments. As a result, many more stoves including the best performing solid-fuel biomass stoves would meet AQG when cooking outdoors, but may also result in substantial localized neighborhood pollution depending on housing density. Inclusion of the neighborhood impact of pollution should be addressed more formally both in guidelines on emissions rates from stoves that would be protective of health, and also in wider health impact evaluation efforts and burden of disease estimates. Emissions guidelines should better represent the different contexts in which stoves are being used, especially because in these contexts the best performing solid fuel stoves have the potential to provide significant benefits.

  19. Development of a computational code for the internal doses assessment of the main radionuclides of occupational exposure at IPEN

    International Nuclear Information System (INIS)

    Claro, Thiago Ribeiro

    2011-01-01

    The dose resulting from internal contamination can be estimated with the use of biokinetic models combined with experimental results obtained from bioanalysis and assessment of the time of incorporation. The biokinetics models are represented by a set of compartments expressing the transportation, retention and elimination of radionuclides from the body. The ICRP publications, number 66, 78 and 100, present compartmental models for the respiratory tract, gastrointestinal tract and for systemic distribution for an array of radionuclides of interest for the radiological protection. The objective of this work is to develop a computational code for the internal doses assessment of the main radionuclides of occupational exposure at IPEN. Consequently serving as a agile and efficient tool for the designing, visualization and resolution of compartmental models of any nature. The architecture of the system was conceived containing two independent software: CBT - responsible for the setup and manipulation of models and SSID - responsible for the mathematical solution of the models. Four different techniques are offered for the resolution of system of equations, including semi-analytical and numerical methods, allowing for comparison of precision and performance of both. The software was developed in C≠ programming, using a Microsoft Access database and XML standards for file exchange with other applications. Compartmental models for uranium, thorium and iodine radionuclides were generated for the validation of the CBT software. The models were subsequently solved via SSID software and the results compared with the values published in the issue 78 of ICRP. In all cases the system replicated the values published by ICRP. (author)

  20. Model Checking Quantified Computation Tree Logic

    NARCIS (Netherlands)

    Rensink, Arend; Baier, C; Hermanns, H.

    2006-01-01

    Propositional temporal logic is not suitable for expressing properties on the evolution of dynamically allocated entities over time. In particular, it is not possible to trace such entities through computation steps, since this requires the ability to freely mix quantification and temporal

  1. Computational compliance criteria in water hammer modelling

    Directory of Open Access Journals (Sweden)

    Urbanowicz Kamil

    2017-01-01

    Full Text Available Among many numerical methods (finite: difference, element, volume etc. used to solve the system of partial differential equations describing unsteady pipe flow, the method of characteristics (MOC is most appreciated. With its help, it is possible to examine the effect of numerical discretisation carried over the pipe length. It was noticed, based on the tests performed in this study, that convergence of the calculation results occurred on a rectangular grid with the division of each pipe of the analysed system into at least 10 elements. Therefore, it is advisable to introduce computational compliance criteria (CCC, which will be responsible for optimal discretisation of the examined system. The results of this study, based on the assumption of various values of the Courant-Friedrichs-Levy (CFL number, indicate also that the CFL number should be equal to one for optimum computational results. Application of the CCC criterion to own written and commercial computer programmes based on the method of characteristics will guarantee fast simulations and the necessary computational coherence.

  2. Computational compliance criteria in water hammer modelling

    Science.gov (United States)

    Urbanowicz, Kamil

    2017-10-01

    Among many numerical methods (finite: difference, element, volume etc.) used to solve the system of partial differential equations describing unsteady pipe flow, the method of characteristics (MOC) is most appreciated. With its help, it is possible to examine the effect of numerical discretisation carried over the pipe length. It was noticed, based on the tests performed in this study, that convergence of the calculation results occurred on a rectangular grid with the division of each pipe of the analysed system into at least 10 elements. Therefore, it is advisable to introduce computational compliance criteria (CCC), which will be responsible for optimal discretisation of the examined system. The results of this study, based on the assumption of various values of the Courant-Friedrichs-Levy (CFL) number, indicate also that the CFL number should be equal to one for optimum computational results. Application of the CCC criterion to own written and commercial computer programmes based on the method of characteristics will guarantee fast simulations and the necessary computational coherence.

  3. Mathematical modeling and computational intelligence in engineering applications

    CERN Document Server

    Silva Neto, Antônio José da; Silva, Geraldo Nunes

    2016-01-01

    This book brings together a rich selection of studies in mathematical modeling and computational intelligence, with application in several fields of engineering, like automation, biomedical, chemical, civil, electrical, electronic, geophysical and mechanical engineering, on a multidisciplinary approach. Authors from five countries and 16 different research centers contribute with their expertise in both the fundamentals and real problems applications based upon their strong background on modeling and computational intelligence. The reader will find a wide variety of applications, mathematical and computational tools and original results, all presented with rigorous mathematical procedures. This work is intended for use in graduate courses of engineering, applied mathematics and applied computation where tools as mathematical and computational modeling, numerical methods and computational intelligence are applied to the solution of real problems.

  4. Lyssavirus infection: 'low dose, multiple exposure' in the mouse model.

    Science.gov (United States)

    Banyard, Ashley C; Healy, Derek M; Brookes, Sharon M; Voller, Katja; Hicks, Daniel J; Núñez, Alejandro; Fooks, Anthony R

    2014-03-06

    The European bat lyssaviruses (EBLV-1 and EBLV-2) are zoonotic pathogens present within bat populations across Europe. The maintenance and transmission of lyssaviruses within bat colonies is poorly understood. Cases of repeated isolation of lyssaviruses from bat roosts have raised questions regarding the maintenance and intraspecies transmissibility of these viruses within colonies. Furthermore, the significance of seropositive bats in colonies remains unclear. Due to the protected nature of European bat species, and hence restrictions to working with the natural host for lyssaviruses, this study analysed the outcome following repeat inoculation of low doses of lyssaviruses in a murine model. A standardized dose of virus, EBLV-1, EBLV-2 or a 'street strain' of rabies (RABV), was administered via a peripheral route to attempt to mimic what is hypothesized as natural infection. Each mouse (n=10/virus/group/dilution) received four inoculations, two doses in each footpad over a period of four months, alternating footpad with each inoculation. Mice were tail bled between inoculations to evaluate antibody responses to infection. Mice succumbed to infection after each inoculation with 26.6% of mice developing clinical disease following the initial exposure across all dilutions (RABV, 32.5% (n=13/40); EBLV-1, 35% (n=13/40); EBLV-2, 12.5% (n=5/40)). Interestingly, the lowest dose caused clinical disease in some mice upon first exposure ((RABV, 20% (n=2/10) after first inoculation; RABV, 12.5% (n=1/8) after second inoculation; EBLV-2, 10% (n=1/10) after primary inoculation). Furthermore, five mice developed clinical disease following the second exposure to live virus (RABV, n=1; EBLV-1, n=1; EBLV-2, n=3) although histopathological examination indicated that the primary inoculation was the most probably cause of death due to levels of inflammation and virus antigen distribution observed. All the remaining mice (RABV, n=26; EBLV-1, n=26; EBLV-2, n=29) survived the tertiary and

  5. An integrated introduction to computer graphics and geometric modeling

    CERN Document Server

    Goldman, Ronald

    2009-01-01

    … this book may be the first book on geometric modelling that also covers computer graphics. In addition, it may be the first book on computer graphics that integrates a thorough introduction to 'freedom' curves and surfaces and to the mathematical foundations for computer graphics. … the book is well suited for an undergraduate course. … The entire book is very well presented and obviously written by a distinguished and creative researcher and educator. It certainly is a textbook I would recommend. …-Computer-Aided Design, 42, 2010… Many books concentrate on computer programming and soon beco

  6. Integrating Cloud-Computing-Specific Model into Aircraft Design

    Science.gov (United States)

    Zhimin, Tian; Qi, Lin; Guangwen, Yang

    Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.

  7. SmartShadow models and methods for pervasive computing

    CERN Document Server

    Wu, Zhaohui

    2013-01-01

    SmartShadow: Models and Methods for Pervasive Computing offers a new perspective on pervasive computing with SmartShadow, which is designed to model a user as a personality ""shadow"" and to model pervasive computing environments as user-centric dynamic virtual personal spaces. Just like human beings' shadows in the physical world, it follows people wherever they go, providing them with pervasive services. The model, methods, and software infrastructure for SmartShadow are presented and an application for smart cars is also introduced.  The book can serve as a valuable reference work for resea

  8. Demonstration of a computer model for residual radioactive material guidelines, RESRAD

    International Nuclear Information System (INIS)

    Yu, C.; Yuan, Y.C.; Zielen, A.J.; Wallo, A. III

    1989-01-01

    A computer model was developed to calculate residual radioactive material guidelines for the US Department of Energy (DOE). This model, called RESRAD, can be run on IBM or IBM-compatible microcomputer. Seven potential exposure pathways from contaminated soil are analyzed, including external radiation exposure and internal radiation exposure from inhalation and food digestion. The RESRAD code has been applied to several DOE sites to derive soil cleanup guidelines. The experience gained indicates that a comprehensive set of site-specific hydrogeologic and geochemical input parameters must be used for a realistic pathway analysis. The RESRAD code is a useful tool; it is easy to run and very user-friendly. 6 refs., 12 figs

  9. Computer modeling of ORNL storage tank sludge mobilization and mixing

    International Nuclear Information System (INIS)

    Terrones, G.; Eyler, L.L.

    1993-09-01

    This report presents and analyzes the results of the computer modeling of mixing and mobilization of sludge in horizontal, cylindrical storage tanks using submerged liquid jets. The computer modeling uses the TEMPEST computational fluid dynamics computer program. The horizontal, cylindrical storage tank configuration is similar to the Melton Valley Storage Tanks (MVST) at Oak Ridge National (ORNL). The MVST tank contents exhibit non-homogeneous, non-Newtonian rheology characteristics. The eventual goals of the simulations are to determine under what conditions sludge mobilization using submerged liquid jets is feasible in tanks of this configuration, and to estimate mixing times required to approach homogeneity of the contents of the tanks

  10. Computational Modeling of Large Wildfires: A Roadmap

    KAUST Repository

    Coen, Janice L.; Douglas, Craig C.

    2010-01-01

    Wildland fire behavior, particularly that of large, uncontrolled wildfires, has not been well understood or predicted. Our methodology to simulate this phenomenon uses high-resolution dynamic models made of numerical weather prediction (NWP) models

  11. Fractal approach to computer-analytical modelling of tree crown

    International Nuclear Information System (INIS)

    Berezovskaya, F.S.; Karev, G.P.; Kisliuk, O.F.; Khlebopros, R.G.; Tcelniker, Yu.L.

    1993-09-01

    In this paper we discuss three approaches to the modeling of a tree crown development. These approaches are experimental (i.e. regressive), theoretical (i.e. analytical) and simulation (i.e. computer) modeling. The common assumption of these is that a tree can be regarded as one of the fractal objects which is the collection of semi-similar objects and combines the properties of two- and three-dimensional bodies. We show that a fractal measure of crown can be used as the link between the mathematical models of crown growth and light propagation through canopy. The computer approach gives the possibility to visualize a crown development and to calibrate the model on experimental data. In the paper different stages of the above-mentioned approaches are described. The experimental data for spruce, the description of computer system for modeling and the variant of computer model are presented. (author). 9 refs, 4 figs

  12. Models of parallel computation :a survey and classification

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yunquan; CHEN Guoliang; SUN Guangzhong; MIAO Qiankun

    2007-01-01

    In this paper,the state-of-the-art parallel computational model research is reviewed.We will introduce various models that were developed during the past decades.According to their targeting architecture features,especially memory organization,we classify these parallel computational models into three generations.These models and their characteristics are discussed based on three generations classification.We believe that with the ever increasing speed gap between the CPU and memory systems,incorporating non-uniform memory hierarchy into computational models will become unavoidable.With the emergence of multi-core CPUs,the parallelism hierarchy of current computing platforms becomes more and more complicated.Describing this complicated parallelism hierarchy in future computational models becomes more and more important.A semi-automatic toolkit that can extract model parameters and their values on real computers can reduce the model analysis complexity,thus allowing more complicated models with more parameters to be adopted.Hierarchical memory and hierarchical parallelism will be two very important features that should be considered in future model design and research.

  13. Patentability aspects of computational cancer models

    Science.gov (United States)

    Lishchuk, Iryna

    2017-07-01

    Multiscale cancer models, implemented in silico, simulate tumor progression at various spatial and temporal scales. Having the innovative substance and possessing the potential of being applied as decision support tools in clinical practice, patenting and obtaining patent rights in cancer models seems prima facie possible. What legal hurdles the cancer models need to overcome for being patented we inquire from this paper.

  14. SHEDS-HT: An Integrated Probabilistic Exposure Model for Prioritizing Exposures to Chemicals with Near-Field and Dietary Sources

    Science.gov (United States)

    United States Environmental Protection Agency (USEPA) researchers are developing a strategy for highthroughput (HT) exposure-based prioritization of chemicals under the ExpoCast program. These novel modeling approaches for evaluating chemicals based on their potential for biologi...

  15. High Throughput Exposure Modeling of Semi-Volatile Chemicals in Articles of Commerce (ACS)

    Science.gov (United States)

    Risk due to chemical exposure is a function of both chemical hazard and exposure. Near-field exposures to chemicals in consumer products are identified as the main drivers of exposure and yet are not well quantified or understood. The ExpoCast project is developing a model that e...

  16. Introduction to computation and modeling for differential equations

    CERN Document Server

    Edsberg, Lennart

    2008-01-01

    An introduction to scientific computing for differential equationsIntroduction to Computation and Modeling for Differential Equations provides a unified and integrated view of numerical analysis, mathematical modeling in applications, and programming to solve differential equations, which is essential in problem-solving across many disciplines, such as engineering, physics, and economics. This book successfully introduces readers to the subject through a unique ""Five-M"" approach: Modeling, Mathematics, Methods, MATLAB, and Multiphysics. This approach facilitates a thorough understanding of h

  17. Estimation of radiation exposure of prospectively triggered 128-slice computed tomography coronary angiography

    Energy Technology Data Exchange (ETDEWEB)

    Ketelsen, D.; Fenchel, M.; Thomas, C.; Boehringer, N.; Tsiflikas, I.; Kaempf, M.; Syha, R.; Claussen, C.D.; Heuschmid, M. [Tuebingen Univ. (Germany). Abt. fuer Diagnostische und Interventionelle Radiologie; Buchgeister, M. [Tuebingen Univ. (Germany). Medizinische Physik

    2010-12-15

    Purpose: To estimate the effective dose of prospectively triggered computed tomography coronary angiography (CTCA) in step-and-shoot (SAS) mode, depending on the tube current and tube voltage modulation. Materials and Methods: For dose measurements, an Alderson-Rando-phantom equipped with thermoluminescent dosimeters was used. The effective dose was calculated according to ICRP 103. Exposure was performed on a 128-slice single source scanner providing a collimation of 128 x 0.6 mm and a rotation time of 0.38 seconds. CTCA in the SAS mode was acquired with variation of the tube current (160, 240, 320 mAs) and tube voltage (100, 120, 140 kV) at a simulated heart rate of 60 beats per minute and a scan range of 13.5 cm. Results: Depending on gender, tube current and tube voltage, the effective dose of a CTCA in SAS mode varies from 2.8 to 10.8 mSv. Due to breast tissue in the primary scan range, exposure in the case of females showed an increase of up to 60.0 {+-}.4 % compared to males. The dose reduction achieved by a reduction of tube current showed a significant positive, linear correlation to effective dose with a possible decrease in the effective dose of up to 60.4 % (r = 0.998; p = 0.044). Disproportionately high, the estimated effective dose can be reduced by using a lower tube voltage with a dose reduction of up to 52.4 %. Conclusion: Further substantial dose reduction of low-dose CTCA in SAS mode can be achieved by adapting the tube current and tube voltage and should be implemented in the clinical routine, i.e. adapting those protocol parameters to patient body weight. (orig.).

  18. Computational model of cellular metabolic dynamics

    DEFF Research Database (Denmark)

    Li, Yanjun; Solomon, Thomas; Haus, Jacob M

    2010-01-01

    of the cytosol and mitochondria. The model simulated skeletal muscle metabolic responses to insulin corresponding to human hyperinsulinemic-euglycemic clamp studies. Insulin-mediated rate of glucose disposal was the primary model input. For model validation, simulations were compared with experimental data......: intracellular metabolite concentrations and patterns of glucose disposal. Model variations were simulated to investigate three alternative mechanisms to explain insulin enhancements: Model 1 (M.1), simple mass action; M.2, insulin-mediated activation of key metabolic enzymes (i.e., hexokinase, glycogen synthase......, by application of mechanism M.3, the model predicts metabolite concentration changes and glucose partitioning patterns consistent with experimental data. The reaction rate fluxes quantified by this detailed model of insulin/glucose metabolism provide information that can be used to evaluate the development...

  19. High burnup models in computer code fair

    Energy Technology Data Exchange (ETDEWEB)

    Dutta, B K; Swami Prasad, P; Kushwaha, H S; Mahajan, S C; Kakodar, A [Bhabha Atomic Research Centre, Bombay (India)

    1997-08-01

    An advanced fuel analysis code FAIR has been developed for analyzing the behavior of fuel rods of water cooled reactors under severe power transients and high burnups. The code is capable of analyzing fuel pins of both collapsible clad, as in PHWR and free standing clad as in LWR. The main emphasis in the development of this code is on evaluating the fuel performance at extended burnups and modelling of the fuel rods for advanced fuel cycles. For this purpose, a number of suitable models have been incorporated in FAIR. For modelling the fission gas release, three different models are implemented, namely Physically based mechanistic model, the standard ANS 5.4 model and the Halden model. Similarly the pellet thermal conductivity can be modelled by the MATPRO equation, the SIMFUEL relation or the Halden equation. The flux distribution across the pellet is modelled by using the model RADAR. For modelling pellet clad interaction (PCMI)/ stress corrosion cracking (SCC) induced failure of sheath, necessary routines are provided in FAIR. The validation of the code FAIR is based on the analysis of fuel rods of EPRI project ``Light water reactor fuel rod modelling code evaluation`` and also the analytical simulation of threshold power ramp criteria of fuel rods of pressurized heavy water reactors. In the present work, a study is carried out by analysing three CRP-FUMEX rods to show the effect of various combinations of fission gas release models and pellet conductivity models, on the fuel analysis parameters. The satisfactory performance of FAIR may be concluded through these case studies. (author). 12 refs, 5 figs.

  20. High burnup models in computer code fair

    International Nuclear Information System (INIS)

    Dutta, B.K.; Swami Prasad, P.; Kushwaha, H.S.; Mahajan, S.C.; Kakodar, A.

    1997-01-01

    An advanced fuel analysis code FAIR has been developed for analyzing the behavior of fuel rods of water cooled reactors under severe power transients and high burnups. The code is capable of analyzing fuel pins of both collapsible clad, as in PHWR and free standing clad as in LWR. The main emphasis in the development of this code is on evaluating the fuel performance at extended burnups and modelling of the fuel rods for advanced fuel cycles. For this purpose, a number of suitable models have been incorporated in FAIR. For modelling the fission gas release, three different models are implemented, namely Physically based mechanistic model, the standard ANS 5.4 model and the Halden model. Similarly the pellet thermal conductivity can be modelled by the MATPRO equation, the SIMFUEL relation or the Halden equation. The flux distribution across the pellet is modelled by using the model RADAR. For modelling pellet clad interaction (PCMI)/ stress corrosion cracking (SCC) induced failure of sheath, necessary routines are provided in FAIR. The validation of the code FAIR is based on the analysis of fuel rods of EPRI project ''Light water reactor fuel rod modelling code evaluation'' and also the analytical simulation of threshold power ramp criteria of fuel rods of pressurized heavy water reactors. In the present work, a study is carried out by analysing three CRP-FUMEX rods to show the effect of various combinations of fission gas release models and pellet conductivity models, on the fuel analysis parameters. The satisfactory performance of FAIR may be concluded through these case studies. (author). 12 refs, 5 figs

  1. Computational neurorehabilitation: modeling plasticity and learning to predict recovery.

    Science.gov (United States)

    Reinkensmeyer, David J; Burdet, Etienne; Casadio, Maura; Krakauer, John W; Kwakkel, Gert; Lang, Catherine E; Swinnen, Stephan P; Ward, Nick S; Schweighofer, Nicolas

    2016-04-30

    Despite progress in using computational approaches to inform medicine and neuroscience in the last 30 years, there have been few attempts to model the mechanisms underlying sensorimotor rehabilitation. We argue that a fundamental understanding of neurologic recovery, and as a result accurate predictions at the individual level, will be facilitated by developing computational models of the salient neural processes, including plasticity and learning systems of the brain, and integrating them into a context specific to rehabilitation. Here, we therefore discuss Computational Neurorehabilitation, a newly emerging field aimed at modeling plasticity and motor learning to understand and improve movement recovery of individuals with neurologic impairment. We first explain how the emergence of robotics and wearable sensors for rehabilitation is providing data that make development and testing of such models increasingly feasible. We then review key aspects of plasticity and motor learning that such models will incorporate. We proceed by discussing how computational neurorehabilitation models relate to the current benchmark in rehabilitation modeling - regression-based, prognostic modeling. We then critically discuss the first computational neurorehabilitation models, which have primarily focused on modeling rehabilitation of the upper extremity after stroke, and show how even simple models have produced novel ideas for future investigation. Finally, we conclude with key directions for future research, anticipating that soon we will see the emergence of mechanistic models of motor recovery that are informed by clinical imaging results and driven by the actual movement content of rehabilitation therapy as well as wearable sensor-based records of daily activity.

  2. Indicators of residential traffic exposure: Modelled NOX, traffic proximity, and self-reported exposure in RHINE III

    Science.gov (United States)

    Carlsen, Hanne Krage; Bäck, Erik; Eneroth, Kristina; Gislason, Thorarinn; Holm, Mathias; Janson, Christer; Jensen, Steen Solvang; Johannessen, Ane; Kaasik, Marko; Modig, Lars; Segersson, David; Sigsgaard, Torben; Forsberg, Bertil; Olsson, David; Orru, Hans

    2017-10-01

    Few studies have investigated associations between self-reported and modelled exposure to traffic pollution. The objective of this study was to examine correlations between self-reported traffic exposure and modelled (a) NOX and (b) traffic proximity in seven different northern European cities; Aarhus (Denmark), Bergen (Norway), Gothenburg, Umeå, and Uppsala (Sweden), Reykjavik (Iceland), and Tartu (Estonia). We analysed data from the RHINE III (Respiratory Health in Northern Europe, http://www.rhine.nu)

  3. Computational fluid dynamics modeling of Bacillus anthracis spore deposition in rabbit and human respiratory airways

    Energy Technology Data Exchange (ETDEWEB)

    Kabilan, S.; Suffield, S. R.; Recknagle, K. P.; Jacob, R. E.; Einstein, D. R.; Kuprat, A. P.; Carson, J. P.; Colby, S. M.; Saunders, J. H.; Hines, S. A.; Teeguarden, J. G.; Straub, T. M.; Moe, M.; Taft, S. C.; Corley, R. A.

    2016-09-01

    Three-dimensional computational fluid dynamics and Lagrangian particle deposition models were developed to compare the deposition of aerosolized Bacillus anthracis spores in the respiratory airways of a human with that of the rabbit, a species commonly used in the study of anthrax disease. The respiratory airway geometries for each species were derived respectively from computed tomography (CT) and µCT images. Both models encompassed airways that extended from the external nose to the lung with a total of 272 outlets in the human model and 2878 outlets in the rabbit model. All simulations of spore deposition were conducted under transient, inhalation–exhalation breathing conditions using average species-specific minute volumes. Two different exposure scenarios were modeled in the rabbit based upon experimental inhalation studies. For comparison, human simulations were conducted at the highest exposure concentration used during the rabbit experimental exposures. Results demonstrated that regional spore deposition patterns were sensitive to airway geometry and ventilation profiles. Due to the complex airway geometries in the rabbit nose, higher spore deposition efficiency was predicted in the nasal sinus compared to the human at the same air concentration of anthrax spores. In contrast, higher spore deposition was predicted in the lower conducting airways of the human compared to the rabbit lung due to differences in airway branching pattern. This information can be used to refine published and ongoing biokinetic models of inhalation anthrax spore exposures, which currently estimate deposited spore concentrations based solely upon exposure concentrations and inhaled doses that do not factor in species-specific anatomy and physiology for deposition.

  4. Establishing a Cloud Computing Success Model for Hospitals in Taiwan

    Science.gov (United States)

    Lian, Jiunn-Woei

    2017-01-01

    The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services. PMID:28112020

  5. Establishing a Cloud Computing Success Model for Hospitals in Taiwan.

    Science.gov (United States)

    Lian, Jiunn-Woei

    2017-01-01

    The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services.

  6. Establishing a Cloud Computing Success Model for Hospitals in Taiwan

    Directory of Open Access Journals (Sweden)

    Jiunn-Woei Lian PhD

    2017-01-01

    Full Text Available The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services.

  7. Evaluation of the individual tube current setting in electrocardiogram-gated cardiac computed tomography estimated from plain chest computed tomography using computed tomography automatic exposure control

    International Nuclear Information System (INIS)

    Kishimoto, Junichi; Sakou, Toshio; Ohta, Yasutoshi

    2013-01-01

    The aim of this study was to estimate the tube current on a cardiac computed tomography (CT) from a plain chest CT using CT-automatic exposure control (CT-AEC), to obtain consistent image noise, and to optimize the scan tube current by individualizing the tube current. Sixty-five patients (Group A) underwent cardiac CT at fixed tube current. The mAs value for plain chest CT using CT-AEC (AEC value) and cardiac CT image noise were measured. The tube current needed to obtain the intended level of image noise in the cardiac CT was determined from their correlation. Another 65 patients (Group B) underwent cardiac CT with tube currents individually determined from the AEC value. Image noise was compared among Group A and B. Image noise of cardiac CT in Group B was 24.4±3.1 Hounsfield unit (HU) and was more uniform than in Group A (21.2±6.1 HU). The error with the desired image noise of 25 HU was lower in Group B (2.4%) than in Group A (15.2%). Individualized tube current selection based on AEC value thus provided consistent image noise and a scan tube current optimized for cardiac CT. (author)

  8. Computer modeling of Cannabinoid receptor type 1

    Directory of Open Access Journals (Sweden)

    Sapundzhi Fatima

    2018-01-01

    Full Text Available Cannabinoid receptors are important class of receptors as they are involved in various physiological processes such as appetite, pain-sensation, mood, and memory. It is important to design receptor-selective ligands in order to treat a particular disorder. The aim of the present study is to model the structure of cannabinoid receptor CB1 and to perform docking between obtained models and known ligands. Two models of CBR1 were prepared with two different methods (Modeller of Chimera and MOE. They were used for docking with GOLD 5.2. It was established a high correlation between inhibitory constant Ki of CB1 cannabinoid ligands and the ChemScore scoring function of GOLD, which concerns both models. This suggests that the models of the CB1 receptors obtained could be used for docking studies and in further investigation and design of new potential, selective and active cannabinoids with the desired effects.

  9. Establishing a Cloud Computing Success Model for Hospitals in Taiwan

    OpenAIRE

    Lian, Jiunn-Woei

    2017-01-01

    The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality...

  10. Computational modeling of shallow geothermal systems

    CERN Document Server

    Al-Khoury, Rafid

    2011-01-01

    A Step-by-step Guide to Developing Innovative Computational Tools for Shallow Geothermal Systems Geothermal heat is a viable source of energy and its environmental impact in terms of CO2 emissions is significantly lower than conventional fossil fuels. Shallow geothermal systems are increasingly utilized for heating and cooling of buildings and greenhouses. However, their utilization is inconsistent with the enormous amount of energy available underneath the surface of the earth. Projects of this nature are not getting the public support they deserve because of the uncertainties associated with

  11. Utero-fetal unit and pregnant woman modeling using a computer graphics approach for dosimetry studies.

    Science.gov (United States)

    Anquez, Jérémie; Boubekeur, Tamy; Bibin, Lazar; Angelini, Elsa; Bloch, Isabelle

    2009-01-01

    Potential sanitary effects related to electromagnetic fields exposure raise public concerns, especially for fetuses during pregnancy. Human fetus exposure can only be assessed through simulated dosimetry studies, performed on anthropomorphic models of pregnant women. In this paper, we propose a new methodology to generate a set of detailed utero-fetal unit (UFU) 3D models during the first and third trimesters of pregnancy, based on segmented 3D ultrasound and MRI data. UFU models are built using recent geometry processing methods derived from mesh-based computer graphics techniques and embedded in a synthetic woman body. Nine pregnant woman models have been generated using this approach and validated by obstetricians, for anatomical accuracy and representativeness.

  12. An investigation of automatic exposure control calibration for chest imaging with a computed radiography system

    International Nuclear Information System (INIS)

    Moore, C S; Wood, T J; Beavis, A W; Saunderson, J R; Avery, G; Balcam, S; Needler, L

    2014-01-01

    The purpose of this study was to examine the use of three physical image quality metrics in the calibration of an automatic exposure control (AEC) device for chest radiography with a computed radiography (CR) imaging system. The metrics assessed were signal-to-noise ratio (SNR), contrast-to-noise ratio (CNR) and mean effective noise equivalent quanta (eNEQ m ), all measured using a uniform chest phantom. Subsequent calibration curves were derived to ensure each metric was held constant across the tube voltage range. Each curve was assessed for its clinical appropriateness by generating computer simulated chest images with correct detector air kermas for each tube voltage, and grading these against reference images which were reconstructed at detector air kermas correct for the constant detector dose indicator (DDI) curve currently programmed into the AEC device. All simulated chest images contained clinically realistic projected anatomy and anatomical noise and were scored by experienced image evaluators. Constant DDI and CNR curves do not appear to provide optimized performance across the diagnostic energy range. Conversely, constant eNEQ m  and SNR do appear to provide optimized performance, with the latter being the preferred calibration metric given as it is easier to measure in practice. Medical physicists may use the SNR image quality metric described here when setting up and optimizing AEC devices for chest radiography CR systems with a degree of confidence that resulting clinical image quality will be adequate for the required clinical task. However, this must be done with close cooperation of expert image evaluators, to ensure appropriate levels of detector air kerma. (paper)

  13. An investigation of automatic exposure control calibration for chest imaging with a computed radiography system.

    Science.gov (United States)

    Moore, C S; Wood, T J; Avery, G; Balcam, S; Needler, L; Beavis, A W; Saunderson, J R

    2014-05-07

    The purpose of this study was to examine the use of three physical image quality metrics in the calibration of an automatic exposure control (AEC) device for chest radiography with a computed radiography (CR) imaging system. The metrics assessed were signal-to-noise ratio (SNR), contrast-to-noise ratio (CNR) and mean effective noise equivalent quanta (eNEQm), all measured using a uniform chest phantom. Subsequent calibration curves were derived to ensure each metric was held constant across the tube voltage range. Each curve was assessed for its clinical appropriateness by generating computer simulated chest images with correct detector air kermas for each tube voltage, and grading these against reference images which were reconstructed at detector air kermas correct for the constant detector dose indicator (DDI) curve currently programmed into the AEC device. All simulated chest images contained clinically realistic projected anatomy and anatomical noise and were scored by experienced image evaluators. Constant DDI and CNR curves do not appear to provide optimized performance across the diagnostic energy range. Conversely, constant eNEQm and SNR do appear to provide optimized performance, with the latter being the preferred calibration metric given as it is easier to measure in practice. Medical physicists may use the SNR image quality metric described here when setting up and optimizing AEC devices for chest radiography CR systems with a degree of confidence that resulting clinical image quality will be adequate for the required clinical task. However, this must be done with close cooperation of expert image evaluators, to ensure appropriate levels of detector air kerma.

  14. Computational Psychometrics for Modeling System Dynamics during Stressful Disasters

    Directory of Open Access Journals (Sweden)

    Pietro Cipresso

    2017-08-01

    Full Text Available Disasters can be very stressful events. However, computational models of stress require data that might be very difficult to collect during disasters. Moreover, personal experiences are not repeatable, so it is not possible to collect bottom-up information when building a coherent model. To overcome these problems, we propose the use of computational models and virtual reality integration to recreate disaster situations, while examining possible dynamics in order to understand human behavior and relative consequences. By providing realistic parameters associated with disaster situations, computational scientists can work more closely with emergency responders to improve the quality of interventions in the future.

  15. The Architectural Designs of a Nanoscale Computing Model

    Directory of Open Access Journals (Sweden)

    Mary M. Eshaghian-Wilner

    2004-08-01

    Full Text Available A generic nanoscale computing model is presented in this paper. The model consists of a collection of fully interconnected nanoscale computing modules, where each module is a cube of cells made out of quantum dots, spins, or molecules. The cells dynamically switch between two states by quantum interactions among their neighbors in all three dimensions. This paper includes a brief introduction to the field of nanotechnology from a computing point of view and presents a set of preliminary architectural designs for fabricating the nanoscale model studied.

  16. Virtual reality exposure treatment of agoraphobia: a comparison of computer automatic virtual environment and head-mounted display

    NARCIS (Netherlands)

    Meyerbröker, K.; Morina, N.; Kerkhof, G.; Emmelkamp, P.M.G.; Wiederhold, B.K.; Bouchard, S.; Riva, G.

    2011-01-01

    In this study the effects of virtual reality exposure therapy (VRET) were investigated in patients with panic disorder and agoraphobia. The level of presence in VRET was compared between using either a head-mounted display (HMD) or a computer automatic virtual environment (CAVE). Results indicate

  17. GRAVTool, a Package to Compute Geoid Model by Remove-Compute-Restore Technique

    Science.gov (United States)

    Marotta, G. S.; Blitzkow, D.; Vidotti, R. M.

    2015-12-01

    Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astro-geodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove-Compute-Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and global geopotential coefficients, respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and that adjust these models to one local vertical datum. This research presents a developed package called GRAVTool based on MATLAB software to compute local geoid models by RCR technique and its application in a study area. The studied area comprehends the federal district of Brazil, with ~6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show the local geoid model computed by the GRAVTool package (Figure), using 1377 terrestrial gravity data, SRTM data with 3 arc second of resolution, and geopotential coefficients of the EIGEN-6C4 model to degree 360. The accuracy of the computed model (σ = ± 0.071 m, RMS = 0.069 m, maximum = 0.178 m and minimum = -0.123 m) matches the uncertainty (σ =± 0.073) of 21 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.099 m, RMS = 0.208 m, maximum = 0.419 m and minimum = -0.040 m).

  18. Computational Modeling of Turbulent Spray Combustion

    NARCIS (Netherlands)

    Ma, L.

    2016-01-01

    The objective of the research presented in this thesis is development and validation of predictive models or modeling approaches of liquid fuel combustion (spray combustion) in hot-diluted environments, known as flameless combustion or MILD combustion. The goal is to combine good physical insight,

  19. Computer Aided Modelling – Opportunities and Challenges

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    -based solutions to significant problems? The important issues of workflow and data flow are discussed together with fit-for-purpose model development. As well, the lack of tools around multiscale modelling provides opportunities for the development of efficient tools to address such challenges. The ability...

  20. Design, development and validation of software for modelling dietary exposure to food chemicals and nutrients.

    Science.gov (United States)

    McNamara, C; Naddy, B; Rohan, D; Sexton, J

    2003-10-01

    The Monte Carlo computational system for stochastic modelling of dietary exposure to food chemicals and nutrients is presented. This system was developed through a European Commission-funded research project. It is accessible as a Web-based application service. The system allows and supports very significant complexity in the data sets used as the model input, but provides a simple, general purpose, linear kernel for model evaluation. Specific features of the system include the ability to enter (arbitrarily) complex mathematical or probabilistic expressions at each and every input data field, automatic bootstrapping on subjects and on subject food intake diaries, and custom kernels to apply brand information such as market share and loyalty to the calculation of food and chemical intake.

  1. Inhalation Exposure Input Parameters for the Biosphere Model

    Energy Technology Data Exchange (ETDEWEB)

    M. A. Wasiolek

    2003-09-24

    This analysis is one of the nine reports that support the Environmental Radiation Model for Yucca Mountain Nevada (ERMYN) biosphere model. The ''Biosphere Model Report'' (BSC 2003a) describes in detail the conceptual model as well as the mathematical model and its input parameters. This report documents a set of input parameters for the biosphere model, and supports the use of the model to develop biosphere dose conversion factors (BDCFs). The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for a Yucca Mountain repository. This report, ''Inhalation Exposure Input Parameters for the Biosphere Model'', is one of the five reports that develop input parameters for the biosphere model. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling, and the plan for development of the biosphere abstraction products for TSPA, as identified in the ''Technical Work Plan: for Biosphere Modeling and Expert Support'' (BSC 2003b). It should be noted that some documents identified in Figure 1-1 may be under development at the time this report is issued and therefore not available at that time. This figure is included to provide an understanding of how this analysis report contributes to biosphere modeling in support of the license application, and is not intended to imply that access to the listed documents is required to understand the contents of this analysis report. This analysis report defines and justifies values of mass loading, which is the total mass concentration of resuspended particles (e.g., dust, ash) in a volume of air. Measurements of mass loading are used in the air submodel of ERMYN to calculate concentrations of radionuclides in air surrounding crops and concentrations in air

  2. Inhalation Exposure Input Parameters for the Biosphere Model

    International Nuclear Information System (INIS)

    M. A. Wasiolek

    2003-01-01

    This analysis is one of the nine reports that support the Environmental Radiation Model for Yucca Mountain Nevada (ERMYN) biosphere model. The ''Biosphere Model Report'' (BSC 2003a) describes in detail the conceptual model as well as the mathematical model and its input parameters. This report documents a set of input parameters for the biosphere model, and supports the use of the model to develop biosphere dose conversion factors (BDCFs). The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for a Yucca Mountain repository. This report, ''Inhalation Exposure Input Parameters for the Biosphere Model'', is one of the five reports that develop input parameters for the biosphere model. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling, and the plan for development of the biosphere abstraction products for TSPA, as identified in the ''Technical Work Plan: for Biosphere Modeling and Expert Support'' (BSC 2003b). It should be noted that some documents identified in Figure 1-1 may be under development at the time this report is issued and therefore not available at that time. This figure is included to provide an understanding of how this analysis report contributes to biosphere modeling in support of the license application, and is not intended to imply that access to the listed documents is required to understand the contents of this analysis report. This analysis report defines and justifies values of mass loading, which is the total mass concentration of resuspended particles (e.g., dust, ash) in a volume of air. Measurements of mass loading are used in the air submodel of ERMYN to calculate concentrations of radionuclides in air surrounding crops and concentrations in air inhaled by a receptor. Concentrations in air to which the

  3. Computing broadband accelerograms using kinematic rupture modeling

    International Nuclear Information System (INIS)

    Ruiz Paredes, J.A.

    2007-05-01

    In order to make the broadband kinematic rupture modeling more realistic with respect to dynamic modeling, physical constraints are added to the rupture parameters. To improve the slip velocity function (SVF) modeling, an evolution of the k -2 source model is proposed, which consists to decompose the slip as a sum of sub-events by band of k. This model yields to SVF close to the solution proposed by Kostrov for a crack, while preserving the spectral characteristics of the radiated wave field, i.e. a w 2 model with spectral amplitudes at high frequency scaled to the coefficient of directivity C d . To better control the directivity effects, a composite source description is combined with a scaling law defining the extent of the nucleation area for each sub-event. The resulting model allows to reduce the apparent coefficient of directivity to a fraction of C d , as well as to reproduce the standard deviation of the new empirical attenuation relationships proposed for Japan. To make source models more realistic, a variable rupture velocity in agreement with the physics of the rupture must be considered. The followed approach that is based on an analytical relation between the fracture energy, the slip and the rupture velocity, leads to higher values of the peak ground acceleration in the vicinity of the fault. Finally, to better account for the interaction of the wave field with the geological medium, a semi-empirical methodology is developed combining a composite source model with empirical Green functions, and is applied to the Yamaguchi, M w 5.9 earthquake. The modeled synthetics reproduce satisfactorily well the observed main characteristics of ground motions. (author)

  4. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  5. Computer Simulation (Microcultures): An Effective Model for Multicultural Education.

    Science.gov (United States)

    Nelson, Jorge O.

    This paper presents a rationale for using high-fidelity computer simulation in planning for and implementing effective multicultural education strategies. Using computer simulation, educators can begin to understand and plan for the concept of cultural sensitivity in delivering instruction. The model promises to emphasize teachers' understanding…

  6. The European computer model for optronic system performance prediction (ECOMOS)

    NARCIS (Netherlands)

    Kessler, S.; Bijl, P.; Labarre, L.; Repasi, E.; Wittenstein, W.; Bürsing, H.

    2017-01-01

    ECOMOS is a multinational effort within the framework of an EDA Project Arrangement. Its aim is to provide a generally accepted and harmonized European computer model for computing nominal Target Acquisition (TA) ranges of optronic imagers operating in the Visible or thermal Infrared (IR). The

  7. Computer - based modeling in extract sciences research -III ...

    African Journals Online (AJOL)

    Molecular modeling techniques have been of great applicability in the study of the biological sciences and other exact science fields like agriculture, mathematics, computer science and the like. In this write up, a list of computer programs for predicting, for instance, the structure of proteins has been provided. Discussions on ...

  8. Methods for teaching geometric modelling and computer graphics

    Energy Technology Data Exchange (ETDEWEB)

    Rotkov, S.I.; Faitel`son, Yu. Ts.

    1992-05-01

    This paper considers methods for teaching the methods and algorithms of geometric modelling and computer graphics to programmers, designers and users of CAD and computer-aided research systems. There is a bibliography that can be used to prepare lectures and practical classes. 37 refs., 1 tab.

  9. Use of an aggregate exposure model to estimate consumer exposure to fragrance ingredients in personal care and cosmetic products.

    Science.gov (United States)

    Safford, B; Api, A M; Barratt, C; Comiskey, D; Daly, E J; Ellis, G; McNamara, C; O'Mahony, C; Robison, S; Smith, B; Thomas, R; Tozer, S

    2015-08-01

    Ensuring the toxicological safety of fragrance ingredients used in personal care and cosmetic products is essential in product development and design, as well as in the regulatory compliance of the products. This requires an accurate estimation of consumer exposure which, in turn, requires an understanding of consumer habits and use of products. Where ingredients are used in multiple product types, it is important to take account of aggregate exposure in consumers using these products. This publication investigates the use of a newly developed probabilistic model, the Creme RIFM model, to estimate aggregate exposure to fragrance ingredients using the example of 2-phenylethanol (PEA). The output shown demonstrates the utility of the model in determining systemic and dermal exposure to fragrances from individual products, and aggregate exposure. The model provides valuable information not only for risk assessment, but also for risk management. It should be noted that data on the concentrations of PEA in products used in this article were obtained from limited sources and not the standard, industry wide surveys typically employed by the fragrance industry and are thus presented here to illustrate the output and utility of the newly developed model. They should not be considered an accurate representation of actual exposure to PEA. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. PBDE exposure from food in Ireland: optimising data exploitation in probabilistic exposure modelling.

    Science.gov (United States)

    Trudel, David; Tlustos, Christina; Von Goetz, Natalie; Scheringer, Martin; Hungerbühler, Konrad

    2011-01-01

    Polybrominated diphenyl ethers (PBDEs) are a class of brominated flame retardants added to plastics, polyurethane foam, electronics, textiles, and other products. These products release PBDEs into the indoor and outdoor environment, thus causing human exposure through food and dust. This study models PBDE dose distributions from ingestion of food for Irish adults on congener basis by using two probabilistic and one semi-deterministic method. One of the probabilistic methods was newly developed and is based on summary statistics of food consumption combined with a model generating realistic daily energy supply from food. Median (intermediate) doses of total PBDEs are in the range of 0.4-0.6 ng/kg(bw)/day for Irish adults. The 97.5th percentiles of total PBDE doses lie in a range of 1.7-2.2 ng/kg(bw)/day, which is comparable to doses derived for Belgian and Dutch adults. BDE-47 and BDE-99 were identified as the congeners contributing most to estimated intakes, accounting for more than half of the total doses. The most influential food groups contributing to this intake are lean fish and salmon which together account for about 22-25% of the total doses.

  11. Validation of the STAFF-5 computer model

    International Nuclear Information System (INIS)

    Fletcher, J.F.; Fields, S.R.

    1981-04-01

    STAFF-5 is a dynamic heat-transfer-fluid-flow stress model designed for computerized prediction of the temperature-stress performance of spent LWR fuel assemblies under storage/disposal conditions. Validation of the temperature calculating abilities of this model was performed by comparing temperature calculations under specified conditions to experimental data from the Engine Maintenance and Dissassembly (EMAD) Fuel Temperature Test Facility and to calculations performed by Battelle Pacific Northwest Laboratory (PNL) using the HYDRA-1 model. The comparisons confirmed the ability of STAFF-5 to calculate representative fuel temperatures over a considerable range of conditions, as a first step in the evaluation and prediction of fuel temperature-stress performance

  12. Transport modeling and advanced computer techniques

    International Nuclear Information System (INIS)

    Wiley, J.C.; Ross, D.W.; Miner, W.H. Jr.

    1988-11-01

    A workshop was held at the University of Texas in June 1988 to consider the current state of transport codes and whether improved user interfaces would make the codes more usable and accessible to the fusion community. Also considered was the possibility that a software standard could be devised to ease the exchange of routines between groups. It was noted that two of the major obstacles to exchanging routines now are the variety of geometrical representation and choices of units. While the workshop formulated no standards, it was generally agreed that good software engineering would aid in the exchange of routines, and that a continued exchange of ideas between groups would be worthwhile. It seems that before we begin to discuss software standards we should review the current state of computer technology --- both hardware and software to see what influence recent advances might have on our software goals. This is done in this paper

  13. Predictive Models and Computational Toxicology (II IBAMTOX)

    Science.gov (United States)

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  14. Computational Models of Human Organizational Dynamics

    National Research Council Canada - National Science Library

    Courand, Gregg

    2000-01-01

    .... ThIs is the final report for our Phase II SBIR project, conducted over three years. Our research program has contributed theory, methodology, and technology for organizational modeling and analysis...

  15. Computational Model for Spacecraft/Habitat Volume

    Data.gov (United States)

    National Aeronautics and Space Administration — Please note that funding to Dr. Simon Hsiang, a critical co-investigator for the development of the Spacecraft Optimization Layout and Volume (SOLV) model, was...

  16. Computational modeling and engineering in pediatric and congenital heart disease.

    Science.gov (United States)

    Marsden, Alison L; Feinstein, Jeffrey A

    2015-10-01

    Recent methodological advances in computational simulations are enabling increasingly realistic simulations of hemodynamics and physiology, driving increased clinical utility. We review recent developments in the use of computational simulations in pediatric and congenital heart disease, describe the clinical impact in modeling in single-ventricle patients, and provide an overview of emerging areas. Multiscale modeling combining patient-specific hemodynamics with reduced order (i.e., mathematically and computationally simplified) circulatory models has become the de-facto standard for modeling local hemodynamics and 'global' circulatory physiology. We review recent advances that have enabled faster solutions, discuss new methods (e.g., fluid structure interaction and uncertainty quantification), which lend realism both computationally and clinically to results, highlight novel computationally derived surgical methods for single-ventricle patients, and discuss areas in which modeling has begun to exert its influence including Kawasaki disease, fetal circulation, tetralogy of Fallot (and pulmonary tree), and circulatory support. Computational modeling is emerging as a crucial tool for clinical decision-making and evaluation of novel surgical methods and interventions in pediatric cardiology and beyond. Continued development of modeling methods, with an eye towards clinical needs, will enable clinical adoption in a wide range of pediatric and congenital heart diseases.

  17. Computational numerical modelling of plasma focus

    International Nuclear Information System (INIS)

    Brollo, Fabricio

    2005-01-01

    Several models for calculation of the dynamics of Plasma Focus have been developed. All of them begin from the same physic principle: the current sheet run down the anode length, ionizing and collecting the gas that finds in its way.This is known as snow-plow model.Concerning pinch's compression, a MHD model is proposed.The plasma is treated as a fluid , particularly as a high ionized gas.However, there are not many models that, taking into account thermal equilibrium inside the plasma, make approximated calculations of the maximum temperatures reached in the pinch.Besides, there are no models which use those temperatures to estimate the termofusion neutron yield for the Deuterium or Deuterium-Tritium gas filled cases.In the PLADEMA network (Dense Magnetized Plasmas) a code was developed with the objective of describe the plasma focus dynamics, in a conceptual engineering stage.The codes calculates the principal variables (currents, time to focus, etc) and estimates the neutron yield in Deuterium-filled plasma focus devices.It can be affirmed that the code's experimental validation, in its axial and radial stages, was very successfully. However, it was accepted that the compression stage should be formulated again, to find a solution for a large variation of a parameter related with velocity profiles for the particles trapped inside the pinch.The objectives of this work can be stated in the next way : - Check the compression's model hypothesis. Develop a new model .- Implement the new model in the code. Compare results against experimental data of Plasma Focus devices from all around the world [es

  18. Computer models and output, Spartan REM: Appendix B

    Science.gov (United States)

    Marlowe, D. S.; West, E. J.

    1984-01-01

    A computer model of the Spartan Release Engagement Mechanism (REM) is presented in a series of numerical charts and engineering drawings. A crack growth analysis code is used to predict the fracture mechanics of critical components.

  19. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  20. The Next Generation ARC Middleware and ATLAS Computing Model

    CERN Document Server

    Filipcic, A; The ATLAS collaboration; Smirnova, O; Konstantinov, A; Karpenko, D

    2012-01-01

    The distributed NDGF Tier-1 and associated Nordugrid clusters are well integrated into the ATLAS computing model but follow a slightly different paradigm than other ATLAS resources. The current strategy does not divide the sites as in the commonly used hierarchical model, but rather treats them as a single storage endpoint and a pool of distributed computing nodes. The next generation ARC middleware with its several new technologies provides new possibilities in development of the ATLAS computing model, such as pilot jobs with pre-cached input files, automatic job migration between the sites, integration of remote sites without connected storage elements, and automatic brokering for jobs with non-standard resource requirements. ARC's data transfer model provides an automatic way for the computing sites to participate in ATLAS' global task management system without requiring centralised brokering or data transfer services. The powerful API combined with Python and Java bindings can easily be used to build new ...

  1. Model and Computing Experiment for Research and Aerosols Usage Management

    Directory of Open Access Journals (Sweden)

    Daler K. Sharipov

    2012-09-01

    Full Text Available The article deals with a math model for research and management of aerosols released into the atmosphere as well as numerical algorithm used as hardware and software systems for conducting computing experiment.

  2. Computational Models for Nonlinear Aeroelastic Systems, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. and Duke University propose to develop and demonstrate a new and efficient computational method of modeling nonlinear aeroelastic systems. The...

  3. Computational modelling of the impact of AIDS on business.

    Science.gov (United States)

    Matthews, Alan P

    2007-07-01

    An overview of computational modelling of the impact of AIDS on business in South Africa, with a detailed description of the AIDS Projection Model (APM) for companies, developed by the author, and suggestions for further work. Computational modelling of the impact of AIDS on business in South Africa requires modelling of the epidemic as a whole, and of its impact on a company. This paper gives an overview of epidemiological modelling, with an introduction to the Actuarial Society of South Africa (ASSA) model, the most widely used such model for South Africa. The APM produces projections of HIV prevalence, new infections, and AIDS mortality on a company, based on the anonymous HIV testing of company employees, and projections from the ASSA model. A smoothed statistical model of the prevalence test data is computed, and then the ASSA model projection for each category of employees is adjusted so that it matches the measured prevalence in the year of testing. FURTHER WORK: Further techniques that could be developed are microsimulation (representing individuals in the computer), scenario planning for testing strategies, and models for the business environment, such as models of entire sectors, and mapping of HIV prevalence in time and space, based on workplace and community data.

  4. Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis

    Science.gov (United States)

    Young, Cristobal; Holsteen, Katherine

    2017-01-01

    Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…

  5. Computational quench model applicable to the SMES/CICC

    Science.gov (United States)

    Luongo, Cesar A.; Chang, Chih-Lien; Partain, Kenneth D.

    1994-07-01

    A computational quench model accounting for the hydraulic peculiarities of the 200 kA SMES cable-in-conduit conductor has been developed. The model is presented and used to simulate the quench on the SMES-ETM. Conclusions are drawn concerning quench detection and protection. A plan for quench model validation is presented.

  6. Petri Net Modeling of Computer Virus Life Cycle | Ikekonwu ...

    African Journals Online (AJOL)

    Virus life cycle, which refers to the stages of development of a computer virus, is presented as a suitable area for the application of Petri nets. Petri nets a powerful modeling tool in the field of dynamic system analysis is applied to model the virus life cycle. Simulation of the derived model is also presented. The intention of ...

  7. A Situative Space Model for Mobile Mixed-Reality Computing

    DEFF Research Database (Denmark)

    Pederson, Thomas; Janlert, Lars-Erik; Surie, Dipak

    2011-01-01

    This article proposes a situative space model that links the physical and virtual realms and sets the stage for complex human-computer interaction defined by what a human agent can see, hear, and touch, at any given point in time.......This article proposes a situative space model that links the physical and virtual realms and sets the stage for complex human-computer interaction defined by what a human agent can see, hear, and touch, at any given point in time....

  8. Computational model of miniature pulsating heat pipes

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, Mario J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Givler, Richard C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-01-01

    The modeling work described herein represents Sandia National Laboratories (SNL) portion of a collaborative three-year project with Northrop Grumman Electronic Systems (NGES) and the University of Missouri to develop an advanced, thermal ground-plane (TGP), which is a device, of planar configuration, that delivers heat from a source to an ambient environment with high efficiency. Work at all three institutions was funded by DARPA/MTO; Sandia was funded under DARPA/MTO project number 015070924. This is the final report on this project for SNL. This report presents a numerical model of a pulsating heat pipe, a device employing a two phase (liquid and its vapor) working fluid confined in a closed loop channel etched/milled into a serpentine configuration in a solid metal plate. The device delivers heat from an evaporator (hot zone) to a condenser (cold zone). This new model includes key physical processes important to the operation of flat plate pulsating heat pipes (e.g. dynamic bubble nucleation, evaporation and condensation), together with conjugate heat transfer with the solid portion of the device. The model qualitatively and quantitatively predicts performance characteristics and metrics, which was demonstrated by favorable comparisons with experimental results on similar configurations. Application of the model also corroborated many previous performance observations with respect to key parameters such as heat load, fill ratio and orientation.

  9. Reduced radiation exposure to the mammary glands in thoracic computed tomography using organ-based tube-current modulation

    International Nuclear Information System (INIS)

    Munechika, Jiro; Ohgiya, Yoshimitsu; Gokan, Takehiko; Hashimoto, Toshi; Iwai, Tsugunori

    2013-01-01

    Organ-based tube-current modulation has been used to reduce radiation exposure to specific organs. However, there are no reports yet published on reducing radiation exposure in clinical cases. In this study, we assessed the reduction in radiation exposure to the mammary glands during thoracic computed tomography (CT) using X-CARE. In a phantom experiment, the use of X-CARE reduced radiation exposure at the midline of the precordial region by a maximum of 45.1%. In our corresponding clinical study, CT was performed using X-CARE in 15 patients, and without X-CARE in another 15. Compared to the non-X-CARE group, radiation exposure was reduced in the X-CARE group at the midline of the precordial region by 22.3% (P 0.05). X-CARE thus reduced radiation exposure at the midline of the precordial region and allowed us to obtain consistent CT values without increasing noise. However, this study revealed increases in radiation exposure at the lateral sides of the breasts. It is conceivable that patients' breasts were laterally displaced by gravity under the standard thoracic imaging conditions. Further studies that consider factors such as body size and adjustment of imaging conditions may be needed in the future. (author)

  10. Computer Models and Automata Theory in Biology and Medicine

    CERN Document Server

    Baianu, I C

    2004-01-01

    The applications of computers to biological and biomedical problem solving goes back to the very beginnings of computer science, automata theory [1], and mathematical biology [2]. With the advent of more versatile and powerful computers, biological and biomedical applications of computers have proliferated so rapidly that it would be virtually impossible to compile a comprehensive review of all developments in this field. Limitations of computer simulations in biology have also come under close scrutiny, and claims have been made that biological systems have limited information processing power [3]. Such general conjectures do not, however, deter biologists and biomedical researchers from developing new computer applications in biology and medicine. Microprocessors are being widely employed in biological laboratories both for automatic data acquisition/processing and modeling; one particular area, which is of great biomedical interest, involves fast digital image processing and is already established for rout...

  11. Computed tomography for preoperative planning in minimal-invasive total hip arthroplasty: Radiation exposure and cost analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huppertz, Alexander, E-mail: Alexander.Huppertz@charite.de [Imaging Science Institute Charite Berlin, Robert-Koch-Platz 7, D-10115 Berlin (Germany); Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Radmer, Sebastian, E-mail: s.radmer@immanuel.de [Department of Orthopedic Surgery and Rheumatology, Immanuel-Krankenhaus, Koenigstr. 63, D-14109, Berlin (Germany); Asbach, Patrick, E-mail: Patrick.Asbach@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Juran, Ralf, E-mail: ralf.juran@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Schwenke, Carsten, E-mail: carsten.schwenke@scossis.de [Biostatistician, Scossis Statistical Consulting, Zeltinger Str. 58G, D-13465 Berlin (Germany); Diederichs, Gerd, E-mail: gerd.diederichs@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Hamm, Bernd, E-mail: Bernd.Hamm@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Sparmann, Martin, E-mail: m.sparmann@immanuel.de [Department of Orthopedic Surgery and Rheumatology, Immanuel-Krankenhaus, Koenigstr. 63, D-14109, Berlin (Germany)

    2011-06-15

    Computed tomography (CT) was used for preoperative planning of minimal-invasive total hip arthroplasty (THA). 92 patients (50 males, 42 females, mean age 59.5 years) with a mean body-mass-index (BMI) of 26.5 kg/m{sup 2} underwent 64-slice CT to depict the pelvis, the knee and the ankle in three independent acquisitions using combined x-, y-, and z-axis tube current modulation. Arthroplasty planning was performed using 3D-Hip Plan (Symbios, Switzerland) and patient radiation dose exposure was determined. The effects of BMI, gender, and contralateral THA on the effective dose were evaluated by an analysis-of-variance. A process-cost-analysis from the hospital perspective was done. All CT examinations were of sufficient image quality for 3D-THA planning. A mean effective dose of 4.0 mSv (SD 0.9 mSv) modeled by the BMI (p < 0.0001) was calculated. The presence of a contralateral THA (9/92 patients; p = 0.15) and the difference between males and females were not significant (p = 0.08). Personnel involved were the radiologist (4 min), the surgeon (16 min), the radiographer (12 min), and administrative personnel (4 min). A CT operation time of 11 min and direct per-patient costs of 52.80 Euro were recorded. Preoperative CT for THA was associated with a slight and justifiable increase of radiation exposure in comparison to conventional radiographs and low per-patient costs.

  12. Advanced Computational Approaches for Characterizing Stochastic Cellular Responses to Low Dose, Low Dose Rate Exposures

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Bobby, R., Ph.D.

    2003-06-27

    OAK - B135 This project final report summarizes modeling research conducted in the U.S. Department of Energy (DOE), Low Dose Radiation Research Program at the Lovelace Respiratory Research Institute from October 1998 through June 2003. The modeling research described involves critically evaluating the validity of the linear nonthreshold (LNT) risk model as it relates to stochastic effects induced in cells by low doses of ionizing radiation and genotoxic chemicals. The LNT model plays a central role in low-dose risk assessment for humans. With the LNT model, any radiation (or genotoxic chemical) exposure is assumed to increase one¡¯s risk of cancer. Based on the LNT model, others have predicted tens of thousands of cancer deaths related to environmental exposure to radioactive material from nuclear accidents (e.g., Chernobyl) and fallout from nuclear weapons testing. Our research has focused on developing biologically based models that explain the shape of dose-response curves for low-dose radiation and genotoxic chemical-induced stochastic effects in cells. Understanding the shape of the dose-response curve for radiation and genotoxic chemical-induced stochastic effects in cells helps to better understand the shape of the dose-response curve for cancer induction in humans. We have used a modeling approach that facilitated model revisions over time, allowing for timely incorporation of new knowledge gained related to the biological basis for low-dose-induced stochastic effects in cells. Both deleterious (e.g., genomic instability, mutations, and neoplastic transformation) and protective (e.g., DNA repair and apoptosis) effects have been included in our modeling. Our most advanced model, NEOTRANS2, involves differing levels of genomic instability. Persistent genomic instability is presumed to be associated with nonspecific, nonlethal mutations and to increase both the risk for neoplastic transformation and for cancer occurrence. Our research results, based on

  13. Computer Models Simulate Fine Particle Dispersion

    Science.gov (United States)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  14. Computational Models for Analysis of Illicit Activities

    DEFF Research Database (Denmark)

    Nizamani, Sarwat

    been explored in this thesis by considering them as epidemic-like processes. A mathematical model has been developed based on differential equations, which studies the dynamics of the issues from the very beginning until the issues cease. This study extends classical models of the spread of epidemics...... to describe the phenomenon of contagious public outrage, which eventually leads to the spread of violence following a disclosure of some unpopular political decisions and/or activity. The results shed a new light on terror activity and provide some hint on how to curb the spreading of violence within...

  15. Model Infrastruktur dan Manajemen Platform Server Berbasis Cloud Computing

    Directory of Open Access Journals (Sweden)

    Mulki Indana Zulfa

    2017-11-01

    Full Text Available Cloud computing is a new technology that is still very rapidly growing. This technology makes the Internet as the main media for the management of data and applications remotely. Cloud computing allows users to run an application without having to think about infrastructure and its platforms. Other technical aspects such as memory, storage, backup and restore, can be done very easily. This research is intended to modeling the infrastructure and management of computer platform in computer network of Faculty of Engineering, University of Jenderal Soedirman. The first stage in this research is literature study, by finding out the implementation model in previous research. Then the result will be combined with a new approach to existing resources and try to implement directly on the existing server network. The results showed that the implementation of cloud computing technology is able to replace the existing platform network.

  16. Complex system modelling and control through intelligent soft computations

    CERN Document Server

    Azar, Ahmad

    2015-01-01

    The book offers a snapshot of the theories and applications of soft computing in the area of complex systems modeling and control. It presents the most important findings discussed during the 5th International Conference on Modelling, Identification and Control, held in Cairo, from August 31-September 2, 2013. The book consists of twenty-nine selected contributions, which have been thoroughly reviewed and extended before their inclusion in the volume. The different chapters, written by active researchers in the field, report on both current theories and important applications of soft-computing. Besides providing the readers with soft-computing fundamentals, and soft-computing based inductive methodologies/algorithms, the book also discusses key industrial soft-computing applications, as well as multidisciplinary solutions developed for a variety of purposes, like windup control, waste management, security issues, biomedical applications and many others. It is a perfect reference guide for graduate students, r...

  17. A Computational Analysis Model for Open-ended Cognitions

    Science.gov (United States)

    Morita, Junya; Miwa, Kazuhisa

    In this paper, we propose a novel usage for computational cognitive models. In cognitive science, computational models have played a critical role of theories for human cognitions. Many computational models have simulated results of controlled psychological experiments successfully. However, there have been only a few attempts to apply the models to complex realistic phenomena. We call such a situation ``open-ended situation''. In this study, MAC/FAC (``many are called, but few are chosen''), proposed by [Forbus 95], that models two stages of analogical reasoning was applied to our open-ended psychological experiment. In our experiment, subjects were presented a cue story, and retrieved cases that had been learned in their everyday life. Following this, they rated inferential soundness (goodness as analogy) of each retrieved case. For each retrieved case, we computed two kinds of similarity scores (content vectors/structural evaluation scores) using the algorithms of the MAC/FAC. As a result, the computed content vectors explained the overall retrieval of cases well, whereas the structural evaluation scores had a strong relation to the rated scores. These results support the MAC/FAC's theoretical assumption - different similarities are involved on the two stages of analogical reasoning. Our study is an attempt to use a computational model as an analysis device for open-ended human cognitions.

  18. Global Stability of an Epidemic Model of Computer Virus

    Directory of Open Access Journals (Sweden)

    Xiaofan Yang

    2014-01-01

    Full Text Available With the rapid popularization of the Internet, computers can enter or leave the Internet increasingly frequently. In fact, no antivirus software can detect and remove all sorts of computer viruses. This implies that viruses would persist on the Internet. To better understand the spread of computer viruses in these situations, a new propagation model is established and analyzed. The unique equilibrium of the model is globally asymptotically stable, in accordance with the reality. A parameter analysis of the equilibrium is also conducted.

  19. Multiphysics and Thermal Response Models to Improve Accuracy of Local Temperature Estimation in Rat Cortex under Microwave Exposure

    Science.gov (United States)

    Kodera, Sachiko; Gomez-Tames, Jose; Hirata, Akimasa; Masuda, Hiroshi; Arima, Takuji; Watanabe, Soichi

    2017-01-01

    The rapid development of wireless technology has led to widespread concerns regarding adverse human health effects caused by exposure to electromagnetic fields. Temperature elevation in biological bodies is an important factor that can adversely affect health. A thermophysiological model is desired to quantify microwave (MW) induced temperature elevations. In this study, parameters related to thermophysiological responses for MW exposures were estimated using an electromagnetic-thermodynamics simulation technique. To the authors’ knowledge, this is the first study in which parameters related to regional cerebral blood flow in a rat model were extracted at a high degree of accuracy through experimental measurements for localized MW exposure at frequencies exceeding 6 GHz. The findings indicate that the improved modeling parameters yield computed results that match well with the measured quantities during and after exposure in rats. It is expected that the computational model will be helpful in estimating the temperature elevation in the rat brain at multiple observation points (that are difficult to measure simultaneously) and in explaining the physiological changes in the local cortex region. PMID:28358345

  20. Computational Modeling Develops Ultra-Hard Steel

    Science.gov (United States)

    2007-01-01

    Glenn Research Center's Mechanical Components Branch developed a spiral bevel or face gear test rig for testing thermal behavior, surface fatigue, strain, vibration, and noise; a full-scale, 500-horsepower helicopter main-rotor transmission testing stand; a gear rig that allows fundamental studies of the dynamic behavior of gear systems and gear noise; and a high-speed helical gear test for analyzing thermal behavior for rotorcraft. The test rig provides accelerated fatigue life testing for standard spur gears at speeds of up to 10,000 rotations per minute. The test rig enables engineers to investigate the effects of materials, heat treat, shot peen, lubricants, and other factors on the gear's performance. QuesTek Innovations LLC, based in Evanston, Illinois, recently developed a carburized, martensitic gear steel with an ultra-hard case using its computational design methodology, but needed to verify surface fatigue, lifecycle performance, and overall reliability. The Battelle Memorial Institute introduced the company to researchers at Glenn's Mechanical Components Branch and facilitated a partnership allowing researchers at the NASA Center to conduct spur gear fatigue testing for the company. Testing revealed that QuesTek's gear steel outperforms the current state-of-the-art alloys used for aviation gears in contact fatigue by almost 300 percent. With the confidence and credibility provided by the NASA testing, QuesTek is commercializing two new steel alloys. Uses for this new class of steel are limitless in areas that demand exceptional strength for high throughput applications.

  1. Computational Modeling for Language Acquisition: A Tutorial With Syntactic Islands.

    Science.gov (United States)

    Pearl, Lisa S; Sprouse, Jon

    2015-06-01

    Given the growing prominence of computational modeling in the acquisition research community, we present a tutorial on how to use computational modeling to investigate learning strategies that underlie the acquisition process. This is useful for understanding both typical and atypical linguistic development. We provide a general overview of why modeling can be a particularly informative tool and some general considerations when creating a computational acquisition model. We then review a concrete example of a computational acquisition model for complex structural knowledge referred to as syntactic islands. This includes an overview of syntactic islands knowledge, a precise definition of the acquisition task being modeled, the modeling results, and how to meaningfully interpret those results in a way that is relevant for questions about knowledge representation and the learning process. Computational modeling is a powerful tool that can be used to understand linguistic development. The general approach presented here can be used to investigate any acquisition task and any learning strategy, provided both are precisely defined.

  2. A Computational Model of Spatial Development

    Science.gov (United States)

    Hiraki, Kazuo; Sashima, Akio; Phillips, Steven

    Psychological experiments on children's development of spatial knowledge suggest experience at self-locomotion with visual tracking as important factors. Yet, the mechanism underlying development is unknown. We propose a robot that learns to mentally track a target object (i.e., maintaining a representation of an object's position when outside the field-of-view) as a model for spatial development. Mental tracking is considered as prediction of an object's position given the previous environmental state and motor commands, and the current environment state resulting from movement. Following Jordan & Rumelhart's (1992) forward modeling architecture the system consists of two components: an inverse model of sensory input to desired motor commands; and a forward model of motor commands to desired sensory input (goals). The robot was tested on the `three cups' paradigm (where children are required to select the cup containing the hidden object under various movement conditions). Consistent with child development, without the capacity for self-locomotion the robot's errors are self-center based. When given the ability of self-locomotion the robot responds allocentrically.

  3. Electricity load modelling using computational intelligence

    NARCIS (Netherlands)

    Ter Borg, R.W.

    2005-01-01

    As a consequence of the liberalisation of the electricity markets in Europe, market players have to continuously adapt their future supply to match their customers' demands. This poses the challenge of obtaining a predictive model that accurately describes electricity loads, current in this thesis.

  4. Computational Modeling of Fluorescence Loss in Photobleaching

    DEFF Research Database (Denmark)

    Hansen, Christian Valdemar; Schroll, Achim; Wüstner, Daniel

    2015-01-01

    sequences as reaction– diffusion systems on segmented cell images. The cell geometry is extracted from microscopy images using the Chan–Vese active contours algorithm [8]. The PDE model is subsequently solved by the automated Finite Element software package FEniCS [20]. The flexibility of FEniCS allows...

  5. Radiation enhanced conduction in insulators: computer modelling

    International Nuclear Information System (INIS)

    Fisher, A.J.

    1986-10-01

    The report describes the implementation of the Klaffky-Rose-Goland-Dienes [Phys. Rev. B.21 3610,1980] model of radiation-enhanced conduction and describes the codes used. The approach is demonstrated for the data for alumina of Pells, Buckley, Hill and Murphy [AERE R.11715, 1985]. (author)

  6. GPSS and Modeling of Computer Communication Networks.

    Science.gov (United States)

    1982-04-01

    action block in a flowchart of the system being modeled. For instance, the process of capturing a facility for some length of time and then...because of the abundance of tutorial material available; whereas, far less complete 47 tutorial material is available to beginners learning SIMSCRIPT

  7. Life system modeling and intelligent computing. Pt. I. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Li, Kang; Irwin, George W. (eds.) [Belfast Queen' s Univ. (United Kingdom). School of Electronics, Electrical Engineering and Computer Science; Fei, Minrui; Jia, Li [Shanghai Univ. (China). School of Mechatronical Engineering and Automation

    2010-07-01

    This book is part I of a two-volume work that contains the refereed proceedings of the International Conference on Life System Modeling and Simulation, LSMS 2010 and the International Conference on Intelligent Computing for Sustainable Energy and Environment, ICSEE 2010, held in Wuxi, China, in September 2010. The 194 revised full papers presented were carefully reviewed and selected from over 880 submissions and recommended for publication by Springer in two volumes of Lecture Notes in Computer Science (LNCS) and one volume of Lecture Notes in Bioinformatics (LNBI). This particular volume of Lecture Notes in Computer Science (LNCS) includes 55 papers covering 7 relevant topics. The 55 papers in this volume are organized in topical sections on intelligent modeling, monitoring, and control of complex nonlinear systems; autonomy-oriented computing and intelligent agents; advanced theory and methodology in fuzzy systems and soft computing; computational intelligence in utilization of clean and renewable energy resources; intelligent modeling, control and supervision for energy saving and pollution reduction; intelligent methods in developing vehicles, engines and equipments; computational methods and intelligence in modeling genetic and biochemical networks and regulation. (orig.)

  8. Studies on the calibration of mammography automatic exposure mode with computed radiology

    International Nuclear Information System (INIS)

    Zhu Hongzhou; Shao Guoliang; Shi Lei; Liu Qing

    2010-01-01

    Objective: To realize the optimization of image quality and radiation dose by correcting mammography automatic exposure, according to automatic exposure controlled mode of mammography film-screen system. Methods: The film-screen system (28 kV) was applied to perform automatic exposure of plexiglass (40 mm) and get the standard dose of exposure, the exposure mode of CR base on LgM=2.0 was rectified, which was divided into 10 steps. Mammary glands pattern (Fluke NA18-220) were examined with CR (26, 28, and 30 kV) by the automatic exposure mode corrected. The exposure values (mAs) were recorded. CR image was diagnosed and evaluated in double blind way by 4 radiologists according to American Collage of Radiology (ACR) standard. Results: Based on the standard of CR automatic exposure with the dose higher than the traditional exposure of film-screen system, the calibration of mammography automatic exposure was accomplished. The test results of the calibrated mode was better than the scoring system of ACR. Conclusions: Comparative study showed improvement in acquiring high-quality image and reduction of radiation dose. The corrected mammography automatic exposure mode might be a better method for clinical use. (authors)

  9. Predictive Modeling of Terrestrial Radiation Exposure from Geologic Materials

    Energy Technology Data Exchange (ETDEWEB)

    Malchow, Russell L. [National Security Technologies, LLC; Haber, Daniel University of Nevada, Las Vegas; Burnley, Pamela [University of Nevada, Las Vegas; Marsac, Kara [University of Nevada, Las Vegas; Hausrath, Elisabeth [University of Nevada, Las Vegas; Adcock, Christopher [University of Nevada, Las Vegas

    2015-01-01

    Aerial gamma ray surveys are important for those working in nuclear security and industry for determining locations of both anthropogenic radiological sources and natural occurrences of radionuclides. During an aerial gamma ray survey, a low flying aircraft, such as a helicopter, flies in a linear pattern across the survey area while measuring the gamma emissions with a sodium iodide (NaI) detector. Currently, if a gamma ray survey is being flown in an area, the only way to correct for geologic sources of gamma rays is to have flown the area previously. This is prohibitively expensive and would require complete national coverage. This project’s goal is to model the geologic contribution to radiological backgrounds using published geochemical data, GIS software, remote sensing, calculations, and modeling software. K, U and Th are the three major gamma emitters in geologic material. U and Th are assumed to be in secular equilibrium with their daughter isotopes. If K, U, and Th abundance values are known for a given geologic unit the expected gamma ray exposure rate can be calculated using the Grasty equation or by modeling software. Monte Carlo N-Particle Transport software (MCNP), developed by Los Alamos National Laboratory, is modeling software designed to simulate particles and their interactions with matter. Using this software, models have been created that represent various lithologies. These simulations randomly generate gamma ray photons at energy levels expected from natural radiologic sources. The photons take a random path through the simulated geologic media and deposit their energy at the end of their track. A series of nested spheres have been created and filled with simulated atmosphere to record energy deposition. Energies deposited are binned in the same manner as the NaI detectors used during an aerial survey. These models are used in place of the simplistic Grasty equation as they take into account absorption properties of the lithology which the

  10. Computational Intelligence in a Human Brain Model

    Directory of Open Access Journals (Sweden)

    Viorel Gaftea

    2016-06-01

    Full Text Available This paper focuses on the current trends in brain research domain and the current stage of development of research for software and hardware solutions, communication capabilities between: human beings and machines, new technologies, nano-science and Internet of Things (IoT devices. The proposed model for Human Brain assumes main similitude between human intelligence and the chess game thinking process. Tactical & strategic reasoning and the need to follow the rules of the chess game, all are very similar with the activities of the human brain. The main objective for a living being and the chess game player are the same: securing a position, surviving and eliminating the adversaries. The brain resolves these goals, and more, the being movement, actions and speech are sustained by the vital five senses and equilibrium. The chess game strategy helps us understand the human brain better and easier replicate in the proposed ‘Software and Hardware’ SAH Model.

  11. Computational Modeling of Supercritical and Transcritical Flows

    Science.gov (United States)

    2017-01-09

    Acentric factor I. Introduction Liquid rocket and gas turbine engines operate at high pressures . For gas turbines, the combustor pressurecan be 60 − 100...equation of state for several reduced pressures . The model captures the high density at very low temperatures and the supercritical behavior at high reduced...physical meaning. The temperature range over which the three roots are present is bounded by TL on the low side and TH on the high side. Figure 2: Roots

  12. On the influence of the exposure model on organ doses

    International Nuclear Information System (INIS)

    Drexler, G.; Eckerl, H.

    1988-01-01

    Based on the design characteristics of the MIRD-V phantom, two sex-specific adult phantoms, ADAM and EVA were introduced especially for the calculation of organ doses resulting from external irradiation. Although the body characteristics of all the phantoms are in good agreement with those of the reference man and woman, they have some disadvantages related to the location and shape of organs and the form of the whole body. To overcome these disadvantages related to the location and shape of organs and form of the whole body. To overcome these disadvantages related to the location and shape of organs and the form of the whole body. To overcome these disadvantages and to obtain more realistic phantoms, a technique based on computer tomographic data (voxel-phantom) was developed. This technique allows any physical phantom or real body to be converted into computer files. The improvements are of special importance with regard to the skeleton, because a better modeling of the bone surfaces and separation of hard bone and bone marrow can be achieved. For photon irradiation, the sensitivity of the model on organ doses or the effective dose equivalent is important for operational radiation protection

  13. Computational Modeling of Lipid Metabolism in Yeast

    Directory of Open Access Journals (Sweden)

    Vera Schützhold

    2016-09-01

    Full Text Available Lipid metabolism is essential for all major cell functions and has recently gained increasing attention in research and health studies. However, mathematical modeling by means of classical approaches such as stoichiometric networks and ordinary differential equation systems has not yet provided satisfactory insights, due to the complexity of lipid metabolism characterized by many different species with only slight differences and by promiscuous multifunctional enzymes.Here, we present a object-oriented stochastic model approach as a way to cope with the complex lipid metabolic network. While all lipid species are treated objects in the model, they can be modified by the respective converting reactions based on reaction rules, a hybrid method that integrates benefits of agent-based and classical stochastic simulation. This approach allows to follow the dynamics of all lipid species with different fatty acids, different degrees of saturation and different headgroups over time and to analyze the effect of parameter changes, potential mutations in the catalyzing enzymes or provision of different precursors. Applied to yeast metabolism during one cell cycle period, we could analyze the distribution of all lipids to the various membranes in time-dependent manner.The presented approach allows to efficiently treat the complexity of cellular lipid metabolism and to derive conclusions on the time- and location-dependent distributions of lipid species and their properties such as saturation. It is widely applicable, easily extendable and will provide further insights in healthy and diseased states of cell metabolism.

  14. Computational mathematics models, methods, and analysis with Matlab and MPI

    CERN Document Server

    White, Robert E

    2004-01-01

    Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from www4.ncsu.edu./~white.This book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...

  15. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  16. Airfoil computations using the gamma-Retheta model; Wind turbines

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen, Niels N.

    2009-05-15

    The present work addresses the validation of the implementation of the Menter, Langtry et al. gamma-theta correlation based transition model [1, 2, 3] in the EllipSys2D code. Firstly the 2. order of accuracy of the code is verified using a grid refinement study for laminar, turbulent and transitional computations. Based on this, an estimate of the error in the computations is determined to be approximately one percent in the attached region. Following the verification of the implemented model, the model is applied to four airfoils, NACA64-018, NACA64-218, NACA64-418 and NACA64-618 and the results are compared to measurements [4] and computations using the Xfoil code by Drela et al. [5]. In the linear pre stall region good agreement is observed both for lift and drag, while differences to both measurements and Xfoil computations are observed in stalled conditions. (au)

  17. Assessment of weld thickness loss in offshore pipelines using computed radiography and computational modeling

    International Nuclear Information System (INIS)

    Correa, S.C.A.; Souza, E.M.; Oliveira, D.F.; Silva, A.X.; Lopes, R.T.; Marinho, C.; Camerini, C.S.

    2009-01-01

    In order to guarantee the structural integrity of oil plants it is crucial to monitor the amount of weld thickness loss in offshore pipelines. However, in spite of its relevance, this parameter is very difficult to determine, due to both the large diameter of most pipes and the complexity of the multi-variable system involved. In this study, a computational modeling based on Monte Carlo MCNPX code is combined with computed radiography to estimate the weld thickness loss in large-diameter offshore pipelines. Results show that computational modeling is a powerful tool to estimate intensity variations in radiographic images generated by weld thickness variations, and it can be combined with computed radiography to assess weld thickness loss in offshore and subsea pipelines.

  18. The role of computer modelling in participatory integrated assessments

    International Nuclear Information System (INIS)

    Siebenhuener, Bernd; Barth, Volker

    2005-01-01

    In a number of recent research projects, computer models have been included in participatory procedures to assess global environmental change. The intention was to support knowledge production and to help the involved non-scientists to develop a deeper understanding of the interactions between natural and social systems. This paper analyses the experiences made in three projects with the use of computer models from a participatory and a risk management perspective. Our cross-cutting analysis of the objectives, the employed project designs and moderation schemes and the observed learning processes in participatory processes with model use shows that models play a mixed role in informing participants and stimulating discussions. However, no deeper reflection on values and belief systems could be achieved. In terms of the risk management phases, computer models serve best the purposes of problem definition and option assessment within participatory integrated assessment (PIA) processes

  19. Computer modeling of road bridge for simulation moving load

    Directory of Open Access Journals (Sweden)

    Miličić Ilija M.

    2016-01-01

    Full Text Available In this paper is shown computational modelling one span road structures truss bridge with the roadway on the upper belt of. Calculation models were treated as planar and spatial girders made up of 1D finite elements with applications for CAA: Tower and Bridge Designer 2016 (2nd Edition. The conducted computer simulations results are obtained for each comparison of the impact of moving load according to the recommendations of the two standards SRPS and AASHATO. Therefore, it is a variant of the bridge structure modeling application that provides Bridge Designer 2016 (2nd Edition identical modeled in an environment of Tower. As important information for the selection of a computer applications point out that the application Bridge Designer 2016 (2nd Edition we arent unable to treat the impacts moving load model under national standard - V600. .

  20. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    Science.gov (United States)

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.