WorldWideScience

Sample records for models imaging simulation

  1. Modeling and simulation of TDI CMOS image sensors

    Science.gov (United States)

    Nie, Kai-ming; Yao, Su-ying; Xu, Jiang-tao; Gao, Jing

    2013-09-01

    In this paper, a mathematical model of TDI CMOS image sensors was established in behavioral level through MATLAB based on the principle of a TDI CMOS image sensor using temporal oversampling rolling shutter in the along-track direction. The geometric perspective and light energy transmission relationships between the scene and the image on the sensor are included in the proposed model. A graphical user interface (GUI) of the model was also established. A high resolution satellitic picture was used to model the virtual scene being photographed. The effectiveness of the proposed model was verified by computer simulations based on the satellitic picture. In order to guide the design of TDI CMOS image sensors, the impacts of some parameters of TDI CMOS image sensors including pixel pitch, pixel photosensitive size, and integration time on the performance of the sensors were researched through the proposed model. The impacts of the above parameters on the sensors were quantified by sensor's modulation transfer function (MTF) of the along-track direction, which was calculated by slanted-edge method. The simulation results indicated that the TDI CMOS image sensor can get a better performance with smaller pixel photosensitive size and shorter integration time. The proposed model is useful in the process of researching and developing a TDI CMOS image sensor.

  2. Photometric Modeling of Simulated Surace-Resolved Bennu Images

    Science.gov (United States)

    Golish, D.; DellaGiustina, D. N.; Clark, B.; Li, J. Y.; Zou, X. D.; Bennett, C. A.; Lauretta, D. S.

    2017-12-01

    The Origins, Spectral Interpretation, Resource Identification, Security, Regolith Explorer (OSIRIS-REx) is a NASA mission to study and return a sample of asteroid (101955) Bennu. Imaging data from the mission will be used to develop empirical surface-resolved photometric models of Bennu at a series of wavelengths. These models will be used to photometrically correct panchromatic and color base maps of Bennu, compensating for variations due to shadows and photometric angle differences, thereby minimizing seams in mosaicked images. Well-corrected mosaics are critical to the generation of a global hazard map and a global 1064-nm reflectance map which predicts LIDAR response. These data products directly feed into the selection of a site from which to safely acquire a sample. We also require photometric correction for the creation of color ratio maps of Bennu. Color ratios maps provide insight into the composition and geological history of the surface and allow for comparison to other Solar System small bodies. In advance of OSIRIS-REx's arrival at Bennu, we use simulated images to judge the efficacy of both the photometric modeling software and the mission observation plan. Our simulation software is based on USGS's Integrated Software for Imagers and Spectrometers (ISIS) and uses a synthetic shape model, a camera model, and an empirical photometric model to generate simulated images. This approach gives us the flexibility to create simulated images of Bennu based on analog surfaces from other small Solar System bodies and to test our modeling software under those conditions. Our photometric modeling software fits image data to several conventional empirical photometric models and produces the best fit model parameters. The process is largely automated, which is crucial to the efficient production of data products during proximity operations. The software also produces several metrics on the quality of the observations themselves, such as surface coverage and the

  3. Microtomographic imaging in the process of bone modeling and simulation

    Science.gov (United States)

    Mueller, Ralph

    1999-09-01

    Micro-computed tomography ((mu) CT) is an emerging technique to nondestructively image and quantify trabecular bone in three dimensions. Where the early implementations of (mu) CT focused more on technical aspects of the systems and required equipment not normally available to the general public, a more recent development emphasized practical aspects of micro- tomographic imaging. That system is based on a compact fan- beam type of tomograph, also referred to as desktop (mu) CT. Desk-top (mu) CT has been used extensively for the investigation of osteoporosis related health problems gaining new insight into the organization of trabecular bone and the influence of osteoporotic bone loss on bone architecture and the competence of bone. Osteoporosis is a condition characterized by excessive bone loss and deterioration in bone architecture. The reduced quality of bone increases the risk of fracture. Current imaging technologies do not allow accurate in vivo measurements of bone structure over several decades or the investigation of the local remodeling stimuli at the tissue level. Therefore, computer simulations and new experimental modeling procedures are necessary for determining the long-term effects of age, menopause, and osteoporosis on bone. Microstructural bone models allow us to study not only the effects of osteoporosis on the skeleton but also to assess and monitor the effectiveness of new treatment regimens. The basis for such approaches are realistic models of bone and a sound understanding of the underlying biological and mechanical processes in bone physiology. In this article, strategies for new approaches to bone modeling and simulation in the study and treatment of osteoporosis and age-related bone loss are presented. The focus is on the bioengineering and imaging aspects of osteoporosis research. With the introduction of desk-top (mu) CT, a new generation of imaging instruments has entered the arena allowing easy and relatively inexpensive access to

  4. Imaging infrared: Scene simulation, modeling, and real image tracking; Proceedings of the Meeting, Orlando, FL, Mar. 30, 31, 1989

    Science.gov (United States)

    Triplett, Milton J.; Wolverton, James R.; Hubert, August J.

    1989-09-01

    Various papers on scene simulation, modeling, and real image tracking using IR imaging are presented. Individual topics addressed include: tactical IR scene generator, dynamic FLIR simulation in flight training research, high-speed dynamic scene simulation in UV to IR spectra, development of an IR sensor calibration facility, IR celestial background scene description, transmission measurement of optical components at cryogenic temperatures, diffraction model for a point-source generator, silhouette-based tracking for tactical IR systems, use of knowledge in electrooptical trackers, detection and classification of target formations in IR image sequences, SMPRAD: simplified three-dimensional cloud radiance model, IR target generator, recent advances in testing of thermal imagers, generic IR system models with dynamic image generation, modeling realistic target acquisition using IR sensors in multiple-observer scenarios, and novel concept of scene generation and comprehensive dynamic sensor test.

  5. MR imaging of model drug distribution in simulated vitreous

    Directory of Open Access Journals (Sweden)

    Stein Sandra

    2015-09-01

    Full Text Available The in vitro and in vivo characterization of intravitreal injections plays an important role in developing innovative therapy approaches. Using the established vitreous model (VM and eye movement system (EyeMoS the distribution of contrast agents with different molecular weight was studied in vitro. The impact of the simulated age-related vitreal liquefaction (VL on drug distribution in VM was examined either with injection through the gel phase or through the liquid phase. For comparison the distribution was studied ex vivo in the porcine vitreous. The studies were performed in a magnetic resonance (MR scanner. As expected, with increasing molecular weight the diffusion velocity and the visual distribution of the injected substances decreased. Similar drug distribution was observed in VM and in porcine eye. VL causes enhanced convective flow and faster distribution in VM. Confirming the importance of the injection technique in progress of VL, injection through gelatinous phase caused faster distribution into peripheral regions of the VM than following injection through liquefied phase. VM and MR scanner in combination present a new approach for the in vitro characterization of drug release and distribution of intravitreal dosage forms.

  6. The use of a virtual printer model for the simulation of imaging systems

    Science.gov (United States)

    Hultgren, Bror

    2006-01-01

    In a companion paper we discuss the impact of statistical variability on perceived image quality. Early in a development program, systems may not be capable of rendering images suitable for quality testing. This does not diminish the program need to estimate the perceived quality of the imaging system. During the development of imaging systems, simulations are extremely effective for demonstrating the visual impact of design choices, allowing both the development process to prioritize these choices and management to understand the risks and benefits of such choices. Where the simulation mirrors the mechanisms of image formation, it not only improves the simulation but also informs the understanding of the image formation process. Clearly the simulation process requires display or printing devices whose quality does not limit the simulation. We will present a generalized methodology. When used with common profile making and color management tools, it will provide simulations of both source and destination devices. The device to be simulated is modeled by its response to a fixed set of input stimuli. In the case of a digital still camera (DSC), these are the reflection spectra of a fixed set of color patches -e.g. the MacBeth DCC, and in the case of a printer, the set of image RGBs. We will demonstrate this methodology with examples of various print media systems.

  7. Multiple-point statistical simulation for hydrogeological models: 3-D training image development and conditioning strategies

    Science.gov (United States)

    Høyer, Anne-Sophie; Vignoli, Giulio; Mejer Hansen, Thomas; Thanh Vu, Le; Keefer, Donald A.; Jørgensen, Flemming

    2017-12-01

    Most studies on the application of geostatistical simulations based on multiple-point statistics (MPS) to hydrogeological modelling focus on relatively fine-scale models and concentrate on the estimation of facies-level structural uncertainty. Much less attention is paid to the use of input data and optimal construction of training images. For instance, even though the training image should capture a set of spatial geological characteristics to guide the simulations, the majority of the research still relies on 2-D or quasi-3-D training images. In the present study, we demonstrate a novel strategy for 3-D MPS modelling characterized by (i) realistic 3-D training images and (ii) an effective workflow for incorporating a diverse group of geological and geophysical data sets. The study covers an area of 2810 km2 in the southern part of Denmark. MPS simulations are performed on a subset of the geological succession (the lower to middle Miocene sediments) which is characterized by relatively uniform structures and dominated by sand and clay. The simulated domain is large and each of the geostatistical realizations contains approximately 45 million voxels with size 100 m × 100 m × 5 m. Data used for the modelling include water well logs, high-resolution seismic data, and a previously published 3-D geological model. We apply a series of different strategies for the simulations based on data quality, and develop a novel method to effectively create observed spatial trends. The training image is constructed as a relatively small 3-D voxel model covering an area of 90 km2. We use an iterative training image development strategy and find that even slight modifications in the training image create significant changes in simulations. Thus, this study shows how to include both the geological environment and the type and quality of input information in order to achieve optimal results from MPS modelling. We present a practical workflow to build the training image and

  8. Comprehensive Modeling and Visualization of Cardiac Anatomy and Physiology from CT Imaging and Computer Simulations.

    Science.gov (United States)

    Xiong, Guanglei; Sun, Peng; Zhou, Haoyin; Ha, Seongmin; Hartaigh, Briain O; Truong, Quynh A; Min, James K

    2017-02-01

    In clinical cardiology, both anatomy and physiology are needed to diagnose cardiac pathologies. CT imaging and computer simulations provide valuable and complementary data for this purpose. However, it remains challenging to gain useful information from the large amount of high-dimensional diverse data. The current tools are not adequately integrated to visualize anatomic and physiologic data from a complete yet focused perspective. We introduce a new computer-aided diagnosis framework, which allows for comprehensive modeling and visualization of cardiac anatomy and physiology from CT imaging data and computer simulations, with a primary focus on ischemic heart disease. The following visual information is presented: (1) Anatomy from CT imaging: geometric modeling and visualization of cardiac anatomy, including four heart chambers, left and right ventricular outflow tracts, and coronary arteries; (2) Function from CT imaging: motion modeling, strain calculation, and visualization of four heart chambers; (3) Physiology from CT imaging: quantification and visualization of myocardial perfusion and contextual integration with coronary artery anatomy; (4) Physiology from computer simulation: computation and visualization of hemodynamics (e.g., coronary blood velocity, pressure, shear stress, and fluid forces on the vessel wall). Substantially, feedback from cardiologists have confirmed the practical utility of integrating these features for the purpose of computer-aided diagnosis of ischemic heart disease.

  9. Fast simulation of ultrasound images

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt; Nikolov, Svetoslav

    2000-01-01

    Realistic B-mode and flow images can be simulated with scattering maps based on optical, CT, or MR images or parametric flow models. The image simulation often includes using 200,000 to 1 million point scatterers. One image line typically takes 1800 seconds to compute on a state-of-the-art PC...... be split among a number of PCs for speeding up the simulation. A full 3D one second volume simulation then takes 7,500 seconds on a 32 CPU 600 MHz Pentium III PC cluster....

  10. Three-dimensional fuse deposition modeling of tissue-simulating phantom for biomedical optical imaging

    Science.gov (United States)

    Dong, Erbao; Zhao, Zuhua; Wang, Minjie; Xie, Yanjun; Li, Shidi; Shao, Pengfei; Cheng, Liuquan; Xu, Ronald X.

    2015-12-01

    Biomedical optical devices are widely used for clinical detection of various tissue anomalies. However, optical measurements have limited accuracy and traceability, partially owing to the lack of effective calibration methods that simulate the actual tissue conditions. To facilitate standardized calibration and performance evaluation of medical optical devices, we develop a three-dimensional fuse deposition modeling (FDM) technique for freeform fabrication of tissue-simulating phantoms. The FDM system uses transparent gel wax as the base material, titanium dioxide (TiO2) powder as the scattering ingredient, and graphite powder as the absorption ingredient. The ingredients are preheated, mixed, and deposited at the designated ratios layer-by-layer to simulate tissue structural and optical heterogeneities. By printing the sections of human brain model based on magnetic resonance images, we demonstrate the capability for simulating tissue structural heterogeneities. By measuring optical properties of multilayered phantoms and comparing with numerical simulation, we demonstrate the feasibility for simulating tissue optical properties. By creating a rat head phantom with embedded vasculature, we demonstrate the potential for mimicking physiologic processes of a living system.

  11. Image-based model of the spectrin cytoskeleton for red blood cell simulation

    Science.gov (United States)

    Stokes, David L.; Peskin, Charles S.

    2017-01-01

    We simulate deformable red blood cells in the microcirculation using the immersed boundary method with a cytoskeletal model that incorporates structural details revealed by tomographic images. The elasticity of red blood cells is known to be supplied by both their lipid bilayer membranes, which resist bending and local changes in area, and their cytoskeletons, which resist in-plane shear. The cytoskeleton consists of spectrin tetramers that are tethered to the lipid bilayer by ankyrin and by actin-based junctional complexes. We model the cytoskeleton as a random geometric graph, with nodes corresponding to junctional complexes and with edges corresponding to spectrin tetramers such that the edge lengths are given by the end-to-end distances between nodes. The statistical properties of this graph are based on distributions gathered from three-dimensional tomographic images of the cytoskeleton by a segmentation algorithm. We show that the elastic response of our model cytoskeleton, in which the spectrin polymers are treated as entropic springs, is in good agreement with the experimentally measured shear modulus. By simulating red blood cells in flow with the immersed boundary method, we compare this discrete cytoskeletal model to an existing continuum model and predict the extent to which dynamic spectrin network connectivity can protect against failure in the case of a red cell subjected to an applied strain. The methods presented here could form the basis of disease- and patient-specific computational studies of hereditary diseases affecting the red cell cytoskeleton. PMID:28991926

  12. Image-based model of the spectrin cytoskeleton for red blood cell simulation.

    Science.gov (United States)

    Fai, Thomas G; Leo-Macias, Alejandra; Stokes, David L; Peskin, Charles S

    2017-10-01

    We simulate deformable red blood cells in the microcirculation using the immersed boundary method with a cytoskeletal model that incorporates structural details revealed by tomographic images. The elasticity of red blood cells is known to be supplied by both their lipid bilayer membranes, which resist bending and local changes in area, and their cytoskeletons, which resist in-plane shear. The cytoskeleton consists of spectrin tetramers that are tethered to the lipid bilayer by ankyrin and by actin-based junctional complexes. We model the cytoskeleton as a random geometric graph, with nodes corresponding to junctional complexes and with edges corresponding to spectrin tetramers such that the edge lengths are given by the end-to-end distances between nodes. The statistical properties of this graph are based on distributions gathered from three-dimensional tomographic images of the cytoskeleton by a segmentation algorithm. We show that the elastic response of our model cytoskeleton, in which the spectrin polymers are treated as entropic springs, is in good agreement with the experimentally measured shear modulus. By simulating red blood cells in flow with the immersed boundary method, we compare this discrete cytoskeletal model to an existing continuum model and predict the extent to which dynamic spectrin network connectivity can protect against failure in the case of a red cell subjected to an applied strain. The methods presented here could form the basis of disease- and patient-specific computational studies of hereditary diseases affecting the red cell cytoskeleton.

  13. Image Processing Strategies Based on a Visual Saliency Model for Object Recognition Under Simulated Prosthetic Vision.

    Science.gov (United States)

    Wang, Jing; Li, Heng; Fu, Weizhen; Chen, Yao; Li, Liming; Lyu, Qing; Han, Tingting; Chai, Xinyu

    2016-01-01

    Retinal prostheses have the potential to restore partial vision. Object recognition in scenes of daily life is one of the essential tasks for implant wearers. Still limited by the low-resolution visual percepts provided by retinal prostheses, it is important to investigate and apply image processing methods to convey more useful visual information to the wearers. We proposed two image processing strategies based on Itti's visual saliency map, region of interest (ROI) extraction, and image segmentation. Itti's saliency model generated a saliency map from the original image, in which salient regions were grouped into ROI by the fuzzy c-means clustering. Then Grabcut generated a proto-object from the ROI labeled image which was recombined with background and enhanced in two ways--8-4 separated pixelization (8-4 SP) and background edge extraction (BEE). Results showed that both 8-4 SP and BEE had significantly higher recognition accuracy in comparison with direct pixelization (DP). Each saliency-based image processing strategy was subject to the performance of image segmentation. Under good and perfect segmentation conditions, BEE and 8-4 SP obtained noticeably higher recognition accuracy than DP, and under bad segmentation condition, only BEE boosted the performance. The application of saliency-based image processing strategies was verified to be beneficial to object recognition in daily scenes under simulated prosthetic vision. They are hoped to help the development of the image processing module for future retinal prostheses, and thus provide more benefit for the patients. Copyright © 2015 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  14. An imaging-based computational model for simulating angiogenesis and tumour oxygenation dynamics

    Science.gov (United States)

    Adhikarla, Vikram; Jeraj, Robert

    2016-05-01

    Tumour growth, angiogenesis and oxygenation vary substantially among tumours and significantly impact their treatment outcome. Imaging provides a unique means of investigating these tumour-specific characteristics. Here we propose a computational model to simulate tumour-specific oxygenation changes based on the molecular imaging data. Tumour oxygenation in the model is reflected by the perfused vessel density. Tumour growth depends on its doubling time (T d) and the imaged proliferation. Perfused vessel density recruitment rate depends on the perfused vessel density around the tumour (sMVDtissue) and the maximum VEGF concentration for complete vessel dysfunctionality (VEGFmax). The model parameters were benchmarked to reproduce the dynamics of tumour oxygenation over its entire lifecycle, which is the most challenging test. Tumour oxygenation dynamics were quantified using the peak pO2 (pO2peak) and the time to peak pO2 (t peak). Sensitivity of tumour oxygenation to model parameters was assessed by changing each parameter by 20%. t peak was found to be more sensitive to tumour cell line related doubling time (~30%) as compared to tissue vasculature density (~10%). On the other hand, pO2peak was found to be similarly influenced by the above tumour- and vasculature-associated parameters (~30-40%). Interestingly, both pO2peak and t peak were only marginally affected by VEGFmax (~5%). The development of a poorly oxygenated (hypoxic) core with tumour growth increased VEGF accumulation, thus disrupting the vessel perfusion as well as further increasing hypoxia with time. The model with its benchmarked parameters, is applied to hypoxia imaging data obtained using a [64Cu]Cu-ATSM PET scan of a mouse tumour and the temporal development of the vasculature and hypoxia maps are shown. The work underscores the importance of using tumour-specific input for analysing tumour evolution. An extended model incorporating therapeutic effects can serve as a powerful tool for analysing

  15. Cortical imaging on a head template: a simulation study using a resistor mesh model (RMM).

    Science.gov (United States)

    Chauveau, Nicolas; Franceries, Xavier; Aubry, Florent; Celsis, Pierre; Rigaud, Bernard

    2008-09-01

    The T1 head template model used in Statistical Parametric Mapping Version 2000 (SPM2), was segmented into five layers (scalp, skull, CSF, grey and white matter) and implemented in 2 mm voxels. We designed a resistor mesh model (RMM), based on the finite volume method (FVM) to simulate the electrical properties of this head model along the three axes for each voxel. Then, we introduced four dipoles of high eccentricity (about 0.8) in this RMM, separately and simultaneously, to compute the potentials for two sets of conductivities. We used the direct cortical imaging technique (CIT) to recover the simulated dipoles, using 60 or 107 electrodes and with or without addition of Gaussian white noise (GWN). The use of realistic conductivities gave better CIT results than standard conductivities, lowering the blurring effect on scalp potentials and displaying more accurate position areas when CIT was applied to single dipoles. Simultaneous dipoles were less accurately localized, but good qualitative and stable quantitative results were obtained up to 5% noise level for 107 electrodes and up to 10% noise level for 60 electrodes, showing that a compromise must be found to optimize both the number of electrodes and the noise level. With the RMM defined in 2 mm voxels, the standard 128-electrode cap and 5% noise appears to be the upper limit providing reliable source positions when direct CIT is used. The admittance matrix defining the RMM is easy to modify so as to adapt to different conductivities. The next step will be the adaptation of individual real head T2 images to the RMM template and the introduction of anisotropy using diffusion imaging (DI).

  16. SU-F-J-178: A Computer Simulation Model Observer for Task-Based Image Quality Assessment in Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Dolly, S; Mutic, S; Anastasio, M; Li, H [Washington University School of Medicine, Saint Louis, MO (United States); Yu, L [Mayo Clinic, Rochester, MN (United States)

    2016-06-15

    Purpose: Traditionally, image quality in radiation therapy is assessed subjectively or by utilizing physically-based metrics. Some model observers exist for task-based medical image quality assessment, but almost exclusively for diagnostic imaging tasks. As opposed to disease diagnosis, the task for image observers in radiation therapy is to utilize the available images to design and deliver a radiation dose which maximizes patient disease control while minimizing normal tissue damage. The purpose of this study was to design and implement a new computer simulation model observer to enable task-based image quality assessment in radiation therapy. Methods: A modular computer simulation framework was developed to resemble the radiotherapy observer by simulating an end-to-end radiation therapy treatment. Given images and the ground-truth organ boundaries from a numerical phantom as inputs, the framework simulates an external beam radiation therapy treatment and quantifies patient treatment outcomes using the previously defined therapeutic operating characteristic (TOC) curve. As a preliminary demonstration, TOC curves were calculated for various CT acquisition and reconstruction parameters, with the goal of assessing and optimizing simulation CT image quality for radiation therapy. Sources of randomness and bias within the system were analyzed. Results: The relationship between CT imaging dose and patient treatment outcome was objectively quantified in terms of a singular value, the area under the TOC (AUTOC) curve. The AUTOC decreases more rapidly for low-dose imaging protocols. AUTOC variation introduced by the dose optimization algorithm was approximately 0.02%, at the 95% confidence interval. Conclusion: A model observer has been developed and implemented to assess image quality based on radiation therapy treatment efficacy. It enables objective determination of appropriate imaging parameter values (e.g. imaging dose). Framework flexibility allows for incorporation

  17. SOFIA tracking image simulation

    Science.gov (United States)

    Taylor, Charles R.; Gross, Michael A. K.

    2016-09-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) tracking camera simulator is a component of the Telescope Assembly Simulator (TASim). TASim is a software simulation of the telescope optics, mounting, and control software. Currently in its fifth major version, TASim is relied upon for telescope operator training, mission planning and rehearsal, and mission control and science instrument software development and testing. TASim has recently been extended for hardware-in-the-loop operation in support of telescope and camera hardware development and control and tracking software improvements. All three SOFIA optical tracking cameras are simulated, including the Focal Plane Imager (FPI), which has recently been upgraded to the status of a science instrument that can be used on its own or in parallel with one of the seven infrared science instruments. The simulation includes tracking camera image simulation of starfields based on the UCAC4 catalog at real-time rates of 4-20 frames per second. For its role in training and planning, it is important for the tracker image simulation to provide images with a realistic appearance and response to changes in operating parameters. For its role in tracker software improvements, it is vital to have realistic signal and noise levels and precise star positions. The design of the software simulation for precise subpixel starfield rendering (including radial distortion), realistic point-spread function as a function of focus, tilt, and collimation, and streaking due to telescope motion will be described. The calibration of the simulation for light sensitivity, dark and bias signal, and noise will also be presented

  18. Shadow effects in simulated ultrasound images derived from computed tomography images using a focused beam tracing model

    DEFF Research Database (Denmark)

    Pham, An Hoai; Lundgren, Bo; Stage, Bjarne

    2012-01-01

    Simulation of ultrasound images based on computed tomography (CT) data has previously been performed with different approaches. Shadow effects are normally pronounced in ultrasound images, so they should be included in the simulation. In this study, a method to capture the shadow effects has been......Focus ultrasound scanner (BK Medical, Herlev, Denmark) equipped with a dedicated research interface giving access to beamformed radio frequency data. CT images were obtained with an Aquilion ONE Toshiba CT scanner (Toshiba Medical Systems Corp., Tochigi, Japan). CT data were mapped from Hounsfield units...

  19. Signal and image processing systems performance evaluation, simulation, and modeling; Proceedings of the Meeting, Orlando, FL, Apr. 4, 5, 1991

    Science.gov (United States)

    Nasr, Hatem N.; Bazakos, Michael E.

    The various aspects of the evaluation and modeling problems in algorithms, sensors, and systems are addressed. Consideration is given to a generic modular imaging IR signal processor, real-time architecture based on the image-processing module family, application of the Proto Ware simulation testbed to the design and evaluation of advanced avionics, development of a fire-and-forget imaging infrared seeker missile simulation, an adaptive morphological filter for image processing, laboratory development of a nonlinear optical tracking filter, a dynamic end-to-end model testbed for IR detection algorithms, wind tunnel model aircraft attitude and motion analysis, an information-theoretic approach to optimal quantization, parametric analysis of target/decoy performance, neural networks for automated target recognition parameters adaptation, performance evaluation of a texture-based segmentation algorithm, evaluation of image tracker algorithms, and multisensor fusion methodologies. (No individual items are abstracted in this volume)

  20. A volumetric CMUT-based ultrasound imaging system simulator with integrated reception and μ-beamforming electronics models.

    Science.gov (United States)

    Matrone, Giulia; Savoia, Alessandro S; Terenzi, Marco; Caliano, Giosuè; Quaglia, Fabio; Magenes, Giovanni

    2014-05-01

    In modern ultrasound imaging devices, two-dimensional probes and electronic scanning allow volumetric imaging of anatomical structures. When dealing with the design of such complex 3-D ultrasound (US) systems, as the number of transducers and channels dramatically increases, new challenges concerning the integration of electronics and the implementation of smart micro-beamforming strategies arise. Hence, the possibility to predict the behavior of the whole system is mandatory. In this paper, we propose and describe an advanced simulation tool for ultrasound system modeling and simulation, which conjugates the US propagation and scattering, signal transduction, electronic signal conditioning, and beamforming in a single environment. In particular, we present the architecture and model of an existing 16-channel integrated receiver, which includes an amplification and micro-beamforming stage, and validate it by comparison with circuit simulations. The developed model is then used in conjunction with the transducer and US field models to perform a system simulation, aimed at estimating the performance of an example 3-D US imaging system that uses a capacitive micromachined ultrasonic transducer (CMUT) 2-D phased-array coupled to the modeled reception front-end. Results of point spread function (PSF) calculations, as well as synthetic imaging of a virtual phantom, show that this tool is actually able to model the complete US image reconstruction process, and that it could be used to quickly provide valuable system-level feedback for an optimized tuning of electronic design parameters.

  1. Simulation model of mammographic calcifications based on the American College of Radiology Breast Imaging Reporting and Data System, or BIRADS.

    Science.gov (United States)

    Kallergi, M; Gavrielides, M A; He, L; Berman, C G; Kim, J J; Clark, R A

    1998-10-01

    The authors developed and evaluated a method for the simulation of calcification clusters based on the guidelines of the Breast Imaging Reporting and Data System of the American College of Radiology. They aimed to reproduce accurately the relative and absolute size, shape, location, number, and intensity of real calcifications associated with both benign and malignant disease. Thirty calcification clusters were simulated by using the proposed model and were superimposed on real, negative mammograms digitized at 30 microns and 16 bits per pixel. The accuracy of the simulation was evaluated by three radiologists in a blinded study. No statistically significant difference was observed in the observers' evaluation of the simulated clusters and the real clusters. The observers' classification of the cluster types seemed to be a good approximation of the intended types from the simulation design. This model can provide simulated calcification clusters with well-defined morphologic, distributional, and contrast characteristics for a variety of applications in digital mammography.

  2. Simulation of Sentinel-3 images by four stream surface atmosphere radiative transfer modeling in the optical and thermal domains

    NARCIS (Netherlands)

    Verhoef, W.; Bach, H.

    2012-01-01

    Simulation of future satellite images can be applied in order to validate the general mission concept and to test the performance of advanced multi-sensor algorithms for the retrieval of surface parameters. This paper describes the radiative transfer modeling part of a so-called Land Scene Generator

  3. Tentative study of flow patterns in the North Aegean Sea using NOAA-AVHRR images and 2D model simulation

    Directory of Open Access Journals (Sweden)

    G. Zodiatis

    1996-11-01

    Full Text Available A statistical technique for image processing, the maximum cross correlation (MCC method, was utilized on sequences of NOAA-AVHRR thermal data in order to explore the surface advective current dynamics at the discharge region of the Hellespont in the North Aegean Sea. A 2D numerical flow model was also used in order to simulate the barotropic flow pattern of the surface water layer. The model was forced with diurnal wind fields obtained for the same period as the satellite infrared images. The currents (magnitude and direction derived from the two methods compare satisfactorily despite the fact that some model simplifications were made.

  4. Realistic simulation of reduced-dose CT with noise modeling and sinogram synthesis using DICOM CT images

    International Nuclear Information System (INIS)

    Won Kim, Chang; Kim, Jong Hyo

    2014-01-01

    Purpose: Reducing the patient dose while maintaining the diagnostic image quality during CT exams is the subject of a growing number of studies, in which simulations of reduced-dose CT with patient data have been used as an effective technique when exploring the potential of various dose reduction techniques. Difficulties in accessing raw sinogram data, however, have restricted the use of this technique to a limited number of institutions. Here, we present a novel reduced-dose CT simulation technique which provides realistic low-dose images without the requirement of raw sinogram data. Methods: Two key characteristics of CT systems, the noise equivalent quanta (NEQ) and the algorithmic modulation transfer function (MTF), were measured for various combinations of object attenuation and tube currents by analyzing the noise power spectrum (NPS) of CT images obtained with a set of phantoms. Those measurements were used to develop a comprehensive CT noise model covering the reduced x-ray photon flux, object attenuation, system noise, and bow-tie filter, which was then employed to generate a simulated noise sinogram for the reduced-dose condition with the use of a synthetic sinogram generated from a reference CT image. The simulated noise sinogram was filtered with the algorithmic MTF and back-projected to create a noise CT image, which was then added to the reference CT image, finally providing a simulated reduced-dose CT image. The simulation performance was evaluated in terms of the degree of NPS similarity, the noise magnitude, the bow-tie filter effect, and the streak noise pattern at photon starvation sites with the set of phantom images. Results: The simulation results showed good agreement with actual low-dose CT images in terms of their visual appearance and in a quantitative evaluation test. The magnitude and shape of the NPS curves of the simulated low-dose images agreed well with those of real low-dose images, showing discrepancies of less than +/−3.2% in

  5. Science yield modeling with the Exoplanet Open-Source Imaging Mission Simulator (EXOSIMS)

    Science.gov (United States)

    Delacroix, Christian; Savransky, Dmitry; Garrett, Daniel; Lowrance, Patrick; Morgan, Rhonda

    2016-08-01

    We report on our ongoing development of EXOSIMS and mission simulation results for WFIRST. We present the interface control and the modular structure of the software, along with corresponding prototypes and class definitions for some of the software modules. More specifically, we focus on describing the main steps of our high-fidelity mission simulator EXOSIMS, i.e., the completeness, optical system and zodiacal light modules definition, the target list module filtering, and the creation of a planet population within our simulated universe module. For the latter, we introduce the integration of a recent mass-radius model from the FORECASTER software. We also provide custom modules dedicated to WFIRST using both the Hybrid Lyot Coronagraph (HLC) and the Shaped Pupil Coronagraph (SPC) for detection and characterization, respectively. In that context, we show and discuss the results of some preliminary WFIRST simulations, focusing on comparing different methods of integration time calculation, through ensembles (large numbers) of survey simulations.

  6. Radar Echo Scattering Modeling and Image Simulations of Full-scale Convex Rough Targets at Terahertz Frequencies

    Directory of Open Access Journals (Sweden)

    Gao Jingkun

    2018-02-01

    Full Text Available Echo simulation is a precondition for developing radar imaging systems, algorithms, and subsequent applications. Electromagnetic scattering modeling of the target is key to echo simulation. At terahertz (THz frequencies, targets are usually of ultra-large electrical size that makes applying classical electromagnetic calculation methods unpractical. In contrast, the short wavelength makes the surface roughness of targets a factor that cannot be ignored, and this makes the traditional echo simulation methods based on point scattering hypothesis in applicable. Modeling the scattering characteristics of targets and efficiently generating its radar echoes in THz bands has become a problem that must be solved. In this paper, a hierarchical semi-deterministic modeling method is proposed. A full-wave algorithm of rough surfaces is used to calculate the scattered field of facets. Then, the scattered fields of all facets are transformed into the target coordinate system and coherently summed. Finally, the radar echo containing phase information can be obtained. Using small-scale rough models, our method is compared with the standard high-frequency numerical method, which verifies the effectiveness of the proposed method. Imaging results of a full-scale cone-shape target is presented, and the scattering model and echo generation problem of the full-scale convex targets with rough surfaces in THz bands are preliminary solved; this lays the foundation for future research on imaging regimes and algorithms.

  7. Image based SAR product simulation for analysis

    Science.gov (United States)

    Domik, G.; Leberl, F.

    1987-01-01

    SAR product simulation serves to predict SAR image gray values for various flight paths. Input typically consists of a digital elevation model and backscatter curves. A new method is described of product simulation that employs also a real SAR input image for image simulation. This can be denoted as 'image-based simulation'. Different methods to perform this SAR prediction are presented and advantages and disadvantages discussed. Ascending and descending orbit images from NASA's SIR-B experiment were used for verification of the concept: input images from ascending orbits were converted into images from a descending orbit; the results are compared to the available real imagery to verify that the prediction technique produces meaningful image data.

  8. Image simulation and a model of noise power spectra across a range of mammographic beam qualities

    Energy Technology Data Exchange (ETDEWEB)

    Mackenzie, Alistair, E-mail: alistairmackenzie@nhs.net; Dance, David R.; Young, Kenneth C. [National Coordinating Centre for the Physics of Mammography, Royal Surrey County Hospital, Guildford GU2 7XX, United Kingdom and Department of Physics, University of Surrey, Guildford GU2 7XH (United Kingdom); Diaz, Oliver [Centre for Vision, Speech and Signal Processing, Faculty of Engineering and Physical Sciences, University of Surrey, Guildford GU2 7XH, United Kingdom and Computer Vision and Robotics Research Institute, University of Girona, Girona 17071 (Spain)

    2014-12-15

    Purpose: The aim of this work is to create a model to predict the noise power spectra (NPS) for a range of mammographic radiographic factors. The noise model was necessary to degrade images acquired on one system to match the image quality of different systems for a range of beam qualities. Methods: Five detectors and x-ray systems [Hologic Selenia (ASEh), Carestream computed radiography CR900 (CRc), GE Essential (CSI), Carestream NIP (NIPc), and Siemens Inspiration (ASEs)] were characterized for this study. The signal transfer property was measured as the pixel value against absorbed energy per unit area (E) at a reference beam quality of 28 kV, Mo/Mo or 29 kV, W/Rh with 45 mm polymethyl methacrylate (PMMA) at the tube head. The contributions of the three noise sources (electronic, quantum, and structure) to the NPS were calculated by fitting a quadratic at each spatial frequency of the NPS against E. A quantum noise correction factor which was dependent on beam quality was quantified using a set of images acquired over a range of radiographic factors with different thicknesses of PMMA. The noise model was tested for images acquired at 26 kV, Mo/Mo with 20 mm PMMA and 34 kV, Mo/Rh with 70 mm PMMA for three detectors (ASEh, CRc, and CSI) over a range of exposures. The NPS were modeled with and without the noise correction factor and compared with the measured NPS. A previous method for adapting an image to appear as if acquired on a different system was modified to allow the reference beam quality to be different from the beam quality of the image. The method was validated by adapting the ASEh flat field images with two thicknesses of PMMA (20 and 70 mm) to appear with the imaging characteristics of the CSI and CRc systems. Results: The quantum noise correction factor rises with higher beam qualities, except for CR systems at high spatial frequencies, where a flat response was found against mean photon energy. This is due to the dominance of secondary quantum noise

  9. Modelling of AlAs/GaAs interfacial structures using high-angle annular dark field (HAADF) image simulations.

    Science.gov (United States)

    Robb, Paul D; Finnie, Michael; Craven, Alan J

    2012-07-01

    High angle annular dark field (HAADF) image simulations were performed on a series of AlAs/GaAs interfacial models using the frozen-phonon multislice method. Three general types of models were considered-perfect, vicinal/sawtooth and diffusion. These were chosen to demonstrate how HAADF image measurements are influenced by different interfacial structures in the technologically important III-V semiconductor system. For each model, interfacial sharpness was calculated as a function of depth and compared to aberration-corrected HAADF experiments of two types of AlAs/GaAs interfaces. The results show that the sharpness measured from HAADF imaging changes in a complicated manner with thickness for complex interfacial structures. For vicinal structures, it was revealed that the type of material that the probe projects through first of all has a significant effect on the measured sharpness. An increase in the vicinal angle was also shown to generate a wider interface in the random step model. The Moison diffusion model produced an increase in the interface width with depth which closely matched the experimental results of the AlAs-on-GaAs interface. In contrast, the interface width decreased as a function of depth in the linear diffusion model. Only in the case of the perfect model was it possible to ascertain the underlying structure directly from HAADF image analysis. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Images created in a model eye during simulated cataract surgery can be the basis for images perceived by patients during cataract surgery

    Science.gov (United States)

    Inoue, M; Uchida, A; Shinoda, K; Taira, Y; Noda, T; Ohnuma, K; Bissen-Miyajima, H; Hirakata, A

    2014-01-01

    Purpose To evaluate the images created in a model eye during simulated cataract surgery. Patients and methods This study was conducted as a laboratory investigation and interventional case series. An artificial opaque lens, a clear intraocular lens (IOL), or an irrigation/aspiration (I/A) tip was inserted into the ‘anterior chamber' of a model eye with the frosted posterior surface corresponding to the retina. Video images were recorded of the posterior surface of the model eye from the rear during simulated cataract surgery. The video clips were shown to 20 patients before cataract surgery, and the similarity of their visual perceptions to these images was evaluated postoperatively. Results The images of the moving lens fragments and I/A tip and the insertion of the IOL were seen from the rear. The image through the opaque lens and the IOL without moving objects was the light of the surgical microscope from the rear. However, when the microscope light was turned off after IOL insertion, the images of the microscope and operating room were observed by the room illumination from the rear. Seventy percent of the patients answered that the visual perceptions of moving lens fragments were similar to the video clips and 55% reported similarity with the IOL insertion. Eighty percent of the patients recommended that patients watch the video clip before their scheduled cataract surgery. Conclusions The patients' visual perceptions during cataract surgery can be reproduced in the model eye. Watching the video images preoperatively may help relax the patients during surgery. PMID:24788007

  11. Image-based biomechanical modeling of aortic wall stress and vessel deformation: response to pulsatile arterial pressure simulations

    Science.gov (United States)

    Hazer, Dilana; Bauer, Miriam; Unterhinninghofen, Roland; Dillmann, Rüdiger; Richter, Götz-M.

    2008-03-01

    Image-based modeling of cardiovascular biomechanics may be very helpful for patients with aortic aneurysms to predict the risk of rupture and evaluate the necessity of a surgical intervention. In order to generate a reliable support it is necessary to develop exact patient-specific models that simulate biomechanical parameters and provide individual structural analysis of the state of fatigue and characterize this to the potential of rupture of the aortic wall. The patient-specific geometry used here originates from a CT scan of an Abdominal Aortic Aneurysm (AAA). The computations are based on the Finite Element Method (FEM) and simulate the wall stress distribution and the vessel deformation. The wall transient boundary conditions are based on real time-dependent pressure simulations obtained from a previous computational fluid dynamics study. The physiological wall material properties consider a nonlinear hyperelastic constitutive model, based on realistic ex-vivo analysis of the aneurismal arterial tissue. The results showed complex deformation and stress distribution on the AAA wall. The maximum stresses occurred at the systole and are found around the aneurismal bulge in regions close to inflection points. Biomechanical modeling based on medical images and coupled with patient-specific hemodynamics allows analysing and quantifying the effects of dilatation of the arterial wall due to the pulsatile aortic pressure. It provides a physical and realistic insight into the wall mechanics and enables predictive simulations of AAA growth and assessment of rupture. Further development integrating endovascular models would help evaluating non-invasively individual treatment strategies for optimal placement and improved device design.

  12. Validation of the GATE Monte Carlo simulation platform for modelling a CsI(Tl) scintillation camera dedicated to small-animal imaging

    International Nuclear Information System (INIS)

    Lazaro, D; Buvat, I; Loudos, G; Strul, D; Santin, G; Giokaris, N; Donnarieix, D; Maigne, L; Spanoudaki, V; Styliaris, S; Staelens, S; Breton, V

    2004-01-01

    Monte Carlo simulations are increasingly used in scintigraphic imaging to model imaging systems and to develop and assess tomographic reconstruction algorithms and correction methods for improved image quantitation. GATE (GEANT4 application for tomographic emission) is a new Monte Carlo simulation platform based on GEANT4 dedicated to nuclear imaging applications. This paper describes the GATE simulation of a prototype of scintillation camera dedicated to small-animal imaging and consisting of a CsI(Tl) crystal array coupled to a position-sensitive photomultiplier tube. The relevance of GATE to model the camera prototype was assessed by comparing simulated 99m Tc point spread functions, energy spectra, sensitivities, scatter fractions and image of a capillary phantom with the corresponding experimental measurements. Results showed an excellent agreement between simulated and experimental data: experimental spatial resolutions were predicted with an error less than 100 μm. The difference between experimental and simulated system sensitivities for different source-to-collimator distances was within 2%. Simulated and experimental scatter fractions in a [98-82 keV] energy window differed by less than 2% for sources located in water. Simulated and experimental energy spectra agreed very well between 40 and 180 keV. These results demonstrate the ability and flexibility of GATE for simulating original detector designs. The main weakness of GATE concerns the long computation time it requires: this issue is currently under investigation by the GEANT4 and the GATE collaborations

  13. Simulation of ultrasound backscatter images from fish

    DEFF Research Database (Denmark)

    Pham, An Hoai; Stage, Bjarne; Hemmsen, Martin Christian

    2011-01-01

    The objective of this work is to investigate ultrasound (US) backscatter in the MHz range from fis to develop a realistic and reliable simulation model. The long term objective of the work is to develop the needed signal processing for fis species differentiation using US. In in-vitro experiments...... images reproduce most of the important characteristics of the measured US image....

  14. Discrete Event Simulation Model of the Polaris 2.1 Gamma Ray Imaging Radiation Detection Device

    Science.gov (United States)

    2016-06-01

    technique to detect special nuclear material using cosmic rays. IEEE Nuclear Science Symposium and Medical Imaging Conference Record. Retrieved...any isotope loaded into its library with that image . The Polaris provides isotope identification, radiation activity levels, spectrograph, and...all data used remains offset, but realistic of correct technology in the gamma ray imaging realm. The purpose is to identify the factors and/or

  15. Computational fluid dynamic simulations of image-based stented coronary bifurcation models

    Science.gov (United States)

    Chiastra, Claudio; Morlacchi, Stefano; Gallo, Diego; Morbiducci, Umberto; Cárdenes, Rubén; Larrabide, Ignacio; Migliavacca, Francesco

    2013-01-01

    One of the relevant phenomenon associated with in-stent restenosis in coronary arteries is an altered haemodynamics in the stented region. Computational fluid dynamics (CFD) offers the possibility to investigate the haemodynamics at a level of detail not always accessible within experimental techniques. CFD can quantify and correlate the local haemodynamics structures which might lead to in-stent restenosis. The aim of this work is to study the fluid dynamics of realistic stented coronary artery models which replicate the complete clinical procedure of stent implantation. Two cases of pathologic left anterior descending coronary arteries with their bifurcations are reconstructed from computed tomography angiography and conventional coronary angiography images. Results of wall shear stress and relative residence time show that the wall regions more prone to the risk of restenosis are located next to stent struts, to the bifurcations and to the stent overlapping zone for both investigated cases. Considering a bulk flow analysis, helical flow structures are generated by the curvature of the zone upstream from the stent and by the bifurcation regions. Helical recirculating microstructures are also visible downstream from the stent struts. This study demonstrates the feasibility to virtually investigate the haemodynamics of patient-specific coronary bifurcation geometries. PMID:23676893

  16. Particle Based Image Segmentation with Simulated Annealing

    NARCIS (Netherlands)

    Everts, M.H.; Bekker, H.; Jalba, A.C.; Roerdink, J.B.T.M.

    2007-01-01

    The Charged Particle Model (CPM) is a physically motivated deformable model for shape recovery and segmentation. It simulates a system of charged particles moving in an electric field generated from the input image, whose positions in the equilibrium state are used for curve or surface

  17. SimVascular 2.0: an Integrated Open Source Pipeline for Image-Based Cardiovascular Modeling and Simulation

    Science.gov (United States)

    Lan, Hongzhi; Merkow, Jameson; Updegrove, Adam; Schiavazzi, Daniele; Wilson, Nathan; Shadden, Shawn; Marsden, Alison

    2015-11-01

    SimVascular (www.simvascular.org) is currently the only fully open source software package that provides a complete pipeline from medical image based modeling to patient specific blood flow simulation and analysis. It was initially released in 2007 and has contributed to numerous advances in fundamental hemodynamics research, surgical planning, and medical device design. However, early versions had several major barriers preventing wider adoption by new users, large-scale application in clinical and research studies, and educational access. In the past years, SimVascular 2.0 has made significant progress by integrating open source alternatives for the expensive commercial libraries previously required for anatomic modeling, mesh generation and the linear solver. In addition, it simplified the across-platform compilation process, improved the graphical user interface and launched a comprehensive documentation website. Many enhancements and new features have been incorporated for the whole pipeline, such as 3-D segmentation, Boolean operation for discrete triangulated surfaces, and multi-scale coupling for closed loop boundary conditions. In this presentation we will briefly overview the modeling/simulation pipeline and advances of the new SimVascular 2.0.

  18. Medical Image Registration and Surgery Simulation

    DEFF Research Database (Denmark)

    Bro-Nielsen, Morten

    1996-01-01

    This thesis explores the application of physical models in medical image registration and surgery simulation. The continuum models of elasticity and viscous fluids are described in detail, and this knowledge is used as a basis for most of the methods described here. Real-time deformable models......, and the use of selective matrix vector multiplication. Fluid medical image registration A new and faster algorithm for non-rigid registration using viscous fluid models is presented. This algorithm replaces the core part of the original algorithm with multi-resolution convolution using a new filter, which...... growth is also presented. Using medical knowledge about the growth processes of the mandibular bone, a registration algorithm for time sequence images of the mandible is developed. Since this registration algorithm models the actual development of the mandible, it is possible to simulate the development...

  19. EMC Simulation and Modeling

    Science.gov (United States)

    Takahashi, Takehiro; Schibuya, Noboru

    The EMC simulation is now widely used in design stage of electronic equipment to reduce electromagnetic noise. As the calculated electromagnetic behaviors of the EMC simulator depends on the inputted EMC model of the equipment, the modeling technique is important to obtain effective results. In this paper, simple outline of the EMC simulator and EMC model are described. Some modeling techniques of EMC simulation are also described with an example of the EMC model which is shield box with aperture.

  20. Numerical simulation of imaging laser radar system

    Science.gov (United States)

    Han, Shaokun; Lu, Bo; Jiang, Ming; Liu, Xunliang

    2008-03-01

    Rational and effective design of imaging laser radar systems is the key of imaging laser radar system research. Design must fully consider the interrelationship between various parameters. According to the parameters, choose suitable laser, detector and other components. To use of mathematical modeling and computer simulation is an effective imaging laser radar system design methods. This paper based on the distance equation, using the detection statistical methods, from the laser radar range coverage, detection probability, false-alarm rate, SNR to build the laser radar system mathematical models. In the process of setting up the mathematical models to fully consider the laser, atmosphere, detector and other factors on the performance that is to make the models be able to respond accurately the real situation. Based on this using C# and Matlab designed a simulation software.

  1. Utilizing native fluorescence imaging, modeling and simulation to examine pharmacokinetics and therapeutic regimen of a novel anticancer prodrug

    International Nuclear Information System (INIS)

    Wang, Jing-Hung; Endsley, Aaron N.; Green, Carol E.; Matin, A. C.

    2016-01-01

    Success of cancer prodrugs relying on a foreign gene requires specific delivery of the gene to the cancer, and improvements such as higher level gene transfer and expression. Attaining these objectives will be facilitated in preclinical studies using our newly discovered CNOB-GDEPT, consisting of the produrg: 6-chloro-9-nitro-5-oxo-5H-benzo-(a)-phenoxazine (CNOB) and its activating enzyme ChrR6, which generates the cytotoxic product 9-amino-6-chloro-5H-benzo[a]phenoxazine-5-one (MCHB). MCHB is fluorescent and can be noninvasively imaged in mice, and here we investigated whether MCHB fluorescence quantitatively reflects its concentration, as this would enhance its reporter value in further development of the CNOB-GDEPT therapeutic regimen. PK parameters were estimated and used to predict more effective CNOB administration schedules. CNOB (3.3 mg/kg) was injected iv in mice implanted with humanized ChrR6 (HChrR6)-expressing 4T1 tumors. Fluorescence was imaged in live mice using IVIS Spectrum, and quantified by Living Image 3.2 software. MCHB and CNOB were quantified also by LC/MS/MS analysis. We used non-compartmental model to estimate PK parameters. Phoenix WinNonlin software was used for simulations to predict a more effective CNOB dosage regimen. CNOB administration significantly prolonged mice survival. MCHB fluorescence quantitatively reflected its exposure levels to the tumor and the plasma, as verified by LC/MS/MS analysis at various time points, including at a low concentration of 2 ng/g tumor. The LC/MS/MS data were used to estimate peak plasma concentrations, exposure (AUC 0-24 ), volume of distribution, clearance and half-life in plasma and the tumor. Simulations suggested that the CNOB-GDEPT can be a successful therapy without large increases in the prodrug dosage. MCHB fluorescence quantifies this drug, and CNOB can be effective at relatively low doses. MCHB fluorescence characteristics will expedite further development of CNOB-GDEPT by, for example

  2. Simulation of High Quality Ultrasound Imaging

    DEFF Research Database (Denmark)

    Hemmsen, Martin Christian; Kortbek, Jacob; Nikolov, Svetoslav Ivanov

    2010-01-01

    This paper investigates if the influence on image quality using physical transducers can be simulated with an sufficient accuracy to reveal system performance. The influence is investigated in a comparative study between Synthetic Aperture Sequential Beamformation (SASB) and Dynamic Receive Focus...... is modeled by incorporating measured element pulse echo responses into the simulation software. Validation is performed through measurements on a water phantom with three metal wires, each with a diameter of 0.07 mm. Results show that when comparing measurement and simulation, the lateral beam profile using...

  3. Human eye cataract microstructure modeling and its effect on simulated retinal imaging

    Science.gov (United States)

    Fan, Wen-Shuang; Chang, Chung-Hao; Horng, Chi-Ting; Yao, Hsin-Yu; Sun, Han-Ying; Huang, Shu-Fang; Wang, Hsiang-Chen

    2017-02-01

    We designed a crystalline microstructure during cataract lesions and calculated the aberration value of the eye by using ray trace modeling to identify the corresponding spherical aberration, coma aberration, and trefoil aberration value under different pathological-change degrees. The mutual relationship between microstructure and aberration was then discussed using these values. Calculation results showed that with increased layer number of microstructure, the influence of aberration value on spherical aberration was the greatest. In addition, the influence of a relatively compact microstructure on spherical aberration and coma aberration was small, but that on trefoil aberration was great.

  4. Simulation modeling and arena

    CERN Document Server

    Rossetti, Manuel D

    2015-01-01

    Emphasizes a hands-on approach to learning statistical analysis and model building through the use of comprehensive examples, problems sets, and software applications With a unique blend of theory and applications, Simulation Modeling and Arena®, Second Edition integrates coverage of statistical analysis and model building to emphasize the importance of both topics in simulation. Featuring introductory coverage on how simulation works and why it matters, the Second Edition expands coverage on static simulation and the applications of spreadsheets to perform simulation. The new edition als

  5. Multilaboratory particle image velocimetry analysis of the FDA benchmark nozzle model to support validation of computational fluid dynamics simulations.

    Science.gov (United States)

    Hariharan, Prasanna; Giarra, Matthew; Reddy, Varun; Day, Steven W; Manning, Keefe B; Deutsch, Steven; Stewart, Sandy F C; Myers, Matthew R; Berman, Michael R; Burgreen, Greg W; Paterson, Eric G; Malinauskas, Richard A

    2011-04-01

    This study is part of a FDA-sponsored project to evaluate the use and limitations of computational fluid dynamics (CFD) in assessing blood flow parameters related to medical device safety. In an interlaboratory study, fluid velocities and pressures were measured in a nozzle model to provide experimental validation for a companion round-robin CFD study. The simple benchmark nozzle model, which mimicked the flow fields in several medical devices, consisted of a gradual flow constriction, a narrow throat region, and a sudden expansion region where a fluid jet exited the center of the nozzle with recirculation zones near the model walls. Measurements of mean velocity and turbulent flow quantities were made in the benchmark device at three independent laboratories using particle image velocimetry (PIV). Flow measurements were performed over a range of nozzle throat Reynolds numbers (Re(throat)) from 500 to 6500, covering the laminar, transitional, and turbulent flow regimes. A standard operating procedure was developed for performing experiments under controlled temperature and flow conditions and for minimizing systematic errors during PIV image acquisition and processing. For laminar (Re(throat)=500) and turbulent flow conditions (Re(throat)≥3500), the velocities measured by the three laboratories were similar with an interlaboratory uncertainty of ∼10% at most of the locations. However, for the transitional flow case (Re(throat)=2000), the uncertainty in the size and the velocity of the jet at the nozzle exit increased to ∼60% and was very sensitive to the flow conditions. An error analysis showed that by minimizing the variability in the experimental parameters such as flow rate and fluid viscosity to less than 5% and by matching the inlet turbulence level between the laboratories, the uncertainties in the velocities of the transitional flow case could be reduced to ∼15%. The experimental procedure and flow results from this interlaboratory study (available

  6. Research on simulated infrared image utility evaluation using deep representation

    Science.gov (United States)

    Zhang, Ruiheng; Mu, Chengpo; Yang, Yu; Xu, Lixin

    2018-01-01

    Infrared (IR) image simulation is an important data source for various target recognition systems. However, whether simulated IR images could be used as training data for classifiers depends on the features of fidelity and authenticity of simulated IR images. For evaluation of IR image features, a deep-representation-based algorithm is proposed. Being different from conventional methods, which usually adopt a priori knowledge or manually designed feature, the proposed method can extract essential features and quantitatively evaluate the utility of simulated IR images. First, for data preparation, we employ our IR image simulation system to generate large amounts of IR images. Then, we present the evaluation model of simulated IR image, for which an end-to-end IR feature extraction and target detection model based on deep convolutional neural network is designed. At last, the experiments illustrate that our proposed method outperforms other verification algorithms in evaluating simulated IR images. Cross-validation, variable proportion mixed data validation, and simulation process contrast experiments are carried out to evaluate the utility and objectivity of the images generated by our simulation system. The optimum mixing ratio between simulated and real data is 0.2≤γ≤0.3, which is an effective data augmentation method for real IR images.

  7. End-to-End Image Simulator for Optical Imaging Systems: Equations and Simulation Examples

    Directory of Open Access Journals (Sweden)

    Peter Coppo

    2013-01-01

    Full Text Available The theoretical description of a simplified end-to-end software tool for simulation of data produced by optical instruments, starting from either synthetic or airborne hyperspectral data, is described and some simulation examples of hyperspectral and panchromatic images for existing and future design instruments are also reported. High spatial/spectral resolution images with low intrinsic noise and the sensor/mission specifications are used as inputs for the simulations. The examples reported in this paper show the capabilities of the tool for simulating target detection scenarios, data quality assessment with respect to classification performance and class discrimination, impact of optical design on image quality, and 3D modelling of optical performances. The simulator is conceived as a tool (during phase 0/A for the specification and early development of new Earth observation optical instruments, whose compliance to user’s requirements is achieved through a process of cost/performance trade-off. The Selex Galileo simulator, as compared with other existing image simulators for phase C/D projects of space-borne instruments, implements all modules necessary for a complete panchromatic and hyper spectral image simulation, and it allows excellent flexibility and expandability for new integrated functions because of the adopted IDL-ENVI software environment.

  8. A general approach to flaw simulation in castings by superimposing projections of 3D models onto real X-ray images

    International Nuclear Information System (INIS)

    Hahn, D.; Mery, D.

    2003-01-01

    In order to evaluate the sensitivity of defect inspection systems, it is convenient to examine simulated data. This gives the possibility to tune the parameters of the inspection method and to test the performance of the system in critical cases. In this paper, a practical method for the simulation of defects in radioscopic images of aluminium castings is presented. The approach simulates only the flaws and not the whole radioscopic image of the object under test. A 3D mesh is used to model a flaw with complex geometry, which is projected and superimposed onto real radioscopic images of a homogeneous object according to the exponential attenuation law for X- rays. The new grey value of a pixel, where the 3D flaw is projected, depends only on four parameters: (a) the grey value of the original X-ray image without flaw; (b) the linear absorption coefficient of the examined material; (c) the maximal thickness observable in the radioscopic image; and (d) the length of the intersection of the 3D flaw with the modelled X-ray beam, that is projected into the pixel. A simulation of a complex flaw modelled as a 3D mesh can be performed in any position of the castings by using the algorithm described in this paper. This allows the evaluation of the performance of defect inspection systems in cases where the detection is known to be difficult. In this paper, we show experimental results on real X-ray images of aluminium wheels, in which 3D flaws like blowholes, cracks and inclusions are simulated

  9. Simulation in Complex Modelling

    DEFF Research Database (Denmark)

    Nicholas, Paul; Ramsgaard Thomsen, Mette; Tamke, Martin

    2017-01-01

    This paper will discuss the role of simulation in extended architectural design modelling. As a framing paper, the aim is to present and discuss the role of integrated design simulation and feedback between design and simulation in a series of projects under the Complex Modelling framework. Complex...... Restraint developed for the research exhibition Complex Modelling, Meldahls Smedie Gallery, Copenhagen in 2016. Where the direct project aims and outcomes have been reported elsewhere, the aim for this paper is to discuss overarching strategies for working with design integrated simulation....

  10. Bio-imaging and visualization for patient-customized simulations

    CERN Document Server

    Luo, Xiongbiao; Li, Shuo

    2014-01-01

    This book contains the full papers presented at the MICCAI 2013 workshop Bio-Imaging and Visualization for Patient-Customized Simulations (MWBIVPCS 2013). MWBIVPCS 2013 brought together researchers representing several fields, such as Biomechanics, Engineering, Medicine, Mathematics, Physics and Statistic. The contributions included in this book present and discuss new trends in those fields, using several methods and techniques, including the finite element method, similarity metrics, optimization processes, graphs, hidden Markov models, sensor calibration, fuzzy logic, data mining, cellular automation, active shape models, template matching and level sets. These serve as tools to address more efficiently different and timely applications involving signal and image acquisition, image processing and analysis, image segmentation, image registration and fusion, computer simulation, image based modelling, simulation and surgical planning, image guided robot assisted surgical and image based diagnosis.  This boo...

  11. Simulation in Complex Modelling

    DEFF Research Database (Denmark)

    Nicholas, Paul; Ramsgaard Thomsen, Mette; Tamke, Martin

    2017-01-01

    This paper will discuss the role of simulation in extended architectural design modelling. As a framing paper, the aim is to present and discuss the role of integrated design simulation and feedback between design and simulation in a series of projects under the Complex Modelling framework. Complex...... performance, engage with high degrees of interdependency and allow the emergence of design agency and feedback between the multiple scales of architectural construction. This paper presents examples for integrated design simulation from a series of projects including Lace Wall, A Bridge Too Far and Inflated...... Restraint developed for the research exhibition Complex Modelling, Meldahls Smedie Gallery, Copenhagen in 2016. Where the direct project aims and outcomes have been reported elsewhere, the aim for this paper is to discuss overarching strategies for working with design integrated simulation....

  12. Scientific Modeling and simulations

    CERN Document Server

    Diaz de la Rubia, Tomás

    2009-01-01

    Showcases the conceptual advantages of modeling which, coupled with the unprecedented computing power through simulations, allow scientists to tackle the formibable problems of our society, such as the search for hydrocarbons, understanding the structure of a virus, or the intersection between simulations and real data in extreme environments

  13. MO-G-17A-04: Internal Dosimetric Calculations for Pediatric Nuclear Imaging Applications, Using Monte Carlo Simulations and High-Resolution Pediatric Computational Models

    Energy Technology Data Exchange (ETDEWEB)

    Papadimitroulas, P; Kagadis, GC [University of Patras, Rion, Ahaia (Greece); Loudos, G [Technical Educational Institute of Athens, Aigaleo, Attiki (Greece)

    2014-06-15

    Purpose: Our purpose is to evaluate the administered absorbed dose in pediatric, nuclear imaging studies. Monte Carlo simulations with the incorporation of pediatric computational models can serve as reference for the accurate determination of absorbed dose. The procedure of the calculated dosimetric factors is described, while a dataset of reference doses is created. Methods: Realistic simulations were executed using the GATE toolkit and a series of pediatric computational models, developed by the “IT'IS Foundation”. The series of the phantoms used in our work includes 6 models in the range of 5–14 years old (3 boys and 3 girls). Pre-processing techniques were applied to the images, to incorporate the phantoms in GATE simulations. The resolution of the phantoms was set to 2 mm3. The most important organ densities were simulated according to the GATE “Materials Database”. Several used radiopharmaceuticals in SPECT and PET applications are being tested, following the EANM pediatric dosage protocol. The biodistributions of the several isotopes used as activity maps in the simulations, were derived by the literature. Results: Initial results of absorbed dose per organ (mGy) are presented in a 5 years old girl from the whole body exposure to 99mTc - SestaMIBI, 30 minutes after administration. Heart, kidney, liver, ovary, pancreas and brain are the most critical organs, in which the S-factors are calculated. The statistical uncertainty in the simulation procedure was kept lower than 5%. The Sfactors for each target organ are calculated in Gy/(MBq*sec) with highest dose being absorbed in kidneys and pancreas (9.29*10{sup 10} and 0.15*10{sup 10} respectively). Conclusion: An approach for the accurate dosimetry on pediatric models is presented, creating a reference dosage dataset for several radionuclides in children computational models with the advantages of MC techniques. Our study is ongoing, extending our investigation to other reference models and

  14. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  15. Automated Simulation Model Generation

    NARCIS (Netherlands)

    Huang, Y.

    2013-01-01

    One of today's challenges in the field of modeling and simulation is to model increasingly larger and more complex systems. Complex models take long to develop and incur high costs. With the advances in data collection technologies and more popular use of computer-aided systems, more data has become

  16. AEGIS geologic simulation model

    International Nuclear Information System (INIS)

    Foley, M.G.

    1982-01-01

    The Geologic Simulation Model (GSM) is used by the AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) program at the Pacific Northwest Laboratory to simulate the dynamic geology and hydrology of a geologic nuclear waste repository site over a million-year period following repository closure. The GSM helps to organize geologic/hydrologic data; to focus attention on active natural processes by requiring their simulation; and, through interactive simulation and calibration, to reduce subjective evaluations of the geologic system. During each computer run, the GSM produces a million-year geologic history that is possible for the region and the repository site. In addition, the GSM records in permanent history files everything that occurred during that time span. Statistical analyses of data in the history files of several hundred simulations are used to classify typical evolutionary paths, to establish the probabilities associated with deviations from the typical paths, and to determine which types of perturbations of the geologic/hydrologic system, if any, are most likely to occur. These simulations will be evaluated by geologists familiar with the repository region to determine validity of the results. Perturbed systems that are determined to be the most realistic, within whatever probability limits are established, will be used for the analyses that involve radionuclide transport and dose models. The GSM is designed to be continuously refined and updated. Simulation models are site specific, and, although the submodels may have limited general applicability, the input data equirements necessitate detailed characterization of each site before application

  17. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...

  18. Image simulation for automatic license plate recognition

    Science.gov (United States)

    Bala, Raja; Zhao, Yonghui; Burry, Aaron; Kozitsky, Vladimir; Fillion, Claude; Saunders, Craig; Rodríguez-Serrano, José

    2012-01-01

    Automatic license plate recognition (ALPR) is an important capability for traffic surveillance applications, including toll monitoring and detection of different types of traffic violations. ALPR is a multi-stage process comprising plate localization, character segmentation, optical character recognition (OCR), and identification of originating jurisdiction (i.e. state or province). Training of an ALPR system for a new jurisdiction typically involves gathering vast amounts of license plate images and associated ground truth data, followed by iterative tuning and optimization of the ALPR algorithms. The substantial time and effort required to train and optimize the ALPR system can result in excessive operational cost and overhead. In this paper we propose a framework to create an artificial set of license plate images for accelerated training and optimization of ALPR algorithms. The framework comprises two steps: the synthesis of license plate images according to the design and layout for a jurisdiction of interest; and the modeling of imaging transformations and distortions typically encountered in the image capture process. Distortion parameters are estimated by measurements of real plate images. The simulation methodology is successfully demonstrated for training of OCR.

  19. Simulated annealing image reconstruction for positron emission tomography

    International Nuclear Information System (INIS)

    Sundermann, E.; Lemahieu, I.; Desmedt, P.

    1994-01-01

    In Positron Emission Tomography (PET) images have to be reconstructed from moisy projection data. The noise on the PET data can be modeled by a Poison distribution. In this paper, we present the results of using the simulated annealing technique to reconstruct PET images. Various parameter settings of the simulated annealing algorithm are discussed and optimized. The reconstructed images are of good quality and high contrast, in comparison to other reconstruction techniques. (authors)

  20. TH-CD-207A-08: Simulated Real-Time Image Guidance for Lung SBRT Patients Using Scatter Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Redler, G; Cifter, G; Templeton, A; Lee, C; Bernard, D; Liao, Y; Zhen, H; Turian, J; Chu, J [Rush University Medical Center, Chicago, IL (United States)

    2016-06-15

    Purpose: To develop a comprehensive Monte Carlo-based model for the acquisition of scatter images of patient anatomy in real-time, during lung SBRT treatment. Methods: During SBRT treatment, images of patient anatomy can be acquired from scattered radiation. To rigorously examine the utility of scatter images for image guidance, a model is developed using MCNP code to simulate scatter images of phantoms and lung cancer patients. The model is validated by comparing experimental and simulated images of phantoms of different complexity. The differentiation between tissue types is investigated by imaging objects of known compositions (water, lung, and bone equivalent). A lung tumor phantom, simulating materials and geometry encountered during lung SBRT treatments, is used to investigate image noise properties for various quantities of delivered radiation (monitor units(MU)). Patient scatter images are simulated using the validated simulation model. 4DCT patient data is converted to an MCNP input geometry accounting for different tissue composition and densities. Lung tumor phantom images acquired with decreasing imaging time (decreasing MU) are used to model the expected noise amplitude in patient scatter images, producing realistic simulated patient scatter images with varying temporal resolution. Results: Image intensity in simulated and experimental scatter images of tissue equivalent objects (water, lung, bone) match within the uncertainty (∼3%). Lung tumor phantom images agree as well. Specifically, tumor-to-lung contrast matches within the uncertainty. The addition of random noise approximating quantum noise in experimental images to simulated patient images shows that scatter images of lung tumors can provide images in as fast as 0.5 seconds with CNR∼2.7. Conclusions: A scatter imaging simulation model is developed and validated using experimental phantom scatter images. Following validation, lung cancer patient scatter images are simulated. These simulated

  1. Three-dimensional modeling and simulation of asphalt concrete mixtures based on X-ray CT microstructure images

    Directory of Open Access Journals (Sweden)

    Hainian Wang

    2014-02-01

    Full Text Available X-ray CT (computed tomography was used to scan asphalt mixture specimen to obtain high resolution continuous cross-section images and the meso-structure. According to the theory of three-dimensional (3D reconstruction, the 3D reconstruction algorithm was investigated in this paper. The key to the reconstruction technique is the acquisition of the voxel positions and the relationship between the pixel element and node. Three-dimensional numerical model of asphalt mixture specimen was created by a self-developed program. A splitting test was conducted to predict the stress distributions of the asphalt mixture and verify the rationality of the 3D model.

  2. Dynamic simulation for distortion image with turbulence atmospheric transmission effects

    Science.gov (United States)

    Du, Huijie; Fei, Jindong; Qing, Duzheng; Zhao, Hongming; Yu, Hong; Cheng, Chen

    2013-09-01

    The imaging through atmospheric turbulence is an inevitable problem encountered by infrared imaging sensors working in the turbulence atmospheric environment. Before light-rays enter the window of the imaging sensors, the atmospheric turbulence will randomly interfere with the transmission of the light waves came from the objects, causing the distribution of image intensity values on the focal plane to diffuse, the peak value to decrease, the image to get blurred, and the pixels to deviate, and making image identification very difficult. Owing to the fact of the long processing time and that the atmospheric turbulent flow field is unknown and hard to be described by mathematical models, dynamic simulation for distortion Image with turbulence atmospheric transmission effects is much more difficult and challenging in the world. This paper discusses the dynamic simulation for distortion Image of turbulence atmospheric transmission effect. First of all, with the data and the optical transmission model of the turbulence atmospheric, the ray-tracing method is applied to obtain the propagation path of optical ray which propagates through the high-speed turbulent flow field, and then to calculate the OPD from the reference wave to the reconverted wave front and obtain the point spread function (PSF). Secondly, infrared characteristics models of typical scene were established according to the theory of infrared physics and heat conduction, and then the dynamic infrared image was generated by OpenGL. The last step is to obtain the distortion Image with turbulence atmospheric transmission effects .With the data of atmospheric transmission computation, infrared simulation image of every frame was processed according to the theory of image processing and the real-time image simulation, and then the dynamic distortion simulation images with effects of blurring, jitter and shifting were obtained. Above-mentioned simulation method can provide the theoretical bases for recovering

  3. Medical image archive node simulation and architecture

    Science.gov (United States)

    Chiang, Ted T.; Tang, Yau-Kuo

    1996-05-01

    It is a well known fact that managed care and new treatment technologies are revolutionizing the health care provider world. Community Health Information Network and Computer-based Patient Record projects are underway throughout the United States. More and more hospitals are installing digital, `filmless' radiology (and other imagery) systems. They generate a staggering amount of information around the clock. For example, a typical 500-bed hospital might accumulate more than 5 terabytes of image data in a period of 30 years for conventional x-ray images and digital images such as Magnetic Resonance Imaging and Computer Tomography images. With several hospitals contributing to the archive, the storage required will be in the hundreds of terabytes. Systems for reliable, secure, and inexpensive storage and retrieval of digital medical information do not exist today. In this paper, we present a Medical Image Archive and Distribution Service (MIADS) concept. MIADS is a system shared by individual and community hospitals, laboratories, and doctors' offices that need to store and retrieve medical images. Due to the large volume and complexity of the data, as well as the diversified user access requirement, implementation of the MIADS will be a complex procedure. One of the key challenges to implementing a MIADS is to select a cost-effective, scalable system architecture to meet the ingest/retrieval performance requirements. We have performed an in-depth system engineering study, and developed a sophisticated simulation model to address this key challenge. This paper describes the overall system architecture based on our system engineering study and simulation results. In particular, we will emphasize system scalability and upgradability issues. Furthermore, we will discuss our simulation results in detail. The simulations study the ingest/retrieval performance requirements based on different system configurations and architectures for variables such as workload, tape

  4. PSH Transient Simulation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Muljadi, Eduard [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-21

    PSH Transient Simulation Modeling presentation from the WPTO FY14 - FY16 Peer Review. Transient effects are an important consideration when designing a PSH system, yet numerical techniques for hydraulic transient analysis still need improvements for adjustable-speed (AS) reversible pump-turbine applications.

  5. Wake modeling and simulation

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Madsen Aagaard, Helge; Larsen, Torben J.

    We present a consistent, physically based theory for the wake meandering phenomenon, which we consider of crucial importance for the overall description of wind turbine loadings in wind farms. In its present version the model is confined to single wake situations. The model philosophy does, howev...... methodology has been implemented in the aeroelastic code HAWC2, and example simulations of wake situations, from the small Tjæreborg wind farm, have been performed showing satisfactory agreement between predictions and measurements...

  6. Noise simulation in cone beam CT imaging with parallel computing

    International Nuclear Information System (INIS)

    Tu, S.-J.; Shaw, Chris C; Chen, Lingyun

    2006-01-01

    We developed a computer noise simulation model for cone beam computed tomography imaging using a general purpose PC cluster. This model uses a mono-energetic x-ray approximation and allows us to investigate three primary performance components, specifically quantum noise, detector blurring and additive system noise. A parallel random number generator based on the Weyl sequence was implemented in the noise simulation and a visualization technique was accordingly developed to validate the quality of the parallel random number generator. In our computer simulation model, three-dimensional (3D) phantoms were mathematically modelled and used to create 450 analytical projections, which were then sampled into digital image data. Quantum noise was simulated and added to the analytical projection image data, which were then filtered to incorporate flat panel detector blurring. Additive system noise was generated and added to form the final projection images. The Feldkamp algorithm was implemented and used to reconstruct the 3D images of the phantoms. A 24 dual-Xeon PC cluster was used to compute the projections and reconstructed images in parallel with each CPU processing 10 projection views for a total of 450 views. Based on this computer simulation system, simulated cone beam CT images were generated for various phantoms and technique settings. Noise power spectra for the flat panel x-ray detector and reconstructed images were then computed to characterize the noise properties. As an example among the potential applications of our noise simulation model, we showed that images of low contrast objects can be produced and used for image quality evaluation

  7. Calibration and Validation of a Detailed Architectural Canopy Model Reconstruction for the Simulation of Synthetic Hemispherical Images and Airborne LiDAR Data

    Directory of Open Access Journals (Sweden)

    Magnus Bremer

    2017-02-01

    Full Text Available Canopy density measures such as the Leaf Area Index (LAI have become standardized mapping products derived from airborne and terrestrial Light Detection And Ranging (aLiDAR and tLiDAR, respectively data. A specific application of LiDAR point clouds is their integration into radiative transfer models (RTM of varying complexity. Using, e.g., ray tracing, this allows flexible simulations of sub-canopy light condition and the simulation of various sensors such as virtual hemispherical images or waveform LiDAR on a virtual forest plot. However, the direct use of LiDAR data in RTMs shows some limitations in the handling of noise, the derivation of surface areas per LiDAR point and the discrimination of solid and porous canopy elements. In order to address these issues, a strategy upgrading tLiDAR and Digital Hemispherical Photographs (DHP into plausible 3D architectural canopy models is suggested. The presented reconstruction workflow creates an almost unbiased virtual 3D representation of branch and leaf surface distributions, minimizing systematic errors due to the object–sensor relationship. The models are calibrated and validated using DHPs. Using the 3D models for simulations, their capabilities for the description of leaf density distributions and the simulation of aLiDAR and DHP signatures are shown. At an experimental test site, the suitability of the models, in order to systematically simulate and evaluate aLiDAR based LAI predictions under various scan settings is proven. This strategy makes it possible to show the importance of laser point sampling density, but also the diversity of scan angles and their quantitative effect onto error margins.

  8. Visual acuity estimation from simulated images

    Science.gov (United States)

    Duncan, William J.

    Simulated images can provide insight into the performance of optical systems, especially those with complicated features. Many modern solutions for presbyopia and cataracts feature sophisticated power geometries or diffractive elements. Some intraocular lenses (IOLs) arrive at multifocality through the use of a diffractive surface and multifocal contact lenses have a radially varying power profile. These type of elements induce simultaneous vision as well as affecting vision much differently than a monofocal ophthalmic appliance. With myriad multifocal ophthalmics available on the market it is difficult to compare or assess performance in ways that effect wearers of such appliances. Here we present software and algorithmic metrics that can be used to qualitatively and quantitatively compare ophthalmic element performance, with specific examples of bifocal intraocular lenses (IOLs) and multifocal contact lenses. We anticipate this study, methods, and results to serve as a starting point for more complex models of vision and visual acuity in a setting where modeling is advantageous. Generating simulated images of real- scene scenarios is useful for patients in assessing vision quality with a certain appliance. Visual acuity estimation can serve as an important tool for manufacturing and design of ophthalmic appliances.

  9. Medical imaging informatics simulators: a tutorial.

    Science.gov (United States)

    Huang, H K; Deshpande, Ruchi; Documet, Jorge; Le, Anh H; Lee, Jasper; Ma, Kevin; Liu, Brent J

    2014-05-01

    A medical imaging informatics infrastructure (MIII) platform is an organized method of selecting tools and synthesizing data from HIS/RIS/PACS/ePR systems with the aim of developing an imaging-based diagnosis or treatment system. Evaluation and analysis of these systems can be made more efficient by designing and implementing imaging informatics simulators. This tutorial introduces the MIII platform and provides the definition of treatment/diagnosis systems, while primarily focusing on the development of the related simulators. A medical imaging informatics (MII) simulator in this context is defined as a system integration of many selected imaging and data components from the MIII platform and clinical treatment protocols, which can be used to simulate patient workflow and data flow starting from diagnostic procedures to the completion of treatment. In these processes, DICOM and HL-7 standards, IHE workflow profiles, and Web-based tools are emphasized. From the information collected in the database of a specific simulator, evidence-based medicine can be hypothesized to choose and integrate optimal clinical decision support components. Other relevant, selected clinical resources in addition to data and tools from the HIS/RIS/PACS and ePRs platform may also be tailored to develop the simulator. These resources can include image content indexing, 3D rendering with visualization, data grid and cloud computing, computer-aided diagnosis (CAD) methods, specialized image-assisted surgical, and radiation therapy technologies. Five simulators will be discussed in this tutorial. The PACS-ePR simulator with image distribution is the cradle of the other simulators. It supplies the necessary PACS-based ingredients and data security for the development of four other simulators: the data grid simulator for molecular imaging, CAD-PACS, radiation therapy simulator, and image-assisted surgery simulator. The purpose and benefits of each simulator with respect to its clinical relevance

  10. Intelligent medical image processing by simulated annealing

    International Nuclear Information System (INIS)

    Ohyama, Nagaaki

    1992-01-01

    Image processing is being widely used in the medical field and already has become very important, especially when used for image reconstruction purposes. In this paper, it is shown that image processing can be classified into 4 categories; passive, active, intelligent and visual image processing. These 4 classes are explained at first through the use of several examples. The results show that the passive image processing does not give better results than the others. Intelligent image processing, then, is addressed, and the simulated annealing method is introduced. Due to the flexibility of the simulated annealing, formulated intelligence is shown to be easily introduced in an image reconstruction problem. As a practical example, 3D blood vessel reconstruction from a small number of projections, which is insufficient for conventional method to give good reconstruction, is proposed, and computer simulation clearly shows the effectiveness of simulated annealing method. Prior to the conclusion, medical file systems such as IS and C (Image Save and Carry) is pointed out to have potential for formulating knowledge, which is indispensable for intelligent image processing. This paper concludes by summarizing the advantages of simulated annealing. (author)

  11. Agricultural trade liberalisation on greenhouse gas emissions. A simulation study using the GTAP-IMAGE modelling framework

    International Nuclear Information System (INIS)

    Verburg, R.; Woltjer, G.; Tabeau, A.; Eickhout, B.; Stehfest, E.

    2008-02-01

    This report explores the effects of agricultural trade liberalisation on greenhouse gas emissions and on changing commodity production areas by coupling of the modeling tools GTAP (economic model) and IMAGE (environmental model). Four scenarios are explored with developments up tot 2050. The scenarios include a baseline, full liberalisation and two partial liberalisation scenarios for which the latter scenarios include removal of trade barriers or removal of milk quota by 2015 only. The results indicate that liberalisation leads to a further increase in greenhouse gas emissions adding to an already observed increase in emissions observed in the baseline scenario. CO2 emission increase is caused by vegetation clearance due to a rapid expansion of agricultural areas in South America and South East Asia. Increased methane emissions are also calculated in these areas caused by less intensive cattle farming. Global production of the commodities milk, dairy and beef does not change between full liberalisation and the baseline but clear shifts from North America and Europe to South America and South East Asia are expected

  12. Simulation - modeling - experiment

    International Nuclear Information System (INIS)

    2004-01-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  13. From 4D Medical Images (CT, MRI, and Ultrasound to 4D Structured Mesh Models of the Left Ventricular Endocardium for Patient-Specific Simulations

    Directory of Open Access Journals (Sweden)

    Federico Canè

    2018-01-01

    Full Text Available With cardiovascular disease (CVD remaining the primary cause of death worldwide, early detection of CVDs becomes essential. The intracardiac flow is an important component of ventricular function, motion kinetics, wash-out of ventricular chambers, and ventricular energetics. Coupling between Computational Fluid Dynamics (CFD simulations and medical images can play a fundamental role in terms of patient-specific diagnostic tools. From a technical perspective, CFD simulations with moving boundaries could easily lead to negative volumes errors and the sudden failure of the simulation. The generation of high-quality 4D meshes (3D in space + time with 1-to-1 vertex becomes essential to perform a CFD simulation with moving boundaries. In this context, we developed a semiautomatic morphing tool able to create 4D high-quality structured meshes starting from a segmented 4D dataset. To prove the versatility and efficiency, the method was tested on three different 4D datasets (Ultrasound, MRI, and CT by evaluating the quality and accuracy of the resulting 4D meshes. Furthermore, an estimation of some physiological quantities is accomplished for the 4D CT reconstruction. Future research will aim at extending the region of interest, further automation of the meshing algorithm, and generating structured hexahedral mesh models both for the blood and myocardial volume.

  14. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    leaving students. It is a probabilistic model. In the next part of this article, two more models - 'input/output model' used for production systems or economic studies and a. 'discrete event simulation model' are introduced. Aircraft Performance Model.

  15. Model simulations of line-of-sight effects in airglow imaging of acoustic and fast gravity waves from ground and space

    Science.gov (United States)

    Aguilar Guerrero, J.; Snively, J. B.

    2017-12-01

    Acoustic waves (AWs) have been predicted to be detectable by imaging systems for the OH airglow layer [Snively, GRL, 40, 2013], and have been identified in spectrometer data [Pilger et al., JASP, 104, 2013]. AWs are weak in the mesopause region, but can attain large amplitudes in the F region [Garcia et al., GRL, 40, 2013] and have local impacts on the thermosphere and ionosphere. Similarly, fast GWs, with phase speeds over 100 m/s, may propagate to the thermosphere and impart significant local body forcing [Vadas and Fritts, JASTP, 66, 2004]. Both have been clearly identified in ionospheric total electron content (TEC), such as following the 2013 Moore, OK, EF5 tornado [Nishioka et al., GRL, 40, 2013] and following the 2011 Tohoku-Oki tsunami [e.g., Galvan et al., RS, 47, 2012, and references therein], but AWs have yet to be unambiguously imaged in MLT data and fast GWs have low amplitudes near the threshold of detection; nevertheless, recent imaging systems have sufficient spatial and temporal resolution and sensitivity to detect both AWs and fast GWs with short periods [e.g., Pautet et al., AO, 53, 2014]. The associated detectability challenges are related to the transient nature of their signatures and to systematic challenges due to line-of-sight (LOS) effects such as enhancements and cancelations due to integration along aligned or oblique wavefronts and geometric intensity enhancements. We employ a simulated airglow imager framework that incorporates 2D and 3D emission rate data and performs the necessary LOS integrations for synthetic imaging from ground- and space-based platforms to assess relative intensity and temperature perturbations. We simulate acoustic and fast gravity wave perturbations to the hydroxyl layer from a nonlinear, compressible model [e.g., Snively, 2013] for different idealized and realistic test cases. The results show clear signal enhancements when acoustic waves are imaged off-zenith or off-nadir and the temporal evolution of these

  16. New developments in simulating X-ray phase contrast imaging

    International Nuclear Information System (INIS)

    Peterzol, A.; Berthier, J.; Duvauchelle, P.; Babot, D.; Ferrero, C.

    2007-01-01

    A deterministic algorithm simulating phase contrast (PC) x-ray images for complex 3- dimensional (3D) objects is presented. This algorithm has been implemented in a simulation code named VXI (Virtual X-ray Imaging). The physical model chosen to account for PC technique is based on the Fresnel-Kirchhoff diffraction theory. The algorithm consists mainly of two parts. The first one exploits the VXI ray-tracing approach to compute the object transmission function. The second part simulates the PC image due to the wave front distortion introduced by the sample. In the first part, the use of computer-aided drawing (CAD) models enables simulations to be carried out with complex 3D objects. Differently from the VXI original version, which makes use of an object description via triangular facets, the new code requires a more 'sophisticated' object representation based on Non-Uniform Rational B-Splines (NURBS). As a first step we produce a spatial high resolution image by using a point and monochromatic source and an ideal detector. To simulate the polychromatic case, the intensity image is integrated over the considered x-ray energy spectrum. Then, in order to account for the system spatial resolution properties, the high spatial resolution image (mono or polychromatic) is convolved with the total point spread function of the imaging system under consideration. The results supplied by the presented algorithm are examined with the help of some relevant examples. (authors)

  17. Monte Carlo simulations of medical imaging modalities

    Energy Technology Data Exchange (ETDEWEB)

    Estes, G.P. [Los Alamos National Lab., NM (United States)

    1998-09-01

    Because continuous-energy Monte Carlo radiation transport calculations can be nearly exact simulations of physical reality (within data limitations, geometric approximations, transport algorithms, etc.), it follows that one should be able to closely approximate the results of many experiments from first-principles computations. This line of reasoning has led to various MCNP studies that involve simulations of medical imaging modalities and other visualization methods such as radiography, Anger camera, computerized tomography (CT) scans, and SABRINA particle track visualization. It is the intent of this paper to summarize some of these imaging simulations in the hope of stimulating further work, especially as computer power increases. Improved interpretation and prediction of medical images should ultimately lead to enhanced medical treatments. It is also reasonable to assume that such computations could be used to design new or more effective imaging instruments.

  18. Simulating Galaxies and Active Galactic Nuclei in the LSST Image Simulation Effort

    NARCIS (Netherlands)

    Pizagno II, Jim; Ahmad, Z.; Bankert, J.; Bard, D.; Connolly, A.; Chang, C.; Gibson, R. R.; Gilmore, K.; Grace, E.; Hannel, M.; Jernigan, J. G.; Jones, L.; Kahn, S. M.; Krughoff, S. K.; Lorenz, S.; Marshall, S.; Shmakova, S. M.; Sylvestri, N.; Todd, N.; Young, M.

    We present an extragalactic source catalog, which includes galaxies and Active Galactic Nuclei, that is used for the Large Survey Synoptic Telescope Imaging Simulation effort. The galaxies are taken from the De Lucia et. al. (2006) semi-analytic modeling (SAM) of the Millennium Simulation. The LSST

  19. 3D Rapid Prototyping for Otolaryngology—Head and Neck Surgery: Applications in Image-Guidance, Surgical Simulation and Patient-Specific Modeling

    Science.gov (United States)

    Chan, Harley H. L.; Siewerdsen, Jeffrey H.; Vescan, Allan; Daly, Michael J.; Prisman, Eitan; Irish, Jonathan C.

    2015-01-01

    The aim of this study was to demonstrate the role of advanced fabrication technology across a broad spectrum of head and neck surgical procedures, including applications in endoscopic sinus surgery, skull base surgery, and maxillofacial reconstruction. The initial case studies demonstrated three applications of rapid prototyping technology are in head and neck surgery: i) a mono-material paranasal sinus phantom for endoscopy training ii) a multi-material skull base simulator and iii) 3D patient-specific mandible templates. Digital processing of these phantoms is based on real patient or cadaveric 3D images such as CT or MRI data. Three endoscopic sinus surgeons examined the realism of the endoscopist training phantom. One experienced endoscopic skull base surgeon conducted advanced sinus procedures on the high-fidelity multi-material skull base simulator. Ten patients participated in a prospective clinical study examining patient-specific modeling for mandibular reconstructive surgery. Qualitative feedback to assess the realism of the endoscopy training phantom and high-fidelity multi-material phantom was acquired. Conformance comparisons using assessments from the blinded reconstructive surgeons measured the geometric performance between intra-operative and pre-operative reconstruction mandible plates. Both the endoscopy training phantom and the high-fidelity multi-material phantom received positive feedback on the realistic structure of the phantom models. Results suggested further improvement on the soft tissue structure of the phantom models is necessary. In the patient-specific mandible template study, the pre-operative plates were judged by two blinded surgeons as providing optimal conformance in 7 out of 10 cases. No statistical differences were found in plate fabrication time and conformance, with pre-operative plating providing the advantage of reducing time spent in the operation room. The applicability of common model design and fabrication techniques

  20. The temporal-spatial dynamics of feature maps during monocular deprivation revealed by chronic imaging and self-organization model simulation.

    Science.gov (United States)

    Tong, Lei; Xie, Yang; Yu, Hongbo

    2016-12-17

    Experiments on the adult visual cortex of cats, ferrets and monkeys have revealed organized spatial relationships between multiple feature maps which can also be reproduced by the Kohonen and elastic net self-organization models. However, attempts to apply these models to simulate the temporal kinetics of monocular deprivation (MD) during the critical period, and their effects on the spatial arrangement of feature maps, have led to conflicting results. In this study, we performed MD and chronic imaging in the ferret visual cortex during the critical period of ocular dominance (OD) plasticity. We also used the Kohonen model to simulate the effects of MD on OD and orientation map development. Both the experiments and simulations demonstrated two general parameter-insensitive findings. Specifically, our first finding demonstrated that the OD index shift resulting from MD, and its subsequent recovery during binocular vision (BV), were both nonlinear, with a significantly stronger shift occurring during the initial period. Meanwhile, spatial reorganization of feature maps led to globally unchanged but locally shifted map patterns. In detail, we found that the periodicity of OD and orientation maps remained unchanged during, and after, deprivation. Relationships between OD and orientation maps remained similar but were significantly weakened due to OD border shifts. These results indicate that orthogonal gradient relationships between maps may be preset and are only mildly modifiable during the critical period. The Kohonen model was able to reproduce these experimental results, hence its role is further extended to the description of cortical feature map dynamics during development. Copyright © 2016 IBRO. Published by Elsevier Ltd. All rights reserved.

  1. 3D Rapid Prototyping for Otolaryngology-Head and Neck Surgery: Applications in Image-Guidance, Surgical Simulation and Patient-Specific Modeling.

    Directory of Open Access Journals (Sweden)

    Harley H L Chan

    Full Text Available The aim of this study was to demonstrate the role of advanced fabrication technology across a broad spectrum of head and neck surgical procedures, including applications in endoscopic sinus surgery, skull base surgery, and maxillofacial reconstruction. The initial case studies demonstrated three applications of rapid prototyping technology are in head and neck surgery: i a mono-material paranasal sinus phantom for endoscopy training ii a multi-material skull base simulator and iii 3D patient-specific mandible templates. Digital processing of these phantoms is based on real patient or cadaveric 3D images such as CT or MRI data. Three endoscopic sinus surgeons examined the realism of the endoscopist training phantom. One experienced endoscopic skull base surgeon conducted advanced sinus procedures on the high-fidelity multi-material skull base simulator. Ten patients participated in a prospective clinical study examining patient-specific modeling for mandibular reconstructive surgery. Qualitative feedback to assess the realism of the endoscopy training phantom and high-fidelity multi-material phantom was acquired. Conformance comparisons using assessments from the blinded reconstructive surgeons measured the geometric performance between intra-operative and pre-operative reconstruction mandible plates. Both the endoscopy training phantom and the high-fidelity multi-material phantom received positive feedback on the realistic structure of the phantom models. Results suggested further improvement on the soft tissue structure of the phantom models is necessary. In the patient-specific mandible template study, the pre-operative plates were judged by two blinded surgeons as providing optimal conformance in 7 out of 10 cases. No statistical differences were found in plate fabrication time and conformance, with pre-operative plating providing the advantage of reducing time spent in the operation room. The applicability of common model design and

  2. An efficient simulator for pinhole imaging of PET isotopes.

    Science.gov (United States)

    Goorden, M C; van der Have, F; Kreuger, R; Beekman, F J

    2011-03-21

    Today, small-animal multi-pinhole single photon emission computed tomography (SPECT) can reach sub-half-millimeter image resolution. Recently we have shown that dedicated multi-pinhole collimators can also image PET tracers at sub-mm level. Simulations play a vital role in the design and optimization of such collimators. Here we propose and validate an efficient simulator that models the whole imaging chain from emitted positron to detector signal. This analytical simulator for pinhole positron emission computed tomography (ASPECT) combines analytical models for pinhole and detector response with Monte Carlo (MC)-generated kernels for positron range. Accuracy of ASPECT was validated by means of a MC simulator (MCS) that uses a kernel-based step for detector response with an angle-dependent detector kernel based on experiments. Digital phantom simulations with ASPECT and MCS converge to almost identical images. However, ASPECT converges to an equal image noise level three to four orders of magnitude faster than MCS. We conclude that ASPECT could serve as a practical tool in collimator design and iterative image reconstruction for novel multi-pinhole PET.

  3. Computer simulation and image guidance for individualised dynamic spinal stabilization.

    Science.gov (United States)

    Kantelhardt, S R; Hausen, U; Kosterhon, M; Amr, A N; Gruber, K; Giese, A

    2015-08-01

    Dynamic implants for the human spine are used to re-establish regular segmental motion. However, the results have often been unsatisfactory and complications such as screw loosening are common. Individualisation of appliances and precision implantation are needed to improve the outcome of this procedure. Computer simulation, virtual implant optimisation and image guidance were used to improve the technique. A human lumbar spine computer model was developed using multi-body simulation software. The model simulates spinal motion under load and degenerative changes. After virtual degeneration of a L4/5 segment, virtual pedicle screw-based implants were introduced. The implants' positions and properties were iteratively optimised. The resulting implant positions were used as operative plan for image guidance and finally implemented in a physical spine model. In the simulation, the introduction and optimisation of virtually designed dynamic implants could partly compensate for the effects of virtual lumbar segment degeneration. The optimised operative plan was exported to two different image-guidance systems for transfer to a physical spine model. Three-dimensional computer graphic simulation is a feasible means to develop operative plans for dynamic spinal stabilization. These operative plans can be transferred to commercially available image-guidance systems for use in implantation of physical implants in a spine model. This concept has important potential in the design of operative plans and implants for individualised dynamic spine stabilization surgery.

  4. Quantification and reduction of the collimator-detector response effect in SPECT by applying a system model during iterative image reconstruction: a simulation study.

    Science.gov (United States)

    Kalantari, Faraz; Rajabi, Hossein; Saghari, Mohsen

    2012-03-01

    Detector blurring and non-ideal collimation decrease the spatial resolution of the single-photon emission computed tomography (SPECT) images. Iterative reconstruction algorithms such as ordered subsets expectation maximization (OSEM) can incorporate degrading factors during reconstruction. We investigated the quantitative errors associated with poor SPECT resolution and evaluated the importance of two-dimensional (2D) and three-dimensional (3D) resolution recovery by modelling system response during iterative image reconstruction. Different phantoms consisted of the NURBS-based cardiac-torso (NCAT) liver phantom with small tumors, the Zubal brain phantom and the NCAT heart phantom were used in this study. Monte Carlo simulation was used to create SPECT projections. Gaussian functions were used to model collimator detector response (CDR). Modeled CDRs were applied during OSEM. Both noise-free and noisy projections were created. Even with noise-free projections, conventional OSEM algorithm provided limited quantitative accuracy compared to both 2D and 3D resolution recovery. The 3D implementation of resolution recovery, however, yielded superior results compared to its 2D implementation. For the liver phantom, the ability to distinguish small tumors in both transverse and axial planes was improved. For the brain phantom, gray to white matter activity ratio was increased from 3.14 ± 0.04 in simple OSEM to 3.84 ± 0.06 in 3D OSEM. For the NCAT heart phantom, 3D resolution recovery, results in images with thinner wall and higher contrast for different noise levels. There are considerable quantitative errors associated with CDR, especially when the size of the target is comparable with the spatial resolution of the system. Between different reconstruction algorithms, 3D OSEM that consider the 3D nature of CDR, improve both the visual quality and the quantitative accuracy of any SPECT studies.

  5. Quantitative Image Simulation and Analysis of Nanoparticles

    DEFF Research Database (Denmark)

    Madsen, Jacob; Hansen, Thomas Willum

    Microscopy (HRTEM) has become a routine analysis tool for structural characterization at atomic resolution, and with the recent development of in-situ TEMs, it is now possible to study catalytic nanoparticles under reaction conditions. However, the connection between an experimental image, and the underlying...... of strain measurements from TEM images, and investigate the stability of these measurements to microscope parameters. This is followed by our efforts toward simulating metal nanoparticles on a metal-oxide support using the Charge Optimized Many Body (COMB) interatomic potential. The simulated interface...

  6. Modelling and simulation of pixelated photon counting X-ray detectors for imaging; Modellierung und Simulation physikalischer Eigenschaften photonenzaehlender Roentgenpixeldetektoren fuer die Bildgebung

    Energy Technology Data Exchange (ETDEWEB)

    Durst, Juergen

    2008-07-22

    First of all the physics processes generating the energy deposition in the sensor volume are investigated. The spatial resolution limits of photon interactions and the range of secondary electrons are discussed. The signatures in the energy deposition spectrum in pixelated detectors with direct conversion layers are described. The energy deposition for single events can be generated by the Monte-Carlo-Simulation package ROSI. The basic interactions of photons with matter are evaluated, resulting in the ability to use ROSI as a basis for the simulation of photon counting pixel detectors with direct conversion. In the context of this thesis a detector class is developed to simulate the response of hybrid photon counting pixel detectors using high-Z sensor materials like Cadmium Telluride (CdTe) or Gallium Arsenide (GaAs) in addition to silicon. To enable the realisation of such a simulation, the relevant physics processes and properties have to be implemented: processes in the sensor layer (provided by EGS4/LSCAT in ROSI), generation of charge carriers as electron hole pairs, diffusion and repulsion of charge carriers during drift and lifetime. Furthermore, several noise contributions of the electronics can be taken into account. The result is a detector class which allows the simulation of photon counting detectors. In this thesis the multiplicity framework is developed, including a formula to calculate or measure the zero frequency detective quantum efficiency (DQE). To enable the measurement of the multiplicity of detected events a cluster analysis program was developed. Random and systematic errors introduced by the cluster analysis are discussed. It is also shown that the cluster analysis method can be used to determine the averaged multiplicity with high accuracy. The method is applied to experimental data. As an example using the implemented detector class, the discriminator threshold dependency of the DQE and modulation transfer function is investigated in

  7. Modelling and Simulation: An Overview

    OpenAIRE

    McAleer, Michael; Chan, Felix; Oxley, Les

    2013-01-01

    This discussion paper resulted in a publication in 'Selected Papers of the MSSANZ 19th Biennial Conference on Modelling and Simulation Mathematics and Computers in Simulation', 2013, pp. viii. The papers in this special issue of Mathematics and Computers in Simulation cover the following topics: improving judgmental adjustment of model-based forecasts, whether forecast updates are progressive, on a constrained mixture vector autoregressive model, whether all estimators are born equal: the emp...

  8. Notes on modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Redondo, Antonio [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-10

    These notes present a high-level overview of how modeling and simulation are carried out by practitioners. The discussion is of a general nature; no specific techniques are examined but the activities associated with all modeling and simulation approaches are briefly addressed. There is also a discussion of validation and verification and, at the end, a section on why modeling and simulation are useful.

  9. Image and Dose Simulation in Support of New Imaging Modalities

    International Nuclear Information System (INIS)

    Kuruvilla Verghese

    2002-01-01

    This report summarizes the highlights of the research performed under the 2-year NEER grant from the Department of Energy. The primary outcome of the work was a new Monte Carlo code, MCMIS-DS, for Monte Carlo for Mammography Image Simulation including Differential Sampling. The code was written to generate simulated images and dose distributions from two different new digital x-ray imaging modalities, namely, synchrotron imaging (SI) and a slot geometry digital mammography system called Fisher Senoscan. A differential sampling scheme was added to the code to generate multiple images that included variations in the parameters of the measurement system and the object in a single execution of the code. The code is to serve multiple purposes; (1) to answer questions regarding the contribution of scattered photons to images, (2) for use in design optimization studies, and (3) to do up to second-order perturbation studies to assess the effects of design parameter variations and/or physical parameters of the object (the breast) without having to re-run the code for each set of varied parameters. The accuracy and fidelity of the code were validated by a large variety of benchmark studies using published data and also using experimental results from mammography phantoms on both imaging modalities

  10. Bubble-Bed Structural Models for Hybrid Flow Simulation: An Outlook Based on a CFD-Generated Flow Image

    Czech Academy of Sciences Publication Activity Database

    Staykov, P.; Fialová, Marie; Vlaev, S. D.

    2009-01-01

    Roč. 87, č. 5 (2009), s. 669-676 ISSN 0263-8762 Institutional research plan: CEZ:AV0Z40720504 Keywords : bubble column * hybrid modeling * phenomenological mixing models Subject RIV: CI - Industrial Chemistry, Chemical Engineering Impact factor: 1.223, year: 2009

  11. AUTOMATIC INTERPRETATION OF HIGH RESOLUTION SAR IMAGES: FIRST RESULTS OF SAR IMAGE SIMULATION FOR SINGLE BUILDINGS

    Directory of Open Access Journals (Sweden)

    J. Tao

    2012-09-01

    Full Text Available Due to the all-weather data acquisition capabilities, high resolution space borne Synthetic Aperture Radar (SAR plays an important role in remote sensing applications like change detection. However, because of the complex geometric mapping of buildings in urban areas, SAR images are often hard to interpret. SAR simulation techniques ease the visual interpretation of SAR images, while fully automatic interpretation is still a challenge. This paper presents a method for supporting the interpretation of high resolution SAR images with simulated radar images using a LiDAR digital surface model (DSM. Line features are extracted from the simulated and real SAR images and used for matching. A single building model is generated from the DSM and used for building recognition in the SAR image. An application for the concept is presented for the city centre of Munich where the comparison of the simulation to the TerraSAR-X data shows a good similarity. Based on the result of simulation and matching, special features (e.g. like double bounce lines, shadow areas etc. can be automatically indicated in SAR image.

  12. Computational Intelligence for Medical Imaging Simulations.

    Science.gov (United States)

    Chang, Victor

    2017-11-25

    This paper describes how to simulate medical imaging by computational intelligence to explore areas that cannot be easily achieved by traditional ways, including genes and proteins simulations related to cancer development and immunity. This paper has presented simulations and virtual inspections of BIRC3, BIRC6, CCL4, KLKB1 and CYP2A6 with their outputs and explanations, as well as brain segment intensity due to dancing. Our proposed MapReduce framework with the fusion algorithm can simulate medical imaging. The concept is very similar to the digital surface theories to simulate how biological units can get together to form bigger units, until the formation of the entire unit of biological subject. The M-Fusion and M-Update function by the fusion algorithm can achieve a good performance evaluation which can process and visualize up to 40 GB of data within 600 s. We conclude that computational intelligence can provide effective and efficient healthcare research offered by simulations and visualization.

  13. A computer code to simulate X-ray imaging techniques

    Energy Technology Data Exchange (ETDEWEB)

    Duvauchelle, Philippe E-mail: philippe.duvauchelle@insa-lyon.fr; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-09-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests.

  14. A computer code to simulate X-ray imaging techniques

    International Nuclear Information System (INIS)

    Duvauchelle, Philippe; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-01-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests

  15. Biomedical Imaging and Computational Modeling in Biomechanics

    CERN Document Server

    Iacoviello, Daniela

    2013-01-01

    This book collects the state-of-art and new trends in image analysis and biomechanics. It covers a wide field of scientific and cultural topics, ranging from remodeling of bone tissue under the mechanical stimulus up to optimizing the performance of sports equipment, through the patient-specific modeling in orthopedics, microtomography and its application in oral and implant research, computational modeling in the field of hip prostheses, image based model development and analysis of the human knee joint, kinematics of the hip joint, micro-scale analysis of compositional and mechanical properties of dentin, automated techniques for cervical cell image analysis, and iomedical imaging and computational modeling in cardiovascular disease.   The book will be of interest to researchers, Ph.D students, and graduate students with multidisciplinary interests related to image analysis and understanding, medical imaging, biomechanics, simulation and modeling, experimental analysis.

  16. Simulation Model of a Transient

    DEFF Research Database (Denmark)

    Jauch, Clemens; Sørensen, Poul; Bak-Jensen, Birgitte

    2005-01-01

    This paper describes the simulation model of a controller that enables an active-stall wind turbine to ride through transient faults. The simulated wind turbine is connected to a simple model of a power system. Certain fault scenarios are specified and the turbine shall be able to sustain operati...

  17. Ultrawideband imaging radar based on OFDM: system simulation analysis

    Science.gov (United States)

    Garmatyuk, Dmitriy

    2006-05-01

    Orthogonal frequency division-multiplexing (OFDM) is rapidly emerging as a preferred method of UWB signaling in commercial applications aimed mainly at low-power, high data-rate communications. This paper explores the possibility of applying OFDM to use in imaging radar technology. Ultra-wideband nature of the signal provides for high resolution of the radar, whereas usage of multi-sub-carrier method of modulation allows for dynamic spectrum allocation. Robust multi-path performance of OFDM signals and heavy reliance of transceiver design on digital processors easily implemented in modern VLSI technology make a number of possible applications viable, e.g.: portable high-resolution indoor radar/movement monitoring system; through-the-wall/foliage synthetic aperture imaging radar with a capability of image transmission/broadcasting, etc. Our work is aimed to provide a proof-of-concept simulation scenario to explore numerous aspects of UWB-OFDM radar imaging through evaluating range and cross-range imaging performance of such a system with an eventual goal of software-defined radio (SDR) implementation. Stripmap SAR topology was chosen for modeling purposes. Range/cross-range profiles were obtained along with full 2-D images for multi-target in noise scenarios. Model set-up and results of UWB-OFDM radar imaging simulation study using Matlab/Simulink modeling are presented and discussed in this paper.

  18. Cognitive models embedded in system simulation models

    International Nuclear Information System (INIS)

    Siegel, A.I.; Wolf, J.J.

    1982-01-01

    If we are to discuss and consider cognitive models, we must first come to grips with two questions: (1) What is cognition; (2) What is a model. Presumably, the answers to these questions can provide a basis for defining a cognitive model. Accordingly, this paper first places these two questions into perspective. Then, cognitive models are set within the context of computer simulation models and a number of computer simulations of cognitive processes are described. Finally, pervasive issues are discussed vis-a-vis cognitive modeling in the computer simulation context

  19. Utilization of MATLAB for Digital Image Transmission Simulation.,

    Directory of Open Access Journals (Sweden)

    T. Kratochvil

    2003-12-01

    Full Text Available The paper deals with the utilization of Matlab for simulation andanalysis of the digital image transmission and transmission distortionsin DTV (Digital Television and DVB (Digital Video Broadcasting area.The simulation model that covers selected phenomena of DVB standardbaseband signal processing applied in Matlab is presented and featuresof the protection against transmission errors are outlined. Thepractical results of FEC (Forward Error Correction codes efficiencyare presented and at the end the GUI application for experimentalsimulation and education is outlined with a simulation example.

  20. Simulation - modeling - experiment; Simulation - modelisation - experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  1. Imaging Simulations for the Korean VLBI Network (KVN

    Directory of Open Access Journals (Sweden)

    Tae-Hyun Jung

    2005-03-01

    Full Text Available The Korean VLBI Network (KVN will open a new field of research in astronomy, geodesy and earth science using the newest three 21m radio telescopes. This will expand our ability to look at the Universe in the millimeter regime. Imaging capability of radio interferometry is highly dependent upon the antenna configuration, source size, declination and the shape of target. In this paper, imaging simulations are carried out with the KVN system configuration. Five test images were used which were a point source, multi-point sources, a uniform sphere with two different sizes compared to the synthesis beam of the KVN and a Very Large Array (VLA image of Cygnus A. The declination for the full time simulation was set as +60 degrees and the observation time range was --6 to +6 hours around transit. Simulations have been done at 22GHz, one of the KVN observation frequency. All these simulations and data reductions have been run with the Astronomical Image Processing System (AIPS software package. As the KVN array has a resolution of about 6 mas (milli arcsecond at 22GHz, in case of model source being approximately the beam size or smaller, the ratio of peak intensity over RMS shows about 10000:1 and 5000:1. The other case in which model source is larger than the beam size, this ratio shows very low range of about 115:1 and 34:1. This is due to the lack of short baselines and the small number of antenna. We compare the coordinates of the model images with those of the cleaned images. The result shows mostly perfect correspondence except in the case of the 12mas uniform sphere. Therefore, the main astronomical targets for the KVN will be the compact sources and the KVN will have an excellent performance in the astrometry for these sources.

  2. Assessment of COTS IR image simulation tools for ATR development

    Science.gov (United States)

    Seidel, Heiko; Stahl, Christoph; Bjerkeli, Frode; Skaaren-Fystro, Paal

    2005-05-01

    Following the tendency of increased use of imaging sensors in military aircraft, future fighter pilots will need onboard artificial intelligence e.g. ATR for aiding them in image interpretation and target designation. The European Aeronautic Defence and Space Company (EADS) in Germany has developed an advanced method for automatic target recognition (ATR) which is based on adaptive neural networks. This ATR method can assist the crew of military aircraft like the Eurofighter in sensor image monitoring and thereby reduce the workload in the cockpit and increase the mission efficiency. The EADS ATR approach can be adapted for imagery of visual, infrared and SAR sensors because of the training-based classifiers of the ATR method. For the optimal adaptation of these classifiers they have to be trained with appropriate and sufficient image data. The training images must show the target objects from different aspect angles, ranges, environmental conditions, etc. Incomplete training sets lead to a degradation of classifier performance. Additionally, ground truth information i.e. scenario conditions like class type and position of targets is necessary for the optimal adaptation of the ATR method. In Summer 2003, EADS started a cooperation with Kongsberg Defence & Aerospace (KDA) from Norway. The EADS/KDA approach is to provide additional image data sets for training-based ATR through IR image simulation. The joint study aims to investigate the benefits of enhancing incomplete training sets for classifier adaptation by simulated synthetic imagery. EADS/KDA identified the requirements of a commercial-off-the-shelf IR simulation tool capable of delivering appropriate synthetic imagery for ATR development. A market study of available IR simulation tools and suppliers was performed. After that the most promising tool was benchmarked according to several criteria e.g. thermal emission model, sensor model, targets model, non-radiometric image features etc., resulting in a

  3. Simulation data mapping in virtual cardiac model.

    Science.gov (United States)

    Jiquan, Liu; Jingyi, Feng; Duan, Huilong; Siping, Chen

    2004-01-01

    Although 3D heart and torso model with realistic geometry are basis of simulation computation in LFX virtual cardiac model, the simulation results are mostly output in 2D format. To solve such a problem and enhance the virtual reality of LFX virtual cardiac model, the methods of voxel mapping and vertex project mapping were presented. With these methods, excitation isochrone map (EIM) was mapped from heart model with realistic geometry to real visible man heart model, and body surface potential map (BSPM) was mapped from torso model with realistic geometry to real visible man body surface. By visualizing in the 4Dview, which is a real-time 3D medical image visualization platform, the visualization results of EIM and BSPM simulation data before and after mapping were also provided. According to the visualization results, the output format of EIM and BSPM simulation data of LFX virtual cardiac model were extended from 2D to 4D (spatio-temporal) and from cardiac model with realistic geometry to real cardiac model, and more realistic and effective simulation was achieved.

  4. Modeling and simulation of gamma camera

    International Nuclear Information System (INIS)

    Singh, B.; Kataria, S.K.; Samuel, A.M.

    2002-08-01

    Simulation techniques play a vital role in designing of sophisticated instruments and also for the training of operating and maintenance staff. Gamma camera systems have been used for functional imaging in nuclear medicine. Functional images are derived from the external counting of the gamma emitting radioactive tracer that after introduction in to the body mimics the behavior of native biochemical compound. The position sensitive detector yield the coordinates of the gamma ray interaction with the detector and are used to estimate the point of gamma ray emission within the tracer distribution space. This advanced imaging device is thus dependent on the performance of algorithm for coordinate computing, estimation of point of emission, generation of image and display of the image data. Contemporary systems also have protocols for quality control and clinical evaluation of imaging studies. Simulation of this processing leads to understanding of the basic camera design problems. This report describes a PC based package for design and simulation of gamma camera along with the options of simulating data acquisition and quality control of imaging studies. Image display and data processing the other options implemented in SIMCAM will be described in separate reports (under preparation). Gamma camera modeling and simulation in SIMCAM has preset configuration of the design parameters for various sizes of crystal detector with the option to pack the PMT on hexagon or square lattice. Different algorithm for computation of coordinates and spatial distortion removal are allowed in addition to the simulation of energy correction circuit. The user can simulate different static, dynamic, MUGA and SPECT studies. The acquired/ simulated data is processed for quality control and clinical evaluation of the imaging studies. Results show that the program can be used to assess these performances. Also the variations in performance parameters can be assessed due to the induced

  5. Construction of multi-functional open modulized Matlab simulation toolbox for imaging ladar system

    Science.gov (United States)

    Wu, Long; Zhao, Yuan; Tang, Meng; He, Jiang; Zhang, Yong

    2011-06-01

    Ladar system simulation is to simulate the ladar models using computer simulation technology in order to predict the performance of the ladar system. This paper presents the developments of laser imaging radar simulation for domestic and overseas studies and the studies of computer simulation on ladar system with different application requests. The LadarSim and FOI-LadarSIM simulation facilities of Utah State University and Swedish Defence Research Agency are introduced in details. This paper presents the low level of simulation scale, un-unified design and applications of domestic researches in imaging ladar system simulation, which are mostly to achieve simple function simulation based on ranging equations for ladar systems. Design of laser imaging radar simulation with open and modularized structure is proposed to design unified modules for ladar system, laser emitter, atmosphere models, target models, signal receiver, parameters setting and system controller. Unified Matlab toolbox and standard control modules have been built with regulated input and output of the functions, and the communication protocols between hardware modules. A simulation based on ICCD gain-modulated imaging ladar system for a space shuttle is made based on the toolbox. The simulation result shows that the models and parameter settings of the Matlab toolbox are able to simulate the actual detection process precisely. The unified control module and pre-defined parameter settings simplify the simulation of imaging ladar detection. Its open structures enable the toolbox to be modified for specialized requests. The modulization gives simulations flexibility.

  6. Biocomputing: numerical simulation of glioblastoma growth using diffusion tensor imaging

    Science.gov (United States)

    Bondiau, Pierre-Yves; Clatz, Olivier; Sermesant, Maxime; Marcy, Pierre-Yves; Delingette, Herve; Frenay, Marc; Ayache, Nicholas

    2008-02-01

    Glioblastoma multiforma (GBM) is one of the most aggressive tumors of the central nervous system. It can be represented by two components: a proliferative component with a mass effect on brain structures and an invasive component. GBM has a distinct pattern of spread showing a preferential growth in the white fiber direction for the invasive component. By using the architecture of white matter fibers, we propose a new model to simulate the growth of GBM. This architecture is estimated by diffusion tensor imaging in order to determine the preferred direction for the diffusion component. It is then coupled with a mechanical component. To set up our growth model, we make a brain atlas including brain structures with a distinct response to tumor aggressiveness, white fiber diffusion tensor information and elasticity. In this atlas, we introduce a virtual GBM with a mechanical component coupled with a diffusion component. These two components are complementary, and can be tuned independently. Then, we tune the parameter set of our model with an MRI patient. We have compared simulated growth (initialized with the MRI patient) with observed growth six months later. The average and the odd ratio of image difference between observed and simulated images are computed. Displacements of reference points are compared to those simulated by the model. The results of our simulation have shown a good correlation with tumor growth, as observed on an MRI patient. Different tumor aggressiveness can also be simulated by tuning additional parameters. This work has demonstrated that modeling the complex behavior of brain tumors is feasible and will account for further validation of this new conceptual approach.

  7. TREAT Modeling and Simulation Strategy

    Energy Technology Data Exchange (ETDEWEB)

    DeHart, Mark David [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This report summarizes a four-phase process used to describe the strategy in developing modeling and simulation software for the Transient Reactor Test Facility. The four phases of this research and development task are identified as (1) full core transient calculations with feedback, (2) experiment modeling, (3) full core plus experiment simulation and (4) quality assurance. The document describes the four phases, the relationship between these research phases, and anticipated needs within each phase.

  8. Kinetic Simulation and Energetic Neutral Atom Imaging of the Magnetosphere

    Science.gov (United States)

    Fok, Mei-Ching H.

    2011-01-01

    Advanced simulation tools and measurement techniques have been developed to study the dynamic magnetosphere and its response to drivers in the solar wind. The Comprehensive Ring Current Model (CRCM) is a kinetic code that solves the 3D distribution in space, energy and pitch-angle information of energetic ions and electrons. Energetic Neutral Atom (ENA) imagers have been carried in past and current satellite missions. Global morphology of energetic ions were revealed by the observed ENA images. We have combined simulation and ENA analysis techniques to study the development of ring current ions during magnetic storms and substorms. We identify the timing and location of particle injection and loss. We examine the evolution of ion energy and pitch-angle distribution during different phases of a storm. In this talk we will discuss the findings from our ring current studies and how our simulation and ENA analysis tools can be applied to the upcoming TRIO-CINAMA mission.

  9. Numerical Simulations of MREIT Conductivity Imaging for Brain Tumor Detection

    Science.gov (United States)

    Meng, Zi Jun; Sajib, Saurav Z. K.; Chauhan, Munish; Sadleir, Rosalind J.; Kim, Hyung Joong; Kwon, Oh In; Woo, Eung Je

    2013-01-01

    Magnetic resonance electrical impedance tomography (MREIT) is a new modality capable of imaging the electrical properties of human body using MRI phase information in conjunction with external current injection. Recent in vivo animal and human MREIT studies have revealed unique conductivity contrasts related to different physiological and pathological conditions of tissues or organs. When performing in vivo brain imaging, small imaging currents must be injected so as not to stimulate peripheral nerves in the skin, while delivery of imaging currents to the brain is relatively small due to the skull's low conductivity. As a result, injected imaging currents may induce small phase signals and the overall low phase SNR in brain tissues. In this study, we present numerical simulation results of the use of head MREIT for brain tumor detection. We used a realistic three-dimensional head model to compute signal levels produced as a consequence of a predicted doubling of conductivity occurring within simulated tumorous brain tissues. We determined the feasibility of measuring these changes in a time acceptable to human subjects by adding realistic noise levels measured from a candidate 3 T system. We also reconstructed conductivity contrast images, showing that such conductivity differences can be both detected and imaged. PMID:23737862

  10. FASTBUS simulation models in VHDL

    International Nuclear Information System (INIS)

    Appelquist, G.

    1992-11-01

    Four hardware simulation models implementing the FASTBUS protocol are described. The models are written in the VHDL hardware description language to obtain portability, i.e. without relations to any specific simulator. They include two complete FASTBUS devices, a full-duplex segment interconnect and ancillary logic for the segment. In addition, master and slave models using a high level interface to describe FASTBUS operations, are presented. With these models different configurations of FASTBUS systems can be evaluated and the FASTBUS transactions of new devices can be verified. (au)

  11. Space-based laser active imaging simulation system based on HLA

    Science.gov (United States)

    Han, Yi; Sun, Huayan; Li, Yingchun

    2013-09-01

    This paper adopts the High Level Architecture to develop the space-based laser active imaging distribution simulation software system, and designs the system framework which contains three-step workflow including modeling, experimental and analysis. The paper puts forward the general needs of the simulation system first, then builds the simulation system architecture based on HLA and constructs 7 simulation federal members. The simulation system has the primary functions of space target scattering characteristic analysis, imaging simulation, image processing and target recognition, and system performance analysis and so on, and can support the whole simulation process. The results show that the distribution simulation system can meet the technical requirements of the space-based laser imaging simulation.

  12. Monte Carlo simulation of PET images for injection doseoptimization

    Czech Academy of Sciences Publication Activity Database

    Boldyš, Jiří; Dvořák, Jiří; Skopalová, M.; Bělohlávek, O.

    2013-01-01

    Roč. 29, č. 9 (2013), s. 988-999 ISSN 2040-7939 R&D Projects: GA MŠk 1M0572 Institutional support: RVO:67985556 Keywords : positron emission tomography * Monte Carlo simulation * biological system modeling * image quality Subject RIV: FD - Oncology ; Hematology Impact factor: 1.542, year: 2013 http://library.utia.cas.cz/separaty/2013/ZOI/boldys-0397175.pdf

  13. Image reconstruction using Monte Carlo simulation and artificial neural networks

    International Nuclear Information System (INIS)

    Emert, F.; Missimner, J.; Blass, W.; Rodriguez, A.

    1997-01-01

    PET data sets are subject to two types of distortions during acquisition: the imperfect response of the scanner and attenuation and scattering in the active distribution. In addition, the reconstruction of voxel images from the line projections composing a data set can introduce artifacts. Monte Carlo simulation provides a means for modeling the distortions and artificial neural networks a method for correcting for them as well as minimizing artifacts. (author) figs., tab., refs

  14. Retrievals of formaldehyde from ground-based FTIR and MAX-DOAS observations at the Jungfraujoch station and comparisons with GEOS-Chem and IMAGES model simulations

    Directory of Open Access Journals (Sweden)

    B. Franco

    2015-04-01

    Full Text Available As an ubiquitous product of the oxidation of many volatile organic compounds (VOCs, formaldehyde (HCHO plays a key role as a short-lived and reactive intermediate in the atmospheric photo-oxidation pathways leading to the formation of tropospheric ozone and secondary organic aerosols. In this study, HCHO profiles have been successfully retrieved from ground-based Fourier transform infrared (FTIR solar spectra and UV-visible Multi-AXis Differential Optical Absorption Spectroscopy (MAX-DOAS scans recorded during the July 2010–December 2012 time period at the Jungfraujoch station (Swiss Alps, 46.5° N, 8.0° E, 3580 m a.s.l.. Analysis of the retrieved products has revealed different vertical sensitivity between both remote sensing techniques. Furthermore, HCHO amounts simulated by two state-of-the-art chemical transport models (CTMs, GEOS-Chem and IMAGES v2, have been compared to FTIR total columns and MAX-DOAS 3.6–8 km partial columns, accounting for the respective vertical resolution of each ground-based instrument. Using the CTM outputs as the intermediate, FTIR and MAX-DOAS retrievals have shown consistent seasonal modulations of HCHO throughout the investigated period, characterized by summertime maximum and wintertime minimum. Such comparisons have also highlighted that FTIR and MAX-DOAS provide complementary products for the HCHO retrieval above the Jungfraujoch station. Finally, tests have revealed that the updated IR parameters from the HITRAN 2012 database have a cumulative effect and significantly decrease the retrieved HCHO columns with respect to the use of the HITRAN 2008 compilation.

  15. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Most systems involve parameters and variables, which are random variables due to uncertainties. Probabilistic meth- ods are powerful in modelling such systems. In this second part, we describe probabilistic models and Monte Carlo simulation along with 'classical' matrix methods and differ- ential equations as most real ...

  16. Sensitivity Analysis of Simulation Models

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2009-01-01

    This contribution presents an overview of sensitivity analysis of simulation models, including the estimation of gradients. It covers classic designs and their corresponding (meta)models; namely, resolution-III designs including fractional-factorial two-level designs for first-order polynomial

  17. Modelling and Simulation: An Overview

    NARCIS (Netherlands)

    M.J. McAleer (Michael); F. Chan (Felix); L. Oxley (Les)

    2013-01-01

    textabstractThe papers in this special issue of Mathematics and Computers in Simulation cover the following topics: improving judgmental adjustment of model-based forecasts, whether forecast updates are progressive, on a constrained mixture vector autoregressive model, whether all estimators are

  18. The land-use projections and resulting emissions in the IPCC SRES scenarios as simulated by the IMAGE 2.2 model

    International Nuclear Information System (INIS)

    Strengers, B.; Eickhout, B.; De Vries, B.; Bouwman, L.; Leemans, R.

    2005-01-01

    The Intergovernmental Panel on Climate Change (IPCC) developed a new series of emission scenarios (SRES). Six global models were used to develop SRES but most focused primarily on energy and industry related emissions. Land-use emissions were only covered by three models, where IMAGE included the most detailed, spatially explicit description of global land-use and land-cover dynamics. To complement their calculations the other models used land-use emission from AIM and IMAGE, leading to inconsistent estimates. Representation of the land-use emissions in SRES is therefore poor. This paper presents details on the IMAGE 2.1 land-use results to complement the SRES report. The IMAGE SRES scenarios are based on the original IPCC SRES assumptions and narratives using the latest version of IMAGE (IMAGE 2.2). IMAGE provides comprehensive emission estimates because not only emissions are addressed but also the resulting atmospheric concentrations, climate change and impacts. Additionally, in SRES the scenario assumptions were only presented and quantified for 4 'macro-regions'. The IMAGE 2.2 SRES implementation has been extended towards 17 regions. We focus on land-use aspects and show that land-related emissions not only depend on population projections but also on the temporal and spatial dynamics of different land-related sources and sinks of greenhouse gases. We also illustrate the importance of systemic feed backs and interactions in the climate system that influence land-use emissions, such as deforestation and forest regrowth, soil respiration and CO2-fertilisation

  19. Vehicle dynamics modeling and simulation

    CERN Document Server

    Schramm, Dieter; Bardini, Roberto

    2014-01-01

    The authors examine in detail the fundamentals and mathematical descriptions of the dynamics of automobiles. In this context different levels of complexity will be presented, starting with basic single-track models up to complex three-dimensional multi-body models. A particular focus is on the process of establishing mathematical models on the basis of real cars and the validation of simulation results. The methods presented are explained in detail by means of selected application scenarios.

  20. Modeling and Simulation: An Overview

    OpenAIRE

    Michael McAleer; Felix Chan; Les Oxley

    2013-01-01

    The papers in this special issue of Mathematics and Computers in Simulation cover the following topics. Improving judgmental adjustment of model-based forecasts, whether forecast updates are progressive, on a constrained mixture vector autoregressive model, whether all estimators are born equal. The empirical properties of some estimators of long memory, characterising trader manipulation in a limitorder driven market, measuring bias in a term-structure model of commodity prices through the c...

  1. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  2. Model for Simulation Atmospheric Turbulence

    DEFF Research Database (Denmark)

    Lundtang Petersen, Erik

    1976-01-01

    A method that produces realistic simulations of atmospheric turbulence is developed and analyzed. The procedure makes use of a generalized spectral analysis, often called a proper orthogonal decomposition or the Karhunen-Loève expansion. A set of criteria, emphasizing a realistic appearance...... eigenfunctions and estimates of the distributions of the corresponding expansion coefficients. The simulation method utilizes the eigenfunction expansion procedure to produce preliminary time histories of the three velocity components simultaneously. As a final step, a spectral shaping procedure is then applied....... The method is unique in modeling the three velocity components simultaneously, and it is found that important cross-statistical features are reasonably well-behaved. It is concluded that the model provides a practical, operational simulator of atmospheric turbulence....

  3. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    This paper describes the modelling, simulating and optimizing including experimental verification as being carried out as part of a Ph.D. project being written resp. supervised by the authors. The work covers dynamic performance of both water-tube boilers and fire tube boilers. A detailed dynamic...... model of the boiler has been developed and simulations carried out by means of the Matlab integration routines. The model is prepared as a dynamic model consisting of both ordinary differential equations and algebraic equations, together formulated as a Differential-Algebraic-Equation system. Being able...... to operate a boiler plant dynamically means that the boiler designs must be able to absorb any fluctuations in water level and temperature gradients resulting from the pressure change in the boiler. On the one hand a large water-/steam space may be required, i.e. to build the boiler as big as possible. Due...

  4. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    This paper describes the modelling, simulating and optimizing including experimental verification as being carried out as part of a Ph.D. project being written resp. supervised by the authors. The work covers dynamic performance of both water-tube boilers and fire tube boilers. A detailed dynamic...... to the internal pressure the consequence of the increased volume (i.e. water-/steam space) is an increased wall thickness in the pressure part of the boiler. The stresses introduced in the boiler pressure part as a result of the temperature gradients are proportional to the square of the wall thickness...... model of the boiler has been developed and simulations carried out by means of the Matlab integration routines. The model is prepared as a dynamic model consisting of both ordinary differential equations and algebraic equations, together formulated as a Differential-Algebraic-Equation system. Being able...

  5. Modeling control in manufacturing simulation

    NARCIS (Netherlands)

    Zee, Durk-Jouke van der; Chick, S.; Sánchez, P.J.; Ferrin, D.; Morrice, D.J.

    2003-01-01

    A significant shortcoming of traditional simulation languages is the lack of attention paid to the modeling of control structures, i.e., the humans or systems responsible for manufacturing planning and control, their activities and the mutual tuning of their activities. Mostly they are hard coded

  6. Analysis of simulated image sequences from sensors for restricted-visibility operations

    Science.gov (United States)

    Kasturi, Rangachar

    1991-01-01

    A real time model of the visible output from a 94 GHz sensor, based on a radiometric simulation of the sensor, was developed. A sequence of images as seen from an aircraft as it approaches for landing was simulated using this model. Thirty frames from this sequence of 200 x 200 pixel images were analyzed to identify and track objects in the image using the Cantata image processing package within the visual programming environment provided by the Khoros software system. The image analysis operations are described.

  7. Arabidopsis Growth Simulation Using Image Processing Technology

    Directory of Open Access Journals (Sweden)

    Junmei Zhang

    2014-01-01

    Full Text Available This paper aims to provide a method to represent the virtual Arabidopsis plant at each growth stage. It includes simulating the shape and providing growth parameters. The shape is described with elliptic Fourier descriptors. First, the plant is segmented from the background with the chromatic coordinates. With the segmentation result, the outer boundary series are obtained by using boundary tracking algorithm. The elliptic Fourier analysis is then carried out to extract the coefficients of the contour. The coefficients require less storage than the original contour points and can be used to simulate the shape of the plant. The growth parameters include total area and the number of leaves of the plant. The total area is obtained with the number of the plant pixels and the image calibration result. The number of leaves is derived by detecting the apex of each leaf. It is achieved by using wavelet transform to identify the local maximum of the distance signal between the contour points and the region centroid. Experiment result shows that this method can record the growth stage of Arabidopsis plant with fewer data and provide a visual platform for plant growth research.

  8. Polarimetric ISAR: Simulation and image reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Chambers, David H. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-03-21

    In polarimetric ISAR the illumination platform, typically airborne, carries a pair of antennas that are directed toward a fixed point on the surface as the platform moves. During platform motion, the antennas maintain their gaze on the point, creating an effective aperture for imaging any targets near that point. The interaction between the transmitted fields and targets (e.g. ships) is complicated since the targets are typically many wavelengths in size. Calculation of the field scattered from the target typically requires solving Maxwell’s equations on a large three-dimensional numerical grid. This is prohibitive to use in any real-world imaging algorithm, so the scattering process is typically simplified by assuming the target consists of a cloud of independent, non-interacting, scattering points (centers). Imaging algorithms based on this scattering model perform well in many applications. Since polarimetric radar is not very common, the scattering model is often derived for a scalar field (single polarization) where the individual scatterers are assumed to be small spheres. However, when polarization is important, we must generalize the model to explicitly account for the vector nature of the electromagnetic fields and its interaction with objects. In this note, we present a scattering model that explicitly includes the vector nature of the fields but retains the assumption that the individual scatterers are small. The response of the scatterers is described by electric and magnetic dipole moments induced by the incident fields. We show that the received voltages in the antennas are linearly related to the transmitting currents through a scattering impedance matrix that depends on the overall geometry of the problem and the nature of the scatterers.

  9. Imaging Mouse Models of Cancer.

    Science.gov (United States)

    Lyons, Scott Keith

    2015-01-01

    Mouse models of cancer have proven to be an indispensable resource in furthering both our basic knowledge of cancer biology and the translation of new cancer treatments and imaging approaches into the clinic. As mouse models have developed and improved in their ability to model many diverse aspects of the human disease, so too has the need for robust imaging approaches to measure key biological parameters noninvasively. The aim of this review is to provide a brief overview of the various imaging approaches available to researchers today for imaging preclinical cancer models, highlighting their relative strengths and weaknesses. The very nature of modeling cancer in the mouse is also changing, and brief mention will be made on how imaging can maximize the utility of these new, accurate, and genetically versatile models.

  10. High Resolution Imaging Ground Penetrating Radar Design and Simulation

    OpenAIRE

    Saunders II, Charles Phillip

    2014-01-01

    This paper describes the design and simulation of a microwave band, high resolution imaging ground penetrating radar. A conceptual explanation is given on the mechanics of wave-based imaging, followed by the governing radar equations. The performance specifications for the imaging system are given as inputs to the radar equations, which output the full system specifications. Those specifications are entered into a MATLAB simulation, and the simulation results are discussed with respect to bot...

  11. A Modeling & Simulation Implementation Framework for Large-Scale Simulation

    Directory of Open Access Journals (Sweden)

    Song Xiao

    2012-10-01

    Full Text Available Classical High Level Architecture (HLA systems are facing development problems for lack of supporting fine-grained component integration and interoperation in large-scale complex simulation applications. To provide efficient methods of this issue, an extensible, reusable and composable simulation framework is proposed. To promote the reusability from coarse-grained federate to fine-grained components, this paper proposes a modelling & simulation framework which consists of component-based architecture, modelling methods, and simulation services to support and simplify the process of complex simulation application construction. Moreover, a standard process and simulation tools are developed to ensure the rapid and effective development of simulation application.

  12. Design and development of a computer based simulator to support learning of radiographic image quality

    International Nuclear Information System (INIS)

    Costaridou, L.; Pitoura, T.; Panayiotakis, G.; Pallikarakis, N.; Hatzis, K.

    1994-01-01

    A training simulator has been developed to offer a structured and functional approach to radiographic imaging procedures and comprehensive understanding of interrelations between physical and technical input parameters of a radiographic imaging system and characteristics of image quality. The system addresses training needs of radiographers and radiology clinicians. The simulator is based on procedural simulation enhanced by a hypertextual model of information organization. It is supported by an image data base, which supplies and enriches the simulator. The simulation is controlled by a browsing facility which corresponds to several hierachical levels of use of the underlying multimodal data base, organized as imaging tasks. Representative tasks are : production of a single radiograph or production of functional sets of radiographs exhibiting parameter effects on image characteristics. System parameters such as patient positioning, focus to patient distance, magnification, field dimensions, focal spot size, tube voltage, tube current and exposure time are under user control. (authors)

  13. Monte Carlo simulation of breast imaging using synchrotron radiation

    Energy Technology Data Exchange (ETDEWEB)

    Fitousi, N. T.; Delis, H.; Panayiotakis, G. [Department of Medical Physics, Faculty of Medicine, University of Patras, 26504 Patras (Greece)

    2012-04-15

    Purpose: Synchrotron radiation (SR), being the brightest artificial source of x-rays with a very promising geometry, has raised the scientific expectations that it could be used for breast imaging with optimized results. The ''in situ'' evaluation of this technique is difficult to perform, mostly due to the limited available SR facilities worldwide. In this study, a simulation model for SR breast imaging was developed, based on Monte Carlo simulation techniques, and validated using data acquired in the SYRMEP beamline of the Elettra facility in Trieste, Italy. Furthermore, primary results concerning the performance of SR were derived. Methods: The developed model includes the exact setup of the SR beamline, considering that the x-ray source is located at almost 23 m from the slit, while the photon energy was considered to originate from a very narrow Gaussian spectrum. Breast phantoms, made of Perspex and filled with air cavities, were irradiated with energies in the range of 16-28 keV. The model included a Gd{sub 2}O{sub 2}S detector with the same characteristics as the one available in the SYRMEP beamline. Following the development and validation of the model, experiments were performed in order to evaluate the contrast resolution of SR. A phantom made of adipose tissue and filled with inhomogeneities of several compositions and sizes was designed and utilized to simulate the irradiation under conventional mammography and SR conditions. Results: The validation results of the model showed an excellent agreement with the experimental data, with the correlation for contrast being 0.996. Significant differences only appeared at the edges of the phantom, where phase effects occur. The initial evaluation experiments revealed that SR shows very good performance in terms of the image quality indices utilized, namely subject contrast and contrast to noise ratio. The response of subject contrast to energy is monotonic; however, this does not stand for

  14. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  15. Modeling and Simulation for Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Swinhoe, Martyn T. [Los Alamos National Laboratory

    2012-07-26

    The purpose of this talk is to give an overview of the role of modeling and simulation in Safeguards R&D and introduce you to (some of) the tools used. Some definitions are: (1) Modeling - the representation, often mathematical, of a process, concept, or operation of a system, often implemented by a computer program; (2) Simulation - the representation of the behavior or characteristics of one system through the use of another system, especially a computer program designed for the purpose; and (3) Safeguards - the timely detection of diversion of significant quantities of nuclear material. The role of modeling and simulation are: (1) Calculate amounts of material (plant modeling); (2) Calculate signatures of nuclear material etc. (source terms); and (3) Detector performance (radiation transport and detection). Plant modeling software (e.g. FACSIM) gives the flows and amount of material stored at all parts of the process. In safeguards this allow us to calculate the expected uncertainty of the mass and evaluate the expected MUF. We can determine the measurement accuracy required to achieve a certain performance.

  16. Model continuity in discrete event simulation: A framework for model-driven development of simulation models

    NARCIS (Netherlands)

    Cetinkaya, D; Verbraeck, A.; Seck, MD

    2015-01-01

    Most of the well-known modeling and simulation (M&S) methodologies state the importance of conceptual modeling in simulation studies, and they suggest the use of conceptual models during the simulation model development process. However, only a limited number of methodologies refers to how to

  17. Assessment of Molecular Modeling & Simulation

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-01-03

    This report reviews the development and applications of molecular and materials modeling in Europe and Japan in comparison to those in the United States. Topics covered include computational quantum chemistry, molecular simulations by molecular dynamics and Monte Carlo methods, mesoscale modeling of material domains, molecular-structure/macroscale property correlations like QSARs and QSPRs, and related information technologies like informatics and special-purpose molecular-modeling computers. The panel's findings include the following: The United States leads this field in many scientific areas. However, Canada has particular strengths in DFT methods and homogeneous catalysis; Europe in heterogeneous catalysis, mesoscale, and materials modeling; and Japan in materials modeling and special-purpose computing. Major government-industry initiatives are underway in Europe and Japan, notably in multi-scale materials modeling and in development of chemistry-capable ab-initio molecular dynamics codes.

  18. A numerical simulation method for aircraft infrared imaging

    Science.gov (United States)

    Zhou, Yue; Wang, Qiang; Li, Ting; Hu, Haiyang

    2017-06-01

    Numerical simulation of infrared (IR) emission from aircraft is of great significance for military and civilian applications. In this paper, the narrow band k-distribution (NBK) model is used to calculate radiative properties of non-gray gases in the hot exhaust plume. With model parameters derived from the high resolution spectral database HITEMP 2010, the NBK model is validated by comparisons with exact line by line (LBL) results and experimental data. Based on the NBK model, a new finite volume and back ray tracing (FVBRT) method is proposed to solve the radiative transfer equations and produce IR imaging. Calculated results by the FVBRT method are compared with experimental data and available results in open references, which shows the FVBRT method can maintain good accuracy while producing IR images with better rendering effects. Finally, the NBK model and FVBRT method are integrated to calculate IR signature of an aircraft. The IR images and spatial distributions of radiative intensity are compared and analyzed in both 3 - 5 μm band and 8 - 12 μm band to provide references for engineering applications.

  19. Simulating spin models on GPU

    Science.gov (United States)

    Weigel, Martin

    2011-09-01

    Over the last couple of years it has been realized that the vast computational power of graphics processing units (GPUs) could be harvested for purposes other than the video game industry. This power, which at least nominally exceeds that of current CPUs by large factors, results from the relative simplicity of the GPU architectures as compared to CPUs, combined with a large number of parallel processing units on a single chip. To benefit from this setup for general computing purposes, the problems at hand need to be prepared in a way to profit from the inherent parallelism and hierarchical structure of memory accesses. In this contribution I discuss the performance potential for simulating spin models, such as the Ising model, on GPU as compared to conventional simulations on CPU.

  20. Numerical simulation for neutron pinhole imaging in ICF

    International Nuclear Information System (INIS)

    Chen Faxin; Yang Jianlun; Wen Shuhuai

    2005-01-01

    Pinhole imaging of the neutron production in laser-driven inertial confinement fusion experiments can provide important information about performance of various capsule designs. In order to get good results in experiments, it is needed to judge performance of various pinhole designs qualitatively or quantitatively before experiment. Calculation of imaging can be simply separated into pinhole imaging and image spectral analysis. In this paper, pinhole imaging is discussed, codes for neutron pinhole imaging and image showing is programed. The codes can be used to provide theoretical foundation for pinhole designing and simulating data for image analysing. (authors)

  1. MULTISCALE SPARSE APPEARANCE MODELING AND SIMULATION OF PATHOLOGICAL DEFORMATIONS

    Directory of Open Access Journals (Sweden)

    Rami Zewail

    2017-08-01

    Full Text Available Machine learning and statistical modeling techniques has drawn much interest within the medical imaging research community. However, clinically-relevant modeling of anatomical structures continues to be a challenging task. This paper presents a novel method for multiscale sparse appearance modeling in medical images with application to simulation of pathological deformations in X-ray images of human spine. The proposed appearance model benefits from the non-linear approximation power of Contourlets and its ability to capture higher order singularities to achieve a sparse representation while preserving the accuracy of the statistical model. Independent Component Analysis is used to extract statistical independent modes of variations from the sparse Contourlet-based domain. The new model is then used to simulate clinically-relevant pathological deformations in radiographic images.

  2. Creating Simulated Microgravity Patient Models

    Science.gov (United States)

    Hurst, Victor; Doerr, Harold K.; Bacal, Kira

    2004-01-01

    The Medical Operational Support Team (MOST) has been tasked by the Space and Life Sciences Directorate (SLSD) at the NASA Johnson Space Center (JSC) to integrate medical simulation into 1) medical training for ground and flight crews and into 2) evaluations of medical procedures and equipment for the International Space Station (ISS). To do this, the MOST requires patient models that represent the physiological changes observed during spaceflight. Despite the presence of physiological data collected during spaceflight, there is no defined set of parameters that illustrate or mimic a 'space normal' patient. Methods: The MOST culled space-relevant medical literature and data from clinical studies performed in microgravity environments. The areas of focus for data collection were in the fields of cardiovascular, respiratory and renal physiology. Results: The MOST developed evidence-based patient models that mimic the physiology believed to be induced by human exposure to a microgravity environment. These models have been integrated into space-relevant scenarios using a human patient simulator and ISS medical resources. Discussion: Despite the lack of a set of physiological parameters representing 'space normal,' the MOST developed space-relevant patient models that mimic microgravity-induced changes in terrestrial physiology. These models are used in clinical scenarios that will medically train flight surgeons, biomedical flight controllers (biomedical engineers; BME) and, eventually, astronaut-crew medical officers (CMO).

  3. Optical physics of scintillation imagers by GEANT4 simulations

    International Nuclear Information System (INIS)

    Lo Meo, Sergio; Baldazzi, Giuseppe; Bennati, Paolo; Bollini, Dante; Cencelli, Valentino O.; Cinti, Maria N.; Moschini, Giuliano; Lanconelli, Nico; Navarria, Francesco L.; Pani, Roberto; Pellegrini, Rosanna; Perrotta, Andrea; Vittorini, Francesca

    2009-01-01

    The recent developments of LaBr 3 :Ce crystals makes their use in a system of gamma imaging for Single Photon Emission Tomography (SPET) applications very attractive, mainly due to their excellent scintillation properties. In this work we use the Monte Carlo simulations in order to model the optical behavior of three crystal configurations. Our goal is to better understand the intrinsic properties of a gamma camera based on LaBr 3 :Ce crystals coupled to a position-sensitive photomultiplier tube. To this aim, the spatial and energy resolutions obtained from optimum photodetection conditions are investigated.

  4. General introduction to simulation models

    DEFF Research Database (Denmark)

    Hisham Beshara Halasa, Tariq; Boklund, Anette

    2012-01-01

    Monte Carlo simulation can be defined as a representation of real life systems to gain insight into their functions and to investigate the effects of alternative conditions or actions on the modeled system. Models are a simplification of a system. Most often, it is best to use experiments and fie...... as support decision making. However, several other factors affect decision making such as, ethics, politics and economics. Furthermore, the insight gained when models are build leads to point out areas where knowledge is lacking....... of FMD spread that can provide useful and trustworthy advises, there are four important issues, which the model should represent: 1) The herd structure of the country in question, 2) the dynamics of animal movements and contacts between herds, 3) the biology of the disease, and 4) the regulations...

  5. The Application of the Technology of 3D Satellite Cloud Imaging in Virtual Reality Simulation

    Directory of Open Access Journals (Sweden)

    Xiao-fang Xie

    2007-05-01

    Full Text Available Using satellite cloud images to simulate clouds is one of the new visual simulation technologies in Virtual Reality (VR. Taking the original data of satellite cloud images as the source, this paper depicts specifically the technology of 3D satellite cloud imaging through the transforming of coordinates and projection, creating a DEM (Digital Elevation Model of cloud imaging and 3D simulation. A Mercator projection was introduced to create a cloud image DEM, while solutions for geodetic problems were introduced to calculate distances, and the outer-trajectory science of rockets was introduced to obtain the elevation of clouds. For demonstration, we report on a computer program to simulate the 3D satellite cloud images.

  6. Primary and scattered component analyses of simulated images with MCNPX

    International Nuclear Information System (INIS)

    Souza, Edmilson M. de; Correa, Samanda C.A.; Silva, Ademir X. da; Lopes, Ricardo T.

    2007-01-01

    MCNPX radiography tallies are an array of point detectors (tally F5) close to one another to generate an image based on the point detector fluxes. The goal of this work is to study the primary and scattered radiation behavior in simulated images with MCNPX radiography tally. Monte Carlo simulations were performed for calculations of the PSF for different materials, thickness and distances between sample and detector. PSF analyses of simulated images showed that the radiography tallies to reproduce with perfection geometric and physical characteristics of experimental images. (author)

  7. Simulated altitude exposure assessment by hyperspectral imaging

    Science.gov (United States)

    Calin, Mihaela Antonina; Macovei, Adrian; Miclos, Sorin; Parasca, Sorin Viorel; Savastru, Roxana; Hristea, Razvan

    2017-05-01

    Testing the human body's reaction to hypoxia (including the one generated by high altitude) is important in aeronautic medicine. This paper presents a method of monitoring blood oxygenation during experimental hypoxia using hyperspectral imaging (HSI) and a spectral unmixing model based on a modified Beer-Lambert law. A total of 20 healthy volunteers (males) aged 25 to 60 years were included in this study. A line-scan HSI system was used to acquire images of the faces of the subjects. The method generated oxyhemoglobin and deoxyhemoglobin distribution maps from the foreheads of the subjects at 5 and 10 min of hypoxia and after recovery in a high oxygen breathing mixture. The method also generated oxygen saturation maps that were validated using pulse oximetry. An interesting pattern of desaturation on the forehead was discovered during the study, showing one of the advantages of using HSI for skin oxygenation monitoring in hypoxic conditions. This could bring new insight into the physiological response to high altitude and may become a step forward in air crew testing.

  8. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  9. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  10. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2004-01-01

    In the present work a framework for optimizing the design of boilers for dynamic operation has been developed. A cost function to be minimized during the optimization has been formulated and for the present design variables related to the Boiler Volume and the Boiler load Gradient (i.e. ring rate...... on the boiler) have been dened. Furthermore a number of constraints related to: minimum and maximum boiler load gradient, minimum boiler size, Shrinking and Swelling and Steam Space Load have been dened. For dening the constraints related to the required boiler volume a dynamic model for simulating the boiler...... performance has been developed. Outputs from the simulations are shrinking and swelling of water level in the drum during for example a start-up of the boiler, these gures combined with the requirements with respect to allowable water level uctuations in the drum denes the requirements with respect to drum...

  11. Parametric uncertainty in optical image modeling

    Science.gov (United States)

    Potzick, James; Marx, Egon; Davidson, Mark

    2006-10-01

    Optical photomask feature metrology and wafer exposure process simulation both rely on optical image modeling for accurate results. While it is fair to question the accuracies of the available models, model results also depend on several input parameters describing the object and imaging system. Errors in these parameter values can lead to significant errors in the modeled image. These parameters include wavelength, illumination and objective NA's, magnification, focus, etc. for the optical system, and topography, complex index of refraction n and k, etc. for the object. In this paper each input parameter is varied over a range about its nominal value and the corresponding images simulated. Second order parameter interactions are not explored. Using the scenario of the optical measurement of photomask features, these parametric sensitivities are quantified by calculating the apparent change of the measured linewidth for a small change in the relevant parameter. Then, using reasonable values for the estimated uncertainties of these parameters, the parametric linewidth uncertainties can be calculated and combined to give a lower limit to the linewidth measurement uncertainty for those parameter uncertainties.

  12. Hybrid simulation using mixed reality for interventional ultrasound imaging training.

    Science.gov (United States)

    Freschi, C; Parrini, S; Dinelli, N; Ferrari, M; Ferrari, V

    2015-07-01

    Ultrasound (US) imaging offers advantages over other imaging modalities and has become the most widespread modality for many diagnostic and interventional procedures. However, traditional 2D US requires a long training period, especially to learn how to manipulate the probe. A hybrid interactive system based on mixed reality was designed, implemented and tested for hand-eye coordination training in diagnostic and interventional US. A hybrid simulator was developed integrating a physical US phantom and a software application with a 3D virtual scene. In this scene, a 3D model of the probe with its relative scan plane is coherently displayed with a 3D representation of the phantom internal structures. An evaluation study of the diagnostic module was performed by recruiting thirty-six novices and four experts. The performances of the hybrid (HG) versus physical (PG) simulator were compared. After the training session, each novice was required to visualize a particular target structure. The four experts completed a 5-point Likert scale questionnaire. Seventy-eight percentage of the HG novices successfully visualized the target structure, whereas only 45% of the PG reached this goal. The mean scores from the questionnaires were 5.00 for usefulness, 4.25 for ease of use, 4.75 for 3D perception, and 3.25 for phantom realism. The hybrid US training simulator provides ease of use and is effective as a hand-eye coordination teaching tool. Mixed reality can improve US probe manipulation training.

  13. A Digital Sensor Simulator of the Pushbroom Offner Hyperspectral Imaging Spectrometer

    Science.gov (United States)

    Tao, Dongxing; Jia, Guorui; Yuan, Yan; Zhao, Huijie

    2014-01-01

    Sensor simulators can be used in forecasting the imaging quality of a new hyperspectral imaging spectrometer, and generating simulated data for the development and validation of the data processing algorithms. This paper presents a novel digital sensor simulator for the pushbroom Offner hyperspectral imaging spectrometer, which is widely used in the hyperspectral remote sensing. Based on the imaging process, the sensor simulator consists of a spatial response module, a spectral response module, and a radiometric response module. In order to enhance the simulation accuracy, spatial interpolation-resampling, which is implemented before the spatial degradation, is developed to compromise the direction error and the extra aliasing effect. Instead of using the spectral response function (SRF), the dispersive imaging characteristics of the Offner convex grating optical system is accurately modeled by its configuration parameters. The non-uniformity characteristics, such as keystone and smile effects, are simulated in the corresponding modules. In this work, the spatial, spectral and radiometric calibration processes are simulated to provide the parameters of modulation transfer function (MTF), SRF and radiometric calibration parameters of the sensor simulator. Some uncertainty factors (the stability, band width of the monochromator for the spectral calibration, and the integrating sphere uncertainty for the radiometric calibration) are considered in the simulation of the calibration process. With the calibration parameters, several experiments were designed to validate the spatial, spectral and radiometric response of the sensor simulator, respectively. The experiment results indicate that the sensor simulator is valid. PMID:25615727

  14. A Digital Sensor Simulator of the Pushbroom Offner Hyperspectral Imaging Spectrometer

    Directory of Open Access Journals (Sweden)

    Dongxing Tao

    2014-12-01

    Full Text Available Sensor simulators can be used in forecasting the imaging quality of a new hyperspectral imaging spectrometer, and generating simulated data for the development and validation of the data processing algorithms. This paper presents a novel digital sensor simulator for the pushbroom Offner hyperspectral imaging spectrometer, which is widely used in the hyperspectral remote sensing. Based on the imaging process, the sensor simulator consists of a spatial response module, a spectral response module, and a radiometric response module. In order to enhance the simulation accuracy, spatial interpolation-resampling, which is implemented before the spatial degradation, is developed to compromise the direction error and the extra aliasing effect. Instead of using the spectral response function (SRF, the dispersive imaging characteristics of the Offner convex grating optical system is accurately modeled by its configuration parameters. The non-uniformity characteristics, such as keystone and smile effects, are simulated in the corresponding modules. In this work, the spatial, spectral and radiometric calibration processes are simulated to provide the parameters of modulation transfer function (MTF, SRF and radiometric calibration parameters of the sensor simulator. Some uncertainty factors (the stability, band width of the monochromator for the spectral calibration, and the integrating sphere uncertainty for the radiometric calibration are considered in the simulation of the calibration process. With the calibration parameters, several experiments were designed to validate the spatial, spectral and radiometric response of the sensor simulator, respectively. The experiment results indicate that the sensor simulator is valid.

  15. Simulated annealing model of acupuncture

    Science.gov (United States)

    Shang, Charles; Szu, Harold

    2015-05-01

    The growth control singularity model suggests that acupuncture points (acupoints) originate from organizers in embryogenesis. Organizers are singular points in growth control. Acupuncture can cause perturbation of a system with effects similar to simulated annealing. In clinical trial, the goal of a treatment is to relieve certain disorder which corresponds to reaching certain local optimum in simulated annealing. The self-organizing effect of the system is limited and related to the person's general health and age. Perturbation at acupoints can lead a stronger local excitation (analogous to higher annealing temperature) compared to perturbation at non-singular points (placebo control points). Such difference diminishes as the number of perturbed points increases due to the wider distribution of the limited self-organizing activity. This model explains the following facts from systematic reviews of acupuncture trials: 1. Properly chosen single acupoint treatment for certain disorder can lead to highly repeatable efficacy above placebo 2. When multiple acupoints are used, the result can be highly repeatable if the patients are relatively healthy and young but are usually mixed if the patients are old, frail and have multiple disorders at the same time as the number of local optima or comorbidities increases. 3. As number of acupoints used increases, the efficacy difference between sham and real acupuncture often diminishes. It predicted that the efficacy of acupuncture is negatively correlated to the disease chronicity, severity and patient's age. This is the first biological - physical model of acupuncture which can predict and guide clinical acupuncture research.

  16. Impulse pumping modelling and simulation

    International Nuclear Information System (INIS)

    Pierre, B; Gudmundsson, J S

    2010-01-01

    Impulse pumping is a new pumping method based on propagation of pressure waves. Of particular interest is the application of impulse pumping to artificial lift situations, where fluid is transported from wellbore to wellhead using pressure waves generated at wellhead. The motor driven element of an impulse pumping apparatus is therefore located at wellhead and can be separated from the flowline. Thus operation and maintenance of an impulse pump are facilitated. The paper describes the different elements of an impulse pumping apparatus, reviews the physical principles and details the modelling of the novel pumping method. Results from numerical simulations of propagation of pressure waves in water-filled pipelines are then presented for illustrating impulse pumping physical principles, and validating the described modelling with experimental data.

  17. Bridging experiments, models and simulations

    DEFF Research Database (Denmark)

    Carusi, Annamaria; Burrage, Kevin; Rodríguez, Blanca

    2012-01-01

    understanding of living organisms and also how they can reduce, replace, and refine animal experiments. A fundamental requirement to fulfill these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present...... of biovariability; 2) testing and developing robust techniques and tools as a prerequisite to conducting physiological investigations; 3) defining and adopting standards to facilitate the interoperability of experiments, models, and simulations; 4) and understanding physiological validation as an iterative process...... that contributes to defining the specific aspects of cardiac electrophysiology the MSE system targets, rather than being only an external test, and that this is driven by advances in experimental and computational methods and the combination of both....

  18. Survey of chemically amplified resist models and simulator algorithms

    Science.gov (United States)

    Croffie, Ebo H.; Yuan, Lei; Cheng, Mosong; Neureuther, Andrew R.

    2001-08-01

    Modeling has become indespensable tool for chemically amplified resist (CAR) evaluations. It has been used extensively to study acid diffusion and its effects on resist image formation. Several commercial and academic simulators have been developed for CAR process simulation. For commercial simulators such as PROLITH (Finle Technologies) and Solid-C (Sigma-C), the user is allowed to choose between an empirical model or a concentration dependant diffusion model. The empirical model is faster but not very accurate for 2-dimension resist simulations. In this case there is a trade off between the speed of the simulator and the accuracy of the results. An academic simulator such as STORM (U.C. Berkeley) gives the user a choice of different algorithms including Fast Imaging 2nd order finite difference algorithm and Moving Boundary finite element algorithm. A user interested in simulating the volume shrinkage and polymer stress effects during post exposure bake will need the Moving Boundary algorithm whereas a user interested in the latent image formation without polymer deformations will find the Fast Imaging algorithm more appropriate. The Fast Imaging algorithm is generally faster and requires less computer memory. This choice of algorithm presents a trade off between speed and level of detail in resist profile prediction. This paper surveys the different models and simulator algorithms available in the literature. Contributions in the field of CAR modeling including contributions to characterization of CAR exposure and post exposure bake (PEB) processes for different resist systems. Several numerical algorithms and their performances will also be discussed in this paper.

  19. Synthetic aperture radar imaging simulator for pulse envelope evaluation

    Science.gov (United States)

    Balster, Eric J.; Scarpino, Frank A.; Kordik, Andrew M.; Hill, Kerry L.

    2017-10-01

    A simulator for spotlight synthetic aperture radar (SAR) image formation is presented. The simulator produces radar returns from a virtual radar positioned at an arbitrary distance and altitude. The radar returns are produced from a source image, where the return is a weighted summation of linear frequency-modulated (LFM) pulse signals delayed by the distance of each pixel in the image to the radar. The imagery is resampled into polar format to ensure consistent range profiles to the position of the radar. The SAR simulator provides a capability enabling the objective analysis of formed SAR imagery, comparing it to an original source image. This capability allows for analysis of various SAR signal processing techniques previously determined by impulse response function (IPF) analysis. The results suggest that IPF analysis provides results that may not be directly related to formed SAR image quality. Instead, the SAR simulator uses image quality metrics, such as peak signal-to-noise ratio (PSNR) and structured similarity index (SSIM), for formed SAR image quality analysis. To showcase the capability of the SAR simulator, it is used to investigate the performance of various envelopes applied to LFM pulses. A power-raised cosine window with a power p=0.35 and roll-off factor of β=0.15 is shown to maximize the quality of the formed SAR images by improving PSNR by 0.84 dB and SSIM by 0.06 from images formed utilizing a rectangular pulse, on average.

  20. Simulation of Langmuir-Blodgett film surface STM images

    International Nuclear Information System (INIS)

    Agabekov, V.E.; Zhavnerko, G.K.; Bar, G.; Cantov, H.J.

    1998-01-01

    The simulation of the STM image of a hydrocarbon tail of the fatty acid was carried out and compared to the experimental results. The simulation procedure includes calculations of the distribution of an isolated molecule electronic density by the extended Huckel-Hoffman method. An agreement between the calculated and experimental STM images of closely packed Langmuir-Blodgett film of cobalt behenate on the graphite surface and the adsorbed molecules constituting bi- and multilayer Langmuir-Blodgett films can be neglected in simulations of STM images. (author)

  1. Application of image simulation in weapon system development

    CSIR Research Space (South Africa)

    Willers, CJ

    2007-09-01

    Full Text Available weapon systems; including sensor development, image processing algorithm development and knowledge management. The simulator environment has proved invaluable in the development and evaluation of complex optronics weapon systems...

  2. An investigation into the use of a mixture model for simulating the electrical properties of soil with varying effective saturation levels for sub-soil imaging using ECT

    International Nuclear Information System (INIS)

    Hayes, R R; Newill, P A; Podd, F J W; York, T A; Grieve, B D; Dorn, O

    2010-01-01

    A new visualisation tool is being developed for seed breeders, providing on-line data for each individual plant in a screening programme. It will be used to indicate how efficiently each plant utilises the water and nutrients available in the surrounding soil. This will facilitate early detection of desirable genetic traits with the aim of increased efficiency in identification and delivery of tomorrow's drought tolerant food crops. Visualisation takes the form of Electrical Capacitance Tomography (ECT), a non-destructive and non-intrusive imaging technique. Measurements are to be obtained for an individual plant thus allowing water and nutrient absorption levels for an individual specimen to be inferred. This paper presents the inverse problem, discusses the inherent challenges and presents the early experimental results. Two mixture models are evaluated for the prediction of electrical capacitance measurement data for varying effective soil saturation levels using a finite element model implemented in COMSOL Multiphysics. These early studies have given the research team an understanding of the technical challenges that must now be addressed to take the current research into the world of agri-science and food supply.

  3. MEGACELL: A nanocrystal model construction software for HRTEM multislice simulation

    International Nuclear Information System (INIS)

    Stroppa, Daniel G.; Righetto, Ricardo D.; Montoro, Luciano A.; Ramirez, Antonio J.

    2011-01-01

    Image simulation has an invaluable importance for the accurate analysis of High Resolution Transmission Electron Microscope (HRTEM) results, especially due to its non-linear image formation mechanism. Because the as-obtained images cannot be interpreted in a straightforward fashion, the retrieval of both qualitative and quantitative information from HRTEM micrographs requires an iterative process including the simulation of a nanocrystal model and its comparison with experimental images. However most of the available image simulation software requires atom-by-atom coordinates as input for the calculations, which can be prohibitive for large finite crystals and/or low-symmetry systems and zone axis orientations. This paper presents an open source citation-ware tool named MEGACELL, which was developed to assist on the construction of nanocrystals models. It allows the user to build nanocrystals with virtually any convex polyhedral geometry and to retrieve its atomic positions either as a plain text file or as an output compatible with EMS (Electron Microscopy Software) input protocol. In addition to the description of this tool features, some construction examples and its application for scientific studies are presented. These studies show MEGACELL as a handy tool, which allows an easier construction of complex nanocrystal models and improves the quantitative information extraction from HRTEM images. -- Highlights: → A software to support the HRTEM image simulation of nanocrystals in actual size. → MEGACELL allows the construction of complex nanocrystals models for multislice image simulation. → Some examples of improved nanocrystalline system characterization are presented, including the analysis of 3D morphology and growth behavior.

  4. Modeling of Imaging Systems in MATLAB

    Directory of Open Access Journals (Sweden)

    J. Hozman

    2003-12-01

    Full Text Available For many applications in image processing it is necessary to knowmodel of imaging system, which has been used for image data obtain.Knowledge about system can be used for the simulation of an image datain astronomy (telescope and CCD camera, for example in near IR band orwe can treat the compression algorithm as any other imaging system andperform objective image quality measurement.

  5. Distributed simulation a model driven engineering approach

    CERN Document Server

    Topçu, Okan; Oğuztüzün, Halit; Yilmaz, Levent

    2016-01-01

    Backed by substantive case studies, the novel approach to software engineering for distributed simulation outlined in this text demonstrates the potent synergies between model-driven techniques, simulation, intelligent agents, and computer systems development.

  6. Crowd Human Behavior for Modeling and Simulation

    Science.gov (United States)

    2009-08-06

    Crowd Human Behavior for Modeling and Simulation Elizabeth Mezzacappa, Ph.D. & Gordon Cooke, MEME Target Behavioral Response Laboratory, ARDEC...TYPE Conference Presentation 3. DATES COVERED 00-00-2008 to 00-00-2009 4. TITLE AND SUBTITLE Crowd Human Behavior for Modeling and Simulation...34understanding human behavior " and "model validation and verification" and will focus on modeling and simulation of crowds from a social scientist???s

  7. Three-dimensional medical images and its application for surgical simulation of plastic and reconstructive surgery

    International Nuclear Information System (INIS)

    Kaneko, Tsuyoshi; Kobayashi, Masahiro; Nakajima, Hideo; Fujino, Toyomi

    1992-01-01

    The author's three surgical simulation systems are presented. First the computer graphics surgical simulation system has been developed which make the three dimensional skull image from CT scans and the arbitrary osteotomy, mobilization of bone segments and prediction of post-operative appearance is made possible. The second system is solid modeling of the skull using laser curable resin and it is concluded that life-sized skull model is useful not only for surgical simulation of major craniofacial surgery but also educational purposes. The third one is solid modeling of the ear using non-contact 3-D shape measurement with slit laser scanner. The mirror image life-sized wax model is made from the normal side of th ear and the autologous cartilage framework is assembled to simulate the wax model, thus the precise three dimensional reconstruction of the auricle is made possible. (author)

  8. Characterization of Compton-scatter imaging with an analytical simulation method

    Science.gov (United States)

    Jones, Kevin C.; Redler, Gage; Templeton, Alistair; Bernard, Damian; Turian, Julius V.; Chu, James C. H.

    2018-01-01

    By collimating the photons scattered when a megavoltage therapy beam interacts with the patient, a Compton-scatter image may be formed without the delivery of an extra dose. To characterize and assess the potential of the technique, an analytical model for simulating scatter images was developed and validated against Monte Carlo (MC). For three phantoms, the scatter images collected during irradiation with a 6 MV flattening-filter-free therapy beam were simulated. Images, profiles, and spectra were compared for different phantoms and different irradiation angles. The proposed analytical method simulates accurate scatter images up to 1000 times faster than MC. Minor differences between MC and analytical simulated images are attributed to limitations in the isotropic superposition/convolution algorithm used to analytically model multiple-order scattering. For a detector placed at 90° relative to the treatment beam, the simulated scattered photon energy spectrum peaks at 140-220 keV, and 40-50% of the photons are the result of multiple scattering. The high energy photons originate at the beam entrance. Increasing the angle between source and detector increases the average energy of the collected photons and decreases the relative contribution of multiple scattered photons. Multiple scattered photons cause blurring in the image. For an ideal 5 mm diameter pinhole collimator placed 18.5 cm from the isocenter, 10 cGy of deposited dose (2 Hz imaging rate for 1200 MU min-1 treatment delivery) is expected to generate an average 1000 photons per mm2 at the detector. For the considered lung tumor CT phantom, the contrast is high enough to clearly identify the lung tumor in the scatter image. Increasing the treatment beam size perpendicular to the detector plane decreases the contrast, although the scatter subject contrast is expected to be greater than the megavoltage transmission image contrast. With the analytical method, real-time tumor tracking may be possible

  9. Simulation

    DEFF Research Database (Denmark)

    Gould, Derek A; Chalmers, Nicholas; Johnson, Sheena J

    2012-01-01

    Recognition of the many limitations of traditional apprenticeship training is driving new approaches to learning medical procedural skills. Among simulation technologies and methods available today, computer-based systems are topical and bring the benefits of automated, repeatable, and reliable...... performance assessments. Human factors research is central to simulator model development that is relevant to real-world imaging-guided interventional tasks and to the credentialing programs in which it would be used....

  10. Simulation Model for DMEK Donor Preparation.

    Science.gov (United States)

    Mittal, Vikas; Mittal, Ruchi; Singh, Swati; Narang, Purvasha; Sridhar, Priti

    2018-04-09

    To demonstrate a simulation model for donor preparation in Descemet membrane endothelial keratoplasty (DMEK). The inner transparent membrane of the onion (Allium cepa) was used as a simulation model for human Descemet membrane (DM). Surgical video (see Video, Supplemental Digital Content 1, http://links.lww.com/ICO/A663) demonstrating all the steps was recorded. This model closely simulates human DM and helps DMEK surgeons learn the nuances of DM donor preparation steps with ease. The technique is repeatable, and the model is cost-effective. The described simulation model can assist surgeons and eye bank technicians to learn steps in donor preparation in DMEK.

  11. Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2014-01-01

    This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

  12. An introduction to enterprise modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ostic, J.K.; Cannon, C.E. [Los Alamos National Lab., NM (United States). Technology Modeling and Analysis Group

    1996-09-01

    As part of an ongoing effort to continuously improve productivity, quality, and efficiency of both industry and Department of Energy enterprises, Los Alamos National Laboratory is investigating various manufacturing and business enterprise simulation methods. A number of enterprise simulation software models are being developed to enable engineering analysis of enterprise activities. In this document the authors define the scope of enterprise modeling and simulation efforts, and review recent work in enterprise simulation at Los Alamos National Laboratory as well as at other industrial, academic, and research institutions. References of enterprise modeling and simulation methods and a glossary of enterprise-related terms are provided.

  13. Structured building model reduction toward parallel simulation

    Energy Technology Data Exchange (ETDEWEB)

    Dobbs, Justin R. [Cornell University; Hencey, Brondon M. [Cornell University

    2013-08-26

    Building energy model reduction exchanges accuracy for improved simulation speed by reducing the number of dynamical equations. Parallel computing aims to improve simulation times without loss of accuracy but is poorly utilized by contemporary simulators and is inherently limited by inter-processor communication. This paper bridges these disparate techniques to implement efficient parallel building thermal simulation. We begin with a survey of three structured reduction approaches that compares their performance to a leading unstructured method. We then use structured model reduction to find thermal clusters in the building energy model and allocate processing resources. Experimental results demonstrate faster simulation and low error without any interprocessor communication.

  14. Modelling, simulation and visualisation for electromagnetic non-destructive testing

    International Nuclear Information System (INIS)

    Ilham Mukriz Zainal Abidin; Abdul Razak Hamzah

    2010-01-01

    This paper reviews the state-of-the art and the recent development of modelling, simulation and visualization for eddy current Non-Destructive Testing (NDT) technique. Simulation and visualization has aid in the design and development of electromagnetic sensors and imaging techniques and systems for Electromagnetic Non-Destructive Testing (ENDT); feature extraction and inverse problems for Quantitative Non-Destructive Testing (QNDT). After reviewing the state-of-the art of electromagnetic modelling and simulation, case studies of Research and Development in eddy current NDT technique via magnetic field mapping and thermography for eddy current distribution are discussed. (author)

  15. Simulation and Modeling Application in Agricultural Mechanization

    Directory of Open Access Journals (Sweden)

    R. M. Hudzari

    2012-01-01

    Full Text Available This experiment was conducted to determine the equations relating the Hue digital values of the fruits surface of the oil palm with maturity stage of the fruit in plantation. The FFB images were zoomed and captured using Nikon digital camera, and the calculation of Hue was determined using the highest frequency of the value for R, G, and B color components from histogram analysis software. New procedure in monitoring the image pixel value for oil palm fruit color surface in real-time growth maturity was developed. The estimation of day harvesting prediction was calculated based on developed model of relationships for Hue values with mesocarp oil content. The simulation model is regressed and predicts the day of harvesting or a number of days before harvest of FFB. The result from experimenting on mesocarp oil content can be used for real-time oil content determination of MPOB color meter. The graph to determine the day of harvesting the FFB was presented in this research. The oil was found to start developing in mesocarp fruit at 65 days before fruit at ripe maturity stage of 75% oil to dry mesocarp.

  16. Hybrid statistics-simulations based method for atom-counting from ADF STEM images

    Energy Technology Data Exchange (ETDEWEB)

    De wael, Annelies, E-mail: annelies.dewael@uantwerpen.be [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium); De Backer, Annick [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium); Jones, Lewys; Nellist, Peter D. [Department of Materials, University of Oxford, Parks Road, OX1 3PH Oxford (United Kingdom); Van Aert, Sandra, E-mail: sandra.vanaert@uantwerpen.be [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium)

    2017-06-15

    A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials. - Highlights: • A hybrid method for atom-counting from ADF STEM images is introduced. • Image simulations are incorporated into a statistical framework in a reliable manner. • Limits of the existing methods for atom-counting are far exceeded. • Reliable counting results from an experimental low dose image are obtained. • Progress towards reliable quantitative analysis of beam-sensitive materials is made.

  17. Application of digital image processing for the generation of voxels phantoms for Monte Carlo simulation.

    Science.gov (United States)

    Boia, L S; Menezes, A F; Cardoso, M A C; da Rosa, L A R; Batista, D V S; Cardoso, S C; Silva, A X; Facure, A

    2012-01-01

    This paper presents the application of a computational methodology for optimizing the conversion of medical tomographic images in voxel anthropomorphic models for simulation of radiation transport using the MCNP code. A computational system was developed for digital image processing that compresses the information from the DICOM medical image before it is converted to the Scan2MCNP software input file for optimization of the image data. In order to validate the computational methodology, a radiosurgery treatment simulation was performed using the Alderson Rando phantom and the acquisition of DICOM images was performed. The simulation results were compared with data obtained with the BrainLab planning system. The comparison showed good agreement for three orthogonal treatment beams of (60)Co gamma radiation. The percentage differences were 3.07%, 0.77% and 6.15% for axial, coronal and sagital projections, respectively. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Simulations of multi-contrast x-ray imaging using near-field speckles

    Energy Technology Data Exchange (ETDEWEB)

    Zdora, Marie-Christine [Lehrstuhl für Biomedizinische Physik, Physik-Department & Institut für Medizintechnik, Technische Universität München, 85748 Garching (Germany); Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire, OX11 0DE, United Kingdom and Department of Physics & Astronomy, University College London, London, WC1E 6BT (United Kingdom); Thibault, Pierre [Department of Physics & Astronomy, University College London, London, WC1E 6BT (United Kingdom); Herzen, Julia; Pfeiffer, Franz [Lehrstuhl für Biomedizinische Physik, Physik-Department & Institut für Medizintechnik, Technische Universität München, 85748 Garching (Germany); Zanette, Irene [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire, OX11 0DE (United Kingdom); Lehrstuhl für Biomedizinische Physik, Physik-Department & Institut für Medizintechnik, Technische Universität München, 85748 Garching (Germany)

    2016-01-28

    X-ray dark-field and phase-contrast imaging using near-field speckles is a novel technique that overcomes limitations inherent in conventional absorption x-ray imaging, i.e. poor contrast for features with similar density. Speckle-based imaging yields a wealth of information with a simple setup tolerant to polychromatic and divergent beams, and simple data acquisition and analysis procedures. Here, we present a simulation software used to model the image formation with the speckle-based technique, and we compare simulated results on a phantom sample with experimental synchrotron data. Thorough simulation of a speckle-based imaging experiment will help for better understanding and optimising the technique itself.

  19. Synthetic SAR Image Generation using Sensor, Terrain and Target Models

    DEFF Research Database (Denmark)

    Kusk, Anders; Abulaitijiang, Adili; Dall, Jørgen

    2016-01-01

    A tool to generate synthetic SAR images of objects set on a clutter background is described. The purpose is to generate images for training Automatic Target Recognition and Identification algorithms. The tool employs a commercial electromagnetic simulation program to calculate radar cross sections...... of the object using a CAD-model. The raw measurements are input to a SAR system and terrain model, which models thermal noise, terrain clutter, and SAR focusing to produce synthetic SAR images. Examples of SAR images at 0.3m and 0.1m resolution, and a comparison with real SAR imagery from the MSTAR dataset...

  20. Improved Hyperthermia Treatment Control using SAR/Temperature Simulation and PRFS Magnetic Resonance Thermal Imaging

    Science.gov (United States)

    Li, Zhen; Vogel, Martin; Maccarini, Paolo F.; Stakhursky, Vadim; Soher, Brian J.; Craciunescu, Oana I.; Das, Shiva; Arabe, Omar A.; Joines, Williams T.; Stauffer, Paul R.

    2010-01-01

    Purpose This article explores the feasibility of using coupled electromagnetic and thermodynamic simulations to improve planning and control of hyperthermia treatments for cancer. The study investigates the usefulness of preplanning to improve heat localization in tumor targets in treatments monitored with PRFS-based Magnetic Resonance Thermal Imaging (MRTI). . Methods Heating capabilities of a cylindrical radiofrequency (RF) mini-annular phased array (MAPA) applicator were investigated with electromagnetic and thermal simulations of SAR in homogeneous phantom models and two human leg sarcomas. HFSS (Ansoft Corp) was used for electromagnetic simulations and SAR patterns were coupled into EPhysics (Ansoft Corp) for thermal modeling with temperature dependent variable perfusion. Simulations were accelerated by integrating tumor specific anatomy into a pre-gridded whole body tissue model. To validate this treatment planning approach, simulations were compared with MR thermal images in both homogenous phantoms and heterogeneous tumors. Results SAR simulations demonstrated excellent agreement with temperature rise distributions obtained with MR thermal imaging in homogeneous phantoms, and clinical treatments of large soft-tissue sarcomas. The results demonstrate feasibility of preplanning appropriate relative phases of antennas for localizing heat in tumor. Conclusions Advances in the accuracy of computer simulation and non-invasive thermometry via MR thermal imaging have provided powerful new tools for optimization of clinical hyperthermia treatments. Simulations agree well with MR thermal images in both homogeneous tissue models and patients with lower leg tumors. This work demonstrates that better quality hyperthermia treatments should be possible when simplified hybrid model simulations are performed routinely as part of the clinical pretreatment plan. PMID:21070140

  1. Electron Holography Image Simulation of Nanoparticles

    NARCIS (Netherlands)

    Keimpema, K.; Raedt, H. De; Hosson, J.Th.M. De

    We discuss a real-space and a Fourier-space technique to compute numerically, the phase images observed by electron holography of nanoscale particles. An assessment of the applicability and accuracy of these techniques is made by calculating numerical results for simple geometries for which

  2. A physiological production model for cacao : results of model simulations

    NARCIS (Netherlands)

    Zuidema, P.A.; Leffelaar, P.A.

    2002-01-01

    CASE2 is a physiological model for cocoa (Theobroma cacao L.) growth and yield. This report introduces the CAcao Simulation Engine for water-limited production in a non-technical way and presents simulation results obtained with the model.

  3. Monte Carlo modeling of human tooth optical coherence tomography imaging

    International Nuclear Information System (INIS)

    Shi, Boya; Meng, Zhuo; Wang, Longzhi; Liu, Tiegen

    2013-01-01

    We present a Monte Carlo model for optical coherence tomography (OCT) imaging of human tooth. The model is implemented by combining the simulation of a Gaussian beam with simulation for photon propagation in a two-layer human tooth model with non-parallel surfaces through a Monte Carlo method. The geometry and the optical parameters of the human tooth model are chosen on the basis of the experimental OCT images. The results show that the simulated OCT images are qualitatively consistent with the experimental ones. Using the model, we demonstrate the following: firstly, two types of photons contribute to the information of morphological features and noise in the OCT image of a human tooth, respectively. Secondly, the critical imaging depth of the tooth model is obtained, and it is found to decrease significantly with increasing mineral loss, simulated as different enamel scattering coefficients. Finally, the best focus position is located below and close to the dental surface by analysis of the effect of focus positions on the OCT signal and critical imaging depth. We anticipate that this modeling will become a powerful and accurate tool for a preliminary numerical study of the OCT technique on diseases of dental hard tissue in human teeth. (paper)

  4. Monte Carlo modeling of human tooth optical coherence tomography imaging

    Science.gov (United States)

    Shi, Boya; Meng, Zhuo; Wang, Longzhi; Liu, Tiegen

    2013-07-01

    We present a Monte Carlo model for optical coherence tomography (OCT) imaging of human tooth. The model is implemented by combining the simulation of a Gaussian beam with simulation for photon propagation in a two-layer human tooth model with non-parallel surfaces through a Monte Carlo method. The geometry and the optical parameters of the human tooth model are chosen on the basis of the experimental OCT images. The results show that the simulated OCT images are qualitatively consistent with the experimental ones. Using the model, we demonstrate the following: firstly, two types of photons contribute to the information of morphological features and noise in the OCT image of a human tooth, respectively. Secondly, the critical imaging depth of the tooth model is obtained, and it is found to decrease significantly with increasing mineral loss, simulated as different enamel scattering coefficients. Finally, the best focus position is located below and close to the dental surface by analysis of the effect of focus positions on the OCT signal and critical imaging depth. We anticipate that this modeling will become a powerful and accurate tool for a preliminary numerical study of the OCT technique on diseases of dental hard tissue in human teeth.

  5. Comparison of image deconvolution algorithms on simulated and laboratory infrared images

    Energy Technology Data Exchange (ETDEWEB)

    Proctor, D. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    We compare Maximum Likelihood, Maximum Entropy, Accelerated Lucy-Richardson, Weighted Goodness of Fit, and Pixon reconstructions of simple scenes as a function of signal-to-noise ratio for simulated images with randomly generated noise. Reconstruction results of infrared images taken with the TAISIR (Temperature and Imaging System InfraRed) are also discussed.

  6. Simulation of an image network in a medical image information system

    International Nuclear Information System (INIS)

    Massar, A.D.A.; De Valk, J.P.J.; Reijns, G.L.; Bakker, A.R.

    1985-01-01

    The desirability of an integrated (digital) communication system for medical images is widely accepted. In the USA and in Europe several experimental projects are in progress to realize (a part of) such a system. Among these is the IMAGIS project in the Netherlands. From the conclusions of the preliminary studies performed, some requirements can be formulated such a system should meet in order to be accepted by its users. For example, the storage resolution of the images should match the maximum resolution of the presently acquired digital images. This determines the amount of data and therefore the storage requirements. Further, the desired images should be there when needed. This time constraint determines the speed requirements to be imposed on the system. As compared to current standards, very large storage capacities and very fast communication media are needed to meet these requirements. By employing cacheing techniques and suitable data compression schemes for the storage and by carefully choosing the network protocols, bare capacity demands can be alleviated. A communication network is needed to make the imaging system available over a larger area. As the network is very likely to become a major bottleneck for system performance, effects of variation of various attributes have to be carefully studied and analysed. After interesting results had been obtained (although preliminary) using a simulation model for a layered storage structure, it was decided to apply simulation also to this problem. Effects of network topology, access protocols and buffering strategies will be tested. Changes in performance resulting from changes in various network parameters will be studied. Results of this study at its present state are presented

  7. Global options for biofuels from plantations according to IMAGE simulations

    International Nuclear Information System (INIS)

    Battjes, J.J.

    1994-07-01

    In this report the contribution of biofuels to the renewable energy supply and the transition towards it are discussed for the energy crops miscanthus, eucalyptus, poplar, wheat and sugar cane. Bio-electricity appears to be the most suitable option regarding energetic and financial aspects and in terms of avoided CO 2 emissions. The IMAGE 2.0 model is a multi-disciplinary, integrated model designed to simulate the dynamics of the global society-biosphere-climate system, and mainly used here for making more realistic estimates. Dynamic calculations are performed to the year 2100. An IMAGE 2.0-based Conventional Wisdom scenario simulates, among other things, future energy demand and supply, future food production, future land cover patterns and future greenhouse gas emissions. Two biofuel scenarios are described in this report. The first consists of growing energy crops on set asides. According to a 'Conventional Wisdom' scenario, Canada, the U.S. and Europe and to a lesser extent Latin America will experience set asides due to a declining demand in agricultural area. The second biofuel scenario consists of growing energy crops on set asides and on 10% of the agricultural area in the developing countries. Growing energy crops on all of the areas listed above leads to an energy production that consists of about 12% of the total non-renewable energy use in 2050, according to the 'Conventional Wisdom' scenario. Furthermore, the energy related CO 2 emissions are reduced with about 15% in 2050, compared to the Conventional Wisdom scenario. Financial aspects will have great influence on the success of growing energy crops. However, energy generated from biomass derived from plantations is currently more expensive than generating it from traditional fuels. Levying taxes on CO 2 emissions and giving subsidies to biofuels will reduce the cost price difference between fossil fuels and biofuels

  8. Simulation modeling and analysis with Arena

    CERN Document Server

    Altiok, Tayfur

    2007-01-01

    Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment.” It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...

  9. An Object-Oriented Simulator for 3D Digital Breast Tomosynthesis Imaging System

    Directory of Open Access Journals (Sweden)

    Saeed Seyyedi

    2013-01-01

    Full Text Available Digital breast tomosynthesis (DBT is an innovative imaging modality that provides 3D reconstructed images of breast to detect the breast cancer. Projections obtained with an X-ray source moving in a limited angle interval are used to reconstruct 3D image of breast. Several reconstruction algorithms are available for DBT imaging. Filtered back projection algorithm has traditionally been used to reconstruct images from projections. Iterative reconstruction algorithms such as algebraic reconstruction technique (ART were later developed. Recently, compressed sensing based methods have been proposed in tomosynthesis imaging problem. We have developed an object-oriented simulator for 3D digital breast tomosynthesis (DBT imaging system using C++ programming language. The simulator is capable of implementing different iterative and compressed sensing based reconstruction methods on 3D digital tomosynthesis data sets and phantom models. A user friendly graphical user interface (GUI helps users to select and run the desired methods on the designed phantom models or real data sets. The simulator has been tested on a phantom study that simulates breast tomosynthesis imaging problem. Results obtained with various methods including algebraic reconstruction technique (ART and total variation regularized reconstruction techniques (ART+TV are presented. Reconstruction results of the methods are compared both visually and quantitatively by evaluating performances of the methods using mean structural similarity (MSSIM values.

  10. An object-oriented simulator for 3D digital breast tomosynthesis imaging system.

    Science.gov (United States)

    Seyyedi, Saeed; Cengiz, Kubra; Kamasak, Mustafa; Yildirim, Isa

    2013-01-01

    Digital breast tomosynthesis (DBT) is an innovative imaging modality that provides 3D reconstructed images of breast to detect the breast cancer. Projections obtained with an X-ray source moving in a limited angle interval are used to reconstruct 3D image of breast. Several reconstruction algorithms are available for DBT imaging. Filtered back projection algorithm has traditionally been used to reconstruct images from projections. Iterative reconstruction algorithms such as algebraic reconstruction technique (ART) were later developed. Recently, compressed sensing based methods have been proposed in tomosynthesis imaging problem. We have developed an object-oriented simulator for 3D digital breast tomosynthesis (DBT) imaging system using C++ programming language. The simulator is capable of implementing different iterative and compressed sensing based reconstruction methods on 3D digital tomosynthesis data sets and phantom models. A user friendly graphical user interface (GUI) helps users to select and run the desired methods on the designed phantom models or real data sets. The simulator has been tested on a phantom study that simulates breast tomosynthesis imaging problem. Results obtained with various methods including algebraic reconstruction technique (ART) and total variation regularized reconstruction techniques (ART+TV) are presented. Reconstruction results of the methods are compared both visually and quantitatively by evaluating performances of the methods using mean structural similarity (MSSIM) values.

  11. Biofidelic Human Activity Modeling and Simulation with Large Variability

    Science.gov (United States)

    2014-11-25

    exact match or a close representation. Efforts were made to ensure that the activity models can be integrated into widely used game engines and image...integrated into widely used game engines and image generators. ABOUT THE AUTHORS Dr. John Camp is a computer research scientist employed by AFRL. Dr...M&S) has been increasingly used in simulation-based training and virtual reality ( VR ). However, human M&S technology currently used in various

  12. Modeling amplitude SAR image with the Cauchy-Rayleigh mixture

    Science.gov (United States)

    Peng, Qiangqiang; Du, Qingyu; Yao, Yinwei; Huang, Huang

    2017-10-01

    In this paper, we introduce a novel mixture model of the SAR amplitude image, which is proposed as an approximation to the heavy-tailed Rayleigh model. The limitation of the heavy-tailed Rayleigh model in SAR image application is discussed. We also present an expectation-maximization (EM) algorithm based parameter estimation method for the Cauchy-Rayleigh mixture. We test the new model on some simulated data in order to confirm that is appropriate to the heavy-tailed Rayleigh model. The performance is evaluated by some statistic values (cumulative square errors (CSE) 0.99 and Kolmogorov-Smirnov distance (K-S) the performance of the proposed mixture model is tested on some real SAR images and compared with other models, including the heavy-tailed Rayleigh and Nakagami mixture models. The result indicates that the proposed model can be an optional statistical model for amplitude SAR images.

  13. System simulation of a 0.2THz imaging radar

    Science.gov (United States)

    Zhu, Li; Deng, Chao; Zhang, Cun-lin; Zhao, Yue-jin

    2009-07-01

    Unlike traditional THz imaging system, we first report a design of 0.2THz stepped frequency radar system, and prove its feasibility by simulation. The stepped frequency radar working from 200GHz to 210GHz can provide centimeter accuracy. To demonstrate the feasibility of our design, we simulate our system by using Advanced Design System (ADS) and Simulink in Matlab. The transmitter line is simulated in ADS, while system-level simulation is carried out in Matlab. The simulation of transmitter is implemented by using parameters from actual products, which can ensure the reality of simulation. In this paper, we will present the methods and results of our simulation. From the results, we can conclude that our design is feasible.

  14. Joint model of motion and anatomy for PET image reconstruction

    International Nuclear Information System (INIS)

    Qiao Feng; Pan Tinsu; Clark, John W. Jr.; Mawlawi, Osama

    2007-01-01

    Anatomy-based positron emission tomography (PET) image enhancement techniques have been shown to have the potential for improving PET image quality. However, these techniques assume an accurate alignment between the anatomical and the functional images, which is not always valid when imaging the chest due to respiratory motion. In this article, we present a joint model of both motion and anatomical information by integrating a motion-incorporated PET imaging system model with an anatomy-based maximum a posteriori image reconstruction algorithm. The mismatched anatomical information due to motion can thus be effectively utilized through this joint model. A computer simulation and a phantom study were conducted to assess the efficacy of the joint model, whereby motion and anatomical information were either modeled separately or combined. The reconstructed images in each case were compared to corresponding reference images obtained using a quadratic image prior based maximum a posteriori reconstruction algorithm for quantitative accuracy. Results of these studies indicated that while modeling anatomical information or motion alone improved the PET image quantitation accuracy, a larger improvement in accuracy was achieved when using the joint model. In the computer simulation study and using similar image noise levels, the improvement in quantitation accuracy compared to the reference images was 5.3% and 19.8% when using anatomical or motion information alone, respectively, and 35.5% when using the joint model. In the phantom study, these results were 5.6%, 5.8%, and 19.8%, respectively. These results suggest that motion compensation is important in order to effectively utilize anatomical information in chest imaging using PET. The joint motion-anatomy model presented in this paper provides a promising solution to this problem

  15. Modelling and simulation of a heat exchanger

    Science.gov (United States)

    Xia, Lei; Deabreu-Garcia, J. Alex; Hartley, Tom T.

    1991-01-01

    Two models for two different control systems are developed for a parallel heat exchanger. First by spatially lumping a heat exchanger model, a good approximate model which has a high system order is produced. Model reduction techniques are applied to these to obtain low order models that are suitable for dynamic analysis and control design. The simulation method is discussed to ensure a valid simulation result.

  16. Model based image restoration for underwater images

    Science.gov (United States)

    Stephan, Thomas; Frühberger, Peter; Werling, Stefan; Heizmann, Michael

    2013-04-01

    The inspection of offshore parks, dam walls and other infrastructure under water is expensive and time consuming, because such constructions must be inspected manually by divers. Underwater buildings have to be examined visually to find small cracks, spallings or other deficiencies. Automation of underwater inspection depends on established water-proved imaging systems. Most underwater imaging systems are based on acoustic sensors (sonar). The disadvantage of such an acoustic system is the loss of the complete visual impression. All information embedded in texture and surface reflectance gets lost. Therefore acoustic sensors are mostly insufficient for these kind of visual inspection tasks. Imaging systems based on optical sensors feature an enormous potential for underwater applications. The bandwidth from visual imaging systems reach from inspection of underwater buildings via marine biological applications through to exploration of the seafloor. The reason for the lack of established optical systems for underwater inspection tasks lies in technical difficulties of underwater image acquisition and processing. Lightening, highly degraded images make a computational postprocessing absolutely essential.

  17. Millimeter waves sensor modeling and simulation

    Science.gov (United States)

    Latger, Jean; Cathala, Thierry

    2015-10-01

    Guidance of weapon systems relies on sensors to analyze targets signature. Defense weapon systems also need to detect then identify threats also using sensors. One important class of sensors are millimeter waves radar systems that are very efficient for seeing through atmosphere and/or foliage for example. This type of high frequency radar can produce high quality images with very tricky features such as dihedral and trihedral bright points, shadows and lay over effect. Besides, image quality is very dependent on the carrier velocity and trajectory. Such sensors systems are so complex that they need simulation to be tested. This paper presents a state of the Art of millimeter waves sensor models. A short presentation of asymptotic methods shows that physical optics support is mandatory to reach realistic results. SE-Workbench-RF tool is presented and typical examples of results are shown both in the frame of Synthetic Aperture Radar sensors and Real Beam Ground Mapping radars. Several technical topics are then discussed, such as the rendering technique (ray tracing vs. rasterization), the implementation (CPU vs. GP GPU) and the tradeoff between physical accuracy and performance of computation. Examples of results using SE-Workbench-RF are showed and commented.

  18. Dual Energy Method for Breast Imaging: A Simulation Study

    Directory of Open Access Journals (Sweden)

    V. Koukou

    2015-01-01

    Full Text Available Dual energy methods can suppress the contrast between adipose and glandular tissues in the breast and therefore enhance the visibility of calcifications. In this study, a dual energy method based on analytical modeling was developed for the detection of minimum microcalcification thickness. To this aim, a modified radiographic X-ray unit was considered, in order to overcome the limited kVp range of mammographic units used in previous DE studies, combined with a high resolution CMOS sensor (pixel size of 22.5 μm for improved resolution. Various filter materials were examined based on their K-absorption edge. Hydroxyapatite (HAp was used to simulate microcalcifications. The contrast to noise ratio (CNRtc of the subtracted images was calculated for both monoenergetic and polyenergetic X-ray beams. The optimum monoenergetic pair was 23/58 keV for the low and high energy, respectively, resulting in a minimum detectable microcalcification thickness of 100 μm. In the polyenergetic X-ray study, the optimal spectral combination was 40/70 kVp filtered with 100 μm cadmium and 1000 μm copper, respectively. In this case, the minimum detectable microcalcification thickness was 150 μm. The proposed dual energy method provides improved microcalcification detectability in breast imaging with mean glandular dose values within acceptable levels.

  19. Dual Energy Method for Breast Imaging: A Simulation Study.

    Science.gov (United States)

    Koukou, V; Martini, N; Michail, C; Sotiropoulou, P; Fountzoula, C; Kalyvas, N; Kandarakis, I; Nikiforidis, G; Fountos, G

    2015-01-01

    Dual energy methods can suppress the contrast between adipose and glandular tissues in the breast and therefore enhance the visibility of calcifications. In this study, a dual energy method based on analytical modeling was developed for the detection of minimum microcalcification thickness. To this aim, a modified radiographic X-ray unit was considered, in order to overcome the limited kVp range of mammographic units used in previous DE studies, combined with a high resolution CMOS sensor (pixel size of 22.5 μm) for improved resolution. Various filter materials were examined based on their K-absorption edge. Hydroxyapatite (HAp) was used to simulate microcalcifications. The contrast to noise ratio (CNR tc ) of the subtracted images was calculated for both monoenergetic and polyenergetic X-ray beams. The optimum monoenergetic pair was 23/58 keV for the low and high energy, respectively, resulting in a minimum detectable microcalcification thickness of 100 μm. In the polyenergetic X-ray study, the optimal spectral combination was 40/70 kVp filtered with 100 μm cadmium and 1000 μm copper, respectively. In this case, the minimum detectable microcalcification thickness was 150 μm. The proposed dual energy method provides improved microcalcification detectability in breast imaging with mean glandular dose values within acceptable levels.

  20. VHDL simulation with access to transistor models

    Science.gov (United States)

    Gibson, J.

    1991-01-01

    Hardware description languages such as VHDL have evolved to aid in the design of systems with large numbers of elements and a wide range of electronic and logical abstractions. For high performance circuits, behavioral models may not be able to efficiently include enough detail to give designers confidence in a simulation's accuracy. One option is to provide a link between the VHDL environment and a transistor level simulation environment. The coupling of the Vantage Analysis Systems VHDL simulator and the NOVA simulator provides the combination of VHDL modeling and transistor modeling.

  1. Application of digital image processing for the generation of voxels phantoms for Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Boia, L.S.; Menezes, A.F.; Cardoso, M.A.C. [Programa de Engenharia Nuclear/COPPE (Brazil); Rosa, L.A.R. da [Instituto de Radioprotecao e Dosimetria-IRD, Av. Salvador Allende, s/no Recreio dos Bandeirantes, CP 37760, CEP 22780-160 Rio de Janeiro, RJ (Brazil); Batista, D.V.S. [Instituto de Radioprotecao e Dosimetria-IRD, Av. Salvador Allende, s/no Recreio dos Bandeirantes, CP 37760, CEP 22780-160 Rio de Janeiro, RJ (Brazil); Instituto Nacional de Cancer-Secao de Fisica Medica, Praca Cruz Vermelha, 23-Centro, 20230-130 Rio de Janeiro, RJ (Brazil); Cardoso, S.C. [Departamento de Fisica Nuclear, Instituto de Fisica, Universidade Federal do Rio de Janeiro, Bloco A-Sala 307, CP 68528, CEP 21941-972 Rio de Janeiro, RJ (Brazil); Silva, A.X., E-mail: ademir@con.ufrj.br [Programa de Engenharia Nuclear/COPPE (Brazil); Departamento de Engenharia Nuclear/Escola Politecnica, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970 Rio de Janeiro, RJ (Brazil); Facure, A. [Comissao Nacional de Energia Nuclear, R. Gal. Severiano 90, sala 409, 22294-900 Rio de Janeiro, RJ (Brazil)

    2012-01-15

    This paper presents the application of a computational methodology for optimizing the conversion of medical tomographic images in voxel anthropomorphic models for simulation of radiation transport using the MCNP code. A computational system was developed for digital image processing that compresses the information from the DICOM medical image before it is converted to the Scan2MCNP software input file for optimization of the image data. In order to validate the computational methodology, a radiosurgery treatment simulation was performed using the Alderson Rando phantom and the acquisition of DICOM images was performed. The simulation results were compared with data obtained with the BrainLab planning system. The comparison showed good agreement for three orthogonal treatment beams of {sup 60}Co gamma radiation. The percentage differences were 3.07%, 0.77% and 6.15% for axial, coronal and sagital projections, respectively. - Highlights: Black-Right-Pointing-Pointer We use a method to optimize the CT image conversion in voxel model for MCNP simulation. Black-Right-Pointing-Pointer We present a methodology to compress a DICOM image before conversion to input file. Black-Right-Pointing-Pointer To validate this study an idealized radiosurgery applied to the Alderson phantom was used.

  2. A Visual Attention Model Based Image Fusion

    OpenAIRE

    Rishabh Gupta; M.R.Vimala Devi; M. Devi

    2013-01-01

    To develop an efficient image fusion algorithm based on visual attention model for images with distinct objects. Image fusion is a process of combining complementary information from multiple images of the same scene into an image, so that the resultant image contains a more accurate description of the scene than any of the individual source images. The two basic fusion techniques are pixel level and region level fusion. Pixel level fusion deals with the operations on each and every pixel sep...

  3. Classification images and bubbles images in the generalized linear model.

    Science.gov (United States)

    Murray, Richard F

    2012-07-09

    Classification images and bubbles images are psychophysical tools that use stimulus noise to investigate what features people use to make perceptual decisions. Previous work has shown that classification images can be estimated using the generalized linear model (GLM), and here I show that this is true for bubbles images as well. Expressing the two approaches in terms of a single statistical model clarifies their relationship to one another, makes it possible to measure classification images and bubbles images simultaneously, and allows improvements developed for one method to be used with the other.

  4. Policy advice derived from simulation models

    NARCIS (Netherlands)

    Brenner, T.; Werker, C.

    2009-01-01

    When advising policy we face the fundamental problem that economic processes are connected with uncertainty and thus policy can err. In this paper we show how the use of simulation models can reduce policy errors. We suggest that policy is best based on socalled abductive simulation models, which

  5. Model Validation for Simulations of Vehicle Systems

    Science.gov (United States)

    2012-08-01

    jackknife”, Annals of Statistics, 7:1-26, 1979. [45] B. Efron and G. Gong, “A leisurely look at the bootstrap, the jackknife, and cross-validation”, The...battery model developed in the Automotive Research Center, a US Army Center of Excellence for modeling and simulation of ground vehicle systems...Sandia National Laboratories and a battery model developed in the Automotive Research Center, a US Army Center of Excellence for modeling and simulation

  6. Transient Modeling and Simulation of Compact Photobioreactors

    OpenAIRE

    Ribeiro, Robert Luis Lara; Mariano, André Bellin; Souza, Jeferson Avila; Vargas, Jose Viriato Coelho

    2008-01-01

    In this paper, a mathematical model is developed to make possible the simulation of microalgae growth and its dependency on medium temperature and light intensity. The model is utilized to simulate a compact photobioreactor response in time with physicochemical parameters of the microalgae Phaeodactylum tricornutum. The model allows for the prediction of the transient and local evolution of the biomass concentration in the photobioreactor with low computational time. As a result, the model is...

  7. Relationships of virtual reality neuroendoscopic simulations to actual imaging.

    Science.gov (United States)

    Riegel, T; Alberti, O; Retsch, R; Shiratori, V; Hellwig, D; Bertalanffy, H

    2000-12-01

    Advances in computer technology have permitted virtual reality images of the ventricular system. To determine the relevance of these images we have compared virtual reality simulations of the ventricular system with endoscopic findings in three patients. The virtual fly-through can be simulated after definition of waypoints. Flight objects of interest can be viewed from all sides. Important drawbacks are that filigree structures may be missed and blood vessels cannot be distinguished clearly. However, virtual endoscopy can presently be used as a planning tool or for training and has future potential for neurosurgery.

  8. Simulation of scintillating fiber gamma ray detectors for medical imaging

    International Nuclear Information System (INIS)

    Chaney, R.C.; Fenyves, E.J.; Antich, P.P.

    1990-01-01

    This paper reports on plastic scintillating fibers which have been shown to be effective for high spatial and time resolution of gamma rays. They may be expected to significantly improve the resolution of current medical imaging systems such as PET and SPECT. Monte Carlo simulation of imaging systems using these detectors, provides a means to optimize their performance in this application, as well as demonstrate their resolution and efficiency. Monte Carlo results are presented for PET and SPECT systems constructed using these detectors

  9. Simulation of Profiles Data For Computed Tomography Using Object Images

    International Nuclear Information System (INIS)

    Srisatit, Somyot

    2007-08-01

    Full text: It is necessary to use a scanning system to obtain the profiles data for computed tomographic images. A good profile data can give a good contrast and resolution. For the scanning system, high efficiency and high price of radiation equipments must be used. So, the simulated profiles data to obtain a good CT images quality as same as the real one for the demonstration can be used

  10. SOFI Simulation Tool: A Software Package for Simulating and Testing Super-Resolution Optical Fluctuation Imaging.

    Science.gov (United States)

    Girsault, Arik; Lukes, Tomas; Sharipov, Azat; Geissbuehler, Stefan; Leutenegger, Marcel; Vandenberg, Wim; Dedecker, Peter; Hofkens, Johan; Lasser, Theo

    2016-01-01

    Super-resolution optical fluctuation imaging (SOFI) allows one to perform sub-diffraction fluorescence microscopy of living cells. By analyzing the acquired image sequence with an advanced correlation method, i.e. a high-order cross-cumulant analysis, super-resolution in all three spatial dimensions can be achieved. Here we introduce a software tool for a simple qualitative comparison of SOFI images under simulated conditions considering parameters of the microscope setup and essential properties of the biological sample. This tool incorporates SOFI and STORM algorithms, displays and describes the SOFI image processing steps in a tutorial-like fashion. Fast testing of various parameters simplifies the parameter optimization prior to experimental work. The performance of the simulation tool is demonstrated by comparing simulated results with experimentally acquired data.

  11. Whole-building Hygrothermal Simulation Model

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2003-01-01

    An existing integrated simulation tool for dynamic thermal simulation of building was extended with a transient model for moisture release and uptake in building materials. Validation of the new model was begun with comparison against measurements in an outdoor test cell furnished with single...... materials. Almost quasi-steady, cyclic experiments were used to compare the indoor humidity variation and the numerical results of the integrated simulation tool with the new moisture model. Except for the case with chipboard as furnishing, the predictions of indoor humidity with the detailed model were...

  12. Laser bistatic two-dimensional scattering imaging simulation of lambert cone

    Science.gov (United States)

    Gong, Yanjun; Zhu, Chongyue; Wang, Mingjun; Gong, Lei

    2015-11-01

    This paper deals with the laser bistatic two-dimensional scattering imaging simulation of lambert cone. Two-dimensional imaging is called as planar imaging. It can reflect the shape of the target and material properties. Two-dimensional imaging has important significance for target recognition. The expression of bistatic laser scattering intensity of lambert cone is obtained based on laser radar eauqtion. The scattering intensity of a micro-element on the target could be obtained. The intensity is related to local angle of incidence, local angle of scattering and the infinitesimal area on the cone. According to the incident direction of laser, scattering direction and normal of infinitesimal area, the local incidence angle and scattering angle can be calculated. Through surface integration and the introduction of the rectangular function, we can get the intensity of imaging unit on the imaging surface, and then get Lambert cone bistatic laser two-dimensional scattering imaging simulation model. We analyze the effect of distinguishability, incident direction, observed direction and target size on the imaging. From the results, we can see that the scattering imaging simulation results of the lambert cone bistatic laser is correct.

  13. Simulation modeling for the health care manager.

    Science.gov (United States)

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  14. Early orthognathic surgery with three-dimensional image simulation during presurgical orthodontics in adults.

    Science.gov (United States)

    Kang, Sang-Hoon; Kim, Moon-Key; Park, Sun-Yeon; Lee, Ji-Yeon; Park, Wonse; Lee, Sang-Hwy

    2011-03-01

    To correct dentofacial deformities, three-dimensional skeletal analysis and computerized orthognathic surgery simulation are used to facilitate accurate diagnoses and surgical plans. Computed tomography imaging of dental occlusion can inform three-dimensional facial analyses and orthognathic surgical simulations. Furthermore, three-dimensional laser scans of a cast model of the predetermined postoperative dental occlusion can be used to increase the accuracy of the preoperative surgical simulation. In this study, we prepared cast models of planned postoperative dental occlusions from 12 patients diagnosed with skeletal class III malocclusions with mandibular prognathism and facial asymmetry that had planned to undergo bimaxillary orthognathic surgery during preoperative orthodontic treatment. The data from three-dimensional laser scans of the cast models were used in three-dimensional surgical simulations. Early orthognathic surgeries were performed based on three-dimensional image simulations using the cast images in several presurgical orthodontic states in which teeth alignment, leveling, and space closure were incomplete. After postoperative orthodontic treatments, intraoral examinations revealed that no patient had a posterior open bite or space. The two-dimensional and three-dimensional skeletal analyses showed that no mandibular deviations occurred between the immediate and final postoperative states of orthodontic treatment. These results showed that early orthognathic surgery with three-dimensional computerized simulations based on cast models of predetermined postoperative dental occlusions could provide early correction of facial deformities and improved efficacy of preoperative orthodontic treatment. This approach can reduce the decompensation treatment period of the presurgical orthodontics and contribute to efficient postoperative orthodontic treatments.

  15. 3D simulation of the image formation in soft x-ray microscopes.

    Science.gov (United States)

    Selin, Mårten; Fogelqvist, Emelie; Holmberg, Anders; Guttmann, Peter; Vogt, Ulrich; Hertz, Hans M

    2014-12-15

    In water-window soft x-ray microscopy the studied object is typically larger than the depth of focus and the sample illumination is often partially coherent. This blurs out-of-focus features and may introduce considerable fringing. Understanding the influence of these phenomena on the image formation is therefore important when interpreting experimental data. Here we present a wave-propagation model operating in 3D for simulating the image formation of thick objects in partially coherent soft x-ray microscopes. The model is compared with present simulation methods as well as with experiments. The results show that our model predicts the image formation of transmission soft x-ray microscopes more accurately than previous models.

  16. Modeling and simulation of blood collection systems.

    Science.gov (United States)

    Alfonso, Edgar; Xie, Xiaolan; Augusto, Vincent; Garraud, Olivier

    2012-03-01

    This paper addresses the modeling and simulation of blood collection systems in France for both fixed site and mobile blood collection with walk in whole blood donors and scheduled plasma and platelet donors. Petri net models are first proposed to precisely describe different blood collection processes, donor behaviors, their material/human resource requirements and relevant regulations. Petri net models are then enriched with quantitative modeling of donor arrivals, donor behaviors, activity times and resource capacity. Relevant performance indicators are defined. The resulting simulation models can be straightforwardly implemented with any simulation language. Numerical experiments are performed to show how the simulation models can be used to select, for different walk in donor arrival patterns, appropriate human resource planning and donor appointment strategies.

  17. Digital image processing: effect on detectability of simulated low-contrast radiographic patterns

    International Nuclear Information System (INIS)

    Ishida, M.; Doi, K.; Loo, L.N.; Metz, C.E.; Lehr, J.L.

    1984-01-01

    Detection studies of simulated low-contrast radiographic patterns were performed with a high-quality digital image processing system. The original images, prepared with conventional screen-film systems, were processed digitally to enhance contrast by a ''windowing'' technique. The detectability of simulated patterns was quantified in terms of the results of observer performance experiments by using the multiple-alternative forced-choice method. The processed images demonstrated a significant increase in observer detection performance over that for the original images. These results are related to the displayed and perceived signal-to-noise ratios derived from signal detection theory. The improvement in detectability is ascribed to a reduction in the relative magnitude of the human observer's ''internal'' noise after image processing. The measured dependence of threshold signal contrast on object size and noise level is accounted for by a statistical decision theory model that includes internal noise

  18. Microcomputer simulation of nuclear magnetic resonance imaging contrasts

    International Nuclear Information System (INIS)

    Le Bihan, D.

    1985-01-01

    The high information content of magnetic resonance images is due to the multiplicity of its parameters. However, this advantage introduces a difficulty in the interpretation of the contrast: an image is strongly modified according to the visualised parameters. The author proposes a micro-computer simulation program. After recalling the main intrinsic and extrinsic parameters, he shows how the program works and its interest as a pedagogic tool and as an aid for contrast optimisation of images as a function of the suspected pathology [fr

  19. [Modeling the eye based on simulated refractive surgery].

    Science.gov (United States)

    Lamard, M; Cochener, B

    2001-10-01

    To achieve three-dimensional modelizing of the eyeball (morphological and mechanical behavior) in order to simulate the impact of various refractive surgery techniques and to study the normal and pathological states of the eye. Rebuilding the ocular shell is based on different kinds of imaging (MRI, ultrasound) including information provided by video topography. Image data are treated using suitable numerized filters that allow automatic segmentations of ocular globus edges. Reconstruction is based on specific mathematical functions (B-splines). The mechanical behavior of a reconstructed model is simulated by solving equations of linearized elasticity with the finitude elements method. Numerous simulations mimmed different refractive surgical techniques and, then validated the model. In addition, simulations of various pathologies allowed us to verify certain clinical hypotheses. This work, although still experimental, demonstrates the advantages of such simulations and will allow novice physicians an easier approach to different surgical techniques and will help them understand their effect. Furthermore, it might be useful for simulation of new surgical concepts even before their in vivo evaluation.

  20. Modeling and Simulation of Matrix Converter

    DEFF Research Database (Denmark)

    Liu, Fu-rong; Klumpner, Christian; Blaabjerg, Frede

    2005-01-01

    This paper discusses the modeling and simulation of matrix converter. Two models of matrix converter are presented: one is based on indirect space vector modulation and the other is based on power balance equation. The basis of these two models is• given and the process on modeling is introduced...

  1. Image based EFIT simulation for nondestructive ultrasonic testing of austenitic steel

    International Nuclear Information System (INIS)

    Nakahata, Kazuyuki; Hirose, Sohichi; Schubert, Frank; Koehler, Bernd

    2009-01-01

    The ultrasonic testing (UT) of an austenitic steel with welds is difficult due to the acoustic anisotropy and local heterogeneity. The ultrasonic wave in the austenitic steel is skewed along crystallographic directions and scattered by weld boundaries. For reliable UT, a straightforward simulation tool to predict the wave propagation is desired. Here a combined method of elastodynamic finite integration technique (EFIT) and digital image processing is developed as a wave simulation tool for UT. The EFIT is a grid-based explicit numerical method and easily treats different boundary conditions which are essential to model wave propagation in heterogeneous materials. In this study, the EFIT formulation in anisotropic and heterogeneous materials is briefly described and an example of a two dimensional simulation of a phased array UT in an austenitic steel bar is demonstrated. In our simulation, a picture of the surface of the steel bar with a V-groove weld is scanned and fed into the image based EFIT modeling. (author)

  2. Virtual X-ray imaging techniques in an immersive casting simulation environment

    International Nuclear Information System (INIS)

    Li, Ning; Kim, Sung-Hee; Suh, Ji-Hyun; Cho, Sang-Hyun; Choi, Jung-Gil; Kim, Myoung-Hee

    2007-01-01

    A computer code was developed to simulate radiograph of complex casting products in a CAVE TM -like environment. The simulation is based on the deterministic algorithms and ray tracing techniques. The aim of this study is to examine CAD/CAE/CAM models at the design stage, to optimize the design and inspect predicted defective regions with fast speed, good accuracy and small numerical expense. The present work discusses the algorithms for the radiography simulation of CAD/CAM model and proposes algorithmic solutions adapted from ray-box intersection algorithm and octree data structure specifically for radiographic simulation of CAE model. The stereoscopic visualization of full-size of product in the immersive casting simulation environment as well as the virtual X-ray images of castings provides an effective tool for design and evaluation of foundry processes by engineers and metallurgists

  3. The Research of Optical Turbulence Model in Underwater Imaging System

    Directory of Open Access Journals (Sweden)

    Liying Sun

    2014-01-01

    Full Text Available In order to research the effect of turbulence on underwater imaging system and image restoration, the underwater turbulence model is simulated by computer fluid dynamics. This model is obtained in different underwater turbulence intensity, which contains the pressure data that influences refractive index distribution. When the pressure value is conversed to refractive index, the refractive index distribution can be received with the refraction formula. In the condition of same turbulent intensity, the distribution of refractive index presents gradient in the whole region, with disorder and mutations in the local region. With the turbulence intensity increase, the holistic variation of the refractive index in the image is larger, and the refractive index change more tempestuously in the local region. All the above are illustrated by the simulation results with he ray tracing method and turbulent refractive index model. According to different turbulence intensity analysis, it is proved that turbulence causes image distortion and increases noise.

  4. Simulation and Efficient Measurements of Intensities for Complex Imaging Sequences

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt; Rasmussen, Morten Fischer; Stuart, Matthias Bo

    2014-01-01

    on the sequence to simulate both intensity and mechanical index (MI) according to FDA rules. A 3 MHz BK Medical 8820e convex array transducer is used with the SARUS scanner. An Onda HFL-0400 hydrophone and the Onda AIMS III system measures the pressure field for three imaging schemes: a fixed focus, single...

  5. Mixing characteristics of sludge simulant in a model anaerobic digester.

    Science.gov (United States)

    Low, Siew Cheng; Eshtiaghi, Nicky; Slatter, Paul; Baudez, Jean-Christophe; Parthasarathy, Rajarathinam

    2016-03-01

    This study aims to investigate the mixing characteristics of a transparent sludge simulant in a mechanically agitated model digester using flow visualisation technique. Video images of the flow patterns were obtained by recording the progress of an acid-base reaction and analysed to determine the active and inactive volumes as a function of time. The doughnut-shaped inactive region formed above and below the impeller in low concentration simulant decreases in size with time and disappears finally. The 'cavern' shaped active mixing region formed around the impeller in simulant solutions with higher concentrations increases with increasing agitation time and reaches a steady state equilibrium size, which is a function of specific power input. These results indicate that the active volume is jointly determined by simulant rheology and specific power input. A mathematical correlation is proposed to estimate the active volume as a function of simulant concentration in terms of yield Reynolds number.

  6. Texture Based Quality Analysis of Simulated Synthetic Ultrasound Images Using Local Binary Patterns †

    Directory of Open Access Journals (Sweden)

    Prerna Singh

    2017-12-01

    Full Text Available Speckle noise reduction is an important area of research in the field of ultrasound image processing. Several algorithms for speckle noise characterization and analysis have been recently proposed in the area. Synthetic ultrasound images can play a key role in noise evaluation methods as they can be used to generate a variety of speckle noise models under different interpolation and sampling schemes, and can also provide valuable ground truth data for estimating the accuracy of the chosen methods. However, not much work has been done in the area of modeling synthetic ultrasound images, and in simulating speckle noise generation to get images that are as close as possible to real ultrasound images. An important aspect of simulated synthetic ultrasound images is the requirement for extensive quality assessment for ensuring that they have the texture characteristics and gray-tone features of real images. This paper presents texture feature analysis of synthetic ultrasound images using local binary patterns (LBP and demonstrates the usefulness of a set of LBP features for image quality assessment. Experimental results presented in the paper clearly show how these features could provide an accurate quality metric that correlates very well with subjective evaluations performed by clinical experts.

  7. A framework for simulating ultrasound imaging based on first order nonlinear pressure–velocity relations

    DEFF Research Database (Denmark)

    Du, Yigang; Fan, Rui; Li, Yong

    2016-01-01

    An ultrasound imaging framework modeled with the first order nonlinear pressure–velocity relations (NPVR) based simulation and implemented by a half-time staggered solution and pseudospectral method is presented in this paper. The framework is capable of simulating linear and nonlinear ultrasound...... for the strong nonlinearity, the RMS error for 1 MPa initial pressure amplitude is reduced from 36.6% to 27.3%....

  8. Physically realistic modeling of maritime training simulation

    OpenAIRE

    Cieutat , Jean-Marc

    2003-01-01

    Maritime training simulation is an important matter of maritime teaching, which requires a lot of scientific and technical skills.In this framework, where the real time constraint has to be maintained, all physical phenomena cannot be studied; the most visual physical phenomena relating to the natural elements and the ship behaviour are reproduced only. Our swell model, based on a surface wave simulation approach, permits to simulate the shape and the propagation of a regular train of waves f...

  9. Systematic modelling and simulation of refrigeration systems

    DEFF Research Database (Denmark)

    Rasmussen, Bjarne D.; Jakobsen, Arne

    1998-01-01

    The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose of the s......The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose...... of the simulation, to select appropriate component models and to set up the equations in a well-arranged way. In this paper the outline of such a method is proposed and examples showing the use of this method for simulation of refrigeration systems are given....

  10. Evaluation of Three-Dimensional Printed Materials for Simulation by Computed Tomography and Ultrasound Imaging.

    Science.gov (United States)

    Mooney, James J; Sarwani, Nabeel; Coleman, Melissa L; Fotos, Joseph S

    2017-06-01

    The use of three-dimensional (3D) printing allows for creation of custom models for clinical care, education, and simulation. Medical imaging, given the significant role it plays in both clinical diagnostics and procedures, remains an important area for such education and simulation. Unfortunately, the materials appropriate for use in simulation involving radiographic or ultrasound imaging remains poorly understood. Therefore, our study was intended to explore the characteristics of readily available 3D printing materials when visualized by computed tomography (CT) and ultrasound. Seven 3D printing materials were examined in standard shapes (cube, cylinder, triangular prism) with a selection of printing methods ("open," "whole," and "solid" forms). For CT imaging, these objects were suspended in a gelatin matrix molded to match a standard human CT phantom. For ultrasound imaging, the objects were placed in acrylic forms filled with a gelatin matrix. All images were examined using OsiriX software. Computed tomography imaging revealed marked variation in materials' Hounsfield units as well as patterning and artifact. The Hounsfield unit variations revealed a number of materials suitable for simulation various human tissues. Ultrasound imaging showed echogenicity in all materials, with some variability in shadowing and posterior wall visualization. We were able to demonstrate the potential utility for 3D printing in the creation of CT and ultrasound simulation models. The similar appearance of materials via ultrasound supports their broad utility for select tissue types, whereas the more variable appearance via CT suggests greater potential for simulating differing tissues but requiring multiple printer technologies to do so.

  11. Methods for obtaining 3D training images for multiple-point statistics simulations: a comparative study

    Science.gov (United States)

    Jha, S. K.; Comunian, A.; Mariethoz, G.; Kelly, B. F.

    2013-12-01

    In recent years, multiple-point statistics (MPS) has been used in several studies for characterizing facies heterogeneity in geological formations. MPS uses a conceptual representation of the expected facies distribution, called a Training image (TI), to generate patterns of facies heterogeneity. In two-dimensional (2D) simulations the TI can be a hand-drawn image, an analogue outcrop image, or derived from geological reconstructions using a combination of geological analogues and geophysical data. However, obtaining suitable TI in three-dimensions (3D) from geological analogues or geophysical data is harder and has limited the use of MPS for simulating facies heterogeneity in 3D. There have been attempts to generate 3D training images using object-based simulation (OBS). However, determining suitable values for the large number of parameters required by OBS is often challenging. In this study, we compare two approaches for generating three-dimensional training images to model a valley filling sequence deposited by meandering rivers. The first approach is based on deriving statistical information from two-dimensional TIs. The 3D domain is simulated with a sequence of 2D MPS simulation steps, performed along different directions on slices of the 3D domain. At each 2D simulation step, the facies simulated at the previous steps that lie on the current 2D slice are used as conditioning data. The second approach uses hand-drawn two-dimensional TIs and produces complex patterns resembling the geological structures by applying rotation and affinity transformations in the facies simulation. The two techniques are compared using transition probabilities, facies proportions, and connectivity metrics. In the presentation we discuss the benefits of each approach for generating three-dimensional facies models.

  12. Algorithms of CT value correction for reconstructing a radiotherapy simulation image through axial CT images

    International Nuclear Information System (INIS)

    Ogino, Takashi; Egawa, Sunao

    1991-01-01

    New algorithms of CT value correction for reconstructing a radiotherapy simulation image through axial CT images were developed. One, designated plane weighting method, is to correct CT value in proportion to the position of the beam element passing through the voxel. The other, designated solid weighting method, is to correct CT value in proportion to the length of the beam element passing through the voxel and the volume of voxel. Phantom experiments showed fair spatial resolution in the transverse direction. In the longitudinal direction, however, spatial resolution of under slice thickness could not be obtained. Contrast resolution was equivalent for both methods. In patient studies, the reconstructed radiotherapy simulation image was almost similar in visual perception of the density resolution to a simulation film taken by X-ray simulator. (author)

  13. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  14. Magnetosphere Modeling: From Cartoons to Simulations

    Science.gov (United States)

    Gombosi, T. I.

    2017-12-01

    Over the last half a century physics-based global computer simulations became a bridge between experiment and basic theory and now it represents the "third pillar" of geospace research. Today, many of our scientific publications utilize large-scale simulations to interpret observations, test new ideas, plan campaigns, or design new instruments. Realistic simulations of the complex Sun-Earth system have been made possible by the dramatically increased power of both computing hardware and numerical algorithms. Early magnetosphere models were based on simple E&M concepts (like the Chapman-Ferraro cavity) and hydrodynamic analogies (bow shock). At the beginning of the space age current system models were developed culminating in the sophisticated Tsyganenko-type description of the magnetic configuration. The first 3D MHD simulations of the magnetosphere were published in the early 1980s. A decade later there were several competing global models that were able to reproduce many fundamental properties of the magnetosphere. The leading models included the impact of the ionosphere by using a height-integrated electric potential description. Dynamic coupling of global and regional models started in the early 2000s by integrating a ring current and a global magnetosphere model. It has been recognized for quite some time that plasma kinetic effects play an important role. Presently, global hybrid simulations of the dynamic magnetosphere are expected to be possible on exascale supercomputers, while fully kinetic simulations with realistic mass ratios are still decades away. In the 2010s several groups started to experiment with PIC simulations embedded in large-scale 3D MHD models. Presently this integrated MHD-PIC approach is at the forefront of magnetosphere simulations and this technique is expected to lead to some important advances in our understanding of magnetosheric physics. This talk will review the evolution of magnetosphere modeling from cartoons to current systems

  15. Galaxy Zoo: Morphological Classification of Galaxy Images from the Illustris  Simulation

    Science.gov (United States)

    Dickinson, Hugh; Fortson, Lucy; Lintott, Chris; Scarlata, Claudia; Willett, Kyle; Bamford, Steven; Beck, Melanie; Cardamone, Carolin; Galloway, Melanie; Simmons, Brooke; Keel, William; Kruk, Sandor; Masters, Karen; Vogelsberger, Mark; Torrey, Paul; Snyder, Gregory F.

    2018-02-01

    Modern large-scale cosmological simulations model the universe with increasing sophistication and at higher spatial and temporal resolutions. These ongoing enhancements permit increasingly detailed comparisons between the simulation outputs and real observational data. Recent projects such as Illustris are capable of producing simulated images that are designed to be comparable to those obtained from local surveys. This paper tests the degree to which Illustris achieves this goal across a diverse population of galaxies using visual morphologies derived from Galaxy Zoo citizen scientists. Morphological classifications provided by these volunteers for simulated galaxies are compared with similar data for a compatible sample of images drawn from the Sloan Digital Sky Survey (SDSS) Legacy Survey. This paper investigates how simple morphological characterization by human volunteers asked to distinguish smooth from featured systems differs between simulated and real galaxy images. Significant differences are identified, which are most likely due to the limited resolution of the simulation, but which could be revealing real differences in the dynamical evolution of populations of galaxies in the real and model universes. Specifically, for stellar masses {M}\\star ≲ {10}11 {M}ȯ , a substantially larger proportion of Illustris galaxies that exhibit disk-like morphology or visible substructure, relative to their SDSS counterparts. Toward higher masses, the visual morphologies for simulated and observed galaxies converge and exhibit similar distributions. The stellar mass threshold indicated by this divergent behavior confirms recent works using parametric measures of morphology from Illustris simulated images. When {M}\\star ≳ {10}11 {M}ȯ , the Illustris data set contains substantially fewer galaxies that classifiers regard as unambiguously featured. In combination, these results suggest that comparison between the detailed properties of observed and simulated galaxies

  16. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    A familiar example of a feedback loop is the business model in which part of the output or profit is fedback as input or additional capital - for instance, a company may choose to reinvest 10% of the profit for expansion of the business. Such simple models, like ..... would help scientists, engineers and managers towards better.

  17. Complex Simulation Model of Mobile Fading Channel

    Directory of Open Access Journals (Sweden)

    Tomas Marek

    2005-01-01

    Full Text Available In the mobile communication environment the mobile channel is the main limiting obstacle to reach the best performance of wireless system. Modeling of the radio channel consists of two basic fading mechanisms - Long-term fading and Short-term fading. The contribution deals with simulation of complex mobile radio channel, which is the channel with all fading components. Simulation model is based on Clarke-Gans theoretical model for fading channel and is developed in MATLAB environment. Simulation results have shown very good coincidence with theory. This model was developed for hybrid adaptation 3G uplink simulator (described in this issue during the research project VEGA - 1/0140/03.

  18. Simulation Model Development for Mail Screening Process

    National Research Council Canada - National Science Library

    Vargo, Trish; Marvin, Freeman; Kooistra, Scott

    2005-01-01

    STUDY OBJECTIVE: Provide decision analysis support to the Homeland Defense Business Unit, Special Projects Team, in developing a simulation model to help determine the most effective way to eliminate backlog...

  19. SEIR model simulation for Hepatitis B

    Science.gov (United States)

    Side, Syafruddin; Irwan, Mulbar, Usman; Sanusi, Wahidah

    2017-09-01

    Mathematical modelling and simulation for Hepatitis B discuss in this paper. Population devided by four variables, namely: Susceptible, Exposed, Infected and Recovered (SEIR). Several factors affect the population in this model is vaccination, immigration and emigration that occurred in the population. SEIR Model obtained Ordinary Differential Equation (ODE) non-linear System 4-D which then reduces to 3-D. SEIR model simulation undertaken to predict the number of Hepatitis B cases. The results of the simulation indicates the number of Hepatitis B cases will increase and then decrease for several months. The result of simulation using the number of case in Makassar also found the basic reproduction number less than one, that means, Makassar city is not an endemic area of Hepatitis B.

  20. Fully Adaptive Radar Modeling and Simulation Development

    Science.gov (United States)

    2017-04-01

    AFRL-RY-WP-TR-2017-0074 FULLY ADAPTIVE RADAR MODELING AND SIMULATION DEVELOPMENT Kristine L. Bell and Anthony Kellems Metron, Inc...SMALL BUSINESS INNOVATION RESEARCH (SBIR) PHASE I REPORT. Approved for public release; distribution unlimited. See additional restrictions...2017 4. TITLE AND SUBTITLE FULLY ADAPTIVE RADAR MODELING AND SIMULATION DEVELOPMENT 5a. CONTRACT NUMBER FA8650-16-M-1774 5b. GRANT NUMBER 5c

  1. Theory, modeling, and simulation annual report, 1992

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  2. On the simulation and mitigation of anisoplanatic optical turbulence for long range imaging

    Science.gov (United States)

    Hardie, Russell C.; LeMaster, Daniel A.

    2017-05-01

    We describe a numerical wave propagation method for simulating long range imaging of an extended scene under anisoplanatic conditions. Our approach computes an array of point spread functions (PSFs) for a 2D grid on the object plane. The PSFs are then used in a spatially varying weighted sum operation, with an ideal image, to produce a simulated image with realistic optical turbulence degradation. To validate the simulation we compare simulated outputs with the theoretical anisoplanatic tilt correlation and differential tilt variance. This is in addition to comparing the long- and short-exposure PSFs, and isoplanatic angle. Our validation analysis shows an excellent match between the simulation statistics and the theoretical predictions. The simulation tool is also used here to quantitatively evaluate a recently proposed block- matching and Wiener filtering (BMWF) method for turbulence mitigation. In this method block-matching registration algorithm is used to provide geometric correction for each of the individual input frames. The registered frames are then averaged and processed with a Wiener filter for restoration. A novel aspect of the proposed BMWF method is that the PSF model used for restoration takes into account the level of geometric correction achieved during image registration. This way, the Wiener filter is able fully exploit the reduced blurring achieved by registration. The BMWF method is relatively simple computationally, and yet, has excellent performance in comparison to state-of-the-art benchmark methods.

  3. Modeling of magnetic particle suspensions for simulations

    CERN Document Server

    Satoh, Akira

    2017-01-01

    The main objective of the book is to highlight the modeling of magnetic particles with different shapes and magnetic properties, to provide graduate students and young researchers information on the theoretical aspects and actual techniques for the treatment of magnetic particles in particle-based simulations. In simulation, we focus on the Monte Carlo, molecular dynamics, Brownian dynamics, lattice Boltzmann and stochastic rotation dynamics (multi-particle collision dynamics) methods. The latter two simulation methods can simulate both the particle motion and the ambient flow field simultaneously. In general, specialized knowledge can only be obtained in an effective manner under the supervision of an expert. The present book is written to play such a role for readers who wish to develop the skill of modeling magnetic particles and develop a computer simulation program using their own ability. This book is therefore a self-learning book for graduate students and young researchers. Armed with this knowledge,...

  4. Challenges for Modeling and Simulation

    National Research Council Canada - National Science Library

    Johnson, James

    2002-01-01

    This document deals with modeling and stimulation. The strengths are study processes that rarely or never occur, evaluate a wide range of alternatives, generate new ideas, new concepts and innovative solutions...

  5. Ultrasound Imaging and its modeling

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    2002-01-01

    Modern medical ultrasound scanners are used for imaging nearly all soft tissue structures in the body. The anatomy can be studied from gray-scale B-mode images, where the reflectivity and scattering strength of the tissues are displayed. The imaging is performed in real time with 20 to 100 images...

  6. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1985-01-01

    A simple model of the wave load on stender members of offshore structures is described . The wave elevation of the sea stateis modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...... velocity can be approximated by a Gaussian Markov process. Known approximate results for the first passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results...

  7. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    velocity can be approximated by a Gaussian Markov process. Known approximate results for the first-passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results......A simple model of the wave load on slender members of offshore structures is described. The wave elevation of the sea state is modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...

  8. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1985-01-01

    velocity can be approximated by a Gaussian Markov process. Known approximate results for the first passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results......A simple model of the wave load on stender members of offshore structures is described . The wave elevation of the sea stateis modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...

  9. Modeling and simulation of discrete event systems

    CERN Document Server

    Choi, Byoung Kyu

    2013-01-01

    Computer modeling and simulation (M&S) allows engineers to study and analyze complex systems. Discrete-event system (DES)-M&S is used in modern management, industrial engineering, computer science, and the military. As computer speeds and memory capacity increase, so DES-M&S tools become more powerful and more widely used in solving real-life problems. Based on over 20 years of evolution within a classroom environment, as well as on decades-long experience in developing simulation-based solutions for high-tech industries, Modeling and Simulation of Discrete-Event Systems is the only book on

  10. Minimum-complexity helicopter simulation math model

    Science.gov (United States)

    Heffley, Robert K.; Mnich, Marc A.

    1988-01-01

    An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.

  11. Computational tools for clinical support: a multi-scale compliant model for haemodynamic simulations in an aortic dissection based on multi-modal imaging data.

    Science.gov (United States)

    Bonfanti, Mirko; Balabani, Stavroula; Greenwood, John P; Puppala, Sapna; Homer-Vanniasinkam, Shervanthi; Díaz-Zuccarini, Vanessa

    2017-11-01

    Aortic dissection (AD) is a vascular condition with high morbidity and mortality rates. Computational fluid dynamics (CFD) can provide insight into the progression of AD and aid clinical decisions; however, oversimplified modelling assumptions and high computational cost compromise the accuracy of the information and impede clinical translation. To overcome these limitations, a patient-specific CFD multi-scale approach coupled to Windkessel boundary conditions and accounting for wall compliance was developed and used to study a patient with AD. A new moving boundary algorithm was implemented to capture wall displacement and a rich in vivo clinical dataset was used to tune model parameters and for validation. Comparisons between in silico and in vivo data showed that this approach successfully captures flow and pressure waves for the patient-specific AD and is able to predict the pressure in the false lumen (FL), a critical variable for the clinical management of the condition. Results showed regions of low and oscillatory wall shear stress which, together with higher diastolic pressures predicted in the FL, may indicate risk of expansion. This study, at the interface of engineering and medicine, demonstrates a relatively simple and computationally efficient approach to account for arterial deformation and wave propagation phenomena in a three-dimensional model of AD, representing a step forward in the use of CFD as a potential tool for AD management and clinical support. © 2017 The Author(s).

  12. A simulation model for football championships

    OpenAIRE

    Koning, Ruud H.; Koolhaas, Michael; Renes, Gusta

    2001-01-01

    In this paper we discuss a simulation/probability model that identifies the team that is most likely to win a tournament. The model can also be used to answer other questions like ‘which team had a lucky draw?’ or ‘what is the probability that two teams meet at some moment in the tournament?’. Input to the simulation/probability model are scoring intensities, that are estimated as a weighted average of goals scored. The model has been used in practice to write articles for the popular press, ...

  13. Blood vessel modeling for interactive simulation of interventional neuroradiology procedures.

    Science.gov (United States)

    Kerrien, E; Yureidini, A; Dequidt, J; Duriez, C; Anxionnat, R; Cotin, S

    2017-01-01

    Endovascular interventions can benefit from interactive simulation in their training phase but also during pre-operative and intra-operative phases if simulation scenarios are based on patient data. A key feature in this context is the ability to extract, from patient images, models of blood vessels that impede neither the realism nor the performance of simulation. This paper addresses both the segmentation and reconstruction of the vasculature from 3D Rotational Angiography data, and adapted to simulation: An original tracking algorithm is proposed to segment the vessel tree while filtering points extracted at the vessel surface in the vicinity of each point on the centerline; then an automatic procedure is described to reconstruct each local unstructured point set as a skeleton-based implicit surface (blobby model). The output of successively applying both algorithms is a new model of vasculature as a tree of local implicit models. The segmentation algorithm is compared with Multiple Hypothesis Testing (MHT) algorithm (Friman et al., 2010) on patient data, showing its greater ability to track blood vessels. The reconstruction algorithm is evaluated on both synthetic and patient data and demonstrate its ability to fit points with a subvoxel precision. Various tests are also reported where our model is used to simulate catheter navigation in interventional neuroradiology. An excellent realism, and much lower computational costs are reported when compared to triangular mesh surface models. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Gaussian Mixture Model and Rjmcmc Based RS Image Segmentation

    Science.gov (United States)

    Shi, X.; Zhao, Q. H.

    2017-09-01

    For the image segmentation method based on Gaussian Mixture Model (GMM), there are some problems: 1) The number of component was usually a fixed number, i.e., fixed class and 2) GMM is sensitive to image noise. This paper proposed a RS image segmentation method that combining GMM with reversible jump Markov Chain Monte Carlo (RJMCMC). In proposed algorithm, GMM was designed to model the distribution of pixel intensity in RS image. Assume that the number of component was a random variable. Respectively build the prior distribution of each parameter. In order to improve noise resistance, used Gibbs function to model the prior distribution of GMM weight coefficient. According to Bayes' theorem, build posterior distribution. RJMCMC was used to simulate the posterior distribution and estimate its parameters. Finally, an optimal segmentation is obtained on RS image. Experimental results show that the proposed algorithm can converge to the optimal number of class and get an ideal segmentation results.

  15. GAUSSIAN MIXTURE MODEL AND RJMCMC BASED RS IMAGE SEGMENTATION

    Directory of Open Access Journals (Sweden)

    X. Shi

    2017-09-01

    Full Text Available For the image segmentation method based on Gaussian Mixture Model (GMM, there are some problems: 1 The number of component was usually a fixed number, i.e., fixed class and 2 GMM is sensitive to image noise. This paper proposed a RS image segmentation method that combining GMM with reversible jump Markov Chain Monte Carlo (RJMCMC. In proposed algorithm, GMM was designed to model the distribution of pixel intensity in RS image. Assume that the number of component was a random variable. Respectively build the prior distribution of each parameter. In order to improve noise resistance, used Gibbs function to model the prior distribution of GMM weight coefficient. According to Bayes' theorem, build posterior distribution. RJMCMC was used to simulate the posterior distribution and estimate its parameters. Finally, an optimal segmentation is obtained on RS image. Experimental results show that the proposed algorithm can converge to the optimal number of class and get an ideal segmentation results.

  16. Simulation of ultrasound image data by a quadrature method

    International Nuclear Information System (INIS)

    Abbott, P.; Braun, M.

    1996-01-01

    Full text: Simulation is an important step in the process of developing reliable and accurate methods for medical image segmentation. Ultrasound data available from commercial scanners is most easily obtained in envelope detected form. Although bandpass and quadrature data is available it is not clear whether data in these forms would improve segmentation. Previous work (Bamber JC and Dickinson RJ, Phys Med Biol 25,3:463-479,1980) has simulated ultrasound data in bandpass (RF) form, requiring a high sampling rate and large data sets. We present a method for simulation of medical ultrasound images which produces data in quadrature form. The quadrature form allows a lower sampling rate which is dependent only on transducer bandwidth and is independent of centre frequency. We consider a two dimensional ultrasound data set I b (x, z) where x and z denote lateral and axial spatial coordinates and let I b (x, z) be bandpass in the axial direction. We then use a quadrature representation: I b (x, z) = Re[I c (x, z) exp jk 0 z ], where I c (x, z) = I p (x, z) + jI q (x, z), and I p and I q are (low-pass) quadrature components. We derive an expression for the power spectral density of I c and from this deduce expressions for the magnitude and phase of the Fourier transform of I p and I q . Taking the inverse Fourier transform then yields the simulated data sets I p and I q . The above quadrature simulation method has been implemented in a MATLAB like environment. We have demonstrated it to be a simple method for simulation of medical ultrasound images in a quadrature form. The method is essentially the same as previous work however produces quadrature data directly and provides a manageable data set independent of transducer centre frequency

  17. Simulating deformations of MR brain images for validation of atlas-based segmentation and registration algorithms.

    Science.gov (United States)

    Xue, Zhong; Shen, Dinggang; Karacali, Bilge; Stern, Joshua; Rottenberg, David; Davatzikos, Christos

    2006-11-15

    Simulated deformations and images can act as the gold standard for evaluating various template-based image segmentation and registration algorithms. Traditional deformable simulation methods, such as the use of analytic deformation fields or the displacement of landmarks followed by some form of interpolation, are often unable to construct rich (complex) and/or realistic deformations of anatomical organs. This paper presents new methods aiming to automatically simulate realistic inter- and intra-individual deformations. The paper first describes a statistical approach to capturing inter-individual variability of high-deformation fields from a number of examples (training samples). In this approach, Wavelet-Packet Transform (WPT) of the training deformations and their Jacobians, in conjunction with a Markov random field (MRF) spatial regularization, are used to capture both coarse and fine characteristics of the training deformations in a statistical fashion. Simulated deformations can then be constructed by randomly sampling the resultant statistical distribution in an unconstrained or a landmark-constrained fashion. The paper also describes a model for generating tissue atrophy or growth in order to simulate intra-individual brain deformations. Several sets of simulated deformation fields and respective images are generated, which can be used in the future for systematic and extensive validation studies of automated atlas-based segmentation and deformable registration methods. The code and simulated data are available through our Web site.

  18. The NPS Virtual Thermal Image Processing Model

    National Research Council Canada - National Science Library

    Lenter, Yucel

    2001-01-01

    ...). The MRTD is a standard performance measure for forward-looking infrared (FLIR) imaging systems. It takes into account thermal imaging system modeling concerns, such as modulation transfer functions...

  19. Simulation of photon and charge transport in X-ray imaging semiconductor sensors

    CERN Document Server

    Nilsson, H E; Hjelm, M; Bertilsson, K

    2002-01-01

    A fully stochastic model for the imaging properties of X-ray silicon pixel detectors is presented. Both integrating and photon counting configurations have been considered, as well as scintillator-coated structures. The model is based on three levels of Monte Carlo simulations; photon transport and absorption using MCNP, full band Monte Carlo simulation of charge transport and system level Monte Carlo simulation of the imaging performance of the detector system. In the case of scintillator-coated detectors, the light scattering in the detector layers has been simulated using a Monte Carlo method. The image resolution was found to be much lower in scintillator-coated systems due to large light spread in thick scintillator layers. A comparison between integrating and photon counting readout methods shows that the image resolution can be slightly enhanced using a photon-counting readout. In addition, the proposed model has been used to study charge-sharing effects on the energy resolution in photon counting dete...

  20. Dark Energy Studies with LSST Image Simulations, Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, John Russell [Purdue Univ., West Lafayette, IN (United States)

    2016-07-26

    This grant funded the development and dissemination of the Photon Simulator (PhoSim) for the purpose of studying dark energy at high precision with the upcoming Large Synoptic Survey Telescope (LSST) astronomical survey. The work was in collaboration with the LSST Dark Energy Science Collaboration (DESC). Several detailed physics improvements were made in the optics, atmosphere, and sensor, a number of validation studies were performed, and a significant number of usability features were implemented. Future work in DESC will use PhoSim as the image simulation tool for data challenges used by the analysis groups.

  1. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Modelling Deterministic Systems. N K Srinivasan gradu- ated from Indian. Institute of Science and obtained his Doctorate from Columbia Univer- sity, New York. He has taught in several universities, and later did system analysis, wargaming and simula- tion for defence. His other areas of interest are reliability engineer-.

  2. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    A model for a flue gas boiler covering the flue gas and the water-/steam side has been formulated. The model has been formulated as a number of sub models that are merged into an overall model for the complete boiler. Sub models have been defined for the furnace, the convection zone (split in 2......: a zone submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic-Equation system (DAE). Subsequently Mat......Lab/Simulink has been applied for carrying out the simulations. To be able to verify the simulated results experiments has been carried out on a full scale boiler plant....

  3. Image-Based Reconstruction and Analysis of Dynamic Scenes in a Landslide Simulation Facility

    Science.gov (United States)

    Scaioni, M.; Crippa, J.; Longoni, L.; Papini, M.; Zanzi, L.

    2017-12-01

    The application of image processing and photogrammetric techniques to dynamic reconstruction of landslide simulations in a scaled-down facility is described. Simulations are also used here for active-learning purpose: students are helped understand how physical processes happen and which kinds of observations may be obtained from a sensor network. In particular, the use of digital images to obtain multi-temporal information is presented. On one side, using a multi-view sensor set up based on four synchronized GoPro 4 Black® cameras, a 4D (3D spatial position and time) reconstruction of the dynamic scene is obtained through the composition of several 3D models obtained from dense image matching. The final textured 4D model allows one to revisit in dynamic and interactive mode a completed experiment at any time. On the other side, a digital image correlation (DIC) technique has been used to track surface point displacements from the image sequence obtained from the camera in front of the simulation facility. While the 4D model may provide a qualitative description and documentation of the experiment running, DIC analysis output quantitative information such as local point displacements and velocities, to be related to physical processes and to other observations. All the hardware and software equipment adopted for the photogrammetric reconstruction has been based on low-cost and open-source solutions.

  4. MULTIFOCUS IMAGE FUSION USING CLOUD MODEL

    OpenAIRE

    K. Kannan

    2014-01-01

    This paper proposes a multifocus image fusion algorithm based on cloud model. First, each source images are divided into overlapping image blocks of size (2N+1) × (2N+1) and then the mean and entropy of every image pixels over this neighborhood window was calculated and compared in Cloud domain. The pixel with higher magnitude of the calculated image features was selected to form the fused image. The results of multifocus image fusion using this algorithm hold favorable consistency in terms o...

  5. Modeling digital breast tomosynthesis imaging systems for optimization studies

    Science.gov (United States)

    Lau, Beverly Amy

    Digital breast tomosynthesis (DBT) is a new imaging modality for breast imaging. In tomosynthesis, multiple images of the compressed breast are acquired at different angles, and the projection view images are reconstructed to yield images of slices through the breast. One of the main problems to be addressed in the development of DBT is the optimal parameter settings to obtain images ideal for detection of cancer. Since it would be unethical to irradiate women multiple times to explore potentially optimum geometries for tomosynthesis, it is ideal to use a computer simulation to generate projection images. Existing tomosynthesis models have modeled scatter and detector without accounting for oblique angles of incidence that tomosynthesis introduces. Moreover, these models frequently use geometry-specific physical factors measured from real systems, which severely limits the robustness of their algorithms for optimization. The goal of this dissertation was to design the framework for a computer simulation of tomosynthesis that would produce images that are sensitive to changes in acquisition parameters, so an optimization study would be feasible. A computer physics simulation of the tomosynthesis system was developed. The x-ray source was modeled as a polychromatic spectrum based on published spectral data, and inverse-square law was applied. Scatter was applied using a convolution method with angle-dependent scatter point spread functions (sPSFs), followed by scaling using an angle-dependent scatter-to-primary ratio (SPR). Monte Carlo simulations were used to generate sPSFs for a 5-cm breast with a 1-cm air gap. Detector effects were included through geometric propagation of the image onto layers of the detector, which were blurred using depth-dependent detector point-spread functions (PRFs). Depth-dependent PRFs were calculated every 5-microns through a 200-micron thick CsI detector using Monte Carlo simulations. Electronic noise was added as Gaussian noise as a

  6. Simulation study of secondary electron images in scanning ion microscopy

    CERN Document Server

    Ohya, K

    2003-01-01

    The target atomic number, Z sub 2 , dependence of secondary electron yield is simulated by applying a Monte Carlo code for 17 species of metals bombarded by Ga ions and electrons in order to study the contrast difference between scanning ion microscopes (SIM) and scanning electron microscopes (SEM). In addition to the remarkable reversal of the Z sub 2 dependence between the Ga ion and electron bombardment, a fine structure, which is correlated to the density of the conduction band electrons in the metal, is calculated for both. The brightness changes of the secondary electron images in SIM and SEM are simulated using Au and Al surfaces adjacent to each other. The results indicate that the image contrast in SIM is much more sensitive to the material species and is clearer than that for SEM. The origin of the difference between SIM and SEM comes from the difference in the lateral distribution of secondary electrons excited within the escape depth.

  7. Model Driven Development of Simulation Models : Defining and Transforming Conceptual Models into Simulation Models by Using Metamodels and Model Transformations

    NARCIS (Netherlands)

    Küçükkeçeci Çetinkaya, D.

    2013-01-01

    Modeling and simulation (M&S) is an effective method for analyzing and designing systems and it is of interest to scientists and engineers from all disciplines. This thesis proposes the application of a model driven software development approach throughout the whole set of M&S activities and it

  8. Fast Simulation of Dynamic Ultrasound Images Using the GPU.

    Science.gov (United States)

    Storve, Sigurd; Torp, Hans

    2017-10-01

    Simulated ultrasound data is a valuable tool for development and validation of quantitative image analysis methods in echocardiography. Unfortunately, simulation time can become prohibitive for phantoms consisting of a large number of point scatterers. The COLE algorithm by Gao et al. is a fast convolution-based simulator that trades simulation accuracy for improved speed. We present highly efficient parallelized CPU and GPU implementations of the COLE algorithm with an emphasis on dynamic simulations involving moving point scatterers. We argue that it is crucial to minimize the amount of data transfers from the CPU to achieve good performance on the GPU. We achieve this by storing the complete trajectories of the dynamic point scatterers as spline curves in the GPU memory. This leads to good efficiency when simulating sequences consisting of a large number of frames, such as B-mode and tissue Doppler data for a full cardiac cycle. In addition, we propose a phase-based subsample delay technique that efficiently eliminates flickering artifacts seen in B-mode sequences when COLE is used without enough temporal oversampling. To assess the performance, we used a laptop computer and a desktop computer, each equipped with a multicore Intel CPU and an NVIDIA GPU. Running the simulator on a high-end TITAN X GPU, we observed two orders of magnitude speedup compared to the parallel CPU version, three orders of magnitude speedup compared to simulation times reported by Gao et al. in their paper on COLE, and a speedup of 27000 times compared to the multithreaded version of Field II, using numbers reported in a paper by Jensen. We hope that by releasing the simulator as an open-source project we will encourage its use and further development.

  9. Simulation and modeling of turbulent flows

    CERN Document Server

    Gatski, Thomas B; Lumley, John L

    1996-01-01

    This book provides students and researchers in fluid engineering with an up-to-date overview of turbulent flow research in the areas of simulation and modeling. A key element of the book is the systematic, rational development of turbulence closure models and related aspects of modern turbulent flow theory and prediction. Starting with a review of the spectral dynamics of homogenous and inhomogeneous turbulent flows, succeeding chapters deal with numerical simulation techniques, renormalization group methods and turbulent closure modeling. Each chapter is authored by recognized leaders in their respective fields, and each provides a thorough and cohesive treatment of the subject.

  10. Dynamic modeling and simulation of wind turbines

    International Nuclear Information System (INIS)

    Ghafari Seadat, M.H.; Kheradmand Keysami, M.; Lari, H.R.

    2002-01-01

    Using wind energy for generating electricity in wind turbines is a good way for using renewable energies. It can also help to protect the environment. The main objective of this paper is dynamic modeling by energy method and simulation of a wind turbine aided by computer. In this paper, the equations of motion are extracted for simulating the system of wind turbine and then the behavior of the system become obvious by solving the equations. The turbine is considered with three blade rotor in wind direction, induced generator that is connected to the network and constant revolution for simulation of wind turbine. Every part of the wind turbine should be simulated for simulation of wind turbine. The main parts are blades, gearbox, shafts and generator

  11. Hybrid simulation models of production networks

    CERN Document Server

    Kouikoglou, Vassilis S

    2001-01-01

    This book is concerned with a most important area of industrial production, that of analysis and optimization of production lines and networks using discrete-event models and simulation. The book introduces a novel approach that combines analytic models and discrete-event simulation. Unlike conventional piece-by-piece simulation, this method observes a reduced number of events between which the evolution of the system is tracked analytically. Using this hybrid approach, several models are developed for the analysis of production lines and networks. The hybrid approach combines speed and accuracy for exceptional analysis of most practical situations. A number of optimization problems, involving buffer design, workforce planning, and production control, are solved through the use of hybrid models.

  12. The behaviour of adaptive boneremodeling simulation models

    NARCIS (Netherlands)

    Weinans, H.; Huiskes, R.; Grootenboer, H.J.

    1992-01-01

    The process of adaptive bone remodeling can be described mathematically and simulated in a computer model, integrated with the finite element method. In the model discussed here, cortical and trabecular bone are described as continuous materials with variable density. The remodeling rule applied to

  13. Analytical system dynamics modeling and simulation

    CERN Document Server

    Fabien, Brian C

    2008-01-01

    This book offering a modeling technique based on Lagrange's energy method includes 125 worked examples. Using this technique enables one to model and simulate systems as diverse as a six-link, closed-loop mechanism or a transistor power amplifier.

  14. Equivalent drawbead model in finite element simulations

    NARCIS (Netherlands)

    Carleer, Bart D.; Carleer, B.D.; Meinders, Vincent T.; Huetink, Han; Lee, J.K.; Kinzel, G.L.; Wagoner, R.

    1996-01-01

    In 3D simulations of the deep drawing process the drawbead geometries are seldom included. Therefore equivalent drawbeads are used. In order to investigate the drawbead behaviour a 2D plane strain finite element model was used. For verification of this model experiments were performed. The analyses

  15. A simulation model for football championships

    NARCIS (Netherlands)

    Koning, RH; Koolhaas, M; Renes, G; Ridder, G

    2003-01-01

    In this paper we discuss a simulation/probability model that identifies the team that is most likely to win a tournament. The model can also be used to answer other questions like 'which team bad a lucky draw?' or 'what is the probability that two teams meet at some moment in the tournament?' Input

  16. A simulation model for football championships

    NARCIS (Netherlands)

    Koning, Ruud H.; Koolhaas, Michael; Renes, Gusta

    2001-01-01

    In this paper we discuss a simulation/probability model that identifies the team that is most likely to win a tournament. The model can also be used to answer other questions like ‘which team had a lucky draw?’ or ‘what is the probability that two teams meet at some moment in the tournament?’. Input

  17. Regularization modeling for large-eddy simulation

    NARCIS (Netherlands)

    Geurts, Bernardus J.; Holm, D.D.

    2003-01-01

    A new modeling approach for large-eddy simulation (LES) is obtained by combining a "regularization principle" with an explicit filter and its inversion. This regularization approach allows a systematic derivation of the implied subgrid model, which resolves the closure problem. The central role of

  18. Validity of microgravity simulation models on earth

    DEFF Research Database (Denmark)

    Regnard, J; Heer, M; Drummer, C

    2001-01-01

    Many studies have used water immersion and head-down bed rest as experimental models to simulate responses to microgravity. However, some data collected during space missions are at variance or in contrast with observations collected from experimental models. These discrepancies could reflect inc...

  19. Landscape Modelling and Simulation Using Spatial Data

    Directory of Open Access Journals (Sweden)

    Amjed Naser Mohsin AL-Hameedawi

    2017-08-01

    Full Text Available In this paper a procedure was performed for engendering spatial model of landscape acclimated to reality simulation. This procedure based on combining spatial data and field measurements with computer graphics reproduced using Blender software. Thereafter that we are possible to form a 3D simulation based on VIS ALL packages. The objective was to make a model utilising GIS, including inputs to the feature attribute data. The objective of these efforts concentrated on coordinating a tolerable spatial prototype, circumscribing facilitation scheme and outlining the intended framework. Thus; the eventual result was utilized in simulation form. The performed procedure contains not only data gathering, fieldwork and paradigm providing, but extended to supply a new method necessary to provide the respective 3D simulation mapping production, which authorises the decision makers as well as investors to achieve permanent acceptance an independent navigation system for Geoscience applications.

  20. Hybrid statistics-simulations based method for atom-counting from ADF STEM images.

    Science.gov (United States)

    De Wael, Annelies; De Backer, Annick; Jones, Lewys; Nellist, Peter D; Van Aert, Sandra

    2017-06-01

    A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Effects of specific surface area and porosity on cube counting fractal dimension, lacunarity, configurational entropy, and permeability of model porous networks: Random packing simulations and NMR micro-imaging study

    Science.gov (United States)

    Lee, Bum Han; Lee, Sung Keun

    2013-07-01

    Despite the importance of understanding and quantifying the microstructure of porous networks in diverse geologic settings, the effects of the specific surface area and porosity on the key structural parameters of the networks have not been fully understood. We performed cube-counting fractal dimension (Dcc) and lacunarity analyses of 3D porous networks of model sands and configurational entropy analysis of 2D cross sections of model sands using random packing simulations and nuclear magnetic resonance (NMR) micro-imaging. We established relationships among porosity, specific surface area, structural parameters (Dcc and lacunarity), and the corresponding macroscopic properties (configurational entropy and permeability). The Dcc of the 3D porous networks increases with increasing specific surface area at a constant porosity and with increasing porosity at a constant specific surface area. Predictive relationships correlating Dcc, specific surface area, and porosity were also obtained. The lacunarity at the minimum box size decreases with increasing porosity, and that at the intermediate box size (∼0.469 mm in the current model sands) was reproduced well with specific surface area. The maximum configurational entropy increases with increasing porosity, and the entropy length of the pores decreases with increasing specific surface area and was used to calculate the average connectivity among the pores. The correlation among porosity, specific surface area, and permeability is consistent with the prediction from the Kozeny-Carman equation. From the relationship between the permeability and the Dcc of pores, the permeability can be expressed as a function of the Dcc of pores and porosity. The current methods and these newly identified correlations among structural parameters and properties provide improved insights into the nature of porous media and have useful geophysical and hydrological implications for elasticity and shear viscosity of complex composites of rock

  2. Benchmark simulation models, quo vadis?

    DEFF Research Database (Denmark)

    Jeppsson, U.; Alex, J; Batstone, D. J.

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together...... to provide their insights, highlighting areas where knowledge may still be deficient and where new opportunities are emerging, and to propose potential avenues for future development and application of the general benchmarking framework and its associated tools. The paper focuses on the topics of temporal...... and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work...

  3. Safety Assessment of Advanced Imaging Sequences II: Simulations

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    2016-01-01

    .6%, when using the impulse response of the probe estimated from an independent measurement. The accuracy is increased to between -22% to 24.5% for MI and between -33.2% to 27.0% for Ispta.3, when using the pressure response measured at a single point to scale the simulation. The spatial distribution of MI...... Mechanical Index (MI) and Ispta.3 as required by FDA. The method is performed on four different imaging schemes and compared to measurements conducted using the SARUS experimental scanner. The sequences include focused emissions with an F-number of 2 with 64 elements that generate highly non-linear fields....... The simulation time is between 0.67 ms to 2.8 ms per emission and imaging point, making it possible to simulate even complex emission sequences in less than 1 s for a single spatial position. The linear simulations yield a relative accuracy on MI between -12.1% to 52.3% and for Ispta.3 between -38.6% to 62...

  4. STEM image simulation with hybrid CPU/GPU programming

    International Nuclear Information System (INIS)

    Yao, Y.; Ge, B.H.; Shen, X.; Wang, Y.G.; Yu, R.C.

    2016-01-01

    STEM image simulation is achieved via hybrid CPU/GPU programming under parallel algorithm architecture to speed up calculation on a personal computer (PC). To utilize the calculation power of a PC fully, the simulation is performed using the GPU core and multi-CPU cores at the same time to significantly improve efficiency. GaSb and an artificial GaSb/InAs interface with atom diffusion have been used to verify the computation. - Highlights: • STEM image simulation is achieved by hybrid CPU/GPU programming under parallel algorithm architecture to speed up the calculation in the personal computer (PC). • In order to fully utilize the calculation power of the PC, the simulation is performed by GPU core and multi-CPU cores at the same time so efficiency is improved significantly. • GaSb and artificial GaSb/InAs interface with atom diffusion have been used to verify the computation. The results reveal some unintuitive phenomena about the contrast variation with the atom numbers.

  5. A queuing model for road traffic simulation

    International Nuclear Information System (INIS)

    Guerrouahane, N.; Aissani, D.; Bouallouche-Medjkoune, L.; Farhi, N.

    2015-01-01

    We present in this article a stochastic queuing model for the raod traffic. The model is based on the M/G/c/c state dependent queuing model, and is inspired from the deterministic Godunov scheme for the road traffic simulation. We first propose a variant of M/G/c/c state dependent model that works with density-flow fundamental diagrams rather than density-speed relationships. We then extend this model in order to consider upstream traffic demand as well as downstream traffic supply. Finally, we show how to model a whole raod by concatenating raod sections as in the deterministic Godunov scheme

  6. Quantitative interface models for simulating microstructure evolution

    International Nuclear Information System (INIS)

    Zhu, J.Z.; Wang, T.; Zhou, S.H.; Liu, Z.K.; Chen, L.Q.

    2004-01-01

    To quantitatively simulate microstructural evolution in real systems, we investigated three different interface models: a sharp-interface model implemented by the software DICTRA and two diffuse-interface models which use either physical order parameters or artificial order parameters. A particular example is considered, the diffusion-controlled growth of a γ ' precipitate in a supersaturated γ matrix in Ni-Al binary alloys. All three models use the thermodynamic and kinetic parameters from the same databases. The temporal evolution profiles of composition from different models are shown to agree with each other. The focus is on examining the advantages and disadvantages of each model as applied to microstructure evolution in alloys

  7. Meteosat third generation imager: simulation of the flexible combined imager instrument chain

    Science.gov (United States)

    Just, Dieter; Gutiérrez, Rebeca; Roveda, Fausto; Steenbergen, Theo

    2014-10-01

    The Meteosat Third Generation (MTG) Programme is the next generation of European geostationary meteorological systems. The first MTG satellite, MTG-I1, which is scheduled for launch at the end of 2018, will host two imaging instruments: the Flexible Combined Imager (FCI) and the Lightning Imager. The FCI will provide continuation of the SEVIRI imager operations on the current Meteosat Second Generation satellites (MSG), but with an improved spatial, temporal and spectral resolution, not dissimilar to GOES-R (of NASA/NOAA). Unlike SEVIRI on the spinning MSG spacecraft, the FCI will be mounted on a 3-axis stabilised platform and a 2-axis tapered scan will provide a full coverage of the Earth in 10 minute repeat cycles. Alternatively, a rapid scanning mode can cover smaller areas, but with a better temporal resolution of up to 2.5 minutes. In order to assess some of the data acquisition and processing aspects which will apply to the FCI, a simplified end-to-end imaging chain prototype was set up. The simulation prototype consists of four different functional blocks: - A function for the generation of FCI-like references images - An image acquisition simulation function for the FCI Line-of-Sight calculation and swath generation - A processing function that reverses the swath generation process by rectifying the swath data - An evaluation function for assessing the quality of the processed data with respect to the reference images This paper presents an overview of the FCI instrument chain prototype, covering instrument characteristics, reference image generation, image acquisition simulation, and processing aspects. In particular, it provides in detail the description of the generation of references images, highlighting innovative features, but also limitations. This is followed by a description of the image acquisition simulation process, and the rectification and evaluation function. The latter two are described in more detail in a separate paper. Finally, results

  8. 3D segmentation of scintigraphic images with validation on realistic GATE simulations

    International Nuclear Information System (INIS)

    Burg, Samuel

    2011-01-01

    The objective of this thesis was to propose a new 3D segmentation method for scintigraphic imaging. The first part of the work was to simulate 3D volumes with known ground truth in order to validate a segmentation method over other. Monte-Carlo simulations were performed using the GATE software (Geant4 Application for Emission Tomography). For this, we characterized and modeled the gamma camera 'γ Imager' Biospace TM by comparing each measurement from a simulated acquisition to his real equivalent. The 'low level' segmentation tool that we have developed is based on a modeling of the levels of the image by probabilistic mixtures. Parameters estimation is done by an SEM algorithm (Stochastic Expectation Maximization). The 3D volume segmentation is achieved by an ICM algorithm (Iterative Conditional Mode). We compared the segmentation based on Gaussian and Poisson mixtures to segmentation by thresholding on the simulated volumes. This showed the relevance of the segmentations obtained using probabilistic mixtures, especially those obtained with Poisson mixtures. Those one has been used to segment real 18 FDG PET images of the brain and to compute descriptive statistics of the different tissues. In order to obtain a 'high level' segmentation method and find anatomical structures (necrotic part or active part of a tumor, for example), we proposed a process based on the point processes formalism. A feasibility study has yielded very encouraging results. (author) [fr

  9. Polarimetric infrared imaging simulation of a synthetic sea surface with Mie scattering.

    Science.gov (United States)

    He, Si; Wang, Xia; Xia, Runqiu; Jin, Weiqi; Liang, Jian'an

    2018-03-01

    A novel method to simulate the polarimetric infrared imaging of a synthetic sea surface with atmospheric Mie scattering effects is presented. The infrared emission, multiple reflections, and infrared polarization of the sea surface and the Mie scattering of aerosols are all included for the first time. At first, a new approach to retrieving the radiative characteristics of a wind-roughened sea surface is introduced. A two-scale method of sea surface realization and the inverse ray tracing of light transfer calculation are combined and executed simultaneously, decreasing the consumption of time and memory dramatically. Then the scattering process that the infrared light emits from the sea surface and propagates in the aerosol particles is simulated with a polarized light Monte Carlo model. Transformations of the polarization state of the light are calculated with the Mie theory. Finally, the polarimetric infrared images of the sea surface of different environmental conditions and detection parameters are generated based on the scattered light detected by the infrared imaging polarimeter. The results of simulation examples show that our polarimetric infrared imaging simulation can be applied to predict the infrared polarization characteristics of the sea surface, model the oceanic scene, and guide the detection in the oceanic environment.

  10. Intraocular Telescopic System Design: Optical and Visual Simulation in a Human Eye Model

    OpenAIRE

    Zoulinakis, Georgios; Ferrer-Blasco, Teresa

    2017-01-01

    Purpose. To design an intraocular telescopic system (ITS) for magnifying retinal image and to simulate its optical and visual performance after implantation in a human eye model. Methods. Design and simulation were carried out with a ray-tracing and optical design software. Two different ITS were designed, and their visual performance was simulated using the Liou-Brennan eye model. The difference between the ITS was their lenses’ placement in the eye model and their powers. Ray tracing in bot...

  11. Simulation of Meteosat Third Generation-Lightning Imager through tropical rainfall measuring mission: Lightning Imaging Sensor data

    Science.gov (United States)

    Biron, Daniele; De Leonibus, Luigi; Laquale, Paolo; Labate, Demetrio; Zauli, Francesco; Melfi, Davide

    2008-08-01

    The Centro Nazionale di Meteorologia e Climatologia Aeronautica recently hosted a fellowship sponsored by Galileo Avionica, with the intent to study and perform a simulation of Meteosat Third Generation - Lightning Imager (MTG-LI) sensor behavior through Tropical Rainfall Measuring Mission - Lightning Imaging Sensor data (TRMM-LIS). For the next generation of earth observation geostationary satellite, major operating agencies are planning to insert an optical imaging mission, that continuously observes lightning pulses in the atmosphere; EUMETSAT has decided in recent years that one of the three candidate mission to be flown on MTG is LI, a Lightning Imager. MTG-LI mission has no Meteosat Second Generation heritage, but users need to evaluate the possible real time data output of the instrument to agree in inserting it on MTG payload. Authors took the expected LI design from MTG Mission Requirement Document, and reprocess real lightning dataset, acquired from space by TRMM-LIS instrument, to produce a simulated MTG-LI lightning dataset. The simulation is performed in several run, varying Minimum Detectable Energy, taking into account processing steps from event detection to final lightning information. A definition of the specific meteorological requirements is given from the potential use in meteorology of lightning final information for convection estimation and numerical cloud modeling. Study results show the range of instrument requirements relaxation which lead to minimal reduction in the final lightning information.

  12. X-ray strain tensor imaging: FEM simulation and experiments with a micro-CT.

    Science.gov (United States)

    Kim, Jae G; Park, So E; Lee, Soo Y

    2014-01-01

    In tissue elasticity imaging, measuring the strain tensor components is necessary to solve the inverse problem. However, it is impractical to measure all the tensor components in ultrasound or MRI elastography because of their anisotropic spatial resolution. The objective of this study is to compute 3D strain tensor maps from the 3D CT images of a tissue-mimicking phantom. We took 3D micro-CT images of the phantom twice with applying two different mechanical compressions to it. Applying the 3D image correlation technique to the CT images under different compression, we computed 3D displacement vectors and strain tensors at every pixel. To evaluate the accuracy of the strain tensor maps, we made a 3D FEM model of the phantom, and we computed strain tensor maps through FEM simulation. Experimentally obtained strain tensor maps showed similar patterns to the FEM-simulated ones in visual inspection. The correlation between the strain tensor maps obtained from the experiment and the FEM simulation ranges from 0.03 to 0.93. Even though the strain tensor maps suffer from high level noise, we expect the x-ray strain tensor imaging may find some biomedical applications such as malignant tissue characterization and stress analysis inside the tissues.

  13. Simulating the X-Ray Image Contrast to Set-Up Techniques with Desired Flaw Detectability

    Science.gov (United States)

    Koshti, Ajay M.

    2015-01-01

    The paper provides simulation data of previous work by the author in developing a model for estimating detectability of crack-like flaws in radiography. The methodology is being developed to help in implementation of NASA Special x-ray radiography qualification, but is generically applicable to radiography. The paper describes a method for characterizing X-ray detector resolution for crack detection. Applicability of ASTM E 2737 resolution requirements to the model are also discussed. The paper describes a model for simulating the detector resolution. A computer calculator application, discussed here, also performs predicted contrast and signal-to-noise ratio calculations. Results of various simulation runs in calculating x-ray flaw size parameter and image contrast for varying input parameters such as crack depth, crack width, part thickness, x-ray angle, part-to-detector distance, part-to-source distance, source sizes, and detector sensitivity and resolution are given as 3D surfaces. These results demonstrate effect of the input parameters on the flaw size parameter and the simulated image contrast of the crack. These simulations demonstrate utility of the flaw size parameter model in setting up x-ray techniques that provide desired flaw detectability in radiography. The method is applicable to film radiography, computed radiography, and digital radiography.

  14. Analyzing Strategic Business Rules through Simulation Modeling

    Science.gov (United States)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  15. Elastic models application for thorax image registration

    International Nuclear Information System (INIS)

    Correa Prado, Lorena S; Diaz, E Andres Valdez; Romo, Raul

    2007-01-01

    This work consist of the implementation and evaluation of elastic alignment algorithms of biomedical images, which were taken at thorax level and simulated with the 4D NCAT digital phantom. Radial Basis Functions spatial transformations (RBF), a kind of spline, which allows carrying out not only global rigid deformations but also local elastic ones were applied, using a point-matching method. The applied functions were: Thin Plate Spline (TPS), Multiquadric (MQ) Gaussian and B-Spline, which were evaluated and compared by means of calculating the Target Registration Error and similarity measures between the registered images (the squared sum of intensity differences (SSD) and correlation coefficient (CC)). In order to value the user incurred error in the point-matching and segmentation tasks, two algorithms were also designed that calculate the Fiduciary Localization Error. TPS and MQ were demonstrated to have better performance than the others. It was proved RBF represent an adequate model for approximating the thorax deformable behaviour. Validation algorithms showed the user error was not significant

  16. Transmission scanning electron microscopy: Defect observations and image simulations.

    Science.gov (United States)

    Callahan, Patrick G; Stinville, Jean-Charles; Yao, Eric R; Echlin, McLean P; Titus, Michael S; De Graef, Marc; Gianola, Daniel S; Pollock, Tresa M

    2018-03-01

    The new capabilities of a FEG scanning electron microscope (SEM) equipped with a scanning transmission electron microscopy (STEM) detector for defect characterization have been studied in parallel with transmission electron microscopy (TEM) imaging. Stacking faults and dislocations have been characterized in strontium titanate, a polycrystalline nickel-base superalloy and a single crystal cobalt-base material. Imaging modes that are similar to conventional TEM (CTEM) bright field (BF) and dark field (DF) and STEM are explored, and some of the differences due to the different accelerating voltages highlighted. Defect images have been simulated for the transmission scanning electron microscopy (TSEM) configuration using a scattering matrix formulation, and diffraction contrast in the SEM is discussed in comparison to TEM. Interference effects associated with conventional TEM, such as thickness fringes and bending contours are significantly reduced in TSEM by using a convergent probe, similar to a STEM imaging modality, enabling individual defects to be imaged clearly even in high dislocation density regions. Beyond this, TSEM provides significant advantages for high throughput and dynamic in-situ characterization. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Nuclear reactor core modelling in multifunctional simulators

    Energy Technology Data Exchange (ETDEWEB)

    Puska, E.K. [VTT Energy, Nuclear Energy, Espoo (Finland)

    1999-06-01

    The thesis concentrates on the development of nuclear reactor core models for the APROS multifunctional simulation environment and the use of the core models in various kinds of applications. The work was started in 1986 as a part of the development of the entire APROS simulation system. The aim was to create core models that would serve in a reliable manner in an interactive, modular and multifunctional simulator/plant analyser environment. One-dimensional and three-dimensional core neutronics models have been developed. Both models have two energy groups and six delayed neutron groups. The three-dimensional finite difference type core model is able to describe both BWR- and PWR-type cores with quadratic fuel assemblies and VVER-type cores with hexagonal fuel assemblies. The one- and three-dimensional core neutronics models can be connected with the homogeneous, the five-equation or the six-equation thermal hydraulic models of APROS. The key feature of APROS is that the same physical models can be used in various applications. The nuclear reactor core models of APROS have been built in such a manner that the same models can be used in simulator and plant analyser applications, as well as in safety analysis. In the APROS environment the user can select the number of flow channels in the three-dimensional reactor core and either the homogeneous, the five- or the six-equation thermal hydraulic model for these channels. The thermal hydraulic model and the number of flow channels have a decisive effect on the calculation time of the three-dimensional core model and thus, at present, these particular selections make the major difference between a safety analysis core model and a training simulator core model. The emphasis on this thesis is on the three-dimensional core model and its capability to analyse symmetric and asymmetric events in the core. The factors affecting the calculation times of various three-dimensional BWR, PWR and WWER-type APROS core models have been

  18. Kanban simulation model for production process optimization

    Directory of Open Access Journals (Sweden)

    Golchev Riste

    2015-01-01

    Full Text Available A long time has passed since the KANBAN system has been established as an efficient method for coping with the excessive inventory. Still, the possibilities for its improvement through its integration with other different approaches should be investigated further. The basic research challenge of this paper is to present benefits of KANBAN implementation supported with Discrete Event Simulation (DES. In that direction, at the beginning, the basics of KANBAN system are presented with emphasis on the information and material flow, together with a methodology for implementation of KANBAN system. Certain analysis on combining the simulation with this methodology is presented. The paper is concluded with a practical example which shows that through understanding the philosophy of the implementation methodology of KANBAN system and the simulation methodology, a simulation model can be created which can serve as a basis for a variety of experiments that can be conducted within a short period of time, resulting with production process optimization.

  19. Voxel-based Monte Carlo simulation of X-ray imaging and spectroscopy experiments

    International Nuclear Information System (INIS)

    Bottigli, U.; Brunetti, A.; Golosio, B.; Oliva, P.; Stumbo, S.; Vincze, L.; Randaccio, P.; Bleuet, P.; Simionovici, A.; Somogyi, A.

    2004-01-01

    A Monte Carlo code for the simulation of X-ray imaging and spectroscopy experiments in heterogeneous samples is presented. The energy spectrum, polarization and profile of the incident beam can be defined so that X-ray tube systems as well as synchrotron sources can be simulated. The sample is modeled as a 3D regular grid. The chemical composition and density is given at each point of the grid. Photoelectric absorption, fluorescent emission, elastic and inelastic scattering are included in the simulation. The core of the simulation is a fast routine for the calculation of the path lengths of the photon trajectory intersections with the grid voxels. The voxel representation is particularly useful for samples that cannot be well described by a small set of polyhedra. This is the case of most naturally occurring samples. In such cases, voxel-based simulations are much less expensive in terms of computational cost than simulations on a polygonal representation. The efficient scheme used for calculating the path lengths in the voxels and the use of variance reduction techniques make the code suitable for the detailed simulation of complex experiments on generic samples in a relatively short time. Examples of applications to X-ray imaging and spectroscopy experiments are discussed

  20. Vermont Yankee simulator BOP model upgrade

    International Nuclear Information System (INIS)

    Alejandro, R.; Udbinac, M.J.

    2006-01-01

    The Vermont Yankee simulator has undergone significant changes in the 20 years since the original order was placed. After the move from the original Unix to MS Windows environment, and upgrade to the latest version of SimPort, now called MASTER, the platform was set for an overhaul and replacement of major plant system models. Over a period of a few months, the VY simulator team, in partnership with WSC engineers, replaced outdated legacy models of the main steam, condenser, condensate, circulating water, feedwater and feedwater heaters, and main turbine and auxiliaries. The timing was ideal, as the plant was undergoing a power up-rate, so the opportunity was taken to replace the legacy models with industry-leading, true on-line object oriented graphical models. Due to the efficiency of design and ease of use of the MASTER tools, VY staff performed the majority of the modeling work themselves with great success, with only occasional assistance from WSC, in a relatively short time-period, despite having to maintain all of their 'regular' simulator maintenance responsibilities. This paper will provide a more detailed view of the VY simulator, including how it is used and how it has benefited from the enhancements and upgrades implemented during the project. (author)

  1. Monte Carlo simulation of grating-based neutron phase contrast imaging at CPHS

    International Nuclear Information System (INIS)

    Zhang Ran; Chen Zhiqiang; Huang Zhifeng; Xiao Yongshun; Wang Xuewu; Wie Jie; Loong, C.-K.

    2011-01-01

    Since the launching of the Compact Pulsed Hadron Source (CPHS) project of Tsinghua University in 2009, works have begun on the design and engineering of an imaging/radiography instrument for the neutron source provided by CPHS. The instrument will perform basic tasks such as transmission imaging and computerized tomography. Additionally, we include in the design the utilization of coded-aperture and grating-based phase contrast methodology, as well as the options of prompt gamma-ray analysis and neutron-energy selective imaging. Previously, we had implemented the hardware and data-analysis software for grating-based X-ray phase contrast imaging. Here, we investigate Geant4-based Monte Carlo simulations of neutron refraction phenomena and then model the grating-based neutron phase contrast imaging system according to the classic-optics-based method. The simulated experimental results of the retrieving phase shift gradient information by five-step phase-stepping approach indicate the feasibility of grating-based neutron phase contrast imaging as an option for the cold neutron imaging instrument at the CPHS.

  2. Design and assessment of a novel SPECT system for desktop open-gantry imaging of small animals: A simulation study

    Energy Technology Data Exchange (ETDEWEB)

    Zeraatkar, Navid; Farahani, Mohammad Hossein [Research Center for Molecular and Cellular Imaging, Tehran University of Medical Sciences, Tehran 1419733141 (Iran, Islamic Republic of); Rahmim, Arman [Department of Radiology, Johns Hopkins University, Baltimore, Maryland 21287 and Department of Electrical and Computer Engineering, Johns Hopkins University, Baltimore, Maryland 21218 (United States); Sarkar, Saeed [Department of Medical Physics and Biomedical Engineering, Tehran University of Medical Sciences, Tehran 1417613151, Iran and Research Center for Science and Technology in Medicine, Tehran University of Medical Sciences, Tehran 1419733141 (Iran, Islamic Republic of); Ay, Mohammad Reza, E-mail: mohammadreza-ay@sina.tums.ac.ir [Research Center for Molecular and Cellular Imaging, Tehran University of Medical Sciences, Tehran 1419733141, Iran and Department of Medical Physics and Biomedical Engineering, Tehran University of Medical Sciences, Tehran 1417613151 (Iran, Islamic Republic of)

    2016-05-15

    Purpose: Given increasing efforts in biomedical research utilizing molecular imaging methods, development of dedicated high-performance small-animal SPECT systems has been growing rapidly in the last decade. In the present work, we propose and assess an alternative concept for SPECT imaging enabling desktop open-gantry imaging of small animals. Methods: The system, PERSPECT, consists of an imaging desk, with a set of tilted detector and pinhole collimator placed beneath it. The object to be imaged is simply placed on the desk. Monte Carlo (MC) and analytical simulations were utilized to accurately model and evaluate the proposed concept and design. Furthermore, a dedicated image reconstruction algorithm, finite-aperture-based circular projections (FABCP), was developed and validated for the system, enabling more accurate modeling of the system and higher quality reconstructed images. Image quality was quantified as a function of different tilt angles in the acquisition and number of iterations in the reconstruction algorithm. Furthermore, more complex phantoms including Derenzo, Defrise, and mouse whole body were simulated and studied. Results: The sensitivity of the PERSPECT was 207 cps/MBq. It was quantitatively demonstrated that for a tilt angle of 30°, comparable image qualities were obtained in terms of normalized squared error, contrast, uniformity, noise, and spatial resolution measurements, the latter at ∼0.6 mm. Furthermore, quantitative analyses demonstrated that 3 iterations of FABCP image reconstruction (16 subsets/iteration) led to optimally reconstructed images. Conclusions: The PERSPECT, using a novel imaging protocol, can achieve comparable image quality performance in comparison with a conventional pinhole SPECT with the same configuration. The dedicated FABCP algorithm, which was developed for reconstruction of data from the PERSPECT system, can produce high quality images for small-animal imaging via accurate modeling of the system as

  3. TRANSFORM - TRANsient Simulation Framework of Reconfigurable Models

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-01

    Existing development tools for early stage design and scoping of energy systems are often time consuming to use, proprietary, and do not contain the necessary function to model complete systems (i.e., controls, primary, and secondary systems) in a common platform. The Modelica programming language based TRANSFORM tool (1) provides a standardized, common simulation environment for early design of energy systems (i.e., power plants), (2) provides a library of baseline component modules to be assembled into full plant models using available geometry, design, and thermal-hydraulic data, (3) defines modeling conventions for interconnecting component models, and (4) establishes user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  4. Monte Carlo simulation of the iView GT portal imager dosimetry

    International Nuclear Information System (INIS)

    Juste, B.; Miro, R.; Diez, S.; Campayo, J.M.; Verdu, G.

    2010-01-01

    This work is mainly focused on developing a methodology to obtain portal dosimetry with an amorphous silicon electronic portal image device (EPID) by means of Monte Carlo simulations and experimental measures. According to this, pixel intensity values of portal images have been compared with dose measured from an ionization chamber and dose obtained from Monte Carlo simulations. To that, several images were acquired with the Elekta iView GT EPID using an attenuator phantom slab (10 cm thickness of solid water) and a 6 MV photon energy beam with different monitor units. The average pixel value in a region of interest (ROI) centered at the beam selecting each image was extracted and compared to dose measures performed with the ionization chamber. These parameters were found to be linearly correlated with the number of monitor units (MU). Since, MCNP5 simulations allow calculating the deposited dose in the ROI within the phosphor layer of the EPID model, we can compare the portal dose with the simulated transit dose in order to perform a treatment control.

  5. Biological transportation networks: Modeling and simulation

    KAUST Repository

    Albi, Giacomo

    2015-09-15

    We present a model for biological network formation originally introduced by Cai and Hu [Adaptation and optimization of biological transport networks, Phys. Rev. Lett. 111 (2013) 138701]. The modeling of fluid transportation (e.g., leaf venation and angiogenesis) and ion transportation networks (e.g., neural networks) is explained in detail and basic analytical features like the gradient flow structure of the fluid transportation network model and the impact of the model parameters on the geometry and topology of network formation are analyzed. We also present a numerical finite-element based discretization scheme and discuss sample cases of network formation simulations.

  6. A universal simulator for ecological models

    DEFF Research Database (Denmark)

    Holst, Niels

    2013-01-01

    Software design is an often neglected issue in ecological models, even though bad software design often becomes a hindrance for re-using, sharing and even grasping an ecological model. In this paper, the methodology of agile software design was applied to the domain of ecological models. Thus...... the principles for a universal design of ecological models were arrived at. To exemplify this design, the open-source software Universal Simulator was constructed using C++ and XML and is provided as a resource for inspiration....

  7. Object Oriented Modelling and Dynamical Simulation

    DEFF Research Database (Denmark)

    Wagner, Falko Jens; Poulsen, Mikael Zebbelin

    1998-01-01

    This report with appendix describes the work done in master project at DTU.The goal of the project was to develop a concept for simulation of dynamical systems based on object oriented methods.The result was a library of C++-classes, for use when both building componentbased models and when...

  8. preliminary multidomain modelling and simulation study

    African Journals Online (AJOL)

    user

    PRELIMINARY MULTIDOMAIN MODELLING AND SIMULATION STUDY OF A. HORIZONTAL AXIS WIND TURBINE (HAWT) TOWER VIBRATION. I. lliyasu1, I. Iliyasu2, I. K. Tanimu3 and D. O Obada4. 1,4 DEPARTMENT OF MECHANICAL ENGINEERING, AHMADU BELLO UNIVERSITY, ZARIA, KADUNA STATE. NIGERIA.

  9. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  10. Thermohydraulic modeling and simulation of breeder reactors

    International Nuclear Information System (INIS)

    Agrawal, A.K.; Khatib-Rahbar, M.; Curtis, R.T.; Hetrick, D.L.; Girijashankar, P.V.

    1982-01-01

    This paper deals with the modeling and simulation of system-wide transients in LMFBRs. Unprotected events (i.e., the presumption of failure of the plant protection system) leading to core-melt are not considered in this paper. The existing computational capabilities in the area of protected transients in the US are noted. Various physical and numerical approximations that are made in these codes are discussed. Finally, the future direction in the area of model verification and improvements is discussed

  11. Image formation simulation for computer-aided inspection planning of machine vision systems

    Science.gov (United States)

    Irgenfried, Stephan; Bergmann, Stephan; Mohammadikaji, Mahsa; Beyerer, Jürgen; Dachsbacher, Carsten; Wörn, Heinz

    2017-06-01

    In this work, a simulation toolset for Computer Aided Inspection Planning (CAIP) of systems for automated optical inspection (AOI) is presented along with a versatile two-robot-setup for verification of simulation and system planning results. The toolset helps to narrow down the large design space of optical inspection systems in interaction with a system expert. The image formation taking place in optical inspection systems is simulated using GPU-based real time graphics and high quality off-line-rendering. The simulation pipeline allows a stepwise optimization of the system, from fast evaluation of surface patch visibility based on real time graphics up to evaluation of image processing results based on off-line global illumination calculation. A focus of this work is on the dependency of simulation quality on measuring, modeling and parameterizing the optical surface properties of the object to be inspected. The applicability to real world problems is demonstrated by taking the example of planning a 3D laser scanner application. Qualitative and quantitative comparison results of synthetic and real images are presented.

  12. Simulation of image formation in x-ray coded aperture microscopy with polycapillary optics.

    Science.gov (United States)

    Korecki, P; Roszczynialski, T P; Sowa, K M

    2015-04-06

    In x-ray coded aperture microscopy with polycapillary optics (XCAMPO), the microstructure of focusing polycapillary optics is used as a coded aperture and enables depth-resolved x-ray imaging at a resolution better than the focal spot dimensions. Improvements in the resolution and development of 3D encoding procedures require a simulation model that can predict the outcome of XCAMPO experiments. In this work we introduce a model of image formation in XCAMPO which enables calculation of XCAMPO datasets for arbitrary positions of the object relative to the focal plane as well as to incorporate optics imperfections. In the model, the exit surface of the optics is treated as a micro-structured x-ray source that illuminates a periodic object. This makes it possible to express the intensity of XCAMPO images as a convolution series and to perform simulations by means of fast Fourier transforms. For non-periodic objects, the model can be applied by enforcing artificial periodicity and setting the spatial period larger then the field-of-view. Simulations are verified by comparison with experimental data.

  13. A Fast Visible-Infrared Imaging Radiometer Suite Simulator for Cloudy Atmopheres

    Science.gov (United States)

    Liu, Chao; Yang, Ping; Nasiri, Shaima L.; Platnick, Steven; Meyer, Kerry G.; Wang, Chen Xi; Ding, Shouguo

    2015-01-01

    A fast instrument simulator is developed to simulate the observations made in cloudy atmospheres by the Visible Infrared Imaging Radiometer Suite (VIIRS). The correlated k-distribution (CKD) technique is used to compute the transmissivity of absorbing atmospheric gases. The bulk scattering properties of ice clouds used in this study are based on the ice model used for the MODIS Collection 6 ice cloud products. Two fast radiative transfer models based on pre-computed ice cloud look-up-tables are used for the VIIRS solar and infrared channels. The accuracy and efficiency of the fast simulator are quantify in comparison with a combination of the rigorous line-by-line (LBLRTM) and discrete ordinate radiative transfer (DISORT) models. Relative errors are less than 2 for simulated TOA reflectances for the solar channels and the brightness temperature differences for the infrared channels are less than 0.2 K. The simulator is over three orders of magnitude faster than the benchmark LBLRTM+DISORT model. Furthermore, the cloudy atmosphere reflectances and brightness temperatures from the fast VIIRS simulator compare favorably with those from VIIRS observations.

  14. An Image-Based Finite Element Approach for Simulating Viscoelastic Response of Asphalt Mixture

    Directory of Open Access Journals (Sweden)

    Wenke Huang

    2016-01-01

    Full Text Available This paper presents an image-based micromechanical modeling approach to predict the viscoelastic behavior of asphalt mixture. An improved image analysis technique based on the OTSU thresholding operation was employed to reduce the beam hardening effect in X-ray CT images. We developed a voxel-based 3D digital reconstruction model of asphalt mixture with the CT images after being processed. In this 3D model, the aggregate phase and air void were considered as elastic materials while the asphalt mastic phase was considered as linear viscoelastic material. The viscoelastic constitutive model of asphalt mastic was implemented in a finite element code using the ABAQUS user material subroutine (UMAT. An experimental procedure for determining the parameters of the viscoelastic constitutive model at a given temperature was proposed. To examine the capability of the model and the accuracy of the parameter, comparisons between the numerical predictions and the observed laboratory results of bending and compression tests were conducted. Finally, the verified digital sample of asphalt mixture was used to predict the asphalt mixture viscoelastic behavior under dynamic loading and creep-recovery loading. Simulation results showed that the presented image-based digital sample may be appropriate for predicting the mechanical behavior of asphalt mixture when all the mechanical properties for different phases became available.

  15. Optical computing for image bandwidth compression: analysis and simulation.

    Science.gov (United States)

    Hunt, B R

    1978-09-15

    Image bandwidth compression is dominated by digital methods for carrying out the required computations. This paper discusses the general problem of using optics to realize the computations in bandwidth compression. A common method of digital bandwidth compression, feedback differential pulse code modulation (DPCM), is reviewed, and the obstacles to making a direct optical analogy to feedback DPCM are discussed. Instead of a direct optical analogy to DPCM, an optical system which captures the essential features of DPCM without optical feedback is introduced. The essential features of this incoherent optical system are encoding of low-frequency information and generation of difference samples which can be coded with a small number of bits. A simulation of this optical system by means of digital image processing is presented, and performance data are also included.

  16. Simulated multipolarized MAPSAR images to distinguish agricultural crops

    Directory of Open Access Journals (Sweden)

    Wagner Fernando Silva

    2012-06-01

    Full Text Available Many researchers have shown the potential of Synthetic Aperture Radar (SAR images for agricultural applications, particularly for monitoring regions with limitations in terms of acquiring cloud free optical images. Recently, Brazil and Germany began a feasibility study on the construction of an orbital L-band SAR sensor referred to as MAPSAR (Multi-Application Purpose SAR. This sensor provides L-band images in three spatial resolutions and polarimetric, interferometric and stereoscopic capabilities. Thus, studies are needed to evaluate the potential of future MAPSAR images. The objective of this study was to evaluate multipolarized MAPSAR images simulated by the airborne SAR-R99B sensor to distinguish coffee, cotton and pasture fields in Brazil. Discrimination among crops was evaluated through graphical and cluster analysis of mean backscatter values, considering single, dual and triple polarizations. Planting row direction of coffee influenced the backscatter and was divided into two classes: parallel and perpendicular to the sensor look direction. Single polarizations had poor ability to discriminate the crops. The overall accuracies were less than 59 %, but the understanding of the microwave interaction with the crops could be explored. Combinations of two polarizations could differentiate various fields of crops, highlighting the combination VV-HV that reached 78 % overall accuracy. The use of three polarizations resulted in 85.4 % overall accuracy, indicating that the classes pasture and parallel coffee were fully discriminated from the other classes. These results confirmed the potential of multipolarized MAPSAR images to distinguish the studied crops and showed considerable improvement in the accuracy of the results when the number of polarizations was increased.

  17. Image Restoration for Fluorescence Planar Imaging with Diffusion Model

    Directory of Open Access Journals (Sweden)

    Xuanxuan Zhang

    2017-01-01

    Full Text Available Fluorescence planar imaging (FPI is failure to capture high resolution images of deep fluorochromes due to photon diffusion. This paper presents an image restoration method to deal with this kind of blurring. The scheme of this method is conceived based on a reconstruction method in fluorescence molecular tomography (FMT with diffusion model. A new unknown parameter is defined through introducing the first mean value theorem for definite integrals. System matrix converting this unknown parameter to the blurry image is constructed with the elements of depth conversion matrices related to a chosen plane named focal plane. Results of phantom and mouse experiments show that the proposed method is capable of reducing the blurring of FPI image caused by photon diffusion when the depth of focal plane is chosen within a proper interval around the true depth of fluorochrome. This method will be helpful to the estimation of the size of deep fluorochrome.

  18. Estimating classification images with generalized linear and additive models.

    Science.gov (United States)

    Knoblauch, Kenneth; Maloney, Laurence T

    2008-12-22

    Conventional approaches to modeling classification image data can be described in terms of a standard linear model (LM). We show how the problem can be characterized as a Generalized Linear Model (GLM) with a Bernoulli distribution. We demonstrate via simulation that this approach is more accurate in estimating the underlying template in the absence of internal noise. With increasing internal noise, however, the advantage of the GLM over the LM decreases and GLM is no more accurate than LM. We then introduce the Generalized Additive Model (GAM), an extension of GLM that can be used to estimate smooth classification images adaptively. We show that this approach is more robust to the presence of internal noise, and finally, we demonstrate that GAM is readily adapted to estimation of higher order (nonlinear) classification images and to testing their significance.

  19. Modeling and interpretation of images*

    Directory of Open Access Journals (Sweden)

    Min Michiel

    2015-01-01

    Full Text Available Imaging protoplanetary disks is a challenging but rewarding task. It is challenging because of the glare of the central star outshining the weak signal from the disk at shorter wavelengths and because of the limited spatial resolution at longer wavelengths. It is rewarding because it contains a wealth of information on the structure of the disks and can (directly probe things like gaps and spiral structure. Because it is so challenging, telescopes are often pushed to their limitations to get a signal. Proper interpretation of these images therefore requires intimate knowledge of the instrumentation, the detection method, and the image processing steps. In this chapter I will give some examples and stress some issues that are important when interpreting images from protoplanetary disks.

  20. Augmented reality intravenous injection simulator based 3D medical imaging for veterinary medicine.

    Science.gov (United States)

    Lee, S; Lee, J; Lee, A; Park, N; Lee, S; Song, S; Seo, A; Lee, H; Kim, J-I; Eom, K

    2013-05-01

    Augmented reality (AR) is a technology which enables users to see the real world, with virtual objects superimposed upon or composited with it. AR simulators have been developed and used in human medicine, but not in veterinary medicine. The aim of this study was to develop an AR intravenous (IV) injection simulator to train veterinary and pre-veterinary students to perform canine venipuncture. Computed tomographic (CT) images of a beagle dog were scanned using a 64-channel multidetector. The CT images were transformed into volumetric data sets using an image segmentation method and were converted into a stereolithography format for creating 3D models. An AR-based interface was developed for an AR simulator for IV injection. Veterinary and pre-veterinary student volunteers were randomly assigned to an AR-trained group or a control group trained using more traditional methods (n = 20/group; n = 8 pre-veterinary students and n = 12 veterinary students in each group) and their proficiency at IV injection technique in live dogs was assessed after training was completed. Students were also asked to complete a questionnaire which was administered after using the simulator. The group that was trained using an AR simulator were more proficient at IV injection technique using real dogs than the control group (P ≤ 0.01). The students agreed that they learned the IV injection technique through the AR simulator. Although the system used in this study needs to be modified before it can be adopted for veterinary educational use, AR simulation has been shown to be a very effective tool for training medical personnel. Using the technology reported here, veterinary AR simulators could be developed for future use in veterinary education. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.

  1. Twitter's tweet method modelling and simulation

    Science.gov (United States)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    This paper seeks to purpose the concept of Twitter marketing methods. The tools that Twitter provides are modelled and simulated using iThink in the context of a Twitter media-marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following models have been developed for a twitter marketing agent/company and tested in real circumstances and with real numbers. These models were finalized through a number of revisions and iterators of the design, develop, simulate, test and evaluate. It also addresses these methods that suit most organized promotion through targeting, to the Twitter social media service. The validity and usefulness of these Twitter marketing methods models for the day-to-day decision making are authenticated by the management of the company organization. It implements system dynamics concepts of Twitter marketing methods modelling and produce models of various Twitter marketing situations. The Tweet method that Twitter provides can be adjusted, depending on the situation, in order to maximize the profit of the company/agent.

  2. Advances in NLTE modeling for integrated simulations

    Science.gov (United States)

    Scott, H. A.; Hansen, S. B.

    2010-01-01

    The last few years have seen significant progress in constructing the atomic models required for non-local thermodynamic equilibrium (NLTE) simulations. Along with this has come an increased understanding of the requirements for accurately modeling the ionization balance, energy content and radiative properties of different atomic species for a wide range of densities and temperatures. Much of this progress is the result of a series of workshops dedicated to comparing the results from different codes and computational approaches applied to a series of test problems. The results of these workshops emphasized the importance of atomic model completeness, especially in doubly-excited states and autoionization transitions, to calculating ionization balance, and the importance of accurate, detailed atomic data to producing reliable spectra. We describe a simple screened-hydrogenic model that calculates NLTE ionization balance with sufficient accuracy, at a low enough computational cost for routine use in radiation-hydrodynamics codes. The model incorporates term splitting, Δ n = 0 transitions, and approximate UTA widths for spectral calculations, with results comparable to those of much more detailed codes. Simulations done with this model have been increasingly successful at matching experimental data for laser-driven systems and hohlraums. Accurate and efficient atomic models are just one requirement for integrated NLTE simulations. Coupling the atomic kinetics to hydrodynamics and radiation transport constrains both discretizations and algorithms to retain energy conservation, accuracy and stability. In particular, the strong coupling between radiation and populations can require either very short time steps or significantly modified radiation transport algorithms to account for NLTE material response. Considerations such as these continue to provide challenges for NLTE simulations.

  3. Advances in NLTE Modeling for Integrated Simulations

    International Nuclear Information System (INIS)

    Scott, H.A.; Hansen, S.B.

    2009-01-01

    The last few years have seen significant progress in constructing the atomic models required for non-local thermodynamic equilibrium (NLTE) simulations. Along with this has come an increased understanding of the requirements for accurately modeling the ionization balance, energy content and radiative properties of different elements for a wide range of densities and temperatures. Much of this progress is the result of a series of workshops dedicated to comparing the results from different codes and computational approaches applied to a series of test problems. The results of these workshops emphasized the importance of atomic model completeness, especially in doubly excited states and autoionization transitions, to calculating ionization balance, and the importance of accurate, detailed atomic data to producing reliable spectra. We describe a simple screened-hydrogenic model that calculates NLTE ionization balance with surprising accuracy, at a low enough computational cost for routine use in radiation-hydrodynamics codes. The model incorporates term splitting, Δn = 0 transitions, and approximate UTA widths for spectral calculations, with results comparable to those of much more detailed codes. Simulations done with this model have been increasingly successful at matching experimental data for laser-driven systems and hohlraums. Accurate and efficient atomic models are just one requirement for integrated NLTE simulations. Coupling the atomic kinetics to hydrodynamics and radiation transport constrains both discretizations and algorithms to retain energy conservation, accuracy and stability. In particular, the strong coupling between radiation and populations can require either very short timesteps or significantly modified radiation transport algorithms to account for NLTE material response. Considerations such as these continue to provide challenges for NLTE simulations.

  4. SIMULATION MODELING OF IT PROJECTS BASED ON PETRI NETS

    Directory of Open Access Journals (Sweden)

    Александр Михайлович ВОЗНЫЙ

    2015-05-01

    Full Text Available An integrated simulation model of IT project based on a modified Petri net model that combines product and model of project tasks has been proposed. Substantive interpretation of the components of the simulation model has been presented, the process of simulation has been described. The conclusions about the integration of the product model and the model of works project were made.

  5. A parallel computational model for GATE simulations.

    Science.gov (United States)

    Rannou, F R; Vega-Acevedo, N; El Bitar, Z

    2013-12-01

    GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  6. Modeling, simulation and optimization of bipedal walking

    CERN Document Server

    Berns, Karsten

    2013-01-01

    The model-based investigation of motions of anthropomorphic systems is an important interdisciplinary research topic involving specialists from many fields such as Robotics, Biomechanics, Physiology, Orthopedics, Psychology, Neurosciences, Sports, Computer Graphics and Applied Mathematics. This book presents a study of basic locomotion forms such as walking and running is of particular interest due to the high demand on dynamic coordination, actuator efficiency and balance control. Mathematical models and numerical simulation and optimization techniques are explained, in combination with experimental data, which can help to better understand the basic underlying mechanisms of these motions and to improve them. Example topics treated in this book are Modeling techniques for anthropomorphic bipedal walking systems Optimized walking motions for different objective functions Identification of objective functions from measurements Simulation and optimization approaches for humanoid robots Biologically inspired con...

  7. Multiphase reacting flows modelling and simulation

    CERN Document Server

    Marchisio, Daniele L

    2007-01-01

    The papers in this book describe the most widely applicable modeling approaches and are organized in six groups covering from fundamentals to relevant applications. In the first part, some fundamentals of multiphase turbulent reacting flows are covered. In particular the introduction focuses on basic notions of turbulence theory in single-phase and multi-phase systems as well as on the interaction between turbulence and chemistry. In the second part, models for the physical and chemical processes involved are discussed. Among other things, particular emphasis is given to turbulence modeling strategies for multiphase flows based on the kinetic theory for granular flows. Next, the different numerical methods based on Lagrangian and/or Eulerian schemes are presented. In particular the most popular numerical approaches of computational fluid dynamics codes are described (i.e., Direct Numerical Simulation, Large Eddy Simulation, and Reynolds-Averaged Navier-Stokes approach). The book will cover particle-based meth...

  8. Fault diagnosis based on continuous simulation models

    Science.gov (United States)

    Feyock, Stefan

    1987-01-01

    The results are described of an investigation of techniques for using continuous simulation models as basis for reasoning about physical systems, with emphasis on the diagnosis of system faults. It is assumed that a continuous simulation model of the properly operating system is available. Malfunctions are diagnosed by posing the question: how can we make the model behave like that. The adjustments that must be made to the model to produce the observed behavior usually provide definitive clues to the nature of the malfunction. A novel application of Dijkstra's weakest precondition predicate transformer is used to derive the preconditions for producing the required model behavior. To minimize the size of the search space, an envisionment generator based on interval mathematics was developed. In addition to its intended application, the ability to generate qualitative state spaces automatically from quantitative simulations proved to be a fruitful avenue of investigation in its own right. Implementations of the Dijkstra transform and the envisionment generator are reproduced in the Appendix.

  9. Numerical model simulation of atmospheric coolant plumes

    International Nuclear Information System (INIS)

    Gaillard, P.

    1980-01-01

    The effect of humid atmospheric coolants on the atmosphere is simulated by means of a three-dimensional numerical model. The atmosphere is defined by its natural vertical profiles of horizontal velocity, temperature, pressure and relative humidity. Effluent discharge is characterised by its vertical velocity and the temperature of air satured with water vapour. The subject of investigation is the area in the vicinity of the point of discharge, with due allowance for the wake effect of the tower and buildings and, where application, wind veer with altitude. The model equations express the conservation relationships for mometum, energy, total mass and water mass, for an incompressible fluid behaving in accordance with the Boussinesq assumptions. Condensation is represented by a simple thermodynamic model, and turbulent fluxes are simulated by introduction of turbulent viscosity and diffusivity data based on in-situ and experimental water model measurements. The three-dimensional problem expressed in terms of the primitive variables (u, v, w, p) is governed by an elliptic equation system which is solved numerically by application of an explicit time-marching algorithm in order to predict the steady-flow velocity distribution, temperature, water vapour concentration and the liquid-water concentration defining the visible plume. Windstill conditions are simulated by a program processing the elliptic equations in an axisymmetrical revolution coordinate system. The calculated visible plumes are compared with plumes observed on site with a view to validate the models [fr

  10. Marketing evaluation model of the territorial image

    Directory of Open Access Journals (Sweden)

    Bacherikova M. L.

    2017-08-01

    Full Text Available this article analyzes the existing models for assessing the image of the territory and concluded that it is necessary to develop a model that allows to assess the image of the territory taking into account all the main target audiences. The study of models of the image of the territory considered in the scientific literature was carried out by the method of traditional (non-formalized analysis of documents on the basis of scientific publications of Russian and foreign authors. The author suggests using the «ideal point» model to assess the image of the territory. At the same time, the assessment of the image of the territory should be carried out for all groups of consumers, taking into account the weight coefficients reflecting the importance of opinions and the number of respondents of each group.

  11. Modeling, simulation, and analysis of optical remote sensing systems

    Science.gov (United States)

    Kerekes, John Paul; Landgrebe, David A.

    1989-01-01

    Remote Sensing of the Earth's resources from space-based sensors has evolved in the past 20 years from a scientific experiment to a commonly used technological tool. The scientific applications and engineering aspects of remote sensing systems have been studied extensively. However, most of these studies have been aimed at understanding individual aspects of the remote sensing process while relatively few have studied their interrelations. A motivation for studying these interrelationships has arisen with the advent of highly sophisticated configurable sensors as part of the Earth Observing System (EOS) proposed by NASA for the 1990's. Two approaches to investigating remote sensing systems are developed. In one approach, detailed models of the scene, the sensor, and the processing aspects of the system are implemented in a discrete simulation. This approach is useful in creating simulated images with desired characteristics for use in sensor or processing algorithm development. A less complete, but computationally simpler method based on a parametric model of the system is also developed. In this analytical model the various informational classes are parameterized by their spectral mean vector and covariance matrix. These class statistics are modified by models for the atmosphere, the sensor, and processing algorithms and an estimate made of the resulting classification accuracy among the informational classes. Application of these models is made to the study of the proposed High Resolution Imaging Spectrometer (HRIS). The interrelationships among observational conditions, sensor effects, and processing choices are investigated with several interesting results.

  12. A Simulation Model for Extensor Tendon Repair

    Directory of Open Access Journals (Sweden)

    Elizabeth Aronstam

    2017-07-01

    Full Text Available Audience: This simulation model is designed for use by emergency medicine residents. Although we have instituted this at the PGY-2 level of our residency curriculum, it is appropriate for any level of emergency medicine residency training. It might also be adapted for use for a variety of other learners, such as practicing emergency physicians, orthopedic surgery residents, or hand surgery trainees. Introduction: Tendon injuries commonly present to the emergency department, so it is essential that emergency physicians be competent in evaluating such injuries. Indeed, extensor tendon repair is included as an ACGME Emergency Medicine Milestone (Milestone 13, Wound Management, Level 5 – “Performs advanced wound repairs, such as tendon repairs…”.1 However, emergency medicine residents may have limited opportunity to develop these skills due to a lack of patients, competition from other trainees, or preexisting referral patterns. Simulation may provide an alternative means to effectively teach these skills in such settings. Previously described tendon repair simulation models that were designed for surgical trainees have used rubber worms4, licorice5, feeding tubes, catheters6,7, drinking straws8, microfoam tape9, sheep forelimbs10 and cadavers.11 These models all suffer a variety of limitations, including high cost, lack of ready availability, or lack of realism. Objectives: We sought to develop an extensor tendon repair simulation model for emergency medicine residents, designed to meet ACGME Emergency Medicine Milestone 13, Level 5. We wished this model to be simple, inexpensive, and realistic. Methods: The learner responsible content/educational handout component of our innovation teaches residents about emergency department extensor tendon repair, and includes: 1 relevant anatomy 2 indications and contraindications for emergency department extensor tendon repair 3 physical exam findings 4 tendon suture techniques and 5 aftercare. During

  13. Connections model for tomographic images reconstruction

    International Nuclear Information System (INIS)

    Rodrigues, R.G.S.; Pela, C.A.; Roque, S.F. A.C.

    1998-01-01

    This paper shows an artificial neural network with an adequately topology for tomographic image reconstruction. The associated error function is derived and the learning algorithm is make. The simulated results are presented and demonstrate the existence of a generalized solution for nets with linear activation function. (Author)

  14. Modelling and simulation of thermal power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eborn, J.

    1998-02-01

    Mathematical modelling and simulation are important tools when dealing with engineering systems that today are becoming increasingly more complex. Integrated production and recycling of materials are trends that give rise to heterogenous systems, which are difficult to handle within one area of expertise. Model libraries are an excellent way to package engineering knowledge of systems and units to be reused by those who are not experts in modelling. Many commercial packages provide good model libraries, but they are usually domain-specific and closed. Heterogenous, multi-domain systems requires open model libraries written in general purpose modelling languages. This thesis describes a model database for thermal power plants written in the object-oriented modelling language OMOLA. The models are based on first principles. Subunits describe volumes with pressure and enthalpy dynamics and flows of heat or different media. The subunits are used to build basic units such as pumps, valves and heat exchangers which can be used to build system models. Several applications are described; a heat recovery steam generator, equipment for juice blending, steam generation in a sulphuric acid plant and a condensing steam plate heat exchanger. Model libraries for industrial use must be validated against measured data. The thesis describes how parameter estimation methods can be used for model validation. Results from a case-study on parameter optimization of a non-linear drum boiler model show how the technique can be used 32 refs, 21 figs

  15. Modeling and simulation of economic processes

    Directory of Open Access Journals (Sweden)

    Bogdan Brumar

    2010-12-01

    Full Text Available In general, any activity requires a longer action often characterized by a degree of uncertainty, insecurity, in terms of size of the objective pursued. Because of the complexity of real economic systems, the stochastic dependencies between different variables and parameters considered, not all systems can be adequately represented by a model that can be solved by analytical methods and covering all issues for management decision analysis-economic horizon real. Often in such cases, it is considered that the simulation technique is the only alternative available. Using simulation techniques to study real-world systems often requires a laborious work. Making a simulation experiment is a process that takes place in several stages.

  16. Simulation as a surgical teaching model.

    Science.gov (United States)

    Ruiz-Gómez, José Luis; Martín-Parra, José Ignacio; González-Noriega, Mónica; Redondo-Figuero, Carlos Godofredo; Manuel-Palazuelos, José Carlos

    2018-01-01

    Teaching of surgery has been affected by many factors over the last years, such as the reduction of working hours, the optimization of the use of the operating room or patient safety. Traditional teaching methodology fails to reduce the impact of these factors on surgeońs training. Simulation as a teaching model minimizes such impact, and is more effective than traditional teaching methods for integrating knowledge and clinical-surgical skills. Simulation complements clinical assistance with training, creating a safe learning environment where patient safety is not affected, and ethical or legal conflicts are avoided. Simulation uses learning methodologies that allow teaching individualization, adapting it to the learning needs of each student. It also allows training of all kinds of technical, cognitive or behavioural skills. Copyright © 2017 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.

  17. Mathematical models and numerical simulation in electromagnetism

    CERN Document Server

    Bermúdez, Alfredo; Salgado, Pilar

    2014-01-01

    The book represents a basic support for a master course in electromagnetism oriented to numerical simulation. The main goal of the book is that the reader knows the boundary-value problems of partial differential equations that should be solved in order to perform computer simulation of electromagnetic processes. Moreover it includes a part devoted to electric circuit theory  based on ordinary differential equations. The book is mainly oriented to electric engineering applications, going from the general to the specific, namely, from the full Maxwell’s equations to the particular cases of electrostatics, direct current, magnetostatics and eddy currents models. Apart from standard exercises related to analytical calculus, the book includes some others oriented to real-life applications solved with MaxFEM free simulation software.

  18. Abstract ID: 197 Monte Carlo simulations of X-ray grating interferometry based imaging systems.

    Science.gov (United States)

    Tessarini, Stefan; Fix, Michael K; Volken, Werner; Frei, Daniel; Stampanoni, Marco F M

    2018-01-01

    Over the last couple of years the implementation of Monte Carlo (MC) methods of grating based imaging techniques is of increasing interest. Several different approaches were taken to include coherent effects into MC in order to simulate the radiation transport of the image forming procedure. These include full MC using FLUKA [1], which however are only considering monochromatic sources. Alternatively, ray-tracing based MC [2] allow fast simulations with the limitation to provide only qualitative results, i.e. this technique is not suitable for dose calculation in the imaged object. Finally, hybrid models [3] were used allowing quantitative results in reasonable computation time, however only two-dimensional implementations are available. Thus, this work aims to develop a full MC framework for X-ray grating interferometry imaging systems using polychromatic sources suitable for large-scale samples. For this purpose the EGSnrc C++ MC code system is extended to take Snell's law, the optical path length and Huygens principle into account. Thereby the EGSnrc library was modified, e.g. the complex index of refraction has to be assigned to each region depending on the material. The framework is setup to be user-friendly and robust with respect to future updates of the EGSnrc package. These implementations have to be tested using dedicated academic situations. Next steps include the validation by comparisons of measurements for different setups with the corresponding MC simulations. Furthermore, the newly developed implementation will be compared with other simulation approaches. This framework will then serve as bases for dose calculation on CT data and has further potential to investigate the image formation process in grating based imaging systems. Copyright © 2017.

  19. Facebook's personal page modelling and simulation

    Science.gov (United States)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    In this paper we will try to define the utility of Facebook's Personal Page marketing method. This tool that Facebook provides, is modelled and simulated using iThink in the context of a Facebook marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following model has been developed for a social media marketing agent/company, Facebook platform oriented and tested in real circumstances. This model is finalized through a number of revisions and iterators of the design, development, simulation, testing and evaluation processes. The validity and usefulness of this Facebook marketing model for the day-to-day decision making are authenticated by the management of the company organization. Facebook's Personal Page method can be adjusted, depending on the situation, in order to maximize the total profit of the company which is to bring new customers, keep the interest of the old customers and deliver traffic to its website.

  20. Modeling and simulation of photovoltaic solar panel

    International Nuclear Information System (INIS)

    Belarbi, M.; Haddouche, K.; Midoun, A.

    2006-01-01

    In this article, we present a new approach for estimating the model parameters of a photovoltaic solar panel according to the irradiance and temperature. The parameters of the one diode model are given from the knowledge of three operating points: short-circuit, open circuit, and maximum power. In the first step, the adopted approach concerns the resolution of the system of equations constituting the three operating points to write all the model parameters according to series resistance. Secondly, we make an iterative resolution at the optimal operating point by using the Newton-Raphson method to calculate the series resistance value as well as the model parameters. Once the panel model is identified, we consider other equations for taking into account the irradiance and temperature effect. The simulation results show the convergence speed of the model parameters and the possibility of visualizing the electrical behaviour of the panel according to the irradiance and temperature. Let us note that a sensitivity of the algorithm at the optimal operating point was observed owing to the fact that a small variation of the optimal voltage value leads to a very great variation of the identified parameters values. With the identified model, we can develop algorithms of maximum power point tracking, and make simulations of a solar water pumping system.(Author)

  1. Simulation of Image Performance Characteristics of the Landsat Data Continuity Mission (LDCM) Thermal Infrared Sensor (TIRS)

    Science.gov (United States)

    Schott, John; Gerace, Aaron; Brown, Scott; Gartley, Michael; Montanaro, Matthew; Reuter, Dennis C.

    2012-01-01

    The next Landsat satellite, which is scheduled for launch in early 2013, will carry two instruments: the Operational Land Imager (OLI) and the Thermal Infrared Sensor (TIRS). Significant design changes over previous Landsat instruments have been made to these sensors to potentially enhance the quality of Landsat image data. TIRS, which is the focus of this study, is a dual-band instrument that uses a push-broom style architecture to collect data. To help understand the impact of design trades during instrument build, an effort was initiated to model TIRS imagery. The Digital Imaging and Remote Sensing Image Generation (DIRSIG) tool was used to produce synthetic "on-orbit" TIRS data with detailed radiometric, geometric, and digital image characteristics. This work presents several studies that used DIRSIG simulated TIRS data to test the impact of engineering performance data on image quality in an effort to determine if the image data meet specifications or, in the event that they do not, to determine if the resulting image data are still acceptable.

  2. A simulation model for material accounting systems

    International Nuclear Information System (INIS)

    Coulter, C.A.; Thomas, K.E.

    1987-01-01

    A general-purpose model that was developed to simulate the operation of a chemical processing facility for nuclear materials has been extended to describe material measurement and accounting procedures as well. The model now provides descriptors for material balance areas, a large class of measurement instrument types and their associated measurement errors for various classes of materials, the measurement instruments themselves with their individual calibration schedules, and material balance closures. Delayed receipt of measurement results (as for off-line analytical chemistry assay), with interim use of a provisional measurement value, can be accurately represented. The simulation model can be used to estimate inventory difference variances for processing areas that do not operate at steady state, to evaluate the timeliness of measurement information, to determine process impacts of measurement requirements, and to evaluate the effectiveness of diversion-detection algorithms. Such information is usually difficult to obtain by other means. Use of the measurement simulation model is illustrated by applying it to estimate inventory difference variances for two material balance area structures of a fictitious nuclear material processing line

  3. Theory, modeling and simulation: Annual report 1993

    International Nuclear Information System (INIS)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE's research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies

  4. Theory, modeling and simulation: Annual report 1993

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  5. Image based Monte Carlo modeling for computational phantom

    International Nuclear Information System (INIS)

    Cheng, M.; Wang, W.; Zhao, K.; Fan, Y.; Long, P.; Wu, Y.

    2013-01-01

    Full text of the publication follows. The evaluation on the effects of ionizing radiation and the risk of radiation exposure on human body has been becoming one of the most important issues for radiation protection and radiotherapy fields, which is helpful to avoid unnecessary radiation and decrease harm to human body. In order to accurately evaluate the dose on human body, it is necessary to construct more realistic computational phantom. However, manual description and verification of the models for Monte Carlo (MC) simulation are very tedious, error-prone and time-consuming. In addition, it is difficult to locate and fix the geometry error, and difficult to describe material information and assign it to cells. MCAM (CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport Simulation) was developed as an interface program to achieve both CAD- and image-based automatic modeling. The advanced version (Version 6) of MCAM can achieve automatic conversion from CT/segmented sectioned images to computational phantoms such as MCNP models. Imaged-based automatic modeling program(MCAM6.0) has been tested by several medical images and sectioned images. And it has been applied in the construction of Rad-HUMAN. Following manual segmentation and 3D reconstruction, a whole-body computational phantom of Chinese adult female called Rad-HUMAN was created by using MCAM6.0 from sectioned images of a Chinese visible human dataset. Rad-HUMAN contains 46 organs/tissues, which faithfully represented the average anatomical characteristics of the Chinese female. The dose conversion coefficients (Dt/Ka) from kerma free-in-air to absorbed dose of Rad-HUMAN were calculated. Rad-HUMAN can be applied to predict and evaluate dose distributions in the Treatment Plan System (TPS), as well as radiation exposure for human body in radiation protection. (authors)

  6. Image based Monte Carlo Modeling for Computational Phantom

    Science.gov (United States)

    Cheng, Mengyun; Wang, Wen; Zhao, Kai; Fan, Yanchang; Long, Pengcheng; Wu, Yican

    2014-06-01

    The evaluation on the effects of ionizing radiation and the risk of radiation exposure on human body has been becoming one of the most important issues for radiation protection and radiotherapy fields, which is helpful to avoid unnecessary radiation and decrease harm to human body. In order to accurately evaluate the dose on human body, it is necessary to construct more realistic computational phantom. However, manual description and verfication of the models for Monte carlo(MC)simulation are very tedious, error-prone and time-consuming. In addiation, it is difficult to locate and fix the geometry error, and difficult to describe material information and assign it to cells. MCAM (CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport Simulation) was developed as an interface program to achieve both CAD- and image-based automatic modeling by FDS Team (Advanced Nuclear Energy Research Team, http://www.fds.org.cn). The advanced version (Version 6) of MCAM can achieve automatic conversion from CT/segmented sectioned images to computational phantoms such as MCNP models. Imaged-based automatic modeling program(MCAM6.0) has been tested by several medical images and sectioned images. And it has been applied in the construction of Rad-HUMAN. Following manual segmentation and 3D reconstruction, a whole-body computational phantom of Chinese adult female called Rad-HUMAN was created by using MCAM6.0 from sectioned images of a Chinese visible human dataset. Rad-HUMAN contains 46 organs/tissues, which faithfully represented the average anatomical characteristics of the Chinese female. The dose conversion coefficients(Dt/Ka) from kerma free-in-air to absorbed dose of Rad-HUMAN were calculated. Rad-HUMAN can be applied to predict and evaluate dose distributions in the Treatment Plan System (TPS), as well as radiation exposure for human body in radiation protection.

  7. Superresolving Black Hole Images with Full-Closure Sparse Modeling

    Science.gov (United States)

    Crowley, Chelsea; Akiyama, Kazunori; Fish, Vincent

    2018-01-01

    It is believed that almost all galaxies have black holes at their centers. Imaging a black hole is a primary objective to answer scientific questions relating to relativistic accretion and jet formation. The Event Horizon Telescope (EHT) is set to capture images of two nearby black holes, Sagittarius A* at the center of the Milky Way galaxy roughly 26,000 light years away and the other M87 which is in Virgo A, a large elliptical galaxy that is 50 million light years away. Sparse imaging techniques have shown great promise for reconstructing high-fidelity superresolved images of black holes from simulated data. Previous work has included the effects of atmospheric phase errors and thermal noise, but not systematic amplitude errors that arise due to miscalibration. We explore a full-closure imaging technique with sparse modeling that uses closure amplitudes and closure phases to improve the imaging process. This new technique can successfully handle data with systematic amplitude errors. Applying our technique to synthetic EHT data of M87, we find that full-closure sparse modeling can reconstruct images better than traditional methods and recover key structural information on the source, such as the shape and size of the predicted photon ring. These results suggest that our new approach will provide superior imaging performance for data from the EHT and other interferometric arrays.

  8. Rooftop Rainwater Harvesting for Mombasa: Scenario Development with Image Classification and Water Resources Simulation

    Directory of Open Access Journals (Sweden)

    Robert O. Ojwang

    2017-05-01

    Full Text Available Mombasa faces severe water scarcity problems. The existing supply is unable to satisfy the demand. This article demonstrates the combination of satellite image analysis and modelling as tools for the development of an urban rainwater harvesting policy. For developing a sustainable remedy policy, rooftop rainwater harvesting (RRWH strategies were implemented into the water supply and demand model WEAP (Water Evaluation and Planning System. Roof areas were detected using supervised image classification. Future population growth, improved living standards, and climate change predictions until 2035 were combined with four management strategies. Image classification techniques were able to detect roof areas with acceptable accuracy. The simulated annual yield of RRWH ranged from 2.3 to 23 million cubic meters (MCM depending on the extent of the roof area. Apart from potential RRWH, additional sources of water are required for full demand coverage.

  9. A Model Management Approach for Co-Simulation Model Evaluation

    NARCIS (Netherlands)

    Zhang, X.C.; Broenink, Johannes F.; Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2011-01-01

    Simulating formal models is a common means for validating the correctness of the system design and reduce the time-to-market. In most of the embedded control system design, multiple engineering disciplines and various domain-specific models are often involved, such as mechanical, control, software

  10. Simulation and modeling for the stand-off radiation detection system (SORDS) using GEANT4

    Energy Technology Data Exchange (ETDEWEB)

    Hoover, Andrew S [Los Alamos National Laboratory; Wallace, Mark [Los Alamos National Laboratory; Galassi, Mark [Los Alamos National Laboratory; Mocko, Michal [Los Alamos National Laboratory; Palmer, David [Los Alamos National Laboratory; Schultz, Larry [Los Alamos National Laboratory; Tornga, Shawn [Los Alamos National Laboratory

    2009-01-01

    A Stand-Off Radiation Detection System (SORDS) is being developed through a joint effort by Raytheon, Los Alamos National Laboratory, Bubble Technology Industries, Radiation Monitoring Devices, and the Massachusetts Institute of Technology, for the Domestic Nuclear Detection Office (DNDO). The system is a mobile truck-based platform performing detection, imaging, and spectroscopic identification of gamma-ray sources. A Tri-Modal Imaging (TMI) approach combines active-mask coded aperture imaging, Compton imaging, and shadow imaging techniques. Monte Carlo simulation and modeling using the GEANT4 toolkit was used to generate realistic data for the development of imaging algorithms and associated software code.

  11. eShopper modeling and simulation

    Science.gov (United States)

    Petrushin, Valery A.

    2001-03-01

    The advent of e-commerce gives an opportunity to shift the paradigm of customer communication into a highly interactive mode. The new generation of commercial Web servers, such as the Blue Martini's server, combines the collection of data on a customer behavior with real-time processing and dynamic tailoring of a feedback page. The new opportunities for direct product marketing and cross selling are arriving. The key problem is what kind of information do we need to achieve these goals, or in other words, how do we model the customer? The paper is devoted to customer modeling and simulation. The focus is on modeling an individual customer. The model is based on the customer's transaction data, click stream data, and demographics. The model includes the hierarchical profile of a customer's preferences to different types of products and brands; consumption models for the different types of products; the current focus, trends, and stochastic models for time intervals between purchases; product affinity models; and some generalized features, such as purchasing power, sensitivity to advertising, price sensitivity, etc. This type of model is used for predicting the date of the next visit, overall spending, and spending for different types of products and brands. For some type of stores (for example, a supermarket) and stable customers, it is possible to forecast the shopping lists rather accurately. The forecasting techniques are discussed. The forecasting results can be used for on- line direct marketing, customer retention, and inventory management. The customer model can also be used as a generative model for simulating the customer's purchasing behavior in different situations and for estimating customer's features.

  12. Robust image modeling technique with a bioluminescence image segmentation application

    Science.gov (United States)

    Zhong, Jianghong; Wang, Ruiping; Tian, Jie

    2009-02-01

    A robust pattern classifier algorithm for the variable symmetric plane model, where the driving noise is a mixture of a Gaussian and an outlier process, is developed. The veracity and high-speed performance of the pattern recognition algorithm is proved. Bioluminescence tomography (BLT) has recently gained wide acceptance in the field of in vivo small animal molecular imaging. So that it is very important for BLT to how to acquire the highprecision region of interest in a bioluminescence image (BLI) in order to decrease loss of the customers because of inaccuracy in quantitative analysis. An algorithm in the mode is developed to improve operation speed, which estimates parameters and original image intensity simultaneously from the noise corrupted image derived from the BLT optical hardware system. The focus pixel value is obtained from the symmetric plane according to a more realistic assumption for the noise sequence in the restored image. The size of neighborhood is adaptive and small. What's more, the classifier function is base on the statistic features. If the qualifications for the classifier are satisfied, the focus pixel intensity is setup as the largest value in the neighborhood.Otherwise, it will be zeros.Finally,pseudo-color is added up to the result of the bioluminescence segmented image. The whole process has been implemented in our 2D BLT optical system platform and the model is proved.

  13. Simulation modelling in agriculture: General considerations. | R.I. ...

    African Journals Online (AJOL)

    The computer does all the necessary arithmetic when the hypothesis is invoked to predict the future behaviour of the simulated system under given conditions.A general ... in the advisory service. Keywords: agriculture; botany; computer simulation; modelling; simulation model; simulation modelling; south africa; techniques ...

  14. Mathematical modeling in biomedical imaging

    CERN Document Server

    2009-01-01

    This volume gives an introduction to a fascinating research area to applied mathematicians. It is devoted to providing the exposition of promising analytical and numerical techniques for solving challenging biomedical imaging problems, which trigger the investigation of interesting issues in various branches of mathematics.

  15. Aqueous Electrolytes: Model Parameters and Process Simulation

    DEFF Research Database (Denmark)

    Thomsen, Kaj

    This thesis deals with aqueous electrolyte mixtures. The Extended UNIQUAC model is being used to describe the excess Gibbs energy of such solutions. Extended UNIQUAC parameters for the twelve ions Na+, K+, NH4+, H+, Cl-, NO3-, SO42-, HSO4-, OH-, CO32-, HCO3-, and S2O82- are estimated. A computer ...... program including a steady state process simulator for the design, simulation, and optimization of fractional crystallization processes is presented.......This thesis deals with aqueous electrolyte mixtures. The Extended UNIQUAC model is being used to describe the excess Gibbs energy of such solutions. Extended UNIQUAC parameters for the twelve ions Na+, K+, NH4+, H+, Cl-, NO3-, SO42-, HSO4-, OH-, CO32-, HCO3-, and S2O82- are estimated. A computer...

  16. Inferring the photometric and size evolution of galaxies from image simulations. I. Method

    Science.gov (United States)

    Carassou, Sébastien; de Lapparent, Valérie; Bertin, Emmanuel; Le Borgne, Damien

    2017-09-01

    Context. Current constraints on models of galaxy evolution rely on morphometric catalogs extracted from multi-band photometric surveys. However, these catalogs are altered by selection effects that are difficult to model, that correlate in non trivial ways, and that can lead to contradictory predictions if not taken into account carefully. Aims: To address this issue, we have developed a new approach combining parametric Bayesian indirect likelihood (pBIL) techniques and empirical modeling with realistic image simulations that reproduce a large fraction of these selection effects. This allows us to perform a direct comparison between observed and simulated images and to infer robust constraints on model parameters. Methods: We use a semi-empirical forward model to generate a distribution of mock galaxies from a set of physical parameters. These galaxies are passed through an image simulator reproducing the instrumental characteristics of any survey and are then extracted in the same way as the observed data. The discrepancy between the simulated and observed data is quantified, and minimized with a custom sampling process based on adaptive Markov chain Monte Carlo methods. Results: Using synthetic data matching most of the properties of a Canada-France-Hawaii Telescope Legacy Survey Deep field, we demonstrate the robustness and internal consistency of our approach by inferring the parameters governing the size and luminosity functions and their evolutions for different realistic populations of galaxies. We also compare the results of our approach with those obtained from the classical spectral energy distribution fitting and photometric redshift approach. Conclusions: Our pipeline infers efficiently the luminosity and size distribution and evolution parameters with a very limited number of observables (three photometric bands). When compared to SED fitting based on the same set of observables, our method yields results that are more accurate and free from

  17. Validation of a power-law noise model for simulating small-scale breast tissue

    Science.gov (United States)

    Reiser, I.; Edwards, A.; Nishikawa, R. M.

    2013-09-01

    We have validated a small-scale breast tissue model based on power-law noise. A set of 110 patient images served as truth. The statistical model parameters were determined by matching the radially averaged power-spectrum of the projected simulated tissue with that of the central tomosynthesis patient breast projections. Observer performance in a signal-known exactly detection task in simulated and actual breast backgrounds was compared. Observers included human readers, a pre-whitening observer model and a channelized Hotelling observer model. For all observers, good agreement between performance in the simulated and actual backgrounds was found, both in the tomosynthesis central projections and the reconstructed images. This tissue model can be used for breast x-ray imaging system optimization. The complete statistical description of the model is provided.

  18. Validation of a power-law noise model for simulating small-scale breast tissue

    International Nuclear Information System (INIS)

    Reiser, I; Edwards, A; Nishikawa, R M

    2013-01-01

    We have validated a small-scale breast tissue model based on power-law noise. A set of 110 patient images served as truth. The statistical model parameters were determined by matching the radially averaged power-spectrum of the projected simulated tissue with that of the central tomosynthesis patient breast projections. Observer performance in a signal-known exactly detection task in simulated and actual breast backgrounds was compared. Observers included human readers, a pre-whitening observer model and a channelized Hotelling observer model. For all observers, good agreement between performance in the simulated and actual backgrounds was found, both in the tomosynthesis central projections and the reconstructed images. This tissue model can be used for breast x-ray imaging system optimization. The complete statistical description of the model is provided. (paper)

  19. Model-based satellite image fusion

    DEFF Research Database (Denmark)

    Aanæs, Henrik; Sveinsson, J. R.; Nielsen, Allan Aasbjerg

    2008-01-01

    A method is proposed for pixel-level satellite image fusion derived directly from a model of the imaging sensor. By design, the proposed method is spectrally consistent. It is argued that the proposed method needs regularization, as is the case for any method for this problem. A framework for pixel...

  20. A Placement Model for Flight Simulators.

    Science.gov (United States)

    1982-09-01

    simulator basing strategies. Captains David R. VanDenburg and Jon D. Veith developed a mathematical model to assist in the placement analysis of A-7...Institute for Defense Analysis, Arlington VA, August 1977. AD A049979. 23. Sugarman , Robert C., Steven L. Johnson, and William F. H. Ring. "B-I Systems...USAF Cost and Plan- nin& Factors. AFR 173-13. Washington: Govern- ment Printing Office, I February 1982. * 30. Van Denburg, Captain David R., USAF

  1. Guiding the Design of Radiation Imagers with Experimentally Benchmarked Geant4 Simulations for Electron-Tracking Compton Imaging

    Science.gov (United States)

    Coffer, Amy Beth

    -scattered electron-trajectories is with high-resolution Charged-Coupled Devices (CCDs). The proof-of-principle CCD-based ETCI experiment demonstrated the CCDs' ability to measure the Compton-scattered electron-tracks as a 2-dimensional image. Electron-track-imaging algorithms using the electron-track-image are able to determine the 3-dimensional electron-track trajectory within +/- 20 degrees. The work presented here is the physics simulations developed along side the experimental proof-of-principle experiment. The development of accurate physics modeling for multiple-layer CCDs based ETCI systems allow for the accurate prediction of future ETCI system performance. The simulations also enable quick development insights for system design, and they guide the development of electron-track reconstruction methods. The physics simulation efforts for this project looked closely at the accuracy of the Geant4 Monte Carlo methods for medium energy electron transport. In older version of Geant4 there were some discrepancies between the electron-tracking experimental measurements and the simulation results. It was determined that when comparing the electron dynamics of electrons at very high resolutions, Geant4 simulations must be fine tuned with careful choices for physics production cuts and electron physics stepping sizes. One result of this work is a CCDs Monte Carlo model that has been benchmarked to experimental findings and fully characterized for both photon and electron transport. The CCDs physics model now match to within 1 percent error of experimental results for scattered-electron energies below 500 keV. Following the improvements of the CCDs simulations, the performance of a realistic two-layer CCD-stack system was characterized. The realistic CCD-stack system looked at the effect of thin passive-layers on the CCDs' front face and back-contact. The photon interaction efficiency was calculated for the two-layer CCD-stack, and we found that there is a 90 percent probability of

  2. Modelling interplanetary CMEs using magnetohydrodynamic simulations

    Directory of Open Access Journals (Sweden)

    P. J. Cargill

    Full Text Available The dynamics of Interplanetary Coronal Mass Ejections (ICMEs are discussed from the viewpoint of numerical modelling. Hydrodynamic models are shown to give a good zero-order picture of the plasma properties of ICMEs, but they cannot model the important magnetic field effects. Results from MHD simulations are shown for a number of cases of interest. It is demonstrated that the strong interaction of the ICME with the solar wind leads to the ICME and solar wind velocities being close to each other at 1 AU, despite their having very different speeds near the Sun. It is also pointed out that this interaction leads to a distortion of the ICME geometry, making cylindrical symmetry a dubious assumption for the CME field at 1 AU. In the presence of a significant solar wind magnetic field, the magnetic fields of the ICME and solar wind can reconnect with each other, leading to an ICME that has solar wind-like field lines. This effect is especially important when an ICME with the right sense of rotation propagates down the heliospheric current sheet. It is also noted that a lack of knowledge of the coronal magnetic field makes such simulations of little use in space weather forecasts that require knowledge of the ICME magnetic field strength.

    Key words. Interplanetary physics (interplanetary magnetic fields Solar physics, astrophysics, and astronomy (flares and mass ejections Space plasma physics (numerical simulation studies

  3. MODELING AND SIMULATION OF A HYDROCRACKING UNIT

    Directory of Open Access Journals (Sweden)

    HASSAN A. FARAG

    2016-06-01

    Full Text Available Hydrocracking is used in the petroleum industry to convert low quality feed stocks into high valued transportation fuels such as gasoline, diesel, and jet fuel. The aim of the present work is to develop a rigorous steady state two-dimensional mathematical model which includes conservation equations of mass and energy for simulating the operation of a hydrocracking unit. Both the catalyst bed and quench zone have been included in this integrated model. The model equations were numerically solved in both axial and radial directions using Matlab software. The presented model was tested against a real plant data in Egypt. The results indicated that a very good agreement between the model predictions and industrial values have been reported for temperature profiles, concentration profiles, and conversion in both radial and axial directions at the hydrocracking unit. Simulation of the quench zone conversion and temperature profiles in the quench zone was also included and gave a low deviation from the actual ones. In concentration profiles, the percentage deviation in the first reactor was found to be 9.28 % and 9.6% for the second reactor. The effect of several parameters such as: Pellet Heat Transfer Coefficient, Effective Radial Thermal Conductivity, Wall Heat Transfer Coefficient, Effective Radial Diffusivity, and Cooling medium (quench zone has been included in this study. The variation of Wall Heat Transfer Coefficient, Effective Radial Diffusivity for the near-wall region, gave no remarkable changes in the temperature profiles. On the other hand, even small variations of Effective Radial Thermal Conductivity, affected the simulated temperature profiles significantly, and this effect could not be compensated by the variations of the other parameters of the model.

  4. The establishment of Digital Image Capture System(DICS) using conventional simulator

    International Nuclear Information System (INIS)

    Oh, Tae Sung; Park, Jong Il; Byun, Young Sik; Shin, Hyun Kyoh

    2004-01-01

    The simulator is used to determine patient field and ensure the treatment field, which encompasses the required anatomy during patient normal movement such as during breathing. The latest simulator provide real time display of still, fluoroscopic and digitalized image, but conventional simulator is not yet. The purpose of this study is to introduce digital image capture system(DICS) using conventional simulator and clinical case using digital captured still and fluoroscopic image. We connect the video signal cable to the video terminal in the back up of simulator monitor, and connect the video jack to the A/D converter. After connection between the converter jack and computer, We can acquire still image and record fluoroscopic image with operating image capture program. The data created with this system can be used in patient treatment, and modified for verification by using image processing software. (j.e. photoshop, paintshop) DICS was able to establish easy and economical procedure. DCIS image was helpful for simulation. DICS imaging was powerful tool in the evaluation of the department specific patient positioning. Because the commercialized simulator based of digital capture is very expensive, it is not easily to establish DICS simulator in the most hospital. DICS using conventional simulator enable to utilize the practical use of image equal to high cost digitalized simulator and to research many clinical cases in case of using other software program.

  5. Reactive transport models and simulation with ALLIANCES

    International Nuclear Information System (INIS)

    Leterrier, N.; Deville, E.; Bary, B.; Trotignon, L.; Hedde, T.; Cochepin, B.; Stora, E.

    2009-01-01

    Many chemical processes influence the evolution of nuclear waste storage. As a result, simulations based only upon transport and hydraulic processes fail to describe adequately some industrial scenarios. We need to take into account complex chemical models (mass action laws, kinetics...) which are highly non-linear. In order to simulate the coupling of these chemical reactions with transport, we use a classical Sequential Iterative Approach (SIA), with a fixed point algorithm, within the mainframe of the ALLIANCES platform. This approach allows us to use the various transport and chemical modules available in ALLIANCES, via an operator-splitting method based upon the structure of the chemical system. We present five different applications of reactive transport simulations in the context of nuclear waste storage: 1. A 2D simulation of the lixiviation by rain water of an underground polluted zone high in uranium oxide; 2. The degradation of the steel envelope of a package in contact with clay. Corrosion of the steel creates corrosion products and the altered package becomes a porous medium. We follow the degradation front through kinetic reactions and the coupling with transport; 3. The degradation of a cement-based material by the injection of an aqueous solution of zinc and sulphate ions. In addition to the reactive transport coupling, we take into account in this case the hydraulic retroaction of the porosity variation on the Darcy velocity; 4. The decalcification of a concrete beam in an underground storage structure. In this case, in addition to the reactive transport simulation, we take into account the interaction between chemical degradation and the mechanical forces (cracks...), and the retroactive influence on the structure changes on transport; 5. The degradation of the steel envelope of a package in contact with a clay material under a temperature gradient. In this case the reactive transport simulation is entirely directed by the temperature changes and

  6. Simulation models generator. Applications in scheduling

    Directory of Open Access Journals (Sweden)

    Omar Danilo Castrillón

    2013-08-01

    Rev.Mate.Teor.Aplic. (ISSN 1409-2433 Vol. 20(2: 231–241, July 2013 generador de modelos de simulacion 233 will, in order to have an approach to reality to evaluate decisions in order to take more assertive. To test prototype was used as the modeling example of a production system with 9 machines and 5 works as a job shop configuration, testing stops processing times and stochastic machine to measure rates of use of machines and time average jobs in the system, as measures of system performance. This test shows the goodness of the prototype, to save the user the simulation model building

  7. Computer Models Simulate Fine Particle Dispersion

    Science.gov (United States)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  8. Modeling and simulation of reactive flows

    CERN Document Server

    Bortoli, De AL; Pereira, Felipe

    2015-01-01

    Modelling and Simulation of Reactive Flows presents information on modeling and how to numerically solve reactive flows. The book offers a distinctive approach that combines diffusion flames and geochemical flow problems, providing users with a comprehensive resource that bridges the gap for scientists, engineers, and the industry. Specifically, the book looks at the basic concepts related to reaction rates, chemical kinetics, and the development of reduced kinetic mechanisms. It considers the most common methods used in practical situations, along with equations for reactive flows, and va

  9. Models for Patch-Based Image Restoration

    Directory of Open Access Journals (Sweden)

    Mithun Das Gupta

    2009-01-01

    Full Text Available We present a supervised learning approach for object-category specific restoration, recognition, and segmentation of images which are blurred using an unknown kernel. The novelty of this work is a multilayer graphical model which unifies the low-level vision task of restoration and the high-level vision task of recognition in a cooperative framework. The graphical model is an interconnected two-layer Markov random field. The restoration layer accounts for the compatibility between sharp and blurred images and models the association between adjacent patches in the sharp image. The recognition layer encodes the entity class and its location in the underlying scene. The potentials are represented using nonparametric kernel densities and are learnt from training data. Inference is performed using nonparametric belief propagation. Experiments demonstrate the effectiveness of our model for the restoration and recognition of blurred license plates as well as face images.

  10. Models for Patch-Based Image Restoration

    Directory of Open Access Journals (Sweden)

    Petrovic Nemanja

    2009-01-01

    Full Text Available Abstract We present a supervised learning approach for object-category specific restoration, recognition, and segmentation of images which are blurred using an unknown kernel. The novelty of this work is a multilayer graphical model which unifies the low-level vision task of restoration and the high-level vision task of recognition in a cooperative framework. The graphical model is an interconnected two-layer Markov random field. The restoration layer accounts for the compatibility between sharp and blurred images and models the association between adjacent patches in the sharp image. The recognition layer encodes the entity class and its location in the underlying scene. The potentials are represented using nonparametric kernel densities and are learnt from training data. Inference is performed using nonparametric belief propagation. Experiments demonstrate the effectiveness of our model for the restoration and recognition of blurred license plates as well as face images.

  11. Simulation and measurement of total ionizing dose radiation induced image lag increase in pinned photodiode CMOS image sensors

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Jing [School of Materials Science and Engineering, Xiangtan University, Hunan (China); State Key Laboratory of Intense Pulsed Irradiation Simulation and Effect, Northwest Institute of Nuclear Technology, P.O.Box 69-10, Xi’an (China); Chen, Wei, E-mail: chenwei@nint.ac.cn [State Key Laboratory of Intense Pulsed Irradiation Simulation and Effect, Northwest Institute of Nuclear Technology, P.O.Box 69-10, Xi’an (China); Wang, Zujun, E-mail: wangzujun@nint.ac.cn [State Key Laboratory of Intense Pulsed Irradiation Simulation and Effect, Northwest Institute of Nuclear Technology, P.O.Box 69-10, Xi’an (China); Xue, Yuanyuan; Yao, Zhibin; He, Baoping; Ma, Wuying; Jin, Junshan; Sheng, Jiangkun; Dong, Guantao [State Key Laboratory of Intense Pulsed Irradiation Simulation and Effect, Northwest Institute of Nuclear Technology, P.O.Box 69-10, Xi’an (China)

    2017-06-01

    This paper presents an investigation of total ionizing dose (TID) induced image lag sources in pinned photodiodes (PPD) CMOS image sensors based on radiation experiments and TCAD simulation. The radiation experiments have been carried out at the Cobalt −60 gamma-ray source. The experimental results show the image lag degradation is more and more serious with increasing TID. Combining with the TCAD simulation results, we can confirm that the junction of PPD and transfer gate (TG) is an important region forming image lag during irradiation. These simulations demonstrate that TID can generate a potential pocket leading to incomplete transfer.

  12. TMS modeling toolbox for realistic simulation.

    Science.gov (United States)

    Cho, Young Sun; Suh, Hyun Sang; Lee, Won Hee; Kim, Tae-Seong

    2010-01-01

    Transcranial magnetic stimulation (TMS) is a technique for brain stimulation using rapidly changing magnetic fields generated by coils. It has been established as an effective stimulation technique to treat patients suffering from damaged brain functions. Although TMS is known to be painless and noninvasive, it can also be harmful to the brain by incorrect focusing and excessive stimulation which might result in seizure. Therefore there is ongoing research effort to elucidate and better understand the effect and mechanism of TMS. Lately Boundary element method (BEM) and Finite element method (FEM) have been used to simulate the electromagnetic phenomenon of TMS. However, there is a lack of general tools to generate the models of TMS due to some difficulties in realistic modeling of the human head and TMS coils. In this study, we have developed a toolbox through which one can generate high-resolution FE TMS models. The toolbox allows creating FE models of the head with isotropic and anisotropic electrical conductivities in five different tissues of the head and the coils in 3D. The generated TMS model is importable to FE software packages such as ANSYS for further and efficient electromagnetic analysis. We present a set of demonstrative results of realistic simulation of TMS with our toolbox.

  13. Wedge Experiment Modeling and Simulation for Reactive Flow Model Calibration

    Science.gov (United States)

    Maestas, Joseph T.; Dorgan, Robert J.; Sutherland, Gerrit T.

    2017-06-01

    Wedge experiments are a typical method for generating pop-plot data (run-to-detonation distance versus input shock pressure), which is used to assess an explosive material's initiation behavior. Such data can be utilized to calibrate reactive flow models by running hydrocode simulations and successively tweaking model parameters until a match between experiment is achieved. Typical simulations are performed in 1D and typically use a flyer impact to achieve the prescribed shock loading pressure. In this effort, a wedge experiment performed at the Army Research Lab (ARL) was modeled using CTH (SNL hydrocode) in 1D, 2D, and 3D space in order to determine if there was any justification in using simplified models. A simulation was also performed using the BCAT code (CTH companion tool) that assumes a plate impact shock loading. Results from the simulations were compared to experimental data and show that the shock imparted into an explosive specimen is accurately captured with 2D and 3D simulations, but changes significantly in 1D space and with the BCAT tool. The difference in shock profile is shown to only affect numerical predictions for large run distances. This is attributed to incorrectly capturing the energy fluence for detonation waves versus flat shock loading. Portions of this work were funded through the Joint Insensitive Munitions Technology Program.

  14. Integrating Visualizations into Modeling NEST Simulations.

    Science.gov (United States)

    Nowke, Christian; Zielasko, Daniel; Weyers, Benjamin; Peyser, Alexander; Hentschel, Bernd; Kuhlen, Torsten W

    2015-01-01

    Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work.

  15. Integrating Visualizations into Modeling NEST Simulations

    Directory of Open Access Journals (Sweden)

    Christian eNowke

    2015-12-01

    Full Text Available Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work.

  16. Integrating Visualizations into Modeling NEST Simulations

    Science.gov (United States)

    Nowke, Christian; Zielasko, Daniel; Weyers, Benjamin; Peyser, Alexander; Hentschel, Bernd; Kuhlen, Torsten W.

    2015-01-01

    Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work. PMID:26733860

  17. Efficient Turbulence Modeling for CFD Wake Simulations

    DEFF Research Database (Denmark)

    van der Laan, Paul

    , that can accurately and efficiently simulate wind turbine wakes. The linear k-ε eddy viscosity model (EVM) is a popular turbulence model in RANS; however, it underpredicts the velocity wake deficit and cannot predict the anisotropic Reynolds-stresses in the wake. In the current work, nonlinear eddy...... viscosity models (NLEVM) are applied to wind turbine wakes. NLEVMs can model anisotropic turbulence through a nonlinear stress-strain relation, and they can improve the velocity deficit by the use of a variable eddy viscosity coefficient, that delays the wake recovery. Unfortunately, all tested NLEVMs show...... numerically unstable behavior for fine grids, which inhibits a grid dependency study for numerical verification. Therefore, a simpler EVM is proposed, labeled as the k-ε - fp EVM, that has a linear stress-strain relation, but still has a variable eddy viscosity coefficient. The k-ε - fp EVM is numerically...

  18. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    Science.gov (United States)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  19. Image contrast enhancement based on a local standard deviation model

    International Nuclear Information System (INIS)

    Chang, Dah-Chung; Wu, Wen-Rong

    1996-01-01

    The adaptive contrast enhancement (ACE) algorithm is a widely used image enhancement method, which needs a contrast gain to adjust high frequency components of an image. In the literature, the gain is usually inversely proportional to the local standard deviation (LSD) or is a constant. But these cause two problems in practical applications, i.e., noise overenhancement and ringing artifact. In this paper a new gain is developed based on Hunt's Gaussian image model to prevent the two defects. The new gain is a nonlinear function of LSD and has the desired characteristic emphasizing the LSD regions in which details are concentrated. We have applied the new ACE algorithm to chest x-ray images and the simulations show the effectiveness of the proposed algorithm

  20. Ekofisk chalk: core measurements, stochastic reconstruction, network modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Talukdar, Saifullah

    2002-07-01

    This dissertation deals with (1) experimental measurements on petrophysical, reservoir engineering and morphological properties of Ekofisk chalk, (2) numerical simulation of core flood experiments to analyze and improve relative permeability data, (3) stochastic reconstruction of chalk samples from limited morphological information, (4) extraction of pore space parameters from the reconstructed samples, development of network model using pore space information, and computation of petrophysical and reservoir engineering properties from network model, and (5) development of 2D and 3D idealized fractured reservoir models and verification of the applicability of several widely used conventional up scaling techniques in fractured reservoir simulation. Experiments have been conducted on eight Ekofisk chalk samples and porosity, absolute permeability, formation factor, and oil-water relative permeability, capillary pressure and resistivity index are measured at laboratory conditions. Mercury porosimetry data and backscatter scanning electron microscope images have also been acquired for the samples. A numerical simulation technique involving history matching of the production profiles is employed to improve the relative permeability curves and to analyze hysteresis of the Ekofisk chalk samples. The technique was found to be a powerful tool to supplement the uncertainties in experimental measurements. Porosity and correlation statistics obtained from backscatter scanning electron microscope images are used to reconstruct microstructures of chalk and particulate media. The reconstruction technique involves a simulated annealing algorithm, which can be constrained by an arbitrary number of morphological parameters. This flexibility of the algorithm is exploited to successfully reconstruct particulate media and chalk samples using more than one correlation functions. A technique based on conditional simulated annealing has been introduced for exact reproduction of vuggy

  1. Development of computational small animal models and their applications in preclinical imaging and therapy research

    NARCIS (Netherlands)

    Xie, Tianwu; Zaidi, Habib

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal

  2. Best Practices for Crash Modeling and Simulation

    Science.gov (United States)

    Fasanella, Edwin L.; Jackson, Karen E.

    2002-01-01

    Aviation safety can be greatly enhanced by the expeditious use of computer simulations of crash impact. Unlike automotive impact testing, which is now routine, experimental crash tests of even small aircraft are expensive and complex due to the high cost of the aircraft and the myriad of crash impact conditions that must be considered. Ultimately, the goal is to utilize full-scale crash simulations of aircraft for design evaluation and certification. The objective of this publication is to describe "best practices" for modeling aircraft impact using explicit nonlinear dynamic finite element codes such as LS-DYNA, DYNA3D, and MSC.Dytran. Although "best practices" is somewhat relative, it is hoped that the authors' experience will help others to avoid some of the common pitfalls in modeling that are not documented in one single publication. In addition, a discussion of experimental data analysis, digital filtering, and test-analysis correlation is provided. Finally, some examples of aircraft crash simulations are described in several appendices following the main report.

  3. Systematic simulations of modified gravity: chameleon models

    Energy Technology Data Exchange (ETDEWEB)

    Brax, Philippe [Institut de Physique Theorique, CEA, IPhT, CNRS, URA 2306, F-91191Gif/Yvette Cedex (France); Davis, Anne-Christine [DAMTP, Centre for Mathematical Sciences, University of Cambridge, Wilberforce Road, Cambridge CB3 0WA (United Kingdom); Li, Baojiu [Institute for Computational Cosmology, Department of Physics, Durham University, Durham DH1 3LE (United Kingdom); Winther, Hans A. [Institute of Theoretical Astrophysics, University of Oslo, 0315 Oslo (Norway); Zhao, Gong-Bo, E-mail: philippe.brax@cea.fr, E-mail: a.c.davis@damtp.cam.ac.uk, E-mail: baojiu.li@durham.ac.uk, E-mail: h.a.winther@astro.uio.no, E-mail: gong-bo.zhao@port.ac.uk [Institute of Cosmology and Gravitation, University of Portsmouth, Portsmouth PO1 3FX (United Kingdom)

    2013-04-01

    In this work we systematically study the linear and nonlinear structure formation in chameleon theories of modified gravity, using a generic parameterisation which describes a large class of models using only 4 parameters. For this we have modified the N-body simulation code ecosmog to perform a total of 65 simulations for different models and parameter values, including the default ΛCDM. These simulations enable us to explore a significant portion of the parameter space. We have studied the effects of modified gravity on the matter power spectrum and mass function, and found a rich and interesting phenomenology where the difference with the ΛCDM paradigm cannot be reproduced by a linear analysis even on scales as large as k ∼ 0.05 hMpc{sup −1}, since the latter incorrectly assumes that the modification of gravity depends only on the background matter density. Our results show that the chameleon screening mechanism is significantly more efficient than other mechanisms such as the dilaton and symmetron, especially in high-density regions and at early times, and can serve as a guidance to determine the parts of the chameleon parameter space which are cosmologically interesting and thus merit further studies in the future.

  4. Systematic simulations of modified gravity: chameleon models

    International Nuclear Information System (INIS)

    Brax, Philippe; Davis, Anne-Christine; Li, Baojiu; Winther, Hans A.; Zhao, Gong-Bo

    2013-01-01

    In this work we systematically study the linear and nonlinear structure formation in chameleon theories of modified gravity, using a generic parameterisation which describes a large class of models using only 4 parameters. For this we have modified the N-body simulation code ecosmog to perform a total of 65 simulations for different models and parameter values, including the default ΛCDM. These simulations enable us to explore a significant portion of the parameter space. We have studied the effects of modified gravity on the matter power spectrum and mass function, and found a rich and interesting phenomenology where the difference with the ΛCDM paradigm cannot be reproduced by a linear analysis even on scales as large as k ∼ 0.05 hMpc −1 , since the latter incorrectly assumes that the modification of gravity depends only on the background matter density. Our results show that the chameleon screening mechanism is significantly more efficient than other mechanisms such as the dilaton and symmetron, especially in high-density regions and at early times, and can serve as a guidance to determine the parts of the chameleon parameter space which are cosmologically interesting and thus merit further studies in the future

  5. Similarity measure and domain adaptation in multiple mixture model clustering: An application to image processing.

    Science.gov (United States)

    Leong, Siow Hoo; Ong, Seng Huat

    2017-01-01

    This paper considers three crucial issues in processing scaled down image, the representation of partial image, similarity measure and domain adaptation. Two Gaussian mixture model based algorithms are proposed to effectively preserve image details and avoids image degradation. Multiple partial images are clustered separately through Gaussian mixture model clustering with a scan and select procedure to enhance the inclusion of small image details. The local image features, represented by maximum likelihood estimates of the mixture components, are classified by using the modified Bayes factor (MBF) as a similarity measure. The detection of novel local features from MBF will suggest domain adaptation, which is changing the number of components of the Gaussian mixture model. The performance of the proposed algorithms are evaluated with simulated data and real images and it is shown to perform much better than existing Gaussian mixture model based algorithms in reproducing images with higher structural similarity index.

  6. Closed loop models for analyzing engineering requirements for simulators

    Science.gov (United States)

    Baron, S.; Muralidharan, R.; Kleinman, D.

    1980-01-01

    A closed loop analytic model, incorporating a model for the human pilot, (namely, the optimal control model) that would allow certain simulation design tradeoffs to be evaluated quantitatively was developed. This model was applied to a realistic flight control problem. The resulting model is used to analyze both overall simulation effects and the effects of individual elements. The results show that, as compared to an ideal continuous simulation, the discrete simulation can result in significant performance and/or workload penalties.

  7. The infrared image simulation of the tank under different movement states

    Science.gov (United States)

    Gao, Xiang; Mu, Cheng-po; Peng, Ming-song; Dong, Qing-xian; Zhang, Rui-heng

    2017-07-01

    Tank, as a vital ground weapon, plays an irreplaceable role in the war. The article did the research of infrared image of the tank. Firstly, the 3D model of tank was established. And then the infrared radiation model of the target was constructed by analysing the infrared characteristics of the tank's different parts.. Finally the infrared radiation value of the tank under different states was calculated and the simulation of infrared characteristics of the tank under different states was done, which will provide reference for the research on infrared characteristics of the army's battlefield target.

  8. A Fourier dimensionality reduction model for big data interferometric imaging

    Science.gov (United States)

    Vijay Kartik, S.; Carrillo, Rafael E.; Thiran, Jean-Philippe; Wiaux, Yves

    2017-06-01

    Data dimensionality reduction in radio interferometry can provide savings of computational resources for image reconstruction through reduced memory footprints and lighter computations per iteration, which is important for the scalability of imaging methods to the big data setting of the next-generation telescopes. This article sheds new light on dimensionality reduction from the perspective of the compressed sensing theory and studies its interplay with imaging algorithms designed in the context of convex optimization. We propose a post-gridding linear data embedding to the space spanned by the left singular vectors of the measurement operator, providing a dimensionality reduction below image size. This embedding preserves the null space of the measurement operator and hence its sampling properties are also preserved in light of the compressed sensing theory. We show that this can be approximated by first computing the dirty image and then applying a weighted subsampled discrete Fourier transform to obtain the final reduced data vector. This Fourier dimensionality reduction model ensures a fast implementation of the full measurement operator, essential for any iterative image reconstruction method. The proposed reduction also preserves the independent and identically distributed Gaussian properties of the original measurement noise. For convex optimization-based imaging algorithms, this is key to justify the use of the standard ℓ2-norm as the data fidelity term. Our simulations confirm that this dimensionality reduction approach can be leveraged by convex optimization algorithms with no loss in imaging quality relative to reconstructing the image from the complete visibility data set. Reconstruction results in simulation settings with no direction dependent effects or calibration errors show promising performance of the proposed dimensionality reduction. Further tests on real data are planned as an extension of the current work. matlab code implementing the

  9. FDTD-based optical simulations methodology for CMOS image sensors pixels architecture and process optimization

    Science.gov (United States)

    Hirigoyen, Flavien; Crocherie, Axel; Vaillant, Jérôme M.; Cazaux, Yvon

    2008-02-01

    This paper presents a new FDTD-based optical simulation model dedicated to describe the optical performances of CMOS image sensors taking into account diffraction effects. Following market trend and industrialization constraints, CMOS image sensors must be easily embedded into even smaller packages, which are now equipped with auto-focus and short-term coming zoom system. Due to miniaturization, the ray-tracing models used to evaluate pixels optical performances are not accurate anymore to describe the light propagation inside the sensor, because of diffraction effects. Thus we adopt a more fundamental description to take into account these diffraction effects: we chose to use Maxwell-Boltzmann based modeling to compute the propagation of light, and to use a software with an FDTD-based (Finite Difference Time Domain) engine to solve this propagation. We present in this article the complete methodology of this modeling: on one hand incoherent plane waves are propagated to approximate a product-use diffuse-like source, on the other hand we use periodic conditions to limit the size of the simulated model and both memory and computation time. After having presented the correlation of the model with measurements we will illustrate its use in the case of the optimization of a 1.75μm pixel.

  10. Software for medical image based phantom modelling

    International Nuclear Information System (INIS)

    Possani, R.G.; Massicano, F.; Coelho, T.S.; Yoriyaz, H.

    2011-01-01

    Latest treatment planning systems depends strongly on CT images, so the tendency is that the dosimetry procedures in nuclear medicine therapy be also based on images, such as magnetic resonance imaging (MRI) or computed tomography (CT), to extract anatomical and histological information, as well as, functional imaging or activities map as PET or SPECT. This information associated with the simulation of radiation transport software is used to estimate internal dose in patients undergoing treatment in nuclear medicine. This work aims to re-engineer the software SCMS, which is an interface software between the Monte Carlo code MCNP, and the medical images, that carry information from the patient in treatment. In other words, the necessary information contained in the images are interpreted and presented in a specific format to the Monte Carlo MCNP code to perform the simulation of radiation transport. Therefore, the user does not need to understand complex process of inputting data on MCNP, as the SCMS is responsible for automatically constructing anatomical data from the patient, as well as the radioactive source data. The SCMS was originally developed in Fortran- 77. In this work it was rewritten in an object-oriented language (JAVA). New features and data options have also been incorporated into the software. Thus, the new software has a number of improvements, such as intuitive GUI and a menu for the selection of the energy spectra correspondent to a specific radioisotope stored in a XML data bank. The new version also supports new materials and the user can specify an image region of interest for the calculation of absorbed dose. (author)

  11. Biomechanics trends in modeling and simulation

    CERN Document Server

    Ogden, Ray

    2017-01-01

    The book presents a state-of-the-art overview of biomechanical and mechanobiological modeling and simulation of soft biological tissues. Seven well-known scientists working in that particular field discuss topics such as biomolecules, networks and cells as well as failure, multi-scale, agent-based, bio-chemo-mechanical and finite element models appropriate for computational analysis. Applications include arteries, the heart, vascular stents and valve implants as well as adipose, brain, collagenous and engineered tissues. The mechanics of the whole cell and sub-cellular components as well as the extracellular matrix structure and mechanotransduction are described. In particular, the formation and remodeling of stress fibers, cytoskeletal contractility, cell adhesion and the mechanical regulation of fibroblast migration in healing myocardial infarcts are discussed. The essential ingredients of continuum mechanics are provided. Constitutive models of fiber-reinforced materials with an emphasis on arterial walls ...

  12. Simulations, evaluations and models. Vol. 1

    International Nuclear Information System (INIS)

    Brehmer, B.; Leplat, J.

    1992-01-01

    Papers presented at the Fourth MOHAWC (Models of Human Activities in Work Context) workshop. The general theme was simulations, evaluations and models. The emphasis was on time in relation to the modelling of human activities in modern, high tech. work. Such work often requires people to control dynamic systems, and the behaviour and misbehaviour of these systems in time is a principle focus of work in, for example, a modern process plant. The papers report on microworlds and on their innovative uses, both in the form of experiments and in the form of a new form of use, that of testing a program which performs diagnostic reasoning. They present new aspects on the problem of time in process control, showing the importance of considering the time scales of dynamic tasks, both in individual decision making and in distributed decision making, and in providing new formalisms, both for the representation of time and for reasoning involving time in diagnosis. (AB)

  13. Traffic flow dynamics data, models and simulation

    CERN Document Server

    Treiber, Martin

    2013-01-01

    This textbook provides a comprehensive and instructive coverage of vehicular traffic flow dynamics and modeling. It makes this fascinating interdisciplinary topic, which to date was only documented in parts by specialized monographs, accessible to a broad readership. Numerous figures and problems with solutions help the reader to quickly understand and practice the presented concepts. This book is targeted at students of physics and traffic engineering and, more generally, also at students and professionals in computer science, mathematics, and interdisciplinary topics. It also offers material for project work in programming and simulation at college and university level. The main part, after presenting different categories of traffic data, is devoted to a mathematical description of the dynamics of traffic flow, covering macroscopic models which describe traffic in terms of density, as well as microscopic many-particle models in which each particle corresponds to a vehicle and its driver. Focus chapters on ...

  14. Modelling and Simulation for Major Incidents

    Directory of Open Access Journals (Sweden)

    Eleonora Pacciani

    2015-11-01

    Full Text Available In recent years, there has been a rise in Major Incidents with big impact on the citizens health and the society. Without the possibility of conducting live experiments when it comes to physical and/or toxic trauma, only an accurate in silico reconstruction allows us to identify organizational solutions with the best possible chance of success, in correlation with the limitations on available resources (e.g. medical team, first responders, treatments, transports, and hospitals availability and with the variability of the characteristic of event (e.g. type of incident, severity of the event and type of lesions. Utilizing modelling and simulation techniques, a simplified mathematical model of physiological evolution for patients involved in physical and toxic trauma incident scenarios has been developed and implemented. The model formalizes the dynamics, operating standards and practices of medical response and the main emergency service in the chain of emergency management during a Major Incident.

  15. Development of a virtual speaking simulator using Image Based Rendering.

    Science.gov (United States)

    Lee, J M; Kim, H; Oh, M J; Ku, J H; Jang, D P; Kim, I Y; Kim, S I

    2002-01-01

    The fear of speaking is often cited as the world's most common social phobia. The rapid growth of computer technology has enabled the use of virtual reality (VR) for the treatment of the fear of public speaking. There are two techniques for building virtual environments for the treatment of this fear: a model-based and a movie-based method. Both methods have the weakness that they are unrealistic and not controllable individually. To understand these disadvantages, this paper presents a virtual environment produced with Image Based Rendering (IBR) and a chroma-key simultaneously. IBR enables the creation of realistic virtual environments where the images are stitched panoramically with the photos taken from a digital camera. And the use of chroma-keys puts virtual audience members under individual control in the environment. In addition, real time capture technique is used in constructing the virtual environments enabling spoken interaction between the subject and a therapist or another subject.

  16. Qualitative simulation in formal process modelling

    International Nuclear Information System (INIS)

    Sivertsen, Elin R.

    1999-01-01

    In relation to several different research activities at the OECD Halden Reactor Project, the usefulness of formal process models has been identified. Being represented in some appropriate representation language, the purpose of these models is to model process plants and plant automatics in a unified way to allow verification and computer aided design of control strategies. The present report discusses qualitative simulation and the tool QSIM as one approach to formal process models. In particular, the report aims at investigating how recent improvements of the tool facilitate the use of the approach in areas like process system analysis, procedure verification, and control software safety analysis. An important long term goal is to provide a basis for using qualitative reasoning in combination with other techniques to facilitate the treatment of embedded programmable systems in Probabilistic Safety Analysis (PSA). This is motivated from the potential of such a combination in safety analysis based on models comprising both software, hardware, and operator. It is anticipated that the research results from this activity will benefit V and V in a wide variety of applications where formal process models can be utilized. Examples are operator procedures, intelligent decision support systems, and common model repositories (author) (ml)

  17. Heinrich events modeled in transient glacial simulations

    Science.gov (United States)

    Ziemen, Florian; Kapsch, Marie; Mikolajewicz, Uwe

    2017-04-01

    Heinrich events are among the most prominent events of climate variability recorded in proxies across the northern hemisphere. They are the archetype of ice sheet — climate interactions on millennial time scales. Nevertheless, the exact mechanisms that cause Heinrich events are still under debate, and their climatic consequences are far from being fully understood. We address open questions by studying Heinrich events in a coupled ice sheet model (ISM) atmosphere-ocean-vegetation general circulation model (AOVGCM) framework, where this variability occurs as part of the model generated internal variability. The framework consists of a northern hemisphere setup of the modified Parallel Ice Sheet Model (mPISM) coupled to the global AOVGCM ECHAM5/MPIOM/LPJ. The simulations were performed fully coupled and with transient orbital and greenhouse gas forcing. They span from several millennia before the last glacial maximum into the deglaciation. To make these long simulations feasible, the atmosphere is accelerated by a factor of 10 relative to the other model components using a periodical-synchronous coupling technique. To disentangle effects of the Heinrich events and the deglaciation, we focus on the events occurring before the deglaciation. The modeled Heinrich events show a peak ice discharge of about 0.05 Sv and raise the sea level by 2.3 m on average. The resulting surface water freshening reduces the Atlantic meridional overturning circulation and ocean heat release. The reduction in ocean heat release causes a sub-surface warming and decreases the air temperature and precipitation regionally and downstream into Eurasia. The surface elevation decrease of the ice sheet enhances moisture transport onto the ice sheet and thus increases precipitation over the Hudson Bay area, thereby accelerating the recovery after an event.

  18. Modeling and Simulation of Grasping of Deformable Objects

    DEFF Research Database (Denmark)

    Fugl, Andreas Rune

    . The purpose of this thesis is to address the modeling and simulation of deformable objects, as applied to robotic grasping and manipulation. The main contributions of this work are: An evaluation of 3D linear elasticity used for robot grasping as implemented by a Finite Difference Method supporting regular...... and adaptively refined grids, a stable and accurate non-linear 2D beam model supporting large deformations and difficult boundary effects, a method for the estimation of material properties and pose from depth and colour images, a method for the learning of Peg-in-Hole actions, an outline for Laying-Down actions...... as well a throughout evaluation of the accuracy of models under large deformations....

  19. Simulation Model of Mobile Detection Systems

    International Nuclear Information System (INIS)

    Edmunds, T.; Faissol, D.; Yao, Y.

    2009-01-01

    In this paper, we consider a mobile source that we attempt to detect with man-portable, vehicle-mounted or boat-mounted radiation detectors. The source is assumed to transit an area populated with these mobile detectors, and the objective is to detect the source before it reaches a perimeter. We describe a simulation model developed to estimate the probability that one of the mobile detectors will come in to close proximity of the moving source and detect it. We illustrate with a maritime simulation example. Our simulation takes place in a 10 km by 5 km rectangular bay patrolled by boats equipped with 2-inch x 4-inch x 16-inch NaI detectors. Boats to be inspected enter the bay and randomly proceed to one of seven harbors on the shore. A source-bearing boat enters the mouth of the bay and proceeds to a pier on the opposite side. We wish to determine the probability that the source is detected and its range from target when detected. Patrol boats select the nearest in-bound boat for inspection and initiate an intercept course. Once within an operational range for the detection system, a detection algorithm is started. If the patrol boat confirms the source is not present, it selects the next nearest boat for inspection. Each run of the simulation ends either when a patrol successfully detects a source or when the source reaches its target. Several statistical detection algorithms have been implemented in the simulation model. First, a simple k-sigma algorithm, which alarms with the counts in a time window exceeds the mean background plus k times the standard deviation of background, is available to the user. The time window used is optimized with respect to the signal-to-background ratio for that range and relative speed. Second, a sequential probability ratio test [Wald 1947] is available, and configured in this simulation with a target false positive probability of 0.001 and false negative probability of 0.1. This test is utilized when the mobile detector maintains

  20. Simulation of arc models with the block modelling method

    NARCIS (Netherlands)

    Thomas, R.; Lahaye, D.J.P.; Vuik, C.; Van der Sluis, L.

    2015-01-01

    Simulation of current interruption is currently performed with non-ideal switching devices for large power systems. Nevertheless, for small networks, non-ideal switching devices can be substituted by arc models. However, this substitution has a negative impact on the computation time. At the same

  1. Modeling lignin polymerization. Part 1: simulation model of dehydrogenation polymers.

    NARCIS (Netherlands)

    F.R.D. van Parijs (Frederik); K. Morreel; J. Ralph; W. Boerjan; R.M.H. Merks (Roeland)

    2010-01-01

    htmlabstractLignin is a heteropolymer that is thought to form in the cell wall by combinatorial radical coupling of monolignols. Here, we present a simulation model of in vitro lignin polymerization, based on the combinatorial coupling theory, which allows us to predict the reaction conditions

  2. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  3. Software to Enable Modeling & Simulation as a Service

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop a Modeling and Simulation as a Service (M&SaaS) software service infrastructure to enable most modeling and simulation (M&S) activities to be...

  4. Aortic dissection simulation models for clinical support: fluid-structure interaction vs. rigid wall models.

    Science.gov (United States)

    Alimohammadi, Mona; Sherwood, Joseph M; Karimpour, Morad; Agu, Obiekezie; Balabani, Stavroula; Díaz-Zuccarini, Vanessa

    2015-04-15

    The management and prognosis of aortic dissection (AD) is often challenging and the use of personalised computational models is being explored as a tool to improve clinical outcome. Including vessel wall motion in such simulations can provide more realistic and potentially accurate results, but requires significant additional computational resources, as well as expertise. With clinical translation as the final aim, trade-offs between complexity, speed and accuracy are inevitable. The present study explores whether modelling wall motion is worth the additional expense in the case of AD, by carrying out fluid-structure interaction (FSI) simulations based on a sample patient case. Patient-specific anatomical details were extracted from computed tomography images to provide the fluid domain, from which the vessel wall was extrapolated. Two-way fluid-structure interaction simulations were performed, with coupled Windkessel boundary conditions and hyperelastic wall properties. The blood was modelled using the Carreau-Yasuda viscosity model and turbulence was accounted for via a shear stress transport model. A simulation without wall motion (rigid wall) was carried out for comparison purposes. The displacement of the vessel wall was comparable to reports from imaging studies in terms of intimal flap motion and contraction of the true lumen. Analysis of the haemodynamics around the proximal and distal false lumen in the FSI model showed complex flow structures caused by the expansion and contraction of the vessel wall. These flow patterns led to significantly different predictions of wall shear stress, particularly its oscillatory component, which were not captured by the rigid wall model. Through comparison with imaging data, the results of the present study indicate that the fluid-structure interaction methodology employed herein is appropriate for simulations of aortic dissection. Regions of high wall shear stress were not significantly altered by the wall motion

  5. Model attraction in medical image object recognition

    Science.gov (United States)

    Tascini, Guido; Zingaretti, Primo

    1995-04-01

    This paper presents as new approach to image recognition based on a general attraction principle. A cognitive recognition is governed by a 'focus on attention' process that concentrates on the visual data subset of task- relevant type only. Our model-based approach combines it with another process, focus on attraction, which concentrates on the transformations of visual data having relevance for the matching. The recognition process is characterized by an intentional evolution of the visual data. This chain of image transformations is viewed as driven by an attraction field that attempts to reduce the distance between the image-point and the model-point in the feature space. The field sources are determined during a learning phase, by supplying the system with a training set. The paper describes a medical interpretation case in the feature space, concerning human skin lesions. The samples of the training set, supplied by the dermatologists, allow the system to learn models of lesions in terms of features such as hue factor, asymmetry factor, and asperity factor. The comparison of the visual data with the model derives the trend of image transformations, allowing a better definition of the given image and its classification. The algorithms are implemented in C language on a PC equipped with Matrox Image Series IM-1280 acquisition and processing boards. The work is now in progress.

  6. Modelling and simulation of railway cable systems

    Energy Technology Data Exchange (ETDEWEB)

    Teichelmann, G.; Schaub, M.; Simeon, B. [Technische Univ. Muenchen, Garching (Germany). Zentrum Mathematik M2

    2005-12-15

    Mathematical models and numerical methods for the computation of both static equilibria and dynamic oscillations of railroad catenaries are derived and analyzed. These cable systems form a complex network of string and beam elements and lead to coupled partial differential equations in space and time where constraints and corresponding Lagrange multipliers express the interaction between carrier, contact wire, and pantograph head. For computing static equilibria, three different algorithms are presented and compared, while the dynamic case is treated by a finite element method in space, combined with stabilized time integration of the resulting differential algebraic system. Simulation examples based on reference data from industry illustrate the potential of such computational tools. (orig.)

  7. Petroleum reservoir data for testing simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Lloyd, J.M.; Harrison, W.

    1980-09-01

    This report consists of reservoir pressure and production data for 25 petroleum reservoirs. Included are 5 data sets for single-phase (liquid) reservoirs, 1 data set for a single-phase (liquid) reservoir with pressure maintenance, 13 data sets for two-phase (liquid/gas) reservoirs and 6 for two-phase reservoirs with pressure maintenance. Also given are ancillary data for each reservoir that could be of value in the development and validation of simulation models. A bibliography is included that lists the publications from which the data were obtained.

  8. Image-based aircraft pose estimation: a comparison of simulations and real-world data

    Science.gov (United States)

    Breuers, Marcel G. J.; de Reus, Nico

    2001-10-01

    The problem of estimating aircraft pose information from mono-ocular image data is considered using a Fourier descriptor based algorithm. The dependence of pose estimation accuracy on image resolution and aspect angle is investigated through simulations using sets of synthetic aircraft images. Further evaluation shows that god pose estimation accuracy can be obtained in real world image sequences.

  9. Simulation model for port shunting yards

    Science.gov (United States)

    Rusca, A.; Popa, M.; Rosca, E.; Rosca, M.; Dragu, V.; Rusca, F.

    2016-08-01

    Sea ports are important nodes in the supply chain, joining two high capacity transport modes: rail and maritime transport. The huge cargo flows transiting port requires high capacity construction and installation such as berths, large capacity cranes, respectively shunting yards. However, the port shunting yards specificity raises several problems such as: limited access since these are terminus stations for rail network, the in-output of large transit flows of cargo relatively to the scarcity of the departure/arrival of a ship, as well as limited land availability for implementing solutions to serve these flows. It is necessary to identify technological solutions that lead to an answer to these problems. The paper proposed a simulation model developed with ARENA computer simulation software suitable for shunting yards which serve sea ports with access to the rail network. Are investigates the principal aspects of shunting yards and adequate measures to increase their transit capacity. The operation capacity for shunting yards sub-system is assessed taking in consideration the required operating standards and the measure of performance (e.g. waiting time for freight wagons, number of railway line in station, storage area, etc.) of the railway station are computed. The conclusion and results, drawn from simulation, help transports and logistics specialists to test the proposals for improving the port management.

  10. Time domain SAR raw data simulation using CST and image focusing of 3D objects

    Science.gov (United States)

    Saeed, Adnan; Hellwich, Olaf

    2017-10-01

    This paper presents the use of a general purpose electromagnetic simulator, CST, to simulate realistic synthetic aperture radar (SAR) raw data of three-dimensional objects. Raw data is later focused in MATLAB using range-doppler algorithm. Within CST Microwave Studio a replica of TerraSAR-X chirp signal is incident upon a modeled Corner Reflector (CR) whose design and material properties are identical to that of the real one. Defining mesh and other appropriate settings reflected wave is measured at several distant points within a line parallel to the viewing direction. This is analogous to an array antenna and is synthesized to create a long aperture for SAR processing. The time domain solver in CST is based on the solution of differential form of Maxwells equations. Exported data from CST is arranged into a 2-d matrix of axis range and azimuth. Hilbert transform is applied to convert the real signal to complex data with phase information. Range compression, range cell migration correction (RCMC), and azimuth compression are applied in time domain to obtain the final SAR image. This simulation can provide valuable information to clarify which real world objects cause images suitable for high accuracy identification in the SAR images.

  11. Modeling VOC transport in simulated waste drums

    International Nuclear Information System (INIS)

    Liekhus, K.J.; Gresham, G.L.; Peterson, E.S.; Rae, C.; Hotz, N.J.; Connolly, M.J.

    1993-06-01

    A volatile organic compound (VOC) transport model has been developed to describe unsteady-state VOC permeation and diffusion within a waste drum. Model equations account for three primary mechanisms for VOC transport from a void volume within the drum. These mechanisms are VOC permeation across a polymer boundary, VOC diffusion across an opening in a volume boundary, and VOC solubilization in a polymer boundary. A series of lab-scale experiments was performed in which the VOC concentration was measured in simulated waste drums under different conditions. A lab-scale simulated waste drum consisted of a sized-down 55-gal metal drum containing a modified rigid polyethylene drum liner. Four polyethylene bags were sealed inside a large polyethylene bag, supported by a wire cage, and placed inside the drum liner. The small bags were filled with VOC-air gas mixture and the VOC concentration was measured throughout the drum over a period of time. Test variables included the type of VOC-air gas mixtures introduced into the small bags, the small bag closure type, and the presence or absence of a variable external heat source. Model results were calculated for those trials where the VOC permeability had been measured. Permeabilities for five VOCs [methylene chloride, 1,1,2-trichloro-1,2,2-trifluoroethane (Freon-113), 1,1,1-trichloroethane, carbon tetrachloride, and trichloroethylene] were measured across a polyethylene bag. Comparison of model and experimental results of VOC concentration as a function of time indicate that model accurately accounts for significant VOC transport mechanisms in a lab-scale waste drum

  12. 3D Core Model for simulation of nuclear power plants: Simulation requirements, model features, and validation

    International Nuclear Information System (INIS)

    Zerbino, H.

    1999-01-01

    In 1994-1996, Thomson Training and Simulation (TT and S) earned out the D50 Project, which involved the design and construction of optimized replica simulators for one Dutch and three German Nuclear Power Plants. It was recognized early on that the faithful reproduction of the Siemens reactor control and protection systems would impose extremely stringent demands on the simulation models, particularly the Core physics and the RCS thermohydraulics. The quality of the models, and their thorough validation, were thus essential. The present paper describes the main features of the fully 3D Core model implemented by TT and S, and its extensive validation campaign, which was defined in extremely positive collaboration with the Customer and the Core Data suppliers. (author)

  13. Image simulations of kinked vortices for transmission electron microscopy

    DEFF Research Database (Denmark)

    Beleggia, Marco; Pozzi, G.; Tonomura, A.

    2010-01-01

    We present an improved model of kinked vortices in high-Tc superconductors suitable for the interpretation of Fresnel or holographic observations carried out with a transmission electron microscope. A kinked vortex is composed of two displaced half-vortices, perpendicular to the film plane...... observations of high-Tc superconducting films, where the Fresnel contrast associated with some vortices showed a dumbbell like appearance. Here, we show that under suitable conditions the JV segment may reveal itself in Fresnel imaging or holographic phase mapping in a transmission electron microscope....

  14. A generalized framework unifying image registration and respiratory motion models and incorporating image reconstruction, for partial image data or full images

    Science.gov (United States)

    McClelland, Jamie R.; Modat, Marc; Arridge, Simon; Grimes, Helen; D'Souza, Derek; Thomas, David; O' Connell, Dylan; Low, Daniel A.; Kaza, Evangelia; Collins, David J.; Leach, Martin O.; Hawkes, David J.

    2017-06-01

    Surrogate-driven respiratory motion models relate the motion of the internal anatomy to easily acquired respiratory surrogate signals, such as the motion of the skin surface. They are usually built by first using image registration to determine the motion from a number of dynamic images, and then fitting a correspondence model relating the motion to the surrogate signals. In this paper we present a generalized framework that unifies the image registration and correspondence model fitting into a single optimization. This allows the use of ‘partial’ imaging data, such as individual slices, projections, or k-space data, where it would not be possible to determine the motion from an individual frame of data. Motion compensated image reconstruction can also be incorporated using an iterative approach, so that both the motion and a motion-free image can be estimated from the partial image data. The framework has been applied to real 4DCT, Cine CT, multi-slice CT, and multi-slice MR data, as well as simulated datasets from a computer phantom. This includes the use of a super-resolution reconstruction method for the multi-slice MR data. Good results were obtained for all datasets, including quantitative results for the 4DCT and phantom datasets where the ground truth motion was known or could be estimated.

  15. Simulation of bonding effects in HRTEM images of light element materials

    Directory of Open Access Journals (Sweden)

    Simon Kurasch

    2011-07-01

    Full Text Available The accuracy of multislice high-resolution transmission electron microscopy (HRTEM simulation can be improved by calculating the scattering potential using density functional theory (DFT. This approach accounts for the fact that electrons in the specimen are redistributed according to their local chemical environment. This influences the scattering process and alters the absolute and relative contrast in the final image. For light element materials with well defined geometry, such as graphene and hexagonal boron nitride monolayers, the DFT based simulation scheme turned out to be necessary to prevent misinterpretation of weak signals, such as the identification of nitrogen substitutions in a graphene network. Furthermore, this implies that the HRTEM image does not only contain structural information (atom positions and atomic numbers. Instead, information on the electron charge distribution can be gained in addition.In order to produce meaningful results, the new input parameters need to be chosen carefully. Here we present details of the simulation process and discuss the influence of the main parameters on the final result. Furthermore we apply the simulation scheme to three model systems: A single atom boron and a single atom oxygen substitution in graphene and an oxygen adatom on graphene.

  16. Molecular models and simulations of layered materials

    International Nuclear Information System (INIS)

    Kalinichev, Andrey G.; Cygan, Randall Timothy; Heinz, Hendrik; Greathouse, Jeffery A.

    2008-01-01

    The micro- to nano-sized nature of layered materials, particularly characteristic of naturally occurring clay minerals, limits our ability to fully interrogate their atomic dispositions and crystal structures. The low symmetry, multicomponent compositions, defects, and disorder phenomena of clays and related phases necessitate the use of molecular models and modern simulation methods. Computational chemistry tools based on classical force fields and quantum-chemical methods of electronic structure calculations provide a practical approach to evaluate structure and dynamics of the materials on an atomic scale. Combined with classical energy minimization, molecular dynamics, and Monte Carlo techniques, quantum methods provide accurate models of layered materials such as clay minerals, layered double hydroxides, and clay-polymer nanocomposites

  17. VISION: Verifiable Fuel Cycle Simulation Model

    Energy Technology Data Exchange (ETDEWEB)

    Jacob J. Jacobson; Abdellatif M. Yacout; Gretchen E. Matthern; Steven J. Piet; David E. Shropshire

    2009-04-01

    The nuclear fuel cycle is a very complex system that includes considerable dynamic complexity as well as detail complexity. In the nuclear power realm, there are experts and considerable research and development in nuclear fuel development, separations technology, reactor physics and waste management. What is lacking is an overall understanding of the entire nuclear fuel cycle and how the deployment of new fuel cycle technologies affects the overall performance of the fuel cycle. The Advanced Fuel Cycle Initiative’s systems analysis group is developing a dynamic simulation model, VISION, to capture the relationships, timing and delays in and among the fuel cycle components to help develop an understanding of how the overall fuel cycle works and can transition as technologies are changed. This paper is an overview of the philosophy and development strategy behind VISION. The paper includes some descriptions of the model and some examples of how to use VISION.

  18. Modeling and visual simulation of Microalgae photobioreactor

    Science.gov (United States)

    Zhao, Ming; Hou, Dapeng; Hu, Dawei

    Microalgae is a kind of nutritious and high photosynthetic efficiency autotrophic plant, which is widely distributed in the land and the sea. It can be extensively used in medicine, food, aerospace, biotechnology, environmental protection and other fields. Photobioreactor which is important equipment is mainly used to cultivate massive and high-density microalgae. In this paper, based on the mathematical model of microalgae which grew under different light intensity, three-dimensional visualization model was built and implemented in 3ds max, Virtools and some other three dimensional software. Microalgae is photosynthetic organism, it can efficiently produce oxygen and absorb carbon dioxide. The goal of the visual simulation is to display its change and impacting on oxygen and carbon dioxide intuitively. In this paper, different temperatures and light intensities were selected to control the photobioreactor, and dynamic change of microalgal biomass, Oxygen and carbon dioxide was observed with the aim of providing visualization support for microalgal and photobioreactor research.

  19. A rainfall simulation model for agricultural development in Bangladesh

    Directory of Open Access Journals (Sweden)

    M. Sayedur Rahman

    2000-01-01

    Full Text Available A rainfall simulation model based on a first-order Markov chain has been developed to simulate the annual variation in rainfall amount that is observed in Bangladesh. The model has been tested in the Barind Tract of Bangladesh. Few significant differences were found between the actual and simulated seasonal, annual and average monthly. The distribution of number of success is asymptotic normal distribution. When actual and simulated daily rainfall data were used to drive a crop simulation model, there was no significant difference of rice yield response. The results suggest that the rainfall simulation model perform adequately for many applications.

  20. Noise Modeling of SDO AIA Images

    Science.gov (United States)

    Kirk, M. S.; Young, C. A.

    2014-12-01

    All digital images are corrupted by noise. In most solar imaging, we have the luxury of high photon counts and low background contamination, which when combined with carful calibration, minimize much of the impact noise has on the measurement. Outside high-intensity regions, such as in coronal holes, the noise component can become significant and complicate feature recognition and segmentation. We create a practical estimate of noise in the AIA images across the detector CCD. A Poisson-Gaussian model of noise is well suited in the digital imaging environment due to the statistical distributions of photons and the characteristics of the CCD. Using the dark and flat field calibration images, the level-1 AIA images, and readout noise measurements, we construct a maximum-a-posteriori estimation of the expected error in the AIA images. These estimations of noise not only provide a clearer view of solar features in AIA, but they are also relevant to error characterizations of other solar images.

  1. Modeling lift operations with SASmacr Simulation Studio

    Science.gov (United States)

    Kar, Leow Soo

    2016-10-01

    Lifts or elevators are an essential part of multistorey buildings which provide vertical transportation for its occupants. In large and high-rise apartment buildings, its occupants are permanent, while in buildings, like hospitals or office blocks, the occupants are temporary or users of the buildings. They come in to work or to visit, and thus, the population of such buildings are much higher than those in residential apartments. It is common these days that large office blocks or hospitals have at least 8 to 10 lifts serving its population. In order to optimize the level of service performance, different transportation schemes are devised to control the lift operations. For example, one lift may be assigned to solely service the even floors and another solely for the odd floors, etc. In this paper, a basic lift system is modelled using SAS Simulation Studio to study the effect of factors such as the number of floors, capacity of the lift car, arrival rate and exit rate of passengers at each floor, peak and off peak periods on the system performance. The simulation is applied to a real lift operation in Sunway College's North Building to validate the model.

  2. Plasma simulation studies using multilevel physics models

    International Nuclear Information System (INIS)

    Park, W.; Belova, E.V.; Fu, G.Y.

    2000-01-01

    The question of how to proceed toward ever more realistic plasma simulation studies using ever increasing computing power is addressed. The answer presented here is the M3D (Multilevel 3D) project, which has developed a code package with a hierarchy of physics levels that resolve increasingly complete subsets of phase-spaces and are thus increasingly more realistic. The rationale for the multilevel physics models is given. Each physics level is described and examples of its application are given. The existing physics levels are fluid models (3D configuration space), namely magnetohydrodynamic (MHD) and two-fluids; and hybrid models, namely gyrokinetic-energetic-particle/MHD (5D energetic particle phase-space), gyrokinetic-particle-ion/fluid-electron (5D ion phase-space), and full-kinetic-particle-ion/fluid-electron level (6D ion phase-space). Resolving electron phase-space (5D or 6D) remains a future project. Phase-space-fluid models are not used in favor of delta f particle models. A practical and accurate nonlinear fluid closure for noncollisional plasmas seems not likely in the near future

  3. Plasma simulation studies using multilevel physics models

    Energy Technology Data Exchange (ETDEWEB)

    Park, W.; Belova, E.V.; Fu, G.Y. [and others

    2000-01-19

    The question of how to proceed toward ever more realistic plasma simulation studies using ever increasing computing power is addressed. The answer presented here is the M3D (Multilevel 3D) project, which has developed a code package with a hierarchy of physics levels that resolve increasingly complete subsets of phase-spaces and are thus increasingly more realistic. The rationale for the multilevel physics models is given. Each physics level is described and examples of its application are given. The existing physics levels are fluid models (3D configuration space), namely magnetohydrodynamic (MHD) and two-fluids; and hybrid models, namely gyrokinetic-energetic-particle/MHD (5D energetic particle phase-space), gyrokinetic-particle-ion/fluid-electron (5D ion phase-space), and full-kinetic-particle-ion/fluid-electron level (6D ion phase-space). Resolving electron phase-space (5D or 6D) remains a future project. Phase-space-fluid models are not used in favor of delta f particle models. A practical and accurate nonlinear fluid closure for noncollisional plasmas seems not likely in the near future.

  4. Coupling of Large Eddy Simulations with Meteorological Models to simulate Methane Leaks from Natural Gas Storage Facilities

    Science.gov (United States)

    Prasad, K.

    2017-12-01

    Atmospheric transport is usually performed with weather models, e.g., the Weather Research and Forecasting (WRF) model that employs a parameterized turbulence model and does not resolve the fine scale dynamics generated by the flow around buildings and features comprising a large city. The NIST Fire Dynamics Simulator (FDS) is a computational fluid dynamics model that utilizes large eddy simulation methods to model flow around buildings at length scales much smaller than is practical with models like WRF. FDS has the potential to evaluate the impact of complex topography on near-field dispersion and mixing that is difficult to simulate with a mesoscale atmospheric model. A methodology has been developed to couple the FDS model with WRF mesoscale transport models. The coupling is based on nudging the FDS flow field towards that computed by WRF, and is currently limited to one way coupling performed in an off-line mode. This approach allows the FDS model to operate as a sub-grid scale model with in a WRF simulation. To test and validate the coupled FDS - WRF model, the methane leak from the Aliso Canyon underground storage facility was simulated. Large eddy simulations were performed over the complex topography of various natural gas storage facilities including Aliso Canyon, Honor Rancho and MacDonald Island at 10 m horizontal and vertical resolution. The goal of these simulations included improving and validating transport models as well as testing leak hypotheses. Forward simulation results were compared with aircraft and tower based in-situ measurements as well as methane plumes observed using the NASA Airborne Visible InfraRed Imaging Spectrometer (AVIRIS) and the next generation instrument AVIRIS-NG. Comparison of simulation results with measurement data demonstrate the capability of the coupled FDS-WRF models to accurately simulate the transport and dispersion of methane plumes over urban domains. Simulated integrated methane enhancements will be presented and

  5. Validation of a simultaneous PET/MR system model for PET simulation using GATE

    International Nuclear Information System (INIS)

    Monnier, Florian; Fayad, Hadi; Bert, Julien; Schmidt, Holger; Visvikis, Dimitris

    2015-01-01

    Simultaneous PET/MR acquisition shows promise in a range of applications. Simulation using GATE is an essential tool that allows obtaining the ground truth for such acquisitions and therefore helping in the development and the validation of innovative processing methods such as PET image reconstruction, attenuation correction and motion correction. The purpose of this work is to validate the GATE simulation of the Siemens Biograph mMR PET/MR system. A model of the Siemens Biograph mMR was developed. This model includes the geometry and spatial positioning of the crystals inside the scanner and the characteristics of the detection process. The accuracy of the model was tested by comparing, on a real physical phantom study, GATE simulated results to reconstructed PET images using measured results obtained from a Siemens Biograph mMR system. The same parameters such as the acquisition time and phantom position inside the scanner were fixed for our simulations. List-mode outputs were recovered in both cases and reconstructed using the OPL-EM algorithm. Several parameters were used to compare the two reconstructed images such as profile comparison, signal-to-noise ratio and activity contrast analysis. Finally patient acquired MR images were segmented and used for the simulation of corresponding PET images. The simulated and acquired sets of reconstructed phantom images showed close emission values in regions of interest with relative differences lower than 5%. The scatter fraction was within a <3% agreement. Close matching of profiles and contrast indices were obtained between simulated and corresponding acquired PET images. Our results indicate that the GATE developed Biograph mMR model is accurate in comparison to the real scanner performance and can be used for evaluating innovative processing methods for applications in clinical PET/MR protocols.

  6. Validation of a simultaneous PET/MR system model for PET simulation using GATE

    Energy Technology Data Exchange (ETDEWEB)

    Monnier, Florian; Fayad, Hadi; Bert, Julien [LaTIM UMR 1101, Brest (France); Schmidt, Holger [University Hospital of Tübingen, Tübingen (Germany); Visvikis, Dimitris [LaTIM UMR 1101, Brest (France)

    2015-05-18

    Simultaneous PET/MR acquisition shows promise in a range of applications. Simulation using GATE is an essential tool that allows obtaining the ground truth for such acquisitions and therefore helping in the development and the validation of innovative processing methods such as PET image reconstruction, attenuation correction and motion correction. The purpose of this work is to validate the GATE simulation of the Siemens Biograph mMR PET/MR system. A model of the Siemens Biograph mMR was developed. This model includes the geometry and spatial positioning of the crystals inside the scanner and the characteristics of the detection process. The accuracy of the model was tested by comparing, on a real physical phantom study, GATE simulated results to reconstructed PET images using measured results obtained from a Siemens Biograph mMR system. The same parameters such as the acquisition time and phantom position inside the scanner were fixed for our simulations. List-mode outputs were recovered in both cases and reconstructed using the OPL-EM algorithm. Several parameters were used to compare the two reconstructed images such as profile comparison, signal-to-noise ratio and activity contrast analysis. Finally patient acquired MR images were segmented and used for the simulation of corresponding PET images. The simulated and acquired sets of reconstructed phantom images showed close emission values in regions of interest with relative differences lower than 5%. The scatter fraction was within a <3% agreement. Close matching of profiles and contrast indices were obtained between simulated and corresponding acquired PET images. Our results indicate that the GATE developed Biograph mMR model is accurate in comparison to the real scanner performance and can be used for evaluating innovative processing methods for applications in clinical PET/MR protocols.

  7. Lung Cancer Pathological Image Analysis Using a Hidden Potts Model

    Directory of Open Access Journals (Sweden)

    Qianyun Li

    2017-06-01

    Full Text Available Nowadays, many biological data are acquired via images. In this article, we study the pathological images scanned from 205 patients with lung cancer with the goal to find out the relationship between the survival time and the spatial distribution of different types of cells, including lymphocyte, stroma, and tumor cells. Toward this goal, we model the spatial distribution of different types of cells using a modified Potts model for which the parameters represent interactions between different types of cells and estimate the parameters of the Potts model using the double Metropolis-Hastings algorithm. The double Metropolis-Hastings algorithm allows us to simulate samples approximately from a distribution with an intractable normalizing constant. Our numerical results indicate that the spatial interaction between the lymphocyte and tumor cells is significantly associated with the patient’s survival time, and it can be used together with the cell count information to predict the survival of the patients.

  8. Lung Cancer Pathological Image Analysis Using a Hidden Potts Model

    Science.gov (United States)

    Li, Qianyun; Yi, Faliu; Wang, Tao; Xiao, Guanghua; Liang, Faming

    2017-01-01

    Nowadays, many biological data are acquired via images. In this article, we study the pathological images scanned from 205 patients with lung cancer with the goal to find out the relationship between the survival time and the spatial distribution of different types of cells, including lymphocyte, stroma, and tumor cells. Toward this goal, we model the spatial distribution of different types of cells using a modified Potts model for which the parameters represent interactions between different types of cells and estimate the parameters of the Potts model using the double Metropolis-Hastings algorithm. The double Metropolis-Hastings algorithm allows us to simulate samples approximately from a distribution with an intractable normalizing constant. Our numerical results indicate that the spatial interaction between the lymphocyte and tumor cells is significantly associated with the patient’s survival time, and it can be used together with the cell count information to predict the survival of the patients. PMID:28615918

  9. Modeling and numerical simulations of the influenced Sznajd model

    Science.gov (United States)

    Karan, Farshad Salimi Naneh; Srinivasan, Aravinda Ramakrishnan; Chakraborty, Subhadeep

    2017-08-01

    This paper investigates the effects of independent nonconformists or influencers on the behavioral dynamic of a population of agents interacting with each other based on the Sznajd model. The system is modeled on a complete graph using the master equation. The acquired equation has been numerically solved. Accuracy of the mathematical model and its corresponding assumptions have been validated by numerical simulations. Regions of initial magnetization have been found from where the system converges to one of two unique steady-state PDFs, depending on the distribution of influencers. The scaling property and entropy of the stationary system in presence of varying level of influence have been presented and discussed.

  10. Atmospheric Model Evaluation Tool for meteorological and air quality simulations

    Science.gov (United States)

    The Atmospheric Model Evaluation Tool compares model predictions to observed data from various meteorological and air quality observation networks to help evaluate meteorological and air quality simulations.

  11. Comparison of performance of simulation models for floor heating

    DEFF Research Database (Denmark)

    Weitzmann, Peter; Svendsen, Svend

    2005-01-01

    This paper describes the comparison of performance of simulation models for floor heating with different level of detail in the modelling process. The models are compared in an otherwise identical simulation model containing room model, walls, windows, ceiling and ventilation system. By exchanging...

  12. Hybrid Reynolds-Averaged/Large Eddy Simulation of the Flow in a Model SCRamjet Cavity Flameholder

    Science.gov (United States)

    Baurle, R. A.

    2016-01-01

    Steady-state and scale-resolving simulations have been performed for flow in and around a model scramjet combustor flameholder. Experimental data available for this configuration include velocity statistics obtained from particle image velocimetry. Several turbulence models were used for the steady-state Reynolds-averaged simulations which included both linear and non-linear eddy viscosity models. The scale-resolving simulations used a hybrid Reynolds-averaged/large eddy simulation strategy that is designed to be a large eddy simulation everywhere except in the inner portion (log layer and below) of the boundary layer. Hence, this formulation can be regarded as a wall-modeled large eddy simulation. This e ort was undertaken to not only assess the performance of the hybrid Reynolds-averaged / large eddy simulation modeling approach in a flowfield of interest to the scramjet research community, but to also begin to understand how this capability can best be used to augment standard Reynolds-averaged simulations. The numerical errors were quantified for the steady-state simulations, and at least qualitatively assessed for the scale-resolving simulations prior to making any claims of predictive accuracy relative to the measurements. The steady-state Reynolds-averaged results displayed a high degree of variability when comparing the flameholder fuel distributions obtained from each turbulence model. This prompted the consideration of applying the higher-fidelity scale-resolving simulations as a surrogate "truth" model to calibrate the Reynolds-averaged closures in a non-reacting setting prior to their use for the combusting simulations. In general, the Reynolds-averaged velocity profile predictions at the lowest fueling level matched the particle imaging measurements almost as well as was observed for the non-reacting condition. However, the velocity field predictions proved to be more sensitive to the flameholder fueling rate than was indicated in the measurements.

  13. Tecnomatix Plant Simulation modeling and programming by means of examples

    CERN Document Server

    Bangsow, Steffen

    2015-01-01

    This book systematically introduces the development of simulation models as well as the implementation and evaluation of simulation experiments with Tecnomatix Plant Simulation. It deals with all users of Plant Simulation, who have more complex tasks to handle. It also looks for an easy entry into the program. Particular attention has been paid to introduce the simulation flow language SimTalk and its use in various areas of the simulation. The author demonstrates with over 200 examples how to combine the blocks for simulation models and how to deal with SimTalk for complex control and analys

  14. Nonlinear distortion in wireless systems modeling and simulation with Matlab

    CERN Document Server

    Gharaibeh, Khaled M

    2011-01-01

    This book covers the principles of modeling and simulation of nonlinear distortion in wireless communication systems with MATLAB simulations and techniques In this book, the author describes the principles of modeling and simulation of nonlinear distortion in single and multichannel wireless communication systems using both deterministic and stochastic signals. Models and simulation methods of nonlinear amplifiers explain in detail how to analyze and evaluate the performance of data communication links under nonlinear amplification. The book addresses the analysis of nonlinear systems

  15. FEM model based optimization of transducer geometry for photoacoustic imaging

    NARCIS (Netherlands)

    Xia, W.; Piras, D.; van Veldhoven, S.; Prins, C; van Leeuwen, Ton; Steenbergen, Wiendelt; Manohar, Srirang

    2012-01-01

    We optimize the design of an ultrasound transducer for photoacoustic breast imaging using FEM analysis. We arrive at a detector design which shows significant improvement in image quality when used in photoacoustic tomographic simulations.

  16. Simulation of deposed dose and application of image processing

    International Nuclear Information System (INIS)

    Dadi, A.; Fahli, A.

    1994-01-01

    In gamma radiation processing, the photons from radioactive isotopes, are absorbed in matter where they lost a part or the whole energy they process. On every point P of the material irradiated, the absorbed dose (D) is the deposed energy in the volume (dV) (centred in P) per the mass (Dm) of this volume. The radiations effects on every point of material it directly depend on the locally deposed energy. For technical applications it very important to know how the dose is deposed in irradiated material. Because of the random aspect of each process who may released in material ( Photoelectric effect, Compton diffusion and pair production (e sup +, e sup -..) and the arbitrary geometries can be treated, we use in this work, the Monte Carlo simulation, to describe the phenomenon and reproduce it at all in computer during the irradiation processing for photons with energies above a few KeV to a several MeV in any element compound or mixture. The rate dose is then calculated at every point P(x,y,z,) and restored as real data file in first time,and transformed to bytes data file and finally shown as a digital image in high resolution 16 colors allowing an analysis of the dose variation in material 2 figs., 2 refs. (author)

  17. Modeling human response errors in synthetic flight simulator domain

    Science.gov (United States)

    Ntuen, Celestine A.

    1992-01-01

    This paper presents a control theoretic approach to modeling human response errors (HRE) in the flight simulation domain. The human pilot is modeled as a supervisor of a highly automated system. The synthesis uses the theory of optimal control pilot modeling for integrating the pilot's observation error and the error due to the simulation model (experimental error). Methods for solving the HRE problem are suggested. Experimental verification of the models will be tested in a flight quality handling simulation.

  18. Multiple Time Series Ising Model for Financial Market Simulations

    International Nuclear Information System (INIS)

    Takaishi, Tetsuya

    2015-01-01

    In this paper we propose an Ising model which simulates multiple financial time series. Our model introduces the interaction which couples to spins of other systems. Simulations from our model show that time series exhibit the volatility clustering that is often observed in the real financial markets. Furthermore we also find non-zero cross correlations between the volatilities from our model. Thus our model can simulate stock markets where volatilities of stocks are mutually correlated

  19. Image reconstructions from super-sampled data sets with resolution modeling in PET imaging.

    Science.gov (United States)

    Li, Yusheng; Matej, Samuel; Metzler, Scott D

    2014-12-01

    Spatial resolution in positron emission tomography (PET) is still a limiting factor in many imaging applications. To improve the spatial resolution for an existing scanner with fixed crystal sizes, mechanical movements such as scanner wobbling and object shifting have been considered for PET systems. Multiple acquisitions from different positions can provide complementary information and increased spatial sampling. The objective of this paper is to explore an efficient and useful reconstruction framework to reconstruct super-resolution images from super-sampled low-resolution data sets. The authors introduce a super-sampling data acquisition model based on the physical processes with tomographic, downsampling, and shifting matrices as its building blocks. Based on the model, we extend the MLEM and Landweber algorithms to reconstruct images from super-sampled data sets. The authors also derive a backprojection-filtration-like (BPF-like) method for the super-sampling reconstruction. Furthermore, they explore variant methods for super-sampling reconstructions: the separate super-sampling resolution-modeling reconstruction and the reconstruction without downsampling to further improve image quality at the cost of more computation. The authors use simulated reconstruction of a resolution phantom to evaluate the three types of algorithms with different super-samplings at different count levels. Contrast recovery coefficient (CRC) versus background variability, as an image-quality metric, is calculated at each iteration for all reconstructions. The authors observe that all three algorithms can significantly and consistently achieve increased CRCs at fixed background variability and reduce background artifacts with super-sampled data sets at the same count levels. For the same super-sampled data sets, the MLEM method achieves better image quality than the Landweber method, which in turn achieves better image quality than the BPF-like method. The authors also demonstrate

  20. Nonparametric Mixture Models for Supervised Image Parcellation.

    Science.gov (United States)

    Sabuncu, Mert R; Yeo, B T Thomas; Van Leemput, Koen; Fischl, Bruce; Golland, Polina

    2009-09-01

    We present a nonparametric, probabilistic mixture model for the supervised parcellation of images. The proposed model yields segmentation algorithms conceptually similar to the recently developed label fusion methods, which register a new image with each training image separately. Segmentation is achieved via the fusion of transferred manual labels. We show that in our framework various settings of a model parameter yield algorithms that use image intensity information differently in determining the weight of a training subject during fusion. One particular setting computes a single, global weight per training subject, whereas another setting uses locally varying weights when fusing the training data. The proposed nonparametric parcellation approach capitalizes on recently developed fast and robust pairwise image alignment tools. The use of multiple registrations allows the algorithm to be robust to occasional registration failures. We report experiments on 39 volumetric brain MRI scans with expert manual labels for the white matter, cerebral cortex, ventricles and subcortical structures. The results demonstrate that the proposed nonparametric segmentation framework yields significantly better segmentation than state-of-the-art algorithms.

  1. Modeling and Simulation Techniques for Large-Scale Communications Modeling

    National Research Council Canada - National Science Library

    Webb, Steve

    1997-01-01

    .... Tests of random number generators were also developed and applied to CECOM models. It was found that synchronization of random number strings in simulations is easy to implement and can provide significant savings for making comparative studies. If synchronization is in place, then statistical experiment design can be used to provide information on the sensitivity of the output to input parameters. The report concludes with recommendations and an implementation plan.

  2. Phase contrast image simulations for electron holography of magnetic and electric fields.

    Science.gov (United States)

    Beleggia, Marco; Pozzi, Giulio

    2013-06-01

    The research on flux line lattices and pancake vortices in superconducting materials, carried out within a long and fruitful collaboration with Akira Tonomura and his group at the Hitachi Advanced Research Laboratory, led us to develop a mathematical framework, based on the reciprocal representation of the magnetic vector potential, that enables us to simulate realistic phase images of fluxons. The aim of this paper is to review the main ideas underpinning our computational framework and the results we have obtained throughout the collaboration. Furthermore, we outline how to generalize the approach to model other samples and structures of interest, in particular thin ferromagnetic films, ferromagnetic nanoparticles and p-n junctions.

  3. A hybrid approach to simulate multiple photon scattering in X-ray imaging

    Energy Technology Data Exchange (ETDEWEB)

    Freud, N. [CNDRI, Laboratory of Nondestructive Testing using Ionizing Radiations, INSA-Lyon Scientific and Technical University, Bat. Antoine de Saint-Exupery, 20, avenue Albert Einstein, 69621 Villeurbanne Cedex (France)]. E-mail: nicolas.freud@insa-lyon.fr; Letang, J.-M. [CNDRI, Laboratory of Nondestructive Testing using Ionizing Radiations, INSA-Lyon Scientific and Technical University, Bat. Antoine de Saint-Exupery, 20, avenue Albert Einstein, 69621 Villeurbanne Cedex (France); Babot, D. [CNDRI, Laboratory of Nondestructive Testing using Ionizing Radiations, INSA-Lyon Scientific and Technical University, Bat. Antoine de Saint-Exupery, 20, avenue Albert Einstein, 69621 Villeurbanne Cedex (France)

    2005-01-01

    A hybrid simulation approach is proposed to compute the contribution of scattered radiation in X- or {gamma}-ray imaging. This approach takes advantage of the complementarity between the deterministic and probabilistic simulation methods. The proposed hybrid method consists of two stages. Firstly, a set of scattering events occurring in the inspected object is determined by means of classical Monte Carlo simulation. Secondly, this set of scattering events is used as a starting point to compute the energy imparted to the detector, with a deterministic algorithm based on a 'forced detection' scheme. For each scattering event, the probability for the scattered photon to reach each pixel of the detector is calculated using well-known physical models (form factor and incoherent scattering function approximations, in the case of Rayleigh and Compton scattering respectively). The results of the proposed hybrid approach are compared to those obtained with the Monte Carlo method alone (Geant4 code) and found to be in excellent agreement. The convergence of the results when the number of scattering events increases is studied. The proposed hybrid approach makes it possible to simulate the contribution of each type (Compton or Rayleigh) and order of scattering, separately or together, with a single PC, within reasonable computation times (from minutes to hours, depending on the number of pixels of the detector). This constitutes a substantial benefit, compared to classical simulation methods (Monte Carlo or deterministic approaches), which usually requires a parallel computing architecture to obtain comparable results.

  4. Modelling and Simulation of Search Engine

    Science.gov (United States)

    Nasution, Mahyuddin K. M.

    2017-01-01

    The best tool currently used to access information is a search engine. Meanwhile, the information space has its own behaviour. Systematically, an information space needs to be familiarized with mathematics so easily we identify the characteristics associated with it. This paper reveal some characteristics of search engine based on a model of document collection, which are then estimated the impact on the feasibility of information. We reveal some of characteristics of search engine on the lemma and theorem about singleton and doubleton, then computes statistically characteristic as simulating the possibility of using search engine. In this case, Google and Yahoo. There are differences in the behaviour of both search engines, although in theory based on the concept of documents collection.

  5. Monte Carlo simulation of Markov unreliability models

    International Nuclear Information System (INIS)

    Lewis, E.E.; Boehm, F.

    1984-01-01

    A Monte Carlo method is formulated for the evaluation of the unrealibility of complex systems with known component failure and repair rates. The formulation is in terms of a Markov process allowing dependences between components to be modeled and computational efficiencies to be achieved in the Monte Carlo simulation. Two variance reduction techniques, forced transition and failure biasing, are employed to increase computational efficiency of the random walk procedure. For an example problem these result in improved computational efficiency by more than three orders of magnitudes over analog Monte Carlo. The method is generalized to treat problems with distributed failure and repair rate data, and a batching technique is introduced and shown to result in substantial increases in computational efficiency for an example problem. A method for separating the variance due to the data uncertainty from that due to the finite number of random walks is presented. (orig.)

  6. Modeling and simulation technology readiness levels.

    Energy Technology Data Exchange (ETDEWEB)

    Clay, Robert L.; Shneider, Max S.; Marburger, S. J.; Trucano, Timothy Guy

    2006-01-01

    This report summarizes the results of an effort to establish a framework for assigning and communicating technology readiness levels (TRLs) for the modeling and simulation (ModSim) capabilities at Sandia National Laboratories. This effort was undertaken as a special assignment for the Weapon Simulation and Computing (WSC) program office led by Art Hale, and lasted from January to September 2006. This report summarizes the results, conclusions, and recommendations, and is intended to help guide the program office in their decisions about the future direction of this work. The work was broken out into several distinct phases, starting with establishing the scope and definition of the assignment. These are characterized in a set of key assertions provided in the body of this report. Fundamentally, the assignment involved establishing an intellectual framework for TRL assignments to Sandia's modeling and simulation capabilities, including the development and testing of a process to conduct the assignments. To that end, we proposed a methodology for both assigning and understanding the TRLs, and outlined some of the restrictions that need to be placed on this process and the expected use of the result. One of the first assumptions we overturned was the notion of a ''static'' TRL--rather we concluded that problem context was essential in any TRL assignment, and that leads to dynamic results (i.e., a ModSim tool's readiness level depends on how it is used, and by whom). While we leveraged the classic TRL results from NASA, DoD, and Sandia's NW program, we came up with a substantially revised version of the TRL definitions, maintaining consistency with the classic level definitions and the Predictive Capability Maturity Model (PCMM) approach. In fact, we substantially leveraged the foundation the PCMM team provided, and augmented that as needed. Given the modeling and simulation TRL definitions and our proposed assignment methodology, we

  7. Computational Modeling and Simulation of Developmental ...

    Science.gov (United States)

    Standard practice for assessing developmental toxicity is the observation of apical endpoints (intrauterine death, fetal growth retardation, structural malformations) in pregnant rats/rabbits following exposure during organogenesis. EPA’s computational toxicology research program (ToxCast) generated vast in vitro cellular and molecular effects data on >1858 chemicals in >600 high-throughput screening (HTS) assays. The diversity of assays has been increased for developmental toxicity with several HTS platforms, including the devTOX-quickPredict assay from Stemina Biomarker Discovery utilizing the human embryonic stem cell line (H9). Translating these HTS data into higher order-predictions of developmental toxicity is a significant challenge. Here, we address the application of computational systems models that recapitulate the kinematics of dynamical cell signaling networks (e.g., SHH, FGF, BMP, retinoids) in a CompuCell3D.org modeling environment. Examples include angiogenesis (angiodysplasia) and dysmorphogenesis. Being numerically responsive to perturbation, these models are amenable to data integration for systems Toxicology and Adverse Outcome Pathways (AOPs). The AOP simulation outputs predict potential phenotypes based on the in vitro HTS data ToxCast. A heuristic computational intelligence framework that recapitulates the kinematics of dynamical cell signaling networks in the embryo, together with the in vitro profiling data, produce quantitative predic

  8. Edge Detection on Images of Pseudoimpedance Section Supported by Context and Adaptive Transformation Model Images

    Directory of Open Access Journals (Sweden)

    Kawalec-Latała Ewa

    2014-03-01

    Full Text Available Most of underground hydrocarbon storage are located in depleted natural gas reservoirs. Seismic survey is the most economical source of detailed subsurface information. The inversion of seismic section for obtaining pseudoacoustic impedance section gives the possibility to extract detailed subsurface information. The seismic wavelet parameters and noise briefly influence the resolution. Low signal parameters, especially long signal duration time and the presence of noise decrease pseudoimpedance resolution. Drawing out from measurement or modelled seismic data approximation of distribution of acoustic pseuoimpedance leads us to visualisation and images useful to stratum homogeneity identification goal. In this paper, the improvement of geologic section image resolution by use of minimum entropy deconvolution method before inversion is applied. The author proposes context and adaptive transformation of images and edge detection methods as a way to increase the effectiveness of correct interpretation of simulated images. In the paper, the edge detection algorithms using Sobel, Prewitt, Robert, Canny operators as well as Laplacian of Gaussian method are emphasised. Wiener filtering of image transformation improving rock section structure interpretation pseudoimpedance matrix on proper acoustic pseudoimpedance value, corresponding to selected geologic stratum. The goal of the study is to develop applications of image transformation tools to inhomogeneity detection in salt deposits.

  9. Discrete imaging models for three-dimensional optoacoustic tomography using radially symmetric expansion functions.

    Science.gov (United States)

    Wang, Kun; Schoonover, Robert W; Su, Richard; Oraevsky, Alexander; Anastasio, Mark A

    2014-05-01

    Optoacoustic tomography (OAT), also known as photoacoustic tomography, is an emerging computed biomedical imaging modality that exploits optical contrast and ultrasonic detection principles. Iterative image reconstruction algorithms that are based on discrete imaging models are actively being developed for OAT due to their ability to improve image quality by incorporating accurate models of the imaging physics, instrument response, and measurement noise. In this work, we investigate the use of discrete imaging models based on Kaiser-Bessel window functions for iterative image reconstruction in OAT. A closed-form expression for the pressure produced by a Kaiser-Bessel function is calculated, which facilitates accurate computation of the system matrix. Computer-simulation and experimental studies are employed to demonstrate the potential advantages of Kaiser-Bessel function-based iterative image reconstruction in OAT.

  10. Fast Monte Carlo-simulator with full collimator and detector response modelling for SPECT

    International Nuclear Information System (INIS)

    Sohlberg, A.O.; Kajaste, M.T.

    2012-01-01

    Monte Carlo (MC)-simulations have proved to be a valuable tool in studying single photon emission computed tomography (SPECT)-reconstruction algorithms. Despite their popularity, the use of Monte Carlo-simulations is still often limited by their large computation demand. This is especially true in situations where full collimator and detector modelling with septal penetration, scatter and X-ray fluorescence needs to be included. This paper presents a rapid and simple MC-simulator, which can effectively reduce the computation times. The simulator was built on the convolution-based forced detection principle, which can markedly lower the number of simulated photons. Full collimator and detector response look-up tables are pre-simulated and then later used in the actual MC-simulations to model the system response. The developed simulator was validated by comparing it against 123 I point source measurements made with a clinical gamma camera system and against 99m Tc software phantom simulations made with the SIMIND MC-package. The results showed good agreement between the new simulator, measurements and the SIMIND-package. The new simulator provided near noise-free projection data in approximately 1.5 min per projection with 99m Tc, which was less than one-tenth of SIMIND's time. The developed MC-simulator can markedly decrease the simulation time without sacrificing image quality. (author)

  11. Monte Carlo Simulations of Background Spectra in Integral Imager Detectors

    Science.gov (United States)

    Armstrong, T. W.; Colborn, B. L.; Dietz, K. L.; Ramsey, B. D.; Weisskopf, M. C.

    1998-01-01

    Predictions of the expected gamma-ray backgrounds in the ISGRI (CdTe) and PiCsIT (Csl) detectors on INTEGRAL due to cosmic-ray interactions and the diffuse gamma-ray background have been made using a coupled set of Monte Carlo radiation transport codes (HETC, FLUKA, EGS4, and MORSE) and a detailed, 3-D mass model of the spacecraft and detector assemblies. The simulations include both the prompt background component from induced hadronic and electromagnetic cascades and the delayed component due to emissions from induced radioactivity. Background spectra have been obtained with and without the use of active (BGO) shielding and charged particle rejection to evaluate the effectiveness of anticoincidence counting on background rejection.

  12. Using Computational Simulations to Confront Students' Mental Models

    Science.gov (United States)

    Rodrigues, R.; Carvalho, P. Simeão

    2014-01-01

    In this paper we show an example of how to use a computational simulation to obtain visual feedback for students' mental models, and compare their predictions with the simulated system's behaviour. Additionally, we use the computational simulation to incrementally modify the students' mental models in order to accommodate new data,…

  13. Evaluation of 3D modality-independent elastography for breast imaging: a simulation study

    International Nuclear Information System (INIS)

    Ou, J J; Ong, R E; Yankeelov, T E; Miga, M I

    2008-01-01

    This paper reports on the development and preliminary testing of a three-dimensional implementation of an inverse problem technique for extracting soft-tissue elasticity information via non-rigid model-based image registration. The modality-independent elastography (MIE) algorithm adjusts the elastic properties of a biomechanical model to achieve maximal similarity between images acquired under different states of static loading. A series of simulation experiments with clinical image sets of human breasts were performed to test the ability of the method to identify and characterize a radiographically occult stiff lesion. Because boundary conditions are a critical input to the algorithm, a comparison of three methods for semi-automated surface point correspondence was conducted in the context of systematic and randomized noise processes. The results illustrate that 3D MIE was able to successfully reconstruct elasticity images using data obtained from both magnetic resonance and x-ray computed tomography systems. The lesion was localized correctly in all cases and its relative elasticity found to be reasonably close to the true values (3.5% with the use of spatial priors and 11.6% without). In addition, the inaccuracies of surface registration performed with thin-plate spline interpolation did not exceed empiric thresholds of unacceptable boundary condition error

  14. A Geostationary Earth Orbit Satellite Model Using Easy Java Simulation

    Science.gov (United States)

    Wee, Loo Kang; Goh, Giam Hwee

    2013-01-01

    We develop an Easy Java Simulation (EJS) model for students to visualize geostationary orbits near Earth, modelled using a Java 3D implementation of the EJS 3D library. The simplified physics model is described and simulated using a simple constant angular velocity equation. We discuss four computer model design ideas: (1) a simple and realistic…

  15. Simulating Deformations of MR Brain Images for Validation of Atlas-based Segmentation and Registration Algorithms

    OpenAIRE

    Xue, Zhong; Shen, Dinggang; Karacali, Bilge; Stern, Joshua; Rottenberg, David; Davatzikos, Christos

    2006-01-01

    Simulated deformations and images can act as the gold standard for evaluating various template-based image segmentation and registration algorithms. Traditional deformable simulation methods, such as the use of analytic deformation fields or the displacement of landmarks followed by some form of interpolation, are often unable to construct rich (complex) and/or realistic deformations of anatomical organs. This paper presents new methods aiming to automatically simulate realistic inter- and in...

  16. A Monte Carlo simulation study for the gamma-ray/neutron dual-particle imager using rotational modulation collimator (RMC).

    Science.gov (United States)

    Kim, Hyun Suk; Choi, Hong Yeop; Lee, Gyemin; Ye, Sung-Joon; Smith, Martin B; Kim, Geehyun

    2017-12-22

    This work is to develop a gamma-ray/neutron dual-particle imager, based on rotating modulation collimators (RMC) and pulse shape discrimination (PSD)-capable scintillators, for possible applications on radioactivity monitoring as well as nuclear security and safeguards. A Monte Carlo simulation study was performed to design an RMC system for the dual-particle imaging, and modulation patterns were obtained for gamma-ray and neutron sources on various configurations. We applied an image reconstruction algorithm utilizing the maximum-likelihood expectation maximization (MLEM) method based on the analytical modeling of source-detector configurations, to the Monte Carlo simulation results. Both gamma-ray and neutron source distributions were reconstructed and evaluated in terms of signal-to-noise ratio (SNR), showing viability of developing an RMC-based gamma-ray/neutron dual-particle imager using PSD-capable scintillators. © 2017 IOP Publishing Ltd.

  17. Deterministic simulation of first-order scattering in virtual X-ray imaging

    Energy Technology Data Exchange (ETDEWEB)

    Freud, N. E-mail: nicolas.freud@insa-lyon.fr; Duvauchelle, P.; Pistrui-Maximean, S.A.; Letang, J.-M.; Babot, D

    2004-07-01

    A deterministic algorithm is proposed to compute the contribution of first-order Compton- and Rayleigh-scattered radiation in X-ray imaging. This algorithm has been implemented in a simulation code named virtual X-ray imaging. The physical models chosen to account for photon scattering are the well-known form factor and incoherent scattering function approximations, which are recalled in this paper and whose limits of validity are briefly discussed. The proposed algorithm, based on a voxel discretization of the inspected object, is presented in detail, as well as its results in simple configurations, which are shown to converge when the sampling steps are chosen sufficiently small. Simple criteria for choosing correct sampling steps (voxel and pixel size) are established. The order of magnitude of the computation time necessary to simulate first-order scattering images amounts to hours with a PC architecture and can even be decreased down to minutes, if only a profile is computed (along a linear detector). Finally, the results obtained with the proposed algorithm are compared to the ones given by the Monte Carlo code Geant4 and found to be in excellent accordance, which constitutes a validation of our algorithm. The advantages and drawbacks of the proposed deterministic method versus the Monte Carlo method are briefly discussed.

  18. Image processing strategies based on saliency segmentation for object recognition under simulated prosthetic vision.

    Science.gov (United States)

    Li, Heng; Su, Xiaofan; Wang, Jing; Kan, Han; Han, Tingting; Zeng, Yajie; Chai, Xinyu

    2018-01-01

    Current retinal prostheses can only generate low-resolution visual percepts constituted of limited phosphenes which are elicited by an electrode array and with uncontrollable color and restricted grayscale. Under this visual perception, prosthetic recipients can just complete some simple visual tasks, but more complex tasks like face identification/object recognition are extremely difficult. Therefore, it is necessary to investigate and apply image processing strategies for optimizing the visual perception of the recipients. This study focuses on recognition of the object of interest employing simulated prosthetic vision. We used a saliency segmentation method based on a biologically plausible graph-based visual saliency model and a grabCut-based self-adaptive-iterative optimization framework to automatically extract foreground objects. Based on this, two image processing strategies, Addition of Separate Pixelization and Background Pixel Shrink, were further utilized to enhance the extracted foreground objects. i) The results showed by verification of psychophysical experiments that under simulated prosthetic vision, both strategies had marked advantages over Direct Pixelization in terms of recognition accuracy and efficiency. ii) We also found that recognition performance under two strategies was tied to the segmentation results and was affected positively by the paired-interrelated objects in the scene. The use of the saliency segmentation method and image processing strategies can automatically extract and enhance foreground objects, and significantly improve object recognition performance towards recipients implanted a high-density implant. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Simulating SOFIA's image jitter performance and how the results compare to in-flight measurements

    Science.gov (United States)

    Graf, Friederike; Reinacher, Andreas; Spohr, Daniel; Jakob, Holger; Fasoulas, Stefanos

    2017-09-01

    SOFIA, the Stratospheric Observatory for Infrared Astronomy is an airborne telescope and in full operation since 2014. It has already successfully conducted over 400 flights and can be equipped with eight different science instruments which range from the visible to the far infrared wavelength regime. In order to reach SOFIA's scientific goals, the telescope has to provide a stable platform with the ambitous image jitter requirements of less than 0.4 "rms. Such a steady operating environment is especially important for slit spectrometers like EXES (Echelon - Cross - Echelle Spectrograph), that aim to keep the star in the area of a very thin slit for integration. Currently, image motion is mainly caused by deformation and excitation of the telescope structure in a wide range of frequencies. These disturbances are counteracted by the so-called Flexible Body Compensation system which uses a set of accelerometers to estimate the resulting image motion. To better study optimization possibilities of SOFIA's control system, a simulation tool has been developed which not only implements system identification data and analytically derived models, but also allows the implementation and verification with sensor data from in flight measurements. Results of the simulation as well as in flight measurements will be presented and improvement strategies will be discussed.

  20. An image-based reaction field method for electrostatic interactions in molecular dynamics simulations of aqueous solutions.

    Science.gov (United States)

    Lin, Yuchun; Baumketner, Andrij; Deng, Shaozhong; Xu, Zhenli; Jacobs, Donald; Cai, Wei

    2009-10-21

    In this paper, a new solvation model is proposed for simulations of biomolecules in aqueous solutions that combines the strengths of explicit and implicit solvent representations. Solute molecules are placed in a spherical cavity filled with explicit water, thus providing microscopic detail where it is most needed. Solvent outside of the cavity is modeled as a dielectric continuum whose effect on the solute is treated through the reaction field corrections. With this explicit/implicit model, the electrostatic potential represents a solute molecule in an infinite bath of solvent, thus avoiding unphysical interactions between periodic images of the solute commonly used in the lattice-sum explicit solvent simulations. For improved computational efficiency, our model employs an accurate and efficient multiple-image charge method to compute reaction fields together with the fast multipole method for the direct Coulomb interactions. To minimize the surface effects, periodic boundary conditions are employed for nonelectrostatic interactions. The proposed model is applied to study liquid water. The effect of model parameters, which include the size of the cavity, the number of image charges used to compute reaction field, and the thickness of the buffer layer, is investigated in comparison with the particle-mesh Ewald simulations as a reference. An optimal set of parameters is obtained that allows for a faithful representation of many structural, dielectric, and dynamic properties of the simulated water, while maintaining manageable computational cost. With controlled and adjustable accuracy of the multiple-image charge representation of the reaction field, it is concluded that the employed model achieves convergence with only one image charge in the case of pure water. Future applications to pKa calculations, conformational sampling of solvated biomolecules and electrolyte solutions are briefly discussed.

  1. Using a simulation assistant in modeling manufacturing systems

    Science.gov (United States)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, S. X.; Wolfsberger, John W.

    1988-01-01

    Numerous simulation languages exist for modeling discrete event processes, and are now ported to microcomputers. Graphic and animation capabilities were added to many of these languages to assist the users build models and evaluate the simulation results. With all these languages and added features, the user is still plagued with learning the simulation language. Futhermore, the time to construct and then to validate the simulation model is always greater than originally anticipated. One approach to minimize the time requirement is to use pre-defined macros that describe various common processes or operations in a system. The development of a simulation assistant for modeling discrete event manufacturing processes is presented. A simulation assistant is defined as an interactive intelligent software tool that assists the modeler in writing a simulation program by translating the modeler's symbolic description of the problem and then automatically generating the corresponding simulation code. The simulation assistant is discussed with emphasis on an overview of the simulation assistant, the elements of the assistant, and the five manufacturing simulation generators. A typical manufacturing system will be modeled using the simulation assistant and the advantages and disadvantages discussed.

  2. Development of a Generic Didactic Model for Simulator Training

    National Research Council Canada - National Science Library

    Emmerik, M

    1997-01-01

    .... The development of such a model is motivated by the need to control training and instruction factors in research on simulator fidelity, the need to assess the benefit of training simulators, e.g...

  3. Near-field three-dimensional coherent imaging: Theory and simulations

    Science.gov (United States)

    Silverstein, Seth D.; Zheng, Yibin

    2004-04-01

    This work presents a rigorous mathematical derivation of an effective approximate solution to the three-dimensional inverse scattering/imaging problem that is applicable for all imaging zones ranging from the near to the far field. Simulation results for the point spread function illustrate the range and cross-range resolution as a function of the optical f number. The model system operates in a synthetic aperture type mode, where the coherent signals are transmitted, and the scattered signals are subsequently received at individual transmitters and receivers. Potential applications of this technology include: Medical ultrasound, foliage penetrating synthetic aperture radar, ground penetrating radar for land mine detection, and electromagnetic millimeter-wave scanning for concealed weapon detection.

  4. Identification of a Common Binding Mode for Imaging Agents to Amyloid Fibrils from Molecular Dynamics Simulations

    DEFF Research Database (Denmark)

    Skeby, Katrine Kirkeby; Sørensen, Jesper; Schiøtt, Birgit

    2013-01-01

    Amyloid diseases are characterized by the misfolding and deposition of proteins in the body in the form of insoluble amyloid fibrils. Alzheimer’s disease and type 2 diabetes mellitus are two examples of amyloid diseases which are closely related both with respect to the atomic structures of the a...... binding modes for imaging agents is proposed to originate from subtle differences in amino acid composition of the surface grooves on an amyloid fibril, resulting in fine tuning of the binding affinities for a specific amyloid fibril....... experimentally due to the insoluble nature of amyloid fibrils. This study uses molecular dynamics simulations to investigate the interactions between 13 aromatic amyloid imaging agents, entailing 4 different organic scaffolds, and a model of an amyloid fibril. Clustering analysis combined with free energy...

  5. Development of computational small animal models and their applications in preclinical imaging and therapy research.

    Science.gov (United States)

    Xie, Tianwu; Zaidi, Habib

    2016-01-01

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.

  6. Development of computational small animal models and their applications in preclinical imaging and therapy research

    International Nuclear Information System (INIS)

    Xie, Tianwu; Zaidi, Habib

    2016-01-01

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future

  7. Development of computational small animal models and their applications in preclinical imaging and therapy research

    Energy Technology Data Exchange (ETDEWEB)

    Xie, Tianwu [Division of Nuclear Medicine and Molecular Imaging, Geneva University Hospital, Geneva 4 CH-1211 (Switzerland); Zaidi, Habib, E-mail: habib.zaidi@hcuge.ch [Division of Nuclear Medicine and Molecular Imaging, Geneva University Hospital, Geneva 4 CH-1211 (Switzerland); Geneva Neuroscience Center, Geneva University, Geneva CH-1205 (Switzerland); Department of Nuclear Medicine and Molecular Imaging, University of Groningen, University Medical Center Groningen, Groningen 9700 RB (Netherlands)

    2016-01-15

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.

  8. Growth Simulation and Discrimination of Botrytis cinerea, Rhizopus stolonifer and Colletotrichum acutatum Using Hyperspectral Reflectance Imaging.

    Directory of Open Access Journals (Sweden)

    Ye Sun

    Full Text Available This research aimed to develop a rapid and nondestructive method to model the growth and discrimination of spoilage fungi, like Botrytis cinerea, Rhizopus stolonifer and Colletotrichum acutatum, based on hyperspectral imaging system (HIS. A hyperspectral imaging system was used to measure the spectral response of fungi inoculated on potato dextrose agar plates and stored at 28°C and 85% RH. The fungi were analyzed every 12 h over two days during growth, and optimal simulation models were built based on HIS parameters. The results showed that the coefficients of determination (R2 of simulation models for testing datasets were 0.7223 to 0.9914, and the sum square error (SSE and root mean square error (RMSE were in a range of 2.03-53.40×10(-4 and 0.011-0.756, respectively. The correlation coefficients between the HIS parameters and colony forming units of fungi were high from 0.887 to 0.957. In addition, fungi species was discriminated by partial least squares discrimination analysis (PLSDA, with the classification accuracy of 97.5% for the test dataset at 36 h. The application of this method in real food has been addressed through the analysis of Botrytis cinerea, Rhizopus stolonifer and Colletotrichum acutatum inoculated in peaches, demonstrating that the HIS technique was effective for simulation of fungal infection in real food. This paper supplied a new technique and useful information for further study into modeling the growth of fungi and detecting fruit spoilage caused by fungi based on HIS.

  9. Growth Simulation and Discrimination of Botrytis cinerea, Rhizopus stolonifer and Colletotrichum acutatum Using Hyperspectral Reflectance Imaging.

    Science.gov (United States)

    Sun, Ye; Gu, Xinzhe; Wang, Zhenjie; Huang, Yangmin; Wei, Yingying; Zhang, Miaomiao; Tu, Kang; Pan, Leiqing

    2015-01-01

    This research aimed to develop a rapid and nondestructive method to model the growth and discrimination of spoilage fungi, like Botrytis cinerea, Rhizopus stolonifer and Colletotrichum acutatum, based on hyperspectral imaging system (HIS). A hyperspectral imaging system was used to measure the spectral response of fungi inoculated on potato dextrose agar plates and stored at 28°C and 85% RH. The fungi were analyzed every 12 h over two days during growth, and optimal simulation models were built based on HIS parameters. The results showed that the coefficients of determination (R2) of simulation models for testing datasets were 0.7223 to 0.9914, and the sum square error (SSE) and root mean square error (RMSE) were in a range of 2.03-53.40×10(-4) and 0.011-0.756, respectively. The correlation coefficients between the HIS parameters and colony forming units of fungi were high from 0.887 to 0.957. In addition, fungi species was discriminated by partial least squares discrimination analysis (PLSDA), with the classification accuracy of 97.5% for the test dataset at 36 h. The application of this method in real food has been addressed through the analysis of Botrytis cinerea, Rhizopus stolonifer and Colletotrichum acutatum inoculated in peaches, demonstrating that the HIS technique was effective for simulation of fungal infection in real food. This paper supplied a new technique and useful information for further study into modeling the growth of fungi and detecting fruit spoilage caused by fungi based on HIS.

  10. Modeling and Simulation in Healthcare Future Directions

    Science.gov (United States)

    2010-07-13

    information all have equal “weight” in the information world Computers Internet Simulation The Future Distribute & communicate Predict, plan & train...Acquire & analyze Third Leg of the Information Age Satava 2 Feb 1999 Simulation Computers Acquire Analyze Simulation Predict, Train Internet Communicate...Serendipity Inspiration FURTHER PROOF: Current evidence is inadequate for Event horizons Cognition Genome Quantum mechanics Memes Etc New

  11. Four Models of In Situ Simulation

    DEFF Research Database (Denmark)

    Musaeus, Peter; Krogh, Kristian; Paltved, Charlotte

    2014-01-01

    Introduction In situ simulation is characterized by being situated in the clinical environment as opposed to the simulation laboratory. But in situ simulation bears a family resemblance to other types of on the job training. We explore a typology of in situ simulation and suggest that there are f...... to team intervention and philosophies informing what good situated learning research is. This study generates system knowledge that might inform scenario development for in situ simulation.......Introduction In situ simulation is characterized by being situated in the clinical environment as opposed to the simulation laboratory. But in situ simulation bears a family resemblance to other types of on the job training. We explore a typology of in situ simulation and suggest...... that there are four fruitful approaches to in situ simulation: (1) In situ simulation informed by reported critical incidents and adverse events from emergency departments (ED) in which team training is about to be conducted to write scenarios. (2) In situ simulation through ethnographic studies at the ED. (3) Using...

  12. Small animal positron emission tomography with gas detectors. Simulations, prototyping, and quantitative image reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Vernekohl, Don

    2014-04-15

    plain surfaces, predicted by simulations, was observed. Third, as the production of photon converters is time consuming and expensive, it was investigated whether or not thin gas detectors with single-lead-layer-converters would be an alternative to the HIDAC converter design. Following simulations, those concepts potentially offer impressive coincidence sensitivities up to 24% for plain lead foils and up to 40% for perforated lead foils. Fourth, compared to other PET scanner systems, the HIDAC concept suffers from missing energy information. Consequently, a substantial amount of scatter events can be found within the measured data. On the basis of image reconstruction and correction techniques the influence of random and scatter events and their characteristics on several simulated phantoms were presented. It was validated with the HIDAC simulator that the applied correction technique results in perfectly corrected images. Moreover, it was shown that the simulator is a credible tool to provide quantitatively improved images. Fifth, a new model for the non-collinearity of the positronium annihilation was developed, since it was observed that the model implemented in the GATE simulator does not correspond to the measured observation. The input parameter of the new model was trimmed to match to a point source measurement. The influence of both models on the spatial resolution was studied with three different reconstruction methods. Furthermore, it was demonstrated that the reduction of converter depth, proposed for increased sensitivity, also has an advantage on the spatial resolution and that a reduction of the FOV from 17 cm to 4 cm (with only 2 detector heads) results in a remarkable sensitivity increase of 150% and a substantial increase in spatial resolution. The presented simulations for the spatial resolution analysis used an intrinsic detector resolution of 0.125 x 0.125 x 3.2 mm{sup 3} and were able to reach fair resolutions down to 0.9-0.5 mm, which is an

  13. Modelling toolkit for simulation of maglev devices

    Science.gov (United States)

    Peña-Roche, J.; Badía-Majós, A.

    2017-01-01

    A stand-alone App1 has been developed, focused on obtaining information about relevant engineering properties of magnetic levitation systems. Our modelling toolkit provides real time simulations of 2D magneto-mechanical quantities for superconductor (SC)/permanent magnet structures. The source code is open and may be customised for a variety of configurations. Ultimately, it relies on the variational statement of the critical state model for the superconducting component and has been verified against experimental data for YBaCuO/NdFeB assemblies. On a quantitative basis, the values of the arising forces, induced superconducting currents, as well as a plot of the magnetic field lines are displayed upon selection of an arbitrary trajectory of the magnet in the vicinity of the SC. The stability issues related to the cooling process, as well as the maximum attainable forces for a given material and geometry are immediately observed. Due to the complexity of the problem, a strategy based on cluster computing, database compression, and real-time post-processing on the device has been implemented.

  14. Simulation Models for Socioeconomic Inequalities in Health: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Niko Speybroeck

    2013-11-01

    Full Text Available Background: The emergence and evolution of socioeconomic inequalities in health involves multiple factors interacting with each other at different levels. Simulation models are suitable for studying such complex and dynamic systems and have the ability to test the impact of policy interventions in silico. Objective: To explore how simulation models were used in the field of socioeconomic inequalities in health. Methods: An electronic search of studies assessing socioeconomic inequalities in health using a simulation model was conducted. Characteristics of the simulation models were extracted and distinct simulation approaches were identified. As an illustration, a simple agent-based model of the emergence of socioeconomic differences in alcohol abuse was developed. Results: We found 61 studies published between 1989 and 2013. Ten different simulation approaches were identified. The agent-based model illustration showed that multilevel, reciprocal and indirect effects of social determinants on health can be modeled flexibly. Discussion and Conclusions: Based on the review, we discuss the utility of using simulation models for studying health inequalities, and refer to good modeling practices for developing such models. The review and the simulation model example suggest that the use of simulation models may enhance the understanding and debate about existing and new socioeconomic inequalities of health frameworks.

  15. A simulation environment for validating ultrasonic blood flow and vessel wall imaging based on fluid-structure interaction simulations: ultrasonic assessment of arterial distension and wall shear rate.

    Science.gov (United States)

    Swillens, Abigail; Degroote, Joris; Vierendeels, Jan; Lovstakken, Lasse; Segers, Patrick

    2010-08-01

    Ultrasound (US) is a commonly used vascular imaging tool when screening for patients at high cardiovascular risk. However, current blood flow and vessel wall imaging methods are hampered by several limitations. When optimizing and developing new ultrasound modalities, proper validation is required before clinical implementation. Therefore, the authors present a simulation environment integrating ultrasound and fluid-structure interaction (FSI) simulations, allowing construction of synthetic ultrasound images based on physiologically realistic behavior of an artery. To demonstrate the potential of the model for vascular ultrasound research, the authors studied clinically relevant imaging modalities of arterial function related to both vessel wall deformation and arterial hemodynamics: Arterial distension (related to arterial stiffness) and wall shear rate (related to the development of atherosclerosis) imaging. An in-house code ("TANGO") was developed to strongly couple the flow solver FLUENT and structural solver ABAQUS using an interface quasi-Newton technique. FIELD II was used to model realistic transducer and scan settings. The input to the FSI-US model is a scatterer phantom on which the US waves reflect, with the scatterer displacement derived from the FSI flow and displacement fields. The authors applied the simulation tool to a 3D straight tube, representative of the common carotid artery (length: 5 cm; and inner and outer radius: 3 and 4 mm). A mass flow inlet boundary condition, based on flow measured in a healthy subject, was applied. A downstream pressure condition, based on a noninvasively measured pressure waveform, was chosen and scaled to simulate three different degrees of arterial distension (1%, 4%, and 9%). The RF data from the FSI-US coupling were further processed for arterial wall and flow imaging. Using an available wall tracking algorithm, arterial distensibility was assessed. Using an autocorrelation estimator, blood velocity and shear

  16. Semantic Image Segmentation with Contextual Hierarchical Models.

    Science.gov (United States)

    Seyedhosseini, Mojtaba; Tasdizen, Tolga

    2016-05-01

    Semantic segmentation is the problem of assigning an object label to each pixel. It unifies the image segmentation and object recognition problems. The importance of using contextual information in semantic segmentation frameworks has been widely realized in the field. We propose a contextual framework, called contextual hierarchical model (CHM), which learns contextual information in a hierarchical framework for semantic segmentation. At each level of the hierarchy, a classifier is trained based on downsampled input images and outputs of previous levels. Our model then incorporates the resulting multi-resolution contextual information into a classifier to segment the input image at original resolution. This training strategy allows for optimization of a joint posterior probability at multiple resolutions through the hierarchy. Contextual hierarchical model is purely based on the input image patches and does not make use of any fragments or shape examples. Hence, it is applicable to a variety of problems such as object segmentation and edge detection. We demonstrate that CHM performs at par with state-of-the-art on Stanford background and Weizmann horse datasets. It also outperforms state-of-the-art edge detection methods on NYU depth dataset and achieves state-of-the-art on Berkeley segmentation dataset (BSDS 500).

  17. Modelization and simulation of capillary barriers

    International Nuclear Information System (INIS)

    Lisbona Cortes, F.; Aguilar Villa, G.; Clavero Gracia, C.; Gracia Lozano, J.L.

    1998-01-01

    Among the different underground transport phenomena, that due to water flows is of great relevance. Water flows in infiltration and percolation processes are responsible of the transport of hazardous wastes towards phreatic layers. From the industrial and geological standpoints, there is a great interest in the design of natural devices to avoid the flows transporting polluting substances. This interest is increased when devices are used to isolate radioactive waste repositories, whose life is to be longer than several hundred years. The so-called natural devices are those based on the superimposition of material with different hydraulic properties. In particular, the flow retention in this kind stratified media, in unsaturated conditions, is basically due to the capillary barrier effect, resulting from placing a low conductivity material over another with a high hydraulic conductivity. Covers designed from the effect above have also to allow a drainage of the upper layer. The lower cost of these covers, with respect to other kinds of protection systems, and the stability in time of their components make them very attractive. However, a previous investigation to determine their effectivity is required. In this report we present the computer code BCSIM, useful for easy simulations of unsaturated flows in a capillary barrier configuration with drainage, and which is intended to serve as a tool for designing efficient covers. The model, the numerical algorithm and several implementation aspects are described. Results obtained in several simulations, confirming the effectivity of capillary barriers as a technique to build safety covers for hazardous waste repositories, are presented. (Author)

  18. Powertrain modeling and simulation for off-road vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Ouellette, S. [McGill Univ., Montreal, PQ (Canada)

    2010-07-01

    Standard forward facing automotive powertrain modeling and simulation methodology did not perform equally for all vehicles in all applications in the 2010 winter Olympics, 2009 world alpine ski championships, summit station in Greenland, the McGill Formula Hybrid, Unicell QuickSider, and lunar mobility. This presentation provided a standard automotive powertrain modeling and simulation flow chart as well as an example. It also provided a flow chart for location based powertrain modeling and simulation and discussed location based powertrain modeling and simulation implementation. It was found that in certain applications, vehicle-environment interactions cannot be neglected in order to have good model fidelity. Powertrain modeling and simulation of off-road vehicles demands a new approach to powertrain modeling and simulation. It was concluded that the proposed location based methodology could improve the results for off-road vehicles. tabs., figs.

  19. Reusable Component Model Development Approach for Parallel and Distributed Simulation

    Science.gov (United States)

    Zhu, Feng; Yao, Yiping; Chen, Huilong; Yao, Feng

    2014-01-01

    Model reuse is a key issue to be resolved in parallel and distributed simulation at present. However, component models built by different domain experts usually have diversiform interfaces, couple tightly, and bind with simulation platforms closely. As a result, they are difficult to be reused across different simulation platforms and applications. To address the problem, this paper first proposed a reusable component model framework. Based on this framework, then our reusable model development approach is elaborated, which contains two phases: (1) domain experts create simulation computational modules observing three principles to achieve their independence; (2) model developer encapsulates these simulation computational modules with six standard service interfaces to improve their reusability. The case study of a radar model indicates that the model developed using our approach has good reusability and it is easy to be used in different simulation platforms and applications. PMID:24729751

  20. Simulation of specular microscopy images of corneal endothelium, a tool for control of measurement errors.

    Science.gov (United States)

    Bucht, Curry; Söderberg, Per; Manneberg, Göran

    2011-05-01

    We aimed at developing simulation software capable of producing images of corneal endothelium close to identical to images captured by clinical specular microscopy with defined morphometrical characteristics. It was further planned to demonstrate the usefulness of the simulator by analysing measurement errors associated with a trained operator using a commercially available semi-automatic algorithm for analysis of simulated images. Software was developed that allows creation of unique images of the corneal endothelium expressing morphology close to identical with that seen in images of corneal specular microscopy. Several hundred unique images of the corneal endothelium were generated with randomization, spanning a physiological range of endothelial cell density. As an example of the usefulness of the simulator for analysis of measurement errors in corneal specular microscopy, a total of 12 of all the images generated were randomly selected such that the endothelial cell density expressed was evenly distributed over the physiological range of endothelial cell density. The images were transferred to a personal computer. The imagenet-640 software was used to analyse endothelial cell size variation, percentage of hexagonal endothelial cells, and endothelial cell density. The simulator developed allows randomized generation of corneal specular microscopy images with a preset expected average and variation of cell structure. Calculated morphometric information of each cell is stored in the simulator. The image quality can secondarily be varied with a toolbox of filters to approximate a large spectrum of clinically captured images. As an example of the use of the simulator, measurement errors associated with one trained operator using the imagenet-640 software, and focusing on endothelial cell density, were examined. The functional dependence between morphometric information estimated with the imagenet-640 software algorithm and real morphometric information as provided

  1. Aircraft vulnerability analysis by modeling and simulation

    Science.gov (United States)

    Willers, Cornelius J.; Willers, Maria S.; de Waal, Alta

    2014-10-01

    guidance acceleration and seeker sensitivity. For the purpose of this investigation the aircraft is equipped with conventional pyrotechnic decoy flares and the missile has no counter-countermeasure means (security restrictions on open publication). This complete simulation is used to calculate the missile miss distance, when the missile is launched from different locations around the aircraft. The miss distance data is then graphically presented showing miss distance (aircraft vulnerability) as a function of launch direction and range. The aircraft vulnerability graph accounts for aircraft and missile characteristics, but does not account for missile deployment doctrine. A Bayesian network is constructed to fuse the doctrinal rules with the aircraft vulnerability data. The Bayesian network now provides the capability to evaluate the combined risk of missile launch and aircraft vulnerability. It is shown in this paper that it is indeed possible to predict the aircraft vulnerability to missile attack in a comprehensive modelling and a holistic process. By using the appropriate real-world models, this approach is used to evaluate the effectiveness of specific countermeasure techniques against specific missile threats. The use of a Bayesian network provides the means to fuse simulated performance data with more abstract doctrinal rules to provide a realistic assessment of the aircraft vulnerability.

  2. Modeling space-based multispectral imaging systems with DIRSIG

    Science.gov (United States)

    Brown, Scott D.; Sanders, Niek J.; Goodenough, Adam A.; Gartley, Michael

    2011-06-01

    The Landsat Data Continuity Mission (LDCM) focuses on a next generation global coverage, imaging system to replace the aging Landsat 5 and Landsat 7 systems. The major difference in the new system is the migration from the multi-spectral whiskbroom design employed by the previous generation of sensors to modular focal plane, multi-spectral pushbroom architecture. Further complicating the design shift is that the reflective and thermal acquisition capability is split across two instruments spatially separated on the satellite bus. One of the focuses of the science and engineering teams prior to launch is the ability to provide seamless data continuity with the historic Landsat data archive. Specifically, the challenges of registering and calibrating data from the new system so that long-term science studies are minimally impacted by the change in the system design. In order to provide the science and engineering teams with simulated pre-launch data, an effort was undertaken to create a robust end-to-end model of the LDCM system. The modeling environment is intended to be flexible and incorporate measured data from the actual system components as they were completed and integrated. The output of the modeling environment needs to include not only radiometrically robust imagery, but also the meta-data necessary to exercise the processing pipeline. This paper describes how the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model has been utilized to model space-based, multi-spectral imaging (MSI) systems in support of systems engineering trade studies. A mechanism to incorporate measured focal plane projections through the forward optics is described. A hierarchal description of the satellite system is presented including the details of how a multiple instrument platform is described and modeled, including the hierarchical management of temporally correlated jitter that allows engineers to explore impacts of different jitter sources on instrument

  3. Model calibration for building energy efficiency simulation

    International Nuclear Information System (INIS)

    Mustafaraj, Giorgio; Marini, Dashamir; Costa, Andrea; Keane, Marcus

    2014-01-01

    Highlights: • Developing a 3D model relating to building architecture, occupancy and HVAC operation. • Two calibration stages developed, final model providing accurate results. • Using an onsite weather station for generating the weather data file in EnergyPlus. • Predicting thermal behaviour of underfloor heating, heat pump and natural ventilation. • Monthly energy saving opportunities related to heat pump of 20–27% was identified. - Abstract: This research work deals with an Environmental Research Institute (ERI) building where an underfloor heating system and natural ventilation are the main systems used to maintain comfort condition throughout 80% of the building areas. Firstly, this work involved developing a 3D model relating to building architecture, occupancy and HVAC operation. Secondly, the calibration methodology, which consists of two levels, was then applied in order to insure accuracy and reduce the likelihood of errors. To further improve the accuracy of calibration a historical weather data file related to year 2011, was created from the on-site local weather station of ERI building. After applying the second level of calibration process, the values of Mean bias Error (MBE) and Cumulative Variation of Root Mean Squared Error (CV(RMSE)) on hourly based analysis for heat pump electricity consumption varied within the following ranges: (MBE) hourly from −5.6% to 7.5% and CV(RMSE) hourly from 7.3% to 25.1%. Finally, the building was simulated with EnergyPlus to identify further possibilities of energy savings supplied by a water to water heat pump to underfloor heating system. It found that electricity consumption savings from the heat pump can vary between 20% and 27% on monthly bases

  4. Sunspot Modeling: From Simplified Models to Radiative MHD Simulations

    Directory of Open Access Journals (Sweden)

    Rolf Schlichenmaier

    2011-09-01

    Full Text Available We review our current understanding of sunspots from the scales of their fine structure to their large scale (global structure including the processes of their formation and decay. Recently, sunspot models have undergone a dramatic change. In the past, several aspects of sunspot structure have been addressed by static MHD models with parametrized energy transport. Models of sunspot fine structure have been relying heavily on strong assumptions about flow and field geometry (e.g., flux-tubes, "gaps", convective rolls, which were motivated in part by the observed filamentary structure of penumbrae or the necessity of explaining the substantial energy transport required to maintain the penumbral brightness. However, none of these models could self-consistently explain all aspects of penumbral structure (energy transport, filamentation, Evershed flow. In recent years, 3D radiative MHD simulations have been advanced dramatically to the point at which models of complete sunspots with sufficient resolution to capture sunspot fine structure are feasible. Here overturning convection is the central element responsible for energy transport, filamentation leading to fine-structure and the driving of strong outflows. On the larger scale these models are also in the progress of addressing the subsurface structure of sunspots as well as sunspot formation. With this shift in modeling capabilities and the recent advances in high resolution observations, the future research will be guided by comparing observation and theory.

  5. Predictive Capability Maturity Model for computational modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  6. Modeling and simulation of the USAVRE network and radiology operations

    Science.gov (United States)

    Martinez, Ralph; Bradford, Daniel Q.; Hatch, Jay; Sochan, John; Chimiak, William J.

    1998-07-01

    The U.S. Army Medical Command, lead by the Brooke Army Medical Center, has embarked on a visionary project. The U.S. Army Virtual Radiology Environment (USAVRE) is a CONUS-based network that connects all the Army's major medical centers and Regional Medical Commands (RMC). The purpose of the USAVRE is to improve the quality, access, and cost of radiology services in the Army via the use of state-of-the-art medical imaging, computer, and networking technologies. The USAVRE contains multimedia viewing workstations; database archive systems are based on a distributed computing environment using Common Object Request Broker Architecture (CORBA) middleware protocols. The underlying telecommunications network is an ATM-based backbone network that connects the RMC regional networks and PACS networks at medical centers and RMC clinics. This project is a collaborative effort between Army, university, and industry centers with expertise in teleradiology and Global PACS applications. This paper describes a model and simulation of the USAVRE for performance evaluation purposes. As a first step the results of a Technology Assessment and Requirements Analysis (TARA) -- an analysis of the workload in Army radiology departments, their equipment and their staffing. Using the TARA data and other workload information, we have developed a very detailed analysis of the workload and workflow patterns of our Medical Treatment Facilities. We are embarking on modeling and simulation strategies, which will form the foundation for the VRE network. The workload analysis is performed for each radiology modality in a RMC site. The workload consists of the number of examinations per modality, type of images per exam, number of images per exam, and size of images. The frequency for store and forward cases, second readings, and interactive consultation cases are also determined. These parameters are translated into the model described below. The model for the USAVRE is hierarchical in nature

  7. Statistical model for OCT image denoising

    KAUST Repository

    Li, Muxingzi

    2017-08-01

    Optical coherence tomography (OCT) is a non-invasive technique with a large array of applications in clinical imaging and biological tissue visualization. However, the presence of speckle noise affects the analysis of OCT images and their diagnostic utility. In this article, we introduce a new OCT denoising algorithm. The proposed method is founded on a numerical optimization framework based on maximum-a-posteriori estimate of the noise-free OCT image. It combines a novel speckle noise model, derived from local statistics of empirical spectral domain OCT (SD-OCT) data, with a Huber variant of total variation regularization for edge preservation. The proposed approach exhibits satisfying results in terms of speckle noise reduction as well as edge preservation, at reduced computational cost.

  8. Multiple Constraints Based Robust Matching of Poor-Texture Close-Range Images for Monitoring a Simulated Landslide

    Directory of Open Access Journals (Sweden)

    Gang Qiao

    2016-05-01

    constraints, followed by a refinement course with similarity constraint and robust checking. A series of temporal Single-Lens Reflex (SLR and High-Speed Camera (HSC stereo images captured during the simulated landslide experiment performed on the campus of Tongji University, Shanghai, were employed to illustrate the proposed method, and the dense and reliable image matching results were obtained. Finally, a series of temporal Digital Surface Models (DSM in the landslide process were constructed using the close-range photogrammetry technique, followed by the discussion of the landslide volume changes and surface elevation changes during the simulation experiment.

  9. Image analysis and modeling in medical image computing. Recent developments and advances.

    Science.gov (United States)

    Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T

    2012-01-01

    Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body

  10. Beyond Modeling: All-Atom Olfactory Receptor Model Simulations

    Directory of Open Access Journals (Sweden)

    Peter C Lai

    2012-05-01

    Full Text Available Olfactory receptors (ORs are a type of GTP-binding protein-coupled receptor (GPCR. These receptors are responsible for mediating the sense of smell through their interaction with odor ligands. OR-odorant interactions marks the first step in the process that leads to olfaction. Computational studies on model OR structures can validate experimental functional studies as well as generate focused and novel hypotheses for further bench investigation by providing a view of these interactions at the molecular level. Here we have shown the specific advantages of simulating the dynamic environment that is associated with OR-odorant interactions. We present a rigorous methodology that ranges from the creation of a computationally-derived model of an olfactory receptor to simulating the interactions between an OR and an odorant molecule. Given the ubiquitous occurrence of GPCRs in the membranes of cells, we anticipate that our OR-developed methodology will serve as a model for the computational structural biology of all GPCRs.

  11. Methods for modeling and quantification in functional imaging by positron emissions tomography and magnetic resonance imaging

    International Nuclear Information System (INIS)

    Costes, Nicolas

    2017-01-01

    This report presents experiences and researches in the field of in vivo medical imaging by positron emission tomography (PET) and magnetic resonance imaging (MRI). In particular, advances in terms of reconstruction, quantification and modeling in PET are described. The validation of processing and analysis methods is supported by the creation of data by simulation of the imaging process in PET. The recent advances of combined PET/MRI clinical cameras, allowing simultaneous acquisition of molecular/metabolic PET information, and functional/structural MRI information opens the door to unique methodological innovations, exploiting spatial alignment and simultaneity of the PET and MRI signals. It will lead to an increase in accuracy and sensitivity in the measurement of biological phenomena. In this context, the developed projects address new methodological issues related to quantification, and to the respective contributions of MRI or PET information for a reciprocal improvement of the signals of the two modalities. They open perspectives for combined analysis of the two imaging techniques, allowing optimal use of synchronous, anatomical, molecular and functional information for brain imaging. These innovative concepts, as well as data correction and analysis methods, will be easily translated into other areas of investigation using combined PET/MRI. (author) [fr

  12. Monte Carlo simulations of elemental imaging using the neutron-associated particle technique.

    Science.gov (United States)

    Abel, Michael R; Nie, Linda H

    2018-02-06

    The purpose of this study is to develop and employ a Monte Carlo (MC) simulation model of associated particle neutron elemental imaging (APNEI) in order to determine the three-dimensional (3D) imaging resolution of such a system by examining relevant physical and technological parameters and to thereby begin to explore the range of clinical applicability of APNEI to fields such as medical diagnostics, intervention, and etiological research. The presented APNEI model was defined in MCNP by a Gaussian-distributed and isotropic surface source emitting deuterium + deuterium (DD) neutrons, iron as the target element, nine iron-containing voxels (1 cm 3 volume each) arranged in a 3-by-3 array as the interrogated volume of interest, and finally, by high-purity germanium (HPGe) gamma-ray detectors anterior and posterior to the 9-voxel array. The MCNP f8 pulse height tally was employed in conjunction with the PTRAC particle tracking function to not only determine the signal acquired from iron inelastic scatter gamma-rays but also to quantitate each of the nine target voxels' contribution to the overall iron signal - each detected iron inelastic scatter gamma-ray being traced to the source neutron which incited its emission. With the spatial, vector, and timing information of the series of events for each relevant neutron history as collected by PTRAC, realistic grayscale images of the distribution of iron concentration in the 9-voxel array were simulated in both the projective and depth dimensions. With an overall 225 ps timing resolution, 6.25 mm 2 imaging plate pixels assumed to have well localized scintillation, and a DD neutron, Gaussian-distributed source spot with a diameter of 2 mm, projective and depth resolutions of imaging resolution offered by APNEI of target elements such as iron lends itself to potential applications in disease diagnosis and treatment planning (high resolution) as well as to ordnance and contraband detection (low resolution). However

  13. Federated Modelling and Simulation for Critical Infrastructure Protection

    NARCIS (Netherlands)

    Rome, E.; Langeslag, P.J.H.; Usov, A.

    2014-01-01

    Modelling and simulation is an important tool for Critical Infrastructure (CI) dependency analysis, for testing methods for risk reduction, and as well for the evaluation of past failures. Moreover, interaction of such simulations with external threat models, e.g., a river flood model, or economic

  14. IDEF method-based simulation model design and development framework

    Directory of Open Access Journals (Sweden)

    Ki-Young Jeong

    2009-09-01

    Full Text Available The purpose of this study is to provide an IDEF method-based integrated framework for a business process simulation model to reduce the model development time by increasing the communication and knowledge reusability during a simulation project. In this framework, simulation requirements are collected by a function modeling method (IDEF0 and a process modeling method (IDEF3. Based on these requirements, a common data model is constructed using the IDEF1X method. From this reusable data model, multiple simulation models are automatically generated using a database-driven simulation model development approach. The framework is claimed to help both requirement collection and experimentation phases during a simulation project by improving system knowledge, model reusability, and maintainability through the systematic use of three descriptive IDEF methods and the features of the relational database technologies. A complex semiconductor fabrication case study was used as a testbed to evaluate and illustrate the concepts and the framework. Two different simulation software products were used to develop and control the semiconductor model from the same knowledge base. The case study empirically showed that this framework could help improve the simulation project processes by using IDEF-based descriptive models and the relational database technology. Authors also concluded that this framework could be easily applied to other analytical model generation by separating the logic from the data.

  15. Simulation models in population breast cancer screening : A systematic review

    NARCIS (Netherlands)

    Koleva-Kolarova, Rositsa G; Zhan, Zhuozhao; Greuter, Marcel J W; Feenstra, Talitha L; De Bock, Geertruida H

    The aim of this review was to critically evaluate published simulation models for breast cancer screening of the general population and provide a direction for future modeling. A systematic literature search was performed to identify simulation models with more than one application. A framework for

  16. Simulation Modeling of a Facility Layout in Operations Management Classes

    Science.gov (United States)

    Yazici, Hulya Julie

    2006-01-01

    Teaching quantitative courses can be challenging. Similarly, layout modeling and lean production concepts can be difficult to grasp in an introductory OM (operations management) class. This article describes a simulation model developed in PROMODEL to facilitate the learning of layout modeling and lean manufacturing. Simulation allows for the…

  17. The Random Walk Drainage Simulation Model as a Teaching Exercise

    Science.gov (United States)

    High, Colin; Richards, Paul

    1972-01-01

    Practical instructions about using the random walk drainage network simulation model as a teaching excercise are given and the results discussed. A source of directional bias in the resulting simulated drainage patterns is identified and given an interpretation in the terms of the model. Three points of educational value concerning the model are…

  18. Maneuver simulation model of an experimental hovercraft