WorldWideScience

Sample records for models imaging simulation

  1. Geometry model construction in infrared image theory simulation of buildings

    Institute of Scientific and Technical Information of China (English)

    谢鸣; 李玉秀; 徐辉; 谈和平

    2004-01-01

    Geometric model construction is the basis of infrared image theory simulation. Taking the construction of the geometric model of one building in Harbin as an example, this paper analyzes the theoretical groundings of simplification and principles of geometric model construction of buildings. It then discusses some particular treatment methods in calculating the radiation transfer coefficient in geometric model construction using the Monte Carlo Method.

  2. Towards integrated modelling: full image simulations for WEAVE

    Science.gov (United States)

    Dalton, Gavin; Ham, Sun Jeong; Trager, Scott; Abrams, Don Carlos; Bonifacio, Piercarlo; Aguerri, J. A. L.; Middleton, Kevin; Benn, Chris; Rogers, Kevin; Stuik, Remko; Carrasco, Esperanza; Vallenari, Antonella; Jin, Shoko; Lewis, Jim

    2016-08-01

    We present an integrated end-end simulation of the spectral images that will be obtained by the weave spectrograph, which aims to include full modelling of all effects from the top of the atmosphere to the detector. These data are based in input spectra from a combination of library spectra and synthetic models, and will be used to provide inputs for an endend test of the full weave data pipeline and archive systems, prior to 1st light of the instrument.

  3. A real-time infrared imaging simulation method with physical effects modeling of infrared sensors

    Science.gov (United States)

    Li, Ni; Huai, Wenqing; Wang, Shaodan; Ren, Lei

    2016-09-01

    Infrared imaging simulation technology can provide infrared data sources for the development, improvement and evaluation of infrared imaging systems under different environment, status and weather conditions, which is reusable and more economic than physical experiments. A real-time infrared imaging simulation process is established to reproduce a complete physical imaging process. Our emphasis is put on the modeling of infrared sensors, involving physical effects of both spatial domain and frequency domain. An improved image convolution method is proposed based on GPU parallel processing to enhance the real-time simulation ability with ensuring its simulation accuracy at the same time. Finally the effectiveness of the above methods is validated by simulation analysis and result comparison.

  4. MR imaging of model drug distribution in simulated vitreous

    Directory of Open Access Journals (Sweden)

    Stein Sandra

    2015-09-01

    Full Text Available The in vitro and in vivo characterization of intravitreal injections plays an important role in developing innovative therapy approaches. Using the established vitreous model (VM and eye movement system (EyeMoS the distribution of contrast agents with different molecular weight was studied in vitro. The impact of the simulated age-related vitreal liquefaction (VL on drug distribution in VM was examined either with injection through the gel phase or through the liquid phase. For comparison the distribution was studied ex vivo in the porcine vitreous. The studies were performed in a magnetic resonance (MR scanner. As expected, with increasing molecular weight the diffusion velocity and the visual distribution of the injected substances decreased. Similar drug distribution was observed in VM and in porcine eye. VL causes enhanced convective flow and faster distribution in VM. Confirming the importance of the injection technique in progress of VL, injection through gelatinous phase caused faster distribution into peripheral regions of the VM than following injection through liquefied phase. VM and MR scanner in combination present a new approach for the in vitro characterization of drug release and distribution of intravitreal dosage forms.

  5. Fast simulation of ultrasound images

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt; Nikolov, Svetoslav

    2000-01-01

    Realistic B-mode and flow images can be simulated with scattering maps based on optical, CT, or MR images or parametric flow models. The image simulation often includes using 200,000 to 1 million point scatterers. One image line typically takes 1800 seconds to compute on a state-of-the-art PC, an...

  6. Efficient scatter model for simulation of ultrasound images from computed tomography data

    Science.gov (United States)

    D'Amato, J. P.; Lo Vercio, L.; Rubi, P.; Fernandez Vera, E.; Barbuzza, R.; Del Fresno, M.; Larrabide, I.

    2015-12-01

    Background and motivation: Real-time ultrasound simulation refers to the process of computationally creating fully synthetic ultrasound images instantly. Due to the high value of specialized low cost training for healthcare professionals, there is a growing interest in the use of this technology and the development of high fidelity systems that simulate the acquisitions of echographic images. The objective is to create an efficient and reproducible simulator that can run either on notebooks or desktops using low cost devices. Materials and methods: We present an interactive ultrasound simulator based on CT data. This simulator is based on ray-casting and provides real-time interaction capabilities. The simulation of scattering that is coherent with the transducer position in real time is also introduced. Such noise is produced using a simplified model of multiplicative noise and convolution with point spread functions (PSF) tailored for this purpose. Results: The computational efficiency of scattering maps generation was revised with an improved performance. This allowed a more efficient simulation of coherent scattering in the synthetic echographic images while providing highly realistic result. We describe some quality and performance metrics to validate these results, where a performance of up to 55fps was achieved. Conclusion: The proposed technique for real-time scattering modeling provides realistic yet computationally efficient scatter distributions. The error between the original image and the simulated scattering image was compared for the proposed method and the state-of-the-art, showing negligible differences in its distribution.

  7. Image Simulation with Shapelets

    CERN Document Server

    Massey, R J; Bacon, D J; Conselice, C J; Massey, Richard J.; Refregier, Alexandre R.; Bacon, Christopher J. Conselice & David J.

    2004-01-01

    We present a method to manufacture simulated deep sky images, with realistic galaxy morphologies and telescope characteristics. For this purpose, we first use the shapelets formalism (Refregier 2003) to parametrize the shapes of all galaxies in the Hubble Deep Field. We consider the distribution of real galaxy morphologies in shapelet parameter space, then resample this distribution to generate new galaxies. The simulated objects include realistic spiral arms, bars, discs, arbitrary radial profiles and even dust lanes or knots. We apply standard morphology diagnostics to demonstrate that our artificial images closely mimic real data in terms of galaxy size, concentration and asymmetry statistics, etc. Observational effects, including Point-Spread Function, noise, pixellisation, and astrometric distortions are also modelled. Sample images are made available on the world wide web. These simulations are useful to develop and test precision image analysis techniques, including photometry, astrometry, and shape me...

  8. Stochastic simulation by image quilting of process-based geological models

    Science.gov (United States)

    Hoffimann, Júlio; Scheidt, Céline; Barfod, Adrian; Caers, Jef

    2017-09-01

    Process-based modeling offers a way to represent realistic geological heterogeneity in subsurface models. The main limitation lies in conditioning such models to data. Multiple-point geostatistics can use these process-based models as training images and address the data conditioning problem. In this work, we further develop image quilting as a method for 3D stochastic simulation capable of mimicking the realism of process-based geological models with minimal modeling effort (i.e. parameter tuning) and at the same time condition them to a variety of data. In particular, we develop a new probabilistic data aggregation method for image quilting that bypasses traditional ad-hoc weighting of auxiliary variables. In addition, we propose a novel criterion for template design in image quilting that generalizes the entropy plot for continuous training images. The criterion is based on the new concept of voxel reuse-a stochastic and quilting-aware function of the training image. We compare our proposed method with other established simulation methods on a set of process-based training images of varying complexity, including a real-case example of stochastic simulation of the buried-valley groundwater system in Denmark.

  9. Stochastic simulation by image quilting of process-based geological models

    DEFF Research Database (Denmark)

    Hoffimann, Júlio; Scheidt, Celine; Barfod, Adrian

    2017-01-01

    . In this work, we further develop image quilting as a method for 3D stochastic simulation capable of mimicking the realism of process-based geological models with minimal modeling effort (i.e. parameter tuning) and at the same time condition them to a variety of data. In particular, we develop a new...

  10. Computer simulation of magnetic resonance angiography imaging: model description and validation.

    Directory of Open Access Journals (Sweden)

    Artur Klepaczko

    Full Text Available With the development of medical imaging modalities and image processing algorithms, there arises a need for methods of their comprehensive quantitative evaluation. In particular, this concerns the algorithms for vessel tracking and segmentation in magnetic resonance angiography images. The problem can be approached by using synthetic images, where true geometry of vessels is known. This paper presents a framework for computer modeling of MRA imaging and the results of its validation. A new model incorporates blood flow simulation within MR signal computation kernel. The proposed solution is unique, especially with respect to the interface between flow and image formation processes. Furthermore it utilizes the concept of particle tracing. The particles reflect the flow of fluid they are immersed in and they are assigned magnetization vectors with temporal evolution controlled by MR physics. Such an approach ensures flexibility as the designed simulator is able to reconstruct flow profiles of any type. The proposed model is validated in a series of experiments with physical and digital flow phantoms. The synthesized 3D images contain various features (including artifacts characteristic for the time-of-flight protocol and exhibit remarkable correlation with the data acquired in a real MR scanner. The obtained results support the primary goal of the conducted research, i.e. establishing a reference technique for a quantified validation of MR angiography image processing algorithms.

  11. Momentum transfer Monte Carlo model for the simulation of laser speckle contrast imaging (Conference Presentation)

    Science.gov (United States)

    Regan, Caitlin; Hayakawa, Carole K.; Choi, Bernard

    2016-03-01

    Laser speckle imaging (LSI) enables measurement of relative blood flow in microvasculature and perfusion in tissues. To determine the impact of tissue optical properties and perfusion dynamics on speckle contrast, we developed a computational simulation of laser speckle contrast imaging. We used a discrete absorption-weighted Monte Carlo simulation to model the transport of light in tissue. We simulated optical excitation of a uniform flat light source and tracked the momentum transfer of photons as they propagated through a simulated tissue geometry. With knowledge of the probability distribution of momentum transfer occurring in various layers of the tissue, we calculated the expected laser speckle contrast arising with coherent excitation using both reflectance and transmission geometries. We simulated light transport in a single homogeneous tissue while independently varying either absorption (.001-100mm^-1), reduced scattering (.1-10mm^-1), or anisotropy (0.05-0.99) over a range of values relevant to blood and commonly imaged tissues. We observed that contrast decreased by 49% with an increase in optical scattering, and observed a 130% increase with absorption (exposure time = 1ms). We also explored how speckle contrast was affected by the depth (0-1mm) and flow speed (0-10mm/s) of a dynamic vascular inclusion. This model of speckle contrast is important to increase our understanding of how parameters such as perfusion dynamics, vessel depth, and tissue optical properties affect laser speckle imaging.

  12. Image Processing Strategies Based on a Visual Saliency Model for Object Recognition Under Simulated Prosthetic Vision.

    Science.gov (United States)

    Wang, Jing; Li, Heng; Fu, Weizhen; Chen, Yao; Li, Liming; Lyu, Qing; Han, Tingting; Chai, Xinyu

    2016-01-01

    Retinal prostheses have the potential to restore partial vision. Object recognition in scenes of daily life is one of the essential tasks for implant wearers. Still limited by the low-resolution visual percepts provided by retinal prostheses, it is important to investigate and apply image processing methods to convey more useful visual information to the wearers. We proposed two image processing strategies based on Itti's visual saliency map, region of interest (ROI) extraction, and image segmentation. Itti's saliency model generated a saliency map from the original image, in which salient regions were grouped into ROI by the fuzzy c-means clustering. Then Grabcut generated a proto-object from the ROI labeled image which was recombined with background and enhanced in two ways--8-4 separated pixelization (8-4 SP) and background edge extraction (BEE). Results showed that both 8-4 SP and BEE had significantly higher recognition accuracy in comparison with direct pixelization (DP). Each saliency-based image processing strategy was subject to the performance of image segmentation. Under good and perfect segmentation conditions, BEE and 8-4 SP obtained noticeably higher recognition accuracy than DP, and under bad segmentation condition, only BEE boosted the performance. The application of saliency-based image processing strategies was verified to be beneficial to object recognition in daily scenes under simulated prosthetic vision. They are hoped to help the development of the image processing module for future retinal prostheses, and thus provide more benefit for the patients.

  13. Mixed reality orthognathic surgical simulation by entity model manipulation and 3D-image display

    Science.gov (United States)

    Shimonagayoshi, Tatsunari; Aoki, Yoshimitsu; Fushima, Kenji; Kobayashi, Masaru

    2005-12-01

    In orthognathic surgery, the framing of 3D-surgical planning that considers the balance between the front and back positions and the symmetry of the jawbone, as well as the dental occlusion of teeth, is essential. In this study, a support system for orthodontic surgery to visualize the changes in the mandible and the occlusal condition and to determine the optimum position in mandibular osteotomy has been developed. By integrating the operating portion of a tooth model that is to determine the optimum occlusal position by manipulating the entity tooth model and the 3D-CT skeletal images (3D image display portion) that are simultaneously displayed in real-time, the determination of the mandibular position and posture in which the improvement of skeletal morphology and occlusal condition is considered, is possible. The realistic operation of the entity model and the virtual 3D image display enabled the construction of a surgical simulation system that involves augmented reality.

  14. Investigation on reduced thermal models for simulating infrared images in fusion devices

    Science.gov (United States)

    Gerardin, J.; Aumeunier, M.-H.; Firdaouss, M.; Gardarein, J.-L.; Rigollet, F.

    2016-09-01

    In fusion facilities, the in-vessel wall receives high heat flux density up to 20 MW/m2. The monitoring of in-vessel components is usually ensured by infra-red (IR) thermography but with all-metallic walls, disturbance phenomenon as reflections may lead to inaccurate temperature estimates, potentially endangering machine safety. A full predictive photonic simulation is then used to assess accurately the IR measurements. This paper investigates some reduced thermal models (semi-infinite wall, thermal quadrupole) to predict the surface temperature from the particle loads on components for a given plasma scenario. The results are compared with a reference 3D Finite Element Method (Ansys Mechanical) and used as input for simulating IR images. The performances of reduced thermal models are analysed by comparing the resulting IR images.

  15. Three-dimensional fuse deposition modeling of tissue-simulating phantom for biomedical optical imaging

    Science.gov (United States)

    Dong, Erbao; Zhao, Zuhua; Wang, Minjie; Xie, Yanjun; Li, Shidi; Shao, Pengfei; Cheng, Liuquan; Xu, Ronald X.

    2015-12-01

    Biomedical optical devices are widely used for clinical detection of various tissue anomalies. However, optical measurements have limited accuracy and traceability, partially owing to the lack of effective calibration methods that simulate the actual tissue conditions. To facilitate standardized calibration and performance evaluation of medical optical devices, we develop a three-dimensional fuse deposition modeling (FDM) technique for freeform fabrication of tissue-simulating phantoms. The FDM system uses transparent gel wax as the base material, titanium dioxide (TiO2) powder as the scattering ingredient, and graphite powder as the absorption ingredient. The ingredients are preheated, mixed, and deposited at the designated ratios layer-by-layer to simulate tissue structural and optical heterogeneities. By printing the sections of human brain model based on magnetic resonance images, we demonstrate the capability for simulating tissue structural heterogeneities. By measuring optical properties of multilayered phantoms and comparing with numerical simulation, we demonstrate the feasibility for simulating tissue optical properties. By creating a rat head phantom with embedded vasculature, we demonstrate the potential for mimicking physiologic processes of a living system.

  16. The PLATO Simulator: Modelling of High-Precision High-Cadence Space-Based Imaging

    CERN Document Server

    Marcos-Arenal, P; De Ridder, J; Aerts, C; Huygen, R; Samadi, R; Green, J; Piotto, G; Salmon, S; Catala, C; Rauer, H

    2014-01-01

    Many aspects of the design trade-off of a space-based instrument and its performance can best be tackled through simulations of the expected observations. The complex interplay of various noise sources in the course of the observations make such simulations an indispensable part of the assessment and design study of any space-based mission. We present a formalism to model and simulate photometric time series of CCD images by including models of the CCD and its electronics, the telescope optics, the stellar field, the jitter movements of the spacecraft, and all important natural noise sources. This formalism has been implemented in a versatile end-to-end simulation software tool, called PLATO Simulator, specifically designed for the PLATO space mission to be operated from L2, but easily adaptable to similar types of missions. We provide a detailed description of several noise sources and discuss their properties, in connection with the optical design, the allowable level of jitter, the quantum efficiency of th...

  17. Shadow effects in simulated ultrasound images derived from computed tomography images using a focused beam tracing model

    DEFF Research Database (Denmark)

    Pham, An Hoai; Lundgren, Bo; Stage, Bjarne

    2012-01-01

    Simulation of ultrasound images based on computed tomography (CT) data has previously been performed with different approaches. Shadow effects are normally pronounced in ultrasound images, so they should be included in the simulation. In this study, a method to capture the shadow effects has been...

  18. An imaging-based computational model for simulating angiogenesis and tumour oxygenation dynamics

    Science.gov (United States)

    Adhikarla, Vikram; Jeraj, Robert

    2016-05-01

    Tumour growth, angiogenesis and oxygenation vary substantially among tumours and significantly impact their treatment outcome. Imaging provides a unique means of investigating these tumour-specific characteristics. Here we propose a computational model to simulate tumour-specific oxygenation changes based on the molecular imaging data. Tumour oxygenation in the model is reflected by the perfused vessel density. Tumour growth depends on its doubling time (T d) and the imaged proliferation. Perfused vessel density recruitment rate depends on the perfused vessel density around the tumour (sMVDtissue) and the maximum VEGF concentration for complete vessel dysfunctionality (VEGFmax). The model parameters were benchmarked to reproduce the dynamics of tumour oxygenation over its entire lifecycle, which is the most challenging test. Tumour oxygenation dynamics were quantified using the peak pO2 (pO2peak) and the time to peak pO2 (t peak). Sensitivity of tumour oxygenation to model parameters was assessed by changing each parameter by 20%. t peak was found to be more sensitive to tumour cell line related doubling time (~30%) as compared to tissue vasculature density (~10%). On the other hand, pO2peak was found to be similarly influenced by the above tumour- and vasculature-associated parameters (~30-40%). Interestingly, both pO2peak and t peak were only marginally affected by VEGFmax (~5%). The development of a poorly oxygenated (hypoxic) core with tumour growth increased VEGF accumulation, thus disrupting the vessel perfusion as well as further increasing hypoxia with time. The model with its benchmarked parameters, is applied to hypoxia imaging data obtained using a [64Cu]Cu-ATSM PET scan of a mouse tumour and the temporal development of the vasculature and hypoxia maps are shown. The work underscores the importance of using tumour-specific input for analysing tumour evolution. An extended model incorporating therapeutic effects can serve as a powerful tool for analysing

  19. Cortical imaging on a head template: a simulation study using a resistor mesh model (RMM).

    Science.gov (United States)

    Chauveau, Nicolas; Franceries, Xavier; Aubry, Florent; Celsis, Pierre; Rigaud, Bernard

    2008-09-01

    The T1 head template model used in Statistical Parametric Mapping Version 2000 (SPM2), was segmented into five layers (scalp, skull, CSF, grey and white matter) and implemented in 2 mm voxels. We designed a resistor mesh model (RMM), based on the finite volume method (FVM) to simulate the electrical properties of this head model along the three axes for each voxel. Then, we introduced four dipoles of high eccentricity (about 0.8) in this RMM, separately and simultaneously, to compute the potentials for two sets of conductivities. We used the direct cortical imaging technique (CIT) to recover the simulated dipoles, using 60 or 107 electrodes and with or without addition of Gaussian white noise (GWN). The use of realistic conductivities gave better CIT results than standard conductivities, lowering the blurring effect on scalp potentials and displaying more accurate position areas when CIT was applied to single dipoles. Simultaneous dipoles were less accurately localized, but good qualitative and stable quantitative results were obtained up to 5% noise level for 107 electrodes and up to 10% noise level for 60 electrodes, showing that a compromise must be found to optimize both the number of electrodes and the noise level. With the RMM defined in 2 mm voxels, the standard 128-electrode cap and 5% noise appears to be the upper limit providing reliable source positions when direct CIT is used. The admittance matrix defining the RMM is easy to modify so as to adapt to different conductivities. The next step will be the adaptation of individual real head T2 images to the RMM template and the introduction of anisotropy using diffusion imaging (DI).

  20. Signal and image processing systems performance evaluation, simulation, and modeling; Proceedings of the Meeting, Orlando, FL, Apr. 4, 5, 1991

    Science.gov (United States)

    Nasr, Hatem N.; Bazakos, Michael E.

    The various aspects of the evaluation and modeling problems in algorithms, sensors, and systems are addressed. Consideration is given to a generic modular imaging IR signal processor, real-time architecture based on the image-processing module family, application of the Proto Ware simulation testbed to the design and evaluation of advanced avionics, development of a fire-and-forget imaging infrared seeker missile simulation, an adaptive morphological filter for image processing, laboratory development of a nonlinear optical tracking filter, a dynamic end-to-end model testbed for IR detection algorithms, wind tunnel model aircraft attitude and motion analysis, an information-theoretic approach to optimal quantization, parametric analysis of target/decoy performance, neural networks for automated target recognition parameters adaptation, performance evaluation of a texture-based segmentation algorithm, evaluation of image tracker algorithms, and multisensor fusion methodologies. (No individual items are abstracted in this volume)

  1. Modeling and simulations of three-dimensional laser imaging based on space-variant structure

    Science.gov (United States)

    Cao, Jie; Hao, Qun; Peng, Yuxin; Cheng, Yang; Mu, Jiaxing; Wang, Peng; Yu, Haoyong

    2016-04-01

    A three-dimensional (3D) laser imaging system based on time of flight is proposed, based on the human retina structure. The system obtains 3D images with space-variant resolution, and we further establish mathematical models of the system and carried out simulative comparison between space-variant structure (SVS) and space-invariant structure (SIS). The system based on SVS produces significant improvements over traditional system based on SIS in the following aspects: (1) The system based on SVS uses less pixels than that based on SIS under the same field of view (FOV) and resolution. Therefore, this property is more suitable for uses in situations that require high speed and large volume data processing. (2) The system based on SVS has higher efficiency of utilization of photodiode array than that based on SIS. (3) 3D image based on SVS has the properties of rotation and scaling invariance. (4) The system based on SVS has higher echo power in outside ring of large photodiode array, which is more effective in detecting targets with low reflectance.

  2. Comparison Between Linear and Nonlinear Models of Mixed Pixels in Remote Sensing Satellite Images Based on Cierniewski Surface BRDF Model by Means of Monte Carlo Ray Tracing Simulation

    Directory of Open Access Journals (Sweden)

    Kohei Arai

    2013-04-01

    Full Text Available Comparative study on linear and nonlinear mixed pixel models of which pixels in remote sensing satellite images is composed with plural ground cover materials mixed together, is conducted for remote sensing satellite image analysis. The mixed pixel models are based on Cierniewski of ground surface reflectance model. The comparative study is conducted by using of Monte Carlo Ray Tracing: MCRT simulations. Through simulation study, the difference between linear and nonlinear mixed pixel models is clarified. Also it is found that the simulation model is validated.

  3. Modelling and Numerical Simulations of In-Air Reverberation Images for Fault Detection in Medical Ultrasonic Transducers: A Feasibility Study

    Directory of Open Access Journals (Sweden)

    W. Kochański

    2015-01-01

    Full Text Available A simplified two-dimensional finite element model which simulates the in-air reverberation image produced by medical ultrasonic transducers has been developed. The model simulates a linear array consisting of 128 PZT-5A crystals, a tungsten-epoxy backing layer, an Araldite matching layer, and a Perspex lens layer. The thickness of the crystal layer is chosen to simulate pulses centered at 4 MHz. The model is used to investigate whether changes in the electromechanical properties of the individual transducer layers (backing layer, crystal layer, matching layer, and lens layer have an effect on the simulated in-air reverberation image generated. Changes in the electromechanical properties are designed to simulate typical medical transducer faults such as crystal drop-out, lens delamination, and deterioration in piezoelectric efficiency. The simulations demonstrate that fault-related changes in transducer behaviour can be observed in the simulated in-air reverberation image pattern. This exploratory approach may help to provide insight into deterioration in transducer performance and help with early detection of faults.

  4. Simulation model of mammographic calcifications based on the American College of Radiology Breast Imaging Reporting and Data System, or BIRADS.

    Science.gov (United States)

    Kallergi, M; Gavrielides, M A; He, L; Berman, C G; Kim, J J; Clark, R A

    1998-10-01

    The authors developed and evaluated a method for the simulation of calcification clusters based on the guidelines of the Breast Imaging Reporting and Data System of the American College of Radiology. They aimed to reproduce accurately the relative and absolute size, shape, location, number, and intensity of real calcifications associated with both benign and malignant disease. Thirty calcification clusters were simulated by using the proposed model and were superimposed on real, negative mammograms digitized at 30 microns and 16 bits per pixel. The accuracy of the simulation was evaluated by three radiologists in a blinded study. No statistically significant difference was observed in the observers' evaluation of the simulated clusters and the real clusters. The observers' classification of the cluster types seemed to be a good approximation of the intended types from the simulation design. This model can provide simulated calcification clusters with well-defined morphologic, distributional, and contrast characteristics for a variety of applications in digital mammography.

  5. Simulation of Sentinel-3 images by four stream surface atmosphere radiative transfer modeling in the optical and thermal domains

    NARCIS (Netherlands)

    Verhoef, W.; Bach, H.

    2012-01-01

    Simulation of future satellite images can be applied in order to validate the general mission concept and to test the performance of advanced multi-sensor algorithms for the retrieval of surface parameters. This paper describes the radiative transfer modeling part of a so-called Land Scene Generator

  6. Simulation of Sentinel-3 images by four stream surface atmosphere radiative transfer modeling in the optical and thermal domains

    NARCIS (Netherlands)

    Verhoef, W.; Bach, H.

    2012-01-01

    Simulation of future satellite images can be applied in order to validate the general mission concept and to test the performance of advanced multi-sensor algorithms for the retrieval of surface parameters. This paper describes the radiative transfer modeling part of a so-called Land Scene Generator

  7. Wax Modeling and Image Analysis for Classroom-Scale Lava Flow Simulations.

    Science.gov (United States)

    Rader, E. L.; Clarke, A. B.; Vanderkluysen, L.

    2016-12-01

    The use of polyethylene glycol wax (PEG 600) as an analog for lava allows for a visual representation of the complex physical process occurring in natural lava flows, including cooling, breakouts, and crust and lobe formation. We used a series of cameras positioned around a tank filled with chilled water as a lab bench to observe and quantify lava flow morphology and motion. A peristaltic pump connected to a vent at the base of the tank delivered dyed wax simulating effusive eruptions similar to those of Kilauea in Hawai`i. By varying the eruptive conditions such as wax temperature and eruption rate, students can observe how the crust forms on wax flows, how different textures result, and how a flow field evolves with time. Recorded footage of the same `eruption' can then be quantitatively analyzed using free software like ImageJ and Tracker to quantify time-series of spreading rate, change in height, and appearance of different surface morphologies. Additional dye colors can be added periodically to further illustrate how lava is transported from the vent to the periphery of a flow field (e.g., through a tube system). Data collected from this activity can be compared to active lava flow footage from Hawai`i and with numerical models of lava flow propagation, followed by discussions of the application of these data and concepts to predicting the behavior of lava in hazard management situations and interpreting paleomagnetic, petrologic, and mapping of older eruptions.

  8. Development of a simplified simulation model for performance characterization of a pixellated CdZnTe multimodality imaging system

    Energy Technology Data Exchange (ETDEWEB)

    Guerra, P; Santos, A [Departamento de IngenierIa Electronica, Universidad Politecnica de Madrid, Ciudad Universitaria s/n, 28040 Madrid (Spain); Darambara, D G [Joint Department of Physics, Royal Marsden NHS Foundation Trust and The Institute of Cancer Research, Fulham Road, London SW3 6JJ (United Kingdom)], E-mail: pguerra@die.um.es

    2008-02-21

    Current requirements of molecular imaging lead to the complete integration of complementary modalities in a single hybrid imaging system to correlate function and structure. Among the various existing detector technologies, which can be implemented to integrate nuclear modalities (PET and/or single-photon emission computed tomography with x-rays (CT) and most probably with MR, pixellated wide bandgap room temperature semiconductor detectors, such as CdZnTe and/or CdTe, are promising candidates. This paper deals with the development of a simplified simulation model for pixellated semiconductor radiation detectors, as a first step towards the performance characterization of a multimodality imaging system based on CdZnTe. In particular, this work presents a simple computational model, based on a 1D approximate solution of the Schockley-Ramo theorem, and its integration into the Geant4 application for tomographic emission (GATE) platform in order to perform accurately and, therefore, improve the simulations of pixellated detectors in different configurations with a simultaneous cathode and anode pixel readout. The model presented here is successfully validated against an existing detailed finite element simulator, the multi-geometry simulation code, with respect to the charge induced at the anode, taking into consideration interpixel charge sharing and crosstalk, and to the detector charge induction efficiency. As a final point, the model provides estimated energy spectra and time resolution for {sup 57}Co and {sup 18}F sources obtained with the GATE code after the incorporation of the proposed model.

  9. Realistic simulation of reduced-dose CT with noise modeling and sinogram synthesis using DICOM CT images

    Energy Technology Data Exchange (ETDEWEB)

    Won Kim, Chang [Interdisciplinary Program of Bioengineering Major Seoul National University College of Engineering, San 56-1, Silim-dong, Gwanak-gu, Seoul 152-742, South Korea and Institute of Radiation Medicine, Seoul National University College of Medicine, 28, Yongon-dong, Chongno-gu, Seoul 110-744 (Korea, Republic of); Kim, Jong Hyo, E-mail: kimjhyo@snu.ac.kr [Department of Radiology, Institute of Radiation Medicine, Seoul National University College of Medicine, 28, Yongon-dong, Chongno-gu, Seoul, 110-744 (Korea, Republic of); Department of Transdisciplinary Studies, Graduate School of Convergence Science and Technology, Seoul National University, Suwon, Gyeonggi-do, 443-270 (Korea, Republic of); Advanced Institutes of Convergence Technology, Seoul National University, Suwon, Gyeonggi-do, 443-270 (Korea, Republic of)

    2014-01-15

    Purpose: Reducing the patient dose while maintaining the diagnostic image quality during CT exams is the subject of a growing number of studies, in which simulations of reduced-dose CT with patient data have been used as an effective technique when exploring the potential of various dose reduction techniques. Difficulties in accessing raw sinogram data, however, have restricted the use of this technique to a limited number of institutions. Here, we present a novel reduced-dose CT simulation technique which provides realistic low-dose images without the requirement of raw sinogram data. Methods: Two key characteristics of CT systems, the noise equivalent quanta (NEQ) and the algorithmic modulation transfer function (MTF), were measured for various combinations of object attenuation and tube currents by analyzing the noise power spectrum (NPS) of CT images obtained with a set of phantoms. Those measurements were used to develop a comprehensive CT noise model covering the reduced x-ray photon flux, object attenuation, system noise, and bow-tie filter, which was then employed to generate a simulated noise sinogram for the reduced-dose condition with the use of a synthetic sinogram generated from a reference CT image. The simulated noise sinogram was filtered with the algorithmic MTF and back-projected to create a noise CT image, which was then added to the reference CT image, finally providing a simulated reduced-dose CT image. The simulation performance was evaluated in terms of the degree of NPS similarity, the noise magnitude, the bow-tie filter effect, and the streak noise pattern at photon starvation sites with the set of phantom images. Results: The simulation results showed good agreement with actual low-dose CT images in terms of their visual appearance and in a quantitative evaluation test. The magnitude and shape of the NPS curves of the simulated low-dose images agreed well with those of real low-dose images, showing discrepancies of less than +/−3.2% in

  10. Effect of crystalline lens surfaces and refractive index on image quality by model simulation analysis

    Institute of Scientific and Technical Information of China (English)

    Meimei Kong; Zhishan Gao; Lei Chen; Xinhua Li

    2008-01-01

    The surfaces and refractive index of crystalline lens play an important role in the optical performance of human eye.On the basis of two eye models,which are widely applied at present,the effect of lens surfaces and its refractive index distribution on optical imaging is analyzed with the optical design software ZEMAX (Zemax Development Co.,San Diego,USA).The result shows that good image quality can be provided by the aspheric lens surfaces or (and) the gradient-index (GRIN) distribution.It has great potential in the design of intraocular lens (IOL).The eye models with an intraocular implantation are presented.

  11. Modelling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Casetti, E.; Vogt, W.G.; Mickle, M.H.

    1984-01-01

    This conference includes papers on the uses of supercomputers, multiprocessors, artificial intelligence and expert systems in various energy applications. Topics considered include knowledge-based expert systems for power engineering, a solar air conditioning laboratory computer system, multivariable control systems, the impact of power system disturbances on computer systems, simulating shared-memory parallel computers, real-time image processing with multiprocessors, and network modeling and simulation of greenhouse solar systems.

  12. Image simulation and a model of noise power spectra across a range of mammographic beam qualities

    Energy Technology Data Exchange (ETDEWEB)

    Mackenzie, Alistair, E-mail: alistairmackenzie@nhs.net; Dance, David R.; Young, Kenneth C. [National Coordinating Centre for the Physics of Mammography, Royal Surrey County Hospital, Guildford GU2 7XX, United Kingdom and Department of Physics, University of Surrey, Guildford GU2 7XH (United Kingdom); Diaz, Oliver [Centre for Vision, Speech and Signal Processing, Faculty of Engineering and Physical Sciences, University of Surrey, Guildford GU2 7XH, United Kingdom and Computer Vision and Robotics Research Institute, University of Girona, Girona 17071 (Spain)

    2014-12-15

    Purpose: The aim of this work is to create a model to predict the noise power spectra (NPS) for a range of mammographic radiographic factors. The noise model was necessary to degrade images acquired on one system to match the image quality of different systems for a range of beam qualities. Methods: Five detectors and x-ray systems [Hologic Selenia (ASEh), Carestream computed radiography CR900 (CRc), GE Essential (CSI), Carestream NIP (NIPc), and Siemens Inspiration (ASEs)] were characterized for this study. The signal transfer property was measured as the pixel value against absorbed energy per unit area (E) at a reference beam quality of 28 kV, Mo/Mo or 29 kV, W/Rh with 45 mm polymethyl methacrylate (PMMA) at the tube head. The contributions of the three noise sources (electronic, quantum, and structure) to the NPS were calculated by fitting a quadratic at each spatial frequency of the NPS against E. A quantum noise correction factor which was dependent on beam quality was quantified using a set of images acquired over a range of radiographic factors with different thicknesses of PMMA. The noise model was tested for images acquired at 26 kV, Mo/Mo with 20 mm PMMA and 34 kV, Mo/Rh with 70 mm PMMA for three detectors (ASEh, CRc, and CSI) over a range of exposures. The NPS were modeled with and without the noise correction factor and compared with the measured NPS. A previous method for adapting an image to appear as if acquired on a different system was modified to allow the reference beam quality to be different from the beam quality of the image. The method was validated by adapting the ASEh flat field images with two thicknesses of PMMA (20 and 70 mm) to appear with the imaging characteristics of the CSI and CRc systems. Results: The quantum noise correction factor rises with higher beam qualities, except for CR systems at high spatial frequencies, where a flat response was found against mean photon energy. This is due to the dominance of secondary quantum noise

  13. Modelling of AlAs/GaAs interfacial structures using high-angle annular dark field (HAADF) image simulations.

    Science.gov (United States)

    Robb, Paul D; Finnie, Michael; Craven, Alan J

    2012-07-01

    High angle annular dark field (HAADF) image simulations were performed on a series of AlAs/GaAs interfacial models using the frozen-phonon multislice method. Three general types of models were considered-perfect, vicinal/sawtooth and diffusion. These were chosen to demonstrate how HAADF image measurements are influenced by different interfacial structures in the technologically important III-V semiconductor system. For each model, interfacial sharpness was calculated as a function of depth and compared to aberration-corrected HAADF experiments of two types of AlAs/GaAs interfaces. The results show that the sharpness measured from HAADF imaging changes in a complicated manner with thickness for complex interfacial structures. For vicinal structures, it was revealed that the type of material that the probe projects through first of all has a significant effect on the measured sharpness. An increase in the vicinal angle was also shown to generate a wider interface in the random step model. The Moison diffusion model produced an increase in the interface width with depth which closely matched the experimental results of the AlAs-on-GaAs interface. In contrast, the interface width decreased as a function of depth in the linear diffusion model. Only in the case of the perfect model was it possible to ascertain the underlying structure directly from HAADF image analysis.

  14. Segmented medical images based simulations of Cardiac electrical activity and electrocardiogram: a model comparison

    OpenAIRE

    Pierre, Charles; Rousseau, Olivier; Bourgault, Yves

    2009-01-01

    The purposes of this work is to compare the action potential and electrocardiogram computed with the monodomain and bidomain models, using a patient-based two-dimensional geometry of the heart-torso. The pipeline from CT scans to image segmentation with an in-house level set method, then to mesh generation is detailed in the article. Our segmentation technique is based on a new iterative Chan-Vese method. The bidomain model and its approximation called the ``adapted'' monodomain model are nex...

  15. The ASTROID Simulator Software Package: Realistic Modelling of High-Precision High-Cadence Space-Based Imaging

    CERN Document Server

    Marcos-Arenal, P; De Ridder, J; Huygen, R; Aerts, C

    2014-01-01

    The preparation of a space-mission that carries out any kind of imaging to detect high-precision low-amplitude variability of its targets requires a robust model for the expected performance of its instruments. This model cannot be derived from simple addition of noise properties due to the complex interaction between the various noise sources. While it is not feasible to build and test a prototype of the imaging device on-ground, realistic numerical simulations in the form of an end-to-end simulator can be used to model the noise propagation in the observations. These simulations not only allow studying the performance of the instrument, its noise source response and its data quality, but also the instrument design verification for different types of configurations, the observing strategy and the scientific feasibility of an observing proposal. In this way, a complete description and assessment of the objectives to expect from the mission can be derived. We present a high-precision simulation software packag...

  16. Single molecule dynamics in a virtual cell: a three-dimensional model that produces simulated fluorescence video-imaging data.

    Science.gov (United States)

    Mashanov, Gregory I

    2014-09-06

    The analysis of single molecule imaging experiments is complicated by the stochastic nature of single molecule events, by instrument noise and by the limited information which can be gathered about any individual molecule observed. Consequently, it is important to cross check experimental results using a model simulating single molecule dynamics (e.g. movements and binding events) in a virtual cell-like environment. The output of such a model should match the real data format allowing researchers to compare simulated results with the real experiments. The proposed model exploits the advantages of 'object-oriented' computing. First of all, the ability to create and manipulate a number of classes, each containing an arbitrary number of single molecule objects. These classes may include objects moving within the 'cytoplasm'; objects moving at the 'plasma membrane'; and static objects located inside the 'body'. The objects of a given class can interact with each other and/or with the objects of other classes according to their physical and chemical properties. Each model run generates a sequence of images, each containing summed images of all fluorescent objects emitting light under given illumination conditions with realistic levels of noise and emission fluctuations. The model accurately reproduces reported single molecule experiments and predicts the outcome of future experiments.

  17. Simulations of the flipping images and microparameters of molecular orientations in liquids according to the molecule string model

    Institute of Scientific and Technical Information of China (English)

    Wang Li-Na; Zhao Xing-Yu; Zhang Li-Li; Huang Yi-Neng

    2012-01-01

    The relaxation dynamics of liquids is one of the fundamental problems in liquid physics,and it is also one of the key issues to understand the glass transition mechanism.It will undoubtedly provide enlightenment on understanding and calculating the relaxation dynamics if the molecular orientation flipping images and relevant microparameters of liquids are studied.In this paper,we first give five microparameters to describe the individual molecular string (MS) relaxation based on the dynamical Hamiltonian of the MS model,and then simulate the images of individual MS ensemble,and at the same time calculate the parameters of the equilibrium state.The results show that the main molecular orientation flipping image in liquids (including supercooled liquid) is similar to the random walk.In addition,two pairs of the parameters are equal,and one can be ignored compared with the other.This conclusion will effectively reduce the difficulties in calculating the individual MS relaxation based on the single-molecule orientation flipping rate of the general Glauber type,and the computer simulation time of interaction MS relaxation.Moreover,the conclusion is of reference significance for solving and simulating the multi-state MS model.

  18. GX_Simulator: An Interactive Idl Widget Tool For Visualization And Simulation Of Imaging Spectroscopy Models And Data

    Science.gov (United States)

    Nita, Gelu M.; Fleishman, G. D.; Gary, D. E.; Kuznetsov, A. A.; Kontar, E. P.

    2011-05-01

    An interactive IDL widget application intended to provide a flexible tool that allows the user to generate spatially resolved radio and/or X-ray spectra is presented. The object-based architecture of this application provides full interaction with local 3D magnetic field models (e.g., from an extrapolation) that may be embedded in a global coronal model. Various tools provided allow users to explore the magnetic connectivity of the model by generating magnetic field lines originating in user-specified volume positions. Such lines may serve as reference lines for creating magnetic flux tubes, which are further populated with user-defined analytical thermal/non thermal particle distribution models. By default, the application integrates IDL callable DLL and Shared libraries containing fast GS emission codes developed in FORTRAN and C++ and soft and hard X-ray codes developed in IDL. However, the interactive interface allows interchanging these default libraries with any user-defined IDL or external callable codes designed to solve the radiation transfer equation in the same or other wavelength ranges of interest. We illustrate the tool capacity and generality by a real-time computation of microwave and X-ray images from realistic magnetic structures obtained from nonlinear force-free field extrapolations. This work was supported in part by NSF grants AGS-0961867, AST-0908344, and NASA grants NNX10AF27G and NNX11AB49G to New Jersey Institute of Technology, by a UK STFC rolling grant, STFC/PPARC Advanced Fellowship, and the Leverhulme Trust, UK. Financial support by the European Commission through the SOLAIRE and HESPE Networks is gratefully acknowledged.

  19. Simulating secondary waterflooding in heterogeneous rocks with variable wettability using an image-based, multiscale pore network model

    Science.gov (United States)

    Bultreys, Tom; Van Hoorebeke, Luc; Cnudde, Veerle

    2016-09-01

    The two-phase flow properties of natural rocks depend strongly on their pore structure and wettability, both of which are often heterogeneous throughout the rock. To better understand and predict these properties, image-based models are being developed. Resulting simulations are however problematic in several important classes of rocks with broad pore-size distributions. We present a new multiscale pore network model to simulate secondary waterflooding in these rocks, which may undergo wettability alteration after primary drainage. This novel approach permits to include the effect of microporosity on the imbibition sequence without the need to describe each individual micropore. Instead, we show that fluid transport through unresolved pores can be taken into account in an upscaled fashion, by the inclusion of symbolic links between macropores, resulting in strongly decreased computational demands. Rules to describe the behavior of these links in the quasistatic invasion sequence are derived from percolation theory. The model is validated by comparison to a fully detailed network representation, which takes each separate micropore into account. Strongly and weakly water-and oil-wet simulations show good results, as do mixed-wettability scenarios with different pore-scale wettability distributions. We also show simulations on a network extracted from a micro-CT scan of Estaillades limestone, which yields good agreement with water-wet and mixed-wet experimental results.

  20. Medical image analysis methods in MR/CT-imaged acute-subacute ischemic stroke lesion: Segmentation, prediction and insights into dynamic evolution simulation models. A critical appraisal☆

    Science.gov (United States)

    Rekik, Islem; Allassonnière, Stéphanie; Carpenter, Trevor K.; Wardlaw, Joanna M.

    2012-01-01

    Over the last 15 years, basic thresholding techniques in combination with standard statistical correlation-based data analysis tools have been widely used to investigate different aspects of evolution of acute or subacute to late stage ischemic stroke in both human and animal data. Yet, a wave of biology-dependent and imaging-dependent issues is still untackled pointing towards the key question: “how does an ischemic stroke evolve?” Paving the way for potential answers to this question, both magnetic resonance (MRI) and CT (computed tomography) images have been used to visualize the lesion extent, either with or without spatial distinction between dead and salvageable tissue. Combining diffusion and perfusion imaging modalities may provide the possibility of predicting further tissue recovery or eventual necrosis. Going beyond these basic thresholding techniques, in this critical appraisal, we explore different semi-automatic or fully automatic 2D/3D medical image analysis methods and mathematical models applied to human, animal (rats/rodents) and/or synthetic ischemic stroke to tackle one of the following three problems: (1) segmentation of infarcted and/or salvageable (also called penumbral) tissue, (2) prediction of final ischemic tissue fate (death or recovery) and (3) dynamic simulation of the lesion core and/or penumbra evolution. To highlight the key features in the reviewed segmentation and prediction methods, we propose a common categorization pattern. We also emphasize some key aspects of the methods such as the imaging modalities required to build and test the presented approach, the number of patients/animals or synthetic samples, the use of external user interaction and the methods of assessment (clinical or imaging-based). Furthermore, we investigate how any key difficulties, posed by the evolution of stroke such as swelling or reperfusion, were detected (or not) by each method. In the absence of any imaging-based macroscopic dynamic model

  1. Dynamic 99mTc-MAG3 renography: images for quality control obtained by combining pharmacokinetic modelling, an anthropomorphic computer phantom and Monte Carlo simulated scintillation camera imaging

    Science.gov (United States)

    Brolin, Gustav; Sjögreen Gleisner, Katarina; Ljungberg, Michael

    2013-05-01

    In dynamic renal scintigraphy, the main interest is the radiopharmaceutical redistribution as a function of time. Quality control (QC) of renal procedures often relies on phantom experiments to compare image-based results with the measurement setup. A phantom with a realistic anatomy and time-varying activity distribution is therefore desirable. This work describes a pharmacokinetic (PK) compartment model for 99mTc-MAG3, used for defining a dynamic whole-body activity distribution within a digital phantom (XCAT) for accurate Monte Carlo (MC)-based images for QC. Each phantom structure is assigned a time-activity curve provided by the PK model, employing parameter values consistent with MAG3 pharmacokinetics. This approach ensures that the total amount of tracer in the phantom is preserved between time points, and it allows for modifications of the pharmacokinetics in a controlled fashion. By adjusting parameter values in the PK model, different clinically realistic scenarios can be mimicked, regarding, e.g., the relative renal uptake and renal transit time. Using the MC code SIMIND, a complete set of renography images including effects of photon attenuation, scattering, limited spatial resolution and noise, are simulated. The obtained image data can be used to evaluate quantitative techniques and computer software in clinical renography.

  2. Medical Image Registration and Surgery Simulation

    DEFF Research Database (Denmark)

    Bro-Nielsen, Morten

    1996-01-01

    This thesis explores the application of physical models in medical image registration and surgery simulation. The continuum models of elasticity and viscous fluids are described in detail, and this knowledge is used as a basis for most of the methods described here. Real-time deformable models......, and the use of selective matrix vector multiplication. Fluid medical image registration A new and faster algorithm for non-rigid registration using viscous fluid models is presented. This algorithm replaces the core part of the original algorithm with multi-resolution convolution using a new filter, which...... for surgery simulation Real-time deformable models, using finite element models of linear elasticity, have been developed for surgery simulation. The time consumption of the finite element method is reduced dramaticly, by the use of condensation techniques, explicit inversion of the stiffness matrix...

  3. Terapixel imaging of cosmological simulations

    CERN Document Server

    Feng, Yu; Di Matteo, Tiziana; Khandai, Nishikanta; Sargent, Randy; Nourbakhsh, Illah; Dille, Paul; Bartley, Chris; Springel, Volker; Jana, Anirban; Gardner, Jeffrey

    2011-01-01

    The increasing size of cosmological simulations has led to the need for new visualization techniques. We focus on Smoothed Particle Hydrodynamical (SPH) simulations run with the GADGET code and describe methods for visually accessing the entire simulation at full resolution. The simulation snapshots are rastered and processed on supercomputers into images that are ready to be accessed through a web interface (GigaPan). This allows any scientist with a web-browser to interactively explore simulation datasets in both in spatial and temporal dimensions, datasets which in their native format can be hundreds of terabytes in size or more. We present two examples, the first a static terapixel image of the MassiveBlack simulation, a P-GADGET SPH simulation with 65 billion particles, and the second an interactively zoomable animation of a different simulation with more than one thousand frames, each a gigapixel in size. Both are available for public access through the GigaPan web interface. We also make our imaging so...

  4. Simulation of photoacoustic imaging of microcracks in silicon wafers using a structure-changeable multilayered thermal diffusion model.

    Science.gov (United States)

    Nakata, Toshihiko; Kitamori, Takehiko; Sawada, Tsuguo

    2007-03-01

    The detection characteristics for photoacoustic imaging of microcracks in silicon wafers were theoretically and quantitatively investigated using a numerical simulation. The simulation is based on a one-dimensional multilayered thermal diffusion model coupled with the thermal-wave impedance of each layer, the layer structures of which are constructed along the wafer surface and are variable according to the scanning position of the point heat source. As the modulation frequency was reduced, the spatial resolution of the temperature amplitude profile at the cracks decreased, showing good agreement with the experimentally obtained photoacoustic amplitude images. At a modulation frequency of 200 kHz, for cracks with narrow air gaps of up to 20 nm, which is much smaller than both the beam spot size of 1.5 microm and the thermal diffusion length of 12 microm, the temperature amplitude is twice that of regions without cracks, and the temperature contrast increased with an increase in the modulation frequency. These calculation results suggest the effectiveness of using a high modulation frequency, making it possible to detect microcracks of the order of 10 nm.

  5. SimVascular 2.0: an Integrated Open Source Pipeline for Image-Based Cardiovascular Modeling and Simulation

    Science.gov (United States)

    Lan, Hongzhi; Merkow, Jameson; Updegrove, Adam; Schiavazzi, Daniele; Wilson, Nathan; Shadden, Shawn; Marsden, Alison

    2015-11-01

    SimVascular (www.simvascular.org) is currently the only fully open source software package that provides a complete pipeline from medical image based modeling to patient specific blood flow simulation and analysis. It was initially released in 2007 and has contributed to numerous advances in fundamental hemodynamics research, surgical planning, and medical device design. However, early versions had several major barriers preventing wider adoption by new users, large-scale application in clinical and research studies, and educational access. In the past years, SimVascular 2.0 has made significant progress by integrating open source alternatives for the expensive commercial libraries previously required for anatomic modeling, mesh generation and the linear solver. In addition, it simplified the across-platform compilation process, improved the graphical user interface and launched a comprehensive documentation website. Many enhancements and new features have been incorporated for the whole pipeline, such as 3-D segmentation, Boolean operation for discrete triangulated surfaces, and multi-scale coupling for closed loop boundary conditions. In this presentation we will briefly overview the modeling/simulation pipeline and advances of the new SimVascular 2.0.

  6. Numerical simulation of imaging laser radar system

    Science.gov (United States)

    Han, Shaokun; Lu, Bo; Jiang, Ming; Liu, Xunliang

    2008-03-01

    Rational and effective design of imaging laser radar systems is the key of imaging laser radar system research. Design must fully consider the interrelationship between various parameters. According to the parameters, choose suitable laser, detector and other components. To use of mathematical modeling and computer simulation is an effective imaging laser radar system design methods. This paper based on the distance equation, using the detection statistical methods, from the laser radar range coverage, detection probability, false-alarm rate, SNR to build the laser radar system mathematical models. In the process of setting up the mathematical models to fully consider the laser, atmosphere, detector and other factors on the performance that is to make the models be able to respond accurately the real situation. Based on this using C# and Matlab designed a simulation software.

  7. Radar Image Simulation: Validation of the Point Scattering Model. Volume 1

    Science.gov (United States)

    1977-09-01

    specific receiver (of a specific radar), it will be found that a typical receiver consists of a local oscillator , mixer, preamplifier, post-amplifier...circle centered on the .16 northwest corner of the power house of the Pickwick Landing Dam test site. This means that the reference ý, fene simulations

  8. Simulation of ultrasound backscatter images from fish

    DEFF Research Database (Denmark)

    Pham, An Hoai; Stage, Bjarne; Hemmsen, Martin Christian

    2011-01-01

    The objective of this work is to investigate ultrasound (US) backscatter in the MHz range from fis to develop a realistic and reliable simulation model. The long term objective of the work is to develop the needed signal processing for fis species differentiation using US. In in-vitro experiments...... is 10 MHz and the Full Width at Half Maximum (FWHM) at the focus point is 0.54 mm in the lateral direction. The transducer model in Field II was calibrated using a wire phantom to validate the simulated point spread function. The inputs to the simulation were the CT image data of the fis converted...

  9. Human eye cataract microstructure modeling and its effect on simulated retinal imaging

    Science.gov (United States)

    Fan, Wen-Shuang; Chang, Chung-Hao; Horng, Chi-Ting; Yao, Hsin-Yu; Sun, Han-Ying; Huang, Shu-Fang; Wang, Hsiang-Chen

    2017-02-01

    We designed a crystalline microstructure during cataract lesions and calculated the aberration value of the eye by using ray trace modeling to identify the corresponding spherical aberration, coma aberration, and trefoil aberration value under different pathological-change degrees. The mutual relationship between microstructure and aberration was then discussed using these values. Calculation results showed that with increased layer number of microstructure, the influence of aberration value on spherical aberration was the greatest. In addition, the influence of a relatively compact microstructure on spherical aberration and coma aberration was small, but that on trefoil aberration was great.

  10. End-to-End Image Simulator for Optical Imaging Systems: Equations and Simulation Examples

    Directory of Open Access Journals (Sweden)

    Peter Coppo

    2013-01-01

    Full Text Available The theoretical description of a simplified end-to-end software tool for simulation of data produced by optical instruments, starting from either synthetic or airborne hyperspectral data, is described and some simulation examples of hyperspectral and panchromatic images for existing and future design instruments are also reported. High spatial/spectral resolution images with low intrinsic noise and the sensor/mission specifications are used as inputs for the simulations. The examples reported in this paper show the capabilities of the tool for simulating target detection scenarios, data quality assessment with respect to classification performance and class discrimination, impact of optical design on image quality, and 3D modelling of optical performances. The simulator is conceived as a tool (during phase 0/A for the specification and early development of new Earth observation optical instruments, whose compliance to user’s requirements is achieved through a process of cost/performance trade-off. The Selex Galileo simulator, as compared with other existing image simulators for phase C/D projects of space-borne instruments, implements all modules necessary for a complete panchromatic and hyper spectral image simulation, and it allows excellent flexibility and expandability for new integrated functions because of the adopted IDL-ENVI software environment.

  11. Simulating Kinect Infrared and Depth Images.

    Science.gov (United States)

    Landau, Michael J; Choo, Benjamin Y; Beling, Peter A

    2016-12-01

    With the emergence of the Microsoft Kinect sensor, many developer communities and research groups have found countless uses and have already published a wide variety of papers that utilize the raw depth images for their specific goals. New methods and applications that use the device generally require an appropriately large ensemble of data sets with accompanying ground truth for testing purposes, as well as accurate models that account for the various systematic and stochastic contributors to Kinect errors. Current error models, however, overlook the intermediate infrared (IR) images that directly contribute to noisy depth estimates. We, therefore, propose a high fidelity Kinect IR and depth image predictor and simulator that models the physics of the transmitter/receiver system, unique IR dot pattern, disparity/depth processing technology, and random intensity speckle and IR noise in the detectors. The model accounts for important characteristics of Kinect's stereo triangulation system, including depth shadowing, IR dot splitting, spreading, and occlusions, correlation-based disparity estimation between windows of measured and reference IR images, and subpixel refinement. Results show that the simulator accurately produces axial depth error from imaged flat surfaces with various tilt angles, as well as the bias and standard lateral error of an object's horizontal and vertical edge.

  12. Modelling capillary trapping using finite-volume simulation of two-phase flow directly on micro-CT images

    Science.gov (United States)

    Raeini, Ali Q.; Bijeljic, Branko; Blunt, Martin J.

    2015-09-01

    We study capillary trapping in porous media using direct pore-scale simulation of two-phase flow on micro-CT images of a Berea sandstone and a sandpack. The trapped non-wetting phase saturations are predicted by solving the full Navier-Stokes equations using a volume-of-fluid based finite-volume framework to simulate primary drainage followed by water injection. Using these simulations, we analyse the effects of initial non-wetting-phase saturation, capillary number and flow direction on the residual saturation. The predictions from our numerical method are in agreement with published experimental measurements of capillary trapping curves. This shows that our direct simulation method can be used to elucidate the effect of pore structure and flow pattern of capillary trapping and provides a platform to study the physics of multiphase flow at the pore scale.

  13. Modeling and Simulation Research of Infrared Image Noise%红外图像噪声建模及仿真研究

    Institute of Scientific and Technical Information of China (English)

    唐麟; 刘琳; 苏君红

    2014-01-01

    噪声是造成图像退化和视频质量下降的另一个重要原因,仿真红外成像的噪声及其影响是建立图像退化模型的核心。红外成像噪声分析中的各种信号噪声,最终反映在热像仪输出的红外图像上体现为多种多样的成像噪声。通过把热像仪成像过程可能出现的各种主要噪声信号映射到红外图像中的随机噪声和固定图案噪声,建立了红外图像的噪声模型,并从实际热成像系统输出视频图像中提取噪声特性参数作为热像噪声仿真系统的输入,对红外成像系统的各种静态图像噪声的仿真实验效果接近实际的红外图像噪声情况,为热像仪成像退化建模奠定基础。%Noise is the cause of image degradation and video quality decrease, and the simulation of infrared image noise and its impact is the core of establishing image degradation model. Various noise signals in infrared imaging noise analysis, ultimately reflect in the output image of IR camera, become various imaging noises. In this paper, all kinds of primary noise signals in the IR camera imaging process are mapped to the random noise and fixed pattern noise in infrared image, the noise model of infrared image is established, and noise characteristic parameters are extracted from the output video of actual thermal imaging system as the input of infrared image noise simulation system. Simulation experiment result of static image noise in infrared imaging system is close to the actual infrared image noise. The paper has built up a foundation for modeling IR camera imaging degradation.

  14. Simulation modeling and arena

    CERN Document Server

    Rossetti, Manuel D

    2015-01-01

    Emphasizes a hands-on approach to learning statistical analysis and model building through the use of comprehensive examples, problems sets, and software applications With a unique blend of theory and applications, Simulation Modeling and Arena®, Second Edition integrates coverage of statistical analysis and model building to emphasize the importance of both topics in simulation. Featuring introductory coverage on how simulation works and why it matters, the Second Edition expands coverage on static simulation and the applications of spreadsheets to perform simulation. The new edition als

  15. Bio-imaging and visualization for patient-customized simulations

    CERN Document Server

    Luo, Xiongbiao; Li, Shuo

    2014-01-01

    This book contains the full papers presented at the MICCAI 2013 workshop Bio-Imaging and Visualization for Patient-Customized Simulations (MWBIVPCS 2013). MWBIVPCS 2013 brought together researchers representing several fields, such as Biomechanics, Engineering, Medicine, Mathematics, Physics and Statistic. The contributions included in this book present and discuss new trends in those fields, using several methods and techniques, including the finite element method, similarity metrics, optimization processes, graphs, hidden Markov models, sensor calibration, fuzzy logic, data mining, cellular automation, active shape models, template matching and level sets. These serve as tools to address more efficiently different and timely applications involving signal and image acquisition, image processing and analysis, image segmentation, image registration and fusion, computer simulation, image based modelling, simulation and surgical planning, image guided robot assisted surgical and image based diagnosis.  This boo...

  16. Simulation of at-sensor radiance over land for proposed thermal channels of Imager payload onboard INSAT-3D satellite using MODTRAN model

    Indian Academy of Sciences (India)

    M R Pandya; D B Shah; H J Trivedi; S Panigrahy

    2011-02-01

    INSAT-3D is the new generation Indian satellite designed for improved Earth observations through two payloads – Imager and Sounder. Study was conducted with an aim of simulating satellite level signal over land in the infrared channels of the Imager payload using a radiative transfer model MODTRAN. Satellite level at-sensor radiance corresponding to all four infrared channels of INSAT-3D Imager payload is obtained using MODTRAN and sensitivity of at-sensor radiance was inferred as a function of input parameters namely, surface temperature, emissivity, view angle and atmospheric water vapour, which is helpful in understanding the signal simulation scheme needed for retrieving a very critical parameter namely, land surface temperature.

  17. Three dimensional image-based simulation of ultrasonic wave propagation in polycrystalline metal using phase-field modeling.

    Science.gov (United States)

    Nakahata, K; Sugahara, H; Barth, M; Köhler, B; Schubert, F

    2016-04-01

    When modeling ultrasonic wave propagation in metals, it is important to introduce mesoscopic crystalline structures because the anisotropy of the crystal structure and the heterogeneity of grains disturb ultrasonic waves. In this paper, a three-dimensional (3D) polycrystalline structure generated by multiphase-field modeling was introduced to ultrasonic simulation for nondestructive testing. 3D finite-element simulations of ultrasonic waves were validated and compared with visualization results obtained from laser Doppler vibrometer measurements. The simulation results and measurements showed good agreement with respect to the velocity and front shape of the pressure wave, as well as multiple scattering due to grains. This paper discussed the applicability of a transversely isotropic approach to ultrasonic wave propagation in a polycrystalline metal with columnar structures.

  18. Image simulation for automatic license plate recognition

    Science.gov (United States)

    Bala, Raja; Zhao, Yonghui; Burry, Aaron; Kozitsky, Vladimir; Fillion, Claude; Saunders, Craig; Rodríguez-Serrano, José

    2012-01-01

    Automatic license plate recognition (ALPR) is an important capability for traffic surveillance applications, including toll monitoring and detection of different types of traffic violations. ALPR is a multi-stage process comprising plate localization, character segmentation, optical character recognition (OCR), and identification of originating jurisdiction (i.e. state or province). Training of an ALPR system for a new jurisdiction typically involves gathering vast amounts of license plate images and associated ground truth data, followed by iterative tuning and optimization of the ALPR algorithms. The substantial time and effort required to train and optimize the ALPR system can result in excessive operational cost and overhead. In this paper we propose a framework to create an artificial set of license plate images for accelerated training and optimization of ALPR algorithms. The framework comprises two steps: the synthesis of license plate images according to the design and layout for a jurisdiction of interest; and the modeling of imaging transformations and distortions typically encountered in the image capture process. Distortion parameters are estimated by measurements of real plate images. The simulation methodology is successfully demonstrated for training of OCR.

  19. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  20. Theory Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Shlachter, Jack [Los Alamos National Laboratory

    2012-08-23

    Los Alamos has a long history in theory, modeling and simulation. We focus on multidisciplinary teams that tackle complex problems. Theory, modeling and simulation are tools to solve problems just like an NMR spectrometer, a gas chromatograph or an electron microscope. Problems should be used to define the theoretical tools needed and not the other way around. Best results occur when theory and experiments are working together in a team.

  1. Simulation modeling of carcinogenesis.

    Science.gov (United States)

    Ellwein, L B; Cohen, S M

    1992-03-01

    A discrete-time simulation model of carcinogenesis is described mathematically using recursive relationships between time-varying model variables. The dynamics of cellular behavior is represented within a biological framework that encompasses two irreversible and heritable genetic changes. Empirical data and biological supposition dealing with both control and experimental animal groups are used together to establish values for model input variables. The estimation of these variables is integral to the simulation process as described in step-by-step detail. Hepatocarcinogenesis in male F344 rats provides the basis for seven modeling scenarios which illustrate the complexity of relationships among cell proliferation, genotoxicity, and tumor risk.

  2. Three-dimensional modeling and simulation of asphalt concrete mixtures based on X-ray CT microstructure images

    Directory of Open Access Journals (Sweden)

    Hainian Wang

    2014-02-01

    Full Text Available X-ray CT (computed tomography was used to scan asphalt mixture specimen to obtain high resolution continuous cross-section images and the meso-structure. According to the theory of three-dimensional (3D reconstruction, the 3D reconstruction algorithm was investigated in this paper. The key to the reconstruction technique is the acquisition of the voxel positions and the relationship between the pixel element and node. Three-dimensional numerical model of asphalt mixture specimen was created by a self-developed program. A splitting test was conducted to predict the stress distributions of the asphalt mixture and verify the rationality of the 3D model.

  3. Modeling and simulation of luminescence detection platforms.

    Science.gov (United States)

    Salama, Khaled; Eltoukhy, Helmy; Hassibi, Arjang; El-Gamal, Abbas

    2004-06-15

    Motivated by the design of an integrated CMOS-based detection platform, a simulation model for CCD and CMOS imager-based luminescence detection systems is developed. The model comprises four parts. The first portion models the process of photon flux generation from luminescence probes using ATP-based and luciferase label-based assay kinetics. An optics simulator is then used to compute the incident photon flux on the imaging plane for a given photon flux and system geometry. Subsequently, the output image is computed using a detailed imaging sensor model that accounts for photodetector spectral response, dark current, conversion gain, and various noise sources. Finally, signal processing algorithms are applied to the image to enhance detection reliability and hence increase the overall system throughput. To validate the model, simulation results are compared to experimental results obtained from a CCD-based system that was built to emulate the integrated CMOS-based platform.

  4. Simulations of optical microscope images

    Science.gov (United States)

    Germer, Thomas A.; Marx, Egon

    2006-03-01

    The resolution of an optical microscope is limited by the optical wavelengths used. However, there is no fundamental limit to the sensitivity of a microscope to small differences in any of a feature's dimensions. That is, those limits are determined by such things as the sensitivity of the detector array, the quality of the optical system, and the stability of the light source. The potential for using this nearly unbounded sensitivity has sparked interest in extending optical microscopy to the characterization of sub-wavelength structures created by photolithography and using that characterization for process control. In this paper, an analysis of the imaging of a semiconductor grating structure with an optical microscope will be presented. The analysis includes the effects of partial coherence in the illumination system, aberrations of both the illumination and the collection optics, non-uniformities in the illumination, and polarization. It can thus model just about any illumination configuration imaginable, including Koehler illumination, focused (confocal) illumination, or dark-field illumination. By propagating Jones matrices throughout the system, polarization control at the back focal planes of both illumination and collection can be investigated. Given a detailed characterization of the microscope (including aberrations), images can be calculated and compared to real data, allowing details of the grating structure to be determined, in a manner similar to that found in scatterometry.

  5. Medical image archive node simulation and architecture

    Science.gov (United States)

    Chiang, Ted T.; Tang, Yau-Kuo

    1996-05-01

    It is a well known fact that managed care and new treatment technologies are revolutionizing the health care provider world. Community Health Information Network and Computer-based Patient Record projects are underway throughout the United States. More and more hospitals are installing digital, `filmless' radiology (and other imagery) systems. They generate a staggering amount of information around the clock. For example, a typical 500-bed hospital might accumulate more than 5 terabytes of image data in a period of 30 years for conventional x-ray images and digital images such as Magnetic Resonance Imaging and Computer Tomography images. With several hospitals contributing to the archive, the storage required will be in the hundreds of terabytes. Systems for reliable, secure, and inexpensive storage and retrieval of digital medical information do not exist today. In this paper, we present a Medical Image Archive and Distribution Service (MIADS) concept. MIADS is a system shared by individual and community hospitals, laboratories, and doctors' offices that need to store and retrieve medical images. Due to the large volume and complexity of the data, as well as the diversified user access requirement, implementation of the MIADS will be a complex procedure. One of the key challenges to implementing a MIADS is to select a cost-effective, scalable system architecture to meet the ingest/retrieval performance requirements. We have performed an in-depth system engineering study, and developed a sophisticated simulation model to address this key challenge. This paper describes the overall system architecture based on our system engineering study and simulation results. In particular, we will emphasize system scalability and upgradability issues. Furthermore, we will discuss our simulation results in detail. The simulations study the ingest/retrieval performance requirements based on different system configurations and architectures for variables such as workload, tape

  6. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...... of models has been somewhat narrow-minded reducing the notion of validation to establishment of truth. This article puts forward the diversity in applications of simulation models that demands a corresponding diversity in the notion of validation....... of models with regards to their purpose, character, field of application and time dimension inherently calls for a similar diversity in validation approaches. A classification of models in terms of the mentioned elements is presented and used to shed light on possible types of validation leading...

  7. Coherent Scattering Imaging Monte Carlo Simulation

    Science.gov (United States)

    Hassan, Laila Abdulgalil Rafik

    Conventional mammography has poor contrast between healthy and cancerous tissues due to the small difference in attenuation properties. Coherent scatter potentially provides more information because interference of coherently scattered radiation depends on the average intermolecular spacing, and can be used to characterize tissue types. However, typical coherent scatter analysis techniques are not compatible with rapid low dose screening techniques. Coherent scatter slot scan imaging is a novel imaging technique which provides new information with higher contrast. In this work a simulation of coherent scatter was performed for slot scan imaging to assess its performance and provide system optimization. In coherent scatter imaging, the coherent scatter is exploited using a conventional slot scan mammography system with anti-scatter grids tilted at the characteristic angle of cancerous tissues. A Monte Carlo simulation was used to simulate the coherent scatter imaging. System optimization was performed across several parameters, including source voltage, tilt angle, grid distances, grid ratio, and shielding geometry. The contrast increased as the grid tilt angle increased beyond the characteristic angle for the modeled carcinoma. A grid tilt angle of 16 degrees yielded the highest contrast and signal to noise ratio (SNR). Also, contrast increased as the source voltage increased. Increasing grid ratio improved contrast at the expense of decreasing SNR. A grid ratio of 10:1 was sufficient to give a good contrast without reducing the intensity to a noise level. The optimal source to sample distance was determined to be such that the source should be located at the focal distance of the grid. A carcinoma lump of 0.5x0.5x0.5 cm3 in size was detectable which is reasonable considering the high noise due to the usage of relatively small number of incident photons for computational reasons. A further study is needed to study the effect of breast density and breast thickness

  8. The sensitivity of training image and integration of airborne 3D electromagnetic data in multiple-point geostatistical simulation and the impact on groundwater modeling

    Science.gov (United States)

    Jensen, K. H.; He, X.; Sonnenborg, T. O.; Jørgensen, F.

    2016-12-01

    Multiple-point geostatistical simulation (MPS) of the geological structure has become popular in recent years in groundwater modeling. The method derives multi-point based structural information from a training image (TI) and as such is superior to the traditional two-point based geostatistical approach. Its application in 3D simulations has been constrained by the difficulty of constructing 3D TI. High resolution 3D electromagnetic data can be used for defining a TI but the data can also be used as secondary data for soft conditioning. An alternative approach for derived a TI is to use the object-based unconditional simulation program TiGenerator. In this study we present different MPS simulations of the geological structure for a site in Denmark based on different scenarios regarding TI and soft conditioning. The generated geostatistical realizations are used for developing groundwater models based on MODFLOW and each of these models is calibrated against hydraulic head measurements using the inversion code PEST. Based on the calibrated flow models the particle tracking code MODPATH is used to simulate probabilistic capture zones for abstraction wells. By comparing simulations of groundwater flow and probabilistic capture zone, comparable results are obtained based on TI directly derived from high resolution geophysical data and generated by theTiGenerator even for the probabilistic capture zones, which are highly sensitive to the geological structure. The study further suggests that soft conditioning in MPS is an effective way of integrating secondary data such as 3D airborne electromagnetic data (SkyTEM) leading to improved estimations of the geological structure as evidenced by the resulting hydraulic parameter values. However, care should be taken when the same data source is used for defining the TI and for soft conditioning as this may lead reduction in the uncertainty estimation.

  9. Visual acuity estimation from simulated images

    Science.gov (United States)

    Duncan, William J.

    Simulated images can provide insight into the performance of optical systems, especially those with complicated features. Many modern solutions for presbyopia and cataracts feature sophisticated power geometries or diffractive elements. Some intraocular lenses (IOLs) arrive at multifocality through the use of a diffractive surface and multifocal contact lenses have a radially varying power profile. These type of elements induce simultaneous vision as well as affecting vision much differently than a monofocal ophthalmic appliance. With myriad multifocal ophthalmics available on the market it is difficult to compare or assess performance in ways that effect wearers of such appliances. Here we present software and algorithmic metrics that can be used to qualitatively and quantitatively compare ophthalmic element performance, with specific examples of bifocal intraocular lenses (IOLs) and multifocal contact lenses. We anticipate this study, methods, and results to serve as a starting point for more complex models of vision and visual acuity in a setting where modeling is advantageous. Generating simulated images of real- scene scenarios is useful for patients in assessing vision quality with a certain appliance. Visual acuity estimation can serve as an important tool for manufacturing and design of ophthalmic appliances.

  10. Integrating MODIS images in a water budget model for dynamic functioning and drought simulation of a Mediterranean forest in Tunisia

    Directory of Open Access Journals (Sweden)

    H. Chakroun

    2012-05-01

    Full Text Available The use of remote sensing at different spatio-temporal resolutions is being common during the last decades since sensors offer many inputs to water budget estimation. Various water balance models use the LAI as a parameter for accounting water interception, evapotranspiration, runoff and available ground water. The objective of the present work is to improve vegetation stress monitoring at regional scale for a natural forested ecosystem. LAI-MODIS and spatialized vegetation, soil and climatic data have been integrated in a water budget model that simulates evapotranspiration and soil water content at daily step. We first explore LAI-MODIS in the specific context of Mediterranean natural ecosystem. Results showed that despite coarse resolution of LAI-MODIS product (1 km, it was possible to discriminate evergreen and coniferous vegetation and that LAI values are influenced by underlying soil capacity of water holding. The dynamic of vegetation has been integrated into the water budget model by weekly varying LAI-MODIS. Results of simulations were analysed in terms of actual evapotranspoiration, deficit of soil water to field capacity and vegetation stress index based on actual and potential evapotranspiration. Comparing dynamic LAI variation, afforded by MODIS, to a hypothetic constant LAI all over the year correspond to 30% of fAPAR increase. A sensitivity analysis of simulation outputs to this fAPAR variation reveals that increase of both deficit of soil water to field capacity and stress index are respectively 18% and 27%, (in terms of RMSE, these variations are respectively 1258 mm yr−1 and 11 days yr−1. These results are consistent with previous studies led at local scale showing that LAI increase is accompanied by stress conditions increase in Mediterranean natural ecosystems. In this study, we also showed that spatial modelisation of drought conditions based on water budget simulations is an adequate tool for

  11. Research on hyperspectral dynamic scene and image sequence simulation

    Science.gov (United States)

    Sun, Dandan; Liu, Fang; Gao, Jiaobo; Sun, Kefeng; Hu, Yu; Li, Yu; Xie, Junhu; Zhang, Lei

    2016-10-01

    This paper presents a simulation method of hyperspectral dynamic scene and image sequence for hyperspectral equipment evaluation and target detection algorithm. Because of high spectral resolution, strong band continuity, anti-interference and other advantages, in recent years, hyperspectral imaging technology has been rapidly developed and is widely used in many areas such as optoelectronic target detection, military defense and remote sensing systems. Digital imaging simulation, as a crucial part of hardware in loop simulation, can be applied to testing and evaluation hyperspectral imaging equipment with lower development cost and shorter development period. Meanwhile, visual simulation can produce a lot of original image data under various conditions for hyperspectral image feature extraction and classification algorithm. Based on radiation physic model and material characteristic parameters this paper proposes a generation method of digital scene. By building multiple sensor models under different bands and different bandwidths, hyperspectral scenes in visible, MWIR, LWIR band, with spectral resolution 0.01μm, 0.05μm and 0.1μm have been simulated in this paper. The final dynamic scenes have high real-time and realistic, with frequency up to 100 HZ. By means of saving all the scene gray data in the same viewpoint image sequence is obtained. The analysis results show whether in the infrared band or the visible band, the grayscale variations of simulated hyperspectral images are consistent with the theoretical analysis results.

  12. Calibration and Validation of a Detailed Architectural Canopy Model Reconstruction for the Simulation of Synthetic Hemispherical Images and Airborne LiDAR Data

    Directory of Open Access Journals (Sweden)

    Magnus Bremer

    2017-02-01

    Full Text Available Canopy density measures such as the Leaf Area Index (LAI have become standardized mapping products derived from airborne and terrestrial Light Detection And Ranging (aLiDAR and tLiDAR, respectively data. A specific application of LiDAR point clouds is their integration into radiative transfer models (RTM of varying complexity. Using, e.g., ray tracing, this allows flexible simulations of sub-canopy light condition and the simulation of various sensors such as virtual hemispherical images or waveform LiDAR on a virtual forest plot. However, the direct use of LiDAR data in RTMs shows some limitations in the handling of noise, the derivation of surface areas per LiDAR point and the discrimination of solid and porous canopy elements. In order to address these issues, a strategy upgrading tLiDAR and Digital Hemispherical Photographs (DHP into plausible 3D architectural canopy models is suggested. The presented reconstruction workflow creates an almost unbiased virtual 3D representation of branch and leaf surface distributions, minimizing systematic errors due to the object–sensor relationship. The models are calibrated and validated using DHPs. Using the 3D models for simulations, their capabilities for the description of leaf density distributions and the simulation of aLiDAR and DHP signatures are shown. At an experimental test site, the suitability of the models, in order to systematically simulate and evaluate aLiDAR based LAI predictions under various scan settings is proven. This strategy makes it possible to show the importance of laser point sampling density, but also the diversity of scan angles and their quantitative effect onto error margins.

  13. microlith : Image Simulation for Biological Phase Microscopy

    CERN Document Server

    Mehta, Shalin B

    2013-01-01

    Accurate simulation of image formation remains under-exploited for biological phase microscopy methods that employ partially coherent illumination, despite being important for the design of imaging systems and the reconstruction algorithms. We present an open-source MATLAB toolbox, microlith (https://code.google.com/p/microlith), that provides accurate simulation of the 3D image of a thin specimen under any partially coherent imaging system, including coherent or incoherent systems. We demonstrate the accuracy of the microlith toolbox by comparing simulated images and experimental images of a phase-only Siemens star test target using dark field and differential interference contrast microscopes. The comparison leads to intriguing insights about the sensitivity of the dark-field microscope to sub-resolution features and effects of specimen birefringence on differential interference contrast.

  14. Remote Ultra-low Light Imaging (RULLI) For Space Situational Awareness (SSA): Modeling And Simulation Results For Passive And Active SSA

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, David C [Los Alamos National Laboratory; Shirey, Robert L [Los Alamos National Laboratory; Roggemann, Michael C [MICH TECH UNIV; Gudimetla, Rao [AFRL

    2008-01-01

    Remote Ultra-Low Light Imaging detectors are photon limited detectors developed at Los Alamos National Laboratories. RULLI detectors provide a very high degree of temporal resolution for the arrival times of detected photoevents, but saturate at a photo-detection rate of about 10{sup 6} photo-events per second. Rather than recording a conventional image, such as output by a charged coupled device (CCD) camera, the RULLI detector outputs a data stream consisting of the two-dimensional location, and time of arrival of each detected photo-electron. Hence, there is no need to select a specific exposure time to accumulate photo-events prior to the data collection with a RULLI detector this quantity can be optimized in post processing. RULLI detectors have lower peak quantum efficiency (from as low as 5% to perhaps as much as 40% with modern photocathode technology) than back-illuminated CCD's (80% or higher). As a result of these factors, and the associated analyses of signal and noise, we have found that RULLI detectors can play two key new roles in SSA: passive imaging of exceedingly dim objects, and three-dimensional imaging of objects illuminated with an appropriate pulsed laser. In this paper we describe the RULLI detection model, compare it to a conventional CCD detection model, and present analytic and simulation results to show the limits of performance of RULLI detectors used for SSA applications at AMOS field site.

  15. Monte Carlo simulations in small animal PET imaging

    Energy Technology Data Exchange (ETDEWEB)

    Branco, Susana [Universidade de Lisboa, Faculdade de Ciencias, Instituto de Biofisica e Engenharia Biomedica, Lisbon (Portugal)], E-mail: susana.silva@fc.ul.pt; Jan, Sebastien [Service Hospitalier Frederic Joliot, CEA/DSV/DRM, Orsay (France); Almeida, Pedro [Universidade de Lisboa, Faculdade de Ciencias, Instituto de Biofisica e Engenharia Biomedica, Lisbon (Portugal)

    2007-10-01

    This work is based on the use of an implemented Positron Emission Tomography (PET) simulation system dedicated for small animal PET imaging. Geant4 Application for Tomographic Emission (GATE), a Monte Carlo simulation platform based on the Geant4 libraries, is well suited for modeling the microPET FOCUS system and to implement realistic phantoms, such as the MOBY phantom, and data maps from real examinations. The use of a microPET FOCUS simulation model with GATE has been validated for spatial resolution, counting rates performances, imaging contrast recovery and quantitative analysis. Results from realistic studies of the mouse body using {sup -}F and [{sup 18}F]FDG imaging protocols are presented. These simulations include the injection of realistic doses into the animal and realistic time framing. The results have shown that it is possible to simulate small animal PET acquisitions under realistic conditions, and are expected to be useful to improve the quantitative analysis in PET mouse body studies.

  16. Using super-resolution technique to elucidate the effects of imaging resolution on transport properties resulting from pore-scale modelling simulations

    Science.gov (United States)

    Karsanina, Marina; Gerke, Kirill; Khirevich, Siarhei; Sizonenko, Timofey; Korost, Dmitry

    2017-04-01

    Permeability is one of the fundamental properties of porous media and is required for large-scale Darcian fluid flow and mass transport models. Whilst permeability can be directly measured at a range of scales, there are increasing opportunities to evaluate permeability from pore-scale simulations. It is well known that single phase flow properties of digital rocks will depend on the resolution of the 3D pore image. Such studies are usually performed by coarsening X-ray microtomography scans. Recently we have proposed a novel approach to fuse multi-scale porous media images using stochastic reconstruction techniques based on directional correlation functions. Here we apply this slightly modified approach to create 3D pore images of different spatial resolution, i.e. stochastic super-resolution method. Contrary to coarsening techniques, this approach preserves porosity values and allows to incorporate fine scale data coming from such imaging techniques as SEM or FIB-SEM. We compute absolute permeability of the same porous media species under different resolutions using lattice-Boltzmann and finite difference methods to model Stokes flow in order to elucidate the effects of image resolution on resulting permeability values and compare stochastic super-resolution technique against conventional coarsening image processing technique. References: 1) Karsanina, M.V., Gerke, K.M., Skvortsova, E.B. and Mallants, D. (2015) Universal spatial correlation functions for describing and reconstructing soil microstructure. PLoS ONE 10(5), e0126515. 2) Gerke, K. M., & Karsanina, M. V. (2015). Improving stochastic reconstructions by weighting correlation functions in an objective function. EPL (Europhysics Letters),111(5), 56002. 3) Gerke, K. M., Karsanina, M. V., Vasilyev, R. V., & Mallants, D. (2014). Improving pattern reconstruction using directional correlation functions. EPL (Europhysics Letters), 106(6), 66002. 4) Gerke, K.M., Karsanina, M. V, Mallants, D., 2015. Universal

  17. Regional deposition of particles in an image-based airway model: large-eddy simulation and left-right lung ventilation asymmetry.

    Science.gov (United States)

    Lambert, Andrew R; O'Shaughnessy, Patrick; Tawhai, Merryn H; Hoffman, Eric A; Lin, Ching-Long

    2011-01-01

    Regional deposition and ventilation of particles by generation, lobe and lung during steady inhalation in a computed tomography (CT) based human airway model are investigated numerically. The airway model consists of a seven-generation human airway tree, with oral cavity, pharynx and larynx. The turbulent flow in the upper respiratory tract is simulated by large-eddy simulation. The flow boundary conditions at the peripheral airways are derived from CT images at two lung volumes to produce physiologically-realistic regional ventilation. Particles with diameter equal to or greater than 2.5 microns are selected for study because smaller particles tend to penetrate to the more distal parts of the lung. The current generational particle deposition efficiencies agree well with existing measurement data. Generational deposition efficiencies exhibit similar dependence on particle Stokes number regardless of generation, whereas deposition and ventilation efficiencies vary by lobe and lung, depending on airway morphology and airflow ventilation. In particular, regardless of particle size, the left lung receives a greater proportion of the particle bolus as compared to the right lung in spite of greater flow ventilation to the right lung. This observation is supported by the left-right lung asymmetry of particle ventilation observed in medical imaging. It is found that the particle-laden turbulent laryngeal jet flow, coupled with the unique geometrical features of the airway, causes a disproportionate amount of particles to enter the left lung.

  18. Hyperspectral imaging simulation of object under sea-sky background

    Science.gov (United States)

    Wang, Biao; Lin, Jia-xuan; Gao, Wei; Yue, Hui

    2016-10-01

    Remote sensing image simulation plays an important role in spaceborne/airborne load demonstration and algorithm development. Hyperspectral imaging is valuable in marine monitoring, search and rescue. On the demand of spectral imaging of objects under the complex sea scene, physics based simulation method of spectral image of object under sea scene is proposed. On the development of an imaging simulation model considering object, background, atmosphere conditions, sensor, it is able to examine the influence of wind speed, atmosphere conditions and other environment factors change on spectral image quality under complex sea scene. Firstly, the sea scattering model is established based on the Philips sea spectral model, the rough surface scattering theory and the water volume scattering characteristics. The measured bi directional reflectance distribution function (BRDF) data of objects is fit to the statistical model. MODTRAN software is used to obtain solar illumination on the sea, sky brightness, the atmosphere transmittance from sea to sensor and atmosphere backscattered radiance, and Monte Carlo ray tracing method is used to calculate the sea surface object composite scattering and spectral image. Finally, the object spectrum is acquired by the space transformation, radiation degradation and adding the noise. The model connects the spectrum image with the environmental parameters, the object parameters, and the sensor parameters, which provide a tool for the load demonstration and algorithm development.

  19. Clutter discrimination algorithm simulation in pulse laser radar imaging

    Science.gov (United States)

    Zhang, Yan-mei; Li, Huan; Guo, Hai-chao; Su, Xuan; Zhu, Fule

    2015-10-01

    Pulse laser radar imaging performance is greatly influenced by different kinds of clutter. Various algorithms are developed to mitigate clutter. However, estimating performance of a new algorithm is difficult. Here, a simulation model for estimating clutter discrimination algorithms is presented. This model consists of laser pulse emission, clutter jamming, laser pulse reception and target image producing. Additionally, a hardware platform is set up gathering clutter data reflected by ground and trees. The data logging is as clutter jamming input in the simulation model. The hardware platform includes a laser diode, a laser detector and a high sample rate data logging circuit. The laser diode transmits short laser pulses (40ns FWHM) at 12.5 kilohertz pulse rate and at 905nm wavelength. An analog-to-digital converter chip integrated in the sample circuit works at 250 mega samples per second. The simulation model and the hardware platform contribute to a clutter discrimination algorithm simulation system. Using this system, after analyzing clutter data logging, a new compound pulse detection algorithm is developed. This new algorithm combines matched filter algorithm and constant fraction discrimination (CFD) algorithm. Firstly, laser echo pulse signal is processed by matched filter algorithm. After the first step, CFD algorithm comes next. Finally, clutter jamming from ground and trees is discriminated and target image is produced. Laser radar images are simulated using CFD algorithm, matched filter algorithm and the new algorithm respectively. Simulation result demonstrates that the new algorithm achieves the best target imaging effect of mitigating clutter reflected by ground and trees.

  20. Calibrated Ultra Fast Image Simulations for the Dark Energy Survey

    CERN Document Server

    Bruderer, Claudio; Refregier, Alexandre; Amara, Adam; Berge, Joel; Gamper, Lukas

    2015-01-01

    Weak lensing by large-scale structure is a powerful technique to probe the dark components of the universe. To understand the measurement process of weak lensing and the associated systematic effects, image simulations are becoming increasingly important. For this purpose we present a first implementation of the $\\textit{Monte Carlo Control Loops}$ ($\\textit{MCCL}$; Refregier & Amara 2014), a coherent framework for studying systematic effects in weak lensing. It allows us to model and calibrate the shear measurement process using image simulations from the Ultra Fast Image Generator (UFig; Berge et al. 2013). We apply this framework to a subset of the data taken during the Science Verification period (SV) of the Dark Energy Survey (DES). We calibrate the UFig simulations to be statistically consistent with DES images. We then perform tolerance analyses by perturbing the simulation parameters and study their impact on the shear measurement at the one-point level. This allows us to determine the relative im...

  1. Terahertz/mm wave imaging simulation software

    Science.gov (United States)

    Fetterman, M. R.; Dougherty, J.; Kiser, W. L., Jr.

    2006-10-01

    We have developed a mm wave/terahertz imaging simulation package from COTS graphic software and custom MATLAB code. In this scheme, a commercial ray-tracing package was used to simulate the emission and reflections of radiation from scenes incorporating highly realistic imagery. Accurate material properties were assigned to objects in the scenes, with values obtained from the literature, and from our own terahertz spectroscopy measurements. The images were then post-processed with custom Matlab code to include the blur introduced by the imaging system and noise levels arising from system electronics and detector noise. The Matlab code was also used to simulate the effect of fog, an important aspect for mm wave imaging systems. Several types of image scenes were evaluated, including bar targets, contrast detail targets, a person in a portal screening situation, and a sailboat on the open ocean. The images produced by this simulation are currently being used as guidance for a 94 GHz passive mm wave imaging system, but have broad applicability for frequencies extending into the terahertz region.

  2. Accurate study of FosPeg® distribution in a mouse model using fluorescence imaging technique and fluorescence white monte carlo simulations

    DEFF Research Database (Denmark)

    Xie, Haiyan; Liu, Haichun; Svenmarker, Pontus

    2010-01-01

    Fluorescence imaging is used for quantitative in vivo assessment of drug concentration. Light attenuation in tissue is compensated for through Monte-Carlo simulations. The intrinsic fluorescence intensity, directly proportional to the drug concentration, could be obtained....

  3. Simulating Galaxies and Active Galactic Nuclei in the LSST Image Simulation Effort

    NARCIS (Netherlands)

    Pizagno, James; Ahmad, Z.; Bankert, J.; Bard, D.; Connolly, A.; Chang, C.; Gibson, R. R.; Gilmore, K.; Grace, E.; Hannel, M.; Jernigan, J. G.; Jones, L.; Kahn, S. M.; Krughoff, S. K.; Lorenz, S.; Marshall, S.; Shmakova, S. M.; Sylvestri, N.; Todd, N.; Young, M.

    2011-01-01

    We present an extragalactic source catalog, which includes galaxies and Active Galactic Nuclei, that is used for the Large Survey Synoptic Telescope Imaging Simulation effort. The galaxies are taken from the De Lucia et. al. (2006) semi-analytic modeling (SAM) of the Millennium Simulation. The LSST

  4. Wake modeling and simulation

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Madsen Aagaard, Helge; Larsen, Torben J.;

    , have the potential to include also mutual wake interaction phenomenons. The basic conjecture behind the dynamic wake meandering (DWM) model is that wake transportation in the atmospheric boundary layer is driven by the large scale lateral- and vertical turbulence components. Based on this conjecture...... and trailed vorticity, has been approached by a simple semi-empirical model essentially based on an eddy viscosity philosophy. Contrary to previous attempts to model wake loading, the DWM approach opens for a unifying description in the sense that turbine power- and load aspects can be treated simultaneously...... methodology has been implemented in the aeroelastic code HAWC2, and example simulations of wake situations, from the small Tjæreborg wind farm, have been performed showing satisfactory agreement between predictions and measurements...

  5. Seismic Imaging of Sandbox Models

    Science.gov (United States)

    Buddensiek, M. L.; Krawczyk, C. M.; Kukowski, N.; Oncken, O.

    2009-04-01

    Analog sandbox simulations have been applied to study structural geological processes to provide qualitative and quantitative insights into the evolution of mountain belts and basins. These sandbox simulations provide either two-dimensional and dynamic or pseudo-three-dimensional and static information. To extend the dynamic simulations to three dimensions, we combine the analog sandbox simulation techniques with seismic physical modeling of these sandbox models. The long-term objective of this approach is to image seismic and seismological events of static and actively deforming 3D analog models. To achieve this objective, a small-scale seismic apparatus, composed of a water tank, a PC control unit including piezo-electric transducers, and a positioning system, was built for laboratory use. For the models, we use granular material such as sand and glass beads, so that the simulations can evolve dynamically. The granular models are required to be completely water saturated so that the sources and receivers are directly and well coupled to the propagating medium. Ultrasonic source frequencies (˜500 kHz) corresponding to wavelengths ˜5 times the grain diameter are necessary to be able to resolve small scale structures. In three experiments of different two-layer models, we show that (1) interfaces of layers of granular materials can be resolved depending on the interface preparation more than on the material itself. Secondly, we show that the dilation between the sand grains caused by a string that has been pulled through the grains, simulating a shear zone, causes a reflection that can be detected in the seismic data. In the third model, we perform a seismic reflection survey across a model that contains both the prepared interface and a shear zone, and apply 2D-seismic reflection processing to improve the resolution. Especially for more complex models, the clarity and penetration depth need to be improved to study the evolution of geological structures in dynamic

  6. 3D Rapid Prototyping for Otolaryngology—Head and Neck Surgery: Applications in Image-Guidance, Surgical Simulation and Patient-Specific Modeling

    Science.gov (United States)

    Chan, Harley H. L.; Siewerdsen, Jeffrey H.; Vescan, Allan; Daly, Michael J.; Prisman, Eitan; Irish, Jonathan C.

    2015-01-01

    The aim of this study was to demonstrate the role of advanced fabrication technology across a broad spectrum of head and neck surgical procedures, including applications in endoscopic sinus surgery, skull base surgery, and maxillofacial reconstruction. The initial case studies demonstrated three applications of rapid prototyping technology are in head and neck surgery: i) a mono-material paranasal sinus phantom for endoscopy training ii) a multi-material skull base simulator and iii) 3D patient-specific mandible templates. Digital processing of these phantoms is based on real patient or cadaveric 3D images such as CT or MRI data. Three endoscopic sinus surgeons examined the realism of the endoscopist training phantom. One experienced endoscopic skull base surgeon conducted advanced sinus procedures on the high-fidelity multi-material skull base simulator. Ten patients participated in a prospective clinical study examining patient-specific modeling for mandibular reconstructive surgery. Qualitative feedback to assess the realism of the endoscopy training phantom and high-fidelity multi-material phantom was acquired. Conformance comparisons using assessments from the blinded reconstructive surgeons measured the geometric performance between intra-operative and pre-operative reconstruction mandible plates. Both the endoscopy training phantom and the high-fidelity multi-material phantom received positive feedback on the realistic structure of the phantom models. Results suggested further improvement on the soft tissue structure of the phantom models is necessary. In the patient-specific mandible template study, the pre-operative plates were judged by two blinded surgeons as providing optimal conformance in 7 out of 10 cases. No statistical differences were found in plate fabrication time and conformance, with pre-operative plating providing the advantage of reducing time spent in the operation room. The applicability of common model design and fabrication techniques

  7. 3D Rapid Prototyping for Otolaryngology-Head and Neck Surgery: Applications in Image-Guidance, Surgical Simulation and Patient-Specific Modeling.

    Directory of Open Access Journals (Sweden)

    Harley H L Chan

    Full Text Available The aim of this study was to demonstrate the role of advanced fabrication technology across a broad spectrum of head and neck surgical procedures, including applications in endoscopic sinus surgery, skull base surgery, and maxillofacial reconstruction. The initial case studies demonstrated three applications of rapid prototyping technology are in head and neck surgery: i a mono-material paranasal sinus phantom for endoscopy training ii a multi-material skull base simulator and iii 3D patient-specific mandible templates. Digital processing of these phantoms is based on real patient or cadaveric 3D images such as CT or MRI data. Three endoscopic sinus surgeons examined the realism of the endoscopist training phantom. One experienced endoscopic skull base surgeon conducted advanced sinus procedures on the high-fidelity multi-material skull base simulator. Ten patients participated in a prospective clinical study examining patient-specific modeling for mandibular reconstructive surgery. Qualitative feedback to assess the realism of the endoscopy training phantom and high-fidelity multi-material phantom was acquired. Conformance comparisons using assessments from the blinded reconstructive surgeons measured the geometric performance between intra-operative and pre-operative reconstruction mandible plates. Both the endoscopy training phantom and the high-fidelity multi-material phantom received positive feedback on the realistic structure of the phantom models. Results suggested further improvement on the soft tissue structure of the phantom models is necessary. In the patient-specific mandible template study, the pre-operative plates were judged by two blinded surgeons as providing optimal conformance in 7 out of 10 cases. No statistical differences were found in plate fabrication time and conformance, with pre-operative plating providing the advantage of reducing time spent in the operation room. The applicability of common model design and

  8. Simulating realistic imaging conditions for in situ liquid microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Welch, David A., E-mail: dawelch@ucdavis.edu [Department of Chemical Engineering and Materials Science, University of California, Davis, CA (United States); Faller, Roland [Department of Chemical Engineering and Materials Science, University of California, Davis, CA (United States); Evans, James E. [Pacific Northwest National Laboratory, Environmental Molecular Sciences Laboratory, Richland, WA (United States); Browning, Nigel D. [Pacific Northwest National Laboratory, Fundamental Computational Sciences Directorate, Richland, WA (United States)

    2013-12-15

    In situ transmission electron microscopy enables the imaging of biological cells, macromolecular protein complexes, nanoparticles, and other systems in a near-native environment. In order to improve interpretation of image contrast features and also predict ideal imaging conditions ahead of time, new virtual electron microscopic techniques are needed. A technique for virtual fluid-stage high-angle annular dark-field scanning transmission electron microscopy with the multislice method is presented that enables the virtual imaging of model fluid-stage systems composed of millions of atoms. The virtual technique is exemplified by simulating images of PbS nanoparticles under different imaging conditions and the results agree with previous experimental findings. General insight is obtained on the influence of the effects of fluid path length, membrane thickness, nanoparticle position, defocus and other microscope parameters on attainable image quality. - Highlights: • Image simulation has been performed to understand in situ electron microscopy experiments. • Experimentally observed resolution of in situ grown PbS nanoparticles has been virtually reproduced. • General relationships between image resolution and in situ holder design, defocus, and particle size have been determined. • The presented image simulation technique can predict the obtainable resolution of future experiments.

  9. Delay modeling in logic simulation

    Energy Technology Data Exchange (ETDEWEB)

    Acken, J. M.; Goldstein, L. H.

    1980-01-01

    As digital integrated circuit size and complexity increases, the need for accurate and efficient computer simulation increases. Logic simulators such as SALOGS (SAndia LOGic Simulator), which utilize transition states in addition to the normal stable states, provide more accurate analysis than is possible with traditional logic simulators. Furthermore, the computational complexity of this analysis is far lower than that of circuit simulation such as SPICE. An eight-value logic simulation environment allows the use of accurate delay models that incorporate both element response and transition times. Thus, timing simulation with an accuracy approaching that of circuit simulation can be accomplished with an efficiency comparable to that of logic simulation. 4 figures.

  10. Simulation Analysis of Cylindrical Panoramic Image Mosaic

    Directory of Open Access Journals (Sweden)

    ZHU Ningning

    2017-04-01

    Full Text Available With the rise of virtual reality (VR technology, panoramic images are used more widely, which obtained by multi-camera stitching and take advantage of homography matrix and image transformation, however, this method will destroy the collinear condition, make it's difficult to 3D reconstruction and other work. This paper proposes a new method for cylindrical panoramic image mosaic, which set the number of mosaic camera, imaging focal length, imaging position and imaging attitude, simulate the mapping process of multi-camera and construct cylindrical imaging equation from 3D points to 2D image based on photogrammetric collinearity equations. This cylindrical imaging equation can not only be used for panoramic stitching, but also be used for precision analysis, test results show: ①this method can be used for panoramic stitching under the condition of multi-camera and incline imaging; ②the accuracy of panoramic stitching is affected by 3 kinds of parameter errors including focus, displacement and rotation angle, in which focus error can be corrected by image resampling, displacement error is closely related to object distance and rotation angle error is affected mainly by the number of cameras.

  11. Using Realistic MHD Simulations for Modeling and Interpretation of Quiet-Sun Observations with the Solar Dynamics Observatory Helioseismic and Magnetic Imager

    CERN Document Server

    Kitiashvili, Irina N; Lagg, Andreas

    2014-01-01

    The solar atmosphere is extremely dynamic, and many important phenomena develop on small scales that are unresolved in observations with the Helioseismic and Magnetic Imager (HMI) instrument on the Solar Dynamics Observatory (SDO). For correct calibration and interpretation, it is very important to investigate the effects of small-scale structures and dynamics on the HMI observables, such as Doppler shift, continuum intensity, spectral line depth, and width. We use 3D radiative hydrodynamics simulations of the upper turbulent convective layer and the atmosphere of the Sun, and a spectro-polarimetric radiative transfer code to study observational characteristics of the Fe I 6173A line observed by HMI in quiet-Sun regions. We use the modeling results to investigate the sensitivity of the line Doppler shift to plasma velocity, and also sensitivities of the line parameters to plasma temperature and density, and determine effective line formation heights for observations of solar regions located at different dista...

  12. Simulations of astronomical imaging phased arrays.

    Science.gov (United States)

    Saklatvala, George; Withington, Stafford; Hobson, Michael P

    2008-04-01

    We describe a theoretical procedure for analyzing astronomical phased arrays with overlapping beams and apply the procedure to simulate a simple example. We demonstrate the effect of overlapping beams on the number of degrees of freedom of the array and on the ability of the array to recover a source. We show that the best images are obtained using overlapping beams, contrary to common practice, and show how the dynamic range of a phased array directly affects the image quality.

  13. A virtual imaging platform for multi-modality medical image simulation.

    Science.gov (United States)

    Glatard, Tristan; Lartizien, Carole; Gibaud, Bernard; da Silva, Rafael Ferreira; Forestier, Germain; Cervenansky, Frédéric; Alessandrini, Martino; Benoit-Cattin, Hugues; Bernard, Olivier; Camarasu-Pop, Sorina; Cerezo, Nadia; Clarysse, Patrick; Gaignard, Alban; Hugonnard, Patrick; Liebgott, Hervé; Marache, Simon; Marion, Adrien; Montagnat, Johan; Tabary, Joachim; Friboulet, Denis

    2013-01-01

    This paper presents the Virtual Imaging Platform (VIP), a platform accessible at http://vip.creatis.insa-lyon.fr to facilitate the sharing of object models and medical image simulators, and to provide access to distributed computing and storage resources. A complete overview is presented, describing the ontologies designed to share models in a common repository, the workflow template used to integrate simulators, and the tools and strategies used to exploit computing and storage resources. Simulation results obtained in four image modalities and with different models show that VIP is versatile and robust enough to support large simulations. The platform currently has 200 registered users who consumed 33 years of CPU time in 2011.

  14. Simulating Realistic Imaging Conditions For In-Situ Liquid Microscopy

    Science.gov (United States)

    Welch, David A.; Faller, Roland; Evans, James E.; Browning, Nigel D.

    2013-01-01

    In situ transmission electron microscopy enables the imaging of biological cells, macromolecular protein complexes, nanoparticles, and other systems in a near-native environment. In order to improve interpretation of image contrast features and also predict ideal imaging conditions ahead of time, new virtual electron microscopic techniques are needed. A technique for virtual fluid-stage high-angle annular dark-field scanning transmission electron microscopy with the multislice method is presented that enables the virtual imaging of model fluid-stage systems composed of millions of atoms. The virtual technique is exemplified by simulating images of PbS nanoparticles under different imaging conditions and the results agree with previous experimental findings. General insight is obtained on the influence of the effects of fluid path length, membrane thickness, nanoparticle position, defocus and other microscope parameters on attainable image quality. PMID:23872040

  15. Simulating realistic imaging conditions for in situ liquid microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Welch, David A.; Faller, Roland; Evans, James E.; Browning, Nigel D.

    2013-12-01

    In situ transmission electron microscopy enables the imaging of biological cells, macromolecular protein complexes, nanoparticles, and other systems in a near-native environment. In order to improve interpretation of image contrast features and also predict ideal imaging conditions ahead of time, new virtual electron microscopic techniques are needed. A technique for virtual fluid-stage high-angle annular dark-field scanning transmission electron microscopy with the multislice method is presented that enables the virtual imaging of model fluid-stage systems composed of millions of atoms. The virtual technique is exemplified by simulating images of PbS nanoparticles under different imaging conditions and the results agree with previous experimental findings. General insight is obtained on the influence of the effects of fluid path length, membrane thickness, nanoparticle position, defocus and other microscope parameters on attainable image quality.

  16. [The model of adaptive primary image processing].

    Science.gov (United States)

    Dudkin, K N; Mironov, S V; Dudkin, A K; Chikhman, V N

    1998-07-01

    A computer model of adaptive segmentation of the 2D visual objects was developed. Primary image descriptions are realised via spatial frequency filters and feature detectors performing as self-organised mechanisms. Simulation of the control processes related to attention, lateral, frequency-selective and cross-orientation inhibition, determines the adaptive image processing.

  17. Monte-Carlo simulations and image reconstruction for novel imaging scenarios in emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Gillam, John E. [The University of Sydney, Faculty of Health Sciences and The Brain and Mind Centre, Camperdown (Australia); Rafecas, Magdalena, E-mail: rafecas@imt.uni-luebeck.de [University of Lubeck, Institute of Medical Engineering, Ratzeburger Allee 160, 23538 Lübeck (Germany)

    2016-02-11

    Emission imaging incorporates both the development of dedicated devices for data acquisition as well as algorithms for recovering images from that data. Emission tomography is an indirect approach to imaging. The effect of device modification on the final image can be understood through both the way in which data are gathered, using simulation, and the way in which the image is formed from that data, or image reconstruction. When developing novel devices, systems and imaging tasks, accurate simulation and image reconstruction allow performance to be estimated, and in some cases optimized, using computational methods before or during the process of physical construction. However, there are a vast range of approaches, algorithms and pre-existing computational tools that can be exploited and the choices made will affect the accuracy of the in silico results and quality of the reconstructed images. On the one hand, should important physical effects be neglected in either the simulation or reconstruction steps, specific enhancements provided by novel devices may not be represented in the results. On the other hand, over-modeling of device characteristics in either step leads to large computational overheads that can confound timely results. Here, a range of simulation methodologies and toolkits are discussed, as well as reconstruction algorithms that may be employed in emission imaging. The relative advantages and disadvantages of a range of options are highlighted using specific examples from current research scenarios.

  18. Simulation of computed radiography with imaging plate detectors

    Energy Technology Data Exchange (ETDEWEB)

    Tisseur, D.; Costin, M. [CEA LIST, CEA Saclay 91191 Gif sur Yvette Cedex (France); Mathy, F. [CEA-LETI, Campus Minatec, F-38054, Grenoble (France); Schumm, A. [EDF R and D, 1 avenue du général de gaulle 92141 Clamart (France)

    2014-02-18

    Computed radiography (CR) using phosphor imaging plate detectors is taking an increasing place in Radiography Testing. CR uses similar equipment as conventional radiography except that the classical X-ray film is replaced by a numerical detector, called image plate (IP), which is made of a photostimulable layer and which is read by a scanning device through photostimulated luminescence. Such digital radiography has already demonstrated important benefits in terms of exposure time, decrease of source energies and thus reduction of radioprotection area besides being a solution without effluents. This paper presents a model for the simulation of radiography with image plate detectors in CIVA together with examples of validation of the model. The study consists in a cross comparison between experimental and simulation results obtained on a step wedge with a classical X-ray tube. Results are proposed in particular with wire Image quality Indicator (IQI) and duplex IQI.

  19. Preparations, models, and simulations.

    Science.gov (United States)

    Rheinberger, Hans-Jörg

    2015-01-01

    This paper proposes an outline for a typology of the different forms that scientific objects can take in the life sciences. The first section discusses preparations (or specimens)--a form of scientific object that accompanied the development of modern biology in different guises from the seventeenth century to the present: as anatomical-morphological specimens, as microscopic cuts, and as biochemical preparations. In the second section, the characteristics of models in biology are discussed. They became prominent from the end of the nineteenth century onwards. Some remarks on the role of simulations--characterising the life sciences of the turn from the twentieth to the twenty-first century--conclude the paper.

  20. Simulation of High Quality Ultrasound Imaging

    DEFF Research Database (Denmark)

    Hemmsen, Martin Christian; Kortbek, Jacob; Nikolov, Svetoslav Ivanov

    2010-01-01

    This paper investigates if the influence on image quality using physical transducers can be simulated with an sufficient accuracy to reveal system performance. The influence is investigated in a comparative study between Synthetic Aperture Sequential Beamformation (SASB) and Dynamic Receive Focus...

  1. Targets IMage Energy Regional (TIMER) Model, Technical Documentation

    NARCIS (Netherlands)

    Vries B de; Vuuren D van; Elzen M den; Janssen M; MNV

    2002-01-01

    The Targets IMage Energy Regional simulation model, TIMER, is described in detail. This model was developed and used in close connection with the Integrated Model to Assess the Global Environment (IMAGE) 2.2. The system-dynamics TIMER model simulates the global energy system at an intermediate level

  2. Biomedical Imaging and Computational Modeling in Biomechanics

    CERN Document Server

    Iacoviello, Daniela

    2013-01-01

    This book collects the state-of-art and new trends in image analysis and biomechanics. It covers a wide field of scientific and cultural topics, ranging from remodeling of bone tissue under the mechanical stimulus up to optimizing the performance of sports equipment, through the patient-specific modeling in orthopedics, microtomography and its application in oral and implant research, computational modeling in the field of hip prostheses, image based model development and analysis of the human knee joint, kinematics of the hip joint, micro-scale analysis of compositional and mechanical properties of dentin, automated techniques for cervical cell image analysis, and iomedical imaging and computational modeling in cardiovascular disease.   The book will be of interest to researchers, Ph.D students, and graduate students with multidisciplinary interests related to image analysis and understanding, medical imaging, biomechanics, simulation and modeling, experimental analysis.

  3. Modelling and simulation of pixelated photon counting X-ray detectors for imaging; Modellierung und Simulation physikalischer Eigenschaften photonenzaehlender Roentgenpixeldetektoren fuer die Bildgebung

    Energy Technology Data Exchange (ETDEWEB)

    Durst, Juergen

    2008-07-22

    First of all the physics processes generating the energy deposition in the sensor volume are investigated. The spatial resolution limits of photon interactions and the range of secondary electrons are discussed. The signatures in the energy deposition spectrum in pixelated detectors with direct conversion layers are described. The energy deposition for single events can be generated by the Monte-Carlo-Simulation package ROSI. The basic interactions of photons with matter are evaluated, resulting in the ability to use ROSI as a basis for the simulation of photon counting pixel detectors with direct conversion. In the context of this thesis a detector class is developed to simulate the response of hybrid photon counting pixel detectors using high-Z sensor materials like Cadmium Telluride (CdTe) or Gallium Arsenide (GaAs) in addition to silicon. To enable the realisation of such a simulation, the relevant physics processes and properties have to be implemented: processes in the sensor layer (provided by EGS4/LSCAT in ROSI), generation of charge carriers as electron hole pairs, diffusion and repulsion of charge carriers during drift and lifetime. Furthermore, several noise contributions of the electronics can be taken into account. The result is a detector class which allows the simulation of photon counting detectors. In this thesis the multiplicity framework is developed, including a formula to calculate or measure the zero frequency detective quantum efficiency (DQE). To enable the measurement of the multiplicity of detected events a cluster analysis program was developed. Random and systematic errors introduced by the cluster analysis are discussed. It is also shown that the cluster analysis method can be used to determine the averaged multiplicity with high accuracy. The method is applied to experimental data. As an example using the implemented detector class, the discriminator threshold dependency of the DQE and modulation transfer function is investigated in

  4. USING REALISTIC MHD SIMULATIONS FOR THE MODELING AND INTERPRETATION OF QUIET-SUN OBSERVATIONS WITH THE SOLAR DYNAMICS OBSERVATORY HELIOSEISMIC AND MAGNETIC IMAGER

    Energy Technology Data Exchange (ETDEWEB)

    Kitiashvili, I. N. [NASA Ames Research Center, Moffett Field, Mountain View, CA 94035 (United States); Couvidat, S. [Stanford University, Stanford, CA 94305 (United States); Lagg, A. [Max Planck Institute for Solar System Research, Göttingen, D-37077 (Germany)

    2015-07-20

    The solar atmosphere is extremely dynamic, and many important phenomena develop on small scales that are unresolved in observations with the Helioseismic and Magnetic Imager (HMI) instrument on the Solar Dynamics Observatory. For correct calibration and interpretation of the observations, it is very important to investigate the effects of small-scale structures and dynamics on the HMI observables, such as Doppler shift, continuum intensity, spectral line depth, and width. We use 3D radiative hydrodynamics simulations of the upper turbulent convective layer and the atmosphere of the Sun, and a spectro-polarimetric radiative transfer code to study observational characteristics of the Fe i 6173 Å line observed by HMI in quiet-Sun regions. We use the modeling results to investigate the sensitivity of the line Doppler shift to plasma velocity, and also sensitivities of the line parameters to plasma temperature and density, and determine effective line formation heights for observations of solar regions located at different distances from the disk center. These estimates are important for the interpretation of helioseismology measurements. In addition, we consider various center-to-limb effects, such as convective blueshift, variations of helioseismic travel-times, and the “concave” Sun effect, and show that the simulations can qualitatively reproduce the observed phenomena, indicating that these effects are related to a complex interaction of the solar dynamics and radiative transfer.

  5. Retinal Image Simulation of Subjective Refraction Techniques.

    Science.gov (United States)

    Perches, Sara; Collados, M Victoria; Ares, Jorge

    2016-01-01

    Refraction techniques make it possible to determine the most appropriate sphero-cylindrical lens prescription to achieve the best possible visual quality. Among these techniques, subjective refraction (i.e., patient's response-guided refraction) is the most commonly used approach. In this context, this paper's main goal is to present a simulation software that implements in a virtual manner various subjective-refraction techniques--including Jackson's Cross-Cylinder test (JCC)--relying all on the observation of computer-generated retinal images. This software has also been used to evaluate visual quality when the JCC test is performed in multifocal-contact-lens wearers. The results reveal this software's usefulness to simulate the retinal image quality that a particular visual compensation provides. Moreover, it can help to gain a deeper insight and to improve existing refraction techniques and it can be used for simulated training.

  6. Quantifying the effect of tissue deformation on diffusion-weighted MRI: a mathematical model and an efficient simulation framework applied to cardiac diffusion imaging

    Science.gov (United States)

    Mekkaoui, Imen; Moulin, Kevin; Croisille, Pierre; Pousin, Jerome; Viallon, Magalie

    2016-08-01

    Cardiac motion presents a major challenge in diffusion weighted MRI, often leading to large signal losses that necessitate repeated measurements. The diffusion process in the myocardium is difficult to investigate because of the unqualified sensitivity of diffusion measurements to cardiac motion. A rigorous mathematical formalism is introduced to quantify the effect of tissue motion in diffusion imaging. The presented mathematical model, based on the Bloch-Torrey equations, takes into account deformations according to the laws of continuum mechanics. Approximating this mathematical model by using finite elements method, numerical simulations can predict the sensitivity of the diffusion signal to cardiac motion. Different diffusion encoding schemes are considered and the diffusion weighted MR signals, computed numerically, are compared to available results in literature. Our numerical model can identify the existence of two time points in the cardiac cycle, at which the diffusion is unaffected by myocardial strain and cardiac motion. Of course, these time points depend on the type of diffusion encoding scheme. Our numerical results also show that the motion sensitivity of the diffusion sequence can be reduced by using either spin echo technique with acceleration motion compensation diffusion gradients or stimulated echo acquisition mode with unipolar and bipolar diffusion gradients.

  7. A computer code to simulate X-ray imaging techniques

    Energy Technology Data Exchange (ETDEWEB)

    Duvauchelle, Philippe E-mail: philippe.duvauchelle@insa-lyon.fr; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-09-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests.

  8. GALSIM: The modular galaxy image simulation toolkit

    Science.gov (United States)

    Rowe, B. T. P.; Jarvis, M.; Mandelbaum, R.; Bernstein, G. M.; Bosch, J.; Simet, M.; Meyers, J. E.; Kacprzak, T.; Nakajima, R.; Zuntz, J.; Miyatake, H.; Dietrich, J. P.; Armstrong, R.; Melchior, P.; Gill, M. S. S.

    2015-04-01

    GALSIM is a collaborative, open-source project aimed at providing an image simulation tool of enduring benefit to the astronomical community. It provides a software library for generating images of astronomical objects such as stars and galaxies in a variety of ways, efficiently handling image transformations and operations such as convolution and rendering at high precision. We describe the GALSIM software and its capabilities, including necessary theoretical background. We demonstrate that the performance of GALSIM meets the stringent requirements of high precision image analysis applications such as weak gravitational lensing, for current datasets and for the Stage IV dark energy surveys of the Large Synoptic Survey Telescope, ESA's Euclid mission, and NASA's WFIRST-AFTA mission. The GALSIM project repository is public and includes the full code history, all open and closed issues, installation instructions, documentation, and wiki pages (including a Frequently Asked Questions section). The GALSIM repository can be found at https://github.com/GalSim-developers/GalSim.

  9. Featured Image: Modeling Supernova Remnants

    Science.gov (United States)

    Kohler, Susanna

    2016-05-01

    This image shows a computer simulation of the hydrodynamics within a supernova remnant. The mixing between the outer layers (where color represents the log of density) is caused by turbulence from the Rayleigh-Taylor instability, an effect that arises when the expanding core gas of the supernova is accelerated into denser shell gas. The past standard for supernova-evolution simulations was to perform them in one dimension and then, in post-processing, manually smooth out regions that undergo Rayleigh-Taylor turbulence (an intrinsically multidimensional effect). But in a recent study, Paul Duffell (University of California, Berkeley) has explored how a 1D model could be used to reproduce the multidimensional dynamics that occur in turbulence from this instability. For more information, check out the paper below!CitationPaul C. Duffell 2016 ApJ 821 76. doi:10.3847/0004-637X/821/2/76

  10. Notes on modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Redondo, Antonio [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-10

    These notes present a high-level overview of how modeling and simulation are carried out by practitioners. The discussion is of a general nature; no specific techniques are examined but the activities associated with all modeling and simulation approaches are briefly addressed. There is also a discussion of validation and verification and, at the end, a section on why modeling and simulation are useful.

  11. Evaluation of color error and noise on simulated images

    Science.gov (United States)

    Mornet, Clémence; Vaillant, Jérôme; Decroux, Thomas; Hérault, Didier; Schanen, Isabelle

    2010-01-01

    The evaluation of CMOS sensors performance in terms of color accuracy and noise is a big challenge for camera phone manufacturers. On this paper, we present a tool developed with Matlab at STMicroelectronics which allows quality parameters to be evaluated on simulated images. These images are computed based on measured or predicted Quantum Efficiency (QE) curves and noise model. By setting the parameters of integration time and illumination, the tool optimizes the color correction matrix (CCM) and calculates the color error, color saturation and signal-to-noise ratio (SNR). After this color correction optimization step, a Graphics User Interface (GUI) has been designed to display a simulated image at a chosen illumination level, with all the characteristics of a real image taken by the sensor with the previous color correction. Simulated images can be a synthetic Macbeth ColorChecker, for which reflectance of each patch is known, or a multi-spectral image, described by the reflectance spectrum of each pixel or an image taken at high-light level. A validation of the results has been performed with ST under development sensors. Finally we present two applications one based on the trade-offs between color saturation and noise by optimizing the CCM and the other based on demosaicking SNR trade-offs.

  12. [Vernier Anode Design and Image Simulation].

    Science.gov (United States)

    Zhao, Ai-rong; Ni, Qi-liang; Song, Ke-fei

    2015-12-01

    Based-MCP position-sensitive anode photon-counting imaging detector is good at detecting extremely faint light, which includes micro-channel plate (MCP), position-sensitive anode and readout, and the performances of these detectors are mainly decided by the position-sensitive anode. As a charge division anode, Vernier anode using cyclically varying electrode areas which replaces the linearly varying electrodes of wedge-strip anode can get better resolution and greater electrode dynamic range. Simulation and design of the Vernier anode based on Vernier's decode principle are given here. Firstly, we introduce the decode and design principle of Vernier anode with nine electrodes in vector way, and get the design parameters which are the pitch, amplitude and the coarse wavelength of electrode. Secondly, we analyze the effect of every design parameters to the imaging of the detector. We simulate the electron cloud, the Vernier anode and the detector imaging using Labview software and get the relationship between the pitch and the coarse wavelength of the anode. Simultaneously, we get the corresponding electron cloud for the designing parameters. Based on the result of the simulation and the practical machining demand, a nine electrodes Vernier anode was designed and fabricated which has a pitch of 891 µm, insulation width of 25 µm, amplitude of 50 µm, coarse pixel numbers of 5.

  13. Evaluating uncertainty in simulation models

    Energy Technology Data Exchange (ETDEWEB)

    McKay, M.D.; Beckman, R.J.; Morrison, J.D.; Upton, S.C.

    1998-12-01

    The authors discussed some directions for research and development of methods for assessing simulation variability, input uncertainty, and structural model uncertainty. Variance-based measures of importance for input and simulation variables arise naturally when using the quadratic loss function of the difference between the full model prediction y and the restricted prediction {tilde y}. The concluded that generic methods for assessing structural model uncertainty do not now exist. However, methods to analyze structural uncertainty for particular classes of models, like discrete event simulation models, may be attainable.

  14. Global observations of glyoxal columns from OMI/Aura and GOME-2/Metop-A sensors and comparison with multi-year simulations by the IMAGES model

    Science.gov (United States)

    Lerot, Christophe; Stavrakou, Trissevgeni; Hendrick, François; De Smedt, Isabelle; Müller, Jean-François; Volkamer, Rainer; Van Roozendael, Michel

    2015-04-01

    successively performed in Beijing and Xianghe, China, since 2008. Also, comparisons of the satellite data sets with simulations by the IMAGES chemistry transport model show generally good correlation. Sensitivity tests on the VOC emissions used in the model will also be discussed. Lerot, C., Stavrakou, T., De Smedt, I., Müller, J.-F., and Van Roozendael, M.: Glyoxal vertical columns from GOME-2 backscattered light measurements and comparisons with a global model, Atmos. Chem. Phys., 10, 12059-12072, doi:10.5194/acp-10-12059-2010, 2010.

  15. Simulation Model of a Transient

    DEFF Research Database (Denmark)

    Jauch, Clemens; Sørensen, Poul; Bak-Jensen, Birgitte

    2005-01-01

    This paper describes the simulation model of a controller that enables an active-stall wind turbine to ride through transient faults. The simulated wind turbine is connected to a simple model of a power system. Certain fault scenarios are specified and the turbine shall be able to sustain operation...... in case of such faults. The design of the controller is described and its performance assessed by simulations. The control strategies are explained and the behaviour of the turbine discussed....

  16. Simulation Model of a Transient

    DEFF Research Database (Denmark)

    Jauch, Clemens; Sørensen, Poul; Bak-Jensen, Birgitte

    2005-01-01

    This paper describes the simulation model of a controller that enables an active-stall wind turbine to ride through transient faults. The simulated wind turbine is connected to a simple model of a power system. Certain fault scenarios are specified and the turbine shall be able to sustain operation...

  17. Investigation of Simulating Radar Images Concerning the Multipath Scattering Effect

    Institute of Scientific and Technical Information of China (English)

    Yang Chun-hua; Zhu Guo-qiang

    2004-01-01

    In the composed system of a target and rough surface, the electromagnetic scattering mechanism, especially the multipath scattering, is investigated. Using physical optics double bouncing algorithm, the multipath scattering model of the system has been established. Simulated by a wideband radar signal and based on fractal rough surface. the artificial echo of the target has been obtained in virtue of the established multipath scattering model. By simulating to image the target in one dimension using the artificial echo, two kinds of range profiles are attained. It is found that one is from the target and the other is from the multipath scattering effect.

  18. Construction of multi-functional open modulized Matlab simulation toolbox for imaging ladar system

    Science.gov (United States)

    Wu, Long; Zhao, Yuan; Tang, Meng; He, Jiang; Zhang, Yong

    2011-06-01

    Ladar system simulation is to simulate the ladar models using computer simulation technology in order to predict the performance of the ladar system. This paper presents the developments of laser imaging radar simulation for domestic and overseas studies and the studies of computer simulation on ladar system with different application requests. The LadarSim and FOI-LadarSIM simulation facilities of Utah State University and Swedish Defence Research Agency are introduced in details. This paper presents the low level of simulation scale, un-unified design and applications of domestic researches in imaging ladar system simulation, which are mostly to achieve simple function simulation based on ranging equations for ladar systems. Design of laser imaging radar simulation with open and modularized structure is proposed to design unified modules for ladar system, laser emitter, atmosphere models, target models, signal receiver, parameters setting and system controller. Unified Matlab toolbox and standard control modules have been built with regulated input and output of the functions, and the communication protocols between hardware modules. A simulation based on ICCD gain-modulated imaging ladar system for a space shuttle is made based on the toolbox. The simulation result shows that the models and parameter settings of the Matlab toolbox are able to simulate the actual detection process precisely. The unified control module and pre-defined parameter settings simplify the simulation of imaging ladar detection. Its open structures enable the toolbox to be modified for specialized requests. The modulization gives simulations flexibility.

  19. Biocomputing: numerical simulation of glioblastoma growth using diffusion tensor imaging

    Energy Technology Data Exchange (ETDEWEB)

    Bondiau, Pierre-Yves [Institut National de Recherche en Informatique et Automatique, 2004 Route des Lucioles, 06 902 Sophia Antipolis (France); Clatz, Olivier [Institut National de Recherche en Informatique et Automatique, 2004 Route des Lucioles, 06 902 Sophia Antipolis (France); Sermesant, Maxime [Institut National de Recherche en Informatique et Automatique, 2004 Route des Lucioles, 06 902 Sophia Antipolis (France); Marcy, Pierre-Yves [Departement de Radiotherapie, Centre Antoine Lacassagne, 33 av de Valombrose, 06189 Nice (France); Delingette, Herve [Institut National de Recherche en Informatique et Automatique, 2004 Route des Lucioles, 06 902 Sophia Antipolis (France); Frenay, Marc [Departement d' Oncologie Medicale, Centre Antoine Lacassagne, 33 av de Valombrose, 06189 Nice (France); Ayache, Nicholas [Institut National de Recherche en Informatique et Automatique, 2004 Route des Lucioles, 06 902 Sophia Antipolis (France)

    2008-02-21

    Glioblastoma multiforma (GBM) is one of the most aggressive tumors of the central nervous system. It can be represented by two components: a proliferative component with a mass effect on brain structures and an invasive component. GBM has a distinct pattern of spread showing a preferential growth in the white fiber direction for the invasive component. By using the architecture of white matter fibers, we propose a new model to simulate the growth of GBM. This architecture is estimated by diffusion tensor imaging in order to determine the preferred direction for the diffusion component. It is then coupled with a mechanical component. To set up our growth model, we make a brain atlas including brain structures with a distinct response to tumor aggressiveness, white fiber diffusion tensor information and elasticity. In this atlas, we introduce a virtual GBM with a mechanical component coupled with a diffusion component. These two components are complementary, and can be tuned independently. Then, we tune the parameter set of our model with an MRI patient. We have compared simulated growth (initialized with the MRI patient) with observed growth six months later. The average and the odd ratio of image difference between observed and simulated images are computed. Displacements of reference points are compared to those simulated by the model. The results of our simulation have shown a good correlation with tumor growth, as observed on an MRI patient. Different tumor aggressiveness can also be simulated by tuning additional parameters. This work has demonstrated that modeling the complex behavior of brain tumors is feasible and will account for further validation of this new conceptual approach.

  20. Simulation - modeling - experiment; Simulation - modelisation - experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  1. IVOA Recommendation: Simulation Data Model

    CERN Document Server

    Lemson, Gerard; Cervino, Miguel; Gheller, Claudio; Gray, Norman; LePetit, Franck; Louys, Mireille; Ooghe, Benjamin; Wagner, Rick; Wozniak, Herve

    2014-01-01

    In this document and the accompanying documents we describe a data model (Simulation Data Model) describing numerical computer simulations of astrophysical systems. The primary goal of this standard is to support discovery of simulations by describing those aspects of them that scientists might wish to query on, i.e. it is a model for meta-data describing simulations. This document does not propose a protocol for using this model. IVOA protocols are being developed and are supposed to use the model, either in its original form or in a form derived from the model proposed here, but more suited to the particular protocol. The SimDM has been developed in the IVOA Theory Interest Group with assistance of representatives of relevant working groups, in particular DM and Semantics.

  2. Numerical Simulations of MREIT Conductivity Imaging for Brain Tumor Detection

    Directory of Open Access Journals (Sweden)

    Zi Jun Meng

    2013-01-01

    Full Text Available Magnetic resonance electrical impedance tomography (MREIT is a new modality capable of imaging the electrical properties of human body using MRI phase information in conjunction with external current injection. Recent in vivo animal and human MREIT studies have revealed unique conductivity contrasts related to different physiological and pathological conditions of tissues or organs. When performing in vivo brain imaging, small imaging currents must be injected so as not to stimulate peripheral nerves in the skin, while delivery of imaging currents to the brain is relatively small due to the skull’s low conductivity. As a result, injected imaging currents may induce small phase signals and the overall low phase SNR in brain tissues. In this study, we present numerical simulation results of the use of head MREIT for brain tumor detection. We used a realistic three-dimensional head model to compute signal levels produced as a consequence of a predicted doubling of conductivity occurring within simulated tumorous brain tissues. We determined the feasibility of measuring these changes in a time acceptable to human subjects by adding realistic noise levels measured from a candidate 3 T system. We also reconstructed conductivity contrast images, showing that such conductivity differences can be both detected and imaged.

  3. Numerical simulations of MREIT conductivity imaging for brain tumor detection.

    Science.gov (United States)

    Meng, Zi Jun; Sajib, Saurav Z K; Chauhan, Munish; Sadleir, Rosalind J; Kim, Hyung Joong; Kwon, Oh In; Woo, Eung Je

    2013-01-01

    Magnetic resonance electrical impedance tomography (MREIT) is a new modality capable of imaging the electrical properties of human body using MRI phase information in conjunction with external current injection. Recent in vivo animal and human MREIT studies have revealed unique conductivity contrasts related to different physiological and pathological conditions of tissues or organs. When performing in vivo brain imaging, small imaging currents must be injected so as not to stimulate peripheral nerves in the skin, while delivery of imaging currents to the brain is relatively small due to the skull's low conductivity. As a result, injected imaging currents may induce small phase signals and the overall low phase SNR in brain tissues. In this study, we present numerical simulation results of the use of head MREIT for brain tumor detection. We used a realistic three-dimensional head model to compute signal levels produced as a consequence of a predicted doubling of conductivity occurring within simulated tumorous brain tissues. We determined the feasibility of measuring these changes in a time acceptable to human subjects by adding realistic noise levels measured from a candidate 3 T system. We also reconstructed conductivity contrast images, showing that such conductivity differences can be both detected and imaged.

  4. Kinetic Simulation and Energetic Neutral Atom Imaging of the Magnetosphere

    Science.gov (United States)

    Fok, Mei-Ching H.

    2011-01-01

    Advanced simulation tools and measurement techniques have been developed to study the dynamic magnetosphere and its response to drivers in the solar wind. The Comprehensive Ring Current Model (CRCM) is a kinetic code that solves the 3D distribution in space, energy and pitch-angle information of energetic ions and electrons. Energetic Neutral Atom (ENA) imagers have been carried in past and current satellite missions. Global morphology of energetic ions were revealed by the observed ENA images. We have combined simulation and ENA analysis techniques to study the development of ring current ions during magnetic storms and substorms. We identify the timing and location of particle injection and loss. We examine the evolution of ion energy and pitch-angle distribution during different phases of a storm. In this talk we will discuss the findings from our ring current studies and how our simulation and ENA analysis tools can be applied to the upcoming TRIO-CINAMA mission.

  5. Simulation of electro-optical imaging system based on OpenGL

    Science.gov (United States)

    Zhu, Yong; Fu, Qiang; Duan, Jin; Jing, Wen-bo

    2013-08-01

    With the development of electro-optical imaging system technology and simulation technology, and the demand of optimizing the new type electro-optical imaging system theoretical model, more and more scientific research institutes, colleges and universities research on the simulation of electro-optical imaging system, and the better results were obtained. Simulation technology saved the cost of system design development, meanwhile, some complex and hard to re-implement experiments can be carried repeatedly. According to the demand of complex environment construction technology and the requirement of imaging simulation system fidelity, considering the performance of electro-optical imaging system, an electro-optical imaging system is modeled. The modeling has two aspects which is scene characteristic modeling and electro-optical system modeling. Scene characteristic modeling can construct dynamic scenes in different kinds of complex environments by using powerful OpenGL three-dimension model visualization technology. Electro-optical system modeling is consist of optical system and imaging detector. Electro-optical imaging system simulation model is established with the analysis of electro-optical imaging system theory. The use of modular design concept and general interface technology is combined. Different imaging effect is received under different parameters by modifying the model's related parameters. The experimental results show that, the image produced from simulation basically reflects the performance of imaging system, so this kind of image can be used as a information source for imaging system performance analysis. It provides a simple and feasible method for the analysis of imaging system performance, which has a very important practical significance.

  6. The land-use projections and resulting emissions in the IPCC SRES scenarios as simulated by the IMAGE 2.2 model

    NARCIS (Netherlands)

    Strengers, B.; Leemans, R.; Eickhout, B.; Vries, de B.; Bouwman, L.

    2004-01-01

    The Intergovernmental Panel on Climate Change (IPCC) developed a new series of emission scenarios (SRES). Six global models were used to develop SRES but most focused primarily on energy and industry related emissions. Land-use emissions were only covered by three models, where IMAGE included the mo

  7. Modeling and Simulation with INS.

    Science.gov (United States)

    Roberts, Stephen D.; And Others

    INS, the Integrated Network Simulation language, puts simulation modeling into a network framework and automatically performs such programming activities as placing the problem into a next event structure, coding events, collecting statistics, monitoring status, and formatting reports. To do this, INS provides a set of symbols (nodes and branches)…

  8. Simulation modeling of estuarine ecosystems

    Science.gov (United States)

    Johnson, R. W.

    1980-01-01

    A simulation model has been developed of Galveston Bay, Texas ecosystem. Secondary productivity measured by harvestable species (such as shrimp and fish) is evaluated in terms of man-related and controllable factors, such as quantity and quality of inlet fresh-water and pollutants. This simulation model used information from an existing physical parameters model as well as pertinent biological measurements obtained by conventional sampling techniques. Predicted results from the model compared favorably with those from comparable investigations. In addition, this paper will discuss remotely sensed and conventional measurements in the framework of prospective models that may be used to study estuarine processes and ecosystem productivity.

  9. Modeling and Simulating Environmental Effects

    OpenAIRE

    Guest, Peter S.; Murphree, Tom; Frederickson, Paul A.; Guest, Arlene A.

    2012-01-01

    MOVES Research & Education Systems Seminar: Presentation; Session 4: Collaborative NWDC/NPS M&S Research; Moderator: Curtis Blais; Modeling and Simulating Environmental Effects; speakers: Peter Guest, Paul Frederickson & Tom Murphree Environmental Effects Group

  10. TREAT Modeling and Simulation Strategy

    Energy Technology Data Exchange (ETDEWEB)

    DeHart, Mark David [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This report summarizes a four-phase process used to describe the strategy in developing modeling and simulation software for the Transient Reactor Test Facility. The four phases of this research and development task are identified as (1) full core transient calculations with feedback, (2) experiment modeling, (3) full core plus experiment simulation and (4) quality assurance. The document describes the four phases, the relationship between these research phases, and anticipated needs within each phase.

  11. Retrievals of formaldehyde from ground-based FTIR and MAX-DOAS observations at the Jungfraujoch station and comparisons with GEOS-Chem and IMAGES model simulations

    Science.gov (United States)

    Franco, B.; Hendrick, F.; Van Roozendael, M.; Müller, J.-F.; Stavrakou, T.; Marais, E. A.; Bovy, B.; Bader, W.; Fayt, C.; Hermans, C.; Lejeune, B.; Pinardi, G.; Servais, C.; Mahieu, E.

    2015-04-01

    As an ubiquitous product of the oxidation of many volatile organic compounds (VOCs), formaldehyde (HCHO) plays a key role as a short-lived and reactive intermediate in the atmospheric photo-oxidation pathways leading to the formation of tropospheric ozone and secondary organic aerosols. In this study, HCHO profiles have been successfully retrieved from ground-based Fourier transform infrared (FTIR) solar spectra and UV-visible Multi-AXis Differential Optical Absorption Spectroscopy (MAX-DOAS) scans recorded during the July 2010-December 2012 time period at the Jungfraujoch station (Swiss Alps, 46.5° N, 8.0° E, 3580 m a.s.l.). Analysis of the retrieved products has revealed different vertical sensitivity between both remote sensing techniques. Furthermore, HCHO amounts simulated by two state-of-the-art chemical transport models (CTMs), GEOS-Chem and IMAGES v2, have been compared to FTIR total columns and MAX-DOAS 3.6-8 km partial columns, accounting for the respective vertical resolution of each ground-based instrument. Using the CTM outputs as the intermediate, FTIR and MAX-DOAS retrievals have shown consistent seasonal modulations of HCHO throughout the investigated period, characterized by summertime maximum and wintertime minimum. Such comparisons have also highlighted that FTIR and MAX-DOAS provide complementary products for the HCHO retrieval above the Jungfraujoch station. Finally, tests have revealed that the updated IR parameters from the HITRAN 2012 database have a cumulative effect and significantly decrease the retrieved HCHO columns with respect to the use of the HITRAN 2008 compilation.

  12. Retrievals of formaldehyde from ground-based FTIR and MAX-DOAS observations at the Jungfraujoch station and comparisons with GEOS-Chem and IMAGES model simulations

    Directory of Open Access Journals (Sweden)

    B. Franco

    2015-04-01

    Full Text Available As an ubiquitous product of the oxidation of many volatile organic compounds (VOCs, formaldehyde (HCHO plays a key role as a short-lived and reactive intermediate in the atmospheric photo-oxidation pathways leading to the formation of tropospheric ozone and secondary organic aerosols. In this study, HCHO profiles have been successfully retrieved from ground-based Fourier transform infrared (FTIR solar spectra and UV-visible Multi-AXis Differential Optical Absorption Spectroscopy (MAX-DOAS scans recorded during the July 2010–December 2012 time period at the Jungfraujoch station (Swiss Alps, 46.5° N, 8.0° E, 3580 m a.s.l.. Analysis of the retrieved products has revealed different vertical sensitivity between both remote sensing techniques. Furthermore, HCHO amounts simulated by two state-of-the-art chemical transport models (CTMs, GEOS-Chem and IMAGES v2, have been compared to FTIR total columns and MAX-DOAS 3.6–8 km partial columns, accounting for the respective vertical resolution of each ground-based instrument. Using the CTM outputs as the intermediate, FTIR and MAX-DOAS retrievals have shown consistent seasonal modulations of HCHO throughout the investigated period, characterized by summertime maximum and wintertime minimum. Such comparisons have also highlighted that FTIR and MAX-DOAS provide complementary products for the HCHO retrieval above the Jungfraujoch station. Finally, tests have revealed that the updated IR parameters from the HITRAN 2012 database have a cumulative effect and significantly decrease the retrieved HCHO columns with respect to the use of the HITRAN 2008 compilation.

  13. Stochastic modeling analysis and simulation

    CERN Document Server

    Nelson, Barry L

    1995-01-01

    A coherent introduction to the techniques for modeling dynamic stochastic systems, this volume also offers a guide to the mathematical, numerical, and simulation tools of systems analysis. Suitable for advanced undergraduates and graduate-level industrial engineers and management science majors, it proposes modeling systems in terms of their simulation, regardless of whether simulation is employed for analysis. Beginning with a view of the conditions that permit a mathematical-numerical analysis, the text explores Poisson and renewal processes, Markov chains in discrete and continuous time, se

  14. Model reduction for circuit simulation

    CERN Document Server

    Hinze, Michael; Maten, E Jan W Ter

    2011-01-01

    Simulation based on mathematical models plays a major role in computer aided design of integrated circuits (ICs). Decreasing structure sizes, increasing packing densities and driving frequencies require the use of refined mathematical models, and to take into account secondary, parasitic effects. This leads to very high dimensional problems which nowadays require simulation times too large for the short time-to-market demands in industry. Modern Model Order Reduction (MOR) techniques present a way out of this dilemma in providing surrogate models which keep the main characteristics of the devi

  15. ERIS: the exoplanet high-resolution image simulator for CHARIS

    Science.gov (United States)

    Limbach, Mary Anne; Groff, Tyler D.; Kasdin, N. J.; Brandt, Timothy; Mede, Kyle; Loomis, Craig; Hayashi, Masahiko; Takato, Naruhisa

    2014-07-01

    ERIS is an image simulator for CHARIS, the high-contrast exoplanet integral field spectrograph (IFS) being built at Princeton University for the Subaru telescope. We present here the software design and implementation of the ERIS code. ERIS simulates CHARIS FITS images and data cubes that are used for developing the data reduction pipeline and verifying the expected CHARIS performance. Components of the software include detailed models of the light source (such as a star or exoplanet), atmosphere, telescope, adaptive optics systems (AO188 and SCExAO), CHARIS IFS and the Hawaii2-RG infrared detector. Code includes novel details such as the phase errors at the lenslet array, optical wavefront error maps and pinholes for reducing crosstalk, just to list a few. The details of the code as well as several simulated images are presented in this paper. This IFS simulator is critical for the CHARIS data analysis pipeline development, minimizing troubleshooting in the lab and on-sky and the characterization of crosstalk.

  16. Piecewise Linear Model-Based Image Enhancement

    Directory of Open Access Journals (Sweden)

    Fabrizio Russo

    2004-09-01

    Full Text Available A novel technique for the sharpening of noisy images is presented. The proposed enhancement system adopts a simple piecewise linear (PWL function in order to sharpen the image edges and to reduce the noise. Such effects can easily be controlled by varying two parameters only. The noise sensitivity of the operator is further decreased by means of an additional filtering step, which resorts to a nonlinear model too. Results of computer simulations show that the proposed sharpening system is simple and effective. The application of the method to contrast enhancement of color images is also discussed.

  17. A VRLA battery simulation model

    Energy Technology Data Exchange (ETDEWEB)

    Pascoe, P.E.; Anbuky, A.H. [Invensys Energy Systems NZ Limited, Christchurch (New Zealand)

    2004-05-01

    A valve regulated lead acid (VRLA) battery simulation model is an invaluable tool for the standby power system engineer. The obvious use for such a model is to allow the assessment of battery performance. This may involve determining the influence of cells suffering from state of health (SOH) degradation on the performance of the entire string, or the running of test scenarios to ascertain the most suitable battery size for the application. In addition, it enables the engineer to assess the performance of the overall power system. This includes, for example, running test scenarios to determine the benefits of various load shedding schemes. It also allows the assessment of other power system components, either for determining their requirements and/or vulnerabilities. Finally, a VRLA battery simulation model is vital as a stand alone tool for educational purposes. Despite the fundamentals of the VRLA battery having been established for over 100 years, its operating behaviour is often poorly understood. An accurate simulation model enables the engineer to gain a better understanding of VRLA battery behaviour. A system level multipurpose VRLA battery simulation model is presented. It allows an arbitrary battery (capacity, SOH, number of cells and number of strings) to be simulated under arbitrary operating conditions (discharge rate, ambient temperature, end voltage, charge rate and initial state of charge). The model accurately reflects the VRLA battery discharge and recharge behaviour. This includes the complex start of discharge region known as the coup de fouet. (author)

  18. Modelling and Simulation: An Overview

    NARCIS (Netherlands)

    M.J. McAleer (Michael); F. Chan (Felix); L. Oxley (Les)

    2013-01-01

    textabstractThe papers in this special issue of Mathematics and Computers in Simulation cover the following topics: improving judgmental adjustment of model-based forecasts, whether forecast updates are progressive, on a constrained mixture vector autoregressive model, whether all estimators are bor

  19. General introduction to simulation models

    DEFF Research Database (Denmark)

    Hisham Beshara Halasa, Tariq; Boklund, Anette

    2012-01-01

    Monte Carlo simulation can be defined as a representation of real life systems to gain insight into their functions and to investigate the effects of alternative conditions or actions on the modeled system. Models are a simplification of a system. Most often, it is best to use experiments and field...

  20. Modelling, simulating and optimizing Boilers

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2003-01-01

    of the boiler has been developed and simulations carried out by means of the Matlab integration routines. The model is prepared as a dynamic model consisting of both ordinary differential equations and algebraic equations, together formulated as a Differential-Algebraic- Equation system. Being able to operate...

  1. Image quality simulation and verification of x-ray volume imaging systems

    Science.gov (United States)

    Kroon, Han; Schoumans, Nicole; Snoeren, Ruud

    2006-03-01

    Nowadays, 2D X-ray systems are used more and more for 3-dimensional rotational X-ray imaging (3D-RX) or volume imaging, such as 3D rotational angiography. However, it is not evident that the application of settings for optimal 2D images also guarantee optimal conditions for 3D-RX reconstruction results. In particular the search for a good compromise between patient dose and IQ may lead to different results in case of 3D imaging. For this purpose we developed an additional 3D-RX module for our full-scale image quality & patient dose (IQ&PD) simulation model, with specific calculations of patient dose under rotational conditions, and contrast, sharpness and noise of 3D images. The complete X-ray system from X-ray tube up to and including the display device is modelled in separate blocks for each distinguishable component or process. The model acts as a tool for X-ray system design, image quality optimisation and patient dose reduction. The model supports the decomposition of system level requirements, and takes inherently care of the prerequisite mutual coherence between component requirements. The short calculation times enable comprehensive multi-parameter optimisation studies. The 3D-RX IQ&PD performance is validated by comparing calculation results with actual measurements performed on volume images acquired with a state-of-the-art 3D-RX system. The measurements include RXDI dose index, signal and contrast based on Hounsfield units (H and ΔH), modulation transfer function (MTF), noise variance (σ2) and contrast-to-noise ratio (CNR). Further we developed a new 3D contrast-delta (3D-CΔ) phantom with details of varying size and contrast medium material and concentration. Simulation and measurement results show a significant correlation.

  2. Monte Carlo simulation of secondary electron images for real sample structures in scanning electron microscopy.

    Science.gov (United States)

    Zhang, P; Wang, H Y; Li, Y G; Mao, S F; Ding, Z J

    2012-01-01

    Monte Carlo simulation methods for the study of electron beam interaction with solids have been mostly concerned with specimens of simple geometry. In this article, we propose a simulation algorithm for treating arbitrary complex structures in a real sample. The method is based on a finite element triangular mesh modeling of sample geometry and a space subdivision for accelerating simulation. Simulation of secondary electron image in scanning electron microscopy has been performed for gold particles on a carbon substrate. Comparison of the simulation result with an experiment image confirms that this method is effective to model complex morphology of a real sample.

  3. Vehicle dynamics modeling and simulation

    CERN Document Server

    Schramm, Dieter; Bardini, Roberto

    2014-01-01

    The authors examine in detail the fundamentals and mathematical descriptions of the dynamics of automobiles. In this context different levels of complexity will be presented, starting with basic single-track models up to complex three-dimensional multi-body models. A particular focus is on the process of establishing mathematical models on the basis of real cars and the validation of simulation results. The methods presented are explained in detail by means of selected application scenarios.

  4. Polarimetric ISAR: Simulation and image reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Chambers, David H. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-03-21

    In polarimetric ISAR the illumination platform, typically airborne, carries a pair of antennas that are directed toward a fixed point on the surface as the platform moves. During platform motion, the antennas maintain their gaze on the point, creating an effective aperture for imaging any targets near that point. The interaction between the transmitted fields and targets (e.g. ships) is complicated since the targets are typically many wavelengths in size. Calculation of the field scattered from the target typically requires solving Maxwell’s equations on a large three-dimensional numerical grid. This is prohibitive to use in any real-world imaging algorithm, so the scattering process is typically simplified by assuming the target consists of a cloud of independent, non-interacting, scattering points (centers). Imaging algorithms based on this scattering model perform well in many applications. Since polarimetric radar is not very common, the scattering model is often derived for a scalar field (single polarization) where the individual scatterers are assumed to be small spheres. However, when polarization is important, we must generalize the model to explicitly account for the vector nature of the electromagnetic fields and its interaction with objects. In this note, we present a scattering model that explicitly includes the vector nature of the fields but retains the assumption that the individual scatterers are small. The response of the scatterers is described by electric and magnetic dipole moments induced by the incident fields. We show that the received voltages in the antennas are linearly related to the transmitting currents through a scattering impedance matrix that depends on the overall geometry of the problem and the nature of the scatterers.

  5. High Fidelity Imaging Algorithm for the Undique Imaging Monte Carlo Simulator

    Directory of Open Access Journals (Sweden)

    Tremblay Grégoire

    2016-01-01

    Full Text Available The Undique imaging Monte Carlo simulator (Undique hereafter was developed to reproduce the behavior of 3D imaging devices. This paper describes its high fidelity imaging algorithm.

  6. Arabidopsis Growth Simulation Using Image Processing Technology

    Directory of Open Access Journals (Sweden)

    Junmei Zhang

    2014-01-01

    Full Text Available This paper aims to provide a method to represent the virtual Arabidopsis plant at each growth stage. It includes simulating the shape and providing growth parameters. The shape is described with elliptic Fourier descriptors. First, the plant is segmented from the background with the chromatic coordinates. With the segmentation result, the outer boundary series are obtained by using boundary tracking algorithm. The elliptic Fourier analysis is then carried out to extract the coefficients of the contour. The coefficients require less storage than the original contour points and can be used to simulate the shape of the plant. The growth parameters include total area and the number of leaves of the plant. The total area is obtained with the number of the plant pixels and the image calibration result. The number of leaves is derived by detecting the apex of each leaf. It is achieved by using wavelet transform to identify the local maximum of the distance signal between the contour points and the region centroid. Experiment result shows that this method can record the growth stage of Arabidopsis plant with fewer data and provide a visual platform for plant growth research.

  7. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  8. Compressive Imaging with Iterative Forward Models

    CERN Document Server

    Liu, Hsiou-Yuan; Liu, Dehong; Mansour, Hassan; Boufounos, Petros T

    2016-01-01

    We propose a new compressive imaging method for reconstructing 2D or 3D objects from their scattered wave-field measurements. Our method relies on a novel, nonlinear measurement model that can account for the multiple scattering phenomenon, which makes the method preferable in applications where linear measurement models are inaccurate. We construct the measurement model by expanding the scattered wave-field with an accelerated-gradient method, which is guaranteed to converge and is suitable for large-scale problems. We provide explicit formulas for computing the gradient of our measurement model with respect to the unknown image, which enables image formation with a sparsity- driven numerical optimization algorithm. We validate the method both analytically and with numerical simulations.

  9. Modeling and simulation of the human eye

    Science.gov (United States)

    Duran, R.; Ventura, L.; Nonato, L.; Bruno, O.

    2007-02-01

    The computational modeling of the human eye has been wide studied for different sectors of the scientific and technological community. One of the main reasons for this increasing interest is the possibility to reproduce eye optic properties by means of computational simulations, becoming possible the development of efficient devices to treat and to correct the problems of the vision. This work explores this aspect still little investigated of the modeling of the visual system, considering a computational sketch that make possible the use of real data in the modeling and simulation of the human visual system. This new approach makes possible the individual inquiry of the optic system, assisting in the construction of new techniques used to infer vital data in medical investigations. Using corneal topography to collect real data from patients, a computational model of cornea is constructed and a set of simulations were build to ensure the correctness of the system and to investigate the effect of corneal abnormalities in retinal image formation, such as Plcido Discs, Point Spread Function, Wave front and the projection of a real image and it's visualization on retina.

  10. Computational Modeling of Simulation Tests.

    Science.gov (United States)

    1980-06-01

    Mexico , March 1979. 14. Kinney, G. F.,.::. IeiN, .hoce 1h Ir, McMillan, p. 57, 1962. 15. Courant and Friedrichs, ,U: r. on moca an.: Jho...AD 79 275 NEW MEXICO UNIV ALBUGUERGUE ERIC H WANG CIVIL ENGINE-ETC F/6 18/3 COMPUTATIONAL MODELING OF SIMULATION TESTS.(U) JUN 80 6 LEIGH, W CHOWN, B...COMPUTATIONAL MODELING OF SIMULATION TESTS00 0G. Leigh W. Chown B. Harrison Eric H. Wang Civil Engineering Research Facility University of New Mexico

  11. SAR IMAGING SIMULATION OF HORIZONTAL FULLY TWO-DIMENSIONAL INTERNAL WAVES

    Institute of Scientific and Technical Information of China (English)

    SHEN Hui; HE Yi-Jun

    2006-01-01

    Based on the research of Lynett and Liu, a new horizontal fully two-dimensional internal wave propagation model with rotation effect was deduced, which can be used to simulate the characteristics of internal waves in a horizontal fully two-dimensional plane. By combining the imaging mechanism of Synthetic Aperture Radar(SAR), a simulation procedure was fatherly acquired, which can simulate the propagation characteristics of oceanic internal waves into SAR images. In order to evaluate the validity of the proposed simulation procedure, case studies are performed in South China Sea and results from simulation procedure are analyzed in detail. A very good consistency was found between the simulation results and satellite images. The proposed simulation procedure will be a possible foundation for the quantitative interpretation of internal waves from fully two-dimensional satellite images.

  12. SIMULATION OF COLLECTIVE RISK MODEL

    Directory of Open Access Journals (Sweden)

    Viera Pacáková

    2007-12-01

    Full Text Available The article focuses on providing brief theoretical definitions of the basic terms and methods of modeling and simulations of insurance risks in non-life insurance by means of mathematical and statistical methods using statistical software. While risk assessment of insurance company in connection with its solvency is a rather complex and comprehensible problem, its solution starts with statistical modeling of number and amount of individual claims. Successful solution of these fundamental problems enables solving of curtail problems of insurance such as modeling and simulation of collective risk, premium an reinsurance premium calculation, estimation of probabiliy of ruin etc. The article also presents some essential ideas underlying Monte Carlo methods and their applications to modeling of insurance risk. Solving problem is to find the probability distribution of the collective risk in non-life insurance portfolio. Simulation of the compound distribution function of the aggregate claim amount can be carried out, if the distibution functions of the claim number process and the claim size are assumed given. The Monte Carlo simulation is suitable method to confirm the results of other methods and for treatments of catastrophic claims, when small collectives are studied. Analysis of insurance risks using risk theory is important part of the project Solvency II. Risk theory is analysis of stochastic features of non-life insurance process. The field of application of risk theory has grown rapidly. There is a need to develop the theory into form suitable for practical purposes and demostrate their application. Modern computer simulation techniques open up a wide field of practical applications for risk theory concepts, without requiring the restricive assumptions and sophisticated mathematics. This article presents some comparisons of the traditional actuarial methods and of simulation methods of the collective risk model.

  13. Wave optics approach for incoherent imaging simulation through distributed turbulence

    Science.gov (United States)

    Underwood, Thomas A.; Voelz, David G.

    2013-09-01

    An approach is presented for numerically simulating incoherent imaging using coherent wave optics propagation methods. The approach employs averaging of irradiance from uncorrelated coherent waves to produce incoherent results. Novel aspects of the method include 1) the exploitation of a spatial windowing feature in the wave optics numerical propagator to limit the angular spread of the light and 2) a simple propagation scaling concept to avoid aliased field components after the focusing element. Classical linear systems theory is commonly used to simulate incoherent imaging when it is possible to incorporate aberrations and/or propagation medium characteristics into an optical transfer function (OTF). However, the technique presented here is useful for investigating situations such as "instantaneous" short-exposure imaging through distributed turbulence and phenomena like anisoplanatism that are not easily modeled with the typical linear systems theory. The relationships between simulation variables such as spatial sampling, source and aperture support, and intermediate focal plane are discussed and the requirement or benefits of choosing these in certain ways are demonstrated.

  14. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    This paper describes the modelling, simulating and optimizing including experimental verification as being carried out as part of a Ph.D. project being written resp. supervised by the authors. The work covers dynamic performance of both water-tube boilers and fire tube boilers. A detailed dynamic...... model of the boiler has been developed and simulations carried out by means of the Matlab integration routines. The model is prepared as a dynamic model consisting of both ordinary differential equations and algebraic equations, together formulated as a Differential-Algebraic-Equation system. Being able...... to operate a boiler plant dynamically means that the boiler designs must be able to absorb any fluctuations in water level and temperature gradients resulting from the pressure change in the boiler. On the one hand a large water-/steam space may be required, i.e. to build the boiler as big as possible. Due...

  15. Intelligent Mobility Modeling and Simulation

    Science.gov (United States)

    2015-03-04

    cog.cs.drexel.edu/act-r/index.html) •Models sensory / motor performance of human driver or teleoperator 27UNCLASSIFIED: Distribution Statement A. Approved for...U.S. ARMY TANK AUTOMOTIVE RESEARCH, DEVELOPMENT AND ENGINEERING CENTER Intelligent Mobility Modeling and Simulation 1 Dr. P. Jayakumar, S. Arepally...Prescribed by ANSI Std Z39-18 Contents 1. Mobility - Autonomy - Latency Relationship 2. Machine - Human Partnership 3. Development of Shared Control

  16. Learning generative models of natural images.

    Science.gov (United States)

    Wu, Jiann-Ming; Lin, Zheng-Han

    2002-04-01

    This work proposes an unsupervised learning process for analysis of natural images. The derivation is based on a generative model, a stochastic coin-flip process directly operating on many disjoint multivariate Gaussian distributions. Following the maximal likelihood principle and using the Potts encoding, the goodness-of-fit of the generative model to tremendous patches randomly sampled from natural images is quantitatively expressed by an objective function subject to a set of constraints. By further combination of the objective function and the minimal wiring criterion, we achieve a mixed integer and linear programming. A hybrid of the mean field annealing and the gradient descent method is applied to the mathematical framework and produces three sets of interactive dynamics for the learning process. Numerical simulations show that the learning process is effective for extraction of orientation, localization and bandpass features and the generative model can make an ensemble of a sparse code for natural images.

  17. Modeling of a Single Multimode Fiber Imaging System

    CERN Document Server

    Liu, Chen; Liu, Deming; Su, Lei

    2016-01-01

    We present a detailed theoretical analysis on image transmission via a single multimode fiber (MMF). A single MMF imaging model is developed to study the light wave propagation from the light source to the camera, by using free-space Fourier optics theory and mode-coupling theory. A mathematical expression is obtained for the complete single MMF imaging system, which is further validated by image-transmission simulations. Our model is believed to be the first theoretical model to describe the complete MMF imaging system based on the transmission of individual modes. Therefore, this model is robust and capable of analyzing MMF image transmission under specific mode-coupling conditions. We use our model to study bending-induced image blur in single-MMF image transmission, and the result has found a good agreement with that of existing experimental studies. These should provide important insights into future MMF imaging system developments.

  18. Visual analysis of the computer simulation for both imaging and non-imaging optical systems

    Science.gov (United States)

    Barladian, B. K.; Potemin, I. S.; Zhdanov, D. D.; Voloboy, A. G.; Shapiro, L. S.; Valiev, I. V.; Birukov, E. D.

    2016-10-01

    Typical results of the optic simulation are images generated on the virtual sensors of various kinds. As a rule, these images represent two-dimensional distribution of the light values in Cartesian coordinates (luminance, illuminance) or in polar coordinates (luminous intensity). Using the virtual sensors allows making the calculation and design of different kinds of illumination devices, providing stray light analysis, synthesizing of photorealistic images of three-dimensional scenes under the complex illumination generated with optical systems, etc. Based on rich experience in the development and practical using of computer systems of virtual prototyping and photorealistic visualization the authors formulated a number of basic requirements for the visualization and analysis of the results of light simulations represented as two-dimensional distribution of luminance, illuminance and luminous intensity values. The requirements include the tone mapping operators, pseudo color imaging, visualization of the spherical panorama, regression analysis, the analysis of the image sections and regions, analysis of pixel values, the image data export, etc. All those requirements were successfully satisfied in designed software component for visual analysis of the light simulation results. The module "LumiVue" is an integral part of "Lumicept" modeling system and the corresponding plug-in of computer-aided design and support for CATIA product. A number of visual examples of analysis of calculated two-dimensional distribution of luminous intensity, illuminance and luminance illustrate the article. The examples are results of simulation and design of lighting optical systems, secondary optics for LEDs, stray light analysis, virtual prototyping and photorealistic rendering.

  19. Image quantization: statistics and modeling

    Science.gov (United States)

    Whiting, Bruce R.; Muka, Edward

    1998-07-01

    A method for analyzing the effects of quantization, developed for temporal one-dimensional signals, is extended to two- dimensional radiographic images. By calculating the probability density function for the second order statistics (the differences between nearest neighbor pixels) and utilizing its Fourier transform (the characteristic function), the effect of quantization on image statistics can be studied by the use of standard communication theory. The approach is demonstrated by characterizing the noise properties of a storage phosphor computed radiography system and the image statistics of a simple radiographic object (cylinder) and by comparing the model to experimental measurements. The role of quantization noise and the onset of contouring in image degradation are explained.

  20. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2004-01-01

    on the boiler) have been dened. Furthermore a number of constraints related to: minimum and maximum boiler load gradient, minimum boiler size, Shrinking and Swelling and Steam Space Load have been dened. For dening the constraints related to the required boiler volume a dynamic model for simulating the boiler...... performance has been developed. Outputs from the simulations are shrinking and swelling of water level in the drum during for example a start-up of the boiler, these gures combined with the requirements with respect to allowable water level uctuations in the drum denes the requirements with respect to drum...... size. The model has been formulated with a specied building-up of the pressure during the start-up of the plant, i.e. the steam production during start-up of the boiler is output from the model. The steam outputs together with requirements with respect to steam space load have been utilized to dene...

  1. Modeling and Simulation of Nanoindentation

    Science.gov (United States)

    Huang, Sixie; Zhou, Caizhi

    2017-08-01

    Nanoindentation is a hardness test method applied to small volumes of material which can provide some unique effects and spark many related research activities. To fully understand the phenomena observed during nanoindentation tests, modeling and simulation methods have been developed to predict the mechanical response of materials during nanoindentation. However, challenges remain with those computational approaches, because of their length scale, predictive capability, and accuracy. This article reviews recent progress and challenges for modeling and simulation of nanoindentation, including an overview of molecular dynamics, the quasicontinuum method, discrete dislocation dynamics, and the crystal plasticity finite element method, and discusses how to integrate multiscale modeling approaches seamlessly with experimental studies to understand the length-scale effects and microstructure evolution during nanoindentation tests, creating a unique opportunity to establish new calibration procedures for the nanoindentation technique.

  2. Multiscale Stochastic Simulation and Modeling

    Energy Technology Data Exchange (ETDEWEB)

    James Glimm; Xiaolin Li

    2006-01-10

    Acceleration driven instabilities of fluid mixing layers include the classical cases of Rayleigh-Taylor instability, driven by a steady acceleration and Richtmyer-Meshkov instability, driven by an impulsive acceleration. Our program starts with high resolution methods of numerical simulation of two (or more) distinct fluids, continues with analytic analysis of these solutions, and the derivation of averaged equations. A striking achievement has been the systematic agreement we obtained between simulation and experiment by using a high resolution numerical method and improved physical modeling, with surface tension. Our study is accompanies by analysis using stochastic modeling and averaged equations for the multiphase problem. We have quantified the error and uncertainty using statistical modeling methods.

  3. Featured Image: Experimental Simulation of Melting Meteoroids

    Science.gov (United States)

    Kohler, Susanna

    2017-03-01

    Ever wonder what experimental astronomy looks like? Some days, it looks like this piece of rock in a wind tunnel (click for a betterlook!). In this photo, a piece of agrillite (a terrestrial rock) is exposed to conditions in a plasma wind tunnel as a team of scientists led by Stefan Loehle (Stuttgart University) simulate what happens to a meteoroid as it hurtles through Earths atmosphere. With these experiments, the scientists hope to better understand meteoroid ablation the process by which meteoroids are heated, melt, and evaporateas they pass through our atmosphere so that we can learn more from the meteorite fragments that make it to the ground. In the scientists experiment, the rock samples were exposed to plasma flow until they disintegrated, and this process was simultaneously studied via photography, video, high-speed imaging, thermography, and Echelle emission spectroscopy. To find out what the team learned from these experiments, you can check out the original article below.CitationStefan Loehle et al 2017 ApJ 837 112. doi:10.3847/1538-4357/aa5cb5

  4. An online interactive simulation system for medical imaging education.

    Science.gov (United States)

    Dikshit, Aditya; Wu, Dawei; Wu, Chunyan; Zhao, Weizhao

    2005-09-01

    This report presents a recently developed web-based medical imaging simulation system for teaching students or other trainees who plan to work in the medical imaging field. The increased importance of computer and information technology widely applied to different imaging techniques in clinics and medical research necessitates a comprehensive medical imaging education program. A complete tutorial of simulations introducing popular imaging modalities, such as X-ray, MRI, CT, ultrasound and PET, forms an essential component of such an education. Internet technologies provide a vehicle to carry medical imaging education online. There exist a number of internet-based medical imaging hyper-books or online documentations. However, there are few providing interactive computational simulations. We focus on delivering knowledge of the physical principles and engineering implementation of medical imaging techniques through an interactive website environment. The online medical imaging simulation system presented in this report outlines basic principles underlying different imaging techniques and image processing algorithms and offers trainees an interactive virtual laboratory. For education purposes, this system aims to provide general understanding of each imaging modality with comprehensive explanations, ample illustrations and copious references as its thrust, rather than complex physics or detailed math. This report specifically describes the development of the tutorial for commonly used medical imaging modalities. An internet-accessible interface is used to simulate various imaging algorithms with user-adjustable parameters. The tutorial is under the MATLAB Web Server environment. Macromedia Director MX is used to develop interactive animations integrating theory with graphic-oriented simulations. HTML and JavaScript are used to enable a user to explore these modules online in a web browser. Numerous multiple choice questions, links and references for advanced study are

  5. Assessment of Molecular Modeling & Simulation

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-01-03

    This report reviews the development and applications of molecular and materials modeling in Europe and Japan in comparison to those in the United States. Topics covered include computational quantum chemistry, molecular simulations by molecular dynamics and Monte Carlo methods, mesoscale modeling of material domains, molecular-structure/macroscale property correlations like QSARs and QSPRs, and related information technologies like informatics and special-purpose molecular-modeling computers. The panel's findings include the following: The United States leads this field in many scientific areas. However, Canada has particular strengths in DFT methods and homogeneous catalysis; Europe in heterogeneous catalysis, mesoscale, and materials modeling; and Japan in materials modeling and special-purpose computing. Major government-industry initiatives are underway in Europe and Japan, notably in multi-scale materials modeling and in development of chemistry-capable ab-initio molecular dynamics codes.

  6. Animal models for simulating weightlessness

    Science.gov (United States)

    Morey-Holton, E.; Wronski, T. J.

    1982-01-01

    NASA has developed a rat model to simulate on earth some aspects of the weightlessness alterations experienced in space, i.e., unloading and fluid shifts. Comparison of data collected from space flight and from the head-down rat suspension model suggests that this model system reproduces many of the physiological alterations induced by space flight. Data from various versions of the rat model are virtually identical for the same parameters; thus, modifications of the model for acute, chronic, or metabolic studies do not alter the results as long as the critical components of the model are maintained, i.e., a cephalad shift of fluids and/or unloading of the rear limbs.

  7. The Application of the Technology of 3D Satellite Cloud Imaging in Virtual Reality Simulation

    Directory of Open Access Journals (Sweden)

    Xiao-fang Xie

    2007-05-01

    Full Text Available Using satellite cloud images to simulate clouds is one of the new visual simulation technologies in Virtual Reality (VR. Taking the original data of satellite cloud images as the source, this paper depicts specifically the technology of 3D satellite cloud imaging through the transforming of coordinates and projection, creating a DEM (Digital Elevation Model of cloud imaging and 3D simulation. A Mercator projection was introduced to create a cloud image DEM, while solutions for geodetic problems were introduced to calculate distances, and the outer-trajectory science of rockets was introduced to obtain the elevation of clouds. For demonstration, we report on a computer program to simulate the 3D satellite cloud images.

  8. Featured Image: The Simulated Collapse of a Core

    Science.gov (United States)

    Kohler, Susanna

    2016-11-01

    This stunning snapshot (click for a closer look!) is from a simulation of a core-collapse supernova. Despite having been studied for many decades, the mechanism driving the explosions of core-collapse supernovae is still an area of active research. Extremely complex simulations such as this one represent best efforts to include as many realistic physical processes as is currently computationally feasible. In this study led by Luke Roberts (a NASA Einstein Postdoctoral Fellow at Caltech at the time), a core-collapse supernova is modeled long-term in fully 3D simulations that include the effects of general relativity, radiation hydrodynamics, and even neutrino physics. The authors use these simulations to examine the evolution of a supernova after its core bounce. To read more about the teams findings (and see more awesome images from their simulations), check out the paper below!CitationLuke F. Roberts et al 2016 ApJ 831 98. doi:10.3847/0004-637X/831/1/98

  9. Simulation Tool for Inventory Models: SIMIN

    OpenAIRE

    Pratiksha Saxen; Tulsi Kushwaha

    2014-01-01

    In this paper, an integrated simulation optimization model for the inventory system is developed. An effective algorithm is developed to evaluate and analyze the back-end stored simulation results. This paper proposes simulation tool SIMIN (Inventory Simulation) to simulate inventory models. SIMIN is a tool which simulates and compares the results of different inventory models. To overcome various practical restrictive assumptions, SIMIN provides values for a number of performance measurement...

  10. Sources of image degradation in fundamental and harmonic ultrasound imaging using nonlinear, full-wave simulations.

    Science.gov (United States)

    Pinton, Gianmarco F; Trahey, Gregg E; Dahl, Jeremy J

    2011-04-01

    A full-wave equation that describes nonlinear propagation in a heterogeneous attenuating medium is solved numerically with finite differences in the time domain (FDTD). This numerical method is used to simulate propagation of a diagnostic ultrasound pulse through a measured representation of the human abdomen with heterogeneities in speed of sound, attenuation, density, and nonlinearity. Conventional delay-andsum beamforming is used to generate point spread functions (PSF) that display the effects of these heterogeneities. For the particular imaging configuration that is modeled, these PSFs reveal that the primary source of degradation in fundamental imaging is reverberation from near-field structures. Reverberation clutter in the harmonic PSF is 26 dB higher than the fundamental PSF. An artificial medium with uniform velocity but unchanged impedance characteristics indicates that for the fundamental PSF, the primary source of degradation is phase aberration. An ultrasound image is created in silico using the same physical and algorithmic process used in an ultrasound scanner: a series of pulses are transmitted through heterogeneous scattering tissue and the received echoes are used in a delay-and-sum beamforming algorithm to generate images. These beamformed images are compared with images obtained from convolution of the PSF with a scatterer field to demonstrate that a very large portion of the PSF must be used to accurately represent the clutter observed in conventional imaging. © 2011 IEEE

  11. Background Simulation and Correction Algorithm in Spot Weld Image Processing

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    One of the chief works for inspecting spot weld quality by X-ray to obtain an ideal and uniform digital image. This paper introduces three methods of image background simulation algorithm, and the effect of background correction was compared. It may be safely said that Kalman filter method is simple and fast for general image; the FFT method has a good adaptability for background simulation.

  12. Standard for Models and Simulations

    Science.gov (United States)

    Steele, Martin J.

    2016-01-01

    This NASA Technical Standard establishes uniform practices in modeling and simulation to ensure essential requirements are applied to the design, development, and use of models and simulations (MS), while ensuring acceptance criteria are defined by the program project and approved by the responsible Technical Authority. It also provides an approved set of requirements, recommendations, and criteria with which MS may be developed, accepted, and used in support of NASA activities. As the MS disciplines employed and application areas involved are broad, the common aspects of MS across all NASA activities are addressed. The discipline-specific details of a given MS should be obtained from relevant recommended practices. The primary purpose is to reduce the risks associated with MS-influenced decisions by ensuring the complete communication of the credibility of MS results.

  13. Simulation study on radiative imaging of combustion flame in furnace

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Radiative imaging of combustion flame in furnace of power plant plays an increasingly important role in combustion diagnosis. This paper presents a new method for calculating the radiative imaging of three-dimensional (3D) combustion flame based on Monte Carlo method and optical lens imaging. Numerical simulation case was used in this study. Radiative images were calculated and images obtained can not only present the energy distribution on the charge-coupled device (CCD) camera target plane but also reflect the energy distribution condition in the simulation furnace. Finally the relationships between volume elements and energy shares were also discussed.

  14. Safety Assessment of Advanced Imaging Sequences II: Simulations

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    2016-01-01

    An automatic approach for simulating the emitted pressure, intensity, and MI of advanced ultrasound imaging sequences is presented. It is based on a linear simulation of pressure fields using Field II, and it is hypothesized that linear simulation can attain the needed accuracy for predicting...

  15. Model for Simulation Atmospheric Turbulence

    DEFF Research Database (Denmark)

    Lundtang Petersen, Erik

    1976-01-01

    A method that produces realistic simulations of atmospheric turbulence is developed and analyzed. The procedure makes use of a generalized spectral analysis, often called a proper orthogonal decomposition or the Karhunen-Loève expansion. A set of criteria, emphasizing a realistic appearance, a co....... The method is unique in modeling the three velocity components simultaneously, and it is found that important cross-statistical features are reasonably well-behaved. It is concluded that the model provides a practical, operational simulator of atmospheric turbulence.......A method that produces realistic simulations of atmospheric turbulence is developed and analyzed. The procedure makes use of a generalized spectral analysis, often called a proper orthogonal decomposition or the Karhunen-Loève expansion. A set of criteria, emphasizing a realistic appearance......, a correct spectral shape, and non-Gaussian statistics, is selected in order to evaluate the model turbulence. An actual turbulence record is analyzed in detail providing both a standard for comparison and input statistics for the generalized spectral analysis, which in turn produces a set of orthonormal...

  16. Multi-scale imaging and elastic simulation of carbonates

    Science.gov (United States)

    Faisal, Titly Farhana; Awedalkarim, Ahmed; Jouini, Mohamed Soufiane; Jouiad, Mustapha; Chevalier, Sylvie; Sassi, Mohamed

    2016-05-01

    Digital Rock Physics (DRP) is an emerging technology that can be used to generate high quality, fast and cost effective special core analysis (SCAL) properties compared to conventional experimental techniques and modeling techniques. The primary workflow of DRP conssits of three elements: 1) image the rock sample using high resolution 3D scanning techniques (e.g. micro CT, FIB/SEM), 2) process and digitize the images by segmenting the pore and matrix phases 3) simulate the desired physical properties of the rocks such as elastic moduli and velocities of wave propagation. A Finite Element Method based algorithm, that discretizes the basic Hooke's Law equation of linear elasticity and solves it numerically using a fast conjugate gradient solver, developed by Garboczi and Day [1] is used for mechanical and elastic property simulations. This elastic algorithm works directly on the digital images by treating each pixel as an element. The images are assumed to have periodic constant-strain boundary condition. The bulk and shear moduli of the different phases are required inputs. For standard 1.5" diameter cores however the Micro-CT scanning reoslution (around 40 μm) does not reveal smaller micro- and nano- pores beyond the resolution. This results in an unresolved "microporous" phase, the moduli of which is uncertain. Knackstedt et al. [2] assigned effective elastic moduli to the microporous phase based on self-consistent theory (which gives good estimation of velocities for well cemented granular media). Jouini et al. [3] segmented the core plug CT scan image into three phases and assumed that micro porous phase is represented by a sub-extracted micro plug (which too was scanned using Micro-CT). Currently the elastic numerical simulations based on CT-images alone largely overpredict the bulk, shear and Young's modulus when compared to laboratory acoustic tests of the same rocks. For greater accuracy of numerical simulation prediction, better estimates of moduli inputs

  17. Distributed Object Medical Imaging Model

    CERN Document Server

    Noor, Ahmad Shukri Mohd

    2009-01-01

    Digital medical informatics and images are commonly used in hospitals today,. Because of the interrelatedness of the radiology department and other departments, especially the intensive care unit and emergency department, the transmission and sharing of medical images has become a critical issue. Our research group has developed a Java-based Distributed Object Medical Imaging Model(DOMIM) to facilitate the rapid development and deployment of medical imaging applications in a distributed environment that can be shared and used by related departments and mobile physiciansDOMIM is a unique suite of multimedia telemedicine applications developed for the use by medical related organizations. The applications support realtime patients' data, image files, audio and video diagnosis annotation exchanges. The DOMIM enables joint collaboration between radiologists and physicians while they are at distant geographical locations. The DOMIM environment consists of heterogeneous, autonomous, and legacy resources. The Common...

  18. The method of infrared image simulation based on the measured image

    Science.gov (United States)

    Lou, Shuli; Liu, Liang; Ren, Jiancun

    2015-10-01

    The development of infrared imaging guidance technology has promoted the research of infrared imaging simulation technology and the key of infrared imaging simulation is the generation of IR image. The generation of IR image is worthful in military and economy. In order to solve the problem of credibility and economy of infrared scene generation, a method of infrared scene generation based on the measured image is proposed. Through researching on optical properties of ship-target and sea background, ship-target images with various gestures are extracted from recorded images based on digital image processing technology. The ship-target image is zoomed in and out to simulate the relative motion between the viewpoint and the target according to field of view and the distance between the target and the sensor. The gray scale of ship-target image is adjusted to simulate the radiation change of the ship-target according to the distance between the viewpoint and the target and the atmospheric transmission. Frames of recorded infrared images without target are interpolated to simulate high frame rate of missile. Processed ship-target images and sea-background infrared images are synthetized to obtain infrared scenes according to different viewpoints. Experiments proved that this method is flexible and applicable, and the fidelity and the reliability of synthesis infrared images can be guaranteed.

  19. Noise simulation system for determining imaging conditions in digital radiography

    Science.gov (United States)

    Tanaka, R.; Ichikawa, K.; Matsubara, K.; Kawashima, H.

    2012-03-01

    Reduction of exposure dose and improvement in image quality can be expected to result from advances in the performance of imaging detectors. We propose a computerized method for determining optimized imaging conditions by use of simulated images. This study was performed to develop a prototype system for image noise and to ensure consistency between the resulting images and actual images. An RQA5 X-ray spectrum was used for determination of input-output characteristics of a flat-panel detector (FPD). The number of incident quantum to the detector per pixel (counts/pixel) was calculated according to the pixel size of the detector and the quantum number in RQA5 determined in IEC6220-1. The relationship among tube current-time product (mAs), exposure dose (C/kg) at the detector surface, the number of incident quanta (counts/pixel), and pixel values measured on the images was addressed, and a conversion function was then created. The images obtained by the FPD was converted into a map of incident quantum numbers and input into random-value generator to simulate image noise. In addition, graphic user interface was developed to observe images with changing image noise and exposure dose levels, which have trade-off relationship. Simulation images provided at different noise levels were compared with actual images obtained by the FPD system. The results indicated that image noise was simulated properly both in objective and subjective evaluation. The present system could allow us to determine necessary dose from image quality and also to estimate image quality from any exposure dose.

  20. Image deblurring applied to infrared tongue position imaging: Initial simulation results

    Science.gov (United States)

    Poots, J. Kent

    2013-10-01

    Paper describes development work for a new biomedical application of image deblurring. Optical imaging is not currently used to assess tongue position during speech, nor is optical imaging the modality of choice for imaging tissue of moderate thickness. Tongue position assessment is important during rehabilitation. Optical imaging of biological tissue provides good contrast, but incident light is scattered, seriously restricting clinical usefulness. Paper describes simulation results for scattering correction and suggests possible directions for future work. Images are represented by sparse matrices.

  1. GF-7 Imaging Simulation and Dsm Accuracy Estimate

    Science.gov (United States)

    Yue, Q.; Tang, X.; Gao, X.

    2017-05-01

    GF-7 satellite is a two-line-array stereo imaging satellite for surveying and mapping which will be launched in 2018. Its resolution is about 0.8 meter at subastral point corresponding to a 20 km width of cloth, and the viewing angle of its forward and backward cameras are 5 and 26 degrees. This paper proposed the imaging simulation method of GF-7 stereo images. WorldView-2 stereo images were used as basic data for simulation. That is, we didn't use DSM and DOM as basic data (we call it "ortho-to-stereo" method) but used a "stereo-to-stereo" method, which will be better to reflect the difference of geometry and radiation in different looking angle. The shortage is that geometric error will be caused by two factors, one is different looking angles between basic image and simulated image, another is not very accurate or no ground reference data. We generated DSM by WorldView-2 stereo images. The WorldView-2 DSM was not only used as reference DSM to estimate the accuracy of DSM generated by simulated GF-7 stereo images, but also used as "ground truth" to establish the relationship between WorldView-2 image point and simulated image point. Static MTF was simulated on the instantaneous focal plane "image" by filtering. SNR was simulated in the electronic sense, that is, digital value of WorldView-2 image point was converted to radiation brightness and used as radiation brightness of simulated GF-7 camera. This radiation brightness will be converted to electronic number n according to physical parameters of GF-7 camera. The noise electronic number n1 will be a random number between -√n and √n. The overall electronic number obtained by TDI CCD will add and converted to digital value of simulated GF-7 image. Sinusoidal curves with different amplitude, frequency and initial phase were used as attitude curves. Geometric installation errors of CCD tiles were also simulated considering the rotation and translation factors. An accuracy estimate was made for DSM generated

  2. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  3. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  4. A study on quality and availability of COCTS images of HY- 1 satellite by simulation

    Institute of Scientific and Technical Information of China (English)

    李淑菁; 毛天明; 潘德炉

    2002-01-01

    Hy-1 is a first China's ocean color satellite which will be launched as a piggyback satellite on FY- 1 satellite using Long March rocket. On the satellite there are two sensors: one is the China's ocean color and temperature scanner (COCTS), the other is CCD coastal zone imager (CZI).The COCTS is considered to be a main sensor to play a key role. In order to understand the characteristics of future ocean color images observed, a simulation and evaluation study on the quality and availability of the COCTS image has been done. First, the simulation models are introduced briefly, and typical simulated cases of radiance images at visible bands are introduced, in which the radiance distribution is based on geographic location, the satellite orbital parameters and sensor properties, the simulated method to evaluate the image quality and availability is developed by using the characteristics of image called the complex signal noise ratio ( CSNR ). Meanwhile, a series of the CSNR images are generated from the simulated radiance components for different cases, which can be used to evaluate the quality and availability of the COCTS images before the HY - 1 is placed in orbit. Finally, the quality and availability of the COCTS images are quantitatively analyzed with the simulated CSNR data. The results will be beneficial to all scientists who are in charge of the COCTS mission and to those who plan to use the data from the COCTS.

  5. Hybrid statistics-simulations based method for atom-counting from ADF STEM images.

    Science.gov (United States)

    De Wael, Annelies; De Backer, Annick; Jones, Lewys; Nellist, Peter D; Van Aert, Sandra

    2017-01-25

    A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials.

  6. Edge detection based on Hodgkin-Huxley neuron model simulation.

    Science.gov (United States)

    Yedjour, Hayat; Meftah, Boudjelal; Lézoray, Olivier; Benyettou, Abdelkader

    2017-04-03

    In this paper, we propose a spiking neural network model for edge detection in images. The proposed model is biologically inspired by the mechanisms employed by natural vision systems, more specifically by the biologically fulfilled function of simple cells of the human primary visual cortex that are selective for orientation. Several aspects are studied in this model according to three characteristics: feedforward spiking neural structure; conductance-based model of the Hodgkin-Huxley neuron and Gabor receptive fields structure. A visualized map is generated using the firing rate of neurons representing the orientation map of the visual cortex area. We have simulated the proposed model on different images. Successful computer simulation results are obtained. For comparison, we have chosen five methods for edge detection. We finally evaluate and compare the performances of our model toward contour detection using a public dataset of natural images with associated contour ground truths. Experimental results show the ability and high performance of the proposed network model.

  7. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2004-01-01

    on the boiler) have been dened. Furthermore a number of constraints related to: minimum and maximum boiler load gradient, minimum boiler size, Shrinking and Swelling and Steam Space Load have been dened. For dening the constraints related to the required boiler volume a dynamic model for simulating the boiler...... size. The model has been formulated with a specied building-up of the pressure during the start-up of the plant, i.e. the steam production during start-up of the boiler is output from the model. The steam outputs together with requirements with respect to steam space load have been utilized to dene...... of the boiler is (with an acceptable accuracy) proportional with the volume of the boiler. For the dynamic operation capability a cost function penalizing limited dynamic operation capability and vise-versa has been dened. The main idea is that it by mean of the parameters in this function is possible to t its...

  8. Simulating Longitudinal Brain MRIs with Known Volume Changes and Realistic Variations in Image Intensity.

    Science.gov (United States)

    Khanal, Bishesh; Ayache, Nicholas; Pennec, Xavier

    2017-01-01

    This paper presents a simulator tool that can simulate large databases of visually realistic longitudinal MRIs with known volume changes. The simulator is based on a previously proposed biophysical model of brain deformation due to atrophy in AD. In this work, we propose a novel way of reproducing realistic intensity variation in longitudinal brain MRIs, which is inspired by an approach used for the generation of synthetic cardiac sequence images. This approach combines a deformation field obtained from the biophysical model with a deformation field obtained by a non-rigid registration of two images. The combined deformation field is then used to simulate a new image with specified atrophy from the first image, but with the intensity characteristics of the second image. This allows to generate the realistic variations present in real longitudinal time-series of images, such as the independence of noise between two acquisitions and the potential presence of variable acquisition artifacts. Various options available in the simulator software are briefly explained in this paper. In addition, the software is released as an open-source repository. The availability of the software allows researchers to produce tailored databases of images with ground truth volume changes; we believe this will help developing more robust brain morphometry tools. Additionally, we believe that the scientific community can also use the software to further experiment with the proposed model, and add more complex models of brain deformation and atrophy generation.

  9. Comprehensive simulation of SEM images taking into account local and global electromagnetic fields

    Science.gov (United States)

    Babin, Sergey; Borisov, Sergey S.; Ito, Hiroyuki; Ivanchikov, Andrei; Matison, Dmitri; Militsin, Vladimir; Suzuki, Makoto

    2010-06-01

    We are reporting the development of a simulation tool with unique capabilities to comprehensively model an SEM signal. This includes electron scattering, charging, and detector settings, as well as modeling of the local and global electromagnetic fields and the electron trajectories in these fields. Experimental and simulated results were compared for SEM imaging of carbon nanofibers embedded into bulk material in the presence of significant charging, as well as for samples with applied potential on metal electrodes. The effect of the potentials applied to electrodes on the secondary emission was studied; the resulting SEM images were simulated. The image contrast depends strongly on the sign and the value of the potential. SEM imaging of nanofibers embedded into silicon dioxide resulted in the considerable change of the appeared dimensions of the fibers and as well as tone reversal when the beam voltage was varied. The results of the simulations are in agreement with experimental results.

  10. Simulated annealing model of acupuncture

    Science.gov (United States)

    Shang, Charles; Szu, Harold

    2015-05-01

    The growth control singularity model suggests that acupuncture points (acupoints) originate from organizers in embryogenesis. Organizers are singular points in growth control. Acupuncture can cause perturbation of a system with effects similar to simulated annealing. In clinical trial, the goal of a treatment is to relieve certain disorder which corresponds to reaching certain local optimum in simulated annealing. The self-organizing effect of the system is limited and related to the person's general health and age. Perturbation at acupoints can lead a stronger local excitation (analogous to higher annealing temperature) compared to perturbation at non-singular points (placebo control points). Such difference diminishes as the number of perturbed points increases due to the wider distribution of the limited self-organizing activity. This model explains the following facts from systematic reviews of acupuncture trials: 1. Properly chosen single acupoint treatment for certain disorder can lead to highly repeatable efficacy above placebo 2. When multiple acupoints are used, the result can be highly repeatable if the patients are relatively healthy and young but are usually mixed if the patients are old, frail and have multiple disorders at the same time as the number of local optima or comorbidities increases. 3. As number of acupoints used increases, the efficacy difference between sham and real acupuncture often diminishes. It predicted that the efficacy of acupuncture is negatively correlated to the disease chronicity, severity and patient's age. This is the first biological - physical model of acupuncture which can predict and guide clinical acupuncture research.

  11. Fractal model for simulation of frost formation and growth

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    A planar fractal model for simulation of frost formation and growth was proposed based on diffusion limited aggregation(DLA)model and the computational simulation was carried out in this paper.By changing the times of program running circulation and the ratio of random particles generated,the simulation figures were gained under different conditions.A microscope is used to observe the shape and structure of frost layer and a digital camera with high resolution is used to record the pattern of frost layer at different time.Through comparing the simulation figures with the experimental images,we find that the simulation results agree well with the experimental images in shape and the fractal dimension of simulation figures is nearly equal to that of experimental images.The results indicate that it is reasonable to represent frost layer growth time with the program circulation times and to simulate the frost layer density variation during its growth process by reducing the random particle generation probability.The feasibility of using the suggested model to simulate the process of frost formation and growth was justified.The insufficiencies and its causes of this fractal model are also discussed.

  12. Uterine Contraction Modeling and Simulation

    Science.gov (United States)

    Liu, Miao; Belfore, Lee A.; Shen, Yuzhong; Scerbo, Mark W.

    2010-01-01

    Building a training system for medical personnel to properly interpret fetal heart rate tracing requires developing accurate models that can relate various signal patterns to certain pathologies. In addition to modeling the fetal heart rate signal itself, the change of uterine pressure that bears strong relation to fetal heart rate and provides indications of maternal and fetal status should also be considered. In this work, we have developed a group of parametric models to simulate uterine contractions during labor and delivery. Through analysis of real patient records, we propose to model uterine contraction signals by three major components: regular contractions, impulsive noise caused by fetal movements, and low amplitude noise invoked by maternal breathing and measuring apparatus. The regular contractions are modeled by an asymmetric generalized Gaussian function and least squares estimation is used to compute the parameter values of the asymmetric generalized Gaussian function based on uterine contractions of real patients. Regular contractions are detected based on thresholding and derivative analysis of uterine contractions. Impulsive noise caused by fetal movements and low amplitude noise by maternal breathing and measuring apparatus are modeled by rational polynomial functions and Perlin noise, respectively. Experiment results show the synthesized uterine contractions can mimic the real uterine contractions realistically, demonstrating the effectiveness of the proposed algorithm.

  13. Computer simulation of the effects of a distributed array antenna on synthetic aperture radar images

    Science.gov (United States)

    Estes, J. M.

    1985-01-01

    The ARL:UT orbital SAR simulation has been upgraded to use three-dimensional antenna gain patterns. This report describes the modifications and presents quantitative image analyses of a simulation using antenna patterns generated from the modeling of a distributed array antenna.

  14. Scatterer reconstruction and parametrization of homogeneous tissue for ultrasound image simulation.

    Science.gov (United States)

    Mattausch, Oliver; Goksel, Orcun

    2015-01-01

    Numerical simulation of ultrasound images can facilitate the training of sonographers. A realistic appearance of simulated ultrasonic speckle is essential for a plausible ultrasound simulation. An efficient and realistic model for ultrasonic speckle is the convolution of the ultrasound point-spread function with a parametrized distribution of point scatterers. Nevertheless, for a given arbitrary tissue, such scatterer distributions that would generate a realistic image are not known a priori, and currently there is no principled method to extract such scatterer patterns for given target tissues to be simulated. In this paper we propose to solve the inverse problem, in which an underlying scatterer map for a given sample ultrasound image is estimated. From such scatterer maps, it is also shown that a parametrization distribution model can be built, using which other instances of the same tissue can be simulated by feeding into a standard speckle generation method. This enables us to synthesize images of different tissue types from actual ultrasound images to be used in simulations with arbitrary view angles and transducer settings. We show in numerical and physical tissue-mimicking phantoms and actual physical tissue that the appearance of the synthesized images closely match the real images.

  15. SIMULATION STUDY OF IMAGING OF UNDERWATER BOTTOM TOPOGRAPHY BY SYNTHETIC APERTURE RADAR

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Interaction between current and underwater bottom topography modulates roughness of the sea surface, which in turn yields variation of the radar scattering echo. By using the mechanism, this paper presents a simulation model for Synthetic Aperture Radar (SAR) imaging of underwater bottom topography. The numerical simulations experiments were made using the Princeton Ocean Model (POM) and analytical expression theory of SAR Image in Mischief sea area. It is concluded that the SAR image is better visual when water depth of underwater bottom topography is shallow or gradient of underwater bottom topography is high.

  16. Application of Geostatistical Simulation to Enhance Satellite Image Products

    Science.gov (United States)

    Hlavka, Christine A.; Dungan, Jennifer L.; Thirulanambi, Rajkumar; Roy, David

    2004-01-01

    With the deployment of Earth Observing System (EOS) satellites that provide daily, global imagery, there is increasing interest in defining the limitations of the data and derived products due to its coarse spatial resolution. Much of the detail, i.e. small fragments and notches in boundaries, is lost with coarse resolution imagery such as the EOS MODerate-Resolution Imaging Spectroradiometer (MODIS) data. Higher spatial resolution data such as the EOS Advanced Spaceborn Thermal Emission and Reflection Radiometer (ASTER), Landsat and airborne sensor imagery provide more detailed information but are less frequently available. There are, however, both theoretical and analytical evidence that burn scars and other fragmented types of land covers form self-similar or self-affine patterns, that is, patterns that look similar when viewed at widely differing spatial scales. Therefore small features of the patterns should be predictable, at least in a statistical sense, with knowledge about the large features. Recent developments in fractal modeling for characterizing the spatial distribution of undiscovered petroleum deposits are thus applicable to generating simulations of finer resolution satellite image products. We will present example EOS products, analysis to investigate self-similarity, and simulation results.

  17. Applications of Joint Tactical Simulation Modeling

    Science.gov (United States)

    1997-12-01

    NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS APPLICATIONS OF JOINT TACTICAL SIMULATION MODELING by Steve VanLandingham December 1997...SUBTITLE APPLICATIONS OF JOINT TACTICAL SIMULATION MODELING 5. FUNDING NUMBERS 6. AUTHOR(S) VanLandingham, Steve 7. PERFORMING ORGANIZATION NAME(S...release; distribution is unlimited. APPLICATIONS OF JOINT TACTICAL SIMULATION MODELING Steve VanLandingham Lieutenant, United States Navy B.S

  18. Benchmark simulation models, quo vadis?

    DEFF Research Database (Denmark)

    Jeppsson, U.; Alex, J; Batstone, D. J.

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to p...... already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity. © IWA Publishing 2013....... and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work...

  19. SWEEPOP a simulation model for Target Simulation Mode minesweeping

    NARCIS (Netherlands)

    Keus, H.E.; Beckers, A.L.D.; Cleophas, P.L.H.

    2005-01-01

    SWEEPOP is a flexible model that simulates the physical interaction between objects in a maritime underwater environment. The model was built to analyse the deployment and the performance of a Target Simulation Mode (TSM) minesweeping system for the Royal Netherlands Navy (RNLN) and to support its p

  20. Coded source imaging simulation with visible light

    Science.gov (United States)

    Wang, Sheng; Zou, Yubin; Zhang, Xueshuang; Lu, Yuanrong; Guo, Zhiyu

    2011-09-01

    A coded source could increase the neutron flux with high L/ D ratio. It may benefit a neutron imaging system with low yield neutron source. Visible light CSI experiments were carried out to test the physical design and reconstruction algorithm. We used a non-mosaic Modified Uniformly Redundant Array (MURA) mask to project the shadow of black/white samples on a screen. A cooled-CCD camera was used to record the image on the screen. Different mask sizes and amplification factors were tested. The correlation, Wiener filter deconvolution and Richardson-Lucy maximum likelihood iteration algorithm were employed to reconstruct the object imaging from the original projection. The results show that CSI can benefit the low flux neutron imaging with high background noise.

  1. Fast and Automatic Ultrasound Simulation from CT Images

    OpenAIRE

    Weijian Cong; Jian Yang; Yue Liu; Yongtian Wang

    2013-01-01

    Ultrasound is currently widely used in clinical diagnosis because of its fast and safe imaging principles. As the anatomical structures present in an ultrasound image are not as clear as CT or MRI. Physicians usually need advance clinical knowledge and experience to distinguish diseased tissues. Fast simulation of ultrasound provides a cost-effective way for the training and correlation of ultrasound and the anatomic structures. In this paper, a novel method is proposed for fast simulation of...

  2. Simulation and evaluation of the quality and availability of typical GLI ocean image

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    ADEOS-II satellite will be launched in the near future. It collocates many remote sensing instruments in the same platform. Among them, Global Image (GLI) is considered to be a main sensor which will play a key role. In order to understand the characteristics of future GLI ocean color images, a simulation and evaluation of the quality and availability of GLI typical ocean image has been done. In the paper, we first introduce the simulation models briefly, and simulate typical cases of radiance images at visible channels, in which the radiance distribution is based on geographic location, the satellite orbital parameters and sensor properties. A method, complex signal noise ratio (CSNR) to evaluate the image quality and availability, is developed according to the characteristics of image. Meanwhile, a series of CSNR images are generated from the simulated radiance components for different cases, which can be used to evaluate the quality and availability of GLI images before the ADEOS-II is placed in orbit. Finally, the quality and availability of GLI images are quantitatively analyzed by the simulated CSNR data. The results will be beneficial to the people who are in charge of GLI mission or plan to use the data from GLI.

  3. Adaptive image ray-tracing for astrophysical simulations

    CERN Document Server

    Parkin, E R

    2010-01-01

    A technique is presented for producing synthetic images from numerical simulations whereby the image resolution is adapted around prominent features. In so doing, adaptive image ray-tracing (AIR) improves the efficiency of a calculation by focusing computational effort where it is needed most. The results of test calculations show that a factor of >~ 4 speed-up, and a commensurate reduction in the number of pixels required in the final image, can be achieved compared to an equivalent calculation with a fixed resolution image.

  4. Synthetic SAR Image Generation using Sensor, Terrain and Target Models

    DEFF Research Database (Denmark)

    Kusk, Anders; Abulaitijiang, Adili; Dall, Jørgen

    2016-01-01

    A tool to generate synthetic SAR images of objects set on a clutter background is described. The purpose is to generate images for training Automatic Target Recognition and Identification algorithms. The tool employs a commercial electromagnetic simulation program to calculate radar cross sections...... of the object using a CAD-model. The raw measurements are input to a SAR system and terrain model, which models thermal noise, terrain clutter, and SAR focusing to produce synthetic SAR images. Examples of SAR images at 0.3m and 0.1m resolution, and a comparison with real SAR imagery from the MSTAR dataset...

  5. Sources of Image Degradation in Fundamental and Harmonic Ultrasound Imaging: A Nonlinear, Full-Wave, Simulation Study

    Science.gov (United States)

    Pinton, Gianmarco F.; Trahey, Gregg E.; Dahl, Jeremy J.

    2011-01-01

    A full-wave equation that describes nonlinear propagation in a heterogeneous attenuating medium is solved numerically with finite differences in the time domain (FDTD). This numerical method is used to simulate propagation of a diagnostic ultrasound pulse through a measured representation of the human abdomen with heterogeneities in speed of sound, attenuation, density, and nonlinearity. Conventional delay-and-sum beamforming is used to generate point spread functions (PSF) that display the effects of these heterogeneities. For the particular imaging configuration that is modeled, these PSFs reveal that the primary source of degradation in fundamental imaging is due to reverberation from near-field structures. Compared to fundamental imaging, reverberation clutter in harmonic imaging is 27.1 dB lower. Simulated tissue with uniform velocity but unchanged impedance characteristics indicates that for fundamental imaging, the primary source of degradation is phase aberration. PMID:21507753

  6. Erratum: Sources of Image Degradation in Fundamental and Harmonic Ultrasound Imaging: A Nonlinear, Full-Wave, Simulation Study

    Science.gov (United States)

    Pinton, Gianmarco F.; Trahey, Gregg E.; Dahl, Jeremy J.

    2015-01-01

    A full-wave equation that describes nonlinear propagation in a heterogeneous attenuating medium is solved numerically with finite differences in the time domain. This numerical method is used to simulate propagation of a diagnostic ultrasound pulse through a measured representation of the human abdomen with heterogeneities in speed of sound, attenuation, density, and nonlinearity. Conventional delay-and-sum beamforming is used to generate point spread functions (PSFs) that display the effects of these heterogeneities. For the particular imaging configuration that is modeled, these PSFs reveal that the primary source of degradation in fundamental imaging is due to reverberation from near-field structures. Compared with fundamental imaging, reverberation clutter in harmonic imaging is 27.1 dB lower. Simulated tissue with uniform velocity but unchanged impedance characteristics indicates that for harmonic imaging, the primary source of degradation is phase aberration. PMID:21693410

  7. SEMICONDUCTOR DEVICES: Two-dimensional pixel image lag simulation and optimization in a 4-T CMOS image sensor

    Science.gov (United States)

    Junting, Yu; Binqiao, Li; Pingping, Yu; Jiangtao, Xu; Cun, Mou

    2010-09-01

    Pixel image lag in a 4-T CMOS image sensor is analyzed and simulated in a two-dimensional model. Strategies of reducing image lag are discussed from transfer gate channel threshold voltage doping adjustment, PPD N-type doping dose/implant tilt adjustment and transfer gate operation voltage adjustment for signal electron transfer. With the computer analysis tool ISE-TCAD, simulation results show that minimum image lag can be obtained at a pinned photodiode n-type doping dose of 7.0 × 1012 cm-2, an implant tilt of -2°, a transfer gate channel doping dose of 3.0 × 1012 cm-2 and an operation voltage of 3.4 V. The conclusions of this theoretical analysis can be a guideline for pixel design to improve the performance of 4-T CMOS image sensors.

  8. Three-dimensional medical images and its application for surgical simulation of plastic and reconstructive surgery

    Energy Technology Data Exchange (ETDEWEB)

    Kaneko, Tsuyoshi; Kobayashi, Masahiro; Nakajima, Hideo; Fujino, Toyomi (Keio Univ., Tokyo (Japan). School of Medicine)

    1992-12-01

    The author's three surgical simulation systems are presented. First the computer graphics surgical simulation system has been developed which make the three dimensional skull image from CT scans and the arbitrary osteotomy, mobilization of bone segments and prediction of post-operative appearance is made possible. The second system is solid modeling of the skull using laser curable resin and it is concluded that life-sized skull model is useful not only for surgical simulation of major craniofacial surgery but also educational purposes. The third one is solid modeling of the ear using non-contact 3-D shape measurement with slit laser scanner. The mirror image life-sized wax model is made from the normal side of th ear and the autologous cartilage framework is assembled to simulate the wax model, thus the precise three dimensional reconstruction of the auricle is made possible. (author).

  9. Techniques and Simulation Models in Risk Management

    OpenAIRE

    Mirela GHEORGHE

    2012-01-01

    In the present paper, the scientific approach of the research starts from the theoretical framework of the simulation concept and then continues in the setting of the practical reality, thus providing simulation models for a broad range of inherent risks specific to any organization and simulation of those models, using the informatics instrument @Risk (Palisade). The reason behind this research lies in the need for simulation models that will allow the person in charge with decision taking i...

  10. Satellite image time series simulation for environmental monitoring

    Science.gov (United States)

    Guo, Tao

    2014-11-01

    The performance of environmental monitoring heavily depends on the availability of consecutive observation data and it turns out an increasing demand in remote sensing community for satellite image data in the sufficient resolution with respect to both spatial and temporal requirements, which appear to be conflictive and hard to tune tradeoffs. Multiple constellations could be a solution if without concerning cost, and thus it is so far interesting but very challenging to develop a method which can simultaneously improve both spatial and temporal details. There are some research efforts to deal with the problem from various aspects, a type of approaches is to enhance the spatial resolution using techniques of super resolution, pan-sharpen etc. which can produce good visual effects, but mostly cannot preserve spectral signatures and result in losing analytical value. Another type is to fill temporal frequency gaps by adopting time interpolation, which actually doesn't increase informative context at all. In this paper we presented a novel method to generate satellite images in higher spatial and temporal details, which further enables satellite image time series simulation. Our method starts with a pair of high-low resolution data set, and then a spatial registration is done by introducing LDA model to map high and low resolution pixels correspondingly. Afterwards, temporal change information is captured through a comparison of low resolution time series data, and the temporal change is then projected onto high resolution data plane and assigned to each high resolution pixel referring the predefined temporal change patterns of each type of ground objects to generate a simulated high resolution data. A preliminary experiment shows that our method can simulate a high resolution data with a good accuracy. We consider the contribution of our method is to enable timely monitoring of temporal changes through analysis of low resolution images time series only, and usage of

  11. MEGACELL: A nanocrystal model construction software for HRTEM multislice simulation

    Energy Technology Data Exchange (ETDEWEB)

    Stroppa, Daniel G., E-mail: dstroppa@lnls.br [Brazilian Synchrotron Light Laboratory, 13083-970 Campinas, SP (Brazil); Mechanical Engineering School, University of Campinas, 13083-860 Campinas, SP (Brazil); Righetto, Ricardo D. [Brazilian Synchrotron Light Laboratory, 13083-970 Campinas, SP (Brazil); School of Electrical and Computer Engineering, University of Campinas, 13083-852 Campinas, SP (Brazil); Montoro, Luciano A. [Brazilian Synchrotron Light Laboratory, 13083-970 Campinas, SP (Brazil); Ramirez, Antonio J. [Brazilian Synchrotron Light Laboratory, 13083-970 Campinas, SP (Brazil); Mechanical Engineering School, University of Campinas, 13083-860 Campinas, SP (Brazil)

    2011-07-15

    Image simulation has an invaluable importance for the accurate analysis of High Resolution Transmission Electron Microscope (HRTEM) results, especially due to its non-linear image formation mechanism. Because the as-obtained images cannot be interpreted in a straightforward fashion, the retrieval of both qualitative and quantitative information from HRTEM micrographs requires an iterative process including the simulation of a nanocrystal model and its comparison with experimental images. However most of the available image simulation software requires atom-by-atom coordinates as input for the calculations, which can be prohibitive for large finite crystals and/or low-symmetry systems and zone axis orientations. This paper presents an open source citation-ware tool named MEGACELL, which was developed to assist on the construction of nanocrystals models. It allows the user to build nanocrystals with virtually any convex polyhedral geometry and to retrieve its atomic positions either as a plain text file or as an output compatible with EMS (Electron Microscopy Software) input protocol. In addition to the description of this tool features, some construction examples and its application for scientific studies are presented. These studies show MEGACELL as a handy tool, which allows an easier construction of complex nanocrystal models and improves the quantitative information extraction from HRTEM images. -- Highlights: {yields} A software to support the HRTEM image simulation of nanocrystals in actual size. {yields} MEGACELL allows the construction of complex nanocrystals models for multislice image simulation. {yields} Some examples of improved nanocrystalline system characterization are presented, including the analysis of 3D morphology and growth behavior.

  12. A framework of modeling detector systems for computed tomography simulations

    Science.gov (United States)

    Youn, H.; Kim, D.; Kim, S. H.; Kam, S.; Jeon, H.; Nam, J.; Kim, H. K.

    2016-01-01

    Ultimate development in computed tomography (CT) technology may be a system that can provide images with excellent lesion conspicuity with the patient dose as low as possible. Imaging simulation tools have been cost-effectively used for these developments and will continue. For a more accurate and realistic imaging simulation, the signal and noise propagation through a CT detector system has been modeled in this study using the cascaded linear-systems theory. The simulation results are validated in comparisons with the measured results using a laboratory flat-panel micro-CT system. Although the image noise obtained from the simulations at higher exposures is slightly smaller than that obtained from the measurements, the difference between them is reasonably acceptable. According to the simulation results for various exposure levels and additive electronic noise levels, x-ray quantum noise is more dominant than the additive electronic noise. The framework of modeling a CT detector system suggested in this study will be helpful for the development of an accurate and realistic projection simulation model.

  13. Comparison of image deconvolution algorithms on simulated and laboratory infrared images

    Energy Technology Data Exchange (ETDEWEB)

    Proctor, D. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    We compare Maximum Likelihood, Maximum Entropy, Accelerated Lucy-Richardson, Weighted Goodness of Fit, and Pixon reconstructions of simple scenes as a function of signal-to-noise ratio for simulated images with randomly generated noise. Reconstruction results of infrared images taken with the TAISIR (Temperature and Imaging System InfraRed) are also discussed.

  14. Simulation of scanning transmission electron microscope images on desktop computers

    Energy Technology Data Exchange (ETDEWEB)

    Dwyer, C., E-mail: christian.dwyer@mcem.monash.edu.au [Monash Centre for Electron Microscopy, Department of Materials Engineering, Monash University, Victoria 3800 (Australia)

    2010-02-15

    Two independent strategies are presented for reducing the computation time of multislice simulations of scanning transmission electron microscope (STEM) images: (1) optimal probe sampling, and (2) the use of desktop graphics processing units. The first strategy is applicable to STEM images generated by elastic and/or inelastic scattering, and requires minimal effort for its implementation. Used together, these two strategies can reduce typical computation times from days to hours, allowing practical simulation of STEM images of general atomic structures on a desktop computer.

  15. Bridging experiments, models and simulations

    DEFF Research Database (Denmark)

    Carusi, Annamaria; Burrage, Kevin; Rodríguez, Blanca

    2012-01-01

    Computational models in physiology often integrate functional and structural information from a large range of spatiotemporal scales from the ionic to the whole organ level. Their sophistication raises both expectations and skepticism concerning how computational methods can improve our understan...... that contributes to defining the specific aspects of cardiac electrophysiology the MSE system targets, rather than being only an external test, and that this is driven by advances in experimental and computational methods and the combination of both....... of biovariability; 2) testing and developing robust techniques and tools as a prerequisite to conducting physiological investigations; 3) defining and adopting standards to facilitate the interoperability of experiments, models, and simulations; 4) and understanding physiological validation as an iterative process...... understanding of living organisms and also how they can reduce, replace, and refine animal experiments. A fundamental requirement to fulfill these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present...

  16. Simulations of multi-contrast x-ray imaging using near-field speckles

    Energy Technology Data Exchange (ETDEWEB)

    Zdora, Marie-Christine [Lehrstuhl für Biomedizinische Physik, Physik-Department & Institut für Medizintechnik, Technische Universität München, 85748 Garching (Germany); Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire, OX11 0DE, United Kingdom and Department of Physics & Astronomy, University College London, London, WC1E 6BT (United Kingdom); Thibault, Pierre [Department of Physics & Astronomy, University College London, London, WC1E 6BT (United Kingdom); Herzen, Julia; Pfeiffer, Franz [Lehrstuhl für Biomedizinische Physik, Physik-Department & Institut für Medizintechnik, Technische Universität München, 85748 Garching (Germany); Zanette, Irene [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire, OX11 0DE (United Kingdom); Lehrstuhl für Biomedizinische Physik, Physik-Department & Institut für Medizintechnik, Technische Universität München, 85748 Garching (Germany)

    2016-01-28

    X-ray dark-field and phase-contrast imaging using near-field speckles is a novel technique that overcomes limitations inherent in conventional absorption x-ray imaging, i.e. poor contrast for features with similar density. Speckle-based imaging yields a wealth of information with a simple setup tolerant to polychromatic and divergent beams, and simple data acquisition and analysis procedures. Here, we present a simulation software used to model the image formation with the speckle-based technique, and we compare simulated results on a phantom sample with experimental synchrotron data. Thorough simulation of a speckle-based imaging experiment will help for better understanding and optimising the technique itself.

  17. SAR imaging simulation for an inhomogeneous undulated lunar surface based on triangulated irregular network

    Institute of Scientific and Technical Information of China (English)

    FA WenZhe; XU Feng; JIN YaQiu

    2009-01-01

    Based on the statistics of the lunar cratered terrain, e.g., population, dimension and shape of craters, the terrain feature of cratered lunar surface is numerically generated. According to the Inhomogeneous distribution of the lunar surface slope, the triangulated irregular network (TIN) is employed to make the digital elevation of lunar surface model. The Kirchhoff approximation of surface scattering is then applied to simulation of lunar surface scattering. The synthetic aperture radar (SAR) image for compre-hensive cratered lunar surface is numerically generated using back projection (BP) algorithm of SAR Imaging. Making use of the digital elevation and Clementlne UVVIS data at Apollo 15 landing site as the ground truth, an SAR Image at Apollo 15 landing site Is simulated. The image simulation is verified using real SAR image and echoes statistics.

  18. SIMULATION OF SHIP GENERATED TURBULENT AND VORTICAL WAKE IMAGING BY SAR

    Institute of Scientific and Technical Information of China (English)

    Wang Aiming; Zhu Minhui

    2004-01-01

    Synthetic Aperture Radar (SAR) imaging of ocean surface features is studied. The simulation of the turbulent and vortical features generated by a moving ship and SAR imaging of these wakes is carried out. The turbulent wake damping the ocean surface capillary waves may be partially responsible for the suppression of surface waves near the ship track. The vortex pair generating a change in the lateral flow field behind the ship may be partially responsible for an enhancement of the waves near the edges of the smooth area. These hydrodynamic phenomena as well as the changes of radar backscatter generated by turbulence and vortex are simulated.An SAR imaging model is then used on such ocean surface features to provide SAR images.Comparison of two ships' simulated SAR images shows that the wake features are different for various ship parameters.

  19. Application of digital image processing for the generation of voxels phantoms for Monte Carlo simulation.

    Science.gov (United States)

    Boia, L S; Menezes, A F; Cardoso, M A C; da Rosa, L A R; Batista, D V S; Cardoso, S C; Silva, A X; Facure, A

    2012-01-01

    This paper presents the application of a computational methodology for optimizing the conversion of medical tomographic images in voxel anthropomorphic models for simulation of radiation transport using the MCNP code. A computational system was developed for digital image processing that compresses the information from the DICOM medical image before it is converted to the Scan2MCNP software input file for optimization of the image data. In order to validate the computational methodology, a radiosurgery treatment simulation was performed using the Alderson Rando phantom and the acquisition of DICOM images was performed. The simulation results were compared with data obtained with the BrainLab planning system. The comparison showed good agreement for three orthogonal treatment beams of (60)Co gamma radiation. The percentage differences were 3.07%, 0.77% and 6.15% for axial, coronal and sagital projections, respectively.

  20. Use of Airborne Hyperspectral Data in the Simulation of Satellite Images

    Science.gov (United States)

    de Miguel, Eduardo; Jimenez, Marcos; Ruiz, Elena; Salido, Elena; Gutierrez de la Camara, Oscar

    2016-08-01

    The simulation of future images is part of the development phase of most Earth Observation missions. This simulation uses frequently as starting point images acquired from airborne instruments. These instruments provide the required flexibility in acquisition parameters (time, date, illumination and observation geometry...) and high spectral and spatial resolution, well above the target values (as required by simulation tools). However, there are a number of important problems hampering the use of airborne imagery. One of these problems is that observation zenith angles (OZA), are far from those that the misisons to be simulated would use.We examine this problem by evaluating the difference in ground reflectance estimated from airborne images for different observation/illumination geometries. Next, we analyze a solution for simulation purposes, in which a Bi- directional Reflectance Distribution Function (BRDF) model is attached to an image of the isotropic surface reflectance. The results obtained confirm the need for reflectance anisotropy correction when using airborne images for creating a reflectance map for simulation purposes. But this correction should not be used without providing the corresponding estimation of BRDF, in the form of model parameters, to the simulation teams.

  1. Distributed Object Medical Imaging Model

    Directory of Open Access Journals (Sweden)

    Ahmad Shukri Mohd Noor

    2009-09-01

    Full Text Available Digital medical informatics and images are commonly used in hospitals today. Because of the interrelatedness of the radiology department and other departments, especially the intensive care unit and emergency department, the transmission and sharing of medical images has become a critical issue. Our research group has developed a Java-based Distributed Object Medical Imaging Model(DOMIM to facilitate the rapid development and deployment of medical imaging applications in a distributed environment that can be shared and used by related departments and mobile physiciansDOMIM is a unique suite of multimedia telemedicine applications developed for the use by medical related organizations. The applications support realtime patients' data, image files, audio and video diagnosis annotation exchanges. The DOMIM enables joint collaboration between radiologists and physicians while they are at distant geographical locations. The DOMIM environment consists of heterogeneous, autonomous, and legacy resources. The Common Object Request Broker Architecture (CORBA, Java Database Connectivity (JDBC, and Java language provide the capability to combine the DOMIM resources into an integrated, interoperable, and scalable system. The underneath technology, including IDL ORB, Event Service, IIOP JDBC/ODBC, legacy system wrapping and Java implementation are explored. This paper explores a distributed collaborative CORBA/JDBC based framework that will enhance medical information management requirements and development. It encompasses a new paradigm for the delivery of health services that requires process reengineering, cultural changes, as well as organizational changes.

  2. Recalling of Images using Hopfield Neural Network Model

    CERN Document Server

    Ramya, C; Shreedhara, Dr K S

    2011-01-01

    In the present paper, an effort has been made for storing and recalling images with Hopfield Neural Network Model of auto-associative memory. Images are stored by calculating a corresponding weight matrix. Thereafter, starting from an arbitrary configuration, the memory will settle on exactly that stored image, which is nearest to the starting configuration in terms of Hamming distance. Thus given an incomplete or corrupted version of a stored image, the network is able to recall the corresponding original image. The storing of the objects has been performed according to the Hopfield algorithm explained below. Once the net has completely learnt this set of input patterns, a set of testing patterns containing degraded images will be given to the net. Then the Hopfield net will tend to recall the closest matching pattern for the given degraded image. The simulated results show that Hopfield model is the best for storing and recalling images.

  3. Simulation and Efficient Measurements of Intensities for Complex Imaging Sequences

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt; Rasmussen, Morten Fischer; Stuart, Matthias Bo

    2014-01-01

    for the intensity is from -17.6 to 9.7 %, although the measured fields are highly non-linear (several MPa) and linear simulation is used. Linear simulation can, thus, be used to accurately predict intensity levels for any advanced imaging sequence and is an efficient tool in predicting the energy distribution....... on the sequence to simulate both intensity and mechanical index (MI) according to FDA rules. A 3 MHz BK Medical 8820e convex array transducer is used with the SARUS scanner. An Onda HFL-0400 hydrophone and the Onda AIMS III system measures the pressure field for three imaging schemes: a fixed focus, single...... emission scheme, a duplex vector flow scheme, and finally a vector flow imaging scheme. The hydrophone is connected to a receive channel in SARUS, which automatically measures the emitted pressure for the complete imaging sequence. MI can be predicted with an accuracy of 16.4 to 38 %. The accuracy...

  4. An Object-Oriented Simulator for 3D Digital Breast Tomosynthesis Imaging System

    Directory of Open Access Journals (Sweden)

    Saeed Seyyedi

    2013-01-01

    Full Text Available Digital breast tomosynthesis (DBT is an innovative imaging modality that provides 3D reconstructed images of breast to detect the breast cancer. Projections obtained with an X-ray source moving in a limited angle interval are used to reconstruct 3D image of breast. Several reconstruction algorithms are available for DBT imaging. Filtered back projection algorithm has traditionally been used to reconstruct images from projections. Iterative reconstruction algorithms such as algebraic reconstruction technique (ART were later developed. Recently, compressed sensing based methods have been proposed in tomosynthesis imaging problem. We have developed an object-oriented simulator for 3D digital breast tomosynthesis (DBT imaging system using C++ programming language. The simulator is capable of implementing different iterative and compressed sensing based reconstruction methods on 3D digital tomosynthesis data sets and phantom models. A user friendly graphical user interface (GUI helps users to select and run the desired methods on the designed phantom models or real data sets. The simulator has been tested on a phantom study that simulates breast tomosynthesis imaging problem. Results obtained with various methods including algebraic reconstruction technique (ART and total variation regularized reconstruction techniques (ART+TV are presented. Reconstruction results of the methods are compared both visually and quantitatively by evaluating performances of the methods using mean structural similarity (MSSIM values.

  5. Simulation of high quality ultrasound imaging

    DEFF Research Database (Denmark)

    Hemmsen, Martin Christian; Nikolov, Svetoslav; Jensen, Jørgen Arendt

    In this paper the influence using an idealized transducer model (ITM) and a realistic transducer model (RTM) is investigated in a comparative study between Synthetic Aperture Sequential Beamformation (SASB) and Dynamic Receive Focus (DRF)....

  6. Modeling and simulation of integrated luminescence detection platforms

    Science.gov (United States)

    Salama, Khaled; Eltoukhy, Helmy; Hassibi, Arjang; El Gamal, Abbas

    2003-07-01

    We developed a simulation model of an integrated CMOS-based imaging platform for use with bioluminescent DNA microarrays. We formulate the complete kinetic model of ATP based assays and luciferase label-based assays. The model first calculates the number of photons generated per unit time, i.e., photon flux, based upon the kinetics of the light generation process of luminescence probes. The photon flux coupled with the system geometry is then used to calculate the number of photons incident on the photodetector plane. Subsequently the characteristics of the imaging array including the photodetector spectral response, its dark current density, and the sensor conversion gain are incorporated. The model also takes into account different noise sources including shot noise, reset noise, readout noise and fixed pattern noise. Finally, signal processing algorithms are applied to the image to enhance detection reliability and hence increase the overall system throughput. We will present simulations and preliminary experimental results.

  7. Distributed simulation a model driven engineering approach

    CERN Document Server

    Topçu, Okan; Oğuztüzün, Halit; Yilmaz, Levent

    2016-01-01

    Backed by substantive case studies, the novel approach to software engineering for distributed simulation outlined in this text demonstrates the potent synergies between model-driven techniques, simulation, intelligent agents, and computer systems development.

  8. Benchmark simulation models, quo vadis?

    Science.gov (United States)

    Jeppsson, U; Alex, J; Batstone, D J; Benedetti, L; Comas, J; Copp, J B; Corominas, L; Flores-Alsina, X; Gernaey, K V; Nopens, I; Pons, M-N; Rodríguez-Roda, I; Rosen, C; Steyer, J-P; Vanrolleghem, P A; Volcke, E I P; Vrecko, D

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to provide their insights, highlighting areas where knowledge may still be deficient and where new opportunities are emerging, and to propose potential avenues for future development and application of the general benchmarking framework and its associated tools. The paper focuses on the topics of temporal and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity.

  9. Computer simulation of the micropulse imaging lidar

    Science.gov (United States)

    Dai, Yongjiang; Zhao, Hongwei; Zhao, Yu; Wang, Xiaoou

    2000-10-01

    In this paper a design method of the Micro Pulse Lidar (MPL) is introduced, that is a computer simulation of the MPL. Some of the MPL parameters concerned air scattered and the effects on the performance of the lidar are discussed. The design software for the lidar with diode pumped solid laser is programmed by MATLAB. This software is consisted of six modules, that is transmitter, atmosphere, target, receiver, processor and display system. The method can be extended some kinds of lidar.

  10. Simulation and Modeling Application in Agricultural Mechanization

    Directory of Open Access Journals (Sweden)

    R. M. Hudzari

    2012-01-01

    Full Text Available This experiment was conducted to determine the equations relating the Hue digital values of the fruits surface of the oil palm with maturity stage of the fruit in plantation. The FFB images were zoomed and captured using Nikon digital camera, and the calculation of Hue was determined using the highest frequency of the value for R, G, and B color components from histogram analysis software. New procedure in monitoring the image pixel value for oil palm fruit color surface in real-time growth maturity was developed. The estimation of day harvesting prediction was calculated based on developed model of relationships for Hue values with mesocarp oil content. The simulation model is regressed and predicts the day of harvesting or a number of days before harvest of FFB. The result from experimenting on mesocarp oil content can be used for real-time oil content determination of MPOB color meter. The graph to determine the day of harvesting the FFB was presented in this research. The oil was found to start developing in mesocarp fruit at 65 days before fruit at ripe maturity stage of 75% oil to dry mesocarp.

  11. Safety Assessment of Advanced Imaging Sequences II: Simulations

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    2016-01-01

    An automatic approach for simulating the emitted pressure, intensity, and MI of advanced ultrasound imaging sequences is presented. It is based on a linear simulation of pressure fields using Field II, and it is hypothesized that linear simulation can attain the needed accuracy for predicting...... and Ita.3 closely matches that for the measurement, and simulations can therefore be used to select the region for measuring the intensities, resulting in a significant reduction in measurement time. It can validate emission sequences by showing symmetry of emitted pressure fields, focal position...

  12. LISP: a laser imaging simulation package for developing and testing laser vision systems

    Science.gov (United States)

    Wu, Kung C.

    1993-01-01

    The difficulties commonly encountered in developing laser imaging technologies are: (1) high cost of the laser system, and (2) time and cost involved in modeling and maneuvering a physical environment for the desired scenes. In contrast to the real imaging systems, computer generated laser images provide researchers with fast, accurate, cost-effective data for testing and developing algorithms. The laser imaging simulation package (LISP) described in this paper provides an interactive solid modeler that allows users to construct the artificial environment by various solid modelling techniques. Two fast ray tracing algorithms were developed and discussed in this paper for generating the near realistic laser data of any desired scene. These computer generated laser data facilitates the researchers in developing laser imaging algorithms. Thus, LISP not only provides an ideal testbed for developing and testing algorithms, but also an opportunity to explore the limitation of laser imaging applications.

  13. Structured building model reduction toward parallel simulation

    Energy Technology Data Exchange (ETDEWEB)

    Dobbs, Justin R. [Cornell University; Hencey, Brondon M. [Cornell University

    2013-08-26

    Building energy model reduction exchanges accuracy for improved simulation speed by reducing the number of dynamical equations. Parallel computing aims to improve simulation times without loss of accuracy but is poorly utilized by contemporary simulators and is inherently limited by inter-processor communication. This paper bridges these disparate techniques to implement efficient parallel building thermal simulation. We begin with a survey of three structured reduction approaches that compares their performance to a leading unstructured method. We then use structured model reduction to find thermal clusters in the building energy model and allocate processing resources. Experimental results demonstrate faster simulation and low error without any interprocessor communication.

  14. Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2014-01-01

    This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

  15. An introduction to enterprise modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ostic, J.K.; Cannon, C.E. [Los Alamos National Lab., NM (United States). Technology Modeling and Analysis Group

    1996-09-01

    As part of an ongoing effort to continuously improve productivity, quality, and efficiency of both industry and Department of Energy enterprises, Los Alamos National Laboratory is investigating various manufacturing and business enterprise simulation methods. A number of enterprise simulation software models are being developed to enable engineering analysis of enterprise activities. In this document the authors define the scope of enterprise modeling and simulation efforts, and review recent work in enterprise simulation at Los Alamos National Laboratory as well as at other industrial, academic, and research institutions. References of enterprise modeling and simulation methods and a glossary of enterprise-related terms are provided.

  16. A physiological production model for cacao : results of model simulations

    NARCIS (Netherlands)

    Zuidema, P.A.; Leffelaar, P.A.

    2002-01-01

    CASE2 is a physiological model for cocoa (Theobroma cacao L.) growth and yield. This report introduces the CAcao Simulation Engine for water-limited production in a non-technical way and presents simulation results obtained with the model.

  17. A physiological production model for cacao : results of model simulations

    NARCIS (Netherlands)

    Zuidema, P.A.; Leffelaar, P.A.

    2002-01-01

    CASE2 is a physiological model for cocoa (Theobroma cacao L.) growth and yield. This report introduces the CAcao Simulation Engine for water-limited production in a non-technical way and presents simulation results obtained with the model.

  18. Dual Energy Method for Breast Imaging: A Simulation Study

    Directory of Open Access Journals (Sweden)

    V. Koukou

    2015-01-01

    Full Text Available Dual energy methods can suppress the contrast between adipose and glandular tissues in the breast and therefore enhance the visibility of calcifications. In this study, a dual energy method based on analytical modeling was developed for the detection of minimum microcalcification thickness. To this aim, a modified radiographic X-ray unit was considered, in order to overcome the limited kVp range of mammographic units used in previous DE studies, combined with a high resolution CMOS sensor (pixel size of 22.5 μm for improved resolution. Various filter materials were examined based on their K-absorption edge. Hydroxyapatite (HAp was used to simulate microcalcifications. The contrast to noise ratio (CNRtc of the subtracted images was calculated for both monoenergetic and polyenergetic X-ray beams. The optimum monoenergetic pair was 23/58 keV for the low and high energy, respectively, resulting in a minimum detectable microcalcification thickness of 100 μm. In the polyenergetic X-ray study, the optimal spectral combination was 40/70 kVp filtered with 100 μm cadmium and 1000 μm copper, respectively. In this case, the minimum detectable microcalcification thickness was 150 μm. The proposed dual energy method provides improved microcalcification detectability in breast imaging with mean glandular dose values within acceptable levels.

  19. MEGACELL: a nanocrystal model construction software for HRTEM multislice simulation.

    Science.gov (United States)

    Stroppa, Daniel G; Righetto, Ricardo D; Montoro, Luciano A; Ramirez, Antonio J

    2011-07-01

    Image simulation has an invaluable importance for the accurate analysis of High Resolution Transmission Electron Microscope (HRTEM) results, especially due to its non-linear image formation mechanism. Because the as-obtained images cannot be interpreted in a straightforward fashion, the retrieval of both qualitative and quantitative information from HRTEM micrographs requires an iterative process including the simulation of a nanocrystal model and its comparison with experimental images. However most of the available image simulation software requires atom-by-atom coordinates as input for the calculations, which can be prohibitive for large finite crystals and/or low-symmetry systems and zone axis orientations. This paper presents an open source citation-ware tool named MEGACELL, which was developed to assist on the construction of nanocrystals models. It allows the user to build nanocrystals with virtually any convex polyhedral geometry and to retrieve its atomic positions either as a plain text file or as an output compatible with EMS (Electron Microscopy Software) input protocol. In addition to the description of this tool features, some construction examples and its application for scientific studies are presented. These studies show MEGACELL as a handy tool, which allows an easier construction of complex nanocrystal models and improves the quantitative information extraction from HRTEM images. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Bio-inspired color image enhancement model

    Science.gov (United States)

    Zheng, Yufeng

    2009-05-01

    Human being can perceive natural scenes very well under various illumination conditions. Partial reasons are due to the contrast enhancement of center/surround networks and opponent analysis on the human retina. In this paper, we propose an image enhancement model to simulate the color processes in the human retina. Specifically, there are two center/surround layers, bipolar/horizontal and ganglion/amacrine; and four color opponents, red (R), green (G), blue (B), and yellow (Y). The central cell (bipolar or ganglion) takes the surrounding information from one or several horizontal or amacrine cells; and bipolar and ganglion both have ON and OFF sub-types. For example, a +R/-G bipolar (red-center- ON/green-surround-OFF) will be excited if only the center is illuminated, or inhibited if only the surroundings (bipolars) are illuminated, or stay neutral if both center and surroundings are illuminated. Likewise, other two color opponents with ON-center/OFF-surround, +G/-R and +B/-Y, follow the same rules. The yellow (Y) channel can be obtained by averaging red and green channels. On the other hand, OFF-center/ON-surround bipolars (i.e., -R/+G and -G/+R, but no - B/+Y) are inhibited when the center is illuminated. An ON-bipolar (or OFF-bipolar) only transfers signals to an ONganglion (or OFF-ganglion), where amacrines provide surrounding information. Ganglion cells have strong spatiotemporal responses to moving objects. In our proposed enhancement model, the surrounding information is obtained using weighted average of neighborhood; excited or inhibited can be implemented with pixel intensity increase or decrease according to a linear or nonlinear response; and center/surround excitations are decided by comparing their intensities. A difference of Gaussian (DOG) model is used to simulate the ganglion differential response. Experimental results using natural scenery pictures proved that, the proposed image enhancement model by simulating the two-layer center

  1. Simulation modeling and analysis with Arena

    CERN Document Server

    Altiok, Tayfur

    2007-01-01

    Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment.” It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...

  2. Nonsmooth Modeling and Simulation for Switched Circuits

    CERN Document Server

    Acary, Vincent; Brogliato, Bernard

    2011-01-01

    "Nonsmooth Modeling and Simulation for Switched Circuits" concerns the modeling and the numerical simulation of switched circuits with the nonsmooth dynamical systems (NSDS) approach, using piecewise-linear and multivalued models of electronic devices like diodes, transistors, switches. Numerous examples (ranging from introductory academic circuits to various types of power converters) are analyzed and many simulation results obtained with the INRIA open-source SICONOS software package are presented. Comparisons with SPICE and hybrid methods demonstrate the power of the NSDS approach

  3. Juno model rheometry and simulation

    Science.gov (United States)

    Sampl, Manfred; Macher, Wolfgang; Oswald, Thomas; Plettemeier, Dirk; Rucker, Helmut O.; Kurth, William S.

    2016-10-01

    The experiment Waves aboard the Juno spacecraft, which will arrive at its target planet Jupiter in 2016, was devised to study the plasma and radio waves of the Jovian magnetosphere. We analyzed the Waves antennas, which consist of two nonparallel monopoles operated as a dipole. For this investigation we applied two independent methods: the experimental technique, rheometry, which is based on a downscaled model of the spacecraft to measure the antenna properties in an electrolytic tank and numerical simulations, based on commercial computer codes, from which the quantities of interest (antenna impedances and effective length vectors) are calculated. In this article we focus on the results for the low-frequency range up to about 4 MHz, where the antenna system is in the quasi-static regime. Our findings show that there is a significant deviation of the effective length vectors from the physical monopole directions, caused by the presence of the conducting spacecraft body. The effective axes of the antenna monopoles are offset from the mechanical axes by more than 30°, and effective lengths show a reduction to about 60% of the antenna rod lengths. The antennas' mutual capacitances are small compared to the self-capacitances, and the latter are almost the same for the two monopoles. The overall performance of the antennas in dipole configuration is very stable throughout the frequency range up to about 4-5 MHz and therefore can be regarded as the upper frequency bound below which the presented quasi-static results are applicable.

  4. Network Modeling and Simulation A Practical Perspective

    CERN Document Server

    Guizani, Mohsen; Khan, Bilal

    2010-01-01

    Network Modeling and Simulation is a practical guide to using modeling and simulation to solve real-life problems. The authors give a comprehensive exposition of the core concepts in modeling and simulation, and then systematically address the many practical considerations faced by developers in modeling complex large-scale systems. The authors provide examples from computer and telecommunication networks and use these to illustrate the process of mapping generic simulation concepts to domain-specific problems in different industries and disciplines. Key features: Provides the tools and strate

  5. Autocorrelation and regularization in digital images. II - Simple image models

    Science.gov (United States)

    Jupp, David L. B.; Strahler, Alan H.; Woodcock, Curtis E.

    1989-01-01

    The variogram function used in geostatistical analysis is a useful statistic in the analysis of remotely sensed images. Using the results derived by Jupp et al. (1988), the basic second-order, or covariance, properties of scenes modeled by simple disks of varying size and spacing after imaging into disk-shaped pixels are analyzed to explore the relationship betwee image variograms and discrete object scene structure. The models provide insight into the nature of real images of the earth's surface and the tools for a complete analysis of the more complex case of three-dimensional illuminated discrete-object images.

  6. Fast and automatic ultrasound simulation from CT images.

    Science.gov (United States)

    Cong, Weijian; Yang, Jian; Liu, Yue; Wang, Yongtian

    2013-01-01

    Ultrasound is currently widely used in clinical diagnosis because of its fast and safe imaging principles. As the anatomical structures present in an ultrasound image are not as clear as CT or MRI. Physicians usually need advance clinical knowledge and experience to distinguish diseased tissues. Fast simulation of ultrasound provides a cost-effective way for the training and correlation of ultrasound and the anatomic structures. In this paper, a novel method is proposed for fast simulation of ultrasound from a CT image. A multiscale method is developed to enhance tubular structures so as to simulate the blood flow. The acoustic response of common tissues is generated by weighted integration of adjacent regions on the ultrasound propagation path in the CT image, from which parameters, including attenuation, reflection, scattering, and noise, are estimated simultaneously. The thin-plate spline interpolation method is employed to transform the simulation image between polar and rectangular coordinate systems. The Kaiser window function is utilized to produce integration and radial blurring effects of multiple transducer elements. Experimental results show that the developed method is very fast and effective, allowing realistic ultrasound to be fast generated. Given that the developed method is fully automatic, it can be utilized for ultrasound guided navigation in clinical practice and for training purpose.

  7. Fast and Automatic Ultrasound Simulation from CT Images

    Directory of Open Access Journals (Sweden)

    Weijian Cong

    2013-01-01

    Full Text Available Ultrasound is currently widely used in clinical diagnosis because of its fast and safe imaging principles. As the anatomical structures present in an ultrasound image are not as clear as CT or MRI. Physicians usually need advance clinical knowledge and experience to distinguish diseased tissues. Fast simulation of ultrasound provides a cost-effective way for the training and correlation of ultrasound and the anatomic structures. In this paper, a novel method is proposed for fast simulation of ultrasound from a CT image. A multiscale method is developed to enhance tubular structures so as to simulate the blood flow. The acoustic response of common tissues is generated by weighted integration of adjacent regions on the ultrasound propagation path in the CT image, from which parameters, including attenuation, reflection, scattering, and noise, are estimated simultaneously. The thin-plate spline interpolation method is employed to transform the simulation image between polar and rectangular coordinate systems. The Kaiser window function is utilized to produce integration and radial blurring effects of multiple transducer elements. Experimental results show that the developed method is very fast and effective, allowing realistic ultrasound to be fast generated. Given that the developed method is fully automatic, it can be utilized for ultrasound guided navigation in clinical practice and for training purpose.

  8. A generic framework to simulate realistic lung, liver and renal pathologies in CT imaging.

    Science.gov (United States)

    Solomon, Justin; Samei, Ehsan

    2014-11-07

    Realistic three-dimensional (3D) mathematical models of subtle lesions are essential for many computed tomography (CT) studies focused on performance evaluation and optimization. In this paper, we develop a generic mathematical framework that describes the 3D size, shape, contrast, and contrast-profile characteristics of a lesion, as well as a method to create lesion models based on CT data of real lesions. Further, we implemented a technique to insert the lesion models into CT images in order to create hybrid CT datasets. This framework was used to create a library of realistic lesion models and corresponding hybrid CT images. The goodness of fit of the models was assessed using the coefficient of determination (R(2)) and the visual appearance of the hybrid images was assessed with an observer study using images of both real and simulated lesions and receiver operator characteristic (ROC) analysis. The average R(2) of the lesion models was 0.80, implying that the models provide a good fit to real lesion data. The area under the ROC curve was 0.55, implying that the observers could not readily distinguish between real and simulated lesions. Therefore, we conclude that the lesion-modeling framework presented in this paper can be used to create realistic lesion models and hybrid CT images. These models could be instrumental in performance evaluation and optimization of novel CT systems.

  9. Laser bistatic two-dimensional scattering imaging simulation of lambert cone

    Science.gov (United States)

    Gong, Yanjun; Zhu, Chongyue; Wang, Mingjun; Gong, Lei

    2015-11-01

    This paper deals with the laser bistatic two-dimensional scattering imaging simulation of lambert cone. Two-dimensional imaging is called as planar imaging. It can reflect the shape of the target and material properties. Two-dimensional imaging has important significance for target recognition. The expression of bistatic laser scattering intensity of lambert cone is obtained based on laser radar eauqtion. The scattering intensity of a micro-element on the target could be obtained. The intensity is related to local angle of incidence, local angle of scattering and the infinitesimal area on the cone. According to the incident direction of laser, scattering direction and normal of infinitesimal area, the local incidence angle and scattering angle can be calculated. Through surface integration and the introduction of the rectangular function, we can get the intensity of imaging unit on the imaging surface, and then get Lambert cone bistatic laser two-dimensional scattering imaging simulation model. We analyze the effect of distinguishability, incident direction, observed direction and target size on the imaging. From the results, we can see that the scattering imaging simulation results of the lambert cone bistatic laser is correct.

  10. Application of Simulated Three Dimensional CT Image in Orthognathic Surgery

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyun Don; Park, Chang Seo [Dept. of Dental Radiology, College of Dentistry, Yensei University, Seoul (Korea, Republic of); Yoo, Sun Kook; Lee, Kyoung Sang [Dept. of Medical Engineering, College of Medicine, Yensei University, Seoul (Korea, Republic of)

    1998-08-15

    In orthodontics and orthognathic surgery, cephalogram has been routine practice in diagnosis and treatment evaluation of craniofacial deformity. But its inherent distortion of actual length and angles during projecting three dimensional object to two dimensional plane might cause errors in quantitative analysis of shape and size. Therefore, it is desirable that three dimensional object is diagnosed and evaluated three dimensionally and three dimensional CT image is best for three dimensional analysis. Development of clinic necessitates evaluation of result of treatment and comparison before and after surgery. It is desirable that patient that was diagnosed and planned by three dimensional computed tomography before surgery is evaluated by three dimensional computed tomography after surgery, too. But Because there is no standardized normal values in three dimension now and three dimensional Computed Tomography needs expensive equipment and because of its expenses and amount of exposure to radiation, limitations still remain to be solved in its application to routine practice. If postoperative three dimensional image is constructed by pre and postoperative lateral and postero-anterior cephalograms and preoperative three dimensional computed tomogram, pre and postoperative image will be compared and evaluated three dimensionally without three dimensional computed tomography after surgery and that will contribute to standardize normal values in three dimension. This study introduced new method that computer-simulated three dimensional image was constructed by preoperative three dimensional computed tomogram and pre and postoperative lateral and postero-anterior cephalograms, and for validation of new method, in four cases of dry skull that position of mandible was displaced and four patients of orthognathic surgery, computer-simulated three dimensional image and actual postoperative three dimensional image were compared. The results were as follows. 1. In four cases of

  11. SOFI Simulation Tool: A Software Package for Simulating and Testing Super-Resolution Optical Fluctuation Imaging.

    Science.gov (United States)

    Girsault, Arik; Lukes, Tomas; Sharipov, Azat; Geissbuehler, Stefan; Leutenegger, Marcel; Vandenberg, Wim; Dedecker, Peter; Hofkens, Johan; Lasser, Theo

    2016-01-01

    Super-resolution optical fluctuation imaging (SOFI) allows one to perform sub-diffraction fluorescence microscopy of living cells. By analyzing the acquired image sequence with an advanced correlation method, i.e. a high-order cross-cumulant analysis, super-resolution in all three spatial dimensions can be achieved. Here we introduce a software tool for a simple qualitative comparison of SOFI images under simulated conditions considering parameters of the microscope setup and essential properties of the biological sample. This tool incorporates SOFI and STORM algorithms, displays and describes the SOFI image processing steps in a tutorial-like fashion. Fast testing of various parameters simplifies the parameter optimization prior to experimental work. The performance of the simulation tool is demonstrated by comparing simulated results with experimentally acquired data.

  12. Advanced seismic imaging technology. Data acquisition (computer simulation of elastic wave propagation); Koseido imaging gijutsu. Data shutoku gijutsu (danseiha simulation)

    Energy Technology Data Exchange (ETDEWEB)

    Tsuru, T. [Tech. Research Center, Japan National Oil Corp., Tokyo (Japan)

    1995-11-10

    Development of software was examined for the purpose of making basic data for an advanced seismic imaging technology by obtaining a seismic exploration data from a complicated underground structural model through a numerical simulation. The result in fiscal 1994 was as follows. A dimensional division difference calculus is superior in the stability and accuracy of numerical calculation and capable of calculating by dividing into one dimensional differences. Attenuation items were added which were due to medium absorbing effect by Maxwell viscoelastic model, and simultaneously a function was added which was capable of dealing with the multi focuses and a group installation of geophones. A pseudospectral method is a kind of difference calculus for numerically solving a partial differential equation, and capable of dividing an underground structural model in lattice and calculating the field on the lattice point. The space direction is differentiated by calculating Fourier series without difference approximation; and, therefore, the number of lattice may be reduced to 2 for the maximum wave length; namely, a lattice interval may be coarsened to reduce calculation time. An improvement was made on the parallel calculation part of the program for two-dimensional analysis developed in the preceding fiscal year, enabling reduction in the calculation time. 4 figs.

  13. Numerical simulations of volume holographic imaging system resolution characteristics

    Science.gov (United States)

    Sun, Yajun; Jiang, Zhuqing; Liu, Shaojie; Tao, Shiquan

    2009-05-01

    Because of the Bragg selectivity of volume holographic gratings, it helps VHI system to optically segment the object space. In this paper, properties of point-source diffraction imaging in terms of the point-spread function (PSF) are investigated, and characteristics of depth and lateral resolutions in a VHI system is numerically simulated. The results show that the observed diffracted field obviously changes with the displacement in the z direction, and is nearly unchanged with displacement in the x and y directions. The dependence of the diffracted imaging field on the z-displacement provides a way to possess 3-D image by VHI.

  14. VHDL simulation with access to transistor models

    Science.gov (United States)

    Gibson, J.

    1991-01-01

    Hardware description languages such as VHDL have evolved to aid in the design of systems with large numbers of elements and a wide range of electronic and logical abstractions. For high performance circuits, behavioral models may not be able to efficiently include enough detail to give designers confidence in a simulation's accuracy. One option is to provide a link between the VHDL environment and a transistor level simulation environment. The coupling of the Vantage Analysis Systems VHDL simulator and the NOVA simulator provides the combination of VHDL modeling and transistor modeling.

  15. A framework for simulating ultrasound imaging based on first order nonlinear pressure–velocity relations

    DEFF Research Database (Denmark)

    Du, Yigang; Fan, Rui; Li, Yong

    2016-01-01

    An ultrasound imaging framework modeled with the first order nonlinear pressure–velocity relations (NPVR) based simulation and implemented by a half-time staggered solution and pseudospectral method is presented in this paper. The framework is capable of simulating linear and nonlinear ultrasound...... propagation and reflections in a heterogeneous medium with different sound speeds and densities. It can be initialized with arbitrary focus, excitation and apodization for multiple individual channels in both 2D and 3D spatial fields. The simulated channel data can be generated using this framework......, and ultrasound image can be obtained by beamforming the simulated channel data. Various results simulated by different algorithms are illustrated for comparisons. The root mean square (RMS) errors for each compared pulses are calculated. The linear propagation is validated by an angular spectrum approach (ASA...

  16. Task-based image quality assessment in radiation therapy: initial characterization and demonstration with CT simulation images

    Science.gov (United States)

    Dolly, Steven R.; Anastasio, Mark A.; Yu, Lifeng; Li, Hua

    2017-03-01

    In current radiation therapy practice, image quality is still assessed subjectively or by utilizing physically-based metrics. Recently, a methodology for objective task-based image quality (IQ) assessment in radiation therapy was proposed by Barrett et al.1 In this work, we present a comprehensive implementation and evaluation of this new IQ assessment methodology. A modular simulation framework was designed to perform an automated, computer-simulated end-to-end radiation therapy treatment. A fully simulated framework was created that utilizes new learning-based stochastic object models (SOM) to obtain known organ boundaries, generates a set of images directly from the numerical phantoms created with the SOM, and automates the image segmentation and treatment planning steps of a radiation therapy work ow. By use of this computational framework, therapeutic operating characteristic (TOC) curves can be computed and the area under the TOC curve (AUTOC) can be employed as a figure-of-merit to guide optimization of different components of the treatment planning process. The developed computational framework is employed to optimize X-ray CT pre-treatment imaging. We demonstrate that use of the radiation therapy-based-based IQ measures lead to different imaging parameters than obtained by use of physical-based measures.

  17. "Observing and Analyzing" Images From a Simulated High Redshift Universe

    CERN Document Server

    Morgan, Robert J; Scannapieco, Evan; Thacker, Robert J

    2015-01-01

    We investigate the high-redshift evolution of the restframe UV-luminosity function (LF) of galaxies via hydrodynamical cosmological simulations, coupled with an emulated observational astronomy pipeline that provides a direct comparison with observations. We do this by creating mock images and synthetic galaxy catalogs of approximately 100 square arcminute fields from the numerical model at redshifts ~ 4.5 to 10.4. We include the effects of dust extinction and the point spread function (PSF) for the Hubble WFC3 camera for comparison with space observations. We also include the expected zodiacal background to predict its effect on space observations, including future missions such as the James Webb Space Telescope (JWST). When our model catalogs are fitted to Schechter function parameters, we predict that the faint-end slope alpha of the LF evolves as alpha = -1.16 - 0.12 z over the redshift range z ~ 4.5 to 7.7, in excellent agreement with observations from e.g., Hathi et al. (2010). However, for redshifts z ...

  18. Theory of compressive modeling and simulation

    Science.gov (United States)

    Szu, Harold; Cha, Jae; Espinola, Richard L.; Krapels, Keith

    2013-05-01

    Modeling and Simulation (M&S) has been evolving along two general directions: (i) data-rich approach suffering the curse of dimensionality and (ii) equation-rich approach suffering computing power and turnaround time. We suggest a third approach. We call it (iii) compressive M&S (CM&S); because the basic Minimum Free-Helmholtz Energy (MFE) facilitating CM&S can reproduce and generalize Candes, Romberg, Tao & Donoho (CRT&D) Compressive Sensing (CS) paradigm as a linear Lagrange Constraint Neural network (LCNN) algorithm. CM&S based MFE can generalize LCNN to 2nd order as Nonlinear augmented LCNN. For example, during the sunset, we can avoid a reddish bias of sunlight illumination due to a long-range Rayleigh scattering over the horizon. With CM&S we can take instead of day camera, a night vision camera. We decomposed long wave infrared (LWIR) band with filter into 2 vector components (8~10μm and 10~12μm) and used LCNN to find pixel by pixel the map of Emissive-Equivalent Planck Radiation Sources (EPRS). Then, we up-shifted consistently, according to de-mixed sources map, to the sub-micron RGB color image. Moreover, the night vision imaging can also be down-shifted at Passive Millimeter Wave (PMMW) imaging, suffering less blur owing to dusty smokes scattering and enjoying apparent smoothness of surface reflectivity of man-made objects under the Rayleigh resolution. One loses three orders of magnitudes in the spatial Rayleigh resolution; but gains two orders of magnitude in the reflectivity, and gains another two orders in the propagation without obscuring smog . Since CM&S can generate missing data and hard to get dynamic transients, CM&S can reduce unnecessary measurements and their associated cost and computing in the sense of super-saving CS: measuring one & getting one's neighborhood free .

  19. Modeling and Simulation of Low Voltage Arcs

    NARCIS (Netherlands)

    Ghezzi, L.; Balestrero, A.

    2010-01-01

    Modeling and Simulation of Low Voltage Arcs is an attempt to improve the physical understanding, mathematical modeling and numerical simulation of the electric arcs that are found during current interruptions in low voltage circuit breakers. An empirical description is gained by refined electrical

  20. Modeling and Simulation of Low Voltage Arcs

    NARCIS (Netherlands)

    Ghezzi, L.; Balestrero, A.

    2010-01-01

    Modeling and Simulation of Low Voltage Arcs is an attempt to improve the physical understanding, mathematical modeling and numerical simulation of the electric arcs that are found during current interruptions in low voltage circuit breakers. An empirical description is gained by refined electrical m

  1. Whole-building Hygrothermal Simulation Model

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2003-01-01

    An existing integrated simulation tool for dynamic thermal simulation of building was extended with a transient model for moisture release and uptake in building materials. Validation of the new model was begun with comparison against measurements in an outdoor test cell furnished with single mat...

  2. Virtual X-ray imaging techniques in an immersive casting simulation environment

    Energy Technology Data Exchange (ETDEWEB)

    Li, Ning [Visual Computing and Virtual Reality Laboratory, Department of Computer Science and Engineering, Ewha Womans University, 405-1, Ewha-SK Telecom Building, 11-1, Daehyun-dong, Seodaemun-gu, 120-750 Seoul (Korea, Republic of)]. E-mail: lining@ewha.ac.kr; Kim, Sung-Hee [Visual Computing and Virtual Reality Laboratory, Department of Computer Science and Engineering, Ewha Womans University, 405-1, Ewha-SK Telecom Building, 11-1, Daehyun-dong, Seodaemun-gu, 120-750 Seoul (Korea, Republic of); Suh, Ji-Hyun [Visual Computing and Virtual Reality Laboratory, Department of Computer Science and Engineering, Ewha Womans University, 405-1, Ewha-SK Telecom Building, 11-1, Daehyun-dong, Seodaemun-gu, 120-750 Seoul (Korea, Republic of); Cho, Sang-Hyun [Center for e-Design, Korea Institute of Industrial Technology, 7-47, Songdo-Dong, Yeonsu-Ku, Inchon (Korea, Republic of); Choi, Jung-Gil [Center for e-Design, Korea Institute of Industrial Technology, 7-47, Songdo-Dong, Yeonsu-Ku, Inchon (Korea, Republic of); Kim, Myoung-Hee [Visual Computing and Virtual Reality Laboratory, Department of Computer Science and Engineering, Ewha Womans University, 405-1, Ewha-SK Telecom Building, 11-1, Daehyun-dong, Seodaemun-gu, 120-750 Seoul (Korea, Republic of) and Center for Computer Graphics and Virtual Reality, Ewha Womans University, 400, Ewha-SK Telecom Building, 11-1, Daehyun-dong, Seodaemun-gu, 120-750 Seoul (Korea, Republic of)]. E-mail: mhkim@ewha.ac.kr

    2007-08-15

    A computer code was developed to simulate radiograph of complex casting products in a CAVE{sup TM}-like environment. The simulation is based on the deterministic algorithms and ray tracing techniques. The aim of this study is to examine CAD/CAE/CAM models at the design stage, to optimize the design and inspect predicted defective regions with fast speed, good accuracy and small numerical expense. The present work discusses the algorithms for the radiography simulation of CAD/CAM model and proposes algorithmic solutions adapted from ray-box intersection algorithm and octree data structure specifically for radiographic simulation of CAE model. The stereoscopic visualization of full-size of product in the immersive casting simulation environment as well as the virtual X-ray images of castings provides an effective tool for design and evaluation of foundry processes by engineers and metallurgists.

  3. Impact Analysis of Climate Change on Snow over a Complex Mountainous Region Using Weather Research and Forecast Model (WRF Simulation and Moderate Resolution Imaging Spectroradiometer Data (MODIS-Terra Fractional Snow Cover Products

    Directory of Open Access Journals (Sweden)

    Xiaoduo Pan

    2017-07-01

    Full Text Available Climate change has a complex effect on snow at the regional scale. The change in snow patterns under climate change remains unknown for certain regions. Here, we used high spatiotemporal resolution snow-related variables simulated by a weather research and forecast model (WRF including snowfall, snow water equivalent and snow depth along with fractional snow cover (FSC data extracted from Moderate Resolution Imaging Spectroradiometer Data (MODIS-Terra to evaluate the effects of climate change on snow over the Heihe River Basin (HRB, a typical inland river basin in arid northwestern China from 2000 to 2013. We utilized Empirical Orthogonal Function (EOF analysis and Mann-Kendall/Theil-Sen trend analysis to evaluate the results. The results are as follows: (1 FSC, snow water equivalent, and snow depth across the entire HRB region decreased, especially at elevations over 4500 m; however, snowfall increased at mid-altitude ranges in the upstream area of the HRB. (2 Total snowfall also increased in the upstream area of the HRB; however, the number of snowfall days decreased. Therefore, the number of extreme snow events in the upstream area of the HRB may have increased. (3 Snowfall over the downstream area of the HRB decreased. Thus, ground stations, WRF simulations and remote sensing products can be used to effectively explore the effect of climate change on snow at the watershed scale.

  4. Simulating multiple diffraction in imaging systems using a path integration method.

    Science.gov (United States)

    Mout, Marco; Wick, Michael; Bociort, Florian; Petschulat, Jörg; Urbach, Paul

    2016-05-10

    We present a method for simulating multiple diffraction in imaging systems based on the Huygens-Fresnel principle. The method accounts for the effects of both aberrations and diffraction and is entirely performed using Monte Carlo ray tracing. We compare the results of this method to those of reference simulations for field propagation through optical systems and for the calculation of point spread functions. The method can accurately model a wide variety of optical systems beyond the exit pupil approximation.

  5. Evaluation of Three-Dimensional Printed Materials for Simulation by Computed Tomography and Ultrasound Imaging.

    Science.gov (United States)

    Mooney, James J; Sarwani, Nabeel; Coleman, Melissa L; Fotos, Joseph S

    2017-06-01

    The use of three-dimensional (3D) printing allows for creation of custom models for clinical care, education, and simulation. Medical imaging, given the significant role it plays in both clinical diagnostics and procedures, remains an important area for such education and simulation. Unfortunately, the materials appropriate for use in simulation involving radiographic or ultrasound imaging remains poorly understood. Therefore, our study was intended to explore the characteristics of readily available 3D printing materials when visualized by computed tomography (CT) and ultrasound. Seven 3D printing materials were examined in standard shapes (cube, cylinder, triangular prism) with a selection of printing methods ("open," "whole," and "solid" forms). For CT imaging, these objects were suspended in a gelatin matrix molded to match a standard human CT phantom. For ultrasound imaging, the objects were placed in acrylic forms filled with a gelatin matrix. All images were examined using OsiriX software. Computed tomography imaging revealed marked variation in materials' Hounsfield units as well as patterning and artifact. The Hounsfield unit variations revealed a number of materials suitable for simulation various human tissues. Ultrasound imaging showed echogenicity in all materials, with some variability in shadowing and posterior wall visualization. We were able to demonstrate the potential utility for 3D printing in the creation of CT and ultrasound simulation models. The similar appearance of materials via ultrasound supports their broad utility for select tissue types, whereas the more variable appearance via CT suggests greater potential for simulating differing tissues but requiring multiple printer technologies to do so.

  6. Ultrasound Imaging and its modeling

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    2002-01-01

    Modern medical ultrasound scanners are used for imaging nearly all soft tissue structures in the body. The anatomy can be studied from gray-scale B-mode images, where the reflectivity and scattering strength of the tissues are displayed. The imaging is performed in real time with 20 to 100 images...

  7. Digital imaging and remote sensing image generator (DIRSIG) as applied to NVESD sensor performance modeling

    Science.gov (United States)

    Kolb, Kimberly E.; Choi, Hee-sue S.; Kaur, Balvinder; Olson, Jeffrey T.; Hill, Clayton F.; Hutchinson, James A.

    2016-05-01

    The US Army's Communications Electronics Research, Development and Engineering Center (CERDEC) Night Vision and Electronic Sensors Directorate (referred to as NVESD) is developing a virtual detection, recognition, and identification (DRI) testing methodology using simulated imagery as a means of augmenting the field testing component of sensor performance evaluation, which is expensive, resource intensive, time consuming, and limited to the available target(s) and existing atmospheric visibility and environmental conditions at the time of testing. Existing simulation capabilities such as the Digital Imaging Remote Sensing Image Generator (DIRSIG) and NVESD's Integrated Performance Model Image Generator (NVIPM-IG) can be combined with existing detection algorithms to reduce cost/time, minimize testing risk, and allow virtual/simulated testing using full spectral and thermal object signatures, as well as those collected in the field. NVESD has developed an end-to-end capability to demonstrate the feasibility of this approach. Simple detection algorithms have been used on the degraded images generated by NVIPM-IG to determine the relative performance of the algorithms on both DIRSIG-simulated and collected images. Evaluating the degree to which the algorithm performance agrees between simulated versus field collected imagery is the first step in validating the simulated imagery procedure.

  8. Whole-building Hygrothermal Simulation Model

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2003-01-01

    An existing integrated simulation tool for dynamic thermal simulation of building was extended with a transient model for moisture release and uptake in building materials. Validation of the new model was begun with comparison against measurements in an outdoor test cell furnished with single...... materials. Almost quasi-steady, cyclic experiments were used to compare the indoor humidity variation and the numerical results of the integrated simulation tool with the new moisture model. Except for the case with chipboard as furnishing, the predictions of indoor humidity with the detailed model were...

  9. Simulation model of metallurgical production management

    Directory of Open Access Journals (Sweden)

    P. Šnapka

    2013-07-01

    Full Text Available This article is focused to the problems of the metallurgical production process intensification. The aim is the explaining of simulation model which presents metallurgical production management system adequated to new requirements. The knowledge of a dynamic behavior and features of metallurgical production system and its management are needed to this model creation. Characteristics which determine the dynamics of metallurgical production process are characterized. Simulation model is structured as functional blocks and their linkages with regard to organizational and temporal hierarchy of their actions. The creation of presented simulation model is based on theoretical findings of regulation, hierarchical systems and optimization.

  10. Numerical simulation research on multi-electrodes resistivity imaging survey array

    Institute of Scientific and Technical Information of China (English)

    Jianjun NIU; Xiaopei ZHANG; Lizhi DU

    2008-01-01

    Multi-electrodes Resistivity Imaging Survey (MRIS) is an array method of electrical survey. In practice how to choose a reasonable array is the key to get reliable survey results. Based on four methods of MRIS such as Wenner, Schlumberger, Pole-pole and Dipole-dipole the authors established the model, by studying the result of the forward numerical simulation modeling and inverse modeling, and analyzed the differences among the different forms of detection devices.

  11. A Good Image Model Eases Restoration

    Science.gov (United States)

    2002-02-06

    algorithms, and various classical as well as unexpected new applications of the BV ( bounded variation ) image model, first introduced into image processing by Rudin, Osher, and Fatemi in 1992 Physica D, 60:259-268.

  12. Gaussian Mixture Model and Rjmcmc Based RS Image Segmentation

    Science.gov (United States)

    Shi, X.; Zhao, Q. H.

    2017-09-01

    For the image segmentation method based on Gaussian Mixture Model (GMM), there are some problems: 1) The number of component was usually a fixed number, i.e., fixed class and 2) GMM is sensitive to image noise. This paper proposed a RS image segmentation method that combining GMM with reversible jump Markov Chain Monte Carlo (RJMCMC). In proposed algorithm, GMM was designed to model the distribution of pixel intensity in RS image. Assume that the number of component was a random variable. Respectively build the prior distribution of each parameter. In order to improve noise resistance, used Gibbs function to model the prior distribution of GMM weight coefficient. According to Bayes' theorem, build posterior distribution. RJMCMC was used to simulate the posterior distribution and estimate its parameters. Finally, an optimal segmentation is obtained on RS image. Experimental results show that the proposed algorithm can converge to the optimal number of class and get an ideal segmentation results.

  13. Modelling of classical ghost images obtained using scattered light

    Science.gov (United States)

    Crosby, S.; Castelletto, S.; Aruldoss, C.; Scholten, R. E.; Roberts, A.

    2007-08-01

    The images obtained in ghost imaging with pseudo-thermal light sources are highly dependent on the spatial coherence properties of the incident light. Pseudo-thermal light is often created by reducing the coherence length of a coherent source by passing it through a turbid mixture of scattering spheres. We describe a model for simulating ghost images obtained with such partially coherent light, using a wave-transport model to calculate the influence of the scattering on initially coherent light. The model is able to predict important properties of the pseudo-thermal source, such as the coherence length and the amplitude of the residual unscattered component of the light which influence the resolution and visibility of the final ghost image. We show that the residual ballistic component introduces an additional background in the reconstructed image, and the spatial resolution obtainable depends on the size of the scattering spheres.

  14. Simulation modeling for the health care manager.

    Science.gov (United States)

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  15. A simulation study on image reconstruction in magnetic particle imaging with field-free-line encoding

    CERN Document Server

    Murase, Kenya

    2016-01-01

    The purpose of this study was to present image reconstruction methods for magnetic particle imaging (MPI) with a field-free-line (FFL) encoding scheme and to propose the use of the maximum likelihood-expectation maximization (ML-EM) algorithm for improving the image quality of MPI. The feasibility of these methods was investigated by computer simulations, in which the projection data were generated by summing up the Fourier harmonics obtained from the MPI signals based on the Langevin function. Images were reconstructed from the generated projection data using the filtered backprojection (FBP) method and the ML-EM algorithm. The effects of the gradient of selection magnetic field (SMF), the strength of drive magnetic field (DMF), the diameter of magnetic nanoparticles (MNPs), and the number of projection data on the image quality of the reconstructed images were investigated. The spatial resolution of the reconstructed images became better with increasing gradient of SMF and with increasing diameter of MNPs u...

  16. Warehouse Simulation Through Model Configuration

    NARCIS (Netherlands)

    Verriet, J.H.; Hamberg, R.; Caarls, J.; Wijngaarden, B. van

    2013-01-01

    The pre-build development of warehouse systems leads from a specific customer request to a specific customer quotation. This involves a process of configuring a warehouse system using a sequence of steps that contain increasingly more details. Simulation is a helpful tool in analyzing warehouse desi

  17. Modeling and Simulation of Matrix Converter

    DEFF Research Database (Denmark)

    Liu, Fu-rong; Klumpner, Christian; Blaabjerg, Frede

    2005-01-01

    This paper discusses the modeling and simulation of matrix converter. Two models of matrix converter are presented: one is based on indirect space vector modulation and the other is based on power balance equation. The basis of these two models is• given and the process on modeling is introduced...

  18. Simulation of laser bistatic two-dimensional scattering imaging about lambertian cylinders

    Science.gov (United States)

    Gong, Yanjun; Li, Lang; Wang, Mingjun; Gong, Lei

    2016-10-01

    This paper deals with the simulation of laser bi-static scattering imaging about lambertian cylinders. Two-dimensional imaging of a target can reflect the shape of the target and material property on the surface of the target. Two-dimensional imaging has important significance for target recognition. Simulations results of laser bi-static two-dimensional scattering imaging of some cylinders are given. The laser bi-static scattering imaging of cylinder, whose surface material with diffuse lambertian reflectance, is given in this paper. The scattering direction of laser bi-static scattering imaging is arbitrary direction. The scattering direction of backward two-dimensional scattering imaging is at opposite direction of the incident direction of laser. The backward two-dimensional scattering imaging is special case of bi-static two dimensional scattering imaging. The scattering intensity of a micro-element on the target could be obtained based on the laser radar equation. The intensity is related to local angle of incidence, local angle of scattering and the infinitesimal area on the surface of cylinder. According to the incident direction of incident laser and normal of infinitesimal area, the local incidence angle can be calculated. According to the scattering direction and normal of infinitesimal area, the local angle of scattering can be calculated. Through surface integration and the introduction of the rectangular function, we can get the intensity of imaging unit on the imaging surface, and then get mathematical model of bi-static laser two dimensional scattering imaging about lambert cylinder. From the results given, one can see that the simulation results of laser bi-static scattering about lambert cylinder is correct.

  19. Synthetically Focused Imaging Techniques in Simulated Austenitic Steel Welds Using AN Ultrasonic Phased Array

    Science.gov (United States)

    Connolly, G. D.; Lowe, M. J. S.; Rokhlin, S. I.; Temple, J. A. G.

    2010-02-01

    In austenitic steel welds employed in safety-critical applications, detection of defects that may propagate during service or may have occurred during welding is particularly important. In this study, synthetically focused imaging techniques are applied to the echoes received by phased arrays in order to reconstruct images of the interior of a simulated austenitic steel weld, with application to sizing and location of simplified defects. Using a ray-tracing approach through a previously developed weld model, we briefly describe and then apply three focusing techniques. Results generated via both ray-tracing theory and finite element simulations will be shown.

  20. Simulation of High Density Pedestrian Flow: Microscopic Model

    CERN Document Server

    Dridi, Mohamed H

    2015-01-01

    In recent years modelling crowd and evacuation dynamics has become very important, with increasing huge numbers of people gathering around the world for many reasons and events. The fact that our global population grows dramatically every year and the current public transport systems are able to transport large amounts of people, heightens the risk of crowd panic or crush. Pedestrian models are based on macroscopic or microscopic behaviour. In this paper, we are interested in developing models that can be used for evacuation control strategies. This model will be based on microscopic pedestrian simulation models, and its evolution and design requires a lot of information and data. The people stream will be simulated, based on mathematical models derived from empirical data about pedestrian flows. This model is developed from image data bases, so called empirical data, taken from a video camera or data obtained using human detectors. We consider the individuals as autonomous particles interacting through socia...

  1. Quantum simulation of the t- J model

    Science.gov (United States)

    Yamaguchi, Fumiko; Yamamoto, Yoshihisa

    2002-12-01

    Computer simulation of a many-particle quantum system is bound to reach the inevitable limits of its ability as the system size increases. The primary reason for this is that the memory size used in a classical simulator grows polynomially whereas the Hilbert space of the quantum system does so exponentially. Replacing the classical simulator by a quantum simulator would be an effective method of surmounting this obstacle. The prevailing techniques for simulating quantum systems on a quantum computer have been developed for purposes of computing numerical algorithms designed to obtain approximate physical quantities of interest. The method suggested here requires no numerical algorithms; it is a direct isomorphic translation between a quantum simulator and the quantum system to be simulated. In the quantum simulator, physical parameters of the system, which are the fixed parameters of the simulated quantum system, are under the control of the experimenter. A method of simulating a model for high-temperature superconducting oxides, the t- J model, by optical control, as an example of such a quantum simulation, is presented.

  2. CAUSA - An Environment For Modeling And Simulation

    Science.gov (United States)

    Dilger, Werner; Moeller, Juergen

    1989-03-01

    CAUSA is an environment for modeling and simulation of dynamic systems on a quantitative level. The environment provides a conceptual framework including primitives like objects, processes and causal dependencies which allow the modeling of a broad class of complex systems. The facility of simulation allows the quantitative and qualitative inspection and empirical investigation of the behavior of the modeled system. CAUSA is implemented in Knowledge-Craft and runs on a Symbolics 3640.

  3. Comparative Analysis of Dayside Reconnection Models in Global Magnetosphere Simulations

    CERN Document Server

    Komar, C M; Cassak, P A

    2015-01-01

    We test and compare a number of existing models predicting the location of magnetic reconnection at Earth's dayside magnetopause for various solar wind conditions. We employ robust image processing techniques to determine the locations where each model predicts reconnection to occur. The predictions are then compared to the magnetic separators, the magnetic field lines separating different magnetic topologies. The predictions are tested in distinct high-resolution simulations with interplanetary magnetic field (IMF) clock angles ranging from 30 to 165 degrees in global magnetohydrodynamic simulations using the three-dimensional Block-Adaptive Tree Solarwind Roe-type Upwind Scheme (BATS-R-US) code with a uniform resistivity, although the described techniques can be generally applied to any self-consistent magnetosphere code. Additional simulations are carried out to test location model dependence on IMF strength and dipole tilt. We find that most of the models match large portions of the magnetic separators wh...

  4. Modeling and Simulation of Matrix Converter

    DEFF Research Database (Denmark)

    Liu, Fu-rong; Klumpner, Christian; Blaabjerg, Frede

    2005-01-01

    This paper discusses the modeling and simulation of matrix converter. Two models of matrix converter are presented: one is based on indirect space vector modulation and the other is based on power balance equation. The basis of these two models is• given and the process on modeling is introduced...... in details. The results of simulations developed for different researches reveal that different mdel may be suitable for different purpose, thus the model should be chosen different carefully. Some details and tricks in modeling are also introduced which give a reference for further research....

  5. Simulation-based Manufacturing System Modeling

    Institute of Scientific and Technical Information of China (English)

    卫东; 金烨; 范秀敏; 严隽琪

    2003-01-01

    In recent years, computer simulation appears to be very advantageous technique for researching the resource-constrained manufacturing system. This paper presents an object-oriented simulation modeling method, which combines the merits of traditional methods such as IDEF0 and Petri Net. In this paper, a four-layer-one-angel hierarchical modeling framework based on OOP is defined. And the modeling description of these layers is expounded, such as: hybrid production control modeling and human resource dispatch modeling. To validate the modeling method, a case study of an auto-product line in a motor manufacturing company has been carried out.

  6. GalSim: The modular galaxy image simulation toolkit

    CERN Document Server

    Rowe, Barnaby; Mandelbaum, Rachel; Bernstein, Gary M; Bosch, James; Simet, Melanie; Meyers, Joshua E; Kacprzak, Tomasz; Nakajima, Reiko; Zuntz, Joe; Miyatake, Hironao; Dietrich, Joerg P; Armstrong, Robert; Melchior, Peter; Gill, Mandeep S S

    2014-01-01

    GALSIM is a collaborative, open-source project aimed at providing an image simulation tool of enduring benefit to the astronomical community. It provides a software library for generating images of astronomical objects such as stars and galaxies in a variety of ways, efficiently handling image transformations and operations such as convolution and rendering at high precision. We describe the GALSIM software and its capabilities, including necessary theoretical background. We demonstrate that the performance of GALSIM meets the stringent requirements of high precision image analysis applications such as weak gravitational lensing, for current datasets and for the Stage IV dark energy surveys of the Large Synoptic Survey Telescope, ESA's Euclid mission, and NASA's WFIRST-AFTA mission. The GALSIM project repository is public and includes the full code history, all open and closed issues, installation instructions, documentation, and wiki pages (including a Frequently Asked Questions section). The GALSIM reposito...

  7. Restoration of polarimetric SAR images using simulated annealing

    DEFF Research Database (Denmark)

    Schou, Jesper; Skriver, Henning

    2001-01-01

    approach favoring one of the objectives. An algorithm for estimating the radar cross-section (RCS) for intensity SAR images has previously been proposed in the literature based on Markov random fields and the stochastic optimization method simulated annealing. A new version of the algorithm is presented...... are obtained while at the same time preserving most of the structures in the image. The algorithm is evaluated using multilook polarimetric L-band data from the Danish airborne EMISAR system, and the impact of the algorithm on the unsupervised H-α classification is demonstrated......Filtering synthetic aperture radar (SAR) images ideally results in better estimates of the parameters characterizing the distributed targets in the images while preserving the structures of the nondistributed targets. However, these objectives are normally conflicting, often leading to a filtering...

  8. Simulation study comparing high-purity germanium and cadmium zinc telluride detectors for breast imaging

    Science.gov (United States)

    Campbell, D. L.; Peterson, T. E.

    2014-11-01

    We conducted simulations to compare the potential imaging performance for breast cancer detection with High-Purity Germanium (HPGe) and Cadmium Zinc Telluride (CZT) systems with 1% and 3.8% energy resolution at 140 keV, respectively. Using the Monte Carlo N-Particle (MCNP5) simulation package, we modelled both 5 mm-thick CZT and 10 mm-thick HPGe detectors with the same parallel-hole collimator for the imaging of a breast/torso phantom. Simulated energy spectra were generated, and planar images were created for various energy windows around the 140 keV photopeak. Relative sensitivity and scatter and the torso fractions were calculated along with tumour contrast and signal-to-noise ratios (SNR). Simulations showed that utilizing a ±1.25% energy window with an HPGe system better suppressed torso background and small-angle scattered photons than a comparable CZT system using a -5%/+10% energy window. Both systems provided statistically similar contrast and SNR, with HPGe providing higher relative sensitivity. Lowering the counts of HPGe images to match CZT count density still yielded equivalent contrast between HPGe and CZT. Thus, an HPGe system may provide equivalent breast imaging capability at lower injected radioactivity levels when acquiring for equal imaging time.

  9. On the simulation and mitigation of anisoplanatic optical turbulence for long range imaging

    Science.gov (United States)

    Hardie, Russell C.; LeMaster, Daniel A.

    2017-05-01

    We describe a numerical wave propagation method for simulating long range imaging of an extended scene under anisoplanatic conditions. Our approach computes an array of point spread functions (PSFs) for a 2D grid on the object plane. The PSFs are then used in a spatially varying weighted sum operation, with an ideal image, to produce a simulated image with realistic optical turbulence degradation. To validate the simulation we compare simulated outputs with the theoretical anisoplanatic tilt correlation and differential tilt variance. This is in addition to comparing the long- and short-exposure PSFs, and isoplanatic angle. Our validation analysis shows an excellent match between the simulation statistics and the theoretical predictions. The simulation tool is also used here to quantitatively evaluate a recently proposed block- matching and Wiener filtering (BMWF) method for turbulence mitigation. In this method block-matching registration algorithm is used to provide geometric correction for each of the individual input frames. The registered frames are then averaged and processed with a Wiener filter for restoration. A novel aspect of the proposed BMWF method is that the PSF model used for restoration takes into account the level of geometric correction achieved during image registration. This way, the Wiener filter is able fully exploit the reduced blurring achieved by registration. The BMWF method is relatively simple computationally, and yet, has excellent performance in comparison to state-of-the-art benchmark methods.

  10. 临近空间红外探测像面照度建模与仿真%Modeling and simulation of imaging plane irradiance for near space infrared detecting

    Institute of Scientific and Technical Information of China (English)

    黄晨; 梁小虎; 王建军; 高昕

    2014-01-01

    低空突防目标飞行高度低、时间短,对其进行有效的探测一直是研究的重点和难题。以2.7μm、4.3μm作为探测系统的中心波长,建立了临近空间红外探测像面照度模型。模拟SBIRS天基红外系统和临近空间探测系统,仿真典型目标弹道,考虑大气影响,得到主动段、被动段红外辐射分别在SBIRS预警卫星和临近空间平台的像面照度及目标背景调制对比度。仿真结果表明:对于低空突防目标,临近空间平台较SBIRS拥有更高的信号对比度及更长的探测时间。在被动段,以4.3μm为探测中心波长探测效果好于2.7μm。%It has been the focus of the study for effective detection of low altitude penetration target which has low altitude, short flight time. To 2.7μm, 4.3μm wavelength as the center of the detection system, a near space infrared detection image plane irradiance model was established. SBIRS space-based infrared system and the near space detection system were simulated. Simulating typical target trajectory, considering the atmospheric effects, infrared radiation image plane irradiance and modulation contrast of object and background of boost phase and post-boost phase at SBIRS and near space platform were obtained. Simulation results show that near space platform has a higher signal contrast and a longer detection time than SBIRS. For TBM′s post-boost phase, the detection effect of 4.3μm is better than that of 2.7μm as detection center wavelength.

  11. Multiscale Model Approach for Magnetization Dynamics Simulations

    CERN Document Server

    De Lucia, Andrea; Tretiakov, Oleg A; Kläui, Mathias

    2016-01-01

    Simulations of magnetization dynamics in a multiscale environment enable rapid evaluation of the Landau-Lifshitz-Gilbert equation in a mesoscopic sample with nanoscopic accuracy in areas where such accuracy is required. We have developed a multiscale magnetization dynamics simulation approach that can be applied to large systems with spin structures that vary locally on small length scales. To implement this, the conventional micromagnetic simulation framework has been expanded to include a multiscale solving routine. The software selectively simulates different regions of a ferromagnetic sample according to the spin structures located within in order to employ a suitable discretization and use either a micromagnetic or an atomistic model. To demonstrate the validity of the multiscale approach, we simulate the spin wave transmission across the regions simulated with the two different models and different discretizations. We find that the interface between the regions is fully transparent for spin waves with f...

  12. Imaging simulated secondary caries lesions with cross polarization OCT

    Science.gov (United States)

    Stahl, Jonathan; Kang, Hobin; Fried, Daniel

    2010-02-01

    The clinical diagnosis of secondary caries has been found to account for the replacement of the majority of intra-coronal restorations. Current methods to diagnose the presence of these lesions at early stages are considered insufficient due to their low sensitivity. Polarization-sensitive optical coherence tomography (PS-OCT) imaging studies have confirmed its effectiveness for imaging carious subsurface lesions in enamel and dentin. The objective of this study was to determine if PS-OCT can be used to nondestructively image demineralization through resin restorations on extracted teeth with both simulated and natural lesions. Simulated secondary caries lesions were created by exposing cavity preparations made in extracted human teeth to a demineralizing solution for 48 hours and subsequently restoring with resin. Negative control restorations were also prepared on each tooth. Optical changes in demineralized versus control preparations beneath restorations were measured as a function of depth using PS-OCT. PS-OCT images indicated that a significant increase in reflectivity and depth occurred in the simulated lesions compared with the control preparations. This study suggests that PS-OCT is well-suited to nondestructively detect early caries lesions in enamel beneath composite restorations.

  13. Simulations of Keratoconus Patient Vision with Optical Eye Modeling

    Science.gov (United States)

    Tan, Bo; Chen, Ying-Ling; Lewis, J. W. L.; Shi, Lei; Wang, Ming

    2007-11-01

    Keratoconus (KC) is an eye condition that involves progressive corneal thinning. Pushed by the intraocular pressure, the weakened cornea bulges outward and creates an irregular surface shape. The result is degraded vision that is difficult to correct with regular eye glasses or contact lens. In this study we use the optical lens design software, ZeMax, and patient data including cornea topography and refraction prescription to construct KC eye models. The variation of KC ``cone height'' on the cornea is used to simulate KC progression. The consequent patients' night vision and Snellen letter chart vision at 20 feet are simulated using these anatomically accurate 3-dimensional models. 100 million rays are traced for each image simulation. Animated results illustrate the change of KC visual acuity with the progression of disease. This simulation technique provides a comprehensive tool for medical training and patient consultation/education.

  14. Systematic modelling and simulation of refrigeration systems

    DEFF Research Database (Denmark)

    Rasmussen, Bjarne D.; Jakobsen, Arne

    1998-01-01

    The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose of the s......The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose...... of the simulation, to select appropriate component models and to set up the equations in a well-arranged way. In this paper the outline of such a method is proposed and examples showing the use of this method for simulation of refrigeration systems are given....

  15. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  16. Systematic modelling and simulation of refrigeration systems

    DEFF Research Database (Denmark)

    Rasmussen, Bjarne D.; Jakobsen, Arne

    1998-01-01

    The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose...... of the simulation, to select appropriate component models and to set up the equations in a well-arranged way. In this paper the outline of such a method is proposed and examples showing the use of this method for simulation of refrigeration systems are given....

  17. HVDC System Characteristics and Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Moon, S.I.; Han, B.M.; Jang, G.S. [Electric Enginnering and Science Research Institute, Seoul (Korea)

    2001-07-01

    This report deals with the AC-DC power system simulation method by PSS/E and EUROSTAG for the development of a strategy for the reliable operation of the Cheju-Haenam interconnected system. The simulation using both programs is performed to analyze HVDC simulation models. In addition, the control characteristics of the Cheju-Haenam HVDC system as well as Cheju AC system characteristics are described in this work. (author). 104 figs., 8 tabs.

  18. Simulation modeling and analysis with Arena

    Energy Technology Data Exchange (ETDEWEB)

    Tayfur Altiok; Benjamin Melamed [Rutgers University, NJ (United States). Department of Industrial and Systems Engineering

    2007-06-15

    The textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment. It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings. Chapter 13.3.3 is on coal loading operations on barges/tugboats.

  19. Simulation of the imaging quality of ground-based telescopes affected by atmospheric disturbances

    Science.gov (United States)

    Ren, Yubin; Kou, Songfeng; Gu, Bozhong

    2014-08-01

    Ground-based telescope imaging model is developed in this paper, the relationship between the atmospheric disturbances and the ground-based telescope image quality is studied. Simulation of the wave-front distortions caused by atmospheric turbulences has long been an important method in the study of the propagation of light through the atmosphere. The phase of the starlight wave-front is changed over time, but in an appropriate short exposure time, the atmospheric disturbances can be considered as "frozen". In accordance with Kolmogorov turbulence theory, simulating atmospheric disturbances of image model based on the phase screen distorted by atmospheric turbulences is achieved by the fast Fourier transform (FFT). Geiger mode avalanche photodiode array (APD arrays) model is used for atmospheric wave-front detection, the image is achieved by inversion method of photon counting after the target starlight goes through phase screens and ground-based telescopes. Ground-based telescope imaging model is established in this paper can accurately achieve the relationship between the quality of telescope imaging and monolayer or multilayer atmosphere disturbances, and it is great significance for the wave-front detection and optical correction in a Multi-conjugate Adaptive Optics system (MCAO).

  20. Nanoindentation shape effect: experiments, simulations and modelling

    Energy Technology Data Exchange (ETDEWEB)

    Calabri, L [CNR-INFM-National Research Center on nanoStructures and bioSystems at Surfaces (S3), Via Campi 213/a, 41100 Modena (Italy); Pugno, N [Department of Structural Engineering, Politecnico di Torino, Corso Duca degli Abruzzi 24, 10129 Turin (Italy); Rota, A [CNR-INFM-National Research Center on nanoStructures and bioSystems at Surfaces (S3), Via Campi 213/a, 41100 Modena (Italy); Marchetto, D [CNR-INFM-National Research Center on nanoStructures and bioSystems at Surfaces (S3), Via Campi 213/a, 41100 Modena (Italy); Valeri, S [CNR-INFM-National Research Center on nanoStructures and bioSystems at Surfaces (S3), Via Campi 213/a, 41100 Modena (Italy)

    2007-10-03

    AFM nanoindentation is nowadays commonly used for the study of mechanical properties of materials at the nanoscale. The investigation of surface hardness of a material using AFM means that the probe has to be able to indent the surface, but also to image it. Usually standard indenters are not sharp enough to obtain high-resolution images, but on the other hand measuring the hardness behaviour of a material with a non-standard sharp indenter gives only comparative results affected by a significant deviation from the commonly used hardness scales. In this paper we try to understand how the shape of the indenter affects the hardness measurement, in order to find a relationship between the measured hardness of a material and the corner angle of a pyramidal indenter. To achieve this we performed a full experimental campaign, indenting the same material with three focused ion beam (FIB) nanofabricated probes with a highly altered corner angle. We then compared the results obtained experimentally with those obtained by numerical simulations, using the finite element method (FEM), and by theoretical models, using a general scaling law for nanoindentation available for indenters with a variable size and shape. The comparison between these three approaches (experimental, numerical and theoretical approaches) reveals a good agreement and allowed us to find a theoretical relationship which links the measured hardness value with the shape of the indenter. The same theoretical approach has also been used to fit the hardness experimental results considering the indentation size effect. In this case we compare the measured data, changing the applied load.

  1. Medical image segmentation by MDP model

    Science.gov (United States)

    Lu, Yisu; Chen, Wufan

    2011-11-01

    MDP (Dirichlet Process Mixtures) model is applied to segment medical images in this paper. Segmentation can been automatically done without initializing segmentation class numbers. The MDP model segmentation algorithm is used to segment natural images and MR (Magnetic Resonance) images in the paper. To demonstrate the accuracy of the MDP model segmentation algorithm, many compared experiments, such as EM (Expectation Maximization) image segmentation algorithm, K-means image segmentation algorithm and MRF (Markov Field) image segmentation algorithm, have been done to segment medical MR images. All the methods are also analyzed quantitatively by using DSC (Dice Similarity Coefficients). The experiments results show that DSC of MDP model segmentation algorithm of all slices exceed 90%, which show that the proposed method is robust and accurate.

  2. Object Oriented Modelling and Dynamical Simulation

    DEFF Research Database (Denmark)

    Wagner, Falko Jens; Poulsen, Mikael Zebbelin

    1998-01-01

    This report with appendix describes the work done in master project at DTU.The goal of the project was to develop a concept for simulation of dynamical systems based on object oriented methods.The result was a library of C++-classes, for use when both building componentbased models and when...... onduction simulation experiments....

  3. Modeling and simulation for RF system design

    CERN Document Server

    Frevert, Ronny; Jancke, Roland; Knöchel, Uwe; Schwarz, Peter; Kakerow, Ralf; Darianian, Mohsen

    2005-01-01

    Focusing on RF specific modeling and simulation methods, and system and circuit level descriptions, this work contains application-oriented training material. Accompanied by a CD- ROM, it combines the presentation of a mixed-signal design flow, an introduction into VHDL-AMS and Verilog-A, and the application of commercially available simulators.

  4. Research of nonlinear simulation on sweep voltage of streak tube imaging lidar

    Science.gov (United States)

    Zhai, Qian; Han, Shao-kun; Zhai, Yu; Lei, Jie-yu; Yao, Jian-feng

    2016-10-01

    In order to study the influence of nonlinear sweep voltage on the range accuracy of streak tube imaging lidar, a nonlinear distance model of streak tube is proposed. The model of the parallel-plate deflection system is studied, and the mathematical relation between the sweep voltage and the position of the image point on the screen is obtained based on the movement rule of phoelectron. And the mathematical model of the sweep voltage is established on the basis of its principle. The simulation of streak image is carried out for the selected staircase target, the range image of the target can be reconstructed by extremum method. Comparing reconstruction result and actual target, the range accuracy caused by the nonlinear sweep voltage is obtained. The curve of the errors varying with target ranges is also obtained. And the range accuracy of the system is analyzed by the means of changing the parameter relate to sweep time.

  5. Dark Energy Studies with LSST Image Simulations, Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, John Russell [Purdue Univ., West Lafayette, IN (United States)

    2016-07-26

    This grant funded the development and dissemination of the Photon Simulator (PhoSim) for the purpose of studying dark energy at high precision with the upcoming Large Synoptic Survey Telescope (LSST) astronomical survey. The work was in collaboration with the LSST Dark Energy Science Collaboration (DESC). Several detailed physics improvements were made in the optics, atmosphere, and sensor, a number of validation studies were performed, and a significant number of usability features were implemented. Future work in DESC will use PhoSim as the image simulation tool for data challenges used by the analysis groups.

  6. STEM image simulation with hybrid CPU/GPU programming.

    Science.gov (United States)

    Yao, Y; Ge, B H; Shen, X; Wang, Y G; Yu, R C

    2016-07-01

    STEM image simulation is achieved via hybrid CPU/GPU programming under parallel algorithm architecture to speed up calculation on a personal computer (PC). To utilize the calculation power of a PC fully, the simulation is performed using the GPU core and multi-CPU cores at the same time to significantly improve efficiency. GaSb and an artificial GaSb/InAs interface with atom diffusion have been used to verify the computation. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Blood flow in the cerebral venous system: modeling and simulation.

    Science.gov (United States)

    Miraucourt, Olivia; Salmon, Stéphanie; Szopos, Marcela; Thiriet, Marc

    2017-04-01

    The development of a software platform incorporating all aspects, from medical imaging data, through three-dimensional reconstruction and suitable meshing, up to simulation of blood flow in patient-specific geometries, is a crucial challenge in biomedical engineering. In the present study, a fully three-dimensional blood flow simulation is carried out through a complete rigid macrovascular circuit, namely the intracranial venous network, instead of a reduced order simulation and partial vascular network. The biomechanical modeling step is carefully analyzed and leads to the description of the flow governed by the dimensionless Navier-Stokes equations for an incompressible viscous fluid. The equations are then numerically solved with a free finite element software using five meshes of a realistic geometry obtained from medical images to prove the feasibility of the pipeline. Some features of the intracranial venous circuit in the supine position such as asymmetric behavior in merging regions are discussed.

  8. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  9. 3-D imaging of seismic data from a physical model of a salt structure

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, P. M. (Peter M.); Huang, L. (Lianjie); House, L. S. (Leigh S.); Wiley, R. (Robert)

    2001-01-01

    Seismic data from a physical model of the SEG/EAGE salt structure were imaged to evaluate the quality of imaging of a complex structure and benchmark imaging codes. The physical model was constructed at the University of Houston. Two simulated marine surveys were collected from it: a conventional towed streamer survey, and a vertical receiver cable survey.

  10. A sand wave simulation model

    NARCIS (Netherlands)

    Nemeth, A.A.; Hulscher, S.J.M.H.; Damme, van R.M.J.

    2003-01-01

    Sand waves form a prominent regular pattern in the offshore seabeds of sandy shallow seas. A two dimensional vertical (2DV) flow and morphological numerical model describing the behaviour of these sand waves has been developed. The model contains the 2DV shallow water equations, with a free water su

  11. Modelling Reactive and Proactive Behaviour in Simulation

    CERN Document Server

    Majid, Mazlina Abdul; Aickelin, Uwe

    2010-01-01

    This research investigated the simulation model behaviour of a traditional and combined discrete event as well as agent based simulation models when modelling human reactive and proactive behaviour in human centric complex systems. A departmental store was chosen as human centric complex case study where the operation system of a fitting room in WomensWear department was investigated. We have looked at ways to determine the efficiency of new management policies for the fitting room operation through simulating the reactive and proactive behaviour of staff towards customers. Once development of the simulation models and their verification had been done, we carried out a validation experiment in the form of a sensitivity analysis. Subsequently, we executed a statistical analysis where the mixed reactive and proactive behaviour experimental results were compared with some reactive experimental results from previously published works. Generally, this case study discovered that simple proactive individual behaviou...

  12. Challenges in SysML Model Simulation

    Directory of Open Access Journals (Sweden)

    Mara Nikolaidou

    2016-07-01

    Full Text Available Systems Modeling Language (SysML is a standard proposed by the OMG for systems-of-systems (SoS modeling and engineering. To this end, it provides the means to depict SoS components and their behavior in a hierarchical, multi-layer fashion, facilitating alternative engineering activities, such as system design. To explore the performance of SysML, simulation is one of the preferred methods. There are many efforts targeting simulation code generation from SysML models. Numerous simulation methodologies and tools are employed, while different SysML diagrams are utilized. Nevertheless, this process is not standardized, although most of current approaches tend to follow the same steps, even if they employ different tools. The scope of this paper is to provide a comprehensive understanding of the similarities and differences of existing approaches and identify current challenges in fully automating SysML models simulation process.

  13. SIMULATION MODELING SLOW SPATIALLY HETER- OGENEOUS COAGULATION

    Directory of Open Access Journals (Sweden)

    P. A. Zdorovtsev

    2013-01-01

    Full Text Available A new model of spatially inhomogeneous coagulation, i.e. formation of larger clusters by joint interaction of smaller ones, is under study. The results of simulation are compared with known analytical and numerical solutions.

  14. Simulation study of secondary electron images in scanning ion microscopy

    CERN Document Server

    Ohya, K

    2003-01-01

    The target atomic number, Z sub 2 , dependence of secondary electron yield is simulated by applying a Monte Carlo code for 17 species of metals bombarded by Ga ions and electrons in order to study the contrast difference between scanning ion microscopes (SIM) and scanning electron microscopes (SEM). In addition to the remarkable reversal of the Z sub 2 dependence between the Ga ion and electron bombardment, a fine structure, which is correlated to the density of the conduction band electrons in the metal, is calculated for both. The brightness changes of the secondary electron images in SIM and SEM are simulated using Au and Al surfaces adjacent to each other. The results indicate that the image contrast in SIM is much more sensitive to the material species and is clearer than that for SEM. The origin of the difference between SIM and SEM comes from the difference in the lateral distribution of secondary electrons excited within the escape depth.

  15. Simulated Extragalactic Observations with a Cryogenic Imaging Spectrophotometer

    CERN Document Server

    Mazin, B A; Mazin, Ben A.; Brunner, Robert J.

    2000-01-01

    In this paper we explore the application of cryogenic imaging spectrophotometers. Prototypes of this new class of detector, such as superconducting tunnel junctions (STJs) and transition edge sensors (TESs), currently deliver low resolution imaging spectrophotometry with high quantum efficiency (70-100%) and no read noise over a wide bandpass in the visible to near-infrared. In order to demonstrate their utility and the differences in observing strategy needed to maximize their scientific return, we present simulated observations of a deep extragalactic field. Using a simple analytic technique, we can estimate both the galaxy redshift and spectral type more accurately than is possible with current broadband techniques. From our simulated observations and a subsequent discussion of the expected migration path for this new technology, we illustrate the power and promise of these devices.

  16. Theory, modeling, and simulation annual report, 1992

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    This report briefly discusses research on the following topics: development of electronic structure methods; modeling molecular processes in clusters; modeling molecular processes in solution; modeling molecular processes in separations chemistry; modeling interfacial molecular processes; modeling molecular processes in the atmosphere; methods for periodic calculations on solids; chemistry and physics of minerals; graphical user interfaces for computational chemistry codes; visualization and analysis of molecular simulations; integrated computational chemistry environment; and benchmark computations.

  17. GPU accelerated Monte-Carlo simulation of SEM images for metrology

    Science.gov (United States)

    Verduin, T.; Lokhorst, S. R.; Hagen, C. W.

    2016-03-01

    In this work we address the computation times of numerical studies in dimensional metrology. In particular, full Monte-Carlo simulation programs for scanning electron microscopy (SEM) image acquisition are known to be notoriously slow. Our quest in reducing the computation time of SEM image simulation has led us to investigate the use of graphics processing units (GPUs) for metrology. We have succeeded in creating a full Monte-Carlo simulation program for SEM images, which runs entirely on a GPU. The physical scattering models of this GPU simulator are identical to a previous CPU-based simulator, which includes the dielectric function model for inelastic scattering and also refinements for low-voltage SEM applications. As a case study for the performance, we considered the simulated exposure of a complex feature: an isolated silicon line with rough sidewalls located on a at silicon substrate. The surface of the rough feature is decomposed into 408 012 triangles. We have used an exposure dose of 6 mC/cm2, which corresponds to 6 553 600 primary electrons on average (Poisson distributed). We repeat the simulation for various primary electron energies, 300 eV, 500 eV, 800 eV, 1 keV, 3 keV and 5 keV. At first we run the simulation on a GeForce GTX480 from NVIDIA. The very same simulation is duplicated on our CPU-based program, for which we have used an Intel Xeon X5650. Apart from statistics in the simulation, no difference is found between the CPU and GPU simulated results. The GTX480 generates the images (depending on the primary electron energy) 350 to 425 times faster than a single threaded Intel X5650 CPU. Although this is a tremendous speedup, we actually have not reached the maximum throughput because of the limited amount of available memory on the GTX480. Nevertheless, the speedup enables the fast acquisition of simulated SEM images for metrology. We now have the potential to investigate case studies in CD-SEM metrology, which otherwise would take unreasonable

  18. Application of Chebyshev Polynomial to simulated modeling

    Institute of Scientific and Technical Information of China (English)

    CHI Hai-hong; LI Dian-pu

    2006-01-01

    Chebyshev polynomial is widely used in many fields, and used usually as function approximation in numerical calculation. In this paper, Chebyshev polynomial expression of the propeller properties across four quadrants is given at first, then the expression of Chebyshev polynomial is transformed to ordinary polynomial for the need of simulation of propeller dynamics. On the basis of it,the dynamical models of propeller across four quadrants are given. The simulation results show the efficiency of mathematical model.

  19. Collisionless Electrostatic Shock Modeling and Simulation

    Science.gov (United States)

    2016-10-21

    Briefing Charts 3. DATES COVERED (From - To) 30 September 2016 – 21 October 2016 4. TITLE AND SUBTITLE Collisionless Electrostatic Shock Modeling and...release: distribution unlimited. PA#16490 Air Force Research Laboratory Collisionless Electrostatic Shock Modeling and Simulation Daniel W. Crews In-Space...unlimited. PA#16490 Overview • Motivation and Background • What is a Collisionless Shock Wave? • Features of the Collisionless Shock • The Shock Simulation

  20. Comparative Analysis of Reconstructed Image Quality in a Simulated Chromotomographic Imager

    Science.gov (United States)

    2014-03-01

    unique projection of a segment of an Air Force Bar Chart in which three wavelengths are present. .. 10 Figure 5. This depicts the Shift and Add...projection of a segment of an Air Force Bar Chart in which three wavelengths are present. The displacement of each image is a function of wavelength but the...Simulation Settings Used In Experiment Setting Name Settings Used Oversampling None Aberration Type Geometric Pupil Sampling 32x32 Image Sampling 32x32

  1. Modeling of magnetic particle suspensions for simulations

    CERN Document Server

    Satoh, Akira

    2017-01-01

    The main objective of the book is to highlight the modeling of magnetic particles with different shapes and magnetic properties, to provide graduate students and young researchers information on the theoretical aspects and actual techniques for the treatment of magnetic particles in particle-based simulations. In simulation, we focus on the Monte Carlo, molecular dynamics, Brownian dynamics, lattice Boltzmann and stochastic rotation dynamics (multi-particle collision dynamics) methods. The latter two simulation methods can simulate both the particle motion and the ambient flow field simultaneously. In general, specialized knowledge can only be obtained in an effective manner under the supervision of an expert. The present book is written to play such a role for readers who wish to develop the skill of modeling magnetic particles and develop a computer simulation program using their own ability. This book is therefore a self-learning book for graduate students and young researchers. Armed with this knowledge,...

  2. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    A simple model of the wave load on slender members of offshore structures is described. The wave elevation of the sea state is modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...... velocity can be approximated by a Gaussian Markov process. Known approximate results for the first-passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results...

  3. Modelling and Simulation of Wave Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1985-01-01

    A simple model of the wave load on stender members of offshore structures is described . The wave elevation of the sea stateis modelled by a stationary Gaussian process. A new procedure to simulate realizations of the wave loads is developed. The simulation method assumes that the wave particle...... velocity can be approximated by a Gaussian Markov process. Known approximate results for the first passage density or equivalently, the distribution of the extremes of wave loads are presented and compared with rather precise simulation results. It is demonstrated that the approximate results...

  4. Modeling and simulation of multiport RF switch

    Energy Technology Data Exchange (ETDEWEB)

    Vijay, J [Student, Department of Instrumentation and Control Engineering, National Institute of Technology, Tiruchirappalli-620015 (India); Saha, Ivan [Scientist, Indian Space Research Organisation (ISRO) (India); Uma, G [Lecturer, Department of Instrumentation and Control Engineering, National Institute of Technology, Tiruchirappalli-620015 (India); Umapathy, M [Assistant Professor, Department of Instrumentation and Control Engineering, National Institute of Technology, Tiruchirappalli-620015 (India)

    2006-04-01

    This paper describes the modeling and simulation of 'Multi Port RF Switch' where the latching mechanism is realized with two hot arm electro thermal actuators and the switching action is realized with electrostatic actuators. It can act as single pole single thrown as well as single pole multi thrown switch. The proposed structure is modeled analytically and required parameters are simulated using MATLAB. The analytical simulation results are validated using Finite Element Analysis of the same in the COVENTORWARE software.

  5. Modeling and simulation of discrete event systems

    CERN Document Server

    Choi, Byoung Kyu

    2013-01-01

    Computer modeling and simulation (M&S) allows engineers to study and analyze complex systems. Discrete-event system (DES)-M&S is used in modern management, industrial engineering, computer science, and the military. As computer speeds and memory capacity increase, so DES-M&S tools become more powerful and more widely used in solving real-life problems. Based on over 20 years of evolution within a classroom environment, as well as on decades-long experience in developing simulation-based solutions for high-tech industries, Modeling and Simulation of Discrete-Event Systems is the only book on

  6. Traffic Modeling in WCDMA System Level Simulations

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Traffic modeling is a crucial element in WCDMA system level simulations. A clear understanding of the nature of traffic in the WCDMA system and subsequent selection of an appropriate random traffic model are critical to the success of the modeling enterprise. The resultant performances will evidently be of a function that our design has been well adapted to the traffic, channel and user mobility models, and these models are also accurate. In this article, our attention will be focused on modeling voice and WWW data traffic with the SBBP model and Victor model respectively.

  7. Monte Carlo simulations of landmine detection using neutron backscattering imaging

    Energy Technology Data Exchange (ETDEWEB)

    Datema, Cor P. E-mail: c.datema@iri.tudelft.nl; Bom, Victor R.; Eijk, Carel W.E. van

    2003-11-01

    Neutron backscattering is a technique that has successfully been applied to the detection of non-metallic landmines. Most of the effort in this field has concentrated on single detectors that are scanned across the soil. Here, two new approaches are presented in which a two-dimensional image of the hydrogen distribution in the soil is made. The first method uses an array of position-sensitive {sup 3}He-tubes that is placed in close proximity of the soil. The second method is based on coded aperture imaging. Here, thermal neutrons from the soil are projected onto a detector which is typically placed one to several meters above the soil. Both methods use a pulsed D/D neutron source. The Monte Carlo simulation package GEANT 4 was used to investigate the performance of both imaging systems.

  8. Seismic imaging and evaluation of channels modeled by boolean approach

    Energy Technology Data Exchange (ETDEWEB)

    Spinola, M.; Aggio, A. [PETROBRAS, Rio de Janeiro, RJ (Brazil). Centro de Pesquisas

    1999-07-01

    The seismic method attempt to image the subsurface architecture and has been able to significantly contribute to detect areal and vertical changes in rock properties. This work presents a seismic imaging study of channel objects generated using the boolean technique. Three channels having different thicknesses were simulated, using the same width, sinuosity and direction. A velocity model was constructed in order to allow seismic contrasts between the interior of channels and the embedding rock. To examine the seismic response for different channel thicknesses, a 3D ray tracing with a normal incident point survey was performed. The three channels were resolved and the way the seismic could image them was studied. (author)

  9. Effects of specific surface area and porosity on cube counting fractal dimension, lacunarity, configurational entropy, and permeability of model porous networks: Random packing simulations and NMR micro-imaging study

    Science.gov (United States)

    Lee, Bum Han; Lee, Sung Keun

    2013-07-01

    Despite the importance of understanding and quantifying the microstructure of porous networks in diverse geologic settings, the effects of the specific surface area and porosity on the key structural parameters of the networks have not been fully understood. We performed cube-counting fractal dimension (Dcc) and lacunarity analyses of 3D porous networks of model sands and configurational entropy analysis of 2D cross sections of model sands using random packing simulations and nuclear magnetic resonance (NMR) micro-imaging. We established relationships among porosity, specific surface area, structural parameters (Dcc and lacunarity), and the corresponding macroscopic properties (configurational entropy and permeability). The Dcc of the 3D porous networks increases with increasing specific surface area at a constant porosity and with increasing porosity at a constant specific surface area. Predictive relationships correlating Dcc, specific surface area, and porosity were also obtained. The lacunarity at the minimum box size decreases with increasing porosity, and that at the intermediate box size (∼0.469 mm in the current model sands) was reproduced well with specific surface area. The maximum configurational entropy increases with increasing porosity, and the entropy length of the pores decreases with increasing specific surface area and was used to calculate the average connectivity among the pores. The correlation among porosity, specific surface area, and permeability is consistent with the prediction from the Kozeny-Carman equation. From the relationship between the permeability and the Dcc of pores, the permeability can be expressed as a function of the Dcc of pores and porosity. The current methods and these newly identified correlations among structural parameters and properties provide improved insights into the nature of porous media and have useful geophysical and hydrological implications for elasticity and shear viscosity of complex composites of rock

  10. Simulation of Meteosat Third Generation-Lightning Imager through tropical rainfall measuring mission: Lightning Imaging Sensor data

    Science.gov (United States)

    Biron, Daniele; De Leonibus, Luigi; Laquale, Paolo; Labate, Demetrio; Zauli, Francesco; Melfi, Davide

    2008-08-01

    The Centro Nazionale di Meteorologia e Climatologia Aeronautica recently hosted a fellowship sponsored by Galileo Avionica, with the intent to study and perform a simulation of Meteosat Third Generation - Lightning Imager (MTG-LI) sensor behavior through Tropical Rainfall Measuring Mission - Lightning Imaging Sensor data (TRMM-LIS). For the next generation of earth observation geostationary satellite, major operating agencies are planning to insert an optical imaging mission, that continuously observes lightning pulses in the atmosphere; EUMETSAT has decided in recent years that one of the three candidate mission to be flown on MTG is LI, a Lightning Imager. MTG-LI mission has no Meteosat Second Generation heritage, but users need to evaluate the possible real time data output of the instrument to agree in inserting it on MTG payload. Authors took the expected LI design from MTG Mission Requirement Document, and reprocess real lightning dataset, acquired from space by TRMM-LIS instrument, to produce a simulated MTG-LI lightning dataset. The simulation is performed in several run, varying Minimum Detectable Energy, taking into account processing steps from event detection to final lightning information. A definition of the specific meteorological requirements is given from the potential use in meteorology of lightning final information for convection estimation and numerical cloud modeling. Study results show the range of instrument requirements relaxation which lead to minimal reduction in the final lightning information.

  11. Quantitative surface evaluation by matching experimental and simulated ronchigram images

    Science.gov (United States)

    Kantún Montiel, Juana Rosaura; Cordero Dávila, Alberto; González García, Jorge

    2011-09-01

    To estimate qualitatively the surface errors with Ronchi test, the experimental and simulated ronchigrams are compared. Recently surface errors have been obtained quantitatively matching the intersection point coordinates of ronchigrama fringes with x-axis . In this case, gaussian fit must be done for each fringe, and interference orders are used in Malacara algorithm for the simulations. In order to evaluate surface errors, we added an error function in simulations, described with cubic splines, to the sagitta function of the ideal surface. We used the vectorial transversal aberration formula and a ruling with cosinusoidal transmittance, because these rulings reproduce better experimental ronchigram fringe profiles. Several error functions are tried until the whole experimental ronchigrama image is reproduced. The optimization process was done using genetic algorithms.

  12. SOFT MODELLING AND SIMULATION IN STRATEGY

    Directory of Open Access Journals (Sweden)

    Luciano Rossoni

    2006-06-01

    Full Text Available A certain resistance on the part of the responsible controllers for the strategy exists, in using techniques and tools of modeling and simulation. Many find them excessively complicated, already others see them as rigid and mathematical for excessively for the use of strategies in uncertain and turbulent environments. However, some interpretative boarding that take care of, in part exist, the necessities of these borrowers of decision. The objective of this work is to demonstrate of a clear and simple form, some of the most powerful boarding, methodologies and interpretative tools (soft of modeling and simulation in the business-oriented area of strategy. We will define initially, what they are on models, simulation and some aspects to the modeling and simulation in the strategy area. Later we will see some boarding of modeling soft, that they see the modeling process much more of that simply a mechanical process, therefore, as seen for Simon, the human beings rationally are limited and its decisions are influenced by a series of questions of subjective character, related to the way where it is inserted. Keywords: strategy, modeling and simulation, soft systems methodology, cognitive map, systems dynamics.

  13. Modeling and Simulation of Hydraulic Engine Mounts

    Institute of Scientific and Technical Information of China (English)

    DUAN Shanzhong; Marshall McNea

    2012-01-01

    Hydraulic engine mounts are widely used in automotive powertrains for vibration isolation.A lumped mechanical parameter model is a traditional approach to model and simulate such mounts.This paper presents a dynamical model of a passive hydraulic engine mount with a double-chamber,an inertia track,a decoupler,and a plunger.The model is developed based on analogy between electrical systems and mechanical-hydraulic systems.The model is established to capture both low and high frequency dynatmic behaviors of the hydraulic mount.The model will be further used to find the approximate pulse responses of the mounts in terms of the force transmission and top chamber pressure.The close form solution from the simplifiod linear model may provide some insight into the highly nonlinear behavior of the mounts.Based on the model,computer simulation has been carried out to study dynamic performance of the hydraulic mount.

  14. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels;

    2003-01-01

    A model for a flue gas boiler covering the flue gas and the water-/steam side has been formulated. The model has been formulated as a number of sub models that are merged into an overall model for the complete boiler. Sub models have been defined for the furnace, the convection zone (split in 2......: a zone submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic-Equation system (DAE). Subsequently Mat......Lab/Simulink has been applied for carrying out the simulations. To be able to verify the simulated results experiments has been carried out on a full scale boiler plant....

  15. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, Kim; Karstensen, Claus; Condra, Thomas Joseph;

    2003-01-01

    A model for a ue gas boiler covering the ue gas and the water-/steam side has been formulated. The model has been formulated as a number of sub models that are merged into an overall model for the complete boiler. Sub models have been dened for the furnace, the convection zone (split in 2: a zone...... submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic- Equation system (DAE). Subsequently MatLab/Simulink has...... been applied for carrying out the simulations. To be able to verify the simulated results an experiments has been carried out on a full scale boiler plant....

  16. Model observers in medical imaging research.

    Science.gov (United States)

    He, Xin; Park, Subok

    2013-10-04

    Model observers play an important role in the optimization and assessment of imaging devices. In this review paper, we first discuss the basic concepts of model observers, which include the mathematical foundations and psychophysical considerations in designing both optimal observers for optimizing imaging systems and anthropomorphic observers for modeling human observers. Second, we survey a few state-of-the-art computational techniques for estimating model observers and the principles of implementing these techniques. Finally, we review a few applications of model observers in medical imaging research.

  17. Simulation of a Geiger-Mode Imaging LADAR System for Performance Assessment

    Directory of Open Access Journals (Sweden)

    Yong Joon Kwon

    2013-07-01

    Full Text Available As LADAR systems applications gradually become more diverse, new types of systems are being developed. When developing new systems, simulation studies are an essential prerequisite. A simulator enables performance predictions and optimal system parameters at the design level, as well as providing sample data for developing and validating application algorithms. The purpose of the study is to propose a method for simulating a Geiger-mode imaging LADAR system. We develop simulation software to assess system performance and generate sample data for the applications. The simulation is based on three aspects of modeling—the geometry, radiometry and detection. The geometric model computes the ranges to the reflection points of the laser pulses. The radiometric model generates the return signals, including the noises. The detection model determines the flight times of the laser pulses based on the nature of the Geiger-mode detector. We generated sample data using the simulator with the system parameters and analyzed the detection performance by comparing the simulated points to the reference points. The proportion of the outliers in the simulated points reached 25.53%, indicating the need for efficient outlier elimination algorithms. In addition, the false alarm rate and dropout rate of the designed system were computed as 1.76% and 1.06%, respectively.

  18. Computer simulations of the random barrier model

    DEFF Research Database (Denmark)

    Schrøder, Thomas; Dyre, Jeppe

    2002-01-01

    A brief review of experimental facts regarding ac electronic and ionic conduction in disordered solids is given followed by a discussion of what is perhaps the simplest realistic model, the random barrier model (symmetric hopping model). Results from large scale computer simulations are presented......, focusing on universality of the ac response in the extreme disorder limit. Finally, some important unsolved problems relating to hopping models for ac conduction are listed....

  19. Finite-Difference Time-Domain Simulation for Three-dimensional Polarized Light Imaging

    CERN Document Server

    Menzel, Miriam; De Raedt, Hans; Michielsen, Kristel

    2016-01-01

    Three-dimensional Polarized Light Imaging (3D-PLI) is a promising technique to reconstruct the nerve fiber architecture of human post-mortem brains from birefringence measurements of histological brain sections with micrometer resolution. To better understand how the reconstructed fiber orientations are related to the underlying fiber structure, numerical simulations are employed. Here, we present two complementary simulation approaches that reproduce the entire 3D-PLI analysis: First, we give a short review on a simulation approach that uses the Jones matrix calculus to model the birefringent myelin sheaths. Afterwards, we introduce a more sophisticated simulation tool: a 3D Maxwell solver based on a Finite-Difference Time-Domain algorithm that simulates the propagation of the electromagnetic light wave through the brain tissue. We demonstrate that the Maxwell solver is a valuable tool to better understand the interaction of polarized light with brain tissue and to enhance the accuracy of the fiber orientati...

  20. Simulation of the modulation transfer function dependent on the partial Fourier fraction in dynamic contrast enhancement magnetic resonance imaging.

    Science.gov (United States)

    Takatsu, Yasuo; Ueyama, Tsuyoshi; Miyati, Tosiaki; Yamamura, Kenichirou

    2016-12-01

    The image characteristics in dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) depend on the partial Fourier fraction and contrast medium concentration. These characteristics were assessed and the modulation transfer function (MTF) was calculated by computer simulation. A digital phantom was created from signal intensity data acquired at different contrast medium concentrations on a breast model. The frequency images [created by fast Fourier transform (FFT)] were divided into 512 parts and rearranged to form a new image. The inverse FFT of this image yielded the MTF. From the reference data, three linear models (low, medium, and high) and three exponential models (slow, medium, and rapid) of the signal intensity were created. Smaller partial Fourier fractions, and higher gradients in the linear models, corresponded to faster MTF decline. The MTF more gradually decreased in the exponential models than in the linear models. The MTF, which reflects the image characteristics in DCE-MRI, was more degraded as the partial Fourier fraction decreased.

  1. Model Observers in Medical Imaging Research

    OpenAIRE

    He, Xin; Park, Subok

    2013-01-01

    Model observers play an important role in the optimization and assessment of imaging devices. In this review paper, we first discuss the basic concepts of model observers, which include the mathematical foundations and psychophysical considerations in designing both optimal observers for optimizing imaging systems and anthropomorphic observers for modeling human observers. Second, we survey a few state-of-the-art computational techniques for estimating model observers and the principles of im...

  2. Simulation of AIMS measurements using rigorous mask 3D modeling

    Science.gov (United States)

    Chou, Chih-Shiang; Huang, Hsu-Ting; Chu, Fu-Sheng; Chu, Yuan-Chih; Huang, Wen-Chun; Liu, Ru-Gun; Gau, Tsai-Sheng

    2015-03-01

    Aerial image measurement system (AIMSTM) has been widely used for wafer level inspection of mask defects. Reported inspection flows include die-to-die (D2D) and die-to-database (D2DB) methods. For patterns that do not repeat in another die, only the D2DB approach is applicable. The D2DB method requires accurate simulation of AIMS measurements for a mask pattern. An optical vectorial model is needed to depict the mask diffraction effect in this simulation. To accurately simulate the imaging results, a rigorous electro-magnetic field (EMF) model is essential to correctly take account of the EMF scattering induced by the mask topography, which is usually called the mask 3D effect. In this study, the mask 3D model we use is rigorous coupled-wave analysis (RCWA), which calculates the diffraction fields from a single plane wave incidence. A hybrid Hopkins-Abbe method with RCWA is used to calculate the EMF diffraction at a desired accuracy level while keeping the computation time practical. We will compare the speed of the hybrid Hopkins-Abbe method to the rigorous Abbe method. The matching between simulation and experiment is more challenging for AIMS than CD-SEM because its measurements provide full intensity information. Parameters in the mask 3D model such as film stack thickness or film optical properties, is optimized during the fitting process. We will report the fitting results of AIMS images for twodimensional structures with various pitches. By accurately simulating the AIMS measurements, it provides a necessary tool to perform the mask inspection using the D2DB approach and to accurately predict the mask defects.

  3. Thermal-structural Modeling and Simulation for Scan Mirror on Imager in Solar Radiation%星载辐射计扫描镜太阳辐射热-结构建模与仿真

    Institute of Scientific and Technical Information of China (English)

    游思梁; 陈桂林; 王淦泉

    2011-01-01

    The solar-radiation disturbance exists in the optical payloads onboard the three-axis stabilized satellite platform.It leads to thermal distortion, which obviously affects the imaging quality of optical system.The approximate modeling technique of complex external heat flux was applied, as well as the multi-software-coupled finite element simulation.The thermal effect of the imager's scan mirror caused by solar radiation was researched, and compared under different thermal design conditions on the FY-4 imager.The optimal scheme of the thermal design of the scan mirror was obtained.The result shows that the thermal distortion is closely related to the thermal coating on the side and the back of the scan mirror.It is difficult to resolve the conflict between the temperature fluctuation and the thermal distortion if only using the thermal design method.It is necessary to improve the material of the scan mirror or adjust the optical-mechanical structure.%三轴稳定卫星平台搭载的光学有效载荷都存在午夜太阳辐射干扰问题,由此产生的热变形效应对光学系统的成像品质影响显著.采用复杂外热流近似建模技术,以及多软件耦合有限元仿真方法,以FY-4卫星的扫描辐射计为应用背景,研究了昼夜太阳辐射对星载辐射计扫描镜产生的热效应,并对不同热设计条件下的工况进行了对比分析.分析结果显示:扫描镜侧面及背面的热控涂层选取对其热变形有重要影响,但是仅仅通过热设计解决温度波动和热变形的矛质是不够的,需要通过改进扫描镜的材料和优化光机结构加以改善.

  4. Modeling and simulating of unloading welding transformer

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The simulation model of an unloading welding transformer was established on the basis of MATLAB software, and the modeling principle was described in detail in the paper. The model was made up of three sub-models, i.e. the linear inductor sub-model, the non-linear inductor sub-model and series connection sub-model controlled by current, and these sub-models were jointed together by means of segmented linearization. The simulating results showed that, in the conditions of the high convert frequency and the large cross section of the magnet core of a welding transformer, the non-linear inductor sub-model can be substituted by a linear inductor sub-model in the model; and the leakage reactance in the welding transformer is one of the main reasons of producing over-current and over-voltage in the inverter. The simulation results demonstrate that the over-voltage produced by leakage reactance is nearly two times of the input voltage supplied to the transformer, and the lasting time of over-voltage depends on time constant τ1. With reducing of τ1, the amplitude of the over-current will increase, and the lasting time becomes shorter. Contrarily, with increasing of τ1, the amplitude of the over-current will decrease, and the lasting time becomes longer. The model has played the important role for the development of the inverter resistance welding machine.

  5. Revolutions in energy through modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Tatro, M.; Woodard, J.

    1998-08-01

    The development and application of energy technologies for all aspects from generation to storage have improved dramatically with the advent of advanced computational tools, particularly modeling and simulation. Modeling and simulation are not new to energy technology development, and have been used extensively ever since the first commercial computers were available. However, recent advances in computing power and access have broadened the extent and use, and, through increased fidelity (i.e., accuracy) of the models due to greatly enhanced computing power, the increased reliance on modeling and simulation has shifted the balance point between modeling and experimentation. The complex nature of energy technologies has motivated researchers to use these tools to understand better performance, reliability and cost issues related to energy. The tools originated in sciences such as the strength of materials (nuclear reactor containment vessels); physics, heat transfer and fluid flow (oil production); chemistry, physics, and electronics (photovoltaics); and geosciences and fluid flow (oil exploration and reservoir storage). Other tools include mathematics, such as statistics, for assessing project risks. This paper describes a few advancements made possible by these tools and explores the benefits and costs of their use, particularly as they relate to the acceleration of energy technology development. The computational complexity ranges from basic spreadsheets to complex numerical simulations using hardware ranging from personal computers (PCs) to Cray computers. In all cases, the benefits of using modeling and simulation relate to lower risks, accelerated technology development, or lower cost projects.

  6. Inventory Reduction Using Business Process Reengineering and Simulation Modeling.

    Science.gov (United States)

    1996-12-01

    center is analyzed using simulation modeling and business process reengineering (BPR) concepts. The two simulation models were designed and evaluated by...reengineering and simulation modeling offer powerful tools to aid the manager in reducing cycle time and inventory levels.

  7. Rigorous simulation of OCT image formation using Maxwell's equations in three dimensions (Conference Presentation)

    Science.gov (United States)

    Munro, Peter R. T.; Curatolo, Andrea; Sampson, David D.

    2016-03-01

    Existing models of image formation in optical coherence tomography are based upon the extended Huygens-Fresnel formalism. These models all, to varying degrees, rely on scatterer ensemble averages, rather than deterministic scattering distributions. Whilst the former is sometimes preferable, there are a growing number of applications where the ability to predict image formation based upon deterministic refractive index distributions is of great interest, including, for example, image formation in turbid tissue. A rigorous model based upon three-dimensional solutions of Maxwell's equations offers a number of tantalising opportunities. For example, shedding light on features near or below the resolution of an OCT system and on the impact of phenomena usually described as diffraction, interference and scattering, but which more generally result from light scattering satisfying Maxwell's equations. A rigorous model allows inverse scattering methods to be developed not requiring the first-order Born approximation. Finally, a rigorous model can provide gold standard verification of myriad quantitative techniques currently being developed throughout the field. We have developed the first such model of image formation based upon three-dimensional solutions of Maxwell's equations, which has vastly different properties to models based on two-dimensional solutions. Although we present simulated B-scans, this model is equally applicable to C-scans. This has been made possible by advances in computational techniques and in computational resources routinely available. We will present the main features of our model, comparisons of measured and simulated image formation for phantoms and discuss the future of rigorous modelling in optical coherence tomography research and application.

  8. Simulation and modeling of turbulent flows

    CERN Document Server

    Gatski, Thomas B; Lumley, John L

    1996-01-01

    This book provides students and researchers in fluid engineering with an up-to-date overview of turbulent flow research in the areas of simulation and modeling. A key element of the book is the systematic, rational development of turbulence closure models and related aspects of modern turbulent flow theory and prediction. Starting with a review of the spectral dynamics of homogenous and inhomogeneous turbulent flows, succeeding chapters deal with numerical simulation techniques, renormalization group methods and turbulent closure modeling. Each chapter is authored by recognized leaders in their respective fields, and each provides a thorough and cohesive treatment of the subject.

  9. Modeling & Simulation Executive Agent Panel

    Science.gov (United States)

    2007-11-02

    Richard W. ; 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME AND ADDRESS Office of the Oceanographer of the Navy...acquisition, and training communities.” MSEA Role • Facilitator in the project startup phase • Catalyst during development • Certifier in the...ACOUSTIC MODELS Parabolic Equation 5.0 ASTRAL 5.0 ASPM 4.3 Gaussian Ray Bundle 1.0 High Freq Env Acoustic (HFEVA) 1.0 COLOSSUS II 1.0 Low Freq Bottom LOSS

  10. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    , and the total stress level (i.e. stresses introduced due to internal pressure plus stresses introduced due to temperature gradients) must always be kept below the allowable stress level. In this way, the increased water-/steam space that should allow for better dynamic performance, in the end causes limited...... freedom with respect to dynamic operation of the plant. By means of an objective function including as well the price of the plant as a quantification of the value of dynamic operation of the plant an optimization is carried out. The dynamic model of the boiler plant is applied to define parts...

  11. Modelling, simulating and optimizing Boilers

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2003-01-01

    , and the total stress level (i.e. stresses introduced due to internal pressure plus stresses introduced due to temperature gradients) must always be kept below the allowable stress level. In this way, the increased water-/steam space that should allow for better dynamic performance, in the end causes limited...... freedom with respect to dynamic operation of the plant. By means of an objective function including as well the price of the plant as a quantication of the value of dynamic operation of the plant an optimization is carried out. The dynamic model of the boiler plant is applied to dene parts...

  12. Simulering af dagslys i digitale modeller

    DEFF Research Database (Denmark)

    Villaume, René Domine; Ørstrup, Finn Rude

    2004-01-01

    Projektet undersøger via forskellige simuleringer af dagslys, kvaliteten af visualiseringer af komplekse lysforhold i digitale modeller i forbindelse med formidling af arkitektur via nettet. I en digital 3D model af Utzon Associates Paustians hus, simulers naturligt dagslysindfald med  forskellig...... Renderingsmetoder som: "shaded render" /  ”raytraceing” /  "Final Gather /  ”Global Illumination”...

  13. Validity of microgravity simulation models on earth

    DEFF Research Database (Denmark)

    Regnard, J; Heer, M; Drummer, C

    2001-01-01

    Many studies have used water immersion and head-down bed rest as experimental models to simulate responses to microgravity. However, some data collected during space missions are at variance or in contrast with observations collected from experimental models. These discrepancies could reflect inc...

  14. Molecular simulation and modeling of complex I.

    Science.gov (United States)

    Hummer, Gerhard; Wikström, Mårten

    2016-07-01

    Molecular modeling and molecular dynamics simulations play an important role in the functional characterization of complex I. With its large size and complicated function, linking quinone reduction to proton pumping across a membrane, complex I poses unique modeling challenges. Nonetheless, simulations have already helped in the identification of possible proton transfer pathways. Simulations have also shed light on the coupling between electron and proton transfer, thus pointing the way in the search for the mechanistic principles underlying the proton pump. In addition to reviewing what has already been achieved in complex I modeling, we aim here to identify pressing issues and to provide guidance for future research to harness the power of modeling in the functional characterization of complex I. This article is part of a Special Issue entitled Respiratory complex I, edited by Volker Zickermann and Ulrich Brandt. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Hybrid simulation models of production networks

    CERN Document Server

    Kouikoglou, Vassilis S

    2001-01-01

    This book is concerned with a most important area of industrial production, that of analysis and optimization of production lines and networks using discrete-event models and simulation. The book introduces a novel approach that combines analytic models and discrete-event simulation. Unlike conventional piece-by-piece simulation, this method observes a reduced number of events between which the evolution of the system is tracked analytically. Using this hybrid approach, several models are developed for the analysis of production lines and networks. The hybrid approach combines speed and accuracy for exceptional analysis of most practical situations. A number of optimization problems, involving buffer design, workforce planning, and production control, are solved through the use of hybrid models.

  16. Local histograms and image occlusion models

    CERN Document Server

    Massar, Melody L; Fickus, Matthew; Kovacevic, Jelena

    2011-01-01

    The local histogram transform of an image is a data cube that consists of the histograms of the pixel values that lie within a fixed neighborhood of any given pixel location. Such transforms are useful in image processing applications such as classification and segmentation, especially when dealing with textures that can be distinguished by the distributions of their pixel intensities and colors. We, in particular, use them to identify and delineate biological tissues found in histology images obtained via digital microscopy. In this paper, we introduce a mathematical formalism that rigorously justifies the use of local histograms for such purposes. We begin by discussing how local histograms can be computed as systems of convolutions. We then introduce probabilistic image models that can emulate textures one routinely encounters in histology images. These models are rooted in the concept of image occlusion. A simple model may, for example, generate textures by randomly speckling opaque blobs of one color on ...

  17. Investigating Output Accuracy for a Discrete Event Simulation Model and an Agent Based Simulation Model

    CERN Document Server

    Majid, Mazlina Abdul; Siebers, Peer-Olaf

    2010-01-01

    In this paper, we investigate output accuracy for a Discrete Event Simulation (DES) model and Agent Based Simulation (ABS) model. The purpose of this investigation is to find out which of these simulation techniques is the best one for modelling human reactive behaviour in the retail sector. In order to study the output accuracy in both models, we have carried out a validation experiment in which we compared the results from our simulation models to the performance of a real system. Our experiment was carried out using a large UK department store as a case study. We had to determine an efficient implementation of management policy in the store's fitting room using DES and ABS. Overall, we have found that both simulation models were a good representation of the real system when modelling human reactive behaviour.

  18. Modeling and interpretation of images*

    Directory of Open Access Journals (Sweden)

    Min Michiel

    2015-01-01

    Full Text Available Imaging protoplanetary disks is a challenging but rewarding task. It is challenging because of the glare of the central star outshining the weak signal from the disk at shorter wavelengths and because of the limited spatial resolution at longer wavelengths. It is rewarding because it contains a wealth of information on the structure of the disks and can (directly probe things like gaps and spiral structure. Because it is so challenging, telescopes are often pushed to their limitations to get a signal. Proper interpretation of these images therefore requires intimate knowledge of the instrumentation, the detection method, and the image processing steps. In this chapter I will give some examples and stress some issues that are important when interpreting images from protoplanetary disks.

  19. Power electronics system modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lai, Jih-Sheng

    1994-12-31

    This paper introduces control system design based softwares, SIMNON and MATLAB/SIMULINK, for power electronics system simulation. A complete power electronics system typically consists of a rectifier bridge along with its smoothing capacitor, an inverter, and a motor. The system components, featuring discrete or continuous, linear or nonlinear, are modeled in mathematical equations. Inverter control methods,such as pulse-width-modulation and hysteresis current control, are expressed in either computer algorithms or digital circuits. After describing component models and control methods, computer programs are then developed for complete systems simulation. Simulation results are mainly used for studying system performances, such as input and output current harmonics, torque ripples, and speed responses. Key computer programs and simulation results are demonstrated for educational purposes.

  20. Simulation of Gravity Currents Using VOF Model

    Institute of Scientific and Technical Information of China (English)

    邹建锋; 黄钰期; 应新亚; 任安禄

    2002-01-01

    By the Volume of Fluid (VOF) multiphase flow model two-dimensional gravity currents with three phases including air are numerically simulated in this article. The necessity of consideration of turbulence effect for high Reynolds numbers is demonstrated quantitatively by LES (the Large Eddy Simulation) turbulence model. The gravity currents are simulated for h ≠ H as well as h = H, where h is the depth of the gravity current before the release and H is the depth of the intruded fluid. Uprising of swell occurs when a current flows horizontally into another lighter one for h ≠ H. The problems under what condition the uprising of swell occurs and how long it takes are considered in this article. All the simulated results are in reasonable agreement with the experimental results available.

  1. Simulation of Astronomical Images from Optical Survey Telescopes using a Comprehensive Photon Monte Carlo Approach

    CERN Document Server

    Peterson, J R; Kahn, S M; Rasmussen, A P; Peng, E; Ahmad, Z; Bankert, J; Chang, C; Claver, C; Gilmore, D K; Grace, E; Hannel, M; Hodge, M; Lorenz, S; Lupu, A; Meert, A; Nagarajan, S; Todd, N; Winans, A; Young, M

    2015-01-01

    We present a comprehensive methodology for the simulation of astronomical images from optical survey telescopes. We use a photon Monte Carlo approach to construct images by sampling photons from models of astronomical source populations, and then simulating those photons through the system as they interact with the atmosphere, telescope, and camera. We demonstrate that all physical effects for optical light that determine the shapes, locations, and brightnesses of individual stars and galaxies can be accurately represented in this formalism. By using large scale grid computing, modern processors, and an efficient implementation that can produce 400,000 photons/second, we demonstrate that even very large optical surveys can be now be simulated. We demonstrate that we are able to: 1) construct kilometer scale phase screens necessary for wide-field telescopes, 2) reproduce atmospheric point-spread-function moments using a fast novel hybrid geometric/Fourier technique for non-diffraction limited telescopes, 3) ac...

  2. Model Convolution: A Computational Approach to Digital Image Interpretation

    Science.gov (United States)

    Gardner, Melissa K.; Sprague, Brian L.; Pearson, Chad G.; Cosgrove, Benjamin D.; Bicek, Andrew D.; Bloom, Kerry; Salmon, E. D.

    2010-01-01

    Digital fluorescence microscopy is commonly used to track individual proteins and their dynamics in living cells. However, extracting molecule-specific information from fluorescence images is often limited by the noise and blur intrinsic to the cell and the imaging system. Here we discuss a method called “model-convolution,” which uses experimentally measured noise and blur to simulate the process of imaging fluorescent proteins whose spatial distribution cannot be resolved. We then compare model-convolution to the more standard approach of experimental deconvolution. In some circumstances, standard experimental deconvolution approaches fail to yield the correct underlying fluorophore distribution. In these situations, model-convolution removes the uncertainty associated with deconvolution and therefore allows direct statistical comparison of experimental and theoretical data. Thus, if there are structural constraints on molecular organization, the model-convolution method better utilizes information gathered via fluorescence microscopy, and naturally integrates experiment and theory. PMID:20461132

  3. Development of NASA's Models and Simulations Standard

    Science.gov (United States)

    Bertch, William J.; Zang, Thomas A.; Steele, Martin J.

    2008-01-01

    From the Space Shuttle Columbia Accident Investigation, there were several NASA-wide actions that were initiated. One of these actions was to develop a standard for development, documentation, and operation of Models and Simulations. Over the course of two-and-a-half years, a team of NASA engineers, representing nine of the ten NASA Centers developed a Models and Simulation Standard to address this action. The standard consists of two parts. The first is the traditional requirements section addressing programmatics, development, documentation, verification, validation, and the reporting of results from both the M&S analysis and the examination of compliance with this standard. The second part is a scale for evaluating the credibility of model and simulation results using levels of merit associated with 8 key factors. This paper provides an historical account of the challenges faced by and the processes used in this committee-based development effort. This account provides insights into how other agencies might approach similar developments. Furthermore, we discuss some specific applications of models and simulations used to assess the impact of this standard on future model and simulation activities.

  4. Simulated annealing spectral clustering algorithm for image segmentation

    Institute of Scientific and Technical Information of China (English)

    Yifang Yang; and Yuping Wang

    2014-01-01

    The similarity measure is crucial to the performance of spectral clustering. The Gaussian kernel function based on the Euclidean distance is usual y adopted as the similarity mea-sure. However, the Euclidean distance measure cannot ful y reveal the complex distribution data, and the result of spectral clustering is very sensitive to the scaling parameter. To solve these problems, a new manifold distance measure and a novel simulated anneal-ing spectral clustering (SASC) algorithm based on the manifold distance measure are proposed. The simulated annealing based on genetic algorithm (SAGA), characterized by its rapid conver-gence to the global optimum, is used to cluster the sample points in the spectral mapping space. The proposed algorithm can not only reflect local and global consistency better, but also reduce the sensitivity of spectral clustering to the kernel parameter, which improves the algorithm’s clustering performance. To efficiently ap-ply the algorithm to image segmentation, the Nystr¨om method is used to reduce the computation complexity. Experimental re-sults show that compared with traditional clustering algorithms and those popular spectral clustering algorithms, the proposed algorithm can achieve better clustering performances on several synthetic datasets, texture images and real images.

  5. Simulation of the performance and image quality characteristics of the Landsat OLI and TIRS sensors using DIRSIG

    Science.gov (United States)

    Schott, John R.; Gerace, Aaron; Montanaro, Matthew

    2012-09-01

    The Digital Imaging and Remote Sensing (DIRS) Image Generation (DIRSIG) model has been significantly upgraded to support the Landsat Data Continuity Mission (LDCM). The DIRSIG improvements simulate the LDCM Thermal Infrared Sensor (TIRS) and the Operational Land Imager (OLI) sensor's characteristics in support of the NASA and USGS image quality assessment programs. These improvements allow for simulation of spacecraft orbits using standard NORAD two line element (TLE) orbital descriptors. Sensor improvements include individual detector element lines of site, relative spectral response (RSR), bias, gain, non-linear response, and noise. Using DIRSIG's existing source-targetsensor radiative transfer, atmospheric propagation, scene simulation, and thermal models, simulated Landsat 8 imagery was generated. These tools were developed to enable assessment of design trades during instrument development and build, and evaluation of expected performance during instrument test, as test data is used to refine the modeled instrument performance. Current efforts are aimed at refining predicted performance models, simulating on-orbit calibration maneuvers and generation of data to test data processing and analysis algorithms. Initial studies are aimed at assessing the impact of RSR variation on banding and striping in both OLI and TIRS and the use of side slither (90° Yaw) as a possible method to characterize and potentially compensate for non-linearity effects. Ongoing work aimed at simulating targets to support image based registration of the OLI and TIRS instruments is also presented. In general, the use of advanced simulation and modeling tools to support instrument design trades, image quality prediction, on-orbit image quality assessment and operational trades is reviewed. The overall effort is designed to provide simulated imagery incorporating all aspects of the instrument acquisition physics and scene phenomenology in support of instrument developers, operators, and

  6. Dense and sparse aggregations in complex motion: Video coupled with simulation modeling

    Science.gov (United States)

    In censuses of aggregations composed of highly mobile animals, the link between image processing technology and simulation modeling remains relatively unexplored despite demonstrated ecological needs for abundance and density assessments. We introduce a framework that connects video censusing with ...

  7. Technical Note: Detective quantum efficiency simulation of a-Se imaging detectors using ARTEMIS.

    Science.gov (United States)

    Fang, Yuan; Ito, Takaaki; Nariyuki, Fumito; Kuwabara, Takao; Badano, Aldo; Karim, Karim S

    2017-08-01

    This work studies the detective quantum efficiency (DQE) of a-Se-based solid state x-ray detectors for medical imaging applications using ARTEMIS, a Monte Carlo simulation tool for modeling x-ray photon, electron and charged carrier transport in semiconductors with the presence of applied electric field. ARTEMIS is used to model the signal formation process in a-Se. The simulation model includes x-ray photon and high-energy electron interactions, and detailed electron-hole pair transport with applied detector bias taking into account drift, diffusion, Coulomb interactions, recombination and trapping. For experimental validation, the DQE performance of prototype a-Se detectors is measured following IEC Testing Standard 62220-1-3. Comparison of simulated and experimental DQE results show reasonable agreement for RQA beam qualities. Experimental validation demonstrated within 5% percentage difference between simulation and experimental DQE results for spatial frequency above 0.25 cycles/mm using uniform applied electric field for RQA beam qualities (RQA5, RQA7 and RQA9). Results include two different prototype detectors with thicknesses of 240 μm and 1 mm. ARTEMIS can be used to model the DQE of a-Se detectors as a function of x-ray energy, detector thickness, and spatial frequency. The ARTEMIS model can be used to improve understanding of the physics of x-ray interactions in a-Se and in optimization studies for the development of novel medical imaging applications. © 2017 American Association of Physicists in Medicine.

  8. Modelling and Simulation of Crude Oil Dispersion

    Directory of Open Access Journals (Sweden)

    Abdulfatai JIMOH

    2006-01-01

    Full Text Available This research work was carried out to develop a model equation for the dispersion of crude oil in water. Seven different crude oils (Bonny Light, Antan Terminal, Bonny Medium, Qua Iboe Light, Brass Light Mbede, Forcados Blend and Heavy H were used as the subject crude oils. The developed model equation in this project which is given as...It was developed starting from the equation for the oil dispersion rate in water which is given as...The developed equation was then simulated with the aid of MathCAD 2000 Professional software. The experimental and model results obtained from the simulation of the model equation were plotted on the same axis against time of dispersion. The model results revealed close fittings between the experimental and the model results because the correlation coefficients and the r-square values calculated using Spreadsheet Program were both found to be unity (1.00.

  9. Optical simulation for imaging reconnaissance and intelligence sensors OSIRIS: High fidelity sensor simulation test bed; Modified user`s manual

    Energy Technology Data Exchange (ETDEWEB)

    Abernathy, M.F.; Puccetti, M.G.

    1988-01-04

    The OSIRIS program is an imaging optical simulation program which has been developed to predict the output of space-borne sensor systems. The simulation is radiometrically precise and includes highly realistic laser, atmosphere, and earth background models, as well as detailed models of optical components. This system was developed by Rockwell Power Services for the Los Alamos National Laboratory. It is based upon the LARC (Los Alamos Radiometry Code, also by Rockwell), and uses a similar command structure and 3d coordinate system as LARC. At present OSIRIS runs on the Cray I computer under the CTSS operating s stem, and is stored in the OSIRIS root directory on LANL CTSS mass storage.

  10. Simulation Modeling of Software Development Processes

    Science.gov (United States)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  11. Incorporation of RAM techniques into simulation modeling

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, S.C. Jr.; Haire, M.J.; Schryver, J.C.

    1995-07-01

    This work concludes that reliability, availability, and maintainability (RAM) analytical techniques can be incorporated into computer network simulation modeling to yield an important new analytical tool. This paper describes the incorporation of failure and repair information into network simulation to build a stochastic computer model represents the RAM Performance of two vehicles being developed for the US Army: The Advanced Field Artillery System (AFAS) and the Future Armored Resupply Vehicle (FARV). The AFAS is the US Army`s next generation self-propelled cannon artillery system. The FARV is a resupply vehicle for the AFAS. Both vehicles utilize automation technologies to improve the operational performance of the vehicles and reduce manpower. The network simulation model used in this work is task based. The model programmed in this application requirements a typical battle mission and the failures and repairs that occur during that battle. Each task that the FARV performs--upload, travel to the AFAS, refuel, perform tactical/survivability moves, return to logistic resupply, etc.--is modeled. Such a model reproduces a model reproduces operational phenomena (e.g., failures and repairs) that are likely to occur in actual performance. Simulation tasks are modeled as discrete chronological steps; after the completion of each task decisions are programmed that determine the next path to be followed. The result is a complex logic diagram or network. The network simulation model is developed within a hierarchy of vehicle systems, subsystems, and equipment and includes failure management subnetworks. RAM information and other performance measures are collected which have impact on design requirements. Design changes are evaluated through ``what if`` questions, sensitivity studies, and battle scenario changes.

  12. Testing turbulent closure models with convection simulations

    CERN Document Server

    Snellman, J E; Mantere, M J; Rheinhardt, M; Dintrans, B

    2012-01-01

    Aims: To compare simple analytical closure models of turbulent Boussinesq convection for stellar applications with direct three-dimensional simulations both in homogeneous and inhomogeneous (bounded) setups. Methods: We use simple analytical closure models to compute the fluxes of angular momentum and heat as a function of rotation rate measured by the Taylor number. We also investigate cases with varying angles between the angular velocity and gravity vectors, corresponding to locating the computational domain at different latitudes ranging from the pole to the equator of the star. We perform three-dimensional numerical simulations in the same parameter regimes for comparison. The free parameters appearing in the closure models are calibrated by two fit methods using simulation data. Unique determination of the closure parameters is possible only in the non-rotating case and when the system is placed at the pole. In the other cases the fit procedures yield somewhat differing results. The quality of the closu...

  13. Analyzing Strategic Business Rules through Simulation Modeling

    Science.gov (United States)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  14. DInSAR fringes simulation of sandbox models

    Science.gov (United States)

    Derron, Marc-Henri; Carrea, Dario; Michoud, Clément; Jaboyedoff, Michel

    2015-04-01

    Interpreting satellite DInSAR patterns of slope movements can be difficult because of unwrapping problems, loss of coherence or radar imaging geometry limitations (layover, shadowing …). We investigate the potential of simulating interferometric fringes as a tool to help understanding real DInSAR images. Various types of gravitational slope deformations (sliding, toppling …) have been produced in a sandbox in the lab. These experiments were monitored with a micro-lidar Minolta-Konika Vivid 9i to get successive Digital Elevation Models of the surface. A pair of DEM is then used to simulate DInSAR fringes patterns, with the possibility to vary the wavelength, the angle between the line of sight and the ground displacement, the look angle, the baseline, etc. DInSAR fringes simulated here are idealized. They are not affected by any noise, decoherence, layover or shadow effects; radar image deformations are computed in ancillary files. However it appears that even these ideal wrapped fringes patterns get rapidly very complex when deformation is strong. Then this kind of tool is of interest to better constrain ground surface deformations from resulting InSAR fringes (from lab models or real landslides data). It makes also possible to test how the acquisition geometry impacts the InSAR result depending on the type of slope movement considered.

  15. Simulated multipolarized MAPSAR images to distinguish agricultural crops

    Directory of Open Access Journals (Sweden)

    Wagner Fernando Silva

    2012-06-01

    Full Text Available Many researchers have shown the potential of Synthetic Aperture Radar (SAR images for agricultural applications, particularly for monitoring regions with limitations in terms of acquiring cloud free optical images. Recently, Brazil and Germany began a feasibility study on the construction of an orbital L-band SAR sensor referred to as MAPSAR (Multi-Application Purpose SAR. This sensor provides L-band images in three spatial resolutions and polarimetric, interferometric and stereoscopic capabilities. Thus, studies are needed to evaluate the potential of future MAPSAR images. The objective of this study was to evaluate multipolarized MAPSAR images simulated by the airborne SAR-R99B sensor to distinguish coffee, cotton and pasture fields in Brazil. Discrimination among crops was evaluated through graphical and cluster analysis of mean backscatter values, considering single, dual and triple polarizations. Planting row direction of coffee influenced the backscatter and was divided into two classes: parallel and perpendicular to the sensor look direction. Single polarizations had poor ability to discriminate the crops. The overall accuracies were less than 59 %, but the understanding of the microwave interaction with the crops could be explored. Combinations of two polarizations could differentiate various fields of crops, highlighting the combination VV-HV that reached 78 % overall accuracy. The use of three polarizations resulted in 85.4 % overall accuracy, indicating that the classes pasture and parallel coffee were fully discriminated from the other classes. These results confirmed the potential of multipolarized MAPSAR images to distinguish the studied crops and showed considerable improvement in the accuracy of the results when the number of polarizations was increased.

  16. Image segmentation with a unified graphical model.

    Science.gov (United States)

    Zhang, Lei; Ji, Qiang

    2010-08-01

    We propose a unified graphical model that can represent both the causal and noncausal relationships among random variables and apply it to the image segmentation problem. Specifically, we first propose to employ Conditional Random Field (CRF) to model the spatial relationships among image superpixel regions and their measurements. We then introduce a multilayer Bayesian Network (BN) to model the causal dependencies that naturally exist among different image entities, including image regions, edges, and vertices. The CRF model and the BN model are then systematically and seamlessly combined through the theories of Factor Graph to form a unified probabilistic graphical model that captures the complex relationships among different image entities. Using the unified graphical model, image segmentation can be performed through a principled probabilistic inference. Experimental results on the Weizmann horse data set, on the VOC2006 cow data set, and on the MSRC2 multiclass data set demonstrate that our approach achieves favorable results compared to state-of-the-art approaches as well as those that use either the BN model or CRF model alone.

  17. A Fast Visible-Infrared Imaging Radiometer Suite Simulator for Cloudy Atmopheres

    Science.gov (United States)

    Liu, Chao; Yang, Ping; Nasiri, Shaima L.; Platnick, Steven; Meyer, Kerry G.; Wang, Chen Xi; Ding, Shouguo

    2015-01-01

    A fast instrument simulator is developed to simulate the observations made in cloudy atmospheres by the Visible Infrared Imaging Radiometer Suite (VIIRS). The correlated k-distribution (CKD) technique is used to compute the transmissivity of absorbing atmospheric gases. The bulk scattering properties of ice clouds used in this study are based on the ice model used for the MODIS Collection 6 ice cloud products. Two fast radiative transfer models based on pre-computed ice cloud look-up-tables are used for the VIIRS solar and infrared channels. The accuracy and efficiency of the fast simulator are quantify in comparison with a combination of the rigorous line-by-line (LBLRTM) and discrete ordinate radiative transfer (DISORT) models. Relative errors are less than 2 for simulated TOA reflectances for the solar channels and the brightness temperature differences for the infrared channels are less than 0.2 K. The simulator is over three orders of magnitude faster than the benchmark LBLRTM+DISORT model. Furthermore, the cloudy atmosphere reflectances and brightness temperatures from the fast VIIRS simulator compare favorably with those from VIIRS observations.

  18. Image based numerical simulation of hemodynamics in a intracranial aneurysm

    Science.gov (United States)

    Le, Trung; Ge, Liang; Sotiropoulos, Fotis; Kallmes, David; Cloft, Harry; Lewis, Debra; Dai, Daying; Ding, Yonghong; Kadirvel, Ramanathan

    2007-11-01

    Image-based numerical simulations of hemodynamics in a intracranial aneurysm are carried out. The numerical solver based on CURVIB (curvilinear grid/immersed boundary method) approach developed in Ge and Sotiropoulos, JCP 2007 is used to simulate the blood flow. A curvilinear grid system that gradually follows the curved geometry of artery wall and consists of approximately 5M grid nodes is constructed as the background grid system and the boundaries of the investigated artery and aneurysm are treated as immersed boundaries. The surface geometry of aneurysm wall is reconstructed from an angiography study of an aneurysm formed on the common carotid artery (CCA) of a rabbit and discretized with triangular meshes. At the inlet a physiological flow waveform is specified and direct numerical simulations are used to simulate the blood flow. Very rich vortical dynamics is observed within the aneurysm area, with a ring like vortex sheds from the proximal side of aneurysm, develops and impinge onto the distal side of the aneurysm as flow develops, and destructs into smaller vortices during later cardiac cycle. This work was supported in part by the University of Minnesota Supercomputing Institute.

  19. Modeling and simulation with operator scaling

    CERN Document Server

    Cohen, Serge; Rosinski, Jan

    2009-01-01

    Self-similar processes are useful in modeling diverse phenomena that exhibit scaling properties. Operator scaling allows a different scale factor in each coordinate. This paper develops practical methods for modeling and simulating stochastic processes with operator scaling. A simulation method for operator stable Levy processes is developed, based on a series representation, along with a Gaussian approximation of the small jumps. Several examples are given to illustrate practical applications. A classification of operator stable Levy processes in two dimensions is provided according to their exponents and symmetry groups. We conclude with some remarks and extensions to general operator self-similar processes.

  20. Hemispherical sky simulator for daylighting model studies

    Energy Technology Data Exchange (ETDEWEB)

    Selkowitz, S.

    1981-07-01

    The design of a 24-foot-diameter hemispherical sky simulator recently completed at LBL is described. The goal was to produce a facility in which large models could be tested; which was suitable for research, teaching, and design; which could provide a uniform sky, an overcast sky, and several clear-sky luminance distributions, as well as accommodating an artificial sun. Initial operating experience with the facility is described, the sky simulator capabilities are reviewed, and its strengths and weaknesses relative to outdoor modeling tests are discussed.

  1. Simulated lesion, human observer performance comparison between thin-section dedicated breast CT images versus computed thick-section simulated projection images of the breast

    Science.gov (United States)

    Chen, L.; Boone, J. M.; Abbey, C. K.; Hargreaves, J.; Bateni, C.; Lindfors, K. K.; Yang, K.; Nosratieh, A.; Hernandez, A.; Gazi, P.

    2015-04-01

    The objective of this study was to compare the lesion detection performance of human observers between thin-section computed tomography images of the breast, with thick-section (>40 mm) simulated projection images of the breast. Three radiologists and six physicists each executed a two alterative force choice (2AFC) study involving simulated spherical lesions placed mathematically into breast images produced on a prototype dedicated breast CT scanner. The breast image data sets from 88 patients were used to create 352 pairs of image data. Spherical lesions with diameters of 1, 2, 3, 5, and 11 mm were simulated and adaptively positioned into 3D breast CT image data sets; the native thin section (0.33 mm) images were averaged to produce images with different slice thicknesses; average section thicknesses of 0.33, 0.71, 1.5 and 2.9 mm were representative of breast CT; the average 43 mm slice thickness served to simulate simulated projection images of the breast. The percent correct of the human observer’s responses were evaluated in the 2AFC experiments. Radiologists lesion detection performance was significantly (p physicist observer, however trends in performance were similar. Human observers demonstrate significantly better mass-lesion detection performance on thin-section CT images of the breast, compared to thick-section simulated projection images of the breast.

  2. Wind Shear Target Echo Modeling and Simulation

    Directory of Open Access Journals (Sweden)

    Xiaoyang Liu

    2015-01-01

    Full Text Available Wind shear is a dangerous atmospheric phenomenon in aviation. Wind shear is defined as a sudden change of speed or direction of the wind. In order to analyze the influence of wind shear on the efficiency of the airplane, this paper proposes a mathematical model of point target rain echo and weather target signal echo based on Doppler effect. The wind field model is developed in this paper, and the antenna model is also studied by using Bessel function. The spectrum distribution of symmetric and asymmetric wind fields is researched by using the mathematical model proposed in this paper. The simulation results are in accordance with radial velocity component, and the simulation results also confirm the correctness of the established model of antenna.

  3. Nuclear reactor core modelling in multifunctional simulators

    Energy Technology Data Exchange (ETDEWEB)

    Puska, E.K. [VTT Energy, Nuclear Energy, Espoo (Finland)

    1999-06-01

    The thesis concentrates on the development of nuclear reactor core models for the APROS multifunctional simulation environment and the use of the core models in various kinds of applications. The work was started in 1986 as a part of the development of the entire APROS simulation system. The aim was to create core models that would serve in a reliable manner in an interactive, modular and multifunctional simulator/plant analyser environment. One-dimensional and three-dimensional core neutronics models have been developed. Both models have two energy groups and six delayed neutron groups. The three-dimensional finite difference type core model is able to describe both BWR- and PWR-type cores with quadratic fuel assemblies and VVER-type cores with hexagonal fuel assemblies. The one- and three-dimensional core neutronics models can be connected with the homogeneous, the five-equation or the six-equation thermal hydraulic models of APROS. The key feature of APROS is that the same physical models can be used in various applications. The nuclear reactor core models of APROS have been built in such a manner that the same models can be used in simulator and plant analyser applications, as well as in safety analysis. In the APROS environment the user can select the number of flow channels in the three-dimensional reactor core and either the homogeneous, the five- or the six-equation thermal hydraulic model for these channels. The thermal hydraulic model and the number of flow channels have a decisive effect on the calculation time of the three-dimensional core model and thus, at present, these particular selections make the major difference between a safety analysis core model and a training simulator core model. The emphasis on this thesis is on the three-dimensional core model and its capability to analyse symmetric and asymmetric events in the core. The factors affecting the calculation times of various three-dimensional BWR, PWR and WWER-type APROS core models have been

  4. Battery thermal models for hybrid vehicle simulations

    Science.gov (United States)

    Pesaran, Ahmad A.

    This paper summarizes battery thermal modeling capabilities for: (1) an advanced vehicle simulator (ADVISOR); and (2) battery module and pack thermal design. The National Renewable Energy Laboratory's (NREL's) ADVISOR is developed in the Matlab/Simulink environment. There are several battery models in ADVISOR for various chemistry types. Each one of these models requires a thermal model to predict the temperature change that could affect battery performance parameters, such as resistance, capacity and state of charges. A lumped capacitance battery thermal model in the Matlab/Simulink environment was developed that included the ADVISOR battery performance models. For thermal evaluation and design of battery modules and packs, NREL has been using various computer aided engineering tools including commercial finite element analysis software. This paper will discuss the thermal ADVISOR battery model and its results, along with the results of finite element modeling that were presented at the workshop on "Development of Advanced Battery Engineering Models" in August 2001.

  5. Augmented reality intravenous injection simulator based 3D medical imaging for veterinary medicine.

    Science.gov (United States)

    Lee, S; Lee, J; Lee, A; Park, N; Lee, S; Song, S; Seo, A; Lee, H; Kim, J-I; Eom, K

    2013-05-01

    Augmented reality (AR) is a technology which enables users to see the real world, with virtual objects superimposed upon or composited with it. AR simulators have been developed and used in human medicine, but not in veterinary medicine. The aim of this study was to develop an AR intravenous (IV) injection simulator to train veterinary and pre-veterinary students to perform canine venipuncture. Computed tomographic (CT) images of a beagle dog were scanned using a 64-channel multidetector. The CT images were transformed into volumetric data sets using an image segmentation method and were converted into a stereolithography format for creating 3D models. An AR-based interface was developed for an AR simulator for IV injection. Veterinary and pre-veterinary student volunteers were randomly assigned to an AR-trained group or a control group trained using more traditional methods (n = 20/group; n = 8 pre-veterinary students and n = 12 veterinary students in each group) and their proficiency at IV injection technique in live dogs was assessed after training was completed. Students were also asked to complete a questionnaire which was administered after using the simulator. The group that was trained using an AR simulator were more proficient at IV injection technique using real dogs than the control group (P ≤ 0.01). The students agreed that they learned the IV injection technique through the AR simulator. Although the system used in this study needs to be modified before it can be adopted for veterinary educational use, AR simulation has been shown to be a very effective tool for training medical personnel. Using the technology reported here, veterinary AR simulators could be developed for future use in veterinary education.

  6. Mathematical modeling in biomedical imaging

    CERN Document Server

    2009-01-01

    This volume gives an introduction to a fascinating research area to applied mathematicians. It is devoted to providing the exposition of promising analytical and numerical techniques for solving challenging biomedical imaging problems, which trigger the investigation of interesting issues in various branches of mathematics.

  7. Modeling and interpretation of images

    NARCIS (Netherlands)

    M. Min

    2014-01-01

    Imaging protoplanetary disks is a challenging but rewarding task. It is challenging because of the glare of the central star outshining the weak signal from the disk at shorter wavelengths and because of the limited spatial resolution at longer wavelengths. It is rewarding because it contains a weal

  8. Kanban simulation model for production process optimization

    Directory of Open Access Journals (Sweden)

    Golchev Riste

    2015-01-01

    Full Text Available A long time has passed since the KANBAN system has been established as an efficient method for coping with the excessive inventory. Still, the possibilities for its improvement through its integration with other different approaches should be investigated further. The basic research challenge of this paper is to present benefits of KANBAN implementation supported with Discrete Event Simulation (DES. In that direction, at the beginning, the basics of KANBAN system are presented with emphasis on the information and material flow, together with a methodology for implementation of KANBAN system. Certain analysis on combining the simulation with this methodology is presented. The paper is concluded with a practical example which shows that through understanding the philosophy of the implementation methodology of KANBAN system and the simulation methodology, a simulation model can be created which can serve as a basis for a variety of experiments that can be conducted within a short period of time, resulting with production process optimization.

  9. Numerical simulation research on sodium laser beacon imagings through the atmosphere turbulence

    Science.gov (United States)

    Liu, Xiangyuan; Qian, Xianmei; Zhang, Suimeng; Zhao, Minfu; Cui, Chaolong; Huang, Honghua

    2016-01-01

    Based on the relative intensity distributions of Sodium Laser Beacon (SLB) and analysis of the on-axis imaging of incoherent light, considering the effects of atmospheric turbulence and the changes of telescope receiving diameter on the short-exposure SLB imagings on the focal plane, imagings of an extended source SLB are simulated under the three atmospheric turbulence models. Results indicate that sharpness and peak strehl ratio of SLB imagings increase but sharpness radius decrease with the decrease of atmosphere turbulence strengths. Moreover, the changes of telescope diameter from 3.0m to 1.5m cause the decrease of sharpness and peak strehl ratio but the increase of sharpness radius.

  10. Fault-Tolerant Robot Programming through Simulation with Realistic Sensor Models

    OpenAIRE

    Axel Waggershauser; Thomas Braeunl; Andreas Koestler

    2008-01-01

    We introduce a simulation system for mobile robots that allows a realistic interaction of multiple robots in a common environment. The simulated robots are closely modeled after robots from the EyeBot family and have an identical application programmer interface. The simulation supports driving commands at two levels of abstraction as well as numerous sensors such as shaft encoders, infrared distance sensors, and compass. Simulation of on-board digital cameras via synthetic images allows the ...

  11. Simulation

    DEFF Research Database (Denmark)

    Gould, Derek A; Chalmers, Nicholas; Johnson, Sheena J

    2012-01-01

    Recognition of the many limitations of traditional apprenticeship training is driving new approaches to learning medical procedural skills. Among simulation technologies and methods available today, computer-based systems are topical and bring the benefits of automated, repeatable, and reliable p...... performance assessments. Human factors research is central to simulator model development that is relevant to real-world imaging-guided interventional tasks and to the credentialing programs in which it would be used.......Recognition of the many limitations of traditional apprenticeship training is driving new approaches to learning medical procedural skills. Among simulation technologies and methods available today, computer-based systems are topical and bring the benefits of automated, repeatable, and reliable...

  12. Model-based satellite image fusion

    DEFF Research Database (Denmark)

    Aanæs, Henrik; Sveinsson, J. R.; Nielsen, Allan Aasbjerg

    2008-01-01

    A method is proposed for pixel-level satellite image fusion derived directly from a model of the imaging sensor. By design, the proposed method is spectrally consistent. It is argued that the proposed method needs regularization, as is the case for any method for this problem. A framework for pixel...

  13. Rapid quantitative pharmacodynamic imaging by a novel method: theory, simulation testing and proof of principle

    Directory of Open Access Journals (Sweden)

    Kevin J. Black

    2013-08-01

    Full Text Available Pharmacological challenge imaging has mapped, but rarely quantified, the sensitivity of a biological system to a given drug. We describe a novel method called rapid quantitative pharmacodynamic imaging. This method combines pharmacokinetic-pharmacodynamic modeling, repeated small doses of a challenge drug over a short time scale, and functional imaging to rapidly provide quantitative estimates of drug sensitivity including EC50 (the concentration of drug that produces half the maximum possible effect. We first test the method with simulated data, assuming a typical sigmoidal dose-response curve and assuming imperfect imaging that includes artifactual baseline signal drift and random error. With these few assumptions, rapid quantitative pharmacodynamic imaging reliably estimates EC50 from the simulated data, except when noise overwhelms the drug effect or when the effect occurs only at high doses. In preliminary fMRI studies of primate brain using a dopamine agonist, the observed noise level is modest compared with observed drug effects, and a quantitative EC50 can be obtained from some regional time-signal curves. Taken together, these results suggest that research and clinical applications for rapid quantitative pharmacodynamic imaging are realistic.

  14. A fast convolution-based methodology to simulate 2-D/3-D cardiac ultrasound images.

    Science.gov (United States)

    Gao, Hang; Choi, Hon Fai; Claus, Piet; Boonen, Steven; Jaecques, Siegfried; Van Lenthe, G Harry; Van der Perre, Georges; Lauriks, Walter; D'hooge, Jan

    2009-02-01

    This paper describes a fast convolution-based methodology for simulating ultrasound images in a 2-D/3-D sector format as typically used in cardiac ultrasound. The conventional convolution model is based on the assumption of a space-invariant point spread function (PSF) and typically results in linear images. These characteristics are not representative for cardiac data sets. The spatial impulse response method (IRM) has excellent accuracy in the linear domain; however, calculation time can become an issue when scatterer numbers become significant and when 3-D volumetric data sets need to be computed. As a solution to these problems, the current manuscript proposes a new convolution-based methodology in which the data sets are produced by reducing the conventional 2-D/3-D convolution model to multiple 1-D convolutions (one for each image line). As an example, simulated 2-D/3-D phantom images are presented along with their gray scale histogram statistics. In addition, the computation time is recorded and contrasted to a commonly used implementation of IRM (Field II). It is shown that COLE can produce anatomically plausible images with local Rayleigh statistics but at improved calculation time (1200 times faster than the reference method).

  15. Background and Imaging Simulations for the Hard X-Ray Camera of the MIRAX Mission

    CERN Document Server

    Castro, Manuel; Penacchioni, Ana; D'Amico, Flavio; Sacahui, Rodrigo

    2016-01-01

    We report the results of detailed Monte Carlo simulations of the performance expected both at balloon altitudes and at the probable satellite orbit of a hard X-ray coded-aperture camera being developed for the {\\it MIRAX\\/} mission. Based on a thorough mass model of the instrument and detailed specifications of the spectra and angular dependence of the various relevant radiation fields at both the stratospheric and orbital environments, we have used the well-known package GEANT4 to simulate the instrumental background of the camera. We also show simulated images of source fields to be observed and calculated the detailed sensitivity of the instrument in both situations. The results reported here are especially important to researchers in this field considering that we provide important information, not easily found in the literature, on how to prepare input files and calculate crucial instrumental parameters to perform GEANT4 simulations for high-energy astrophysics space experiments.

  16. Background and imaging simulations for the hard X-ray camera of the MIRAX mission

    Science.gov (United States)

    Castro, M.; Braga, J.; Penacchioni, A.; D'Amico, F.; Sacahui, R.

    2016-07-01

    We report the results of detailed Monte Carlo simulations of the performance expected both at balloon altitudes and at the probable satellite orbit of a hard X-ray coded-aperture camera being developed for the Monitor e Imageador de RAios X (MIRAX) mission. Based on a thorough mass model of the instrument and detailed specifications of the spectra and angular dependence of the various relevant radiation fields at both the stratospheric and orbital environments, we have used the well-known package GEANT4 to simulate the instrumental background of the camera. We also show simulated images of source fields to be observed and calculated the detailed sensitivity of the instrument in both situations. The results reported here are especially important to researchers in this field considering that we provide important information, not easily found in the literature, on how to prepare input files and calculate crucial instrumental parameters to perform GEANT4 simulations for high-energy astrophysics space experiments.

  17. Distributed three-dimensional simulation of B-mode ultrasound imaging using a first-order k-space method.

    Science.gov (United States)

    Daoud, Mohammad I; Lacefield, James C

    2009-09-07

    Computational modeling is an important tool in ultrasound imaging research, but realistic three-dimensional (3D) simulations can exceed the capabilities of serial computers. This paper uses a 3D simulator based on a k-space method that incorporates relaxation absorption and nonreflecting boundary conditions. The simulator, which runs on computer clusters, computes the propagation of a single wavefront. In this paper, an allocation algorithm is introduced to assign each scan line to a group of nodes and use multiple groups to compute independent lines concurrently. The computational complexity required for realistic simulations is analyzed using example calculations of ultrasonic propagation and attenuation in the 30-50 MHz band. Parallel efficiency for B-mode imaging simulations is evaluated for various numbers of scan lines and cluster nodes. An aperture-projection technique is introduced to simulate imaging with a focused transducer using reduced computation grids. This technique is employed to synthesize B-mode images that show realistic 3D refraction artifacts. Parallel computing using 20 nodes to compute groups of ten scan lines concurrently reduced the execution time for each image to 18.6 h, compared to a serial execution time of 357.5 h. The results demonstrate that fully 3D imaging simulations are practical using contemporary computing technology.

  18. A simulation method of aircraft plumes for real-time imaging

    Science.gov (United States)

    Li, Ni; Lv, Zhenhua; Huai, Wenqin; Gong, Guanghong

    2016-07-01

    Real-time infrared simulation technology can provide a large number of infrared images under different conditions to support the design, test and evaluation of a system having infrared imaging equipment with very low costs. By synthesizing heat transfer, infrared physics, fluid mechanics and computer graphics, a real-time infrared simulation method is proposed based on the method of characteristics to predict the infrared feature of aircraft plumes, which tries to obtain a good balance between simulation precision and computation efficiency. The temperature and pressure distribution in the under-expansion status can be rapidly solved with dynamically changing flight statuses and engine working states. And a modified C-G (Curtis-Godson) spectral band model that combines the plume streamlines with the conventional C-G spectral band model was implemented to calculate the non-uniformly distributed radiation parameters inside a plume field. The simulation result was analyzed and compared with the CFD++, which validates the credibility and efficiency of the proposed simulation method.

  19. Characterization of tissue-simulating polymers for photoacoustic vascular imaging

    Science.gov (United States)

    Vogt, William C.; Jia, Congxian; Garra, Brian S.; Pfefer, T. Joshua

    2014-05-01

    Photoacoustic tomography (PAT) is a maturing imaging technique which combines optical excitation and acoustic detection to enable deep tissue sensing for biomedical applications. Optical absorption provides biochemical specificity and high optical contrast while ultrasonic detection provides high spatial resolution and penetration depth. These characteristics make PAT highly suitable as an approach for vascular imaging. However, standard testing methods are needed in order to characterize and compare the performance of these systems. Tissue-mimicking phantoms are commonly used as standard test samples for imaging system development and evaluation due to their repeatable fabrication and tunable properties. The multi-domain mechanism behind PAT necessitates development of phantoms that accurately mimic both acoustic and optical properties of tissues. While a wide variety of materials have been used in the literature, from gelatin and agar hydrogels to silicone, published data indicates that poly(vinyl chloride) plastisol (PVCP) is a promising candidate material for simulating tissue optical and acoustic properties while also providing superior longevity and stability. Critical acoustic properties of PVCP phantoms, including sound velocity and attenuation, were measured using acoustic transmission measurements at multiple frequencies relevant to typical PAT systems. Optical absorption and scattering coefficients of PVCP gels with and without biologically relevant absorbers and scatterers were measured over wavelengths from 500 to 1100 nm. A custom PAT system was developed to assess image contrast in PVCP phantoms containing fluid channels filled with absorbing dye. PVCP demonstrates strong potential as the basis of high-fidelity polymer phantoms for developing and evaluating PAT systems for vascular imaging applications.

  20. GRMHD simulations of visibility amplitude variability for Event Horizon Telescope images of Sgr A*

    CERN Document Server

    Medeiros, Lia; Ozel, Feryal; Psaltis, Dimitrios; Kim, Junhan; Marrone, Daniel P; Sadowski, Aleksander

    2016-01-01

    Synthesis imaging of the black hole in the center of the Milky Way, Sgr A*, with the Event Horizon Telescope (EHT) rests on the assumption of a stationary image. We explore the limitations of this assumption using high-cadence GRMHD simulations of Sgr A*. We employ analytic models that capture the basic characteristics of the images to understand the origin of the variability in the simulated visibility amplitudes. We find that, in all simulations, the visibility amplitudes for baselines oriented perpendicular to the spin axis of the black hole typically decrease smoothly over baseline lengths that are comparable to those of the EHT. On the other hand, the visibility amplitudes for baselines oriented parallel to the spin axis show significant structure with one or more minima. This suggests that fitting EHT observations with geometric models will lead to reasonably accurate determination of the orientation of the black-hole on the plane of the sky. However, in the disk-dominated models, the locations and dept...

  1. TRANSFORM - TRANsient Simulation Framework of Reconfigurable Models

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-01

    Existing development tools for early stage design and scoping of energy systems are often time consuming to use, proprietary, and do not contain the necessary function to model complete systems (i.e., controls, primary, and secondary systems) in a common platform. The Modelica programming language based TRANSFORM tool (1) provides a standardized, common simulation environment for early design of energy systems (i.e., power plants), (2) provides a library of baseline component modules to be assembled into full plant models using available geometry, design, and thermal-hydraulic data, (3) defines modeling conventions for interconnecting component models, and (4) establishes user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  2. A Monte Carlo Simulation Framework for Testing Cosmological Models

    Directory of Open Access Journals (Sweden)

    Heymann Y.

    2014-10-01

    Full Text Available We tested alternative cosmologies using Monte Carlo simulations based on the sam- pling method of the zCosmos galactic survey. The survey encompasses a collection of observable galaxies with respective redshifts that have been obtained for a given spec- troscopic area of the sky. Using a cosmological model, we can convert the redshifts into light-travel times and, by slicing the survey into small redshift buckets, compute a curve of galactic density over time. Because foreground galaxies obstruct the images of more distant galaxies, we simulated the theoretical galactic density curve using an average galactic radius. By comparing the galactic density curves of the simulations with that of the survey, we could assess the cosmologies. We applied the test to the expanding-universe cosmology of de Sitter and to a dichotomous cosmology.

  3. Computer simulation of orthognathic surgery with video imaging

    Science.gov (United States)

    Sader, Robert; Zeilhofer, Hans-Florian U.; Horch, Hans-Henning

    1994-04-01

    Patients with extreme jaw imbalance must often undergo operative corrections. The goal of therapy is to harmonize the stomatognathic system and an aesthetical correction of the face profile. A new procedure will be presented which supports the maxillo-facial surgeon in planning the operation and which also presents the patient the result of the treatment by video images. Once an x-ray has been digitized it is possible to produce individualized cephalometric analyses. Using a ceph on screen, all current orthognathic operations can be simulated, whereby the bony segments are moved according to given parameters, and a new soft tissue profile can be calculated. The profile of the patient is fed into the computer by way of a video system and correlated to the ceph. Using the simulated operation the computer calculates a new video image of the patient which presents the expected postoperative appearance. In studies of patients treated between 1987-91, 76 out of 121 patients were able to be evaluated. The deviation in profile change varied between .0 and 1.6mm. A side effect of the practical applications was an increase in patient compliance.

  4. EXACT SIMULATION OF A BOOLEAN MODEL

    Directory of Open Access Journals (Sweden)

    Christian Lantuéjoul

    2013-06-01

    Full Text Available A Boolean model is a union of independent objects (compact random subsets located at Poisson points. Two algorithms are proposed for simulating a Boolean model in a bounded domain. The first one applies only to stationary models. It generates the objects prior to their Poisson locations. Two examples illustrate its applicability. The second algorithm applies to stationary and non-stationary models. It generates the Poisson points prior to the objects. Its practical difficulties of implementation are discussed. Both algorithms are based on importance sampling techniques, and the generated objects are weighted.

  5. Biological transportation networks: Modeling and simulation

    KAUST Repository

    Albi, Giacomo

    2015-09-15

    We present a model for biological network formation originally introduced by Cai and Hu [Adaptation and optimization of biological transport networks, Phys. Rev. Lett. 111 (2013) 138701]. The modeling of fluid transportation (e.g., leaf venation and angiogenesis) and ion transportation networks (e.g., neural networks) is explained in detail and basic analytical features like the gradient flow structure of the fluid transportation network model and the impact of the model parameters on the geometry and topology of network formation are analyzed. We also present a numerical finite-element based discretization scheme and discuss sample cases of network formation simulations.

  6. Modeling and Simulation of Nuclear Fuel Materials

    Energy Technology Data Exchange (ETDEWEB)

    Devanathan, Ramaswami; Van Brutzel, Laurent; Chartier, Alan; Gueneau, Christine; Mattsson, Ann E.; Tikare, Veena; Bartel, Timothy; Besmann, T. M.; Stan, Marius; Van Uffelen, Paul

    2010-10-01

    We review the state of modeling and simulation of nuclear fuels with emphasis on the most widely used nuclear fuel, UO2. The hierarchical scheme presented represents a science-based approach to modeling nuclear fuels by progressively passing information in several stages from ab initio to continuum levels. Such an approach is essential to overcome the challenges posed by radioactive materials handling, experimental limitations in modeling extreme conditions and accident scenarios, and the small time and distance scales of fundamental defect processes. When used in conjunction with experimental validation, this multiscale modeling scheme can provide valuable guidance to development of fuel for advanced reactors to meet rising global energy demand.

  7. Simulation modeling of health care policy.

    Science.gov (United States)

    Glied, Sherry; Tilipman, Nicholas

    2010-01-01

    Simulation modeling of health reform is a standard part of policy development and, in the United States, a required element in enacting health reform legislation. Modelers use three types of basic structures to build models of the health system: microsimulation, individual choice, and cell-based. These frameworks are filled in with data on baseline characteristics of the system and parameters describing individual behavior. Available data on baseline characteristics are imprecise, and estimates of key empirical parameters vary widely. A comparison of estimated and realized consequences of several health reform proposals suggests that models provided reasonably accurate estimates, with confidence bounds of approximately 30%.

  8. Source coding model for repeated snapshot imaging

    CERN Document Server

    Li, Junhui; Yang, Dongyue; wu, Guohua; Yin, Longfei; Guo, Hong

    2016-01-01

    Imaging based on successive repeated snapshot measurement is modeled as a source coding process in information theory. The necessary number of measurement to maintain a certain level of error rate is depicted as the rate-distortion function of the source coding. Quantitative formula of the error rate versus measurement number relation is derived, based on the information capacity of imaging system. Second order fluctuation correlation imaging (SFCI) experiment with pseudo-thermal light verifies this formula, which paves the way for introducing information theory into the study of ghost imaging (GI), both conventional and computational.

  9. Modeling and simulation of epidemic spread

    DEFF Research Database (Denmark)

    Shatnawi, Maad; Lazarova-Molnar, Sanja; Zaki, Nazar

    2013-01-01

    and control such epidemics. This paper presents an overview of the epidemic spread modeling and simulation, and summarizes the main technical challenges in this field. It further investigates the most relevant recent approaches carried out towards this perspective and provides a comparison and classification...

  10. Object Oriented Modelling and Dynamical Simulation

    DEFF Research Database (Denmark)

    Wagner, Falko Jens; Poulsen, Mikael Zebbelin

    1998-01-01

    This report with appendix describes the work done in master project at DTU.The goal of the project was to develop a concept for simulation of dynamical systems based on object oriented methods.The result was a library of C++-classes, for use when both building componentbased models and when...

  11. Modeling and Simulating Virtual Anatomical Humans

    NARCIS (Netherlands)

    Madehkhaksar, Forough; Luo, Zhiping; Pronost, Nicolas; Egges, Arjan

    2014-01-01

    This chapter presents human musculoskeletal modeling and simulation as a challenging field that lies between biomechanics and computer animation. One of the main goals of computer animation research is to develop algorithms and systems that produce plausible motion. On the other hand, the main chall

  12. Modeling and Simulation in Healthcare Future Directions

    Science.gov (United States)

    2010-07-13

    Quantify performance (Competency - based) 6. Simulate before practice ( Digital Libraries ) Classic Education and Examination What is the REVOLUTION in...av $800,000 yr 2.) Actor patients - $250,000 – $400,000/yr 2. Digital Libraries or synthetic tissue models a. Subscription vs up-front costs

  13. Simulation Versus Models: Which One and When?

    Science.gov (United States)

    Dorn, William S.

    1975-01-01

    Describes two types of computer-based experiments: simulation (which assumes no student knowledge of the workings of the computer program) is recommended for experiments aimed at inductive reasoning; and modeling (which assumes student understanding of the computer program) is recommended for deductive processes. (MLH)

  14. Love Kills:. Simulations in Penna Ageing Model

    Science.gov (United States)

    Stauffer, Dietrich; Cebrat, Stanisław; Penna, T. J. P.; Sousa, A. O.

    The standard Penna ageing model with sexual reproduction is enlarged by adding additional bit-strings for love: Marriage happens only if the male love strings are sufficiently different from the female ones. We simulate at what level of required difference the population dies out.

  15. Inverse modeling for Large-Eddy simulation

    NARCIS (Netherlands)

    Geurts, Bernardus J.

    1998-01-01

    Approximate higher order polynomial inversion of the top-hat filter is developed with which the turbulent stress tensor in Large-Eddy Simulation can be consistently represented using the filtered field. Generalized (mixed) similarity models are proposed which improved the agreement with the kinetic

  16. Microdata Simulation Modeling After Twenty Years.

    Science.gov (United States)

    Haveman, Robert H.

    1986-01-01

    This article describes the method and the development of microdata simulation modeling over the past two decades. After tracing a brief history of this evaluation method, its problems and prospects are assessed. The effects of this research method on the development of the social sciences are examined. (JAZ)

  17. Simulation Modeling on the Macintosh using STELLA.

    Science.gov (United States)

    Costanza, Robert

    1987-01-01

    Describes a new software package for the Apple Macintosh computer which can be used to create elaborate simulation models in a fraction of the time usually required without using a programming language. Illustrates the use of the software which relates to water usage. (TW)

  18. Simulation Modeling of Radio Direction Finding Results

    Directory of Open Access Journals (Sweden)

    K. Pelikan

    1994-12-01

    Full Text Available It is sometimes difficult to determine analytically error probabilities of direction finding results for evaluating algorithms of practical interest. Probalistic simulation models are described in this paper that can be to study error performance of new direction finding systems or to geographical modifications of existing configurations.

  19. A Prison/Parole System Simulation Model,

    Science.gov (United States)

    parole system on future prison and parole populations. A simulation model is presented, viewing a prison / parole system as a feedback process for...ciminal offenders . Transitions among the states in which an offender might be located, imprisoned, paroled , and discharged, are assumed to be in...accordance with a discrete time semi-Markov process. Projected prison and parole populations for sample data and applications of the model are discussed. (Author)

  20. Comparing a phased combination of acoustical radiosity and the image source method with other simulation tools

    DEFF Research Database (Denmark)

    Marbjerg, Gerd Høy; Brunskog, Jonas; Jeong, Cheol-Ho

    2015-01-01

    A phased combination of acoustical radiosity and the image source method (PARISM) has been developed in order to be able to model both specular and diffuse reflections with angle-dependent and complex-valued acoustical descriptions of the surfaces. It is of great interest to model both specular...... and diffuse reflections when simulating the acoustics of small rooms with non-diffuse sound fields, since scattering from walls add to the diffuseness in the room. This room type is often seen in class rooms and offices, as they are often small rectangular rooms with most of the absorption placed...... on the ceiling. Here, PARISM is used for comparisons with other simulation tools and measurements. An empty, rectangular room with a suspended absorbing ceiling is used for the comparisons. It was found that including the phase information in simulations increases the spatial standard deviation, even if only...

  1. Modeling gated neutron images of THD capsules

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, Douglas Carl [Los Alamos National Laboratory; Grim, Gary P [Los Alamos National Laboratory; Tregillis, Ian L [Los Alamos National Laboratory; Wilke, Mark D [Los Alamos National Laboratory; Morgan, George L [Los Alamos National Laboratory; Loomis, Eric N [Los Alamos National Laboratory; Wilde, Carl H [Los Alamos National Laboratory; Oertel, John A [Los Alamos National Laboratory; Fatherley, Valerie E [Los Alamos National Laboratory; Clark, David D [Los Alamos National Laboratory; Schmitt, Mark J [Los Alamos National Laboratory; Merrill, Frank E [Los Alamos National Laboratory; Wang, Tai - Sen F [Los Alamos National Laboratory; Danly, Christopher R [Los Alamos National Laboratory; Batha, Steven H [Los Alamos National Laboratory; Patel, M [LLNL; Sepke, S [LLNL; Hatarik, R [LLNL; Fittinghoff, D [LLNL; Bower, D [LLNL; Marinak, M [LLNL; Munro, D [LLNL; Moran, M [LLNL; Hilko, R [NSTEC; Frank, M [LLNL; Buckles, R [NSTEC

    2010-01-01

    Time gating a neutron detector 28m from a NIF implosion can produce images at different energies. The brighter image near 14 MeV reflects the size and symmetry of the capsule 'hot spot'. Scattered neutrons, {approx}9.5-13 MeV, reflect the size and symmetry of colder, denser fuel, but with only {approx}1-7% of the neutrons. The gated detector records both the scattered neutron image, and, to a good approximation, an attenuated copy of the primary image left by scintillator decay. By modeling the imaging system the energy band for the scattered neutron image (10-12 MeV) can be chosen, trading off the decayed primary image and the decrease of scattered image brightness with energy. Modeling light decay from EJ399, BC422, BCF99-55, Xylene, DPAC-30, and Liquid A leads to a preference from BCF99-55 for the first NIF detector, but DPAC 30 and Liquid A would be preferred if incorporated into a system. Measurement of the delayed light from the NIF scintillator using implosions at the Omega laser shows BCF99-55 to be a good choice for down-scattered imaging at 28m.

  2. Hybrid Information Retrieval Model For Web Images

    CERN Document Server

    Bassil, Youssef

    2012-01-01

    The Bing Bang of the Internet in the early 90's increased dramatically the number of images being distributed and shared over the web. As a result, image information retrieval systems were developed to index and retrieve image files spread over the Internet. Most of these systems are keyword-based which search for images based on their textual metadata; and thus, they are imprecise as it is vague to describe an image with a human language. Besides, there exist the content-based image retrieval systems which search for images based on their visual information. However, content-based type systems are still immature and not that effective as they suffer from low retrieval recall/precision rate. This paper proposes a new hybrid image information retrieval model for indexing and retrieving web images published in HTML documents. The distinguishing mark of the proposed model is that it is based on both graphical content and textual metadata. The graphical content is denoted by color features and color histogram of ...

  3. Image Deblurring Using Sparse-Land Model

    Directory of Open Access Journals (Sweden)

    Harini. B

    2012-06-01

    Full Text Available - In this paper, the problem of deblurring and poor image resolution is addressed and extended the task by removing blur and noise in the image to form a restored image as the output. Haar Wavelet transform based Wiener filtering is used in this algorithm. Soft Thresholding and Parallel Coordinate Descent (PCD iterative shrinkage algorithm are used for removal of noise and deblurring. Sequential Subspace Optimization (SESOP method or Line search method provides speed-up to this process. Sparse-land model is an emerging and powerful method to describe signals based on the sparsity and redundancy of their representations. Sparse coding is a key principle that underlies wavelet representation of images. Coefficient obtained after removal of noise and blur are not truly Sparse and so Minimum Mean Squared Error estimator (MMSE estimates the Sparse vectors. Sparse representation of signals have drawn considerable interest in recent years. Sparse coding is a key principle that underlies wavelet representation of images. In this paper, we explain the effort of seeking a common wavelet sparse coding of images from same object category leads to an active basis model called Sparse-land model, where the images share the same set of selected wavelet elements, which forms a linear basis for restoring the blurred image.

  4. A Jones matrix formalism for simulating 3D Polarised Light Imaging of brain tissue

    CERN Document Server

    Menzel, Miriam; De Raedt, Hans; Reckfort, Julia; Amunts, Katrin; Axer, Markus

    2015-01-01

    The neuroimaging technique 3D Polarised Light Imaging (3D-PLI) provides a high-resolution reconstruction of nerve fibres in human post-mortem brains. The orientations of the fibres are derived from birefringence measurements of histological brain sections assuming that the nerve fibres - consisting of an axon and a surrounding myelin sheath - are uniaxial birefringent and that the measured optic axis is oriented in direction of the nerve fibres (macroscopic model). Although experimental studies support this assumption, the molecular structure of the myelin sheath suggests that the birefringence of a nerve fibre can be described more precisely by multiple optic axes oriented radially around the fibre axis (microscopic model). In this paper, we compare the use of the macroscopic and the microscopic model for simulating 3D-PLI by means of the Jones matrix formalism. The simulations show that the macroscopic model ensures a reliable estimation of the fibre orientations as long as the polarimeter does not resolve ...

  5. Twitter's tweet method modelling and simulation

    Science.gov (United States)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    This paper seeks to purpose the concept of Twitter marketing methods. The tools that Twitter provides are modelled and simulated using iThink in the context of a Twitter media-marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following models have been developed for a twitter marketing agent/company and tested in real circumstances and with real numbers. These models were finalized through a number of revisions and iterators of the design, develop, simulate, test and evaluate. It also addresses these methods that suit most organized promotion through targeting, to the Twitter social media service. The validity and usefulness of these Twitter marketing methods models for the day-to-day decision making are authenticated by the management of the company organization. It implements system dynamics concepts of Twitter marketing methods modelling and produce models of various Twitter marketing situations. The Tweet method that Twitter provides can be adjusted, depending on the situation, in order to maximize the profit of the company/agent.

  6. Modelling of microcracks image treated with fluorescent dye

    Science.gov (United States)

    Glebov, Victor; Lashmanov, Oleg U.

    2015-06-01

    The main reasons of catastrophes and accidents are high level of wear of equipment and violation of the production technology. The methods of nondestructive testing are designed to find out defects timely and to prevent break down of aggregates. These methods allow determining compliance of object parameters with technical requirements without destroying it. This work will discuss dye penetrant inspection or liquid penetrant inspection (DPI or LPI) methods and computer model of microcracks image treated with fluorescent dye. Usually cracks on image look like broken extended lines with small width (about 1 to 10 pixels) and ragged edges. The used method of inspection allows to detect microcracks with depth about 10 or more micrometers. During the work the mathematical model of image of randomly located microcracks treated with fluorescent dye was created in MATLAB environment. Background noises and distortions introduced by the optical systems are considered in the model. The factors that have influence on the image are listed below: 1. Background noise. Background noise is caused by the bright light from external sources and it reduces contrast on the objects edges. 2. Noises on the image sensor. Digital noise manifests itself in the form of randomly located points that are differing in their brightness and color. 3. Distortions caused by aberrations of optical system. After passing through the real optical system the homocentricity of the bundle of rays is violated or homocentricity remains but rays intersect at the point that doesn't coincide with the point of the ideal image. The stronger the influence of the above-listed factors, the worse the image quality and therefore the analysis of the image for control of the item finds difficulty. The mathematical model is created using the following algorithm: at the beginning the number of cracks that will be modeled is entered from keyboard. Then the point with random position is choosing on the matrix whose size is

  7. Misaligned Image Integration With Local Linear Model.

    Science.gov (United States)

    Baba, Tatsuya; Matsuoka, Ryo; Shirai, Keiichiro; Okuda, Masahiro

    2016-05-01

    We present a new image integration technique for a flash and long-exposure image pair to capture a dark scene without incurring blurring or noisy artifacts. Most existing methods require well-aligned images for the integration, which is often a burdensome restriction in practical use. We address this issue by locally transferring the colors of the flash images using a small fraction of the corresponding pixels in the long-exposure images. We formulate the image integration as a convex optimization problem with the local linear model. The proposed method makes it possible to integrate the color of the long-exposure image with the detail of the flash image without causing any harmful effects to its contrast, where we do not need perfect alignment between the images by virtue of our new integration principle. We show that our method successfully outperforms the state of the art in the image integration and reference-based color transfer for challenging misaligned data sets.

  8. Inferring the photometric and size evolution of galaxies from image simulations. I. Method

    Science.gov (United States)

    Carassou, Sébastien; de Lapparent, Valérie; Bertin, Emmanuel; Le Borgne, Damien

    2017-09-01

    Context. Current constraints on models of galaxy evolution rely on morphometric catalogs extracted from multi-band photometric surveys. However, these catalogs are altered by selection effects that are difficult to model, that correlate in non trivial ways, and that can lead to contradictory predictions if not taken into account carefully. Aims: To address this issue, we have developed a new approach combining parametric Bayesian indirect likelihood (pBIL) techniques and empirical modeling with realistic image simulations that reproduce a large fraction of these selection effects. This allows us to perform a direct comparison between observed and simulated images and to infer robust constraints on model parameters. Methods: We use a semi-empirical forward model to generate a distribution of mock galaxies from a set of physical parameters. These galaxies are passed through an image simulator reproducing the instrumental characteristics of any survey and are then extracted in the same way as the observed data. The discrepancy between the simulated and observed data is quantified, and minimized with a custom sampling process based on adaptive Markov chain Monte Carlo methods. Results: Using synthetic data matching most of the properties of a Canada-France-Hawaii Telescope Legacy Survey Deep field, we demonstrate the robustness and internal consistency of our approach by inferring the parameters governing the size and luminosity functions and their evolutions for different realistic populations of galaxies. We also compare the results of our approach with those obtained from the classical spectral energy distribution fitting and photometric redshift approach. Conclusions: Our pipeline infers efficiently the luminosity and size distribution and evolution parameters with a very limited number of observables (three photometric bands). When compared to SED fitting based on the same set of observables, our method yields results that are more accurate and free from

  9. [Simulation of image multi-spectrum using field measured endmember spectrum].

    Science.gov (United States)

    Zhang, Ting; Ding, Jian-li; Wang, Fei

    2010-11-01

    The characteristic of landscape spectrum is the basic of application of remote sensing and plays an important role in quantitative analysis of remote sensing. However, in spectrum-based application of remote sensing, because the difference of measuring scale and instrument resolution yield serious error in spectral curve and reflectance for the same landscape, there exists difficulty in quantitative retrieval of special information extraction of remote sensing. Firstly, the imaging simulation principles of the optics image was described and proposed A method using field measured endmember spectrum with higher spectrum resolutions to simulate spectrum of Multi-spectrum images with lower spectrum resolution was proposed. In the present paper, the authors take the delta oasis of Weigan and Kuqa rivers ocated in the North of Tarim Basin as study area, and choose vegetation and soil as study object. At first, we accomplished the simulation from field measured endmember for multi-spectrum by using the spectral response function of AVNIR-2, and found the large correlation between simulated multi-spectrum and pixel spectrum of AVNIR-2 by using the statistical analyse. Finally, the authors set up the linear model to accomplish the quantitative transformation from edmember scale to pixel scale. The result of this study has the realistic meaning for the quantitative application of remote sensing.

  10. Toward a real-time simulation of ultrasound image sequences based on a 3-D set of moving scatterers.

    Science.gov (United States)

    Marion, Adrien; Vray, Didier

    2009-10-01

    Data simulation is an important research tool to evaluate algorithms. Two types of methods are currently used to simulate medical ultrasound data: those based on acoustic models and those based on convolution models. The simulation of ultrasound data sequences is very time-consuming. In addition, many applications require accounting for the out-of-plane motion induced by the 3-D displacement of scatterers. The purpose of this paper is to propose a model adapted to a fast simulation of ultrasonic data sequences with 3-D moving scatterers. Our approach is based on the convolution model. The scatterers are moved in a 3-D continuous medium between each pair of images and then projected onto the imaging plane before being convolved. This paper discusses the practical implementation of the convolution that can be performed directly or after a grid approximation. The grid approximation convolution is obviously faster than the direct convolution but generates errors resulting from the approximation to the grid's nodes. We provide the analytical expression of these errors and then define 2 intensity-based criteria to quantify them as a function of the spatial sampling. The simulation of an image requires less than 2 s with oversampling, thus reducing these errors. The simulation model is validated with first- and second-order statistics. The positions of the scatterers at each imaging time can be provided by a displacement model. An example applied to flow imaging is proposed. Several cases are used to show that this displacement model provides realistic data. It is validated with speckle tracking, a well-known motion estimator in ultrasound imaging.

  11. Simulation and modeling for the stand-off radiation detection system (SORDS) using GEANT4

    Energy Technology Data Exchange (ETDEWEB)

    Hoover, Andrew S [Los Alamos National Laboratory; Wallace, Mark [Los Alamos National Laboratory; Galassi, Mark [Los Alamos National Laboratory; Mocko, Michal [Los Alamos National Laboratory; Palmer, David [Los Alamos National Laboratory; Schultz, Larry [Los Alamos National Laboratory; Tornga, Shawn [Los Alamos National Laboratory

    2009-01-01

    A Stand-Off Radiation Detection System (SORDS) is being developed through a joint effort by Raytheon, Los Alamos National Laboratory, Bubble Technology Industries, Radiation Monitoring Devices, and the Massachusetts Institute of Technology, for the Domestic Nuclear Detection Office (DNDO). The system is a mobile truck-based platform performing detection, imaging, and spectroscopic identification of gamma-ray sources. A Tri-Modal Imaging (TMI) approach combines active-mask coded aperture imaging, Compton imaging, and shadow imaging techniques. Monte Carlo simulation and modeling using the GEANT4 toolkit was used to generate realistic data for the development of imaging algorithms and associated software code.

  12. SAR Automatic Target Recognition Based on Numerical Scattering Simulation and Model-based Matching

    Directory of Open Access Journals (Sweden)

    Zhou Yu

    2015-12-01

    Full Text Available This study proposes a model-based Synthetic Aperture Radar (SAR automatic target recognition algorithm. Scattering is computed offline using the laboratory-developed Bidirectional Analytic Ray Tracing software and the same system parameter settings as the Moving and Stationary Target Acquisition and Recognition (MSTAR datasets. SAR images are then created by simulated electromagnetic scattering data. Shape features are extracted from the measured and simulated images, and then, matches are searched. The algorithm is verified using three types of targets from MSTAR data and simulated SAR images, and it is shown that the proposed approach is fast and easy to implement with high accuracy.

  13. Fault diagnosis based on continuous simulation models

    Science.gov (United States)

    Feyock, Stefan

    1987-01-01

    The results are described of an investigation of techniques for using continuous simulation models as basis for reasoning about physical systems, with emphasis on the diagnosis of system faults. It is assumed that a continuous simulation model of the properly operating system is available. Malfunctions are diagnosed by posing the question: how can we make the model behave like that. The adjustments that must be made to the model to produce the observed behavior usually provide definitive clues to the nature of the malfunction. A novel application of Dijkstra's weakest precondition predicate transformer is used to derive the preconditions for producing the required model behavior. To minimize the size of the search space, an envisionment generator based on interval mathematics was developed. In addition to its intended application, the ability to generate qualitative state spaces automatically from quantitative simulations proved to be a fruitful avenue of investigation in its own right. Implementations of the Dijkstra transform and the envisionment generator are reproduced in the Appendix.

  14. Modelling and simulation of affinity membrane adsorption.

    Science.gov (United States)

    Boi, Cristiana; Dimartino, Simone; Sarti, Giulio C

    2007-08-24

    A mathematical model for the adsorption of biomolecules on affinity membranes is presented. The model considers convection, diffusion and adsorption kinetics on the membrane module as well as the influence of dead end volumes and lag times; an analysis of flow distribution on the whole system is also included. The parameters used in the simulations were obtained from equilibrium and dynamic experimental data measured for the adsorption of human IgG on A2P-Sartoepoxy affinity membranes. The identification of a bi-Langmuir kinetic mechanisms for the experimental system investigated was paramount for a correct process description and the simulated breakthrough curves were in good agreement with the experimental data. The proposed model provides a new insight into the phenomena involved in the adsorption on affinity membranes and it is a valuable tool to assess the use of membrane adsorbers in large scale processes.

  15. Multiphase reacting flows modelling and simulation

    CERN Document Server

    Marchisio, Daniele L

    2007-01-01

    The papers in this book describe the most widely applicable modeling approaches and are organized in six groups covering from fundamentals to relevant applications. In the first part, some fundamentals of multiphase turbulent reacting flows are covered. In particular the introduction focuses on basic notions of turbulence theory in single-phase and multi-phase systems as well as on the interaction between turbulence and chemistry. In the second part, models for the physical and chemical processes involved are discussed. Among other things, particular emphasis is given to turbulence modeling strategies for multiphase flows based on the kinetic theory for granular flows. Next, the different numerical methods based on Lagrangian and/or Eulerian schemes are presented. In particular the most popular numerical approaches of computational fluid dynamics codes are described (i.e., Direct Numerical Simulation, Large Eddy Simulation, and Reynolds-Averaged Navier-Stokes approach). The book will cover particle-based meth...

  16. Modeling, simulation and optimization of bipedal walking

    CERN Document Server

    Berns, Karsten

    2013-01-01

    The model-based investigation of motions of anthropomorphic systems is an important interdisciplinary research topic involving specialists from many fields such as Robotics, Biomechanics, Physiology, Orthopedics, Psychology, Neurosciences, Sports, Computer Graphics and Applied Mathematics. This book presents a study of basic locomotion forms such as walking and running is of particular interest due to the high demand on dynamic coordination, actuator efficiency and balance control. Mathematical models and numerical simulation and optimization techniques are explained, in combination with experimental data, which can help to better understand the basic underlying mechanisms of these motions and to improve them. Example topics treated in this book are Modeling techniques for anthropomorphic bipedal walking systems Optimized walking motions for different objective functions Identification of objective functions from measurements Simulation and optimization approaches for humanoid robots Biologically inspired con...

  17. A Superbubble Feedback Model for Galaxy Simulations

    CERN Document Server

    Keller, B W; Benincasa, S M; Couchman, H M P

    2014-01-01

    We present a new stellar feedback model that reproduces superbubbles. Superbubbles from clustered young stars evolve quite differently to individual supernovae and are substantially more efficient at generating gas motions. The essential new components of the model are thermal conduction, sub-grid evaporation and a sub-grid multi-phase treatment for cases where the simulation mass resolution is insufficient to model the early stages of the superbubble. The multi-phase stage is short compared to superbubble lifetimes. Thermal conduction physically regulates the hot gas mass without requiring a free parameter. Accurately following the hot component naturally avoids overcooling. Prior approaches tend to heat too much mass, leaving the hot ISM below $10^6$ K and susceptible to rapid cooling unless ad-hoc fixes were used. The hot phase also allows feedback energy to correctly accumulate from multiple, clustered sources, including stellar winds and supernovae. We employ high-resolution simulations of a single star ...

  18. Advancing Material Models for Automotive Forming Simulations

    Science.gov (United States)

    Vegter, H.; An, Y.; ten Horn, C. H. L. J.; Atzema, E. H.; Roelofsen, M. E.

    2005-08-01

    Simulations in automotive industry need more advanced material models to achieve highly reliable forming and springback predictions. Conventional material models implemented in the FEM-simulation models are not capable to describe the plastic material behaviour during monotonic strain paths with sufficient accuracy. Recently, ESI and Corus co-operate on the implementation of an advanced material model in the FEM-code PAMSTAMP 2G. This applies to the strain hardening model, the influence of strain rate, and the description of the yield locus in these models. A subsequent challenge is the description of the material after a change of strain path. The use of advanced high strength steels in the automotive industry requires a description of plastic material behaviour of multiphase steels. The simplest variant is dual phase steel consisting of a ferritic and a martensitic phase. Multiphase materials also contain a bainitic phase in addition to the ferritic and martensitic phase. More physical descriptions of strain hardening than simple fitted Ludwik/Nadai curves are necessary. Methods to predict plastic behaviour of single-phase materials use a simple dislocation interaction model based on the formed cells structures only. At Corus, a new method is proposed to predict plastic behaviour of multiphase materials have to take hard phases into account, which deform less easily. The resulting deformation gradients create geometrically necessary dislocations. Additional micro-structural information such as morphology and size of hard phase particles or grains is necessary to derive the strain hardening models for this type of materials. Measurements available from the Numisheet benchmarks allow these models to be validated. At Corus, additional measured values are available from cross-die tests. This laboratory test can attain critical deformations by large variations in blank size and processing conditions. The tests are a powerful tool in optimising forming simulations

  19. Modelling and simulation of thermal power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eborn, J.

    1998-02-01

    Mathematical modelling and simulation are important tools when dealing with engineering systems that today are becoming increasingly more complex. Integrated production and recycling of materials are trends that give rise to heterogenous systems, which are difficult to handle within one area of expertise. Model libraries are an excellent way to package engineering knowledge of systems and units to be reused by those who are not experts in modelling. Many commercial packages provide good model libraries, but they are usually domain-specific and closed. Heterogenous, multi-domain systems requires open model libraries written in general purpose modelling languages. This thesis describes a model database for thermal power plants written in the object-oriented modelling language OMOLA. The models are based on first principles. Subunits describe volumes with pressure and enthalpy dynamics and flows of heat or different media. The subunits are used to build basic units such as pumps, valves and heat exchangers which can be used to build system models. Several applications are described; a heat recovery steam generator, equipment for juice blending, steam generation in a sulphuric acid plant and a condensing steam plate heat exchanger. Model libraries for industrial use must be validated against measured data. The thesis describes how parameter estimation methods can be used for model validation. Results from a case-study on parameter optimization of a non-linear drum boiler model show how the technique can be used 32 refs, 21 figs

  20. Dynamics modeling and simulation of flexible airships

    Science.gov (United States)

    Li, Yuwen

    The resurgence of airships has created a need for dynamics models and simulation capabilities of these lighter-than-air vehicles. The focus of this thesis is a theoretical framework that integrates the flight dynamics, structural dynamics, aerostatics and aerodynamics of flexible airships. The study begins with a dynamics model based on a rigid-body assumption. A comprehensive computation of aerodynamic effects is presented, where the aerodynamic forces and moments are categorized into various terms based on different physical effects. A series of prediction approaches for different aerodynamic effects are unified and applied to airships. The numerical results of aerodynamic derivatives and the simulated responses to control surface deflection inputs are verified by comparing to existing wind-tunnel and flight test data. With the validated aerodynamics and rigid-body modeling, the equations of motion of an elastic airship are derived by the Lagrangian formulation. The airship is modeled as a free-free Euler-Bernoulli beam and the bending deformations are represented by shape functions chosen as the free-free normal modes. In order to capture the coupling between the aerodynamic forces and the structural elasticity, local velocity on the deformed vehicle is used in the computation of aerodynamic forces. Finally, with the inertial, gravity, aerostatic and control forces incorporated, the dynamics model of a flexible airship is represented by a single set of nonlinear ordinary differential equations. The proposed model is implemented as a dynamics simulation program to analyze the dynamics characteristics of the Skyship-500 airship. Simulation results are presented to demonstrate the influence of structural deformation on the aerodynamic forces and the dynamics behavior of the airship. The nonlinear equations of motion are linearized numerically for the purpose of frequency domain analysis and for aeroelastic stability analysis. The results from the latter for the

  1. The Accuratre Signal Model and Imaging Processing in Geosynchronous SAR

    Science.gov (United States)

    Hu, Cheng

    accurate slant range model and SAR operation principle, the accurate signal model which is the foundation of high accuracy imaging will be analytically achieved in GEO SAR. Because of long synthetic aperture time, the linear trajectory model is invalidated in GEO SAR. Therefore based on the accurate signal model, using the high order Taylor expansion technique, the novel series reversion method is proposed to obtain the accurate two dimensional (2-D) point target frequency spectrum (PTFS) expression. Using the accurate 2-D PTFS, we can implement the phase compensation and range migration correction as well as azimuth focusing, then the high quality 2-D imaging results will be obtained. Finally, the simulation will be carried out to verify the correctness of signal model derivation and imaging algorithm proposed.

  2. Image Optimization in Single Photon Emission Computed Tomography by Hardware Modifications with Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    Mohammad Taghi Bahreyni Toossi

    2010-06-01

    Full Text Available Introduction: In Single Photon Emission Computed Tomography (SPECT, the projection data used for image reconstruction are distorted by several factors, including attenuation and scattering of gamma rays, collimator structure, data acquisition method, organ motion, and washout of radiopharmaceuticals. All these make reconstruction of a quantitative SPECT image very difficult. Simulation of a SPECT system is a convenient method to assess the impact of these factors on the image quality. Materials and Methods: The SIMIND Monte Carlo program was employed to simulate a Siemens E.CAM SPECT system. Verification of the simulation was performed by comparing the performance parameters of the system. The verified system was used for SPECT simulations of homogenous and inhomogeneous voxelized phantoms in conjugation with hardware modifications. The resulting data were compared with those obtained from the simulated system without any modifications. Image quality was assessed by comparing the Structural SIMularity index (SSIM, contrast, and resolution of images. Results: The energy spectra acquired from both simulated and real SPECT systems demonstrated similar energy peak regions. The resulting full-widths-at-half-maximums were 13.92 keV for the simulation and 13.58 keV for experimental data, corresponding to energy resolutions of 9.95% and 9.61%, and with calculated sensitivities of 85.39 and 85.11 cps/MBq, respectively. Better performance parameters were obtained with a hardware-modified system constructed using a 0.944 cm thickness NaI(Tl crystal covered by a layer of 0.24 cm aluminum, a  slat of 4.5 cm Pyrex as a backscattering medium, and a parallel hole collimator of Pb-Sb alloy with 2.405 cm thickness. Conclusion: The modeling of a Siemens E.CAM SPECT system was performed with the SIMIND Monte Carlo code. Results obtained with the code are in good agreement with experimental results. The findings demonstrate that the proposed hardware modifications

  3. A Fourier dimensionality reduction model for big data interferometric imaging

    Science.gov (United States)

    Vijay Kartik, S.; Carrillo, Rafael E.; Thiran, Jean-Philippe; Wiaux, Yves

    2017-06-01

    Data dimensionality reduction in radio interferometry can provide savings of computational resources for image reconstruction through reduced memory footprints and lighter computations per iteration, which is important for the scalability of imaging methods to the big data setting of the next-generation telescopes. This article sheds new light on dimensionality reduction from the perspective of the compressed sensing theory and studies its interplay with imaging algorithms designed in the context of convex optimization. We propose a post-gridding linear data embedding to the space spanned by the left singular vectors of the measurement operator, providing a dimensionality reduction below image size. This embedding preserves the null space of the measurement operator and hence its sampling properties are also preserved in light of the compressed sensing theory. We show that this can be approximated by first computing the dirty image and then applying a weighted subsampled discrete Fourier transform to obtain the final reduced data vector. This Fourier dimensionality reduction model ensures a fast implementation of the full measurement operator, essential for any iterative image reconstruction method. The proposed reduction also preserves the independent and identically distributed Gaussian properties of the original measurement noise. For convex optimization-based imaging algorithms, this is key to justify the use of the standard ℓ2-norm as the data fidelity term. Our simulations confirm that this dimensionality reduction approach can be leveraged by convex optimization algorithms with no loss in imaging quality relative to reconstructing the image from the complete visibility data set. Reconstruction results in simulation settings with no direction dependent effects or calibration errors show promising performance of the proposed dimensionality reduction. Further tests on real data are planned as an extension of the current work. matlab code implementing the

  4. Simulation-based joint estimation of body deformation and elasticity parameters for medical image analysis.

    Science.gov (United States)

    Lee, Huai-Ping; Foskey, Mark; Niethammer, Marc; Krajcevski, Pavel; Lin, Ming

    2012-11-01

    Estimation of tissue stiffness is an important means of noninvasive cancer detection. Existing elasticity reconstruction methods usually depend on a dense displacement field (inferred from ultrasound orMR images) and known external forces.Many imaging modalities, however, cannot provide details within an organ and therefore cannot provide such a displacement field. Furthermore, force exertion and measurement can be difficult for some internal organs, making boundary forces another missing parameter. We propose a general method for estimating elasticity and boundary forces automatically using an iterative optimization framework, given the desired (target) output surface. During the optimization, the input model is deformed by the simulator, and an objective function based on the distance between the deformed surface and the target surface is minimized numerically. The optimization framework does not depend on a particular simulation method and is therefore suitable for different physical models. We show a positive correlation between clinical prostate cancer stage (a clinical measure of severity) and the recovered elasticity of the organ. Since the surface correspondence is established, our method also provides a non-rigid image registration, where the quality of the deformation fields is guaranteed, as they are computed using a physics-based simulation.

  5. Mathematical models and numerical simulation in electromagnetism

    CERN Document Server

    Bermúdez, Alfredo; Salgado, Pilar

    2014-01-01

    The book represents a basic support for a master course in electromagnetism oriented to numerical simulation. The main goal of the book is that the reader knows the boundary-value problems of partial differential equations that should be solved in order to perform computer simulation of electromagnetic processes. Moreover it includes a part devoted to electric circuit theory  based on ordinary differential equations. The book is mainly oriented to electric engineering applications, going from the general to the specific, namely, from the full Maxwell’s equations to the particular cases of electrostatics, direct current, magnetostatics and eddy currents models. Apart from standard exercises related to analytical calculus, the book includes some others oriented to real-life applications solved with MaxFEM free simulation software.

  6. Modeling and simulation of economic processes

    Directory of Open Access Journals (Sweden)

    Bogdan Brumar

    2010-12-01

    Full Text Available In general, any activity requires a longer action often characterized by a degree of uncertainty, insecurity, in terms of size of the objective pursued. Because of the complexity of real economic systems, the stochastic dependencies between different variables and parameters considered, not all systems can be adequately represented by a model that can be solved by analytical methods and covering all issues for management decision analysis-economic horizon real. Often in such cases, it is considered that the simulation technique is the only alternative available. Using simulation techniques to study real-world systems often requires a laborious work. Making a simulation experiment is a process that takes place in several stages.

  7. Deep Drawing Simulations With Different Polycrystalline Models

    Science.gov (United States)

    Duchêne, Laurent; de Montleau, Pierre; Bouvier, Salima; Habraken, Anne Marie

    2004-06-01

    The goal of this research is to study the anisotropic material behavior during forming processes, represented by both complex yield loci and kinematic-isotropic hardening models. A first part of this paper describes the main concepts of the `Stress-strain interpolation' model that has been implemented in the non-linear finite element code Lagamine. This model consists of a local description of the yield locus based on the texture of the material through the full constraints Taylor's model. The texture evolution due to plastic deformations is computed throughout the FEM simulations. This `local yield locus' approach was initially linked to the classical isotropic Swift hardening law. Recently, a more complex hardening model was implemented: the physically-based microstructural model of Teodosiu. It takes into account intergranular heterogeneity due to the evolution of dislocation structures, that affects isotropic and kinematic hardening. The influence of the hardening model is compared to the influence of the texture evolution thanks to deep drawing simulations.

  8. Facebook's personal page modelling and simulation

    Science.gov (United States)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    In this paper we will try to define the utility of Facebook's Personal Page marketing method. This tool that Facebook provides, is modelled and simulated using iThink in the context of a Facebook marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following model has been developed for a social media marketing agent/company, Facebook platform oriented and tested in real circumstances. This model is finalized through a number of revisions and iterators of the design, development, simulation, testing and evaluation processes. The validity and usefulness of this Facebook marketing model for the day-to-day decision making are authenticated by the management of the company organization. Facebook's Personal Page method can be adjusted, depending on the situation, in order to maximize the total profit of the company which is to bring new customers, keep the interest of the old customers and deliver traffic to its website.

  9. Towards Better Coupling of Hydrological Simulation Models

    Science.gov (United States)

    Penton, D.; Stenson, M.; Leighton, B.; Bridgart, R.

    2012-12-01

    Standards for model interoperability and scientific workflow software provide techniques and tools for coupling hydrological simulation models. However, model builders are yet to realize the benefits of these and continue to write ad hoc implementations and scripts. Three case studies demonstrate different approaches to coupling models, the first using tight interfaces (OpenMI), the second using a scientific workflow system (Trident) and the third using a tailored execution engine (Delft Flood Early Warning System - Delft-FEWS). No approach was objectively better than any other approach. The foremost standard for coupling hydrological models is the Open Modeling Interface (OpenMI), which defines interfaces for models to interact. An implementation of the OpenMI standard involves defining interchange terms and writing a .NET/Java wrapper around the model. An execution wrapper such as OatC.GUI or Pipistrelle executes the models. The team built two OpenMI implementations for eWater Source river system models. Once built, it was easy to swap river system models. The team encountered technical challenges with versions of the .Net framework (3.5 calling 4.0) and with the performance of the execution wrappers when running daily simulations. By design, the OpenMI interfaces are general, leaving significant decisions around the semantics of the interfaces to the implementer. Increasingly, scientific workflow tools such as Kepler, Taverna and Trident are able to replace custom scripts. These tools aim to improve the provenance and reproducibility of processing tasks. In particular, Taverna and the myExperiment website have had success making many bioinformatics workflows reusable and sharable. The team constructed Trident activities for hydrological software including IQQM, REALM and eWater Source. They built an activity generator for model builders to build activities for particular river systems. The models were linked at a simulation level, without any daily time

  10. A superbubble feedback model for galaxy simulations

    Science.gov (United States)

    Keller, B. W.; Wadsley, J.; Benincasa, S. M.; Couchman, H. M. P.

    2014-08-01

    We present a new stellar feedback model that reproduces superbubbles. Superbubbles from clustered young stars evolve quite differently to individual supernovae and are substantially more efficient at generating gas motions. The essential new components of the model are thermal conduction, subgrid evaporation and a subgrid multiphase treatment for cases where the simulation mass resolution is insufficient to model the early stages of the superbubble. The multiphase stage is short compared to superbubble lifetimes. Thermal conduction physically regulates the hot gas mass without requiring a free parameter. Accurately following the hot component naturally avoids overcooling. Prior approaches tend to heat too much mass, leaving the hot interstellar medium (ISM) below 106 K and susceptible to rapid cooling unless ad hoc fixes were used. The hot phase also allows feedback energy to correctly accumulate from multiple, clustered sources, including stellar winds and supernovae. We employ high-resolution simulations of a single star cluster to show the model is insensitive to numerical resolution, unresolved ISM structure and suppression of conduction by magnetic fields. We also simulate a Milky Way analogue and a dwarf galaxy. Both galaxies show regulated star formation and produce strong outflows.

  11. Infrared decoy and obscurant modelling and simulation for ship protection

    Science.gov (United States)

    Butters, Brian; Nicholls, Edgar; Walmsley, Roy; Ayling, Richard

    2011-11-01

    Imaging seekers used in modern Anti Ship Missiles (ASMs) use a variety of counter countermeasure (CCM) techniques including guard gates and aspect ratio assessment in order to counter the use of IR decoys. In order to improve the performance of EO/IR countermeasures it is necessary to accurately configure and place the decoys using a launcher that is trainable in azimuth and elevation. Control of the launcher, decoy firing times and burst sequences requires the development of algorithms based on multi-dimensional solvers. The modelling and simulation used to derive the launcher algorithms is described including the countermeasure, threat, launcher and ship models. The launcher model incorporates realistic azimuth and elevation rates with limits on azimuth and elevation arcs of fire. A Navier Stokes based model of the IR decoy includes thermal buoyancy, cooling of the IR smoke and its extinction properties. All of these factors affect the developing size, shape and radiance of the decoy. The hot smoke also influences the performance of any co-located chaff or other obscurant material. Typical simulations are described against generic imaging ASM seekers using shape discrimination or a guard gate.

  12. Theory, modeling and simulation: Annual report 1993

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  13. Simulation modelling of fynbos ecosystems: Systems analysis and conceptual models

    CSIR Research Space (South Africa)

    Kruger, FJ

    1985-03-01

    Full Text Available This report outlines progress with the development of computer based dynamic simulation models for ecosystems in the fynbos biome. The models are planned to run on a portable desktop computer with 500 kbytes of memory, extended BASIC language...

  14. A Model Management Approach for Co-Simulation Model Evaluation

    NARCIS (Netherlands)

    Zhang, X.C.; Broenink, Johannes F.; Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2011-01-01

    Simulating formal models is a common means for validating the correctness of the system design and reduce the time-to-market. In most of the embedded control system design, multiple engineering disciplines and various domain-specific models are often involved, such as mechanical, control, software

  15. Study on 3D simulation of wave fields in acoustic reflection image logging

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The borehole acoustic reflection imaging logging is a newly developed acoustic logging method that has attracted many interests. These converted and reflected waves for imaging are usually mixed up with borehole guided waves and therefore difficult to be clearly identified. To improve the downhole tool design and develop more sophisticate data processing and interpretation algorithms,studies on precisely numerical modeling of the wave fields in the acoustic reflection imaging logging are neces-sary and critical. This paper developed a parallelized scheme of 3D finite difference (3DFD) with non-uniform staggered grid and PML absorbing boundary to simulate the acoustic wave fields in isotropic and anisotropic formations. Applications of this scheme to the typical cases of isotropic and anisot-ropic formations and comparison with the results from published analytical solutions have demon-strated the validation and efficiency of the scheme. Higher accuracy and lower computation cost (3.5 times faster than the conventional schemes) have been achieved with this scheme for modeling such a complex wave fields of 60 dB dynamic range with higher frequency (10 kHz). This simulating program provides a quantitative analytical means for studying acoustic reflection imaging tool and development of the data processing and interpretation methods.

  16. Analytical modeling of printed metasurface cavities for computational imaging

    Science.gov (United States)

    F. Imani, Mohammadreza; Sleasman, Timothy; Gollub, Jonah N.; Smith, David R.

    2016-10-01

    We derive simple analytical expressions to model the electromagnetic response of an electrically large printed cavity. The analytical model is then used to develop printed cavities for microwave imaging purposes. The proposed cavity is excited by a cylindrical source and has boundaries formed by subwavelength metallic cylinders (vias) placed at subwavelength distances apart. Given their small size, the electric currents induced on the vias are assumed to have no angular dependence. Applying this approximation simplifies the electromagnetic problem to a matrix equation which can be solved to directly compute the electric current induced on each via. Once the induced currents are known, the electromagnetic field inside the cavity can be computed for every location. We verify the analytical model by comparing its prediction to full-wave simulations. To utilize this cavity in imaging settings, we perforate one side of the printed cavity with radiative slots such that they act as the physical layer of a computational imaging system. An analytical approximation for the slots is also developed, enabling us to obtain estimates of the cavity performance in imaging scenarios. This ability allows us to make informed decisions on the design of the printed metasurface cavity. The utility of the proposed model is further highlighted by demonstrating high-quality experimental imaging; performance metrics, which are consistent between theory and experiment, are also estimated.

  17. eShopper modeling and simulation

    Science.gov (United States)

    Petrushin, Valery A.

    2001-03-01

    The advent of e-commerce gives an opportunity to shift the paradigm of customer communication into a highly interactive mode. The new generation of commercial Web servers, such as the Blue Martini's server, combines the collection of data on a customer behavior with real-time processing and dynamic tailoring of a feedback page. The new opportunities for direct product marketing and cross selling are arriving. The key problem is what kind of information do we need to achieve these goals, or in other words, how do we model the customer? The paper is devoted to customer modeling and simulation. The focus is on modeling an individual customer. The model is based on the customer's transaction data, click stream data, and demographics. The model includes the hierarchical profile of a customer's preferences to different types of products and brands; consumption models for the different types of products; the current focus, trends, and stochastic models for time intervals between purchases; product affinity models; and some generalized features, such as purchasing power, sensitivity to advertising, price sensitivity, etc. This type of model is used for predicting the date of the next visit, overall spending, and spending for different types of products and brands. For some type of stores (for example, a supermarket) and stable customers, it is possible to forecast the shopping lists rather accurately. The forecasting techniques are discussed. The forecasting results can be used for on- line direct marketing, customer retention, and inventory management. The customer model can also be used as a generative model for simulating the customer's purchasing behavior in different situations and for estimating customer's features.

  18. Visualization of simulated small vessels on computed tomography using a model-based iterative reconstruction technique

    Directory of Open Access Journals (Sweden)

    Toru Higaki

    2017-08-01

    Full Text Available This article describes a quantitative evaluation of visualizing small vessels using several image reconstruction methods in computed tomography. Simulated vessels with diameters of 1–6 mm made by 3D printer was scanned using 320-row detector computed tomography (CT. Hybrid iterative reconstruction (hybrid IR and model-based iterative reconstruction (MBIR were performed for the image reconstruction.

  19. Computing the total atmospheric refraction for real-time optical imaging sensor simulation

    Science.gov (United States)

    Olson, Richard F.

    2015-05-01

    Fast and accurate computation of light path deviation due to atmospheric refraction is an important requirement for real-time simulation of optical imaging sensor systems. A large body of existing literature covers various methods for application of Snell's Law to the light path ray tracing problem. This paper provides a discussion of the adaptation to real time simulation of atmospheric refraction ray tracing techniques used in mid-1980's LOWTRAN releases. The refraction ray trace algorithm published in a LOWTRAN-6 technical report by Kneizys (et. al.) has been coded in MATLAB for development, and in C-language for simulation use. To this published algorithm we have added tuning parameters for variable path segment lengths, and extensions for Earth grazing and exoatmospheric "near Earth" ray paths. Model atmosphere properties used to exercise the refraction algorithm were obtained from tables published in another LOWTRAN-6 related report. The LOWTRAN-6 based refraction model is applicable to atmospheric propagation at wavelengths in the IR and visible bands of the electromagnetic spectrum. It has been used during the past two years by engineers at the U.S. Army Aviation and Missile Research, Development and Engineering Center (AMRDEC) in support of several advanced imaging sensor simulations. Recently, a faster (but sufficiently accurate) method using Gauss-Chebyshev Quadrature integration for evaluating the refraction integral was adopted.

  20. NUMERICAL MODEL APPLICATION IN ROWING SIMULATOR DESIGN

    Directory of Open Access Journals (Sweden)

    Petr Chmátal

    2016-04-01

    Full Text Available The aim of the research was to carry out a hydraulic design of rowing/sculling and paddling simulator. Nowadays there are two main approaches in the simulator design. The first one includes a static water with no artificial movement and counts on specially cut oars to provide the same resistance in the water. The second approach, on the other hand uses pumps or similar devices to force the water to circulate but both of the designs share many problems. Such problems are affecting already built facilities and can be summarized as unrealistic feeling, unwanted turbulent flow and bad velocity profile. Therefore, the goal was to design a new rowing simulator that would provide nature-like conditions for the racers and provide an unmatched experience. In order to accomplish this challenge, it was decided to use in-depth numerical modeling to solve the hydraulic problems. The general measures for the design were taken in accordance with space availability of the simulator ́s housing. The entire research was coordinated with other stages of the construction using BIM. The detailed geometry was designed using a numerical model in Ansys Fluent and parametric auto-optimization tools which led to minimum negative hydraulic phenomena and decreased investment and operational costs due to the decreased hydraulic losses in the system.

  1. Dynamic simulation of viscoelastic soft tissues in harmonic motion imaging application.

    Science.gov (United States)

    Shan, Baoxiang; Kogit, Megan L; Pelegri, Assimina A

    2008-10-20

    A finite element model was built to simulate the dynamic behavior of soft tissues subjected to sinusoidal excitation during harmonic motion imaging. In this study, soft tissues and tissue-like phantoms were modeled as isotropic, viscoelastic, and nearly incompressible media. A 3D incompressible mixed u-p element of eight nodes, S1P0, was developed to accurately calculate the stiffness matrix for soft tissues. The finite element equations of motion were solved using the Newmark method. The Voigt description for tissue viscosity was applied to estimate the relative viscous coefficient from the phase shift between the response and excitation in a harmonic case. After validating our model via ANSYS simulation and experiments, a MATLAB finite element program was then employed to explore the effect of excitation location, viscosity, and multiple frequencies on the dynamic displacement at the frequency of interest.

  2. A Deep Generative Deconvolutional Image Model

    Energy Technology Data Exchange (ETDEWEB)

    Pu, Yunchen; Yuan, Xin; Stevens, Andrew J.; Li, Chunyuan; Carin, Lawrence

    2016-05-09

    A deep generative model is developed for representation and analysis of images, based on a hierarchical convolutional dictionary-learning framework. Stochastic unpooling is employed to link consecutive layers in the model, yielding top-down image generation. A Bayesian support vector machine is linked to the top-layer features, yielding max-margin discrimination. Deep deconvolutional inference is employed when testing, to infer the latent features, and the top-layer features are connected with the max-margin classifier for discrimination tasks. The model is efficiently trained using a Monte Carlo expectation-maximization (MCEM) algorithm; the algorithm is implemented on graphical processor units (GPU) to enable large-scale learning, and fast testing. Excellent results are obtained on several benchmark datasets, including ImageNet, demonstrating that the proposed model achieves results that are highly competitive with similarly sized convolutional neural networks.

  3. OASIS: a simulator to prepare and interpret remote imaging of solar system bodies

    Science.gov (United States)

    Jorda, L.; Spjuth, S.; Keller, H. U.; Lamy, P.; Llebaria, A.

    2010-01-01

    We present a new tool, called "OASIS" (Optimized Astrophysical Simulator for Imaging Systems), whose aim is to generate synthetic calibrated images of solar system bodies. OASIS has been developed to support the operations and the scientific interpretation of visible images acquired by the OSIRIS visible camera aboard the Rosetta spacecraft, but it can be used to create synthetic images taken by the visible imaging system of any spacecraft. OASIS allows takes as input the shape model of the object, in the form of triangular facets defining its surface, geometric parameters describing the position and orientation of the objects included in the scene and of the observer, and instrumental parameters describing the geometric and radiometric properties of the camera. The rendering of the object is performed in several steps which involve: (i) sorting the triangular facets in planes perpendicular to the direction of the light source and to the direction of the line-of-sight, (ii) tracing rays from a given facet to the light source and to the observer to check if it is illuminated and in view from the observer, (iii) calculating the intersection between the projected coordinates of the facets and the pixels of the image, and finally (iv) radiometrically calibrating the images. The pixels of the final image contain the expected signal from the object in digital numbers (DN). We show in the article examples of synthetic images of the asteroid (2867) Steins created with OASIS, both for the preparation of the flyby and for the scientific interpretation of the acquired images later on.

  4. Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review

    Science.gov (United States)

    Antonsson, Erik; Gombosi, Tamas

    2005-01-01

    Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.

  5. Macro Level Simulation Model Of Space Shuttle Processing

    Science.gov (United States)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  6. Interactive Modelling and Simulation of Human Motion

    DEFF Research Database (Denmark)

    Engell-Nørregård, Morten Pol

    Dansk resumé Denne ph.d.-afhandling beskæftiger sig med modellering og simulation af menneskelig bevægelse. Emnerne i denne afhandling har mindst to ting til fælles. For det første beskæftiger de sig med menneskelig bevægelse. Selv om de udviklede modeller også kan benyttes til andre ting,er det ....... Endvidere kan den anvendes med enhver softbody simuleringsmodel som finite elements eller mass spring systemer. • En kontrol metode til deformerbare legemer baseret på rum tids opti- mering. fremgangsmåden kan anvendes til at styre sammentrækning af muskler i en muskel simulering....

  7. Computer Modelling and Simulation for Inventory Control

    Directory of Open Access Journals (Sweden)

    G.K. Adegoke

    2012-07-01

    Full Text Available This study concerns the role of computer simulation as a device for conducting scientific experiments on inventory control. The stores function utilizes a bulk of physical assets and engages a bulk of financial resources in a manufacturing outfit therefore there is a need for an efficient inventory control. The reason being that inventory control reduces cost of production and thereby facilitates the effective and efficient accomplishment of production objectives of an organization. Some mathematical and statistical models were used to compute the Economic Order Quantity (EOQ. Test data were gotten from a manufacturing company and same were simulated. The results generated were used to predict a real life situation and have been presented and discussed. The language of implementation for the three models is Turbo Pascal due to its capability, generality and flexibility as a scientific programming language.

  8. Model parameters for simulation of physiological lipids

    Science.gov (United States)

    McGlinchey, Nicholas

    2016-01-01

    Coarse grain simulation of proteins in their physiological membrane environment can offer insight across timescales, but requires a comprehensive force field. Parameters are explored for multicomponent bilayers composed of unsaturated lipids DOPC and DOPE, mixed‐chain saturation POPC and POPE, and anionic lipids found in bacteria: POPG and cardiolipin. A nonbond representation obtained from multiscale force matching is adapted for these lipids and combined with an improved bonding description of cholesterol. Equilibrating the area per lipid yields robust bilayer simulations and properties for common lipid mixtures with the exception of pure DOPE, which has a known tendency to form nonlamellar phase. The models maintain consistency with an existing lipid–protein interaction model, making the force field of general utility for studying membrane proteins in physiologically representative bilayers. © 2016 The Authors. Journal of Computational Chemistry Published by Wiley Periodicals, Inc. PMID:26864972

  9. A generalized framework unifying image registration and respiratory motion models and incorporating image reconstruction, for partial image data or full images.

    Science.gov (United States)

    McClelland, Jamie R; Modat, Marc; Arridge, Simon; Grimes, Helen; D'Souza, Derek; Thomas, David; O'Connell, Dylan; Low, Daniel; Kaza, Evangelia; Collins, David; Leach, Martin; Hawkes, David

    2017-02-14

    Surrogate-driven respiratory motion models relate the motion of the internal anatomy to easily acquired respiratory surrogate signals, such as the motion of the skin surface. They are usually built by first using image registration to determine the motion from a number of dynamic images, and then fitting a correspondence model relating the motion to the surrogate signals. In this paper we present a generalized framework that unifies the image registration and correspondence model fitting into a single optimization. This allows the use of 'partial' imaging data, such as individual slices, projections, or k-space data, where it would not be possible to determine the motion from an individual frame of data. Motion compensated image reconstruction can also be incorporated using an iterative approach, so that both the motion and a motion-free image can be estimated from the partial image data. The framework has been applied to real 4DCT, Cine CT, multi-slice CT, and multi-slice MR data, as well as simulated datasets from a computer phantom. This includes the use of a super-resolution reconstruction method for the multi-slice MR data. Good results were obtained for all datasets, including quantitative results for the 4DCT and phantom datasets where the ground truth motion was known or could be estimated.

  10. A generalized framework unifying image registration and respiratory motion models and incorporating image reconstruction, for partial image data or full images

    Science.gov (United States)

    McClelland, Jamie R.; Modat, Marc; Arridge, Simon; Grimes, Helen; D'Souza, Derek; Thomas, David; O' Connell, Dylan; Low, Daniel A.; Kaza, Evangelia; Collins, David J.; Leach, Martin O.; Hawkes, David J.

    2017-06-01

    Surrogate-driven respiratory motion models relate the motion of the internal anatomy to easily acquired respiratory surrogate signals, such as the motion of the skin surface. They are usually built by first using image registration to determine the motion from a number of dynamic images, and then fitting a correspondence model relating the motion to the surrogate signals. In this paper we present a generalized framework that unifies the image registration and correspondence model fitting into a single optimization. This allows the use of ‘partial’ imaging data, such as individual slices, projections, or k-space data, where it would not be possible to determine the motion from an individual frame of data. Motion compensated image reconstruction can also be incorporated using an iterative approach, so that both the motion and a motion-free image can be estimated from the partial image data. The framework has been applied to real 4DCT, Cine CT, multi-slice CT, and multi-slice MR data, as well as simulated datasets from a computer phantom. This includes the use of a super-resolution reconstruction method for the multi-slice MR data. Good results were obtained for all datasets, including quantitative results for the 4DCT and phantom datasets where the ground truth motion was known or could be estimated.

  11. Theory, Modeling and Simulation Annual Report 2000

    Energy Technology Data Exchange (ETDEWEB)

    Dixon, David A.; Garrett, Bruce C.; Straatsma, Tp; Jones, Donald R.; Studham, Ronald S.; Harrison, Robert J.; Nichols, Jeffrey A.

    2001-11-01

    This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM&S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems.

  12. Theory, Modeling and Simulation Annual Report 2000

    Energy Technology Data Exchange (ETDEWEB)

    Dixon, David A; Garrett, Bruce C; Straatsma, TP; Jones, Donald R; Studham, Scott; Harrison, Robert J; Nichols, Jeffrey A

    2001-11-01

    This annual report describes the 2000 research accomplishments for the Theory, Modeling, and Simulation (TM and S) directorate, one of the six research organizations in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). EMSL is a U.S. Department of Energy (DOE) national scientific user facility and is the centerpiece of the DOE commitment to providing world-class experimental, theoretical, and computational capabilities for solving the nation's environmental problems.

  13. Catalog of Wargaming and Military Simulation Models

    Science.gov (United States)

    1992-02-07

    PROPONENT: USAF ASD, McDonnell Douglas Corp. POINT OF CONTACT: Photon Research Associates (Alias): Mr. Jeff Johnson , (619) 455-9741; McDonnell Douglas...POINTOF CONTACT: Dr. R. Johnson , (DSN) 295-1593 or (301) 295-1593. PURPOSE: The model provides simulation of airland activities in a theater of operations...training, and education. PROPONENT: J-8 Political Military Affairs Directorate. POINT OF CONTACT: LTC Steven G. Stainer . PURPOSE: RDSS is a system

  14. Fully Adaptive Radar Modeling and Simulation Development

    Science.gov (United States)

    2017-04-01

    Organization (NATO) Sensors Electronics Technology (SET)-227 Panel on Cognitive Radar. The FAR M&S architecture developed in Phase I allows for...Air Force’s previously developed radar M&S tools. This report is organized as follows. In Chapter 3, we provide an overview of the FAR framework...AFRL-RY-WP-TR-2017-0074 FULLY ADAPTIVE RADAR MODELING AND SIMULATION DEVELOPMENT Kristine L. Bell and Anthony Kellems Metron, Inc

  15. Difficulties with True Interoperability in Modeling & Simulation

    Science.gov (United States)

    2011-12-01

    Standards in M&S cover multiple layers of technical abstraction. There are middleware specifica- tions, such as the High Level Architecture (HLA) ( IEEE Xplore ... IEEE Xplore Digital Library. 2010. 1516-2010 IEEE Standard for Modeling and Simulation (M&S) High Level Architecture (HLA) – Framework and Rules...using different communication protocols being able to allow da- 2642978-1-4577-2109-0/11/$26.00 ©2011 IEEE Report Documentation Page Form ApprovedOMB No

  16. The infrared imaging spectrograph (IRIS) for TMT: sensitivities and simulations

    CERN Document Server

    Wright, Shelley A; Larkin, James E; Moore, Anna M; Crampton, David; Simard, Luc

    2010-01-01

    We present sensitivity estimates for point and resolved astronomical sources for the current design of the InfraRed Imaging Spectrograph (IRIS) on the future Thirty Meter Telescope (TMT). IRIS, with TMT's adaptive optics system, will achieve unprecedented point source sensitivities in the near-infrared (0.84 - 2.45 {\\mu}m) when compared to systems on current 8-10m ground based telescopes. The IRIS imager, in 5 hours of total integration, will be able to perform a few percent photometry on 26 - 29 magnitude (AB) point sources in the near-infrared broadband filters (Z, Y, J, H, K). The integral field spectrograph, with a range of scales and filters, will achieve good signal-to-noise on 22 - 26 magnitude (AB) point sources with a spectral resolution of R=4,000 in 5 hours of total integration time. We also present simulated 3D IRIS data of resolved high-redshift star forming galaxies (1 < z < 5), illustrating the extraordinary potential of this instrument to probe the dynamics, assembly, and chemical abunda...

  17. Radiative Models of Sgr A* from GRMHD Simulations

    CERN Document Server

    Moscibrodzka, Monika; Dolence, Joshua C; Shiokawa, Hotaka; Leung, Po Kin

    2009-01-01

    Using flow models based on axisymmetric general relativistic magnetohydrodynamics (GRMHD) simulations, we construct radiative models for sgr A*. Spectral energy distributions that include the effects of thermal synchrotron emission and absorption, and Compton scattering, are calculated using a Monte Carlo technique. Images are calculated using a ray-tracing scheme. All models are scaled so that the 230 GHz flux density is 3.4 Jy. The key model parameters are the dimensionless black hole spin a*, the inclination i, and the ion-to-electron temperature ratio Ti/Te. We find that: (1) models with Ti/Te=1 are inconsistent with the observed submillimeter spectral slope; (2) the X-ray flux is a strongly increasing function of a*; (3) the X-ray flux is a strongly increasing function of i; (4) 230 GHz image size is a complicated function of i, a*, and Ti/Te, but the Ti/Te = 10 models are generally large and at most marginally consistent with the 230 GHz VLBI data; (5) for models with Ti/Te=10 and i=85 deg the event hor...

  18. Evaluation of a Rapid Anisotropic Model for ECG Simulation

    Directory of Open Access Journals (Sweden)

    Simone Pezzuto

    2017-05-01

    Full Text Available State-of-the-art cardiac electrophysiology models that are able to deliver physiologically motivated activation maps and electrocardiograms (ECGs can only be solved on high-performance computing architectures. This makes it nearly impossible to adopt such models in clinical practice. ECG imaging tools typically rely on simplified models, but these neglect the anisotropic electric conductivity of the tissue in the forward problem. Moreover, their results are often confined to the heart-torso interface. We propose a forward model that fully accounts for the anisotropic tissue conductivity and produces the standard 12-lead ECG in a few seconds. The activation sequence is approximated with an eikonal model in the 3d myocardium, while the ECG is computed with the lead-field approach. Both solvers were implemented on graphics processing units and massively parallelized. We studied the numerical convergence and scalability of the approach. We also compared the method to the bidomain model in terms of ECGs and activation maps, using a simplified but physiologically motivated geometry and 6 patient-specific anatomies. The proposed methods provided a good approximation of activation maps and ECGs computed with a bidomain model, in only a few seconds. Both solvers scaled very well to high-end hardware. These methods are suitable for use in ECG imaging methods, and may soon become fast enough for use in interactive simulation tools.

  19. Evaluation of Image-Assisted Forest Monitoring: A Simulation

    Directory of Open Access Journals (Sweden)

    Francis A. Roesch

    2015-08-01

    Full Text Available Fiscal uncertainties can sometimes affect national continuous forest monitoring efforts. One solution of interest is to lengthen the time it takes to collect a “full set” of plot data from five to 10 years in order to reduce costs. Here, we investigate using ancillary information to partially offset this proposed solution’s negative effects. We focus our discussion on the corresponding number of years between measurements of each plot while we investigate how thoroughly the detrimental effects of the reduced sampling effort can be ameliorated with change estimates obtained from temporally-dense remotely-sensed images. We simulate measured plot data under four sampling error structures, and we simulate remotely-sensed change estimates under three reliability assumptions, integrated with assumptions about the additional unobserved growth resulting from the lengthened observation window. We investigate a number of estimation systems with respect to their ability to provide compatible annual estimates of the components of change during years spanned by at least half of the full set of plot observations. We show that auxiliary data with shorter observation intervals can contribute to a significant improvement in estimation.

  20. The Halo Model of Origin Images

    DEFF Research Database (Denmark)

    Josiassen, Alexander; Lukas, Bryan A.; Whitwell, Gregory J.

    2013-01-01

    National origin has gained importance as a marketing tool for practitioners to sell their goods and services. However, because origin-image research has been troubled by several fundamental limitations, academia has become sceptical of the current status and strategic implications of the concept....... The aim of this paper was threefold, namely, to provide a state-of-the-art review of origin-image research in marketing, develop and empirically test a new origin-image model and, present the implications of the study....

  1. Imaging of Simple Defects in Austenitic Steel Welds Using a Simulated Ultrasonic Array

    Science.gov (United States)

    Connolly, G. D.; Lowe, M. J. S.; Rokhlin, S. I.; Temple, J. A. G.

    2009-03-01

    The use of ultrasonic arrays has increased dramatically within recent years due to their ability to perform multiple types of inspection and due to the fact that phased arrays allow the immediate production of images within the structure through post-processing of received signals. These arrays offer potential advantages to the inspection of austenitic steel welds where, for reasons of safety and economics, it is important to be able to detect and size any crack-like defects that may occur during service or may have occurred during welding. This paper outlines the theory behind the generation of images of simple planar defects within a previously developed weld model. Images generated using fundamental ray-tracing theory and from finite element simulations by selected inspection procedures will be shown and compared.

  2. Modelling interplanetary CMEs using magnetohydrodynamic simulations

    Directory of Open Access Journals (Sweden)

    P. J. Cargill

    Full Text Available The dynamics of Interplanetary Coronal Mass Ejections (ICMEs are discussed from the viewpoint of numerical modelling. Hydrodynamic models are shown to give a good zero-order picture of the plasma properties of ICMEs, but they cannot model the important magnetic field effects. Results from MHD simulations are shown for a number of cases of interest. It is demonstrated that the strong interaction of the ICME with the solar wind leads to the ICME and solar wind velocities being close to each other at 1 AU, despite their having very different speeds near the Sun. It is also pointed out that this interaction leads to a distortion of the ICME geometry, making cylindrical symmetry a dubious assumption for the CME field at 1 AU. In the presence of a significant solar wind magnetic field, the magnetic fields of the ICME and solar wind can reconnect with each other, leading to an ICME that has solar wind-like field lines. This effect is especially important when an ICME with the right sense of rotation propagates down the heliospheric current sheet. It is also noted that a lack of knowledge of the coronal magnetic field makes such simulations of little use in space weather forecasts that require knowledge of the ICME magnetic field strength.

    Key words. Interplanetary physics (interplanetary magnetic fields Solar physics, astrophysics, and astronomy (flares and mass ejections Space plasma physics (numerical simulation studies

  3. Interactive Modelling and Simulation of Human Motion

    DEFF Research Database (Denmark)

    Engell-Nørregård, Morten Pol

    Dansk resumé Denne ph.d.-afhandling beskæftiger sig med modellering og simulation af menneskelig bevægelse. Emnerne i denne afhandling har mindst to ting til fælles. For det første beskæftiger de sig med menneskelig bevægelse. Selv om de udviklede modeller også kan benyttes til andre ting,er det...... menneskers led, der udviser både ikke-konveksitet og flere frihedsgrader • En generel og alsidig model for aktivering af bløde legemer. Modellen kan anvendes som et animations værktøj, men er lige så velegnet til simulering af menneskelige muskler, da den opfylder de grundlæggende fysiske principper...... primære fokus på at modellere den menneskelige krop. For det andet, beskæftiger de sig alle med simulering som et redskab til at syntetisere bevægelse og dermed skabe animationer. Dette er en vigtigt pointe, da det betyder, at vi ikke kun skaber værktøjer til animatorer, som de kan bruge til at lave sjove...

  4. WiBro Mobility Simulation Model

    Directory of Open Access Journals (Sweden)

    Junaid Qayyum

    2011-09-01

    Full Text Available WiBro, or Wireless Broadband, is the newest variety of mobile wireless broadband access. WiBro technology is being developed by the Korean Telecoms industry. It is based on the IEEE 802.16e (Mobile WiMax international standard. Korean based fixed-line operators KT, SK Telecom were the first to get the licenses by the South Korean government to provide WiBro Commercially. Samsung had a demonstration on WiBro Mobile Phones and Systems at the "APEC IT Exhibition 2006". WiBro is comprised of two phases namely WiBro Phase I and WiBro Phase II. Samsung Electronics has been extensively contributing to Koreas WiBro (Wireless Broadband initiative as well as the IEEE 802.16 standards. The WiBro is a specific subset of the 802.16 standards, specially focusing on supporting full mobility of wireless access systems with OFDMA PHY interface. In this work, we have developed a simulation model of the WiBro system consisting of a set of Base Stations and Mobile Subscriber Stations by using the OPNET Modeler. The simulation model has been utilized to evaluate effective MAC layer throughput, resource usage efficiency, QoS class differentiation, and system capacity and performance under various simulation scenarios.

  5. Progress in Modeling and Simulation of Batteries

    Energy Technology Data Exchange (ETDEWEB)

    Turner, John A [ORNL

    2016-01-01

    Modeling and simulation of batteries, in conjunction with theory and experiment, are important research tools that offer opportunities for advancement of technologies that are critical to electric motors. The development of data from the application of these tools can provide the basis for managerial and technical decision-making. Together, these will continue to transform batteries for electric vehicles. This collection of nine papers presents the modeling and simulation of batteries and the continuing contribution being made to this impressive progress, including topics that cover: * Thermal behavior and characteristics * Battery management system design and analysis * Moderately high-fidelity 3D capabilities * Optimization Techniques and Durability As electric vehicles continue to gain interest from manufacturers and consumers alike, improvements in economy and affordability, as well as adoption of alternative fuel sources to meet government mandates are driving battery research and development. Progress in modeling and simulation will continue to contribute to battery improvements that deliver increased power, energy storage, and durability to further enhance the appeal of electric vehicles.

  6. Simulating an Optimizing Model of Currency Substitution Simulating an Optimizing Model of Currency Substitution

    Directory of Open Access Journals (Sweden)

    Leonardo Leiderman

    1992-03-01

    Full Text Available Simulating an Optimizing Model of Currency Substitution This paper reports simulations based on the parameter estimates of an intertemporal model of currency substitution under nonexpected utility obtained by Bufman and Leiderman (1991. Here we first study the quantitative impact of changes in the degree of dollarization and in the elasticity of currency substitution on government seigniorage. Then, when examine whether the model can account for the comovement of consumption growth and assets' returnr after the 1985 stabilization program, and in particular for the consumption boom of 1986-87. The results are generally encouraging for future applications of optimizing models of currencysubstitution to policy and practical issues.

  7. Simulation of MR angiography imaging for validation of cerebral arteries segmentation algorithms.

    Science.gov (United States)

    Klepaczko, Artur; Szczypiński, Piotr; Deistung, Andreas; Reichenbach, Jürgen R; Materka, Andrzej

    2016-12-01

    Accurate vessel segmentation of magnetic resonance angiography (MRA) images is essential for computer-aided diagnosis of cerebrovascular diseases such as stenosis or aneurysm. The ability of a segmentation algorithm to correctly reproduce the geometry of the arterial system should be expressed quantitatively and observer-independently to ensure objectivism of the evaluation. This paper introduces a methodology for validating vessel segmentation algorithms using a custom-designed MRA simulation framework. For this purpose, a realistic reference model of an intracranial arterial tree was developed based on a real Time-of-Flight (TOF) MRA data set. With this specific geometry blood flow was simulated and a series of TOF images was synthesized using various acquisition protocol parameters and signal-to-noise ratios. The synthesized arterial tree was then reconstructed using a level-set segmentation algorithm available in the Vascular Modeling Toolkit (VMTK). Moreover, to present versatile application of the proposed methodology, validation was also performed for two alternative techniques: a multi-scale vessel enhancement filter and the Chan-Vese variant of the level-set-based approach, as implemented in the Insight Segmentation and Registration Toolkit (ITK). The segmentation results were compared against the reference model. The accuracy in determining the vessels centerline courses was very high for each tested segmentation algorithm (mean error rate = 5.6% if using VMTK). However, the estimated radii exhibited deviations from ground truth values with mean error rates ranging from 7% up to 79%, depending on the vessel size, image acquisition and segmentation method. We demonstrated the practical application of the designed MRA simulator as a reliable tool for quantitative validation of MRA image processing algorithms that provides objective, reproducible results and is observer independent. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. Computer Models Simulate Fine Particle Dispersion

    Science.gov (United States)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  9. Consolidation modelling for thermoplastic composites forming simulation

    Science.gov (United States)

    Xiong, H.; Rusanov, A.; Hamila, N.; Boisse, P.

    2016-10-01

    Pre-impregnated thermoplastic composites are widely used in the aerospace industry for their excellent mechanical properties, Thermoforming thermoplastic prepregs is a fast manufacturing process, the automotive industry has shown increasing interest in this manufacturing processes, in which the reconsolidation is an essential stage. The model of intimate contact is investigated as the consolidation model, compression experiments have been launched to identify the material parameters, several numerical tests show the influents of the temperature and pressure applied during processing. Finally, a new solid-shell prismatic element has been presented for the simulation of consolidation step in the thermoplastic composites forming process.

  10. Modeling and simulation of reactive flows

    CERN Document Server

    Bortoli, De AL; Pereira, Felipe

    2015-01-01

    Modelling and Simulation of Reactive Flows presents information on modeling and how to numerically solve reactive flows. The book offers a distinctive approach that combines diffusion flames and geochemical flow problems, providing users with a comprehensive resource that bridges the gap for scientists, engineers, and the industry. Specifically, the book looks at the basic concepts related to reaction rates, chemical kinetics, and the development of reduced kinetic mechanisms. It considers the most common methods used in practical situations, along with equations for reactive flows, and va

  11. Solar Electric Bicycle Body Modeling and Simulation

    Directory of Open Access Journals (Sweden)

    Zhikun Wang

    2013-10-01

    Full Text Available A new solar electric bicycle design and study were carried out on in this paper. Application of CAD technology to establish three-dimension geometric model, using the kinetic analysis on the frame and other parts for numerical simulation and static strength analysis for the vehicle model design, virtual assembly, complete frame dynamics analysis and vibration analysis, with considering other factors, first on the frame structure improvement, second on security of design calculation analysis and comparison, finally get the ideal body design.

  12. Viscoelastic flow simulations in model porous media

    Science.gov (United States)

    De, S.; Kuipers, J. A. M.; Peters, E. A. J. F.; Padding, J. T.

    2017-05-01

    We investigate the flow of unsteadfy three-dimensional viscoelastic fluid through an array of symmetric and asymmetric sets of cylinders constituting a model porous medium. The simulations are performed using a finite-volume methodology with a staggered grid. The solid-fluid interfaces of the porous structure are modeled using a second-order immersed boundary method [S. De et al., J. Non-Newtonian Fluid Mech. 232, 67 (2016), 10.1016/j.jnnfm.2016.04.002]. A finitely extensible nonlinear elastic constitutive model with Peterlin closure is used to model the viscoelastic part. By means of periodic boundary conditions, we model the flow behavior for a Newtonian as well as a viscoelastic fluid through successive contractions and expansions. We observe the presence of counterrotating vortices in the dead ends of our geometry. The simulations provide detailed insight into how flow structure, viscoelastic stresses, and viscoelastic work change with increasing Deborah number De. We observe completely different flow structures and different distributions of the viscoelastic work at high De in the symmetric and asymmetric configurations, even though they have the exact same porosity. Moreover, we find that even for the symmetric contraction-expansion flow, most energy dissipation is occurring in shear-dominated regions of the flow domain, not in extensional-flow-dominated regions.

  13. LISP based simulation generators for modeling complex space processes

    Science.gov (United States)

    Tseng, Fan T.; Schroer, Bernard J.; Dwan, Wen-Shing

    1987-01-01

    The development of a simulation assistant for modeling discrete event processes is presented. Included are an overview of the system, a description of the simulation generators, and a sample process generated using the simulation assistant.

  14. Ekofisk chalk: core measurements, stochastic reconstruction, network modeling and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Talukdar, Saifullah

    2002-07-01

    This dissertation deals with (1) experimental measurements on petrophysical, reservoir engineering and morphological properties of Ekofisk chalk, (2) numerical simulation of core flood experiments to analyze and improve relative permeability data, (3) stochastic reconstruction of chalk samples from limited morphological information, (4) extraction of pore space parameters from the reconstructed samples, development of network model using pore space information, and computation of petrophysical and reservoir engineering properties from network model, and (5) development of 2D and 3D idealized fractured reservoir models and verification of the applicability of several widely used conventional up scaling techniques in fractured reservoir simulation. Experiments have been conducted on eight Ekofisk chalk samples and porosity, absolute permeability, formation factor, and oil-water relative permeability, capillary pressure and resistivity index are measured at laboratory conditions. Mercury porosimetry data and backscatter scanning electron microscope images have also been acquired for the samples. A numerical simulation technique involving history matching of the production profiles is employed to improve the relative permeability curves and to analyze hysteresis of the Ekofisk chalk samples. The technique was found to be a powerful tool to supplement the uncertainties in experimental measurements. Porosity and correlation statistics obtained from backscatter scanning electron microscope images are used to reconstruct microstructures of chalk and particulate media. The reconstruction technique involves a simulated annealing algorithm, which can be constrained by an arbitrary number of morphological parameters. This flexibility of the algorithm is exploited to successfully reconstruct particulate media and chalk samples using more than one correlation functions. A technique based on conditional simulated annealing has been introduced for exact reproduction of vuggy

  15. Biomedical Simulation Models of Human Auditory Processes

    Science.gov (United States)

    Bicak, Mehmet M. A.

    2012-01-01

    Detailed acoustic engineering models that explore noise propagation mechanisms associated with noise attenuation and transmission paths created when using hearing protectors such as earplugs and headsets in high noise environments. Biomedical finite element (FE) models are developed based on volume Computed Tomography scan data which provides explicit external ear, ear canal, middle ear ossicular bones and cochlea geometry. Results from these studies have enabled a greater understanding of hearing protector to flesh dynamics as well as prioritizing noise propagation mechanisms. Prioritization of noise mechanisms can form an essential framework for exploration of new design principles and methods in both earplug and earcup applications. These models are currently being used in development of a novel hearing protection evaluation system that can provide experimentally correlated psychoacoustic noise attenuation. Moreover, these FE models can be used to simulate the effects of blast related impulse noise on human auditory mechanisms and brain tissue.

  16. Simulation Model of Brushless Excitation System

    Directory of Open Access Journals (Sweden)

    Ahmed N.A.  Alla

    2007-01-01

    Full Text Available Excitation system is key element in the dynamic performance of electric power systems, accurate excitation models are of great importance in simulating and investigating the power system transient phenomena. Parameter identification of the Brushless excitation system was presented. First a block diagram for the EXS parameter was proposed based on the documents and maps in the power station. To identify the parameters of this model, a test procedure to obtain step response, was presented. Using the Genetic Algorithm with the Matlab-software it was possible to identify all the necessary parameters of the model. Using the same measured input signals the response from the standard model showed nearly the same behavior as the excitation system.

  17. Modeling and simulation of direct contact evaporators

    Directory of Open Access Journals (Sweden)

    F.B. Campos

    2001-09-01

    Full Text Available A dynamic model of a direct contact evaporator was developed and coupled to a recently developed superheated bubble model. The latter model takes into account heat and mass transfer during the bubble formation and ascension stages and is able to predict gas holdup in nonisothermal systems. The results of the coupled model, which does not have any adjustable parameter, were compared with experimental data. The transient behavior of the liquid-phase temperature and the vaporization rate under quasi-steady-state conditions were in very good agreement with experimental data. The transient behavior of liquid height was only reasonably simulated. In order to explain this partial disagreement, some possible causes were analyzed.

  18. Efficient Turbulence Modeling for CFD Wake Simulations

    DEFF Research Database (Denmark)

    van der Laan, Paul

    , that can accurately and efficiently simulate wind turbine wakes. The linear k-ε eddy viscosity model (EVM) is a popular turbulence model in RANS; however, it underpredicts the velocity wake deficit and cannot predict the anisotropic Reynolds-stresses in the wake. In the current work, nonlinear eddy...... viscosity models (NLEVM) are applied to wind turbine wakes. NLEVMs can model anisotropic turbulence through a nonlinear stress-strain relation, and they can improve the velocity deficit by the use of a variable eddy viscosity coefficient, that delays the wake recovery. Unfortunately, all tested NLEVMs show...... numerically unstable behavior for fine grids, which inhibits a grid dependency study for numerical verification. Therefore, a simpler EVM is proposed, labeled as the k-ε - fp EVM, that has a linear stress-strain relation, but still has a variable eddy viscosity coefficient. The k-ε - fp EVM is numerically...

  19. Integrating Visualizations into Modeling NEST Simulations.

    Science.gov (United States)

    Nowke, Christian; Zielasko, Daniel; Weyers, Benjamin; Peyser, Alexander; Hentschel, Bernd; Kuhlen, Torsten W

    2015-01-01

    Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work.

  20. Integrating Visualizations into Modeling NEST Simulations

    Directory of Open Access Journals (Sweden)

    Christian eNowke

    2015-12-01

    Full Text Available Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work.