WorldWideScience

Sample records for inverse planning simulated

  1. Inverse planning IMRT

    International Nuclear Information System (INIS)

    Rosenwald, J.-C.

    2008-01-01

    The lecture addressed the following topics: Optimizing radiotherapy dose distribution; IMRT contributes to optimization of energy deposition; Inverse vs direct planning; Main steps of IMRT; Background of inverse planning; General principle of inverse planning; The 3 main components of IMRT inverse planning; The simplest cost function (deviation from prescribed dose); The driving variable : the beamlet intensity; Minimizing a 'cost function' (or 'objective function') - the walker (or skier) analogy; Application to IMRT optimization (the gradient method); The gradient method - discussion; The simulated annealing method; The optimization criteria - discussion; Hard and soft constraints; Dose volume constraints; Typical user interface for definition of optimization criteria; Biological constraints (Equivalent Uniform Dose); The result of the optimization process; Semi-automatic solutions for IMRT; Generalisation of the optimization problem; Driving and driven variables used in RT optimization; Towards multi-criteria optimization; and Conclusions for the optimization phase. (P.A.)

  2. A new approach to integrate GPU-based Monte Carlo simulation into inverse treatment plan optimization for proton therapy.

    Science.gov (United States)

    Li, Yongbao; Tian, Zhen; Song, Ting; Wu, Zhaoxia; Liu, Yaqiang; Jiang, Steve; Jia, Xun

    2017-01-07

    Monte Carlo (MC)-based spot dose calculation is highly desired for inverse treatment planning in proton therapy because of its accuracy. Recent studies on biological optimization have also indicated the use of MC methods to compute relevant quantities of interest, e.g. linear energy transfer. Although GPU-based MC engines have been developed to address inverse optimization problems, their efficiency still needs to be improved. Also, the use of a large number of GPUs in MC calculation is not favorable for clinical applications. The previously proposed adaptive particle sampling (APS) method can improve the efficiency of MC-based inverse optimization by using the computationally expensive MC simulation more effectively. This method is more efficient than the conventional approach that performs spot dose calculation and optimization in two sequential steps. In this paper, we propose a computational library to perform MC-based spot dose calculation on GPU with the APS scheme. The implemented APS method performs a non-uniform sampling of the particles from pencil beam spots during the optimization process, favoring those from the high intensity spots. The library also conducts two computationally intensive matrix-vector operations frequently used when solving an optimization problem. This library design allows a streamlined integration of the MC-based spot dose calculation into an existing proton therapy inverse planning process. We tested the developed library in a typical inverse optimization system with four patient cases. The library achieved the targeted functions by supporting inverse planning in various proton therapy schemes, e.g. single field uniform dose, 3D intensity modulated proton therapy, and distal edge tracking. The efficiency was 41.6  ±  15.3% higher than the use of a GPU-based MC package in a conventional calculation scheme. The total computation time ranged between 2 and 50 min on a single GPU card depending on the problem size.

  3. Inverse planning for low-dose-rate prostate brachytherapy by simulated annealing under fuzzy expert control

    International Nuclear Information System (INIS)

    Zerda Lerner, Alberto de la

    2004-01-01

    Simulated annealing (SA) is a multivariate combinatorial optimization process that searches the configuration space of possible solutions by a random walk, guided only by the goal of minimization of the objective function. The decision-making capabilities of a fuzzy inference system are applied to guide the SA search, to look for solutions which, in addition to optimizing a plan in dosimetric terms, also present some clinically desirable spatial features. No a priori constraints are placed on the number or position of needles or on the seed loading sequence of individual needles. These additional degrees of freedom are balanced by giving preference to consider plans with seed distributions that are balanced in the right/left and anterior/posterior halves in each axial slice, and with local seed density that is about uniform. Piecewise linear membership functions are constructed to represent these requirements. Before a step in the random search is subject to the SA test, the expert functions representing the spatial seed-distribution requirements are evaluated. Thus, the expert planner's knowledge enters into the decision as to the ''goodness'' of a seed configuration regarding the spatial seed-distribution goals. When a step in the random walk yields a seed configuration that is found wanting, a specific number of additional steps in the local neighborhood is attempted until either improvement in the spatial requirements is achieved, or the allowed number of attempts is exhausted. In the latter case, the expert system desists and the unfavorable step is taken, moving on to the simulated annealing test. The number of attempts is determined by the fuzzy logic inference engine and depends on just how badly the expert requirement is not met. The program is interfaced with a commercial treatment planning system (TPS) to import optimized seed plans for isodose display and analysis. Execution in a 1.5 GHz computer is less than a minute, adequate for real-time planning

  4. Fuzzy logic guided inverse treatment planning

    International Nuclear Information System (INIS)

    Yan Hui; Yin Fangfang; Guan Huaiqun; Kim, Jae Ho

    2003-01-01

    A fuzzy logic technique was applied to optimize the weighting factors in the objective function of an inverse treatment planning system for intensity-modulated radiation therapy (IMRT). Based on this technique, the optimization of weighting factors is guided by the fuzzy rules while the intensity spectrum is optimized by a fast-monotonic-descent method. The resultant fuzzy logic guided inverse planning system is capable of finding the optimal combination of weighting factors for different anatomical structures involved in treatment planning. This system was tested using one simulated (but clinically relevant) case and one clinical case. The results indicate that the optimal balance between the target dose and the critical organ dose is achieved by a refined combination of weighting factors. With the help of fuzzy inference, the efficiency and effectiveness of inverse planning for IMRT are substantially improved

  5. Inverse planning for interstitial gynecologic template brachytherapy: truly anatomy-based planning

    International Nuclear Information System (INIS)

    Lessard, Etienne; Hsu, I-Chou; Pouliot, Jean

    2002-01-01

    Purpose: Commercially available optimization schemes generally result in an undesirable dose distribution, because of the particular shapes of tumors extending laterally from the tandem. Dose distribution is therefore manually obtained by adjusting relative dwell time values until an acceptable solution is found. The objective of this work is to present the clinical application of an inverse planning dose optimization tool for the automatic determination of source dwell time values in the treatment of interstitial gynecologic templates. Methods and Materials: In cases where the tumor extends beyond the range of the tandem-ovoid applicator, catheters as well as the tandem are inserted into the paravaginal and parametrial region in an attempt to cover the tumor volume. CT scans of these patients are then used for CT-based dose planning. Dose distribution is obtained manually by varying the relative dwell times until adequate dose coverage is achieved. This manual planning is performed by an experienced physician. In parallel, our in-house inverse planning based on simulated annealing is used to automatically determine which of all possible dwell positions will become active and to calculate the dwell time values needed to fulfill dose constraints applied to the tumor volume and to each organ at risk. To compare the results of these planning methods, dose-volume histograms and isodose distributions were generated for the target and each organ at risk. Results: This procedure has been applied for the dose planning of 12 consecutive interstitial gynecologic templates cases. For all cases, once the anatomy was contoured, the routine of inverse planning based on simulated annealing found the solution to the dose constraints within 1 min of CPU time. In comparison, manual planning took more than 45 min. The inverse planning-generated plans showed improved protection to organs at risk for the same coverage compared to manual planning. Conclusion: This inverse planning tool

  6. Review of ankle inversion sprain simulators in the biomechanics laboratory

    Directory of Open Access Journals (Sweden)

    Sophia Chui-Wai Ha

    2015-10-01

    Full Text Available Ankle inversion ligamentous sprain is one of the most common sports injuries. The most direct way is to investigate real injury incidents, but it is unethical and impossible to replicate on test participants. Simulators including tilt platforms, trapdoors, and fulcrum devices were designed to mimic ankle inversion movements in laboratories. Inversion angle was the only element considered in early designs; however, an ankle sprain is composed of inversion and plantarflexion in clinical observations. Inversion velocity is another parameter that increased the reality of simulation. This review summarised the simulators, and aimed to compare and contrast their features and settings.

  7. When does treatment plan optimization require inverse planning?

    International Nuclear Information System (INIS)

    Sherouse, George W.

    1995-01-01

    which the most sophisticated techniques of inverse planning represent a cost-effective solution is relatively small, and that surprisingly simple optimization techniques (and correspondingly simple treatment delivery techniques) can reliably produce acceptable results for the majority of routine conformal radiotherapy practice

  8. Review of ankle inversion sprain simulators in the biomechanics laboratory

    OpenAIRE

    Ha, Sophia Chui-Wai; Fong, Daniel Tik-Pui; Chan, Kai-Ming

    2015-01-01

    Ankle inversion ligamentous sprain is one of the most common sports injuries. The most direct way is to investigate real injury incidents, but it is unethical and impossible to replicate on test participants. Simulators including tilt platforms, trapdoors, and fulcrum devices were designed to mimic ankle inversion movements in laboratories. Inversion angle was the only element considered in early designs; however, an ankle sprain is composed of inversion and plantarflexion in clinical observa...

  9. Inverse treatment planning based on MRI for HDR prostate brachytherapy

    International Nuclear Information System (INIS)

    Citrin, Deborah; Ning, Holly; Guion, Peter; Li Guang; Susil, Robert C.; Miller, Robert W.; Lessard, Etienne; Pouliot, Jean; Xie Huchen; Capala, Jacek; Coleman, C. Norman; Camphausen, Kevin; Menard, Cynthia

    2005-01-01

    Purpose: To develop and optimize a technique for inverse treatment planning based solely on magnetic resonance imaging (MRI) during high-dose-rate brachytherapy for prostate cancer. Methods and materials: Phantom studies were performed to verify the spatial integrity of treatment planning based on MRI. Data were evaluated from 10 patients with clinically localized prostate cancer who had undergone two high-dose-rate prostate brachytherapy boosts under MRI guidance before and after pelvic radiotherapy. Treatment planning MRI scans were systematically evaluated to derive a class solution for inverse planning constraints that would reproducibly result in acceptable target and normal tissue dosimetry. Results: We verified the spatial integrity of MRI for treatment planning. MRI anatomic evaluation revealed no significant displacement of the prostate in the left lateral decubitus position, a mean distance of 14.47 mm from the prostatic apex to the penile bulb, and clear demarcation of the neurovascular bundles on postcontrast imaging. Derivation of a class solution for inverse planning constraints resulted in a mean target volume receiving 100% of the prescribed dose of 95.69%, while maintaining a rectal volume receiving 75% of the prescribed dose of <5% (mean 1.36%) and urethral volume receiving 125% of the prescribed dose of <2% (mean 0.54%). Conclusion: Systematic evaluation of image spatial integrity, delineation uncertainty, and inverse planning constraints in our procedure reduced uncertainty in planning and treatment

  10. Multi-objective optimization of inverse planning for accurate radiotherapy

    International Nuclear Information System (INIS)

    Cao Ruifen; Pei Xi; Cheng Mengyun; Li Gui; Hu Liqin; Wu Yican; Jing Jia; Li Guoli

    2011-01-01

    The multi-objective optimization of inverse planning based on the Pareto solution set, according to the multi-objective character of inverse planning in accurate radiotherapy, was studied in this paper. Firstly, the clinical requirements of a treatment plan were transformed into a multi-objective optimization problem with multiple constraints. Then, the fast and elitist multi-objective Non-dominated Sorting Genetic Algorithm (NSGA-II) was introduced to optimize the problem. A clinical example was tested using this method. The results show that an obtained set of non-dominated solutions were uniformly distributed and the corresponding dose distribution of each solution not only approached the expected dose distribution, but also met the dose-volume constraints. It was indicated that the clinical requirements were better satisfied using the method and the planner could select the optimal treatment plan from the non-dominated solution set. (authors)

  11. The Trump tax plan halts inversions but increases treaty shopping

    NARCIS (Netherlands)

    Lejour, Arjen; Cnossen, Sybren; van 't Riet, Maarten

    2017-01-01

    Some US multinationals have displayed a willingness to relinquish their American nationality and move their headquarters abroad. Such ‘inversions’ generally aim to avoid and minimise taxes. This column argues that the new Trump tax plan is likely to halt tax inversions by US multinationals. However,

  12. Inverse planning for x-ray rotation therapy: a general solution of the inverse problem

    International Nuclear Information System (INIS)

    Oelfke, U.; Bortfeld, T.

    1999-01-01

    Rotation therapy with photons is currently under investigation for the delivery of intensity modulated radiotherapy (IMRT). An analytical approach for inverse treatment planning of this radiotherapy technique is described. The inverse problem for the delivery of arbitrary 2D dose profiles is first formulated and then solved analytically. In contrast to previously applied strategies for solving the inverse problem, it is shown that the most general solution for the fluence profiles consists of two independent solutions of different parity. A first analytical expression for both fluence profiles is derived. The mathematical derivation includes two different strategies, an elementary expansion of fluence and dose into polynomials and a more practical approach in terms of Fourier transforms. The obtained results are discussed in the context of previous work on this problem. (author)

  13. Intermediate simulation of the inverse seismic problem

    International Nuclear Information System (INIS)

    Brolley, J.E.

    1980-03-01

    An introductory study of the inverse seismic problem is performed. The complex cepstrum of a seismogram generated by the convolution of three factors, the Seggern-Blandford source function of an explosion, the Futterman mantle transfer function, and the SRO seismometer transfer function, is used. For a given Q and yield, a synthetic seismogram is computed. Arbitrary values of Q and yield are introduced, and a search is conducted to find that pair of values that minimized the cepstral difference between the original and arbitrary seismograms. The original values are accurately recovered. Spectral and amplitude characteristics of the various factors are presented. Possible application to the problem of studying a medium intervening between a source and receiver is discussed. 25 figures, 1 table

  14. A framework for simulation and inversion in electromagnetics

    Science.gov (United States)

    Heagy, Lindsey J.; Cockett, Rowan; Kang, Seogi; Rosenkjaer, Gudni K.; Oldenburg, Douglas W.

    2017-10-01

    Simulations and inversions of electromagnetic geophysical data are paramount for discerning meaningful information about the subsurface from these data. Depending on the nature of the source electromagnetic experiments may be classified as time-domain or frequency-domain. Multiple heterogeneous and sometimes anisotropic physical properties, including electrical conductivity and magnetic permeability, may need be considered in a simulation. Depending on what one wants to accomplish in an inversion, the parameters which one inverts for may be a voxel-based description of the earth or some parametric representation that must be mapped onto a simulation mesh. Each of these permutations of the electromagnetic problem has implications in a numerical implementation of the forward simulation as well as in the computation of the sensitivities, which are required when considering gradient-based inversions. This paper proposes a framework for organizing and implementing electromagnetic simulations and gradient-based inversions in a modular, extensible fashion. We take an object-oriented approach for defining and organizing each of the necessary elements in an electromagnetic simulation, including: the physical properties, sources, formulation of the discrete problem to be solved, the resulting fields and fluxes, and receivers used to sample to the electromagnetic responses. A corresponding implementation is provided as part of the open source simulation and parameter estimation project SIMPEG (http://simpeg.xyz). The application of the framework is demonstrated through two synthetic examples and one field example. The first example shows the application of the common framework for 1D time domain and frequency domain inversions. The second is a field example that demonstrates a 1D inversion of electromagnetic data collected over the Bookpurnong Irrigation District in Australia. The final example is a 3D example which shows how the modular implementation is used to compute the

  15. Beam's-Eye-View Dosimetrics-Guided Inverse Planning for Aperture-Modulated Arc Therapy

    International Nuclear Information System (INIS)

    Ma Yunzhi; Popple, Richard; Suh, Tae-Suk; Xing Lei

    2009-01-01

    Purpose: To use angular beam's-eye-view dosimetrics (BEVD) information to improve the computational efficiency and plan quality of inverse planning of aperture-modulated arc therapy (AMAT). Methods and Materials: In BEVD-guided inverse planning, the angular space spanned by a rotational arc is represented by a large number of fixed-gantry beams with angular spacing of ∼2.5 degrees. Each beam is assigned with an initial aperture shape determined by the beam's-eye-view (BEV) projection of the planning target volume (PTV) and an initial weight. Instead of setting the beam weights arbitrarily, which slows down the subsequent optimization process and may result in a suboptimal solution, a priori knowledge about the quality of the beam directions derived from a BEVD is adopted to initialize the weights. In the BEVD calculation, a higher score is assigned to directions that allow more dose to be delivered to the PTV without exceeding the dose tolerances of the organs at risk (OARs) and vice versa. Simulated annealing is then used to optimize the segment shapes and weights. The BEVD-guided inverse planning is demonstrated by using two clinical cases, and the results are compared with those of a conventional approach without BEVD guidance. Results: An a priori knowledge-guided inverse planning scheme for AMAT is established. The inclusion of BEVD guidance significantly improves the convergence behavior of AMAT inverse planning and results in much better OAR sparing as compared with the conventional approach. Conclusions: BEVD-guidance facilitates AMAT treatment planning and provides a comprehensive tool to maximally use the technical capacity of the new arc therapeutic modality.

  16. 3D inverse treatment planning for the tandem and ovoid applicator in cervical cancer

    International Nuclear Information System (INIS)

    DeWitt, Kelly D.; Hsu, I. Chow Joe; Speight, Joycelyn; Weinberg, Vivian K.; Lessard, Etienne; Pouliot, Jean

    2005-01-01

    Purpose: Three-dimensional treatment planning systems and inverse planning optimization for brachytherapy are becoming commercially available. Guidelines for target delineation and dose constrictions have not been established using this new software. In this study we describe a method of target delineation for the tandem and ovoids applicator. We then compare inverse planning dose distributions with the traditional methods of prescribing dose. Methods and Materials: Target and organ-at-risk volumes were defined using systematic guidelines on 15 patients treated in our department with high-dose-rate brachytherapy for cervical cancer using tandem and ovoids. High-dose-rate distributions were created according to three different dose optimization protocols: inverse planning simulated annealing (IPSA), point A, and point A with a normalization of 2 cc of the bladder receiving 80% of the dose (bladder-sparing method). An uniform cost function for dose constraints was applied to all IPSA generated plans, and no manual optimization was allowed for any planning method. Results: Guidelines for target and structure-at-risk volumes, as well as dose constraint cost functions, were established. Dose-volume histogram analysis showed that the IPSA algorithm indicated no difference in tumor coverage compared with point A optimization while decreasing dose to the bladder and rectum. The IPSA algorithm provided better target volume coverage compared with bladder-sparing method with equivalent doses to the bladder and rectum. Conclusion: This study uses a systematic approach for delineating target and organ-at-risk volumes and a uniform cost function for generating IPSA plans for cervical cancer using tandem and ovoids. Compared with conventional dose prescription methods, IPSA provides a consistent method of optimization that maintains or improves target coverage while decreasing dose to normal structures. Image-guided brachytherapy and inverse planning improve brachytherapy

  17. TH-A-9A-06: Inverse Planning of Gamma Knife Radiosurgery Using Natural Physical Models

    International Nuclear Information System (INIS)

    Purpose: Treatment-planning systems rely on computer intensive optimization algorithms in order to provide radiation dose localization. We are investigating a new optimization paradigm based on natural physical modeling and simulations, which tend to evolve in time and find the minimum energy state. In our research, we aim to match physical models with radiation therapy inverse planning problems, where the minimum energy state coincides with the optimal solution. As a prototype study, we have modeled the inverse planning of Gamma Knife radiosurgery using the dynamic interactions between charged particles and demonstrate the potential of the paradigm. Methods: For inverse planning of Gamma Knife radiosurgery: (1) positive charges are uniformly placed on the surface of tumors and critical structures. (2) The Gamma Knife dose kernels of 4mm, 8mm and 16mm radii are modeled as geometric objects with variable charges. (3) The number of shots per each kernel radii is obtained by solving a constrained integer-linear problem. (4) The shots are placed into the tumor volume and move under electrostatic forces. The simulation is performed until internal forces are zero or maximum iterations are reached. (5) Finally, non-negative least squares (NNLS) is used to calculate the beam-on times for each shot. Results: A 3D C-shaped tumor surrounding a spherical critical structure was used for testing the new optimization paradigm. These tests showed that charges spread out evenly covering the tumor while keeping distance from the critical structure, resulting in a high quality plan. Conclusion: We have developed a new paradigm for dose optimization based on the simulation of physical models. As prototype studies, we applied electrostatic models to Gamma Knife radiosurgery and demonstrated the potential of the new paradigm. Further research and fine-tuning of the model are underway. NSF CBET-0853157

  18. Incorporating model parameter uncertainty into inverse treatment planning

    International Nuclear Information System (INIS)

    Lian Jun; Xing Lei

    2004-01-01

    Radiobiological treatment planning depends not only on the accuracy of the models describing the dose-response relation of different tumors and normal tissues but also on the accuracy of tissue specific radiobiological parameters in these models. Whereas the general formalism remains the same, different sets of model parameters lead to different solutions and thus critically determine the final plan. Here we describe an inverse planning formalism with inclusion of model parameter uncertainties. This is made possible by using a statistical analysis-based frameset developed by our group. In this formalism, the uncertainties of model parameters, such as the parameter a that describes tissue-specific effect in the equivalent uniform dose (EUD) model, are expressed by probability density function and are included in the dose optimization process. We found that the final solution strongly depends on distribution functions of the model parameters. Considering that currently available models for computing biological effects of radiation are simplistic, and the clinical data used to derive the models are sparse and of questionable quality, the proposed technique provides us with an effective tool to minimize the effect caused by the uncertainties in a statistical sense. With the incorporation of the uncertainties, the technique has potential for us to maximally utilize the available radiobiology knowledge for better IMRT treatment

  19. Development of inverse-planning system for neutron capture therapy

    International Nuclear Information System (INIS)

    Kumada, Hiroaki; Yamamoto, Kazuyoshi; Maruo, Takeshi

    2006-01-01

    To lead proper irradiation condition effectively, Japan Atomic Energy Agency (JAEA) is developing an inverse-planning system for neutron capture therapy (NCT-IPS) based on the JAEA computational dosimetry system (JCDS) for BNCT. The leading methodology of an optimum condition in the NCT-IPS has been applied spatial channel theory with adjoint flux solution of Botzman transport. By analyzing the results obtained from the adjoint flux calculations according to the theory, optimum incident point of the beam against the patient can be found, and neutron spectrum of the beam which can generate ideal distribution of neutron flux around tumor region can be determined. The conceptual design of the NCT-IPS was investigated, and prototype of NCT-IPS with JCDS is being developed. (author)

  20. A design of inverse Taylor projectiles using material simulation

    International Nuclear Information System (INIS)

    Tonks, Michael; Harstad, Eric; Maudlin, Paul; Trujillo, Carl

    2008-01-01

    The classic Taylor cylinder test, in which a right circular cylinder is projected at a rigid anvil, exploits the inertia of the projectile to access strain rates that are difficult to achieve with more traditional uniaxial testing methods. In this work we present our efforts to design inverse Taylor projectiles, in which a tapered projectile becomes a right circular cylinder after impact, from annealed copper and show that the self-correcting geometry leads to a uniform compressive strain in the radial direction. We design projectiles using finite element simulation and optimization that deform as desired in tests with minor deviations in the deformed geometry due to manufacturing error and uncertainty in the initial velocity. The inverse Taylor projectiles designed in this manner provide a simple means of validating constitutive models. This work is a step towards developing a general method of designing Taylor projectiles that provide stress–strain behavior relevant to particular engineering problems

  1. Sensitivity study on hydraulic well testing inversion using simulated annealing

    Energy Technology Data Exchange (ETDEWEB)

    Nakao, Shinsuke; Najita, J.; Karasaki, Kenzi

    1997-11-01

    For environmental remediation, management of nuclear waste disposal, or geothermal reservoir engineering, it is very important to evaluate the permeabilities, spacing, and sizes of the subsurface fractures which control ground water flow. Cluster variable aperture (CVA) simulated annealing has been used as an inversion technique to construct fluid flow models of fractured formations based on transient pressure data from hydraulic tests. A two-dimensional fracture network system is represented as a filled regular lattice of fracture elements. The algorithm iteratively changes an aperture of cluster of fracture elements, which are chosen randomly from a list of discrete apertures, to improve the match to observed pressure transients. The size of the clusters is held constant throughout the iterations. Sensitivity studies using simple fracture models with eight wells show that, in general, it is necessary to conduct interference tests using at least three different wells as pumping well in order to reconstruct the fracture network with a transmissivity contrast of one order of magnitude, particularly when the cluster size is not known a priori. Because hydraulic inversion is inherently non-unique, it is important to utilize additional information. The authors investigated the relationship between the scale of heterogeneity and the optimum cluster size (and its shape) to enhance the reliability and convergence of the inversion. It appears that the cluster size corresponding to about 20--40 % of the practical range of the spatial correlation is optimal. Inversion results of the Raymond test site data are also presented and the practical range of spatial correlation is evaluated to be about 5--10 m from the optimal cluster size in the inversion.

  2. Adaptive Core Simulation Employing Discrete Inverse Theory - Part I: Theory

    International Nuclear Information System (INIS)

    Abdel-Khalik, Hany S.; Turinsky, Paul J.

    2005-01-01

    Use of adaptive simulation is intended to improve the fidelity and robustness of important core attribute predictions such as core power distribution, thermal margins, and core reactivity. Adaptive simulation utilizes a selected set of past and current reactor measurements of reactor observables, i.e., in-core instrumentation readings, to adapt the simulation in a meaningful way. A meaningful adaption will result in high-fidelity and robust adapted core simulator models. To perform adaption, we propose an inverse theory approach in which the multitudes of input data to core simulators, i.e., reactor physics and thermal-hydraulic data, are to be adjusted to improve agreement with measured observables while keeping core simulator models unadapted. At first glance, devising such adaption for typical core simulators with millions of input and observables data would spawn not only several prohibitive challenges but also numerous disparaging concerns. The challenges include the computational burdens of the sensitivity-type calculations required to construct Jacobian operators for the core simulator models. Also, the computational burdens of the uncertainty-type calculations required to estimate the uncertainty information of core simulator input data present a demanding challenge. The concerns however are mainly related to the reliability of the adjusted input data. The methodologies of adaptive simulation are well established in the literature of data adjustment. We adopt the same general framework for data adjustment; however, we refrain from solving the fundamental adjustment equations in a conventional manner. We demonstrate the use of our so-called Efficient Subspace Methods (ESMs) to overcome the computational and storage burdens associated with the core adaption problem. We illustrate the successful use of ESM-based adaptive techniques for a typical boiling water reactor core simulator adaption problem

  3. Inverse planning of energy-modulated electron beams in radiotherapy

    International Nuclear Information System (INIS)

    Gentry, John R.; Steeves, Richard; Paliwal, Bhudatt A.

    2006-01-01

    The use of megavoltage electron beams often poses a clinical challenge in that the planning target volume (PTV) is anterior to other radiosensitive structures and has variable depth. To ensure that skin as well as the deepest extent of the PTV receives the prescribed dose entails prescribing to a point beyond the depth of peak dose for a single electron energy. This causes dose inhomogeneities and heightened potential for tissue fibrosis, scarring, and possible soft tissue necrosis. Use of bolus on the skin improves the entrant dose at the cost of decreasing the therapeutic depth that can be treated. Selection of a higher energy to improve dose homogeneity results in increased dose to structures beyond the PTV, as well as enlargement of the volume receiving heightened dose. Measured electron data from a linear accelerator was used as input to create an inverse planning tool employing energy and intensity modulation using bolus (e-IMRT TM ). Using tools readily available in a radiotherapy department, the applications of energy and intensity modulation on the central axis makes it possible to remove hot spots of 115% or more over the depths clinically encountered. The e-IMRT TM algorithm enables the development of patient-specific dose distributions with user-defined positions of peak dose, range, and reduced dose to points beyond the prescription point

  4. Inverse treatment planning using volume-based objective functions

    Science.gov (United States)

    Bednarz, Greg; Michalski, Darek; Anne, Pramila R.; Valicenti, Richard K.

    2004-06-01

    The results of optimization of inverse treatment plans depend on a choice of the objective function. Even when the optimal solution for a given cost function can be obtained, a better solution may exist for a given clinical scenario and it could be obtained with a revised objective function. In the approach presented in this work mixed integer programming was used to introduce a new volume-based objective function, which allowed for minimization of the number of under- or overdosed voxels in selected structures. By selecting and prioritizing components of this function the user could drive the computations towards the desired solution. This optimization approach was tested using cases of patients treated for prostate and oropharyngeal cancer. Initial solutions were obtained based on minimization/maximization of the dose to critical structures and targets. Subsequently, the volume-based objective functions were used to locate solutions, which satisfied better clinical objectives particular to each of the cases. For prostate cases, these additional solutions offered further improvements in sparing of the rectum or the bladder. For oropharyngeal cases, families of solutions were obtained satisfying an intensity modulated radiation therapy protocol for this disease site, while offering significant improvement in the sparing of selected critical structures, e.g., parotid glands. An additional advantage of the present approach was in providing a convenient mechanism to test the feasibility of the dose-volume histogram constraints.

  5. Rapid Evaluation of Particle Properties using Inverse SEM Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Bekar, Kursat B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Miller, Thomas Martin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Patton, Bruce W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Weber, Charles F. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-01-01

    This report is the final deliverable of a 3 year project whose purpose was to investigate the possibility of using simulations of X-ray spectra generated inside a scanning electron microscope (SEM) as a means to perform quantitative analysis of the sample imaged in the SEM via an inverse analysis methodology. Using the nine point Technology Readiness Levels (TRL) typically used by the US Department of Defense (DOD) and the National Aeronautics and Space Administration (NASA), this concept is now at a TRL of 3. In other words, this work has proven the feasibility of this concept and is ready to be further investigated to address some of the issues highlighted by this initial proof of concept.

  6. Effect of objective function on multi-objective inverse planning of radiation therapy

    International Nuclear Information System (INIS)

    Li Guoli; Wu Yican; Song Gang; Wang Shifang

    2006-01-01

    There are two kinds of objective functions in radiotherapy inverse planning: dose distribution-based and Dose-Volume Histogram (DVH)-based functions. The treatment planning in our days is still a trial and error process because the multi-objective problem is solved by transforming it into a single objective problem using a specific set of weights for each object. This work investigates the problem of objective function setting based on Pareto multi-optimization theory, and compares the effect on multi-objective inverse planning of those two kinds of objective functions including calculation time, converge speed, etc. The basis of objective function setting on inverse planning is discussed. (authors)

  7. Inverse Flush Air Data System (FADS) for Real Time Simulations

    Science.gov (United States)

    Madhavanpillai, Jayakumar; Dhoaya, Jayanta; Balakrishnan, Vidya Saraswathi; Narayanan, Remesh; Chacko, Finitha Kallely; Narayanan, Shyam Mohan

    2017-12-01

    Flush Air Data Sensing System (FADS) forms a mission critical sub system in future reentry vehicles. FADS makes use of surface pressure measurements from the nose cap of the vehicle for deriving the air data parameters of the vehicle such as angle of attack, angle of sideslip, Mach number, etc. These parameters find use in the flight control and guidance systems, and also assist in the overall mission management. The FADS under consideration in this paper makes use of nine pressure ports located in the nose cap of a technology demonstrator vehicle. In flight, the air data parameters are obtained from the FADS estimation algorithm using the pressure data at the nine pressure ports. But, these pressure data will not be available, for testing the FADS package during ground simulation. So, an inverse software to FADS which estimates the pressure data at the pressure ports for a given flight condition is developed. These pressure data at the nine ports will go as input to the FADS package during ground simulation. The software is run to generate the pressure data for the descent phase trajectory of the technology demonstrator. This data is used again to generate the air data parameters from FADS algorithm. The computed results from FADS algorithm match well with the trajectory data.

  8. Theory and Simulation of an Inverse Free Electron Laser Experiment

    Science.gov (United States)

    Guo, S. K.; Bhattacharjee, A.; Fang, J. M.; Marshall, T. C.

    1996-11-01

    An experimental demonstration of the acceleration of electrons using a high power CO2 laser in an inverse free electron laser (IFEL) is underway at the Brookhaven National Laboratory. This experiment has generated data, which we are attempting to simulate. Included in our studies are such effects as: a low-loss metallic waveguide with a dielectric coating on the walls; multi-mode coupling due to self-consistent interaction between the electrons and the optical wave; space charge (which is significant at lower laser power); energy-spread of the electrons; arbitrary wiggler field profile; and slippage. Two types of wiggler profile have been considered: a linear taper of the period, and a step-taper of the period (the period is ~ 3cm, the field is ~ 1T, and the wiggler length is 47cm). The energy increment of the electrons ( ~ 1-2%) is analyzed in detail as a function of laser power, wiggler parameters, and the initial beam energy (40MeV). For laser power ~ 0.5GW, the predictions of the simulations are in good accord with experimental results. A matter currently under study is the discrepancy between theory and observations for the electron energy distribution observed at the end of the IFEL. This work is supported by the Department of Energy.

  9. Functional avoidance of lung in plan optimization with an aperture-based inverse planning system

    International Nuclear Information System (INIS)

    St-Hilaire, Jason; Lavoie, Caroline; Dagnault, Anne; Beaulieu, Frederic; Morin, Francis; Beaulieu, Luc; Tremblay, Daniel

    2011-01-01

    Purpose: To implement SPECT-based optimization in an anatomy-based aperture inverse planning system for the functional avoidance of lung in thoracic irradiation. Material and methods: SPECT information has been introduced as a voxel-by-voxel modulation of lung importance factors proportionally to the local perfusion count. Fifteen cases of lung cancer have been retrospectively analyzed by generating angle-optimized non-coplanar plans, comparing a purely anatomical approach and our functional approach. Planning target volume coverage and lung sparing have been compared. Statistical significance was assessed by a Wilcoxon matched pairs test. Results: For similar target coverage, perfusion-weighted volume receiving 10 Gy was reduced by a median of 2.2% (p = 0.022) and mean perfusion-weighted lung dose, by a median of 0.9 Gy (p = 0.001). A separate analysis of patients with localized or non-uniform hypoperfusion could not show which would benefit more from SPECT-based treatment planning. Redirection of dose sometimes created overdosage regions in the target volume. Plans consisted of a similar number of segments and monitor units. Conclusions: Angle optimization and SPECT-based modulation of importance factors allowed for functional avoidance of the lung while preserving target coverage. The technique could be also applied to implement PET-based modulation inside the target volume, leading to a safer dose escalation.

  10. Inverse planning in brachytherapy from radium to high rate 192 iridium afterloading

    International Nuclear Information System (INIS)

    Lahanas, M.; Mould, R.F.; Baltas, D.; Karauzakis, K.; Giannouli, S.; Baltas, D.

    2004-01-01

    We consider the inverse planning problem in brachytherapy, i.e. the problem to determine an optimal number of catheters, number of sources for low-dose rate brachytherapy (LDR) and the optimal dwell times for high-dose rate brachytherapy (HDR) necessary to obtain an optimal as possible dose distribution. Starting from the 1930s, inverse planning for LDR brachytherapy used geometrically derived rules to determine the optimal placement of sources in order to achieve a uniform dose distribution of a specific level in planes, spheres and cylinders. Rules and nomograms were derived which still are widely used. With the rapid development of 3D imaging technologies and the rapidly increasing computer power we have now entered the new era of computer-based inverse planning in brachytherapy. The inverse planning is now an optimisation process adapted to the individual geometry of the patient. New inverse planning optimisation algorithms are anatomy-based that consider the real anatomy of the tumour and the organs at risk (OAR). Computer-based inverse planning considers various effects such as stability of solutions for seed misplacements which cannot ever be solved analytically without gross simplifications. In the last few years multiobjective (MO) inverse planning algorithms have been developed which recognise the MO optimisation problem which is inherent in inverse planning in brachytherapy. Previous methods used a trial and error method to obtain a satisfactory solution. MO optimisation replaces this trial and error process by presenting a representative set of dose distributions that can be obtained. With MO optimisation it is possible to obtain information that can be used to obtain the optimum number of catheters, their position and the optimum distribution of dwell times for HDR brachytherapy. For LDR brachytherapy also the stability of solutions due to seed migration can also be improved. A spectrum of alternative solutions is available and the treatment planner

  11. Alaska Simulator - A Journey to Planning

    Science.gov (United States)

    Weber, Barbara; Pinggera, Jakob; Zugal, Stefan; Wild, Werner

    The Alaska Simulator is an interactive software tool developed at the University of Innsbruck which allows people to test, analyze and improve their own planning behavior. In addition, the Alaska Simulator can be used for studying research questions in the context of software project management and other related fields. Thereby, the Alaska Simulator uses a journey as a metaphor for planning a software project. In the context of software project management the simulator can be used to compare traditional rather plan-driven project management methods with more agile approaches. Instead of pre-planning everything in advance agile approaches spread planning activities throughout the project and provide mechanisms for effectively dealing with uncertainty. The biggest challenge thereby is to find the right balance between pre-planning activities and keeping options open. The Alaska Simulator allows to explore how much planning is needed under different circumstances.

  12. Fusion Simulation Program Execution Plan

    International Nuclear Information System (INIS)

    Brooks, Jeffrey

    2011-01-01

    The overall science goal of the FSP is to develop predictive simulation capability for magnetically confined fusion plasmas at an unprecedented level of integration and fidelity. This will directly support and enable effective U.S. participation in research related to the International Thermonuclear Experimental Reactor (ITER) and the overall mission of delivering practical fusion energy. The FSP will address a rich set of scientific issues together with experimental programs, producing validated integrated physics results. This is very well aligned with the mission of the ITER Organization to coordinate with its members the integrated modeling and control of fusion plasmas, including benchmarking and validation activities. [1]. Initial FSP research will focus on two critical areas: 1) the plasma edge and 2) whole device modeling including disruption avoidance. The first of these problems involves the narrow plasma boundary layer and its complex interactions with the plasma core and the surrounding material wall. The second requires development of a computationally tractable, but comprehensive model that describes all equilibrium and dynamic processes at a sufficient level of detail to provide useful prediction of the temporal evolution of fusion plasma experiments. The initial driver for the whole device model (WDM) will be prediction and avoidance of discharge-terminating disruptions, especially at high performance, which are a critical impediment to successful operation of machines like ITER. If disruptions prove unable to be avoided, their associated dynamics and effects will be addressed in the next phase of the FSP. The FSP plan targets the needed modeling capabilities by developing Integrated Science Applications (ISAs) specific to their needs. The Pedestal-Boundary model will include boundary magnetic topology, cross-field transport of multi-species plasmas, parallel plasma transport, neutral transport, atomic physics and interactions with the plasma wall

  13. Prostate Dose Escalation by Innovative Inverse Planning-Driven IMRT

    National Research Council Canada - National Science Library

    Xing, Lei

    2005-01-01

    .... Because of the tacit ignorance of intra-structural tradeoff, the IMRT plans generated by these systems for prostate treatment are, at best, sub-optimal and our endeavor of providing the best possible...

  14. Prostate Dose Escalation by a Innovative Inverse Planning-Driven IMRT

    National Research Council Canada - National Science Library

    Xing, Lei

    2008-01-01

    ...) Developed a voxel-specific penalty scheme for TRV-based inverse planning; (iv) Established a cine-EPID image retrospective dose reconstruction in IMRT dose delivery for adaptive planning and IMRT dose verification. These works are both timely and important and should lead to widespread impact on prostate cancer management.

  15. Poster - 33: Dosimetry Comparison of Prone Breast Forward and Inverse Treatment planning considering daily setup variations

    International Nuclear Information System (INIS)

    Jiang, Runqing; Zhan, Lixin; Osei, Ernest

    2016-01-01

    Introduction: The purpose of this study is to investigate the effects of daily setup variations on prone breast forward field-in-field (FinF) and inverse IMRT treatment planning. Methods: Rando Phantom (Left breast) and Pixy phantom (Right breast) were built and CT scanned in prone position. The treatment planning (TP) is performed in Eclipse TP system. Forward FinF plan and inverse IMRT plan were created to satisfy the CTV coverage and OARs criteria. The daily setup variations were assumed to be 5 mm at left-right, superior-inferior, and anterior-posterior directions. The DVHs of CTV coverage and OARs were compared for both forward FinF plan and inverse IMRT plans due to 5mm setup variation. Results and Discussions: DVHs of CTV coverage had fewer variations for 5m setup variation for forward FinF and inverse IMRT plan for both phantoms. However, for the setup variations in the left-right direction, the DVH of CTV coverage of IMRT plan showed the worst variation due to lateral setup variation for both phantoms. For anterior-posterior variation, the CTV could not get full coverage when the breast chest wall is shallow; however, with the guidance of MV imaging, breast chest wall will be checked during the MV imaging setup. So the setup variations have more effects on inverse IMRT plan, compared to forward FinF plan, especially in the left-right direction. Conclusions: The Forward FinF plan was recommended clinically considering daily setup variation.

  16. Bead Game Simulation. Lesson Plan.

    Science.gov (United States)

    Ripp, Ken

    This lesson plan offers students the opportunity to participate in the three basic economic systems (market, command, and tradition). By working in each of the systems, students will internalize the fundamental values present in each system and will gain insights into the basic advantages and disadvantages of each system. The lesson plan provides…

  17. Sparsity constrained split feasibility for dose-volume constraints in inverse planning of intensity-modulated photon or proton therapy

    Science.gov (United States)

    Penfold, Scott; Zalas, Rafał; Casiraghi, Margherita; Brooke, Mark; Censor, Yair; Schulte, Reinhard

    2017-05-01

    A split feasibility formulation for the inverse problem of intensity-modulated radiation therapy treatment planning with dose-volume constraints included in the planning algorithm is presented. It involves a new type of sparsity constraint that enables the inclusion of a percentage-violation constraint in the model problem and its handling by continuous (as opposed to integer) methods. We propose an iterative algorithmic framework for solving such a problem by applying the feasibility-seeking CQ-algorithm of Byrne combined with the automatic relaxation method that uses cyclic projections. Detailed implementation instructions are furnished. Functionality of the algorithm was demonstrated through the creation of an intensity-modulated proton therapy plan for a simple 2D C-shaped geometry and also for a realistic base-of-skull chordoma treatment site. Monte Carlo simulations of proton pencil beams of varying energy were conducted to obtain dose distributions for the 2D test case. A research release of the Pinnacle 3 proton treatment planning system was used to extract pencil beam doses for a clinical base-of-skull chordoma case. In both cases the beamlet doses were calculated to satisfy dose-volume constraints according to our new algorithm. Examination of the dose-volume histograms following inverse planning with our algorithm demonstrated that it performed as intended. The application of our proposed algorithm to dose-volume constraint inverse planning was successfully demonstrated. Comparison with optimized dose distributions from the research release of the Pinnacle 3 treatment planning system showed the algorithm could achieve equivalent or superior results.

  18. Evaluation of an artificial intelligence guided inverse planning system: Clinical case study

    International Nuclear Information System (INIS)

    Yan Hui; Yin Fangfang; Willett, Christopher

    2007-01-01

    Purpose: An artificial intelligence (AI) guided method for parameter adjustment of inverse planning was implemented on a commercial inverse treatment planning system. For evaluation purpose, four typical clinical cases were tested and the results from both plans achieved by automated and manual methods were compared. Methods and materials: The procedure of parameter adjustment mainly consists of three major loops. Each loop is in charge of modifying parameters of one category, which is carried out by a specially customized fuzzy inference system. A physician prescribed multiple constraints for a selected volume were adopted to account for the tradeoff between prescription dose to the PTV and dose-volume constraints for critical organs. The searching process for an optimal parameter combination began with the first constraint, and proceeds to the next until a plan with acceptable dose was achieved. The initial setup of the plan parameters was the same for each case and was adjusted independently by both manual and automated methods. After the parameters of one category were updated, the intensity maps of all fields were re-optimized and the plan dose was subsequently re-calculated. When final plan arrived, the dose statistics were calculated from both plans and compared. Results: For planned target volume (PTV), the dose for 95% volume is up to 10% higher in plans using the automated method than those using the manual method. For critical organs, an average decrease of the plan dose was achieved. However, the automated method cannot improve the plan dose for some critical organs due to limitations of the inference rules currently employed. For normal tissue, there was no significant difference between plan doses achieved by either automated or manual method. Conclusion: With the application of AI-guided method, the basic parameter adjustment task can be accomplished automatically and a comparable plan dose was achieved in comparison with that achieved by the manual

  19. A comparison of forward and inverse treatment planning for intensity-modulated radiotherapy of head and neck cancer

    International Nuclear Information System (INIS)

    Baer, Werner; Schwarz, Marco; Alber, Markus; Bos, Luc J.; Mijnheer, Ben J.; Rasch, Coen; Schneider, Christoph; Nuesslin, Fridtjof; Damen, Eugene M.F.

    2003-01-01

    Background and purpose: To compare intensity-modulated treatment plans of patients with head and neck cancer generated by forward and inverse planning. Materials and methods: Ten intensity-modulated treatment plans, planned and treated with a step and shoot technique using a forward planning approach, were retrospectively re-planned with an inverse planning algorithm. For this purpose, two strategies were applied. First, inverse planning was performed with the same beam directions as forward planning. In addition, nine equidistant, coplanar incidences were used. The main objective of the optimisation process was the sparing of the parotid glands beside an adequate treatment of the planning target volume (PTV). Inverse planning was performed both with pencil beam and Monte Carlo dose computation to investigate the influence of dose computation on the result of the optimisation. Results: In most cases, both inverse planning strategies managed to improve the treatment plans distinctly due to a better target coverage, a better sparing of the parotid glands or both. A reduction of the mean dose by 3-11 Gy for at least one of the parotid glands could be achieved for most of the patients. For three patients, inverse planning allowed to spare a parotid gland that had to be sacrificed by forward planning. Inverse planning increased the number of segments compared to forward planning by a factor of about 3; from 9-15 to 27-46. No significant differences for PTV and parotid glands between both inverse planning approaches were found. Also, the use of Monte Carlo instead of pencil beam dose computation did not influence the results significantly. Conclusion: The results demonstrate the potential of inverse planning to improve intensity-modulated treatment plans for head and neck cases compared to forward planning while retaining clinical utility in terms of treatment time and quality assurance

  20. A gEUD-based inverse planning technique for HDR prostate brachytherapy: Feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Giantsoudi, D. [Department of Radiological Sciences, University of Texas Health Sciences Center, San Antonio, Texas 78229 (United States); Department of Radiation Oncology, Francis H. Burr Proton Therapy Center, Boston, Massachusetts 02114 (United States); Baltas, D. [Department of Medical Physics and Engineering, Strahlenklinik, Klinikum Offenbach GmbH, 63069 Offenbach (Germany); Nuclear and Particle Physics Section, Physics Department, University of Athens, 15701 Athens (Greece); Karabis, A. [Pi-Medical Ltd., Athens 10676 (Greece); Mavroidis, P. [Department of Radiological Sciences, University of Texas Health Sciences Center, San Antonio, Texas 78299 and Department of Medical Radiation Physics, Karolinska Institutet and Stockholm University, 17176 (Sweden); Zamboglou, N.; Tselis, N. [Strahlenklinik, Klinikum Offenbach GmbH, 63069 Offenbach (Germany); Shi, C. [St. Vincent' s Medical Center, 2800 Main Street, Bridgeport, Connecticut 06606 (United States); Papanikolaou, N. [Department of Radiological Sciences, University of Texas Health Sciences Center, San Antonio, Texas 78299 (United States)

    2013-04-15

    Purpose: The purpose of this work was to study the feasibility of a new inverse planning technique based on the generalized equivalent uniform dose for image-guided high dose rate (HDR) prostate cancer brachytherapy in comparison to conventional dose-volume based optimization. Methods: The quality of 12 clinical HDR brachytherapy implants for prostate utilizing HIPO (Hybrid Inverse Planning Optimization) is compared with alternative plans, which were produced through inverse planning using the generalized equivalent uniform dose (gEUD). All the common dose-volume indices for the prostate and the organs at risk were considered together with radiobiological measures. The clinical effectiveness of the different dose distributions was investigated by comparing dose volume histogram and gEUD evaluators. Results: Our results demonstrate the feasibility of gEUD-based inverse planning in HDR brachytherapy implants for prostate. A statistically significant decrease in D{sub 10} or/and final gEUD values for the organs at risk (urethra, bladder, and rectum) was found while improving dose homogeneity or dose conformity of the target volume. Conclusions: Following the promising results of gEUD-based optimization in intensity modulated radiation therapy treatment optimization, as reported in the literature, the implementation of a similar model in HDR brachytherapy treatment plan optimization is suggested by this study. The potential of improved sparing of organs at risk was shown for various gEUD-based optimization parameter protocols, which indicates the ability of this method to adapt to the user's preferences.

  1. Communication Systems Simulation Laboratory (CSSL): Simulation Planning Guide

    Science.gov (United States)

    Schlesinger, Adam

    2012-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the CSSL. The Simulation Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  2. Inverse Planned High-Dose-Rate Brachytherapy for Locoregionally Advanced Cervical Cancer: 4-Year Outcomes

    Energy Technology Data Exchange (ETDEWEB)

    Tinkle, Christopher L.; Weinberg, Vivian [Department of Radiation Oncology, University of California, San Francisco, California (United States); Chen, Lee-May [Department of Obstetrics, Gynecology, and Reproductive Sciences, University of California, San Francisco, California (United States); Littell, Ramey [Gynecologic Oncology, The Permanente Medical Group, San Francisco, California (United States); Cunha, J. Adam M.; Sethi, Rajni A. [Department of Radiation Oncology, University of California, San Francisco, California (United States); Chan, John K. [Gynecologic Oncology, California Pacific Medical Center, San Francisco, California (United States); Hsu, I-Chow, E-mail: ichow.hsu@ucsf.edu [Department of Radiation Oncology, University of California, San Francisco, California (United States)

    2015-08-01

    Purpose: Evaluate the efficacy and toxicity of image guided brachytherapy using inverse planning simulated annealing (IPSA) high-dose-rate brachytherapy (HDRB) boost for locoregionally advanced cervical cancer. Methods and Materials: From December 2003 through September 2009, 111 patients with primary cervical cancer were treated definitively with IPSA-planned HDRB boost (28 Gy in 4 fractions) after external radiation at our institution. We performed a retrospective review of our experience using image guided brachytherapy. Of the patients, 70% had a tumor size >4 cm, 38% had regional nodal disease, and 15% had clinically evident distant metastasis, including nonregional nodal disease, at the time of diagnosis. Surgical staging involving pelvic lymph node dissection was performed in 15% of patients, and 93% received concurrent cisplatin-based chemotherapy. Toxicities are reported according to the Common Terminology Criteria for Adverse Events version 4.0 guidelines. Results: With a median follow-up time of 42 months (range, 3-84 months), no acute or late toxicities of grade 4 or higher were observed, and grade 3 toxicities (both acute and late) developed in 8 patients (1 constitutional, 1 hematologic, 2 genitourinary, 4 gastrointestinal). The 4-year Kaplan-Meier estimate of late grade 3 toxicity was 8%. Local recurrence developed in 5 patients (4 to 9 months after HDRB), regional recurrence in 3 (6, 16, and 72 months after HDRB), and locoregional recurrence in 1 (4 months after HDR boost). The 4-year estimates of local, locoregional, and distant control of disease were 94.0%, 91.9%, and 69.1%, respectively. The overall and disease-free survival rates at 4 years were 64.3% (95% confidence interval [CI] of 54%-73%) and 61.0% (95% CI, 51%-70%), respectively. Conclusions: Definitive radiation by use of inverse planned HDRB boost for locoregionally advanced cervical cancer is well tolerated and achieves excellent local control of disease. However, overall

  3. invertFREGENE: software for simulating inversions in population genetic data.

    Science.gov (United States)

    O'Reilly, Paul F; Coin, Lachlan J M; Hoggart, Clive J

    2010-03-15

    Inversions are a common form of structural variation, which may have a marked effect on the genome and methods to infer quantities of interest such as those relating to population structure and natural selection. However, due to the challenge in detecting inversions, little is presently known about their impact. Software to simulate inversions could be used to provide a better understanding of how to detect and account for them; but while there are several software packages for simulating population genetic data, none incorporate inversion polymorphisms. Here, we describe a software package, modified from the forward-in-time simulator FREGENE, which simulates the evolution of an inversion polymorphism, of specified length, location, frequency and age, in a population of sequences. We describe previously unreported signatures of inversions in SNP data observed in invertFREGENE results and a known inversion in humans. C++ source code and user manual are available for download from http://www.ebi.ac.uk/projects/BARGEN/ under the GPL licence. l.coin@ic.ac.uk; c.hoggart@ic.ac.uk Supplementary data are available at Bioinformatics online.

  4. Power system restoration: planning and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hazarika, D. [Assam Engineering Coll., Dept. of Electrical Engineering, Assam (India); Sinha, A.K. [Inidan Inst. of Technology, Dept. of Electrical Engineering, Kharagpur (India)

    2003-03-01

    This paper describes a restoration guidance simulator, which allows power system operator/planner to simulate and plan restoration events in an interactive mode. The simulator provides a list of restoration events according to the priority based on some restoration rules and list of priority loads. It also provides in an interactive mode the list of events, which becomes possible as the system grows during restoration. Further, the selected event is validated through a load flow and other analytical tools to show the consequences of implementing the planned event. (Author)

  5. Comparison of treatments of steep and shoot generated by different inverse planning systems

    International Nuclear Information System (INIS)

    Perez Moreno, J. M.; Zucca Aparicio, D.; Fernandez Leton, P.; Garcia Ruiz-Zorrilla, J.; Minambres Moro, A.

    2011-01-01

    The problem of IMRT treatments with the technique Steep and Shoot or static is the number of segments and monitor units used in the treatment. These parameters depend largely on the inverse planning system which determines treatment. Are evaluated three commercial planning systems, with each one performing clinical dosimetry for the same series of patients. Dosimetric results are compared, UM calculated and number of segments.

  6. Mixed integer programming improves comprehensibility and plan quality in inverse optimization of prostate HDR Brachytherapy

    NARCIS (Netherlands)

    Gorissen, B.L.; den Hertog, D.; Hoffmann, A.L.

    2013-01-01

    Current inverse treatment planning methods that optimize both catheter positions and dwell times in prostate HDR brachytherapy use surrogate linear or quadratic objective functions that have no direct interpretation in terms of dose-volume histogram (DVH) criteria, do not result in an optimum or

  7. A study of planning dose constraints for treatment of nasopharyngeal carcinoma using a commercial inverse treatment planning system.

    Science.gov (United States)

    Xia, Ping; Lee, Nancy; Liu, Yu-Ming; Poon, Ian; Weinberg, Vivian; Shin, Edward; Quivey, Jeanne M; Verhey, Lynn J

    2004-07-01

    The purpose of this study was to develop and test planning dose constraint templates for tumor and normal structures in the treatment of nasopharyngeal carcinoma (NPC) using a specific commercial inverse treatment planning system. Planning dose constraint templates were developed based on the analyses of dose-volume histograms (DVHs) of tumor targets and adjacent sensitive structures by clinically approved treatment plans of 9 T1-2 and 16 T3-4 NPC patients treated with inverse planned intensity-modulated radiation therapy (IP-IMRT). DVHs of sensitive structures were analyzed by examining multiple defined endpoints, based on the characteristics of each sensitive structure. For each subgroup of patients with T1-2 and T3-4 NPC, the resulting mean values of these defined endpoint doses were considered as templates for planning dose constraints and subsequently applied to a second group of patients, 5 with T1-2 NPC and 5 with T3-4 NPC. The 10 regenerated plans (called new plans) were compared to the original clinical plans that were used to treat the second group of patients, based on plan conformity index and DVHs. The conformity indices of the new plans were comparable to the original plans with no statistical difference (p = 0.85). Among the serial sensitive structures evaluated, there was a significant decrease with the new plans in the dose to the spinal cord when analyzed by the maximum dose (p = 0.001), doses encompassing 1 cc of the spinal cord volume (p = 0.001) and 3 cc of the spinal cord volume (p = 0.001). There was no significant difference in the mean maximum dose to the brainstem between the new plans and the original plans (p = 0.36). However, a significant difference in the mean maximum dose to the brainstem was seen among the different T-stages (p = 0.04). A decrease with the new plan to the brainstem in the doses encompassing 5% and 10% of the volume was of borderline statistical significance (p = 0.08 and p = 0.06, respectively). There were no

  8. Inverse treatment planning for intensity modulated radiation therapy: CDVH treatment prescription with integral cost function

    International Nuclear Information System (INIS)

    Carol, M.P.; Nash, R.; Campbell, R.C.; Huber, R.

    1997-01-01

    Purpose/Objective: Inverse planning is a required approach when dealing with the complexity of variables present in an intensity modulated plan. However, an inverse planning system is only as useful as it is 1) easy to use and 2) predictable in its result. This is especially the case when the target goals and structure limits specified by the user all cannot be achieved. We have previously developed two interfaces for specifying how such conflicts should be resolved when they occur, that, although allowing a range of results to be obtained, still require 'trial and error' on the part of the user and are case dependent. A new method is explored with goals of allowing the desired results to be specified in an intuitive manner and producing predictable results that are case independent. Materials and Methods: Target goals and structure limits are specified by entering partial volume data: goal/limit, % under/over goal/limit, minimum, maximum. This data is converted to a CDVH curve for each target/structure. During the simulated annealing process used to produce an optimized solution, the actual CDVHs are compared to the desired CDVHs after each iteration and a cost is computed for the difference between the curves. For each curve, the cost is proportional to the difference in area between the desired and actual curves. This cost is controlled by three variables: offset (amount of difference before there is any cost), scale (the range the cost can take) and shape (the shape of the curve for difference versus cost). A range of values were explored for these variables in order to determine if predictable trade-offs would be made automatically by the system. The cost function was tested against a range of cases: a highly irregularly shaped intracranial lesion, a head and neck case with three target volumes with different prescriptions, and a prostate cancer. Results: By varying the values assigned to the control variables, a variety of predictable results could be

  9. A Treatment Planning Analysis of Inverse-Planned and Forward-Planned Intensity-Modulated Radiation Therapy in Nasopharyngeal Carcinoma

    International Nuclear Information System (INIS)

    Poon, Ian M; Xia Ping; Weinberg, Vivien; Sultanem, Khalil; Akazawa, Clayton C.; Akazawa, Pamela C.; Verhey, Lynn; Quivey, Jeanne Marie; Lee, Nancy

    2007-01-01

    Purpose: To compare dose-volume histograms of target volumes and organs at risk in 57 patients with nasopharyngeal carcinoma (NPC) with inverse- (IP) or forward-planned (FP) intensity-modulated radiation treatment (IMRT). Methods and Materials: The DVHs of 57 patients with NPC with IMRT with or without chemotherapy were reviewed. Thirty-one patients underwent IP IMRT, and 26 patients underwent FP IMRT. Treatment goals were to prescribe a minimum dose of 66-70 Gy for gross tumor volume and 59.4 Gy for planning target volume to greater than 95% of the volume. Multiple selected end points were used to compare dose-volume histograms of the targets, including minimum, mean, and maximum doses; percentage of target volume receiving less than 90% (1-V90%), less than 95% (1-V95%), and greater than 105% (1-V105%). Dose-volume histograms of organs at risk were evaluated with characteristic end points. Results: Both planning methods provided excellent target coverage with no statistically significant differences found, although a trend was suggested in favor of improved target coverage with IP IMRT in patients with T3/T4 NPC (p = 0.10). Overall, IP IMRT statistically decreased the dose to the parotid gland, temporomandibular joint, brain stem, and spinal cord overall, whereas IP led to a dose decrease to the middle/inner ear in only the T1/T2 subgroup. Conclusions: Use of IP and FP IMRT can lead to good target coverage while maintaining critical structures within tolerance. The IP IMRT selectively spared these critical organs to a greater degree and should be considered the standard of treatment in patients with NPC, particularly those with T3/T4. The FP IMRT is an effective second option in centers with limited IP IMRT capacity. As a modification of conformal techniques, the human/departmental resources to incorporate FP-IMRT should be nominal

  10. Robotic path-finding in inverse treatment planning for stereotactic radiosurgery with continuous dose delivery

    International Nuclear Information System (INIS)

    Vandewouw, Marlee M.; Aleman, Dionne M.; Jaffray, David A.

    2016-01-01

    Purpose: Continuous dose delivery in radiation therapy treatments has been shown to decrease total treatment time while improving the dose conformity and distribution homogeneity over the conventional step-and-shoot approach. The authors develop an inverse treatment planning method for Gamma Knife® Perfexion™ that continuously delivers dose along a path in the target. Methods: The authors’ method is comprised of two steps: find a path within the target, then solve a mixed integer optimization model to find the optimal collimator configurations and durations along the selected path. Robotic path-finding techniques, specifically, simultaneous localization and mapping (SLAM) using an extended Kalman filter, are used to obtain a path that travels sufficiently close to selected isocentre locations. SLAM is novelly extended to explore a 3D, discrete environment, which is the target discretized into voxels. Further novel extensions are incorporated into the steering mechanism to account for target geometry. Results: The SLAM method was tested on seven clinical cases and compared to clinical, Hamiltonian path continuous delivery, and inverse step-and-shoot treatment plans. The SLAM approach improved dose metrics compared to the clinical plans and Hamiltonian path continuous delivery plans. Beam-on times improved over clinical plans, and had mixed performance compared to Hamiltonian path continuous plans. The SLAM method is also shown to be robust to path selection inaccuracies, isocentre selection, and dose distribution. Conclusions: The SLAM method for continuous delivery provides decreased total treatment time and increased treatment quality compared to both clinical and inverse step-and-shoot plans, and outperforms existing path methods in treatment quality. It also accounts for uncertainty in treatment planning by accommodating inaccuracies.

  11. Incorporating organ movements in IMRT treatment planning for prostate cancer: Minimizing uncertainties in the inverse planning process

    International Nuclear Information System (INIS)

    Unkelbach, Jan; Oelfke, Uwe

    2005-01-01

    We investigate an off-line strategy to incorporate inter fraction organ movements in IMRT treatment planning. Nowadays, imaging modalities located in the treatment room allow for several CT scans of a patient during the course of treatment. These multiple CT scans can be used to estimate a probability distribution of possible patient geometries. This probability distribution can subsequently be used to calculate the expectation value of the delivered dose distribution. In order to incorporate organ movements into the treatment planning process, it was suggested that inverse planning could be based on that probability distribution of patient geometries instead of a single snapshot. However, it was shown that a straightforward optimization of the expectation value of the dose may be insufficient since the expected dose distribution is related to several uncertainties: first, this probability distribution has to be estimated from only a few images. And second, the distribution is only sparsely sampled over the treatment course due to a finite number of fractions. In order to obtain a robust treatment plan these uncertainties should be considered and minimized in the inverse planning process. In the current paper, we calculate a 3D variance distribution in addition to the expectation value of the dose distribution which are simultaniously optimized. The variance is used as a surrogate to quantify the associated risks of a treatment plan. The feasibility of this approach is demonstrated for clinical data of prostate patients. Different scenarios of dose expectation values and corresponding variances are discussed

  12. Path planning of master-slave manipulator using graphic simulator

    International Nuclear Information System (INIS)

    Lee, J. Y.; Kim, S. H.; Song, T. K.; Park, B. S.; Yoon, J. S.

    2002-01-01

    To handle the high level radioactive materials such as spent fuels remotely, the master-slave manipulator is generally used as a remote handling equipment in the hot cell. To analyze the motion and to implement the training system by virtual reality technology, the simulator for M-S manipulator using the computer graphics is developed. The parts are modelled in 3-D graphics, assembled, and kinematics are assigned. The inverse kinematics of the manipulator is defined, and the slave of manipulator is coupled with master by the manipulator's specification. Also, the virtual work cell is implemented in the graphical environment which is the same as the real environment and the path planning method using the function of the collision detection for a manipulator are proposed. This graphic simulator of manipulator can be effectively used in designing of the maintenance processes for the hot cell equipment and enhance the reliability of the spent fuel management

  13. Dose-volume and biological-model based comparison between helical tomotherapy and (inverse-planned) IMAT for prostate tumours

    International Nuclear Information System (INIS)

    Iori, Mauro; Cattaneo, Giovanni Mauro; Cagni, Elisabetta; Fiorino, Claudio; Borasi, Gianni; Riccardo, Calandrino; Iotti, Cinzia; Fazio, Ferruccio; Nahum, Alan E.

    2008-01-01

    Background and purpose: Helical tomotherapy (HT) and intensity-modulated arc therapy (IMAT) are two arc-based approaches to the delivery of intensity-modulated radiotherapy (IMRT). Through plan comparisons we have investigated the potential of IMAT, both with constant (conventional or IMAT-C) and variable (non-conventional or IMAT-NC, a theoretical exercise) dose-rate, to serve as an alternative to helical tomotherapy. Materials and methods: Six patients with prostate tumours treated by HT with a moderately hypo-fractionated protocol, involving a simultaneous integrated boost, were re-planned as IMAT treatments. A method for IMAT inverse-planning using a commercial module for static IMRT combined with a multi-leaf collimator (MLC) arc-sequencing was developed. IMAT plans were compared to HT plans in terms of dose statistics and radiobiological indices. Results: Concerning the planning target volume (PTV), the mean doses for all PTVs were similar for HT and IMAT-C plans with minimum dose, target coverage, equivalent uniform dose (EUD) and tumour control probability (TCP) values being generally higher for HT; maximum dose and degree of heterogeneity were instead higher for IMAT-C. In relation to organs at risk, mean doses and normal tissue complication probability (NTCP) values were similar between the two modalities, except for the penile bulb where IMAT was significantly better. Re-normalizing all plans to the same rectal toxicity (NTCP = 5%), the HT modality yielded higher TCP than IMAT-C but there was no significant difference between HT and IMAT-NC. The integral dose with HT was higher than that for IMAT. Conclusions: with regards to the plan analysis, the HT is superior to IMAT-C in terms of target coverage and dose homogeneity within the PTV. Introducing dose-rate variation during arc-rotation, not deliverable with current linac technology, the simulations result in comparable plan indices between (IMAT-NC) and HT

  14. Reduction of computational dimensionality in inverse radiotherapy planning using sparse matrix operations

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Paul S. [Department of Radiation Oncology, University of Washington, Box 356043, Seattle, WA 98195-6043 (United States)]. E-mail: cho@radonc.washington.edu; Phillips, Mark H. [Department of Radiation Oncology, University of Washington, Box 356043, Seattle, WA 98195-6043 (United States)

    2001-05-01

    For dynamic multileaf collimator-based intensity modulated radiotherapy in which small beam elements are used to generate continuous modulation, the sheer size of the dose calculation matrix could pose serious computational challenges. In order to circumvent this problem, the dose calculation matrix was reduced to a sparse matrix by truncating the weakly contributing entries below a certain cutoff to zero. Subsequently, the sparse matrix was compressed and matrix indexing vectors were generated to facilitate matrix-vector and matrix-matrix operations used in inverse planning. The application of sparsity permitted the reduction of overall memory requirement by an order of magnitude. In addition, the effect of disregarding the small scatter components on the quality of optimization was investigated by repeating the inverse planning using the dense dose calculation matrix. Comparison of dense and sparse matrix-based plans revealed an insignificant difference in optimization outcome, thus demonstrating the feasibility and usefulness of the sparse method in inverse planning. Furthermore, two additional methods of memory minimization are suggested, namely hexagonal dose sampling and limited normal tissue sampling. (author)

  15. Reduction of computational dimensionality in inverse radiotherapy planning using sparse matrix operations.

    Science.gov (United States)

    Cho, P S; Phillips, M H

    2001-05-01

    For dynamic multileaf collimator-based intensity modulated radiotherapy in which small beam elements are used to generate continuous modulation, the sheer size of the dose calculation matrix could pose serious computational challenges. In order to circumvent this problem, the dose calculation matrix was reduced to a sparse matrix by truncating the weakly contributing entries below a certain cutoff to zero. Subsequently, the sparse matrix was compressed and matrix indexing vectors were generated to facilitate matrix-vector and matrix-matrix operations used in inverse planning. The application of sparsity permitted the reduction of overall memory requirement by an order of magnitude. In addition, the effect of disregarding the small scatter components on the quality of optimization was investigated by repeating the inverse planning using the dense dose calculation matrix. Comparison of dense and sparse matrix-based plans revealed an insignificant difference in optimization outcome, thus demonstrating the feasibility and usefulness of the sparse method in inverse planning. Furthermore, two additional methods of memory minimization are suggested, namely hexagonal dose sampling and limited normal tissue sampling.

  16. NOTE: Reduction of computational dimensionality in inverse radiotherapy planning using sparse matrix operations

    Science.gov (United States)

    Cho, Paul S.; Phillips, Mark H.

    2001-05-01

    For dynamic multileaf collimator-based intensity modulated radiotherapy in which small beam elements are used to generate continuous modulation, the sheer size of the dose calculation matrix could pose serious computational challenges. In order to circumvent this problem, the dose calculation matrix was reduced to a sparse matrix by truncating the weakly contributing entries below a certain cutoff to zero. Subsequently, the sparse matrix was compressed and matrix indexing vectors were generated to facilitate matrix-vector and matrix-matrix operations used in inverse planning. The application of sparsity permitted the reduction of overall memory requirement by an order of magnitude. In addition, the effect of disregarding the small scatter components on the quality of optimization was investigated by repeating the inverse planning using the dense dose calculation matrix. Comparison of dense and sparse matrix-based plans revealed an insignificant difference in optimization outcome, thus demonstrating the feasibility and usefulness of the sparse method in inverse planning. Furthermore, two additional methods of memory minimization are suggested, namely hexagonal dose sampling and limited normal tissue sampling.

  17. Using measurable dosimetric quantities to characterize the inter-structural tradeoff in inverse planning

    Science.gov (United States)

    Liu, Hongcheng; Dong, Peng; Xing, Lei

    2017-08-01

    Traditional inverse planning relies on the use of weighting factors to balance the conflicting requirements of different structures. Manual trial-and-error determination of weighting factors has long been recognized as a time-consuming part of treatment planning. The purpose of this work is to develop an inverse planning framework that parameterizes the dosimetric tradeoff among the structures with physically meaningful quantities to simplify the search for clinically sensible plans. In this formalism, instead of using weighting factors, the permissible variation range of the prescription dose or dose volume histogram (DVH) of the involved structures are used to characterize the ‘importance’ of the structures. The inverse planning is then formulated into a convex feasibility problem, called the dosimetric variation-controlled model (DVCM), whose goal is to generate plans with dosimetric or DVH variations of the structures consistent with the pre-specified values. For simplicity, the dosimetric variation range for a structure is extracted from a library of previous cases which possess similar anatomy and prescription. A two-phase procedure (TPP) is designed to solve the model. The first phase identifies a physically feasible plan to satisfy the prescribed dosimetric variation, and the second phase automatically improves the plan in case there is room for further improvement. The proposed technique is applied to plan two prostate cases and two head-and-neck cases and the results are compared with those obtained using a conventional CVaR approach and with a moment-based optimization scheme. Our results show that the strategy is able to generate clinically sensible plans with little trial and error. In all cases, the TPP generates a very competitive plan as compared to those obtained using the alternative approaches. Particularly, in the planning of one of the head-and-neck cases, the TPP leads to a non-trivial improvement in the resultant dose distribution

  18. TU-EF-304-08: LET-Based Inverse Planning for IMPT

    Energy Technology Data Exchange (ETDEWEB)

    Gorissen, BL; Giantsoudi, D; Unkelbach, J; Paganetti, H [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States)

    2015-06-15

    Purpose: Cell survival experiments suggest that the relative biological effectiveness (RBE) of proton beams depends on linear energy transfer (LET), leading to higher RBE near the end of range. With intensity-modulated proton therapy (IMPT), multiple treatment plans that differ in the dose contribution per field may yield a similar physical dose distribution, but the RBE-weighted dose distribution may be disparate. RBE models currently do not have the required predictive power to be included in an optimization model due to the variations in experimental data. We propose an LET-based planning method that guides IMPT optimization models towards plans with reduced RBE-weighted dose in surrounding organs at risk (OARs) compared to inverse planning based on physical dose alone. Methods: Optimization models for physical dose are extended with a term for dose times LET (doseLET). Monte Carlo code is used to generate the physical dose and doseLET distribution of each individual pencil beam. The method is demonstrated for an atypical meningioma patient where the target volume abuts the brainstem and partially overlaps with the optic nerve. Results: A reference plan optimized based on physical dose alone yields high doseLET values in parts of the brainstem and optic nerve. Minimizing doseLET in these critical structures as an additional planning goal reduces the risk of high RBE-weighted dose. The resulting treatment plan avoids the distal fall-off of the Bragg peaks for shaping the dose distribution in front of critical stuctures. The maximum dose in the OARs evaluated with RBE models from literature is reduced by 8–14\\% with our method compared to conventional planning. Conclusion: LET-based inverse planning for IMPT offers the ability to reduce the RBE-weighted dose in OARs without sacrificing target dose. This project was in part supported by NCI - U19 CA 21239.

  19. An efficient inverse radiotherapy planning method for VMAT using quadratic programming optimization.

    Science.gov (United States)

    Hoegele, W; Loeschel, R; Merkle, N; Zygmanski, P

    2012-01-01

    The purpose of this study is to investigate the feasibility of an inverse planning optimization approach for the Volumetric Modulated Arc Therapy (VMAT) based on quadratic programming and the projection method. The performance of this method is evaluated against a reference commercial planning system (eclipse(TM) for rapidarc(TM)) for clinically relevant cases. The inverse problem is posed in terms of a linear combination of basis functions representing arclet dose contributions and their respective linear coefficients as degrees of freedom. MLC motion is decomposed into basic motion patterns in an intuitive manner leading to a system of equations with a relatively small number of equations and unknowns. These equations are solved using quadratic programming under certain limiting physical conditions for the solution, such as the avoidance of negative dose during optimization and Monitor Unit reduction. The modeling by the projection method assures a unique treatment plan with beneficial properties, such as the explicit relation between organ weightings and the final dose distribution. Clinical cases studied include prostate and spine treatments. The optimized plans are evaluated by comparing isodose lines, DVH profiles for target and normal organs, and Monitor Units to those obtained by the clinical treatment planning system eclipse(TM). The resulting dose distributions for a prostate (with rectum and bladder as organs at risk), and for a spine case (with kidneys, liver, lung and heart as organs at risk) are presented. Overall, the results indicate that similar plan qualities for quadratic programming (QP) and rapidarc(TM) could be achieved at significantly more efficient computational and planning effort using QP. Additionally, results for the quasimodo phantom [Bohsung et al., "IMRT treatment planning: A comparative inter-system and inter-centre planning exercise of the estro quasimodo group," Radiother. Oncol. 76(3), 354-361 (2005)] are presented as an example

  20. Advanced Simulation and Computing Business Plan

    Energy Technology Data Exchange (ETDEWEB)

    Rummel, E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-09

    To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsible for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.

  1. Improving IMRT delivery efficiency with reweighted L1-minimization for inverse planning

    International Nuclear Information System (INIS)

    Kim, Hojin; Becker, Stephen; Lee, Rena; Lee, Soonhyouk; Shin, Sukyoung; Candès, Emmanuel; Xing Lei; Li Ruijiang

    2013-01-01

    Purpose: This study presents an improved technique to further simplify the fluence-map in intensity modulated radiation therapy (IMRT) inverse planning, thereby reducing plan complexity and improving delivery efficiency, while maintaining the plan quality.Methods: First-order total-variation (TV) minimization (min.) based on L1-norm has been proposed to reduce the complexity of fluence-map in IMRT by generating sparse fluence-map variations. However, with stronger dose sparing to the critical structures, the inevitable increase in the fluence-map complexity can lead to inefficient dose delivery. Theoretically, L0-min. is the ideal solution for the sparse signal recovery problem, yet practically intractable due to its nonconvexity of the objective function. As an alternative, the authors use the iteratively reweighted L1-min. technique to incorporate the benefits of the L0-norm into the tractability of L1-min. The weight multiplied to each element is inversely related to the magnitude of the corresponding element, which is iteratively updated by the reweighting process. The proposed penalizing process combined with TV min. further improves sparsity in the fluence-map variations, hence ultimately enhancing the delivery efficiency. To validate the proposed method, this work compares three treatment plans obtained from quadratic min. (generally used in clinic IMRT), conventional TV min., and our proposed reweighted TV min. techniques, implemented by a large-scale L1-solver (template for first-order conic solver), for five patient clinical data. Criteria such as conformation number (CN), modulation index (MI), and estimated treatment time are employed to assess the relationship between the plan quality and delivery efficiency.Results: The proposed method yields simpler fluence-maps than the quadratic and conventional TV based techniques. To attain a given CN and dose sparing to the critical organs for 5 clinical cases, the proposed method reduces the number of segments

  2. Simultaneous optimization of beam orientations, wedge filters and field weights for inverse planning with anatomy-based MLC fields

    International Nuclear Information System (INIS)

    Beaulieu, Frederic; Beaulieu, Luc; Tremblay, Daniel; Roy, Rene

    2004-01-01

    As an alternative between manual planning and beamlet-based IMRT, we have developed an optimization system for inverse planning with anatomy-based MLC fields. In this system, named Ballista, the orientation (table and gantry), the wedge filter and the field weights are simultaneously optimized for every beam. An interesting feature is that the system is coupled to Pinnacle3 by means of the PinnComm interface, and uses its convolution dose calculation engine. A fully automatic MLC segmentation algorithm is also included. The plan evaluation is based on a quasi-random sampling and on a quadratic objective function with penalty-like constraints. For efficiency, optimal wedge angles and wedge orientations are determined using the concept of the super-omni wedge. A bound-constrained quasi-Newton algorithm performs field weight optimization, while a fast simulated annealing algorithm selects the optimal beam orientations. Moreover, in order to generate directly deliverable plans, the following practical considerations have been incorporated in the system: collision between the gantry and the table as well as avoidance of the radio-opaque elements of a table top. We illustrate the performance of the new system on two patients. In a rhabdomyosarcoma case, the system generated plans improving both the target coverage and the sparing of the parotide, as compared to a manually designed plan. In the second case presented, the system successfully produced an adequate plan for the treatment of the prostate while avoiding both hip prostheses. For the many cases where full IMRT may not be necessary, the system efficiently generates satisfactory plans meeting the clinical objectives, while keeping the treatment verification much simpler

  3. Satellite chartography of atmospheric methane from SCIAMACHY onboard ENVISAT: 2. Evaluation based on inverse model simulations

    NARCIS (Netherlands)

    Bergamaschi, P.; Frankenberg, C.; Meirink, J.F.; Krol, M.C.; Dentener, F.; Wagner, T.; Platt, U.; Kaplan, J.O.; Körner, S.; Heimann, M.; Dlugokencky, E.J.; Goede, A.

    2007-01-01

    We extend the analysis of a global CH4 data set retrieved from SCIAMACHY (Frankenberg et al., 2006) by making a detailed comparison with inverse TM5 model simulations for 2003 that are optimized versus high accuracy CH4 surface measurements from the NOAA ESRL network. The comparison of column

  4. The influence of air temperature inversions on snowmelt and glacier mass-balance simulations, Ammassalik island, SE Greenland

    Energy Technology Data Exchange (ETDEWEB)

    Mernild, Sebastian Haugard [Los Alamos National Laboratory; Liston, Glen [COLORADO STATE UNIV.

    2009-01-01

    In many applications, a realistic description of air temperature inversions is essential for accurate snow and glacier ice melt, and glacier mass-balance simulations. A physically based snow-evolution modeling system (SnowModel) was used to simulate eight years (1998/99 to 2005/06) of snow accumulation and snow and glacier ice ablation from numerous small coastal marginal glaciers on the SW-part of Ammassalik Island in SE Greenland. These glaciers are regularly influenced by inversions and sea breezes associated with the adjacent relatively low temperature and frequently ice-choked fjords and ocean. To account for the influence of these inversions on the spatiotemporal variation of air temperature and snow and glacier melt rates, temperature inversion routines were added to MircoMet, the meteorological distribution sub-model used in SnowModel. The inversions were observed and modeled to occur during 84% of the simulation period. Modeled inversions were defined not to occur during days with strong winds and high precipitation rates due to the potential of inversion break-up. Field observations showed inversions to extend from sea level to approximately 300 m a.s.l., and this inversion level was prescribed in the model simulations. Simulations with and without the inversion routines were compared. The inversion model produced air temperature distributions with warmer lower elevation areas and cooler higher elevation areas than without inversion routines due to the use of cold sea-breeze base temperature data from underneath the inversion. This yielded an up to 2 weeks earlier snowmelt in the lower areas and up to 1 to 3 weeks later snowmelt in the higher elevation areas of the simulation domain. Averaged mean annual modeled surface mass-balance for all glaciers (mainly located above the inversion layer) was -720 {+-} 620 mm w.eq. y{sup -1} for inversion simulations, and -880 {+-} 620 mm w.eq. y{sup -1} without the inversion routines, a difference of 160 mm w.eq. y

  5. hp-HGS strategy for inverse 3D DC resistivity logging measurement simulations

    KAUST Repository

    Gajda-Zaǵorska, Ewa

    2012-06-02

    In this paper we present a twin adaptive strategy hp-HGS for solving inverse problems related to 3D DC borehole resistivity measurement simulations. The term “simulation of measurements” is widely used by the geophysical community. A quantity of interest, voltage, is measured at a receiver electrode located in the logging instrument. We use the self-adaptive goal-oriented hp-Finite Element Method (hp-FEM) computer simulations of the process of measurements in deviated wells (when the angle between the borehole and formation layers are < 90 deg). We also employ the hierarchical genetic search (HGS) algorithm to solve the inverse problem. Each individual in the population represents a single configuration of the formation layers. The evaluation of the individual is performed by solving the direct problem by means of the hp-FEM algorithm and by comparison with measured logging curve. We conclude the paper with some discussion on the parallelization of the algorithm.

  6. From capture to simulation: connecting forward and inverse problems in fluids

    KAUST Repository

    Gregson, James

    2014-07-27

    We explore the connection between fluid capture, simulation and proximal methods, a class of algorithms commonly used for inverse problems in image processing and computer vision. Our key finding is that the proximal operator constraining fluid velocities to be divergence-free is directly equivalent to the pressure-projection methods commonly used in incompressible flow solvers. This observation lets us treat the inverse problem of fluid tracking as a constrained flow problem all while working in an efficient, modular framework. In addition it lets us tightly couple fluid simulation into flow tracking, providing a global prior that significantly increases tracking accuracy and temporal coherence as compared to previous techniques. We demonstrate how we can use these improved results for a variety of applications, such as re-simulation, detail enhancement, and domain modification. We furthermore give an outlook of the applications beyond fluid tracking that our proximal operator framework could enable by exploring the connection of deblurring and fluid guiding.

  7. IPIP: A new approach to inverse planning for HDR brachytherapy by directly optimizing dosimetric indices

    International Nuclear Information System (INIS)

    Siauw, Timmy; Cunha, Adam; Atamtuerk, Alper; Hsu, I-Chow; Pouliot, Jean; Goldberg, Ken

    2011-01-01

    Purpose: Many planning methods for high dose rate (HDR) brachytherapy require an iterative approach. A set of computational parameters are hypothesized that will give a dose plan that meets dosimetric criteria. A dose plan is computed using these parameters, and if any dosimetric criteria are not met, the process is iterated until a suitable dose plan is found. In this way, the dose distribution is controlled by abstract parameters. The purpose of this study is to develop a new approach for HDR brachytherapy by directly optimizing the dose distribution based on dosimetric criteria. Methods: The authors developed inverse planning by integer program (IPIP), an optimization model for computing HDR brachytherapy dose plans and a fast heuristic for it. They used their heuristic to compute dose plans for 20 anonymized prostate cancer image data sets from patients previously treated at their clinic database. Dosimetry was evaluated and compared to dosimetric criteria. Results: Dose plans computed from IPIP satisfied all given dosimetric criteria for the target and healthy tissue after a single iteration. The average target coverage was 95%. The average computation time for IPIP was 30.1 s on an Intel(R) Core TM 2 Duo CPU 1.67 GHz processor with 3 Gib RAM. Conclusions: IPIP is an HDR brachytherapy planning system that directly incorporates dosimetric criteria. The authors have demonstrated that IPIP has clinically acceptable performance for the prostate cases and dosimetric criteria used in this study, in both dosimetry and runtime. Further study is required to determine if IPIP performs well for a more general group of patients and dosimetric criteria, including other cancer sites such as GYN.

  8. IPIP: A new approach to inverse planning for HDR brachytherapy by directly optimizing dosimetric indices.

    Science.gov (United States)

    Siauw, Timmy; Cunha, Adam; Atamtürk, Alper; Hsu, I-Chow; Pouliot, Jean; Goldberg, Ken

    2011-07-01

    Many planning methods for high dose rate (HDR) brachytherapy require an iterative approach. A set of computational parameters are hypothesized that will give a dose plan that meets dosimetric criteria. A dose plan is computed using these parameters, and if any dosimetric criteria are not met, the process is iterated until a suitable dose plan is found. In this way, the dose distribution is controlled by abstract parameters. The purpose of this study is to develop a new approach for HDR brachytherapy by directly optimizing the dose distribution based on dosimetric criteria. The authors developed inverse planning by integer program (IPIP), an optimization model for computing HDR brachytherapy dose plans and a fast heuristic for it. They used their heuristic to compute dose plans for 20 anonymized prostate cancer image data sets from patients previously treated at their clinic database. Dosimetry was evaluated and compared to dosimetric criteria. Dose plans computed from IPIP satisfied all given dosimetric criteria for the target and healthy tissue after a single iteration. The average target coverage was 95%. The average computation time for IPIP was 30.1 s on an Intel(R) Core 2 Duo CPU 1.67 GHz processor with 3 Gib RAM. IPIP is an HDR brachytherapy planning system that directly incorporates dosimetric criteria. The authors have demonstrated that IPIP has clinically acceptable performance for the prostate cases and dosimetric criteria used in this study, in both dosimetry and runtime. Further study is required to determine if IPIP performs well for a more general group of patients and dosimetric criteria, including other cancer sites such as GYN.

  9. IPIP: A new approach to inverse planning for HDR brachytherapy by directly optimizing dosimetric indices

    Energy Technology Data Exchange (ETDEWEB)

    Siauw, Timmy; Cunha, Adam; Atamtuerk, Alper; Hsu, I-Chow; Pouliot, Jean; Goldberg, Ken [Department of Civil and Environmental Engineering, University of California, Berkeley, 760 Davis Hall, Berkeley, California 94720-1710 (United States); Department of Radiation Oncology, University of California, San Francisco, Comprehensive Cancer Center, 1600 Divisadero Street, Suite H1031, San Francisco, California 94143-1708 (United States); Department of Industrial Engineering and Operations, University of California, Berkeley, 4141 Etcheverry Hall, Berkeley, California 94720-1777 (United States); Department of Radiation Oncology, University of California, San Francisco, Comprehensive Cancer Center, 1600 Divisadero Street, Suite H1031, San Francisco, California 94143-1708 (United States); Department of Industrial Engineering and Operations Research and Department of Electrical Engineering and Computer Science, University of California, Berkeley, 4141 Etcheverry Hall, Berkeley, California 94720-1777 (United States)

    2011-07-15

    Purpose: Many planning methods for high dose rate (HDR) brachytherapy require an iterative approach. A set of computational parameters are hypothesized that will give a dose plan that meets dosimetric criteria. A dose plan is computed using these parameters, and if any dosimetric criteria are not met, the process is iterated until a suitable dose plan is found. In this way, the dose distribution is controlled by abstract parameters. The purpose of this study is to develop a new approach for HDR brachytherapy by directly optimizing the dose distribution based on dosimetric criteria. Methods: The authors developed inverse planning by integer program (IPIP), an optimization model for computing HDR brachytherapy dose plans and a fast heuristic for it. They used their heuristic to compute dose plans for 20 anonymized prostate cancer image data sets from patients previously treated at their clinic database. Dosimetry was evaluated and compared to dosimetric criteria. Results: Dose plans computed from IPIP satisfied all given dosimetric criteria for the target and healthy tissue after a single iteration. The average target coverage was 95%. The average computation time for IPIP was 30.1 s on an Intel(R) Core{sup TM}2 Duo CPU 1.67 GHz processor with 3 Gib RAM. Conclusions: IPIP is an HDR brachytherapy planning system that directly incorporates dosimetric criteria. The authors have demonstrated that IPIP has clinically acceptable performance for the prostate cases and dosimetric criteria used in this study, in both dosimetry and runtime. Further study is required to determine if IPIP performs well for a more general group of patients and dosimetric criteria, including other cancer sites such as GYN.

  10. Design by dragging: an interface for creative forward and inverse design with simulation ensembles.

    Science.gov (United States)

    Coffey, Dane; Lin, Chi-Lun; Erdman, Arthur G; Keefe, Daniel F

    2013-12-01

    We present an interface for exploring large design spaces as encountered in simulation-based engineering, design of visual effects, and other tasks that require tuning parameters of computationally-intensive simulations and visually evaluating results. The goal is to enable a style of design with simulations that feels as-direct-as-possible so users can concentrate on creative design tasks. The approach integrates forward design via direct manipulation of simulation inputs (e.g., geometric properties, applied forces) in the same visual space with inverse design via 'tugging' and reshaping simulation outputs (e.g., scalar fields from finite element analysis (FEA) or computational fluid dynamics (CFD)). The interface includes algorithms for interpreting the intent of users' drag operations relative to parameterized models, morphing arbitrary scalar fields output from FEA and CFD simulations, and in-place interactive ensemble visualization. The inverse design strategy can be extended to use multi-touch input in combination with an as-rigid-as-possible shape manipulation to support rich visual queries. The potential of this new design approach is confirmed via two applications: medical device engineering of a vacuum-assisted biopsy device and visual effects design using a physically based flame simulation.

  11. Inverse simulation system for manual-controlled rendezvous and docking based on artificial neural network

    Science.gov (United States)

    Zhou, Wanmeng; Wang, Hua; Tang, Guojin; Guo, Shuai

    2016-09-01

    The time-consuming experimental method for handling qualities assessment cannot meet the increasing fast design requirements for the manned space flight. As a tool for the aircraft handling qualities research, the model-predictive-control structured inverse simulation (MPC-IS) has potential applications in the aerospace field to guide the astronauts' operations and evaluate the handling qualities more effectively. Therefore, this paper establishes MPC-IS for the manual-controlled rendezvous and docking (RVD) and proposes a novel artificial neural network inverse simulation system (ANN-IS) to further decrease the computational cost. The novel system was obtained by replacing the inverse model of MPC-IS with the artificial neural network. The optimal neural network was trained by the genetic Levenberg-Marquardt algorithm, and finally determined by the Levenberg-Marquardt algorithm. In order to validate MPC-IS and ANN-IS, the manual-controlled RVD experiments on the simulator were carried out. The comparisons between simulation results and experimental data demonstrated the validity of two systems and the high computational efficiency of ANN-IS.

  12. SU-F-T-346: Dose Mimicking Inverse Planning Based On Helical Delivery Treatment Plans for Head and Neck Patients

    Energy Technology Data Exchange (ETDEWEB)

    Kumaran Nair, C; Hoffman, D; Wright, C; Yamamoto, T; Rao, S; Benedict, S; Rong, Y [University of California Davis Medical Center, Sacramento, CA (United States); Markham, J [Raysearch Laboratories, Garden City, NY (United States)

    2016-06-15

    Purpose: We aim to evaluate a new commercial dose mimicking inverse-planning application that was designed to provide cross-platform treatment planning, for its dosimetric quality and efficiency. The clinical benefit of this application allows patients treated on O-shaped linac to receive an equivalent plan on conventional L-shaped linac as needed for workflow or machine downtime. Methods: The dose mimicking optimization process seeks to create a similar DVH of an O-shaped linac-based plans with an alternative treatment technique (IMRT or VMAT), by maintaining target conformity, and penalizing dose falloff outside the target. Ten head and neck (HN) helical delivery plans, including simple and complex cases were selected for re-planning with the dose mimicking application. All plans were generated for a 6 MV beam model, using 7-field/ 9-field IMRT and VMAT techniques. PTV coverage (D1, D99 and homogeneity index [HI]), and OARs avoidance (Dmean / Dmax) were compared. Results: The resulting dose mimicked HN plans achieved acceptable PTV coverage for HI (VMAT 7.0±2.3, 7-fld 7.3±2.4, and 9-fld 7.0±2.4), D99 (98.0%±0.7%, 97.8%±0.7%, and 98.0%±0.7%), as well as D1 (106.4%±2.1%, 106.5%±2.2%, and 106.4%±2.1%), respectively. The OAR dose discrepancy varied: brainstem (2% to 4%), cord (3% to 6%), esophagus (−4% to −8%), larynx (−4% to 2%), and parotid (4% to 14%). Mimicked plans would typically be needed for 1–5 fractions of a treatment course, and we estimate <1% variance would be introduced in target coverage while maintaining comparable low dose to OARs. All mimicked plans were approved by independent physician and passed patient specific QA within our established tolerance. Conclusion: Dose mimicked plans provide a practical alternative for responding to clinical workflow issues, and provide reliability for patient treatment. The quality of dose mimicking for HN patients highly depends on the delivery technique, field numbers and angles, as well as user

  13. Development of a neuro-fuzzy technique for automated parameter optimization of inverse treatment planning

    Directory of Open Access Journals (Sweden)

    Wenz Frederik

    2009-09-01

    Full Text Available Abstract Background Parameter optimization in the process of inverse treatment planning for intensity modulated radiation therapy (IMRT is mainly conducted by human planners in order to create a plan with the desired dose distribution. To automate this tedious process, an artificial intelligence (AI guided system was developed and examined. Methods The AI system can automatically accomplish the optimization process based on prior knowledge operated by several fuzzy inference systems (FIS. Prior knowledge, which was collected from human planners during their routine trial-and-error process of inverse planning, has first to be "translated" to a set of "if-then rules" for driving the FISs. To minimize subjective error which could be costly during this knowledge acquisition process, it is necessary to find a quantitative method to automatically accomplish this task. A well-developed machine learning technique, based on an adaptive neuro fuzzy inference system (ANFIS, was introduced in this study. Based on this approach, prior knowledge of a fuzzy inference system can be quickly collected from observation data (clinically used constraints. The learning capability and the accuracy of such a system were analyzed by generating multiple FIS from data collected from an AI system with known settings and rules. Results Multiple analyses showed good agreements of FIS and ANFIS according to rules (error of the output values of ANFIS based on the training data from FIS of 7.77 ± 0.02% and membership functions (3.9%, thus suggesting that the "behavior" of an FIS can be propagated to another, based on this process. The initial experimental results on a clinical case showed that ANFIS is an effective way to build FIS from practical data, and analysis of ANFIS and FIS with clinical cases showed good planning results provided by ANFIS. OAR volumes encompassed by characteristic percentages of isodoses were reduced by a mean of between 0 and 28%. Conclusion The

  14. Development of a neuro-fuzzy technique for automated parameter optimization of inverse treatment planning.

    Science.gov (United States)

    Stieler, Florian; Yan, Hui; Lohr, Frank; Wenz, Frederik; Yin, Fang-Fang

    2009-09-25

    Parameter optimization in the process of inverse treatment planning for intensity modulated radiation therapy (IMRT) is mainly conducted by human planners in order to create a plan with the desired dose distribution. To automate this tedious process, an artificial intelligence (AI) guided system was developed and examined. The AI system can automatically accomplish the optimization process based on prior knowledge operated by several fuzzy inference systems (FIS). Prior knowledge, which was collected from human planners during their routine trial-and-error process of inverse planning, has first to be "translated" to a set of "if-then rules" for driving the FISs. To minimize subjective error which could be costly during this knowledge acquisition process, it is necessary to find a quantitative method to automatically accomplish this task. A well-developed machine learning technique, based on an adaptive neuro fuzzy inference system (ANFIS), was introduced in this study. Based on this approach, prior knowledge of a fuzzy inference system can be quickly collected from observation data (clinically used constraints). The learning capability and the accuracy of such a system were analyzed by generating multiple FIS from data collected from an AI system with known settings and rules. Multiple analyses showed good agreements of FIS and ANFIS according to rules (error of the output values of ANFIS based on the training data from FIS of 7.77 +/- 0.02%) and membership functions (3.9%), thus suggesting that the "behavior" of an FIS can be propagated to another, based on this process. The initial experimental results on a clinical case showed that ANFIS is an effective way to build FIS from practical data, and analysis of ANFIS and FIS with clinical cases showed good planning results provided by ANFIS. OAR volumes encompassed by characteristic percentages of isodoses were reduced by a mean of between 0 and 28%. The study demonstrated a feasible way to automatically perform

  15. Development of a neuro-fuzzy technique for automated parameter optimization of inverse treatment planning

    International Nuclear Information System (INIS)

    Stieler, Florian; Yan, Hui; Lohr, Frank; Wenz, Frederik; Yin, Fang-Fang

    2009-01-01

    Parameter optimization in the process of inverse treatment planning for intensity modulated radiation therapy (IMRT) is mainly conducted by human planners in order to create a plan with the desired dose distribution. To automate this tedious process, an artificial intelligence (AI) guided system was developed and examined. The AI system can automatically accomplish the optimization process based on prior knowledge operated by several fuzzy inference systems (FIS). Prior knowledge, which was collected from human planners during their routine trial-and-error process of inverse planning, has first to be 'translated' to a set of 'if-then rules' for driving the FISs. To minimize subjective error which could be costly during this knowledge acquisition process, it is necessary to find a quantitative method to automatically accomplish this task. A well-developed machine learning technique, based on an adaptive neuro fuzzy inference system (ANFIS), was introduced in this study. Based on this approach, prior knowledge of a fuzzy inference system can be quickly collected from observation data (clinically used constraints). The learning capability and the accuracy of such a system were analyzed by generating multiple FIS from data collected from an AI system with known settings and rules. Multiple analyses showed good agreements of FIS and ANFIS according to rules (error of the output values of ANFIS based on the training data from FIS of 7.77 ± 0.02%) and membership functions (3.9%), thus suggesting that the 'behavior' of an FIS can be propagated to another, based on this process. The initial experimental results on a clinical case showed that ANFIS is an effective way to build FIS from practical data, and analysis of ANFIS and FIS with clinical cases showed good planning results provided by ANFIS. OAR volumes encompassed by characteristic percentages of isodoses were reduced by a mean of between 0 and 28%. The study demonstrated a feasible way

  16. Inverse planning in the age of digital LINACs: station parameter optimized radiation therapy (SPORT)

    International Nuclear Information System (INIS)

    Xing, Lei; Li, Ruijiang

    2014-01-01

    The last few years have seen a number of technical and clinical advances which give rise to a need for innovations in dose optimization and delivery strategies. Technically, a new generation of digital linac has become available which offers features such as programmable motion between station parameters and high dose-rate Flattening Filter Free (FFF) beams. Current inverse planning methods are designed for traditional machines and cannot accommodate these features of new generation linacs without compromising either dose conformality and/or delivery efficiency. Furthermore, SBRT is becoming increasingly important, which elevates the need for more efficient delivery, improved dose distribution. Here we will give an overview of our recent work in SPORT designed to harness the digital linacs and highlight the essential components of SPORT. We will summarize the pros and cons of traditional beamlet-based optimization (BBO) and direct aperture optimization (DAO) and introduce a new type of algorithm, compressed sensing (CS)-based inverse planning, that is capable of automatically removing the redundant segments during optimization and providing a plan with high deliverability in the presence of a large number of station control points (potentially non-coplanar, non-isocentric, and even multi-isocenters). We show that CS-approach takes the interplay between planning and delivery into account and allows us to balance the dose optimality and delivery efficiency in a controlled way and, providing a viable framework to address various unmet demands of the new generation linacs. A few specific implementation strategies of SPORT in the forms of fixed-gantry and rotational arc delivery are also presented.

  17. Inverse planning in the age of digital LINACs: station parameter optimized radiation therapy (SPORT)

    Science.gov (United States)

    Xing, Lei; Li, Ruijiang

    2014-03-01

    The last few years have seen a number of technical and clinical advances which give rise to a need for innovations in dose optimization and delivery strategies. Technically, a new generation of digital linac has become available which offers features such as programmable motion between station parameters and high dose-rate Flattening Filter Free (FFF) beams. Current inverse planning methods are designed for traditional machines and cannot accommodate these features of new generation linacs without compromising either dose conformality and/or delivery efficiency. Furthermore, SBRT is becoming increasingly important, which elevates the need for more efficient delivery, improved dose distribution. Here we will give an overview of our recent work in SPORT designed to harness the digital linacs and highlight the essential components of SPORT. We will summarize the pros and cons of traditional beamlet-based optimization (BBO) and direct aperture optimization (DAO) and introduce a new type of algorithm, compressed sensing (CS)-based inverse planning, that is capable of automatically removing the redundant segments during optimization and providing a plan with high deliverability in the presence of a large number of station control points (potentially non-coplanar, non-isocentric, and even multi-isocenters). We show that CS-approach takes the interplay between planning and delivery into account and allows us to balance the dose optimality and delivery efficiency in a controlled way and, providing a viable framework to address various unmet demands of the new generation linacs. A few specific implementation strategies of SPORT in the forms of fixed-gantry and rotational arc delivery are also presented.

  18. Forward versus inverse planning in oropharyngeal cancer: A comparative study using physical and biological indices

    Directory of Open Access Journals (Sweden)

    T Sundaram

    2013-01-01

    Full Text Available Context: Possible benefits of inverse planning. Aims: To analyze possible benefits of inverse planning intensity modulated radiation therapy (IMRT over field-in-field 3D conformal radiation therapy (FIF-3DCRT and to evaluate the differences if any, between low (6 Million Volts and high energy (15 Million Volts IMRT plans. Materials and Methods: Ten patients with squamous cell carcinoma of oropharynx, previously treated with 6 MV step and shoot IMRT were studied. V 100 , V 33 , V 66 , mean dose and normal tissue complication probabilities (NTCP were evaluated for parotid glands. Maximum dose and NTCP were the parameters for spinal cord. Statistical Analysis Used: A two-tailed t-test was applied to analyze statistical significance between the different techniques. Results: For combined parotid gland, a reduction of 4.374 Gy, 9.343 Gy and 7.883 Gy were achieved for D 100 , D 66 and D 33 , respectively in 6 MV-IMRT when compared with FIF-3DCRT. Spinal cord sparing was better in 6 MV-IMRT (40.963 ± 2.650, with an average reduction of maximum spinal cord dose by 7.355 Gy from that using the FIF-3DCRT technique. The uncomplicated tumor control probabilities values were higher in IMRT plans thus leading to a possibility of dose escalation. Conclusions: Though low-energy IMRT is the preferred choice for treatment of oropharyngeal cancers, FIF-3DCRT must be given due consideration as a second choice for its well established advantages over traditional conventioan technique.

  19. A comparison of conventional 'forward planning' with inverse planning for 3D conformal radiotherapy of the prostate

    International Nuclear Information System (INIS)

    Oldham, Mark; Neal, Anthony; Webb, Steve

    1995-01-01

    A radiotherapy treatment plan optimisation algorithm has been applied to 48 prostate plans and the results compared with those of an experienced human planner. Twelve patients were used in the study, and 3-, 4-, 6- and 8-field plans (with standard coplanar beam angles for each plan type) were optimised by both the human planner and the optimisation algorithm. The human planner 'optimised' the plan by conventional forward planning techniques. The optimisation algorithm was based on fast simulated annealing using a cost-function designed to achieve a homogenous dose in the 'planning-target-volume' and to minimise the integral dose to the organs at risk. 'Importance factors' assigned to different regions of the patient provide a method for controlling the algorithm, and it was found that the same values gave good results for almost all plans. A study of the convergence of the algorithm is presented and optimal convergence parameters are determined. The plans were compared on the basis of both dose statistics and 'normal-tissue-complication-probability' (NTCP) and 'tumour-control-probability' (TCP). The results of the comparison study show that the optimisation algorithm yielded results that were at least as good as the human planner for all plan types, and on the whole slightly better. A study of the beam-weights chosen by the optimisation algorithm and the planner revealed differences that increased with the number of beams in the plan. The planner was found to make small perturbations about a conceived optimal beam-weight set. The optimisation algorithm showed much greater variation, in response to individual patient geometry, frequently deselecting certain beams altogether from the plan. The algorithm is shown to be a useful tool for radiotherapy treatment planning. For simple (e.g., three-field) plans it was found to consistently achieve slightly higher TCP and lower NTCP values. For more complicated (e.g., eight-field) plans the optimisation also achieved

  20. Simulated scatter performance of an inverse-geometry dedicated breast CT system.

    Science.gov (United States)

    Bhagtani, Reema; Schmidt, Taly Gilat

    2009-03-01

    The purpose of this work was to quantify the effects of scatter for inverse-geometry dedicated breast CT compared to cone-beam breast CT through simulations. The inverse geometry was previously proposed as an alternative to cone-beam acquisition for volumetric CT. The inverse geometry consists of a large-area scanned-source opposite a detector array that is smaller in the transverse direction. While the gantry rotates, the x-ray beam is rapidly sequenced through an array of positions, acquiring a truncated projection image at each position. Inverse-geometry CT (IGCT) is expected to detect less scatter than cone-beam methods because only a fraction of the object is irradiated at any time and the fast detector isolates the measurements from sequential x-ray beams. An additional scatter benefit is the increased air gap due to the inverted geometry. In this study, we modeled inverse-geometry and cone-beam dedicated breast CT systems of equivalent resolution, field of view, and photon fluence. Monte Carlo simulations generated scatter and primary projections of three cylindrical phantoms of diameters 10, 14, and 18 cm composed of 50% adipose/50% glandular tissue. The scatter-to-primary ratio (SPR) was calculated for each breast diameter. Monte Carlo simulations were combined with analytical simulations to generate inverse-geometry and cone-beam images of breast phantoms embedded with tumors. Noise reprehenting the photon fluence of a realistic breast CT scan was added to the simulated projections. Cone-beam data were reconstructed with and without an ideal scatter correction. The CNR between breast tumor and background was compared for the inverse and cone-beam geometries for the three phantom diameters. Results demonstrated an order of magnitude reduction in SPR for the IGCT system compared to the cone-beam system. For example, the peak IGCT SPRs were 0.05 and 0.09 for the 14 and 18 cm phantoms, respectively, compared to 0.42 and 1 for the cone-beam system. For both

  1. Planning Study Comparison of Real-Time Target Tracking and Four-Dimensional Inverse Planning for Managing Patient Respiratory Motion

    International Nuclear Information System (INIS)

    Zhang Peng; Hugo, Geoffrey D.; Yan Di

    2008-01-01

    Purpose: Real-time target tracking (RT-TT) and four-dimensional inverse planning (4D-IP) are two potential methods to manage respiratory target motion. In this study, we evaluated each method using the cumulative dose-volume criteria in lung cancer radiotherapy. Methods and Materials: Respiration-correlated computed tomography scans were acquired for 4 patients. Deformable image registration was applied to generate a displacement mapping for each phase image of the respiration-correlated computed tomography images. First, the dose distribution for the organs of interest obtained from an idealized RT-TT technique was evaluated, assuming perfect knowledge of organ motion and beam tracking. Inverse planning was performed on each phase image separately. The treatment dose to the organs of interest was then accumulated from the optimized plans. Second, 4D-IP was performed using the probability density function of respiratory motion. The beam arrangement, prescription dose, and objectives were consistent in both planning methods. The dose-volume and equivalent uniform dose in the target volume, lung, heart, and spinal cord were used for the evaluation. Results: The cumulative dose in the target was similar for both techniques. The equivalent uniform dose of the lung, heart, and spinal cord was 4.6 ± 2.2, 11 ± 4.4, and 11 ± 6.6 Gy for RT-TT with a 0-mm target margin, 5.2 ± 3.1, 12 ± 5.9, and 12 ± 7.8 Gy for RT-TT with a 2-mm target margin, and 5.3 ± 2.3, 11.9 ± 5.0, and 12 ± 5.6 Gy for 4D-IP, respectively. Conclusion: The results of our study have shown that 4D-IP can achieve plans similar to those achieved by RT-TT. Considering clinical implementation, 4D-IP could be a more reliable and practical method to manage patient respiration-induced motion

  2. Conventional treatment planning optimization using simulated annealing

    International Nuclear Information System (INIS)

    Morrill, S.M.; Langer, M.; Lane, R.G.

    1995-01-01

    Purpose: Simulated annealing (SA) allows for the implementation of realistic biological and clinical cost functions into treatment plan optimization. However, a drawback to the clinical implementation of SA optimization is that large numbers of beams appear in the final solution, some with insignificant weights, preventing the delivery of these optimized plans using conventional (limited to a few coplanar beams) radiation therapy. A preliminary study suggested two promising algorithms for restricting the number of beam weights. The purpose of this investigation was to compare these two algorithms using our current SA algorithm with the aim of producing a algorithm to allow clinically useful radiation therapy treatment planning optimization. Method: Our current SA algorithm, Variable Stepsize Generalized Simulated Annealing (VSGSA) was modified with two algorithms to restrict the number of beam weights in the final solution. The first algorithm selected combinations of a fixed number of beams from the complete solution space at each iterative step of the optimization process. The second reduced the allowed number of beams by a factor of two at periodic steps during the optimization process until only the specified number of beams remained. Results of optimization of beam weights and angles using these algorithms were compared using a standard cadre of abdominal cases. The solution space was defined as a set of 36 custom-shaped open and wedged-filtered fields at 10 deg. increments with a target constant target volume margin of 1.2 cm. For each case a clinically-accepted cost function, minimum tumor dose was maximized subject to a set of normal tissue binary dose-volume constraints. For this study, the optimized plan was restricted to four (4) fields suitable for delivery with conventional therapy equipment. Results: The table gives the mean value of the minimum target dose obtained for each algorithm averaged over 5 different runs and the comparable manual treatment

  3. Planning of step-stress accelerated degradation test based on the inverse Gaussian process

    International Nuclear Information System (INIS)

    Wang, Huan; Wang, Guan-jun; Duan, Feng-jun

    2016-01-01

    The step-stress accelerated degradation test (SSADT) is a useful tool for assessing the lifetime distribution of highly reliable or expensive product. Some efficient SSADT plans have been proposed when the underlying degradation follows the Wiener process or Gamma process. However, how to design an efficient SSADT plan for the inverse Gaussian (IG) process is still a problem to be solved. The aim of this paper is to provide an optimal SSADT plan for the IG degradation process. A cumulative exposure model for the SSADT is adopted, in which the product degradation path depends only on the current stress level and the degradation accumulated, and has nothing to do with the way of accumulation. Under the constraint of the total experimental budget, some design variables are optimized by minimizing the asymptotic variance of the estimated p-quantile of the lifetime distribution of the product. Finally, we use the proposed method to deal with the optimal SSADT design for a type of electrical connector based on a set of stress relaxation data. The sensitivity and stability of the SSADT plan are studied, and we find that the optimal test plan is quite robust for a moderate departure from the values of the parameters. - Highlights: • We propose an optimal SSADT plan for the IG degradation process. • A CE model is assumed in describing the degradation path of the SSADT. • The asymptotic variance of the estimated p-quantile is used as the objective function. • A set of stress relaxation data is analyzed and used for illustration of our method.

  4. Elastic Cherenkov effects in transversely isotropic soft materials-I: Theoretical analysis, simulations and inverse method

    Science.gov (United States)

    Li, Guo-Yang; Zheng, Yang; Liu, Yanlin; Destrade, Michel; Cao, Yanping

    2016-11-01

    A body force concentrated at a point and moving at a high speed can induce shear-wave Mach cones in dusty-plasma crystals or soft materials, as observed experimentally and named the elastic Cherenkov effect (ECE). The ECE in soft materials forms the basis of the supersonic shear imaging (SSI) technique, an ultrasound-based dynamic elastography method applied in clinics in recent years. Previous studies on the ECE in soft materials have focused on isotropic material models. In this paper, we investigate the existence and key features of the ECE in anisotropic soft media, by using both theoretical analysis and finite element (FE) simulations, and we apply the results to the non-invasive and non-destructive characterization of biological soft tissues. We also theoretically study the characteristics of the shear waves induced in a deformed hyperelastic anisotropic soft material by a source moving with high speed, considering that contact between the ultrasound probe and the soft tissue may lead to finite deformation. On the basis of our theoretical analysis and numerical simulations, we propose an inverse approach to infer both the anisotropic and hyperelastic parameters of incompressible transversely isotropic (TI) soft materials. Finally, we investigate the properties of the solutions to the inverse problem by deriving the condition numbers in analytical form and performing numerical experiments. In Part II of the paper, both ex vivo and in vivo experiments are conducted to demonstrate the applicability of the inverse method in practical use.

  5. Anatomy-based inverse planning dose optimization in HDR prostate implant: A toxicity study

    International Nuclear Information System (INIS)

    Mahmoudieh, Alireza; Tremblay, Christine; Beaulieu, Luc; Lachance, Bernard; Harel, Francois; Lessard, Etienne; Pouliot, Jean; Vigneault, Eric

    2005-01-01

    Background and purpose: The aim of this study is to evaluate the acute and late complications in patients who have received HDR implant boost using inverse planning, and to determine dose volume correlations. Patients and methods: Between September 1999 and October 2002, 44 patients with locally advanced prostate cancer (PSA ≥10 ng/ml, and/or Gleason score ≥7, and/or Stage T2c or higher) were treated with 40-45 Gy external pelvic field followed by 2-3 fraction of inverse-planned HDR implant boost (6-9.5 Gy /fraction). Median follow-up time was 1.7 years with 81.8% of patients who had at least 12 months of follow up (range 8.6-42.5. Acute and late morbidity data were collected and graded according to RTOG criteria. Questionnaires were used to collect prostate related measures of quality of life, and international prostate symptom score (IPSS) before and after treatment. Dose-volume histograms for prostate, urethra, bladder, penis bulb and rectum were analyzed. Results: The median patient age was 64 years. Of these, 32% were in the high risk group, and 61% in the intermediate risk group. 3 patients (7%) had no adverse prognostic factors. A single grade 3 GU acute toxicity was reported but no grade 3-4 acute GI toxicity. No grade 3-4 late GU or GI toxicity was reported. Acute (late) grade 2 urinary and rectal symptoms were reported in 31.8 (11.4%) and 4.6% (4.6%) of patients, respectively. A trend for predicting acute GU toxicity is seen for total HDR dose of more than 18 Gy (OR=3.6, 95%CI=[0.96-13.5], P=0.058). The evolution of toxicity is presented for acute and late GU/GI toxicity. Erectile dysfunction occurs in approximately 27% of patients who were not on hormonal deprivation, but may be taking sildenafil. The IPSS peaked on averaged 6 weeks post-implant and returned to the baseline at a median of 6 months. Conclusions: Inverse-planned HDR brachytherapy is a viable option to deliver higher dose to the prostate as a boost without increasing GU or rectal

  6. The dosimetric impact of inversely optimized arc radiotherapy plan modulation for real-time dynamic MLC tracking delivery

    DEFF Research Database (Denmark)

    Falk, Marianne; Larsson, Tobias; Keall, P.

    2012-01-01

    of MLC tracking delivery of an inversely optimized arc radiotherapy plan can be improved by incorporating leaf position constraints in the objective function without otherwise affecting the plan quality. The dosimetric robustness may be estimated prior to delivery by evaluating the ALDw of the plan.......Purpose: Real-time dynamic multileaf collimator (MLC) tracking for management of intrafraction tumor motion can be challenging for highly modulated beams, as the leaves need to travel far to adjust for target motion perpendicular to the leaf travel direction. The plan modulation can be reduced...... by using a leaf position constraint (LPC) that reduces the difference in the position of adjacent MLC leaves in the plan. The purpose of this study was to investigate the impact of the LPC on the quality of inversely optimized arc radiotherapy plans and the effect of the MLC motion pattern...

  7. Investigation of using a power function as a cost function in inverse planning optimization.

    Science.gov (United States)

    Xia, Ping; Yu, Naichang; Xing, Lei; Sun, Xuepeng; Verhey, Lynn J

    2005-04-01

    The purpose of this paper is to investigate the use of a power function as a cost function in inverse planning optimization. The cost function for each structure is implemented as an exponential power function of the deviation between the resultant dose and prescribed or constrained dose. The total cost function for all structures is a summation of the cost function of every structure. When the exponents of all terms in the cost function are set to 2, the cost function becomes a classical quadratic cost function. An independent optimization module was developed and interfaced with a research treatment planning system from the University of North Carolina for dose calculation and display of results. Three clinical cases were tested for this study with various exponents set for tumor targets and sensitive structures. Treatment plans with these exponent settings were compared, using dose volume histograms. The results of our study demonstrated that using an exponent higher than 2 in the cost function for the target achieved better dose homogeneity than using an exponent of 2. An exponent higher than 2 for serial sensitive structures can effectively reduce the maximum dose. Varying the exponent from 2 to 4 resulted in the most effective changes in dose volume histograms while the change from 4 to 8 is less drastic, indicating a situation of saturation. In conclusion, using a power function with exponent greater than 2 as a cost function can effectively achieve homogeneous dose inside the target and/or minimize maximum dose to the critical structures.

  8. Development of a residency program in radiation oncology physics: an inverse planning approach.

    Science.gov (United States)

    Khan, Rao F H; Dunscombe, Peter B

    2016-03-08

    Over the last two decades, there has been a concerted effort in North America to organize medical physicists' clinical training programs along more structured and formal lines. This effort has been prompted by the Commission on Accreditation of Medical Physics Education Programs (CAMPEP) which has now accredited about 90 residency programs. Initially the accreditation focused on standardized and higher quality clinical physics training; the development of rounded professionals who can function at a high level in a multidisciplinary environment was recognized as a priority of a radiation oncology physics residency only lately. In this report, we identify and discuss the implementation of, and the essential components of, a radiation oncology physics residency designed to produce knowledgeable and effective clinical physicists for today's safety-conscious and collaborative work environment. Our approach is that of inverse planning, by now familiar to all radiation oncology physicists, in which objectives and constraints are identified prior to the design of the program. Our inverse planning objectives not only include those associated with traditional residencies (i.e., clinical physics knowledge and critical clinical skills), but also encompass those other attributes essential for success in a modern radiation therapy clinic. These attributes include formal training in management skills and leadership, teaching and communication skills, and knowledge of error management techniques and patient safety. The constraints in our optimization exercise are associated with the limited duration of a residency and the training resources available. Without compromising the knowledge and skills needed for clinical tasks, we have successfully applied the model to the University of Calgary's two-year residency program. The program requires 3840 hours of overall commitment from the trainee, of which 7%-10% is spent in obtaining formal training in nontechnical "soft skills".

  9. Adaptive Core Simulation Employing Discrete Inverse Theory - Part II: Numerical Experiments

    International Nuclear Information System (INIS)

    Abdel-Khalik, Hany S.; Turinsky, Paul J.

    2005-01-01

    Use of adaptive simulation is intended to improve the fidelity and robustness of important core attribute predictions such as core power distribution, thermal margins, and core reactivity. Adaptive simulation utilizes a selected set of past and current reactor measurements of reactor observables, i.e., in-core instrumentation readings, to adapt the simulation in a meaningful way. The companion paper, ''Adaptive Core Simulation Employing Discrete Inverse Theory - Part I: Theory,'' describes in detail the theoretical background of the proposed adaptive techniques. This paper, Part II, demonstrates several computational experiments conducted to assess the fidelity and robustness of the proposed techniques. The intent is to check the ability of the adapted core simulator model to predict future core observables that are not included in the adaption or core observables that are recorded at core conditions that differ from those at which adaption is completed. Also, this paper demonstrates successful utilization of an efficient sensitivity analysis approach to calculate the sensitivity information required to perform the adaption for millions of input core parameters. Finally, this paper illustrates a useful application for adaptive simulation - reducing the inconsistencies between two different core simulator code systems, where the multitudes of input data to one code are adjusted to enhance the agreement between both codes for important core attributes, i.e., core reactivity and power distribution. Also demonstrated is the robustness of such an application

  10. Expected treatment dose construction and adaptive inverse planning optimization: Implementation for offline head and neck cancer adaptive radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Yan Di; Liang Jian [Department of Radiation Oncology, Beaumont Health System, Royal Oak, Michigan 48073 (United States)

    2013-02-15

    Purpose: To construct expected treatment dose for adaptive inverse planning optimization, and evaluate it on head and neck (h and n) cancer adaptive treatment modification. Methods: Adaptive inverse planning engine was developed and integrated in our in-house adaptive treatment control system. The adaptive inverse planning engine includes an expected treatment dose constructed using the daily cone beam (CB) CT images in its objective and constrains. Feasibility of the adaptive inverse planning optimization was evaluated retrospectively using daily CBCT images obtained from the image guided IMRT treatment of 19 h and n cancer patients. Adaptive treatment modification strategies with respect to the time and the number of adaptive inverse planning optimization during the treatment course were evaluated using the cumulative treatment dose in organs of interest constructed using all daily CBCT images. Results: Expected treatment dose was constructed to include both the delivered dose, to date, and the estimated dose for the remaining treatment during the adaptive treatment course. It was used in treatment evaluation, as well as in constructing the objective and constraints for adaptive inverse planning optimization. The optimization engine is feasible to perform planning optimization based on preassigned treatment modification schedule. Compared to the conventional IMRT, the adaptive treatment for h and n cancer illustrated clear dose-volume improvement for all critical normal organs. The dose-volume reductions of right and left parotid glands, spine cord, brain stem and mandible were (17 {+-} 6)%, (14 {+-} 6)%, (11 {+-} 6)%, (12 {+-} 8)%, and (5 {+-} 3)% respectively with the single adaptive modification performed after the second treatment week; (24 {+-} 6)%, (22 {+-} 8)%, (21 {+-} 5)%, (19 {+-} 8)%, and (10 {+-} 6)% with three weekly modifications; and (28 {+-} 5)%, (25 {+-} 9)%, (26 {+-} 5)%, (24 {+-} 8)%, and (15 {+-} 9)% with five weekly modifications. Conclusions

  11. Capability of leaf interdigitation with different inverse planning strategies in Monaco: an investigation of representative tumour sites

    International Nuclear Information System (INIS)

    Duan, Jinghao; Meng, Xiangjuan; Liu, Tonghai; Yin, Yong

    2016-01-01

    The aim of this study was to experimentally assess the dosimetric impact of leaf interdigitation using different inverse treatment strategies for representative tumour sites and to identify the situations in which leaf interdigitation can benefit these tumour sites. Sixty previously treated patients (15 nasopharyngeal carcinoma (NPC), 15 multiple brain metastasis (MBM), 15 cervical cancer and 15 prostate cancer) were re-planned for volumetric modulated arc therapy (VMAT), sliding window IMRT (dMLC) and step-and-shoot IMRT (ssIMRT) with and without leaf interdigitation. Various dosimetric variables, such as PTV coverage, OARs sparing, delivery efficiency and planning time, were evaluated for each plan. In addition, a protocol developed by our group was applied to identify the situations in which leaf interdigitation can achieve benefits in clinical practice. Leaf interdigitation produced few benefits in PTV homogeneity for the MBM VMAT plans and NPC ssIMRT plans. For OARs, sparing was equivalent with and without leaf interdigitation. Leaf interdigitation showed an increase in MUs for dMLC plans and a decrease in MUs for ssIMRT plans. Leaf interdigitation resulted in an increase in segments for dMLC plans and a decrease in segments for NPC and MBM ssIMRT plans. For beam on time, leaf interdigitation showed an increase in MBM dMLC, NPC ssIMRT and prostate ssIMRT plans. In addition, leaf interdigitation saved planning time for VMAT and dMLC plans but increased planning time for ssIMRT plans. Leaf interdigitation does not improve plan quality when performing inverse treatment strategies, regardless of whether the target is simple or complex. However, it influences the delivery efficiency and planning time. Based on these observations, our study suggests that leaf interdigitation should be utilized when performing MBM VMAT plans and NPC ssIMRT plans. The online version of this article (doi:10.1186/s13014-016-0655-1) contains supplementary material, which is available to

  12. Mixed integer programming improves comprehensibility and plan quality in inverse optimization of prostate HDR brachytherapy.

    Science.gov (United States)

    Gorissen, Bram L; den Hertog, Dick; Hoffmann, Aswin L

    2013-02-21

    Current inverse treatment planning methods that optimize both catheter positions and dwell times in prostate HDR brachytherapy use surrogate linear or quadratic objective functions that have no direct interpretation in terms of dose-volume histogram (DVH) criteria, do not result in an optimum or have long solution times. We decrease the solution time of the existing linear and quadratic dose-based programming models (LP and QP, respectively) to allow optimizing over potential catheter positions using mixed integer programming. An additional average speed-up of 75% can be obtained by stopping the solver at an early stage, without deterioration of the plan quality. For a fixed catheter configuration, the dwell time optimization model LP solves to optimality in less than 15 s, which confirms earlier results. We propose an iterative procedure for QP that allows us to prescribe the target dose as an interval, while retaining independence between the solution time and the number of dose calculation points. This iterative procedure is comparable in speed to the LP model and produces better plans than the non-iterative QP. We formulate a new dose-volume-based model that maximizes V(100%) while satisfying pre-set DVH criteria. This model optimizes both catheter positions and dwell times within a few minutes depending on prostate volume and number of catheters, optimizes dwell times within 35 s and gives better DVH statistics than dose-based models. The solutions suggest that the correlation between the objective value and the clinical plan quality is weak in the existing dose-based models.

  13. An Interactive Simulation Tool for Production Planning in Bacon Factories

    DEFF Research Database (Denmark)

    Nielsen, Jens Frederik Dalsgaard; Nielsen, Kirsten Mølgaard

    1994-01-01

    The paper describes an interactive simulation tool for production planning in bacon factories. The main aim of the tool is to make it possible to combine the production plans of all parts of the factory......The paper describes an interactive simulation tool for production planning in bacon factories. The main aim of the tool is to make it possible to combine the production plans of all parts of the factory...

  14. The dose-volume constraint satisfaction problem for inverse treatment planning with field segments

    International Nuclear Information System (INIS)

    Michalski, Darek; Xiao, Ying; Censor, Yair; Galvin, James M

    2004-01-01

    The prescribed goals of radiation treatment planning are often expressed in terms of dose-volume constraints. We present a novel formulation of a dose-volume constraint satisfaction search for the discretized radiation therapy model. This approach does not rely on any explicit cost function. Inverse treatment planning uses the aperture-based approach with predefined, according to geometric rules, segmental fields. The solver utilizes the simultaneous version of the cyclic subgradient projection algorithm. This is a deterministic iterative method designed for solving the convex feasibility problems. A prescription is expressed with the set of inequalities imposed on the dose at the voxel resolution. Additional constraint functions control the compliance with selected points of the expected cumulative dose-volume histograms. The performance of this method is tested on prostate and head-and-neck cases. The relationships with other models and algorithms of similar conceptual origin are discussed. The demonstrated advantages of the method are: the equivalence of the algorithmic and prescription parameters, the intuitive setup of free parameters, and the improved speed of the method as compared to similar iterative as well as other techniques. The technique reported here will deliver approximate solutions for inconsistent prescriptions

  15. Compact nuclear simulator and its upgrade plan

    International Nuclear Information System (INIS)

    Kwon, Kee-Choon; Park, Jae-Chang; Jung, Chul-Hwan; Lee, Jang-Soo; Kim, Jang-Yeol

    1997-01-01

    The Compact Nuclear Simulator (CNS) was installed at the nuclear training center of the Korea Atomic Energy Research Institute (KAERI) in 1998. The CNS has been used for training non-operator personnel, such as NSSS design engineers, maintenance personnel, and inspectors of regulatory body, and for testing fuzzy control algorithm. The CNS mathematical modeling modeled a three loop Westinghouse Pressurizer Water Reactor (PWR), 993 MWe, mostly referred to as the Kori Unit 3 and 4 in Korea. However, the main computer (Micro VAX II), an interface card between a main computer and operator panel, and a graphic display system are faced with frequent troubles due to obsolescence and a lack of spare parts. Accordingly, CNS hardware should be replaced by state of the art equipment. There are plans to replace the main computer with an HP workstation, the dedicated interface card with a PLC-based interface system, and the graphic interface system with a X-terminal based full graphics system. The full graphics user interface system supports an easy and friendly interface between the CNS and users. The software for the instructor console also will be modified from a text-based to a Motif-based user interface. The Motif-based user interface provides a more efficient and easy operation in an instructor console. The real-time executive software programmed under a Micro VMS operating system should also be replaced by software programmed under a HPUX operating system. (author)

  16. Exploration of the impact of nearby sources on urban atmospheric inversions using large eddy simulation

    Directory of Open Access Journals (Sweden)

    Brian J. Gaudet

    2017-10-01

    Full Text Available The Indianapolis Flux Experiment (INFLUX aims to quantify and improve the effectiveness of inferring greenhouse gas (GHG source strengths from downstream concentration measurements in urban environments. Mesoscale models such as the Weather Research and Forecasting (WRF model can provide realistic depictions of planetary boundary layer (PBL structure and flow fields at horizontal grid lengths (Δ'x' down to a few km. Nevertheless, a number of potential sources of error exist in the use of mesoscale models for urban inversions, including accurate representation of the dispersion of GHGs by turbulence close to a point source. Here we evaluate the predictive skill of a 1-km chemistry-adapted WRF (WRF-Chem simulation of daytime CO2 transport from an Indianapolis power plant for a single INFLUX case (28 September 2013. We compare the simulated plume release on domains at different resolutions, as well as on a domain run in large eddy simulation (LES mode, enabling us to study the impact of both spatial resolution and parameterization of PBL turbulence on the transport of CO2. Sensitivity tests demonstrate that much of the difference between 1-km mesoscale and 111-m LES plumes, including substantially lower maximum concentrations in the mesoscale simulation, is due to the different horizontal resolutions. However, resolution is insufficient to account for the slower rate of ascent of the LES plume with downwind distance, which results in much higher surface concentrations for the LES plume in the near-field but a near absence of tracer aloft. Physics sensitivity experiments and theoretical analytical models demonstrate that this effect is an inherent problem with the parameterization of turbulent transport in the mesoscale PBL scheme. A simple transformation is proposed that may be applied to mesoscale model concentration footprints to correct for their near-field biases. Implications for longer-term source inversion are discussed.

  17. Simulation of inverse Compton scattering and its implications on the scattered linewidth

    Science.gov (United States)

    Ranjan, N.; Terzić, B.; Krafft, G. A.; Petrillo, V.; Drebot, I.; Serafini, L.

    2018-03-01

    Rising interest in inverse Compton sources has increased the need for efficient models that properly quantify the behavior of scattered radiation given a set of interaction parameters. The current state-of-the-art simulations rely on Monte Carlo-based methods, which, while properly expressing scattering behavior in high-probability regions of the produced spectra, may not correctly simulate such behavior in low-probability regions (e.g. tails of spectra). Moreover, sampling may take an inordinate amount of time for the desired accuracy to be achieved. In this paper, we present an analytic derivation of the expression describing the scattered radiation linewidth and propose a model to describe the effects of horizontal and vertical emittance on the properties of the scattered radiation. We also present an improved version of the code initially reported in Krafft et al. [Phys. Rev. Accel. Beams 19, 121302 (2016), 10.1103/PhysRevAccelBeams.19.121302], that can perform the same simulations as those present in cain and give accurate results in low-probability regions by integrating over the emissions of the electrons. Finally, we use these codes to carry out simulations that closely verify the behavior predicted by the analytically derived scaling law.

  18. Leaf nitrogen spectral reflectance model of winter wheat (Triticum aestivum) based on PROSPECT: simulation and inversion

    Science.gov (United States)

    Yang, Guijun; Zhao, Chunjiang; Pu, Ruiliang; Feng, Haikuan; Li, Zhenhai; Li, Heli; Sun, Chenhong

    2015-01-01

    Through its association with proteins and plant pigments, leaf nitrogen (N) plays an important regulatory role in photosynthesis, leaf respiration, and net primary production. However, the traditional methods of measurement leaf N are rooted in sample-based spectroscopy in laboratory. There is a big challenge of deriving leaf N from the nondestructive field-measured leaf spectra. In this study, the original PROSPECT model was extended by replacing the absorption coefficient of chlorophyll in the original PROSPECT model with an equivalent N absorption coefficient to develop a nitrogen-based PROSPECT model (N-PROSPECT). N-PROSPECT was evaluated by comparing the model-simulated reflectance values with the measured leaf reflectance values. The validated results show that the correlation coefficient (R) was 0.98 for the wavelengths of 400 to 2500 nm. Finally, N-PROSPECT was used to simulate leaf reflectance using different combinations of input parameters, and partial least squares regression (PLSR) was used to establish the relationship between the N-PROSPECT simulated reflectance and the corresponding leaf nitrogen density (LND). The inverse of the PLSR-based N-PROSPECT model was used to retrieve LND from the measured reflectance with a relatively high accuracy (R2=0.77, RMSE=22.15 μg cm-2). This result demonstrates that the N-PROSPECT model established in this study can accurately simulate nitrogen spectral contributions and retrieve LND.

  19. Objective mapping of observed sub-surface mesoscale cold core eddy in the Bay of Bengal by stochastic inverse technique with tomographically simulated travel times

    Digital Repository Service at National Institute of Oceanography (India)

    Murty, T.V.R.; Rao, M.M.M.; Sadhuram, Y.; Sridevi, B.; Maneesha, K.; SujithKumar, S.; Prasanna, P.L.; Murthy, K.S.R.

    of Bengal during south-west monsoon season and explore possibility to reconstruct the acoustic profile of the eddy by Stochastic Inverse Technique. A simulation experiment on forward and inverse problems for observed sound velocity perturbation field has...

  20. SU-E-T-628: A Cloud Computing Based Multi-Objective Optimization Method for Inverse Treatment Planning.

    Science.gov (United States)

    Na, Y; Suh, T; Xing, L

    2012-06-01

    Multi-objective (MO) plan optimization entails generation of an enormous number of IMRT or VMAT plans constituting the Pareto surface, which presents a computationally challenging task. The purpose of this work is to overcome the hurdle by developing an efficient MO method using emerging cloud computing platform. As a backbone of cloud computing for optimizing inverse treatment planning, Amazon Elastic Compute Cloud with a master node (17.1 GB memory, 2 virtual cores, 420 GB instance storage, 64-bit platform) is used. The master node is able to scale seamlessly a number of working group instances, called workers, based on the user-defined setting account for MO functions in clinical setting. Each worker solved the objective function with an efficient sparse decomposition method. The workers are automatically terminated if there are finished tasks. The optimized plans are archived to the master node to generate the Pareto solution set. Three clinical cases have been planned using the developed MO IMRT and VMAT planning tools to demonstrate the advantages of the proposed method. The target dose coverage and critical structure sparing of plans are comparable obtained using the cloud computing platform are identical to that obtained using desktop PC (Intel Xeon® CPU 2.33GHz, 8GB memory). It is found that the MO planning speeds up the processing of obtaining the Pareto set substantially for both types of plans. The speedup scales approximately linearly with the number of nodes used for computing. With the use of N nodes, the computational time is reduced by the fitting model, 0.2+2.3/N, with r̂2>0.99, on average of the cases making real-time MO planning possible. A cloud computing infrastructure is developed for MO optimization. The algorithm substantially improves the speed of inverse plan optimization. The platform is valuable for both MO planning and future off- or on-line adaptive re-planning. © 2012 American Association of Physicists in Medicine.

  1. The use of mixed-integer programming for inverse treatment planning with pre-defined field segments

    International Nuclear Information System (INIS)

    Bednarz, Greg; Michalski, Darek; Houser, Chris; Huq, M. Saiful; Xiao Ying; Rani, Pramila Anne; Galvin, James M.

    2002-01-01

    Complex intensity patterns generated by traditional beamlet-based inverse treatment plans are often very difficult to deliver. In the approach presented in this work the intensity maps are controlled by pre-defining field segments to be used for dose optimization. A set of simple rules was used to define a pool of allowable delivery segments and the mixed-integer programming (MIP) method was used to optimize segment weights. The optimization problem was formulated by combining real variables describing segment weights with a set of binary variables, used to enumerate voxels in targets and critical structures. The MIP method was compared to the previously used Cimmino projection algorithm. The field segmentation approach was compared to an inverse planning system with a traditional beamlet-based beam intensity optimization. In four complex cases of oropharyngeal cancer the segmental inverse planning produced treatment plans, which competed with traditional beamlet-based IMRT plans. The mixed-integer programming provided mechanism for imposition of dose-volume constraints and allowed for identification of the optimal solution for feasible problems. Additional advantages of the segmental technique presented here are: simplified dosimetry, quality assurance and treatment delivery. (author)

  2. Stereotactic intensity-modulated radiation therapy (IMRT) and inverse treatment planning for advanced pleural mesothelioma. Feasibility and initial results

    Energy Technology Data Exchange (ETDEWEB)

    Muenter, M.W.; Thilmann, C.; Hof, H.; Debus, J. [Clinical Cooperation Unit Radiation Oncology, German Cancer Research Center (dkfz), Heidelberg (Germany); Nill, S.; Hoess, A.; Partridge, M. [Dept. of Medical Physics, German Cancer Research Center (dkfz), Heidelberg (Germany); Haering, P. [Dept. of Central Dosimetry, German Cancer Research Center (dkfz), Heidelberg (Germany); Manegold, C. [Dept. of Medical Oncology/Internal Medicine, Thoraxklinik Heidelberg gGmbH, Heidelberg (Germany); Wannenmacher, M. [Dept. of Clinical Radiology, Univ. of Heidelberg, Heidelberg (Germany)

    2003-08-01

    Background and Purpose: Complex-shaped malignant pleural mesotheliomas (MPMs) with challenging volumes are extremely difficult to treat by conventional radiotherapy due to tolerance doses of the surrounding normal tissue. In a feasibility study, we evaluated if inversely planned stereotactic intensity-modulated radiation therapy (IMRT) could be applied in the treatment of MPM. Patients and Methods: Eight patients with unresectable lesions were treated after failure of chemotherapy. All patients were positioned using noninvasive patient fixation techniques which can be attached to the applied extracranial stereotactic system. Due to craniocaudal extension of the tumor, it was necessary to develop a special software attached to the inverse planning program KonRad, which can connect two inverse treatment plans and consider the applied dose of the first treatment plan in the area of the matchline of the second treatment plan. Results: Except for one patient, in whom radiotherapy was canceled due to abdominal metastasis, treatment could be completed in all patients and was well tolerated. Median survival after diagnosis was 20 months and after IMRT 6.5 months. Therefore, both the 1-year actuarial overall survival from the start of radiotherapy and the 2-year actuarial overall survival since diagnosis were 28%. IMRT did not result in clinically significant acute side effects. By using the described inverse planning software, over- or underdosage in the region of the field matchline could be prevented. Pure treatment time ranged between 10 and 21 min. Conclusion: This study showed that IMRT is feasible in advanced unresectable MPM. The presented possibilities of stereotactic IMRT in the treatment of MPM will justify the evaluation of IMRT in early-stage pleural mesothelioma combined with chemotherapy in a study protocol, in order to improve the outcome of these patients. Furthermore, dose escalation should be possible by using IMRT. (orig.)

  3. Seismic signal simulation and study of underground nuclear sources by moment inversion

    International Nuclear Information System (INIS)

    Crusem, R.

    1986-09-01

    Some problems of underground nuclear explosions are examined from the seismological point of view. In the first part a model is developed for mean seismic propagation through the lagoon of Mururoa atoll and for calculation of synthetic seismograms (in intermediate fields: 5 to 20 km) by summation of discrete wave numbers. In the second part this ground model is used with a linear inversion method of seismic moments for estimation of elastic source terms equivalent to the nuclear source. Only the isotrope part is investigated solution stability is increased by using spectral smoothing and a minimal phase hypothesis. Some examples of applications are presented: total energy estimation of a nuclear explosion, simulation of mechanical effects induced by an underground explosion [fr

  4. COMPENSATED INVERSE PID CONTROLLER FOR ACTIVE VIBRATION CONTROL WITH PIEZOELECTRIC PATCHES: MODELING, SIMULATION AND IMPLEMENTATION

    Directory of Open Access Journals (Sweden)

    Asan Gani

    2010-09-01

    Full Text Available Active vibration control of the first three modes of a vibrating cantilever beam using collocated piezoelectric sensor and actuator is examined in this paper. To achieve this, a model based on Euler-Bernoulli beam equation is adopted and extended to the case of three bonded piezoelectric patches that act as sensor, actuator and exciter respectively. A compensated inverse PID controller has been designed and developed to damp first three modes of vibration. Controllers have been designed for each mode and these are later combined in parallel to damp any of the three modes. Individual controller gives better reduction in sensor output for the second and third modes while the combined controller performs better for the first mode. Simulation studies are carried out using MATLAB. These results are compared and verified experimentally and the real-time implementation is carried out with xPC-target toolbox in MATLAB

  5. Comparison of step and shoot IMRT treatment plans generated by three inverse treatment planning systems; Comparacion de tratamientos de IMRT estatica generados por tres sistemas de planificacion inversa

    Energy Technology Data Exchange (ETDEWEB)

    Perez Moreno, J. M.; Zucca Aparicio, D.; Fernandez leton, P.; Garcia Ruiz-Zorrilla, J.; Minambres Moro, A.

    2011-07-01

    One of the most important issues of intensity modulated radiation therapy (IMRT) treatments using the step-and-shoot technique is the number of segments and monitor units (MU) for treatment delivery. These parameters depend heavily on the inverse optimization module of the treatment planning system (TPS) used. Three commercial treatment planning systems: CMS XiO, iPlan and Prowess Panther have been evaluated. With each of them we have generated a treatment plan for the same group of patients, corresponding to clinical cases. Dosimetric results, MU calculated and number of segments were compared. Prowess treatment planning system generates plans with a number of segments significantly lower than other systems, while MU are less than a half. It implies important reductions in leakage radiation and delivery time. Degradation in the final dose calculation of dose is very small, because it directly optimizes positions of multileaf collimator (MLC). (Author) 13 refs.

  6. Theory and simulation of an inverse free-electron laser experiment

    Science.gov (United States)

    Gou, S. K.; Bhattacharjee, A.; Fang, J.-M.; Marshall, T. C.

    1997-03-01

    An experimental demonstration of the acceleration of electrons using a high-power CO2 laser interacting with a relativistic electron beam moving along a wiggler has been carried out at the Accelerator Test Facility of the Brookhaven National Laboratory [Phys. Rev. Lett. 77, 2690 (1996)]. The data generated by this inverse free-electron-laser (IFEL) experiment are studied by means of theory and simulation. Included in the simulations are such effects as: a low-loss metallic waveguide with a dielectric coating on the walls; multi-mode coupling due to self-consistent interaction between the electrons and the optical wave; space charge; energy spread of the electrons; and arbitrary wiggler-field profile. Two types of wiggler profile are considered: a linear taper of the period, and a step-taper of the period. (The period of the wiggler is ˜3 cm, its magnetic field is ˜1 T, and the wiggler length is 0.47 m.) The energy increment of the electrons (˜1-2%) is analyzed in detail as a function of laser power, wiggler parameters, and the initial beam energy (˜40 MeV). At a laser power level ˜0.5 Gw, the simulation results on energy gain are in reasonable agreement with the experimental results. Preliminary results on the electron energy distribution at the end of the IFEL are presented. Whereas the experiment produces a near-monotone distribution of electron energies with the peak shifted to higher energy, the simulation shows a more structured and non-monotonic distribution at the end of the wiggler. Effects that may help reconcile these differences are considered.

  7. Thermohaline structure of an inverse estuary--The Gulf of Kachchh: measurements and model simulations.

    Science.gov (United States)

    Vethamony, P; Babu, M T; Ramanamurty, M V; Saran, A K; Joseph, Antony; Sudheesh, K; Padgaonkar, Rupali S; Jayakumar, S

    2007-06-01

    The Gulf of Kachchh (GoK) is situated in the northeastern Arabian Sea. The presence of several industries along its coastal belt makes GoK a highly sensitive coastal ecosystem. In the present study, an attempt is made for the first time to study GoK thermohaline structure and its variability, based on field measurements and model simulations. Though GoK is considered as a well-mixed system, the study reveals that only the central Gulf is well mixed. Vertical gradients in temperature and salinity fields are noticed in the eastern Gulf, where a cold and high saline tongue is observed in the subsurface layers. Salinity indicates the characteristic feature of an inverse estuary with low values (37.20 psu) near the mouth and high values (>40.0 psu) near the head of the Gulf. The model simulated temperature and salinity fields exhibit semidiurnal oscillations similar to that of field observations. Model results show cold, high saline waters advecting from the east during ebb forming a transition zone, which oscillates with tides. A high salinity tongue is seen in the bottom layer, indicating a westward flowing bottom current. The transient zone acts as an dynamic barrier, and plays a vital role in the pollutant transport.

  8. Compression moulding simulations of SMC using a multiobjective surrogate-based inverse modeling approach

    Science.gov (United States)

    Marjavaara, B. D.; Ebermark, S.; Lundström, T. S.

    2009-09-01

    A multiobjective surrogate-based inverse modeling technique to predict the spatial and temporal pressure distribution numerically during the fabrication of sheet moulding compounds (SMCs) is introduced. Specifically, an isotropic temperature-dependent Newtonian viscosity model of a SMC charge is fitted to experimental measurements via numerical simulations in order to mimic the temporal pressure distribution at two spatial locations simultaneously. The simulations are performed by using the commercial computational fluid dynamics (CFD) code ANSYS CFX-10.0, and the multiobjective surrogate-based fitting procedure proposed is carried out with a hybrid formulation of the NSGA-IIa evolutionary algorithm and the response surface methodology in Matlab. The outcome of the analysis shows the ability of the optimization framework to efficiently reduce the total computational load of the problem. Furthermore, the viscosity model assumed seems to be able to re solve the temporal pressure distribution and the advancing flow front accurately, which can not be said of the spatial pressure distribution. Hence, it is recommended to improve the CFD model proposed in order to better capture the true behaviour of the mould flow.

  9. Plan Validation Using DES and Agent-based Simulation

    National Research Council Canada - National Science Library

    Wong, Teck H; Ong, Kim S

    2008-01-01

    .... This thesis explores the possibility of using a multi-agent system (MAS) to generate the aggressor's air strike plans, which could be coupled with a low resolution Discrete Event Simulation (DES...

  10. Identifying the contribution of capillary, film and vapour flow by inverse simulation of transient evaporation experiments

    Science.gov (United States)

    Iden, Sascha; Diamantopoulos, Efstathios; Durner, Wolfgang

    2017-04-01

    Evaporation from bare soil is an important component of the water cycle and the surface energy balance in arid and semi-arid regions. Modeling soil water movement in dry soil and predicting the evaporation fluxes to the atmosphere still face considerable challenges. Flow simulations rely on a proper conceptual model for water flow and an adequate parameterization of soil hydraulic properties. While the inclusion of vapor flow into variably-saturated flow models has become more widespread recently, the parametrization of the unsaturated hydraulic conductivity function in dry soil is often still based on sparse literature data from the past which do not extend into the dry range. Another shortcoming is that standard models of hydraulic conductivity do not account for water flow in incompletely-filled pores, i.e. film and corner flow. The objective of this study was to identify soil hydraulic properties by inverse modeling, with a particular focus on the medium to dry moisture range. We conducted evaporation experiments on large soil columns under laboratory conditions and used an extended instrumentation, consisting of minitensiometers and relative humidity sensors, to measure the pressure head over a wide range from saturation to -100 MPa. Evaporation rate and column-averaged water content were measured gravimetrically. The resulting data were evaluated by inverse modeling using the isothermal Richards equation as process model. Our results clearly demonstrate that classic models of soil hydraulic conductivity which are based on the assumption that water flows exclusively in water-filled capillaries, cannot describe the observed time series of pressure head and relative humidity. An adequate description of the observations was only possible by accounting for isothermal vapor flow and an additional flow of liquid water. The physical cause of the latter could be film and corner flow as proposed before based on a theoretical analysis of water flow in angular porous

  11. IPIP: A new approach to inverse planning for HDR brachytherapy by directly optimizing dosimetric indices

    OpenAIRE

    Pouliot, Jean; Cunha, Jason Adam; Hsu, I-Chow

    2011-01-01

    Purpose: Many planning methods for high dose rate (HDR) brachytherapy require an iterative approach. A set of computational parameters are hypothesized that will g ive a dose plan that meets dosimetric criteria. A dose plan is computed using these parameter

  12. The Paper Airplane Challenge: A Market Economy Simulation. Lesson Plan.

    Science.gov (United States)

    Owens, Kimberly

    This lesson plan features a classroom simulation that helps students understand the characteristics of a market economic system. The lesson plan states a purpose; cites student objectives; suggests a time duration; lists materials needed; and details a step-by-step teaching procedure. The "Paper Airplane Challenge" handout is attached. (BT)

  13. New inverse planning technology for image-guided cervical cancer brachytherapy: Description and evaluation within a clinical frame

    International Nuclear Information System (INIS)

    Trnkova, Petra; Poetter, Richard; Baltas, Dimos; Karabis, Andreas; Fidarova, Elena; Dimopoulos, Johannes; Georg, Dietmar; Kirisits, Christian

    2009-01-01

    Purpose: To test the feasibility of a new inverse planning technology based on the Hybrid Inverse treatment Planning and Optimisation (HIPO) algorithm for image-guided cervical cancer brachytherapy in comparison to conventional manual optimisation as applied in recent clinical practice based on long-term intracavitary cervical cancer brachytherapy experience. Materials and methods: The clinically applied treatment plans of 10 tandem/ring (T/R) and 10 cases with additional needles (T/R + N) planned with PLATO v14.3 were included. Standard loading patterns were manually optimised to reach an optimal coverage with 7 Gy per fraction to the High Risk CTV and to fulfil dose constraints for organs at risk. For each of these patients an inverse plan was retrospectively created with Oncentra GYN v0.9.14. Anatomy based automatic source activation was based on the topography of target and organs. The HIPO algorithm included individual gradient and modification restrictions for the T/R and needle dwell times to preserve the spatial high-dose distribution as known from the long-term clinical experience in the standard cervical cancer brachytherapy and with manual planning. Results: HIPO could achieve a better target coverage (V100) for all T/R and 7 T/R + N patients. Changes in the shape of the overdose volume (V200/400) were limited. The D 2cc per fraction for bladder, rectum and sigmoid colon was on average lower by 0.2 Gy, 0.4 Gy, 0.2 Gy, respectively, for T/R patients and 0.6 Gy, 0.3 Gy, 0.3 Gy for T/R + N patients (a decrease from 4.5 to 4 Gy per fraction means a total dose reduction of 5 Gy EQD2 for a 4-fraction schedule). In general the dwell times in the additional needles were lower compared to manual planning. The sparing factors were always better for HIPO plans. Additionally, in 7 T/R and 7 T/R + N patients all three D 0.1cc , D 1cc and D 2cc for vagina wall were lower and a smaller area of vagina was covered by the reference dose in HIPO plans. Overall loading

  14. Estimating tectonic history through basin simulation-enhanced seismic inversion: Geoinformatics for sedimentary basins

    Science.gov (United States)

    Tandon, K.; Tuncay, K.; Hubbard, K.; Comer, J.; Ortoleva, P.

    2004-01-01

    A data assimilation approach is demonstrated whereby seismic inversion is both automated and enhanced using a comprehensive numerical sedimentary basin simulator to study the physics and chemistry of sedimentary basin processes in response to geothermal gradient in much greater detail than previously attempted. The approach not only reduces costs by integrating the basin analysis and seismic inversion activities to understand the sedimentary basin evolution with respect to geodynamic parameters-but the technique also has the potential for serving as a geoinfomatics platform for understanding various physical and chemical processes operating at different scales within a sedimentary basin. Tectonic history has a first-order effect on the physical and chemical processes that govern the evolution of sedimentary basins. We demonstrate how such tectonic parameters may be estimated by minimizing the difference between observed seismic reflection data and synthetic ones constructed from the output of a reaction, transport, mechanical (RTM) basin model. We demonstrate the method by reconstructing the geothermal gradient. As thermal history strongly affects the rate of RTM processes operating in a sedimentary basin, variations in geothermal gradient history alter the present-day fluid pressure, effective stress, porosity, fracture statistics and hydrocarbon distribution. All these properties, in turn, affect the mechanical wave velocity and sediment density profiles for a sedimentary basin. The present-day state of the sedimentary basin is imaged by reflection seismology data to a high degree of resolution, but it does not give any indication of the processes that contributed to the evolution of the basin or causes for heterogeneities within the basin that are being imaged. Using texture and fluid properties predicted by our Basin RTM simulator, we generate synthetic seismograms. Linear correlation using power spectra as an error measure and an efficient quadratic

  15. On the Structure and Adjustment of Inversion-Capped Neutral Atmospheric Boundary-Layer Flows: Large-Eddy Simulation Study

    DEFF Research Database (Denmark)

    Pedersen, Jesper Grønnegaard; Gryning, Sven-Erik; Kelly, Mark C.

    2014-01-01

    A range of large-eddy simulations, with differing free atmosphere stratification and zero or slightly positive surface heat flux, is investigated to improve understanding of the neutral and near-neutral, inversion-capped, horizontally homogeneous, barotropic atmospheric boundary layer with emphas...

  16. System Planning With The Hanford Waste Operations Simulator

    International Nuclear Information System (INIS)

    Crawford, T.W.; Certa, P.J.; Wells, M.N.

    2010-01-01

    At the U. S. Department of Energy's Hanford Site in southeastern Washington State, 216 million liters (57 million gallons) of nuclear waste is currently stored in aging underground tanks, threatening the Columbia River. The River Protection Project (RPP), a fully integrated system of waste storage, retrieval, treatment, and disposal facilities, is in varying stages of design, construction, operation, and future planning. These facilities face many overlapping technical, regulatory, and financial hurdles to achieve site cleanup and closure. Program execution is ongoing, but completion is currently expected to take approximately 40 more years. Strategic planning for the treatment of Hanford tank waste is by nature a multi-faceted, complex and iterative process. To help manage the planning, a report referred to as the RPP System Plan is prepared to provide a basis for aligning the program scope with the cost and schedule, from upper-tier contracts to individual facility operating plans. The Hanford Tank Waste Operations Simulator (HTWOS), a dynamic flowsheet simulation and mass balance computer model, is used to simulate the current planned RPP mission, evaluate the impacts of changes to the mission, and assist in planning near-term facility operations. Development of additional modeling tools, including an operations research model and a cost model, will further improve long-term planning confidence. The most recent RPP System Plan, Revision 4, was published in September 2009.

  17. SYSTEM PLANNING WITH THE HANFORD WASTE OPERATIONS SIMULATOR

    Energy Technology Data Exchange (ETDEWEB)

    CRAWFORD TW; CERTA PJ; WELLS MN

    2010-01-14

    At the U. S. Department of Energy's Hanford Site in southeastern Washington State, 216 million liters (57 million gallons) of nuclear waste is currently stored in aging underground tanks, threatening the Columbia River. The River Protection Project (RPP), a fully integrated system of waste storage, retrieval, treatment, and disposal facilities, is in varying stages of design, construction, operation, and future planning. These facilities face many overlapping technical, regulatory, and financial hurdles to achieve site cleanup and closure. Program execution is ongoing, but completion is currently expected to take approximately 40 more years. Strategic planning for the treatment of Hanford tank waste is by nature a multi-faceted, complex and iterative process. To help manage the planning, a report referred to as the RPP System Plan is prepared to provide a basis for aligning the program scope with the cost and schedule, from upper-tier contracts to individual facility operating plans. The Hanford Tank Waste Operations Simulator (HTWOS), a dynamic flowsheet simulation and mass balance computer model, is used to simulate the current planned RPP mission, evaluate the impacts of changes to the mission, and assist in planning near-term facility operations. Development of additional modeling tools, including an operations research model and a cost model, will further improve long-term planning confidence. The most recent RPP System Plan, Revision 4, was published in September 2009.

  18. IPIP: A New Approach to Inverse Planning for HDR Brachytherapy by Directly Optimizing Dosimetric Indices

    OpenAIRE

    Siauw, Timmy; Cunha, Adam; Atamturk, Alper; Hsu, I-Chow; Pouliot, Jean; Goldberg, Ken

    2010-01-01

    Purpose: Many planning methods for high dose rate (HDR) brachytherapy treatment planning require an iterative approach. A set of computational parameters are hypothesized that will give a dose plan that meets dosimetric criteria. A dose plan is computed using these parameters, and if any dosimetric criteria are not met, the process is iterated until a suitable dose plan is found. In this way, the dose distribution is controlled by abstract parameters. The purpose of this study is to improve H...

  19. A Simulation Tool for Hurricane Evacuation Planning

    Directory of Open Access Journals (Sweden)

    Daniel J. Fonseca

    2009-01-01

    Full Text Available Atlantic hurricanes and severe tropical storms are a serious threat for the communities in the Gulf of Mexico region. Such storms are violent and destructive. In response to these dangers, coastal evacuation may be ordered. This paper describes the development of a simulation model to analyze the movement of vehicles through I-65, a major US Interstate highway that runs north off the coastal City of Mobile, Alabama, towards the State of Tennessee, during a massive evacuation originated by a disastrous event such a hurricane. The constructed simulation platform consists of a primary and two secondary models. The primary model is based on the entry of vehicles from the 20 on-ramps to I-65. The two secondary models assist the primary model with related traffic events such as car breakdowns and accidents, traffic control measures, interarrival signaling, and unforeseen emergency incidents, among others. Statistical testing was performed on the data generated by the simulation model to indentify variation in relevant traffic variables affecting the timely flow of vehicles travelling north. The performed statistical analysis focused on the closing of alternative on-ramps throughout the Interstate.

  20. Acute small bowel toxicity and preoperative chemoradiotherapy for rectal cancer: Investigating dose-volume relationships and role for inverse planning

    International Nuclear Information System (INIS)

    Tho, Lye Mun; Glegg, Martin; Paterson, Jennifer; Yap, Christina; MacLeod, Alice; McCabe, Marie; McDonald, Alexander C.

    2006-01-01

    Purpose: The relationship between volume of irradiated small bowel (VSB) and acute toxicity in rectal cancer radiotherapy is poorly quantified, particularly in patients receiving concurrent preoperative chemoradiotherapy. Using treatment planning data, we studied a series of such patients. Methods and Materials: Details of 41 patients with locally advanced rectal cancer were reviewed. All received 45 Gy in 25 fractions over 5 weeks, 3-4 fields three-dimensional conformal radiotherapy with daily 5-fluorouracil and folinic acid during Weeks 1 and 5. Toxicity was assessed prospectively in a weekly clinic. Using computed tomography planning software, the VSB was determined at 5 Gy dose intervals (V 5 , V 1 , etc.). Eight patients with maximal VSB had dosimetry and radiobiological modeling outcomes compared between inverse and conformal three-dimensional planning. Results: VSB correlated strongly with diarrheal severity at every dose level (p 5 and V 15 . Conclusions: A strong dose-volume relationship exists between VSB and acute diarrhea at all dose levels during preoperative chemoradiotherapy. Our constructed model may be useful in predicting toxicity, and this has been derived without the confounding influence of surgical excision on bowel function. Inverse planning can reduce calculated dose to small bowel and late NTCP, and its clinical role warrants further investigation

  1. Four dimensional data assimilation (FDDA) impacts on WRF performance in simulating inversion layer structure and distributions of CMAQ-simulated winter ozone concentrations in Uintah Basin

    Science.gov (United States)

    Tran, Trang; Tran, Huy; Mansfield, Marc; Lyman, Seth; Crosman, Erik

    2018-03-01

    Four-dimensional data assimilation (FDDA) was applied in WRF-CMAQ model sensitivity tests to study the impact of observational and analysis nudging on model performance in simulating inversion layers and O3 concentration distributions within the Uintah Basin, Utah, U.S.A. in winter 2013. Observational nudging substantially improved WRF model performance in simulating surface wind fields, correcting a 10 °C warm surface temperature bias, correcting overestimation of the planetary boundary layer height (PBLH) and correcting underestimation of inversion strengths produced by regular WRF model physics without nudging. However, the combined effects of poor performance of WRF meteorological model physical parameterization schemes in simulating low clouds, and warm and moist biases in the temperature and moisture initialization and subsequent simulation fields, likely amplified the overestimation of warm clouds during inversion days when observational nudging was applied, impacting the resulting O3 photochemical formation in the chemistry model. To reduce the impact of a moist bias in the simulations on warm cloud formation, nudging with the analysis water mixing ratio above the planetary boundary layer (PBL) was applied. However, due to poor analysis vertical temperature profiles, applying analysis nudging also increased the errors in the modeled inversion layer vertical structure compared to observational nudging. Combining both observational and analysis nudging methods resulted in unrealistically extreme stratified stability that trapped pollutants at the lowest elevations at the center of the Uintah Basin and yielded the worst WRF performance in simulating inversion layer structure among the four sensitivity tests. The results of this study illustrate the importance of carefully considering the representativeness and quality of the observational and model analysis data sets when applying nudging techniques within stable PBLs, and the need to evaluate model results

  2. How inverse solver technologies can support die face development and process planning in the automotive industry

    Science.gov (United States)

    Huhn, Stefan; Peeling, Derek; Burkart, Maximilian

    2017-10-01

    With the availability of die face design tools and incremental solver technologies to provide detailed forming feasibility results in a timely fashion, the use of inverse solver technologies and resulting process improvements during the product development process of stamped parts often is underestimated. This paper presents some applications of inverse technologies that are currently used in the automotive industry to streamline the product development process and greatly increase the quality of a developed process and the resulting product. The first focus is on the so-called target strain technology. Application examples will show how inverse forming analysis can be applied to support the process engineer during the development of a die face geometry for Class `A' panels. The drawing process is greatly affected by the die face design and the process designer has to ensure that the resulting drawn panel will meet specific requirements regarding surface quality and a minimum strain distribution to ensure dent resistance. The target strain technology provides almost immediate feedback to the process engineer during the die face design process if a specific change of the die face design will help to achieve these specific requirements or will be counterproductive. The paper will further show how an optimization of the material flow can be achieved through the use of a newly developed technology called Sculptured Die Face (SDF). The die face generation in SDF is more suited to be used in optimization loops than any other conventional die face design technology based on cross section design. A second focus in this paper is on the use of inverse solver technologies for secondary forming operations. The paper will show how the application of inverse technology can be used to accurately and quickly develop trim lines on simple as well as on complex support geometries.

  3. Monte Carlo simulation of inverse geometry x-ray fluoroscopy using a modified MC-GPU framework

    Science.gov (United States)

    Dunkerley, David A. P.; Tomkowiak, Michael T.; Slagowski, Jordan M.; McCabe, Bradley P.; Funk, Tobias; Speidel, Michael A.

    2015-01-01

    Scanning-Beam Digital X-ray (SBDX) is a technology for low-dose fluoroscopy that employs inverse geometry x-ray beam scanning. To assist with rapid modeling of inverse geometry x-ray systems, we have developed a Monte Carlo (MC) simulation tool based on the MC-GPU framework. MC-GPU version 1.3 was modified to implement a 2D array of focal spot positions on a plane, with individually adjustable x-ray outputs, each producing a narrow x-ray beam directed toward a stationary photon-counting detector array. Geometric accuracy and blurring behavior in tomosynthesis reconstructions were evaluated from simulated images of a 3D arrangement of spheres. The artifact spread function from simulation agreed with experiment to within 1.6% (rRMSD). Detected x-ray scatter fraction was simulated for two SBDX detector geometries and compared to experiments. For the current SBDX prototype (10.6 cm wide by 5.3 cm tall detector), x-ray scatter fraction measured 2.8–6.4% (18.6–31.5 cm acrylic, 100 kV), versus 2.1–4.5% in MC simulation. Experimental trends in scatter versus detector size and phantom thickness were observed in simulation. For dose evaluation, an anthropomorphic phantom was imaged using regular and regional adaptive exposure (RAE) scanning. The reduction in kerma-area-product resulting from RAE scanning was 45% in radiochromic film measurements, versus 46% in simulation. The integral kerma calculated from TLD measurement points within the phantom was 57% lower when using RAE, versus 61% lower in simulation. This MC tool may be used to estimate tomographic blur, detected scatter, and dose distributions when developing inverse geometry x-ray systems. PMID:26113765

  4. a Simulation-As Framework Facilitating Webgis Based Installation Planning

    Science.gov (United States)

    Zheng, Z.; Chang, Z. Y.; Fei, Y. F.

    2017-09-01

    Installation Planning is constrained by both natural and social conditions, especially for spatially sparse but functionally connected facilities. Simulation is important for proper deploy in space and configuration in function of facilities to make them a cohesive and supportive system to meet users' operation needs. Based on requirement analysis, we propose a framework to combine GIS and Agent simulation to overcome the shortness in temporal analysis and task simulation of traditional GIS. In this framework, Agent based simulation runs as a service on the server, exposes basic simulation functions, such as scenario configuration, simulation control, and simulation data retrieval to installation planners. At the same time, the simulation service is able to utilize various kinds of geoprocessing services in Agents' process logic to make sophisticated spatial inferences and analysis. This simulation-as-a-service framework has many potential benefits, such as easy-to-use, on-demand, shared understanding, and boosted performances. At the end, we present a preliminary implement of this concept using ArcGIS javascript api 4.0 and ArcGIS for server, showing how trip planning and driving can be carried out by agents.

  5. Simulation based planning of surgical interventions in pediatric cardiology

    Science.gov (United States)

    Marsden, Alison L.

    2013-10-01

    Hemodynamics plays an essential role in the progression and treatment of cardiovascular disease. However, while medical imaging provides increasingly detailed anatomical information, clinicians often have limited access to hemodynamic data that may be crucial to patient risk assessment and treatment planning. Computational simulations can now provide detailed hemodynamic data to augment clinical knowledge in both adult and pediatric applications. There is a particular need for simulation tools in pediatric cardiology, due to the wide variation in anatomy and physiology in congenital heart disease patients, necessitating individualized treatment plans. Despite great strides in medical imaging, enabling extraction of flow information from magnetic resonance and ultrasound imaging, simulations offer predictive capabilities that imaging alone cannot provide. Patient specific simulations can be used for in silico testing of new surgical designs, treatment planning, device testing, and patient risk stratification. Furthermore, simulations can be performed at no direct risk to the patient. In this paper, we outline the current state of the art in methods for cardiovascular blood flow simulation and virtual surgery. We then step through pressing challenges in the field, including multiscale modeling, boundary condition selection, optimization, and uncertainty quantification. Finally, we summarize simulation results of two representative examples from pediatric cardiology: single ventricle physiology, and coronary aneurysms caused by Kawasaki disease. These examples illustrate the potential impact of computational modeling tools in the clinical setting.

  6. Study on Real-Time Simulation Analysis and Inverse Analysis System for Temperature and Stress of Concrete Dam

    Directory of Open Access Journals (Sweden)

    Lei Zhang

    2015-01-01

    Full Text Available In the concrete dam construction, it is very necessary to strengthen the real-time monitoring and scientific management of concrete temperature control. This paper constructs the analysis and inverse analysis system of temperature stress simulation, which is based on various useful data collected in real time in the process of concrete construction. The system can produce automatically data file of temperature and stress calculation and then achieve the remote real-time simulation calculation of temperature stress by using high performance computing techniques, so the inverse analysis can be carried out based on a basis of monitoring data in the database; it fulfills the automatic feedback calculation according to the error requirement and generates the corresponding curve and chart after the automatic processing and analysis of corresponding results. The system realizes the automation and intellectualization of complex data analysis and preparation work in simulation process and complex data adjustment in the inverse analysis process, which can facilitate the real-time tracking simulation and feedback analysis of concrete temperature stress in construction process and enable you to discover problems timely, take measures timely, and adjust construction scheme and can well instruct you how to ensure project quality.

  7. BRUS2. An energy system simulator for long term planning

    DEFF Research Database (Denmark)

    Skytte, K.; Skjerk Christensen, P.

    1999-01-01

    the energy system by four demand sectors: residential, service. production, and transport, and by two supply systems: electricity, and gas and oil. The simulations are carried out in three years: a base year For calibration, a midterm year preferably in accordance with the planning horizon of national...... and regional plans, and an end year, used to study long-term trends of development of society and technology. The results of simulations include fuel demand, emissions of pollutants, and economic consequences. BRUS2 has been implemented in several countries, recently in Mexico. The methodology is described...

  8. Evaluation of hybrid inverse planning and optimization (HIPO) algorithm for optimization in real-time, high-dose-rate (HDR) brachytherapy for prostate.

    Science.gov (United States)

    Pokharel, Shyam; Rana, Suresh; Blikenstaff, Joseph; Sadeghi, Amir; Prestidge, Bradley

    2013-07-08

    The purpose of this study is to investigate the effectiveness of the HIPO planning and optimization algorithm for real-time prostate HDR brachytherapy. This study consists of 20 patients who underwent ultrasound-based real-time HDR brachytherapy of the prostate using the treatment planning system called Oncentra Prostate (SWIFT version 3.0). The treatment plans for all patients were optimized using inverse dose-volume histogram-based optimization followed by graphical optimization (GRO) in real time. The GRO is manual manipulation of isodose lines slice by slice. The quality of the plan heavily depends on planner expertise and experience. The data for all patients were retrieved later, and treatment plans were created and optimized using HIPO algorithm with the same set of dose constraints, number of catheters, and set of contours as in the real-time optimization algorithm. The HIPO algorithm is a hybrid because it combines both stochastic and deterministic algorithms. The stochastic algorithm, called simulated annealing, searches the optimal catheter distributions for a given set of dose objectives. The deterministic algorithm, called dose-volume histogram-based optimization (DVHO), optimizes three-dimensional dose distribution quickly by moving straight downhill once it is in the advantageous region of the search space given by the stochastic algorithm. The PTV receiving 100% of the prescription dose (V100) was 97.56% and 95.38% with GRO and HIPO, respectively. The mean dose (D(mean)) and minimum dose to 10% volume (D10) for the urethra, rectum, and bladder were all statistically lower with HIPO compared to GRO using the student pair t-test at 5% significance level. HIPO can provide treatment plans with comparable target coverage to that of GRO with a reduction in dose to the critical structures.

  9. Convection methodology for fission track annealing: direct and inverse numerical simulations in the multi-exponential case

    International Nuclear Information System (INIS)

    Miellou, J.C.; Igli, H.; Grivet, M.; Rebetez, M.; Chambaudet, A.

    1994-01-01

    In minerals, the uranium fission tracks are sensitive to temperature and time. The consequence is that the etchable lengths are reduced. To simulate the phenomenon, at the last International Conference on Nuclear Tracks in solids at Beijing in 1992, we proposed a convection model for fission track annealing based on a reaction situation associated with only one activation energy. Moreover a simple inverse method based on the resolution of an ordinary differential equation was described, making it possible to retrace the thermal history in this mono-exponential situation. The aim of this paper is to consider a more involved class of models including multi-exponentials associated with several activation energies. We shall describe in this framework the modelling of the direct phenomenon and the resolution of the inverse problem. Results of numerical simulations and comparison with the mono-exponential case will be presented. 5 refs. (author)

  10. Very fast simulated reannealing in radiation therapy treatment plan optimization

    International Nuclear Information System (INIS)

    Morrill, Steven M.; Lam, Kam Shing; Lane, Richard G.; Langer, Mark; Rosen, Isaac I.

    1995-01-01

    Purpose: Very Fast Simulated Reannealing is a relatively new (1989) and sophisticated algorithm for simulated annealing applications. It offers the advantages of annealing methods while requiring shorter execution times. The purpose of this investigation was to adapt Very Fast Simulated Reannealing to conformal treatment planning optimization. Methods and Materials: We used Very Fast Simulated Reannealing to optimize treatments for three clinical cases with two different cost functions. The first cost function was linear (minimum target dose) with nonlinear dose-volume normal tissue constraints. The second cost function (probability of uncomplicated local control) was a weighted product of normal tissue complication probabilities and the tumor control probability. Results: For the cost functions used in this study, the Very Fast Simulated Reannealing algorithm achieved results within 5-10% of the final solution (100,000 iterations) after 1000 iterations and within 3-5% of the final solution after 5000-10000 iterations. These solutions were superior to those produced by a conventional treatment plan based on an analysis of the resulting dose-volume histograms. However, this technique is a stochastic method and results vary in a statistical manner. Successive solutions may differ by up to 10%. Conclusion: Very Fast Simulated Reannealing, with modifications, is suitable for radiation therapy treatment planning optimization. It produced results within 3-10% of the optimal solution, produced using another optimization algorithm (Mixed Integer Programming), in clinically useful execution times

  11. The role of simulated small-scale ocean variability in inverse computations for ocean acoustic tomography.

    Science.gov (United States)

    Dushaw, Brian D; Sagen, Hanne

    2017-12-01

    Ocean acoustic tomography depends on a suitable reference ocean environment with which to set the basic parameters of the inverse problem. Some inverse problems may require a reference ocean that includes the small-scale variations from internal waves, small mesoscale, or spice. Tomographic inversions that employ data of stable shadow zone arrivals, such as those that have been observed in the North Pacific and Canary Basin, are an example. Estimating temperature from the unique acoustic data that have been obtained in Fram Strait is another example. The addition of small-scale variability to augment a smooth reference ocean is essential to understanding the acoustic forward problem in these cases. Rather than a hindrance, the stochastic influences of the small scale can be exploited to obtain accurate inverse estimates. Inverse solutions are readily obtained, and they give computed arrival patterns that matched the observations. The approach is not ad hoc, but universal, and it has allowed inverse estimates for ocean temperature variations in Fram Strait to be readily computed on several acoustic paths for which tomographic data were obtained.

  12. Insights into the use of time-lapse GPR data as observations for inverse multiphase flow simulations of DNAPL migration

    Science.gov (United States)

    Johnson, R.H.; Poeter, E.P.

    2007-01-01

    Perchloroethylene (PCE) saturations determined from GPR surveys were used as observations for inversion of multiphase flow simulations of a PCE injection experiment (Borden 9??m cell), allowing for the estimation of optimal bulk intrinsic permeability values. The resulting fit statistics and analysis of residuals (observed minus simulated PCE saturations) were used to improve the conceptual model. These improvements included adjustment of the elevation of a permeability contrast, use of the van Genuchten versus Brooks-Corey capillary pressure-saturation curve, and a weighting scheme to account for greater measurement error with larger saturation values. A limitation in determining PCE saturations through one-dimensional GPR modeling is non-uniqueness when multiple GPR parameters are unknown (i.e., permittivity, depth, and gain function). Site knowledge, fixing the gain function, and multiphase flow simulations assisted in evaluating non-unique conceptual models of PCE saturation, where depth and layering were reinterpreted to provide alternate conceptual models. Remaining bias in the residuals is attributed to the violation of assumptions in the one-dimensional GPR interpretation (which assumes flat, infinite, horizontal layering) resulting from multidimensional influences that were not included in the conceptual model. While the limitations and errors in using GPR data as observations for inverse multiphase flow simulations are frustrating and difficult to quantify, simulation results indicate that the error and bias in the PCE saturation values are small enough to still provide reasonable optimal permeability values. The effort to improve model fit and reduce residual bias decreases simulation error even for an inversion based on biased observations and provides insight into alternate GPR data interpretations. Thus, this effort is warranted and provides information on bias in the observation data when this bias is otherwise difficult to assess. ?? 2006 Elsevier B

  13. Lean Supply Chain Planning: A Performance Evaluation through Simulation

    Directory of Open Access Journals (Sweden)

    Rossini Matteo

    2016-01-01

    Full Text Available Nowadays companies look more and more for improving their efficiency to excel in the market. At the same time, the competition has moved from firm level to whole supply chain level. Supply chain are very complex systems and lacks of coordination among their members leads to inefficiency. Supply chain planning task is to improve coordination among supply chain members. Which is the best planning solution to improve efficiency is an open issue. On the other hand, Lean approach is becoming more and more popular among managers. Lean approach is recognize as efficiency engine for production systems, but effects of Lean implementation out of single firm boundaries is not clear. This paper aims at providing a theoretical and practical starting point for Lean implementation in supply chain planning issue. To reach it, a DES simulation model of a three-echelon and multi-product supply chain has been set. Lean management is a very broad topic and this paper focuses on two principles of “pull” and “create the flow”. Kanban system and setup-time and batch-size reductions are implemented in the lean-configured supply chain to apply “pull” and “create the flow” respectively. Lean principles implementations have been analyzed and compared with other supply chain planning policies: EOQ and information sharing (Visibility. Supported by the simulation study, this paper points Lean supply chain planning is a competitive planning policies to increase efficiency.

  14. Empowering stakeholders through simulation in water resources planning

    International Nuclear Information System (INIS)

    Palmer, R.N.; Keyes, A.M.; Fisher, S.

    1993-01-01

    During the past two years, researchers at the University of Washington (UW) have had the unique opportunity to facilitate and observe the development of drought planning activities associated with the National Drought Study (NDS) and its Drought Preparedness Studies (DPS) sites as sponsored by the Institute of Water Resources of the US Army Corps of Engineers. Each of the DPS sites is unique, with different study objectives and institutional constraints. However, one uniform requirement of the study is to develop tactical and strategic drought plans that can be successfully implemented within the study region. At the onset of the study, it was recognized that successful implementation is directly related to the active involvement of affected parties and agencies (denoted as stakeholders) and the degree to which they support the plan's conclusions. Their involvement is also necessary because the problems addressed by the DPS's require the experience and knowledge of a variety of water resource interests in order to arrive at effective alternatives. Their support of the plan conclusions enables regional implementation. Several techniques were used to encourage stakeholder participation in the planning process. Individuals representing the stakeholders had a wide range of professional backgrounds. This paper concentrates on one specific approach found useful in encouraging comprehensive and meaningful participation by a wide range of stakeholders; the development of object-oriented simulation models for the water resource systems under study. Simulation models were to develop tactical and strategic drought plans and to ensure the acceptance of the plans by building consensus among the stakeholders. The remainder of this paper describes: how simulation models became a part of the National Drought Study, procedures used to develop the DPS models, and how the model empowered stakeholders

  15. WE-AB-209-02: A New Inverse Planning Framework with Principle-Based Modeling of Inter-Structural Dosimetric Tradeoffs

    International Nuclear Information System (INIS)

    Liu, H; Dong, P; Xing, L

    2016-01-01

    Purpose: Traditional radiotherapy inverse planning relies on the weighting factors to phenomenologically balance the conflicting criteria for different structures. The resulting manual trial-and-error determination of the weights has long been recognized as the most time-consuming part of treatment planning. The purpose of this work is to develop an inverse planning framework that parameterizes the inter-structural dosimetric tradeoff among with physically more meaningful quantities to simplify the search for a clinically sensible plan. Methods: A permissible dosimetric uncertainty is introduced for each of the structures to balance their conflicting dosimetric requirements. The inverse planning is then formulated as a convex feasibility problem, which aims to generate plans with acceptable dosimetric uncertainties. A sequential procedure (SP) is derived to decompose the model into three submodels to constrain the uncertainty in the planning target volume (PTV), the critical structures, and all other structures to spare, sequentially. The proposed technique is applied to plan a liver case and a head-and-neck case and compared with a conventional approach. Results: Our results show that the strategy is able to generate clinically sensible plans with little trial-and-error. In the case of liver IMRT, the fractional volumes to liver and heart above 20Gy are found to be 22% and 10%, respectively, which are 15.1% and 33.3% lower than that of the counterpart conventional plan while maintaining the same PTV coverage. The planning of the head and neck IMRT show the same level of success, with the DVHs for all organs at risk and PTV very competitive to a counterpart plan. Conclusion: A new inverse planning framework has been established. With physically more meaningful modeling of the inter-structural tradeoff, the technique enables us to substantially reduce the need for trial-and-error adjustment of the model parameters and opens new opportunities of incorporating prior

  16. Treatment planning for a small animal using Monte Carlo simulation

    International Nuclear Information System (INIS)

    Chow, James C. L.; Leung, Michael K. K.

    2007-01-01

    The development of a small animal model for radiotherapy research requires a complete setup of customized imaging equipment, irradiators, and planning software that matches the sizes of the subjects. The purpose of this study is to develop and demonstrate the use of a flexible in-house research environment for treatment planning on small animals. The software package, called DOSCTP, provides a user-friendly platform for DICOM computed tomography-based Monte Carlo dose calculation using the EGSnrcMP-based DOSXYZnrc code. Validation of the treatment planning was performed by comparing the dose distributions for simple photon beam geometries calculated through the Pinnacle3 treatment planning system and measurements. A treatment plan for a mouse based on a CT image set by a 360-deg photon arc is demonstrated. It is shown that it is possible to create 3D conformal treatment plans for small animals with consideration of inhomogeneities using small photon beam field sizes in the diameter range of 0.5-5 cm, with conformal dose covering the target volume while sparing the surrounding critical tissue. It is also found that Monte Carlo simulation is suitable to carry out treatment planning dose calculation for small animal anatomy with voxel size about one order of magnitude smaller than that of the human

  17. Creating virtual humans for simulation-based training and planning

    Energy Technology Data Exchange (ETDEWEB)

    Stansfield, S.; Sobel, A.

    1998-05-12

    Sandia National Laboratories has developed a distributed, high fidelity simulation system for training and planning small team Operations. The system provides an immersive environment populated by virtual objects and humans capable of displaying complex behaviors. The work has focused on developing the behaviors required to carry out complex tasks and decision making under stress. Central to this work are techniques for creating behaviors for virtual humans and for dynamically assigning behaviors to CGF to allow scenarios without fixed outcomes. Two prototype systems have been developed that illustrate these capabilities: MediSim, a trainer for battlefield medics and VRaptor, a system for planning, rehearsing and training assault operations.

  18. Multi-GPU configuration of 4D intensity modulated radiation therapy inverse planning using global optimization

    Science.gov (United States)

    Hagan, Aaron; Sawant, Amit; Folkerts, Michael; Modiri, Arezoo

    2018-01-01

    We report on the design, implementation and characterization of a multi-graphic processing unit (GPU) computational platform for higher-order optimization in radiotherapy treatment planning. In collaboration with a commercial vendor (Varian Medical Systems, Palo Alto, CA), a research prototype GPU-enabled Eclipse (V13.6) workstation was configured. The hardware consisted of dual 8-core Xeon processors, 256 GB RAM and four NVIDIA Tesla K80 general purpose GPUs. We demonstrate the utility of this platform for large radiotherapy optimization problems through the development and characterization of a parallelized particle swarm optimization (PSO) four dimensional (4D) intensity modulated radiation therapy (IMRT) technique. The PSO engine was coupled to the Eclipse treatment planning system via a vendor-provided scripting interface. Specific challenges addressed in this implementation were (i) data management and (ii) non-uniform memory access (NUMA). For the former, we alternated between parameters over which the computation process was parallelized. For the latter, we reduced the amount of data required to be transferred over the NUMA bridge. The datasets examined in this study were approximately 300 GB in size, including 4D computed tomography images, anatomical structure contours and dose deposition matrices. For evaluation, we created a 4D-IMRT treatment plan for one lung cancer patient and analyzed computation speed while varying several parameters (number of respiratory phases, GPUs, PSO particles, and data matrix sizes). The optimized 4D-IMRT plan enhanced sparing of organs at risk by an average reduction of 26% in maximum dose, compared to the clinical optimized IMRT plan, where the internal target volume was used. We validated our computation time analyses in two additional cases. The computation speed in our implementation did not monotonically increase with the number of GPUs. The optimal number of GPUs (five, in our study) is directly related to the

  19. A High-Speed Train Operation Plan Inspection Simulation Model

    Directory of Open Access Journals (Sweden)

    Yang Rui

    2018-01-01

    Full Text Available We developed a train operation simulation tool to inspect a train operation plan. In applying an improved Petri Net, the train was regarded as a token, and the line and station were regarded as places, respectively, in accordance with the high-speed train operation characteristics and network function. Location change and running information transfer of the high-speed train were realized by customizing a variety of transitions. The model was built based on the concept of component combination, considering the random disturbance in the process of train running. The simulation framework can be generated quickly and the system operation can be completed according to the different test requirements and the required network data. We tested the simulation tool when used for the real-world Wuhan to Guangzhou high-speed line. The results showed that the proposed model can be developed, the simulation results basically coincide with the objective reality, and it can not only test the feasibility of the high-speed train operation plan, but also be used as a support model to develop the simulation platform with more capabilities.

  20. Real-time inverse high-dose-rate brachytherapy planning with catheter optimization by compressed sensing-inspired optimization strategies.

    Science.gov (United States)

    Guthier, C V; Aschenbrenner, K P; Müller, R; Polster, L; Cormack, R A; Hesser, J W

    2016-08-21

    This paper demonstrates that optimization strategies derived from the field of compressed sensing (CS) improve computational performance in inverse treatment planning (ITP) for high-dose-rate (HDR) brachytherapy. Following an approach applied to low-dose-rate brachytherapy, we developed a reformulation of the ITP problem with the same mathematical structure as standard CS problems. Two greedy methods, derived from hard thresholding and subspace pursuit are presented and their performance is compared to state-of-the-art ITP solvers. Applied to clinical prostate brachytherapy plans speed-up by a factor of 56-350 compared to state-of-the-art methods. Based on a Wilcoxon signed rank-test the novel method statistically significantly decreases the final objective function value (p  optimization times were below one second and thus planing can be considered as real-time capable. The novel CS inspired strategy enables real-time ITP for HDR brachytherapy including catheter optimization. The generated plans are either clinically equivalent or show a better performance with respect to dosimetric measures.

  1. Treatment planning in radiosurgery: parallel Monte Carlo simulation software

    Energy Technology Data Exchange (ETDEWEB)

    Scielzo, G. [Galliera Hospitals, Genova (Italy). Dept. of Hospital Physics; Grillo Ruggieri, F. [Galliera Hospitals, Genova (Italy) Dept. for Radiation Therapy; Modesti, M.; Felici, R. [Electronic Data System, Rome (Italy); Surridge, M. [University of South Hampton (United Kingdom). Parallel Apllication Centre

    1995-12-01

    The main objective of this research was to evaluate the possibility of direct Monte Carlo simulation for accurate dosimetry with short computation time. We made us of: graphics workstation, linear accelerator, water, PMMA and anthropomorphic phantoms, for validation purposes; ionometric, film and thermo-luminescent techniques, for dosimetry; treatment planning system for comparison. Benchmarking results suggest that short computing times can be obtained with use of the parallel version of EGS4 that was developed. Parallelism was obtained assigning simulation incident photons to separate processors, and the development of a parallel random number generator was necessary. Validation consisted in: phantom irradiation, comparison of predicted and measured values good agreement in PDD and dose profiles. Experiments on anthropomorphic phantoms (with inhomogeneities) were carried out, and these values are being compared with results obtained with the conventional treatment planning system.

  2. Monte-Carlo simulation of a high-resolution inverse geometry spectrometer on the SNS. Long Wavelength Target Station

    International Nuclear Information System (INIS)

    Bordallo, H.N.; Herwig, K.W.

    2001-01-01

    Using the Monte-Carlo simulation program McStas, we present the design principles of the proposed high-resolution inverse geometry spectrometer on the SNS-Long Wavelength Target Station (LWTS). The LWTS will provide the high flux of long wavelength neutrons at the requisite pulse rate required by the spectrometer design. The resolution of this spectrometer lies between that routinely achieved by spin echo techniques and the design goal of the high power target station backscattering spectrometer. Covering this niche in energy resolution will allow systematic studies over the large dynamic range required by many disciplines, such as protein dynamics. (author)

  3. Numerical Simulations of an Inversion Fog Event in the Salt Lake Valley during the MATERHORN-Fog Field Campaign

    Science.gov (United States)

    Chachere, Catherine N.; Pu, Zhaoxia

    2018-01-01

    An advanced research version of the Weather Research and Forecasting (WRF) Model is employed to simulate a wintertime inversion fog event in the Salt Lake Valley during the Mountain Terrain Atmospheric Modeling and Observations Program (MATERHORN) field campaign during January 2015. Simulation results are compared to observations obtained from the field program. The sensitivity of numerical simulations to available cloud microphysical (CM), planetary boundary layer (PBL), radiation, and land surface models (LSMs) is evaluated. The influence of differing visibility algorithms and initialization times on simulation results is also examined. Results indicate that the numerical simulations of the fog event are sensitive to the choice of CM, PBL, radiation, and LSM as well as the visibility algorithm and initialization time. Although the majority of experiments accurately captured the synoptic setup environment, errors were found in most experiments within the boundary layer, specifically a 3° warm bias in simulated surface temperatures compared to observations. Accurate representation of surface and boundary layer variables are vital in correctly predicting fog in the numerical model.

  4. Importance of simulation tools for the planning of optical network

    Science.gov (United States)

    Martins, Indayara B.; Martins, Yara; Rudge, Felipe; Moschimı, Edson

    2015-10-01

    The main proposal of this work is to show the importance of using simulation tools to project optical networks. The simulation method supports the investigation of several system and network parameters, such as bit error rate, blocking probability as well as physical layer issues, such as attenuation, dispersion, and nonlinearities, as these are all important to evaluate and validate the operability of optical networks. The work was divided into two parts: firstly, physical layer preplanning was proposed for the distribution of amplifiers and compensating for the attenuation and dispersion effects in span transmission; in this part, we also analyzed the quality of the transmitted signal. In the second part, an analysis of the transport layer was completed, proposing wavelength distribution planning, according to the total utilization of each link. The main network parameters used to evaluate the transport and physical layer design were delay (latency), blocking probability, and bit error rate (BER). This work was carried out with commercially available simulation tools.

  5. TU-G-BRB-02: A New Mathematical Framework for IMRT Inverse Planning with Voxel-Dependent Optimization Parameters.

    Science.gov (United States)

    Zarepisheh, M; Uribe-Sanchez, A; Li, N; Jia, X; Jiang, S

    2012-06-01

    To establish a new mathematical framework for IMRT treatment optimization with voxel-dependent optimization parameters. In IMRT inverse treatment planning, a physician seeks for a plan to deliver a prescribed dose to the target while sparing the nearby healthy tissues. The conflict between these objectives makes the multi-criteria optimization an appropriate tool. Traditionally, a clinically acceptable plan can be generated by fine-tuning organ-based parameters. We establish a new mathematical framework by using voxel-based parameters for optimization. We introduce three different Pareto surfaces, prove the relationship between those surfaces, and compare voxel-based and organ-based methods. We prove some new theorems providing conditions under which the Pareto optimality is guaranteed. The new mathematical framework has shown that: 1) Using an increasing voxel penalty function with an increasing derivative, in particular the popular power function, it is possible to explore the entire Pareto surface by changing voxel-based weighting factors, which increases the chances of getting more desirable plan. 2) The Pareto optimality is always guaranteed by adjusting voxel-based weighting factors. 3) If the plan is initially produced by adjusting organ-based weighting factors, it is impossible to improve all the DVH curves at the same time by adjusting voxel-based weighting factors. 4) A larger Pareto surface is explored by changing voxel-based weighting factors than by changing organ-based weighting factors, possibly leading to a plan with better trade-offs. 5) The Pareto optimality is not necessarily guaranteed while we are adjusting the voxel reference doses, and hence, adjusting voxel-based weighting factors is preferred in terms of preserving the Pareto optimality. We have developed a mathematical framework for IMRT optimization using voxel-based parameters. We can improve the plan quality by adjusting voxel-based weighting factors after organ-based parameter

  6. A novel approach to multi-criteria inverse planning for IMRT

    International Nuclear Information System (INIS)

    Breedveld, Sebastiaan; Storchi, Pascal R M; Keijzer, Marleen; Heemink, Arnold W; Heijmen, Ben J M

    2007-01-01

    Treatment plan optimization is a multi-criteria process. Optimizing solely on one objective or on a sum of a priori weighted objectives may result in inferior treatment plans. Manually adjusting weights or constraints in a trial and error procedure is time consuming. In this paper we introduce a novel multi-criteria optimization approach to automatically optimize treatment constraints (dose-volume and maximum-dose). The algorithm tries to meet these constraints as well as possible, but in the case of conflicts it relaxes lower priority constraints so that higher priority constraints can be met. Afterwards, all constraints are tightened, starting with the highest priority constraints. Applied constraint priority lists can be used as class solutions for patients with similar tumour types. The presented algorithm does iteratively apply an underlying algorithm for beam profile optimization, based on a quadratic objective function with voxel-dependent importance factors. These voxel-dependent importance factors are automatically adjusted to reduce dose-volume and maximum-dose constraint violations

  7. 3D Simulation of Elastic Wave Propagation in Heterogeneous Anisotropic Media in Laplace Domain for Electromagnetic-Seismic Inverse Modeling

    Science.gov (United States)

    Petrov, P.; Newman, G. A.

    2011-12-01

    averaging elastic coefficients and three averaging densities are necessary to describe the heterogeneous medium with VTI anisotropy. The resulting system is solved with iterative Krylov methods. The developed method will be incorporated in an inversion scheme for joint seismic-electromagnetic imaging. References. Brown, B.M., M. Jais, I.W. Knowles, 2005, A variational approach to an elastic inverse problem: Inverse Problems, 21, 1953-1973. Commer, M., G. Newman, 2008, New advances in three-dimensional controlled-source electromagnetic inversion: Geophysical Journal International, 172, 513-535. Newman, G. A., M. Commer and J.J. Carazzone, 2010, Imaging CSEM data in the presence of electrical anisotropy: Geophysics 75, 51-61 Petrov, P.V., G. A. Newman (2010), Using 3D Simulation of Elastic Wave Propagation in Laplace Domain for Electromagnetic-Seismic Inverse Modeling, Abstract T21A-2140 presented at 2010 Fall Meeting, AGU, San Francisco, Calif., 13-17 Dec. Shin, C. , W. Ha, 2008, A comparison between the behavior of objective functions for waveform inversion in the frequency and Laplace domains: Geophysics, 73, 119-133. Shin, C. , Y. H. Cha, 2008. Waveform inversion in the Laplace domain: Geophysical Journal International, 173, 922-931.

  8. A Graphical Interactive Simulation Environment for Production Planning in Bacon Factories

    DEFF Research Database (Denmark)

    Nielsen, Kirsten Mølgaard; Nielsen, Jens Frederik Dalsgaard

    1994-01-01

    The paper describes a graphical interactive simulation tool for production planning in bacon factories........The paper describes a graphical interactive simulation tool for production planning in bacon factories.....

  9. COUPLED FREE AND DISSOLVED PHASE TRANSPORT: NEW SIMULATION CAPABILITIES AND PARAMETER INVERSION

    Science.gov (United States)

    The vadose zone free-phase simulation capabilities of the US EPA Hydrocarbon Spill Screening Model (HSSM) (Weaver et al., 1994) have been linked with the 3-D multi-species dissolved-phase contaminant transport simulator MT3DMS (Zheng and Wang, 1999; Zheng, 2005). The linkage pro...

  10. Effective rates of heavy metal release from alkaline wastes — Quantified by column outflow experiments and inverse simulations

    Science.gov (United States)

    Wehrer, Markus; Totsche, Kai Uwe

    2008-10-01

    Column outflow experiments operated at steady state flow conditions do not allow the identification of rate limited release processes. This requires an alternative experimental methodology. In this study, the aim was to apply such a methodology in order to identify and quantify effective release rates of heavy metals from granular wastes. Column experiments were conducted with demolition waste and municipal waste incineration (MSWI) bottom ash using different flow velocities and multiple flow interruptions. The effluent was analyzed for heavy metals, DOC, electrical conductivity and pH. The breakthrough-curves were inversely modeled with a numerical code based on the advection-dispersion equation with first order mass-transfer and nonlinear interaction terms. Chromium, Copper, Nickel and Arsenic are usually released under non-equilibrium conditions. DOC might play a role as carrier for those trace metals. By inverse simulations, generally good model fits are derived. Although some parameters are correlated and some model deficiencies can be revealed, we are able to deduce physically reasonable release-mass-transfer time scales. Applying forward simulations, the parameter space with equifinal parameter sets was delineated. The results demonstrate that the presented experimental design is capable of identifying and quantifying non-equilibrium conditions. They show also that the possibility of rate limited release must not be neglected in release and transport studies involving inorganic contaminants.

  11. Enhanced Simulated Annealing for Solving Aggregate Production Planning

    Directory of Open Access Journals (Sweden)

    Mohd Rizam Abu Bakar

    2016-01-01

    Full Text Available Simulated annealing (SA has been an effective means that can address difficulties related to optimisation problems. SA is now a common discipline for research with several productive applications such as production planning. Due to the fact that aggregate production planning (APP is one of the most considerable problems in production planning, in this paper, we present multiobjective linear programming model for APP and optimised by SA. During the course of optimising for the APP problem, it uncovered that the capability of SA was inadequate and its performance was substandard, particularly for a sizable controlled APP problem with many decision variables and plenty of constraints. Since this algorithm works sequentially then the current state will generate only one in next state that will make the search slower and the drawback is that the search may fall in local minimum which represents the best solution in only part of the solution space. In order to enhance its performance and alleviate the deficiencies in the problem solving, a modified SA (MSA is proposed. We attempt to augment the search space by starting with N+1 solutions, instead of one solution. To analyse and investigate the operations of the MSA with the standard SA and harmony search (HS, the real performance of an industrial company and simulation are made for evaluation. The results show that, compared to SA and HS, MSA offers better quality solutions with regard to convergence and accuracy.

  12. Two-dimensional inverse planning and delivery with a preclinical image guided microirradiator

    International Nuclear Information System (INIS)

    Stewart, James M. P.; Lindsay, Patricia E.; Jaffray, David A.

    2013-01-01

    profiles taken through the sock distribution of 3.9%. Mean absolute delivery error across the 0–1 Gy linear dose gradient over 7.5 mm was 0.01 Gy.Conclusions: The work presented here demonstrates the potential for complex dose distributions to be planned and automatically delivered with millimeter scale heterogeneity at submillimeter accuracy. This capability establishes the technical foundation for preclinical validation of biologically guided radiotherapy investigations and development of unique radiobiological experiments

  13. Comparison of treatments of steep and shoot generated by different inverse planning systems; Comparacion de tratamiento de steep and shoot generados por diferentes sistemas de planificacion inversa

    Energy Technology Data Exchange (ETDEWEB)

    Perez Moreno, J. M.; Zucca Aparicio, D.; Fernandez Leton, P.; Garcia Ruiz-Zorrilla, J.; Minambres Moro, A.

    2011-07-01

    The problem of IMRT treatments with the technique Steep and Shoot or static is the number of segments and monitor units used in the treatment. These parameters depend largely on the inverse planning system which determines treatment. Are evaluated three commercial planning systems, with each one performing clinical dosimetry for the same series of patients. Dosimetric results are compared, UM calculated and number of segments.

  14. Decision support for simulation-based operation planning

    Science.gov (United States)

    Schubert, Johan; Hörling, Pontus

    2016-05-01

    In this paper, we develop methods for analyzing large amounts of data from a military ground combat simulation system. Through a series of processes, we focus the big data set on situations that correspond to important questions and show advantageous outcomes. The result is a decision support methodology that provides commanders with results that answer specific questions of interest, such as what the consequences for the Blue side are in various Red scenarios or what a particular Blue force can withstand. This approach is a step toward taking the traditional data farming methodology from its analytical view into a prescriptive operation planning context and a decision making mode.

  15. Building Performance Simulation tools for planning of energy efficiency retrofits

    DEFF Research Database (Denmark)

    Mondrup, Thomas Fænø; Karlshøj, Jan; Vestergaard, Flemming

    2014-01-01

    Designing energy efficiency retrofits for existing buildings will bring environmental, economic, social, and health benefits. However, selecting specific retrofit strategies is complex and requires careful planning. In this study, we describe a methodology for adopting Building Performance...... to energy efficiency retrofits in social housing. To generate energy savings, we focus on optimizing the building envelope. We evaluate alternative building envelope actions using procedural solar radiation and daylight simulations. In addition, we identify the digital information flow and the information...... Simulation (BPS) tools as energy and environmentally conscious decision-making aids. The methodology has been developed to screen buildings for potential improvements and to support the development of retrofit strategies. We present a case study of a Danish renovation project, implementing BPS approaches...

  16. Virtual environment simulation as a tool to support evacuation planning

    International Nuclear Information System (INIS)

    Mol, Antonio C.; Grecco, Claudio H.S.; Santos, Isaac J.A.L.; Carvalho, Paulo V.R.; Jorge, Carlos A.F.; Sales, Douglas S.; Couto, Pedro M.; Botelho, Felipe M.; Bastos, Felipe R.

    2007-01-01

    This work is a preliminary study of the use of a free game-engine as a tool to build and to navigate in virtual environments, with a good degree of realism, for virtual simulations of evacuation from building and risk zones. To achieve this goal, some adjustments in the game engine have been implemented. A real building with four floors, consisting of some rooms with furniture and people, has been virtually implemented. Simulations of simple different evacuation scenarios have been performed, measuring the total time spent in each case. The measured times have been compared with their corresponding real evacuation times, measured in the real building. The first results have demonstrated that the virtual environment building with the free game engine is capable to reproduce the real situation with a satisfactory level. However, it is important to emphasize that such virtual simulations serve only as an aid in the planning of real evacuation simulations, and as such must never substitute the later. (author)

  17. Analisa Forward Dan Inverse Kinematics Pada Simulator Arm Robot 5 Derajat Kebebasan

    OpenAIRE

    Utomo, Budi; Munadi, Munadi

    2013-01-01

    An arm robot simulator 5 dof (degree of freedom) which is equipped with a two-finger gripper is designed to determine the movement of the robot manipulator. To make an arm robot simulator, we used acrylic as a base material, servomotor as a driver and an Arduino Uno SMD as microcontroller. Acrylic was chosen because it is light, strong and durable. Arduino Uno SMD was chosen because it can interact with LabVIEW that will be able to control the movement angle of servomotor manually. The purpos...

  18. Design and Simulation Plant Layout Using Systematic Layout Planning

    Science.gov (United States)

    Suhardini, D.; Septiani, W.; Fauziah, S.

    2017-12-01

    This research aims to design the factory layout of PT. Gunaprima Budiwijaya in order to increase production capacity. The problem faced by this company is inappropriate layout causes cross traffic on the production floor. The re-layout procedure consist of these three steps: analysing the existing layout, designing plant layout based on SLP and evaluation and selection of alternative layout using Simulation Pro model version 6. Systematic layout planning is used to re-layout not based on the initial layout. This SLP produces four layout alternatives, and each alternative will be evaluated based on two criteria, namely cost of material handling using Material Handling Evaluation Sheet (MHES) and processing time by simulation. The results showed that production capacity is increasing as much as 37.5% with the addition of the machine and the operator, while material handling cost was reduced by improvement of the layout. The use of systematic layout planning method reduces material handling cost of 10,98% from initial layout or amounting to Rp1.229.813,34.

  19. Thermohaline structure of an inverse estuary - The Gulf of Kachchh: Measurements and model simulations

    Digital Repository Service at National Institute of Oceanography (India)

    Vethamony, P.; Babu, M.T.; Ramanamurty, M.V.; Saran, A.K.; Joseph, A.; Sudheesh, K.; Patgaonkar, R.S.; Jayakumar, S.

    K thermohaline structure and its variability, based on field measurements and model simulations. Though GoK is considered as a well-mixed system, the study reveals that only the central Gulf is well mixed. Vertical gradients in temperature and salinity fields...

  20. Acoustical characteristics and simulated tomographic inversion of a cold core eddy in the Bay of Bengal

    Digital Repository Service at National Institute of Oceanography (India)

    PrasannaKumar, S.; Navelkar, G.S.; Murty, T.V.R.; Murty, C.S.

    sound speed by about 10 m.s sup(-1). Under the influence of the eddy, depth of SOFAR channel axis remains constant (1600 m) which otherwise should have shown a deepening in this region. Simulated ray arrival structure depicts the typical characteristics...

  1. Planning of overhead contact lines and simulation of the pantograph running; Oberleitungsplanung und Simulation des Stromabnehmerlaufes

    Energy Technology Data Exchange (ETDEWEB)

    Hofbauer, Gerhard [ALPINE-ENERGIE Oesterreich GmbH, Linz (Austria); Hofbauer, Werner

    2009-07-01

    Using the software FLTG all planning steps for overhead contact lines can be carried out based on the parameters of the contact line type and the line data. Contact line supports and individual spans are presented graphically. The geometric interaction of pantograph and contact line can be simulated taking into account the pantograph type, its sway and the wind action. Thus, the suitability of a line for the interoperability of the transEuropean rail system can be demonstrated. (orig.)

  2. SU-E-T-14: A Comparative Study Between Forward and Inverse Planning in Gamma Knife Radiosurgery for Acoustic Neuroma Tumours

    Energy Technology Data Exchange (ETDEWEB)

    Gopishankar, N; Agarwal, Priyanka; Bisht, Raj Kishor; Kale, S S; Rath, G K; Chander, S; Sharma, B S [All India Institute of Medical Sciences, New Delhi (India)

    2015-06-15

    Purpose: To evaluate forward and inverse planning methods for acoustic neuroma cases treated in Gamma Knife Perfexion. Methods: Five patients with acoustic neuroma tumour abutting brainstem were planned twice in LGP TPS (Version 10.1) using TMR10 algorithm. First plan was entirely based on forward planning (FP) in which each shot was chosen manually. Second plan was generated using inverse planning (IP) for which planning parameters like coverage, selectivity, gradient index (GI) and beam-on time threshold were set. Number of shots in IP was automatically selected by objective function using iterative process. In both planning methods MRI MPRAGE sequence images were used for tumour localization and planning. A planning dose of 12Gy at 50% isodose level was chosen. Results and Discussion: Number of shots used in FP was greater than IP and beam-on time in FP was in average 1.4 times more than IP. One advantage of FP was that the brainstem volume subjected to 6Gy dose (25% isodose) was less in FP than IP. Our results showed use of more number of shots as in FP results in GI less than or equal to 2.55 which is close to its lower limit. Dose homogeneity index (DHI) analysis of FP and IP showed average values of 0.59 and 0.67 respectively. General trend in GK for planning in acoustic neuroma cases is to use small collimator shots to avoid dose to adjacent critical structures. More number of shots and prolonged treatment time causes inconvenience to the patients. Similarly overuse of automatic shot shaping as in IP results in increased scatter dose. A compromise is required in shot selection for these cases. Conclusion: IP method could be used in acoustic neuroma cases to decrease treatment time provided the source sector openings near brainstem are shielded or adjusted appropriately to reduce brainstem dose.

  3. Asian Rhinoplasty: Preoperative Simulation and Planning Using Adobe Photoshop.

    Science.gov (United States)

    Kiranantawat, Kidakorn; Nguyen, Anh H

    2015-11-01

    A rhinoplasty in Asians differs from a rhinoplasty performed in patients of other ethnicities. Surgeons should understand the concept of Asian beauty, the nasal anatomy of Asians, and common problems encountered while operating on the Asian nose. With this understanding, surgeons can set appropriate goals, choose proper operative procedures, and provide an outcome that satisfies patients. In this article the authors define the concept of an Asian rhinoplasty-a paradigm shift from the traditional on-top augmentation rhinoplasty to a structurally integrated augmentation rhinoplasty-and provide a step-by-step procedure for the use of Adobe Photoshop as a preoperative program to simulate the expected surgical outcome for patients and to develop a preoperative plan for surgeons.

  4. Solving Assembly Sequence Planning using Angle Modulated Simulated Kalman Filter

    Science.gov (United States)

    Mustapa, Ainizar; Yusof, Zulkifli Md.; Adam, Asrul; Muhammad, Badaruddin; Ibrahim, Zuwairie

    2018-03-01

    This paper presents an implementation of Simulated Kalman Filter (SKF) algorithm for optimizing an Assembly Sequence Planning (ASP) problem. The SKF search strategy contains three simple steps; predict-measure-estimate. The main objective of the ASP is to determine the sequence of component installation to shorten assembly time or save assembly costs. Initially, permutation sequence is generated to represent each agent. Each agent is then subjected to a precedence matrix constraint to produce feasible assembly sequence. Next, the Angle Modulated SKF (AMSKF) is proposed for solving ASP problem. The main idea of the angle modulated approach in solving combinatorial optimization problem is to use a function, g(x), to create a continuous signal. The performance of the proposed AMSKF is compared against previous works in solving ASP by applying BGSA, BPSO, and MSPSO. Using a case study of ASP, the results show that AMSKF outperformed all the algorithms in obtaining the best solution.

  5. Comparison of CT-based 3D treatment planning with simulator planning of pelvic irradiation of primary cervical carcinoma

    International Nuclear Information System (INIS)

    Knocke, T.H.; Pokrajac, B.; Fellner, C.; Poetter, R.

    1999-01-01

    In a prospective study on 20 subsequent patients with primary cervical carcinoma in Stages I to III simulator planning of a 4-field box-technique was performed. After defining the planning target volume (PTV) in the 3D planning system the field configuration of the simulator planning was transmitted. The resulting plan was compared to a second one based on the defined PTV and evaluated regarding a possible geographical miss and encompassment of the PTV by the treated volume (ICRU). Volumes of open and shaped portals were calculated for both techniques. Planning by simulation resulted in 1 geographical miss and in 10 more cases the encompassment of the PTV by the treated volume was inadequate. For a PTV of mean 1 729 cm 3 the mean volume defined by simulation was 3 120 cm 3 for the open portals and 2 702 cm 3 for the shaped portals. The volume reduction by blocks was 13,4% (mean). With CT-based 3D treatment planning the volume of the open portals was 3,3% (mean) enlarged to 3 224 cm 3 . The resulting mean volume of the shaped portals was 2 458 ccm. The reduction compared to the open portals was 23,8% (mean). The treated volumes were 244 cm 3 or 9% (mean) smaller compared to simulator planning. The 'treated volume/planning target volume ratio' was decreased from 1.59 to 1.42. (orig.) [de

  6. Dosimetry audit simulation of treatment planning system in multicenters radiotherapy

    Science.gov (United States)

    Kasmuri, S.; Pawiro, S. A.

    2017-07-01

    Treatment Planning System (TPS) is an important modality that determines radiotherapy outcome. TPS requires input data obtained through commissioning and the potentially error occurred. Error in this stage may result in the systematic error. The aim of this study to verify the TPS dosimetry to know deviation range between calculated and measurement dose. This study used CIRS phantom 002LFC representing the human thorax and simulated all external beam radiotherapy stages. The phantom was scanned using CT Scanner and planned 8 test cases that were similar to those in clinical practice situation were made, tested in four radiotherapy centers. Dose measurement using 0.6 cc ionization chamber. The results of this study showed that generally, deviation of all test cases in four centers was within agreement criteria with average deviation about -0.17±1.59 %, -1.64±1.92 %, 0.34±1.34 % and 0.13±1.81 %. The conclusion of this study was all TPS involved in this study showed good performance. The superposition algorithm showed rather poor performance than either analytic anisotropic algorithm (AAA) and convolution algorithm with average deviation about -1.64±1.92 %, -0.17±1.59 % and -0.27±1.51 % respectively.

  7. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    International Nuclear Information System (INIS)

    2007-01-01

    The Performance Demonstration Program (PDP) for headspace gases distributes blind audit samples in a gas matrix for analysis of volatile organic compounds (VOCs). Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility's compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document

  8. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    International Nuclear Information System (INIS)

    2006-01-01

    The Performance Demonstration Program (PDP) for headspace gases distributes sample gases of volatile organic compounds (VOCs) for analysis. Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility's compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document. Participating measurement

  9. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    Energy Technology Data Exchange (ETDEWEB)

    Carlsbad Field Office

    2006-04-01

    The Performance Demonstration Program (PDP) for headspace gases distributes sample gases of volatile organic compounds (VOCs) for analysis. Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility’s compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document. Participating measurement

  10. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    Energy Technology Data Exchange (ETDEWEB)

    Carlsbad Field Office

    2007-11-19

    The Performance Demonstration Program (PDP) for headspace gases distributes blind audit samples in a gas matrix for analysis of volatile organic compounds (VOCs). Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility’s compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document

  11. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    Energy Technology Data Exchange (ETDEWEB)

    Carlsbad Field Office

    2007-11-13

    The Performance Demonstration Program (PDP) for headspace gases distributes blind audit samples in a gas matrix for analysis of volatile organic compounds (VOCs). Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility’s compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document

  12. Optimal multi-agent path planning for fast inverse modeling in UAV-based flood sensing applications

    KAUST Repository

    Abdelkader, Mohamed

    2014-05-01

    Floods are the most common natural disasters, causing thousands of casualties every year in the world. In particular, flash flood events are particularly deadly because of the short timescales on which they occur. Unmanned air vehicles equipped with mobile microsensors could be capable of sensing flash floods in real time, saving lives and greatly improving the efficiency of the emergency response. However, of the main issues arising with sensing floods is the difficulty of planning the path of the sensing agents in advance so as to obtain meaningful data as fast as possible. In this particle, we present a fast numerical scheme to quickly compute the trajectories of a set of UAVs in order to maximize the accuracy of model parameter estimation over a time horizon. Simulation results are presented, a preliminary testbed is briefly described, and future research directions and problems are discussed. © 2014 IEEE.

  13. Single-Column Model Simulations of Subtropical Marine Boundary-Layer Cloud Transitions Under Weakening Inversions

    Science.gov (United States)

    Neggers, R. A. J.; Ackerman, A. S.; Angevine, W. M.; Bazile, E.; Beau, I.; Blossey, P. N.; Boutle, I. A.; de Bruijn, C.; Cheng, A.; van der Dussen, J.; Fletcher, J.; Dal Gesso, S.; Jam, A.; Kawai, H.; Cheedela, S. K.; Larson, V. E.; Lefebvre, M.-P.; Lock, A. P.; Meyer, N. R.; de Roode, S. R.; de Rooy, W.; Sandu, I.; Xiao, H.; Xu, K.-M.

    2017-10-01

    Results are presented of the GASS/EUCLIPSE single-column model intercomparison study on the subtropical marine low-level cloud transition. A central goal is to establish the performance of state-of-the-art boundary-layer schemes for weather and climate models for this cloud regime, using large-eddy simulations of the same scenes as a reference. A novelty is that the comparison covers four different cases instead of one, in order to broaden the covered parameter space. Three cases are situated in the North-Eastern Pacific, while one reflects conditions in the North-Eastern Atlantic. A set of variables is considered that reflects key aspects of the transition process, making use of simple metrics to establish the model performance. Using this method, some longstanding problems in low-level cloud representation are identified. Considerable spread exists among models concerning the cloud amount, its vertical structure, and the associated impact on radiative transfer. The sign and amplitude of these biases differ somewhat per case, depending on how far the transition has progressed. After cloud breakup the ensemble median exhibits the well-known "too few too bright" problem. The boundary-layer deepening rate and its state of decoupling are both underestimated, while the representation of the thin capping cloud layer appears complicated by a lack of vertical resolution. Encouragingly, some models are successful in representing the full set of variables, in particular, the vertical structure and diurnal cycle of the cloud layer in transition. An intriguing result is that the median of the model ensemble performs best, inspiring a new approach in subgrid parameterization.

  14. RTSTEP regional transportation simulation tool for emergency planning - final report.

    Energy Technology Data Exchange (ETDEWEB)

    Ley, H.; Sokolov, V.; Hope, M.; Auld, J.; Zhang, K.; Park, Y.; Kang, X. (Energy Systems)

    2012-01-20

    such materials over a large area, with responders trying to mitigate the immediate danger to the population in a variety of ways that may change over time (e.g., in-place evacuation, staged evacuations, and declarations of growing evacuation zones over time). In addition, available resources will be marshaled in unusual ways, such as the repurposing of transit vehicles to support mass evacuations. Thus, any simulation strategy will need to be able to address highly dynamic effects and will need to be able to handle any mode of ground transportation. Depending on the urgency and timeline of the event, emergency responders may also direct evacuees to leave largely on foot, keeping roadways as clear as possible for emergency responders, logistics, mass transport, and law enforcement. This RTSTEP project developed a regional emergency evacuation modeling tool for the Chicago Metropolitan Area that emergency responders can use to pre-plan evacuation strategies and compare different response strategies on the basis of a rather realistic model of the underlying complex transportation system. This approach is a significant improvement over existing response strategies that are largely based on experience gained from small-scale events, anecdotal evidence, and extrapolation to the scale of the assumed emergency. The new tool will thus add to the toolbox available to emergency response planners to help them design appropriate generalized procedures and strategies that lead to an improved outcome when used during an actual event.

  15. TU-EF-304-07: Monte Carlo-Based Inverse Treatment Plan Optimization for Intensity Modulated Proton Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Li, Y [Tsinghua University, Beijing, Beijing (China); UT Southwestern Medical Center, Dallas, TX (United States); Tian, Z; Jiang, S; Jia, X [UT Southwestern Medical Center, Dallas, TX (United States); Song, T [Southern Medical University, Guangzhou, Guangdong (China); UT Southwestern Medical Center, Dallas, TX (United States); Wu, Z; Liu, Y [Tsinghua University, Beijing, Beijing (China)

    2015-06-15

    Purpose: Intensity-modulated proton therapy (IMPT) is increasingly used in proton therapy. For IMPT optimization, Monte Carlo (MC) is desired for spots dose calculations because of its high accuracy, especially in cases with a high level of heterogeneity. It is also preferred in biological optimization problems due to the capability of computing quantities related to biological effects. However, MC simulation is typically too slow to be used for this purpose. Although GPU-based MC engines have become available, the achieved efficiency is still not ideal. The purpose of this work is to develop a new optimization scheme to include GPU-based MC into IMPT. Methods: A conventional approach using MC in IMPT simply calls the MC dose engine repeatedly for each spot dose calculations. However, this is not the optimal approach, because of the unnecessary computations on some spots that turned out to have very small weights after solving the optimization problem. GPU-memory writing conflict occurring at a small beam size also reduces computational efficiency. To solve these problems, we developed a new framework that iteratively performs MC dose calculations and plan optimizations. At each dose calculation step, the particles were sampled from different spots altogether with Metropolis algorithm, such that the particle number is proportional to the latest optimized spot intensity. Simultaneously transporting particles from multiple spots also mitigated the memory writing conflict problem. Results: We have validated the proposed MC-based optimization schemes in one prostate case. The total computation time of our method was ∼5–6 min on one NVIDIA GPU card, including both spot dose calculation and plan optimization, whereas a conventional method naively using the same GPU-based MC engine were ∼3 times slower. Conclusion: A fast GPU-based MC dose calculation method along with a novel optimization workflow is developed. The high efficiency makes it attractive for clinical

  16. TU-EF-304-07: Monte Carlo-Based Inverse Treatment Plan Optimization for Intensity Modulated Proton Therapy

    International Nuclear Information System (INIS)

    Li, Y; Tian, Z; Jiang, S; Jia, X; Song, T; Wu, Z; Liu, Y

    2015-01-01

    Purpose: Intensity-modulated proton therapy (IMPT) is increasingly used in proton therapy. For IMPT optimization, Monte Carlo (MC) is desired for spots dose calculations because of its high accuracy, especially in cases with a high level of heterogeneity. It is also preferred in biological optimization problems due to the capability of computing quantities related to biological effects. However, MC simulation is typically too slow to be used for this purpose. Although GPU-based MC engines have become available, the achieved efficiency is still not ideal. The purpose of this work is to develop a new optimization scheme to include GPU-based MC into IMPT. Methods: A conventional approach using MC in IMPT simply calls the MC dose engine repeatedly for each spot dose calculations. However, this is not the optimal approach, because of the unnecessary computations on some spots that turned out to have very small weights after solving the optimization problem. GPU-memory writing conflict occurring at a small beam size also reduces computational efficiency. To solve these problems, we developed a new framework that iteratively performs MC dose calculations and plan optimizations. At each dose calculation step, the particles were sampled from different spots altogether with Metropolis algorithm, such that the particle number is proportional to the latest optimized spot intensity. Simultaneously transporting particles from multiple spots also mitigated the memory writing conflict problem. Results: We have validated the proposed MC-based optimization schemes in one prostate case. The total computation time of our method was ∼5–6 min on one NVIDIA GPU card, including both spot dose calculation and plan optimization, whereas a conventional method naively using the same GPU-based MC engine were ∼3 times slower. Conclusion: A fast GPU-based MC dose calculation method along with a novel optimization workflow is developed. The high efficiency makes it attractive for clinical

  17. Performance demonstration program plan for analysis of simulated headspace gases

    International Nuclear Information System (INIS)

    1995-06-01

    The Performance Demonstration Program (PDP) for analysis of headspace gases will consist of regular distribution and analyses of test standards to evaluate the capability for analyzing VOCs, hydrogen, and methane in the headspace of transuranic (TRU) waste throughout the Department of Energy (DOE) complex. Each distribution is termed a PDP cycle. These evaluation cycles will provide an objective measure of the reliability of measurements performed for TRU waste characterization. Laboratory performance will be demonstrated by the successful analysis of blind audit samples of simulated TRU waste drum headspace gases according to the criteria set within the text of this Program Plan. Blind audit samples (hereinafter referred to as PDP samples) will be used as an independent means to assess laboratory performance regarding compliance with the QAPP QAOs. The concentration of analytes in the PDP samples will encompass the range of concentrations anticipated in actual waste characterization gas samples. Analyses which are required by the WIPP to demonstrate compliance with various regulatory requirements and which are included in the PDP must be performed by laboratories which have demonstrated acceptable performance in the PDP

  18. A framework for inverse planning of beam-on times for 3D small animal radiotherapy using interactive multi-objective optimisation

    International Nuclear Information System (INIS)

    Balvert, Marleen; Den Hertog, Dick; Van Hoof, Stefan J; Granton, Patrick V; Trani, Daniela; Hoffmann, Aswin L; Verhaegen, Frank

    2015-01-01

    Advances in precision small animal radiotherapy hardware enable the delivery of increasingly complicated dose distributions on the millimeter scale. Manual creation and evaluation of treatment plans becomes difficult or even infeasible with an increasing number of degrees of freedom for dose delivery and available image data. The goal of this work is to develop an optimisation model that determines beam-on times for a given beam configuration, and to assess the feasibility and benefits of an automated treatment planning system for small animal radiotherapy.The developed model determines a Pareto optimal solution using operator-defined weights for a multiple-objective treatment planning problem. An interactive approach allows the planner to navigate towards, and to select the Pareto optimal treatment plan that yields the most preferred trade-off of the conflicting objectives. This model was evaluated using four small animal cases based on cone-beam computed tomography images. Resulting treatment plan quality was compared to the quality of manually optimised treatment plans using dose-volume histograms and metrics.Results show that the developed framework is well capable of optimising beam-on times for 3D dose distributions and offers several advantages over manual treatment plan optimisation. For all cases but the simple flank tumour case, a similar amount of time was needed for manual and automated beam-on time optimisation. In this time frame, manual optimisation generates a single treatment plan, while the inverse planning system yields a set of Pareto optimal solutions which provides quantitative insight on the sensitivity of conflicting objectives. Treatment planning automation decreases the dependence on operator experience and allows for the use of class solutions for similar treatment scenarios. This can shorten the time required for treatment planning and therefore increase animal throughput. In addition, this can improve treatment standardisation and

  19. Perceived Speech Privacy in Computer Simulated Open-plan Offices

    DEFF Research Database (Denmark)

    Pop, Claudiu B.; Rindel, Jens Holger

    2005-01-01

    In open plan offices the lack of speech privacy between the workstations is one of the major acoustic problems. Improving the speech privacy in an open plan design is therefore the main concern for a successful open plan environment. The project described in this paper aimed to find an objective...... parameter that correlates well with the perceived degree of speech privacy and to derive a clear method for evaluating the acoustic conditions in open plan offices. Acoustic measurements were carried out in an open plan office, followed by data analysis at the Acoustic Department, DTU. A computer model...

  20. Retrieval of Seasonal Leaf Area Index from Simulated EnMAP Data through Optimized LUT-Based Inversion of the PROSAIL Model

    Directory of Open Access Journals (Sweden)

    Matthias Locherer

    2015-08-01

    Full Text Available The upcoming satellite mission EnMAP offers the opportunity to retrieve information on the seasonal development of vegetation parameters on a regional scale based on hyperspectral data. This study aims to investigate whether an analysis method for the retrieval of leaf area index (LAI, developed and validated on the 4 m resolution scale of six airborne datasets covering the 2012 growing period, is transferable to the spaceborne 30 m resolution scale of the future EnMAP mission. The widely used PROSAIL model is applied to generate look-up-table (LUT libraries, by which the model is inverted to derive LAI information. With the goal of defining the impact of different selection criteria in the inversion process, different techniques for the LUT based inversion are tested, such as several cost functions, type and amount of artificial noise, number of considered solutions and type of averaging method. The optimal inversion procedure (Laplace, median, 4% inverse multiplicative noise, 350 out of 100,000 averages is identified by validating the results against corresponding in-situ measurements (n = 330 of LAI. Finally, the best performing LUT inversion (R2 = 0.65, RMSE = 0.64 is adapted to simulated EnMAP data, generated from the airborne acquisitions. The comparison of the retrieval results to upscaled maps of LAI, previously validated on the 4 m scale, shows that the optimized retrieval method can successfully be transferred to spaceborne EnMAP data.

  1. Sequential use of simulation and optimization in analysis and planning

    Science.gov (United States)

    Hans R. Zuuring; Jimmie D. Chew; J. Greg Jones

    2000-01-01

    Management activities are analyzed at landscape scales employing both simulation and optimization. SIMPPLLE, a stochastic simulation modeling system, is initially applied to assess the risks associated with a specific natural process occurring on the current landscape without management treatments, but with fire suppression. These simulation results are input into...

  2. WE-H-BRC-09: Simulated Errors in Mock Radiotherapy Plans to Quantify the Effectiveness of the Physics Plan Review

    International Nuclear Information System (INIS)

    Gopan, O; Kalet, A; Smith, W; Hendrickson, K; Kim, M; Young, L; Nyflot, M; Chvetsov, A; Phillips, M; Ford, E

    2016-01-01

    Purpose: A standard tool for ensuring the quality of radiation therapy treatments is the initial physics plan review. However, little is known about its performance in practice. The goal of this study is to measure the effectiveness of physics plan review by introducing simulated errors into “mock” treatment plans and measuring the performance of plan review by physicists. Methods: We generated six mock treatment plans containing multiple errors. These errors were based on incident learning system data both within the department and internationally (SAFRON). These errors were scored for severity and frequency. Those with the highest scores were included in the simulations (13 errors total). Observer bias was minimized using a multiple co-correlated distractor approach. Eight physicists reviewed these plans for errors, with each physicist reviewing, on average, 3/6 plans. The confidence interval for the proportion of errors detected was computed using the Wilson score interval. Results: Simulated errors were detected in 65% of reviews [51–75%] (95% confidence interval [CI] in brackets). The following error scenarios had the highest detection rates: incorrect isocenter in DRRs/CBCT (91% [73–98%]) and a planned dose different from the prescribed dose (100% [61–100%]). Errors with low detection rates involved incorrect field parameters in record and verify system (38%, [18–61%]) and incorrect isocenter localization in planning system (29% [8–64%]). Though pre-treatment QA failure was reliably identified (100%), less than 20% of participants reported the error that caused the failure. Conclusion: This is one of the first quantitative studies of error detection. Although physics plan review is a key safety measure and can identify some errors with high fidelity, others errors are more challenging to detect. This data will guide future work on standardization and automation. Creating new checks or improving existing ones (i.e., via automation) will help in

  3. Planning of general practitioners in the Netherlands: a simulation model.

    NARCIS (Netherlands)

    Greuningen, M. van; Batenburg, R.S.; Velden, L.F.J. van der

    2010-01-01

    Manpower planning can be an important instrument to control shortages (or oversupply) within the health care labour market. The Netherlands is one of the countries that have a relative long tradition of manpower planning in health care. In 1973 the government introduced the numerus clausus for the

  4. The Challenge of Grounding Planning in Simulation with an Interactive Model Development Environment

    Science.gov (United States)

    Clement, Bradley J.; Frank, Jeremy D.; Chachere, John M.; Smith, Tristan B.; Swanson, Keith J.

    2011-01-01

    A principal obstacle to fielding automated planning systems is the difficulty of modeling. Physical systems are modeled conventionally based on specification documents and the modeler's understanding of the system. Thus, the model is developed in a way that is disconnected from the system's actual behavior and is vulnerable to manual error. Another obstacle to fielding planners is testing and validation. For a space mission, generated plans must be validated often by translating them into command sequences that are run in a simulation testbed. Testing in this way is complex and onerous because of the large number of possible plans and states of the spacecraft. Though, if used as a source of domain knowledge, the simulator can ease validation. This paper poses a challenge: to ground planning models in the system physics represented by simulation. A proposed, interactive model development environment illustrates the integration of planning and simulation to meet the challenge. This integration reveals research paths for automated model construction and validation.

  5. Fast Simulation of 3-D Surface Flanging and Prediction of the Flanging Lines Based On One-Step Inverse Forming Algorithm

    International Nuclear Information System (INIS)

    Bao Yidong; Hu Sibo; Lang Zhikui; Hu Ping

    2005-01-01

    A fast simulation scheme for 3D curved binder flanging and blank shape prediction of sheet metal based on one-step inverse finite element method is proposed, in which the total plasticity theory and proportional loading assumption are used. The scheme can be actually used to simulate 3D flanging with complex curve binder shape, and suitable for simulating any type of flanging model by numerically determining the flanging height and flanging lines. Compared with other methods such as analytic algorithm and blank sheet-cut return method, the prominent advantage of the present scheme is that it can directly predict the location of the 3D flanging lines when simulating the flanging process. Therefore, the prediction time of flanging lines will be obviously decreased. Two typical 3D curve binder flanging including stretch and shrink characters are simulated in the same time by using the present scheme and incremental FE non-inverse algorithm based on incremental plasticity theory, which show the validity and high efficiency of the present scheme

  6. Enhancing Student’s Understanding in Entrepreneurship Through Business Plan Simulation

    Directory of Open Access Journals (Sweden)

    Guzairy M.

    2018-01-01

    Full Text Available Business Plan is an important document for entrepreneurs to guide them managing their business. Business Plan also assist the entrepreneur to strategies their business and manage future growth. That is why Malaysian government has foster all Higher Education Provider to set entrepreneurship education as compulsory course. One of the entrepreneurship education learning outcome is the student can write effective business plan. This study focused on enhancing student’s understanding in entrepreneurship through business plan simulation. This study also considers which of the factor that most facilitate the business simulation that help the student to prepare effective business plan. The methodology of this study using quantitative approach with pre-and post-research design. 114 students take part as respondent in the business simulation and answer quantitative survey pre-question and post question. The crucial findings of this study are student characteristic factor after playing the simulation contribute much on facilitate business plan learning. The result has shown that the business plan simulation can enhance undergraduate student in understanding entrepreneurship by preparing effective business plan before opening new startup.

  7. Path planning and Ground Control Station simulator for UAV

    Science.gov (United States)

    Ajami, A.; Balmat, J.; Gauthier, J.-P.; Maillot, T.

    In this paper we present a Universal and Interoperable Ground Control Station (UIGCS) simulator for fixed and rotary wing Unmanned Aerial Vehicles (UAVs), and all types of payloads. One of the major constraints is to operate and manage multiple legacy and future UAVs, taking into account the compliance with NATO Combined/Joint Services Operational Environment (STANAG 4586). Another purpose of the station is to assign the UAV a certain degree of autonomy, via autonomous planification/replanification strategies. The paper is organized as follows. In Section 2, we describe the non-linear models of the fixed and rotary wing UAVs that we use in the simulator. In Section 3, we describe the simulator architecture, which is based upon interacting modules programmed independently. This simulator is linked with an open source flight simulator, to simulate the video flow and the moving target in 3D. To conclude this part, we tackle briefly the problem of the Matlab/Simulink software connection (used to model the UAV's dynamic) with the simulation of the virtual environment. Section 5 deals with the control module of a flight path of the UAV. The control system is divided into four distinct hierarchical layers: flight path, navigation controller, autopilot and flight control surfaces controller. In the Section 6, we focus on the trajectory planification/replanification question for fixed wing UAV. Indeed, one of the goals of this work is to increase the autonomy of the UAV. We propose two types of algorithms, based upon 1) the methods of the tangent and 2) an original Lyapunov-type method. These algorithms allow either to join a fixed pattern or to track a moving target. Finally, Section 7 presents simulation results obtained on our simulator, concerning a rather complicated scenario of mission.

  8. Prototyping and validating requirements of radiation and nuclear emergency plan simulator

    Science.gov (United States)

    Hamid, AHA.; Rozan, MZA.; Ibrahim, R.; Deris, S.; Selamat, A.

    2015-04-01

    Organizational incapability in developing unrealistic, impractical, inadequate and ambiguous mechanisms of radiological and nuclear emergency preparedness and response plan (EPR) causing emergency plan disorder and severe disasters. These situations resulting from 65.6% of poor definition and unidentified roles and duties of the disaster coordinator. Those unexpected conditions brought huge aftermath to the first responders, operators, workers, patients and community at large. Hence, in this report, we discuss prototyping and validating of Malaysia radiation and nuclear emergency preparedness and response plan simulation model (EPRM). A prototyping technique was required to formalize the simulation model requirements. Prototyping as systems requirements validation was carried on to endorse the correctness of the model itself against the stakeholder's intensions in resolving those organizational incapability. We have made assumptions for the proposed emergency preparedness and response model (EPRM) through the simulation software. Those assumptions provided a twofold of expected mechanisms, planning and handling of the respective emergency plan as well as in bringing off the hazard involved. This model called RANEPF (Radiation and Nuclear Emergency Planning Framework) simulator demonstrated the training emergency response perquisites rather than the intervention principles alone. The demonstrations involved the determination of the casualties' absorbed dose range screening and the coordination of the capacity planning of the expected trauma triage. Through user-centred design and sociotechnical approach, RANEPF simulator was strategized and simplified, though certainly it is equally complex.

  9. Prototyping and validating requirements of radiation and nuclear emergency plan simulator

    International Nuclear Information System (INIS)

    Hamid, AHA.; Rozan, MZA.; Ibrahim, R.; Deris, S.; Selamat, A.

    2015-01-01

    Organizational incapability in developing unrealistic, impractical, inadequate and ambiguous mechanisms of radiological and nuclear emergency preparedness and response plan (EPR) causing emergency plan disorder and severe disasters. These situations resulting from 65.6% of poor definition and unidentified roles and duties of the disaster coordinator. Those unexpected conditions brought huge aftermath to the first responders, operators, workers, patients and community at large. Hence, in this report, we discuss prototyping and validating of Malaysia radiation and nuclear emergency preparedness and response plan simulation model (EPRM). A prototyping technique was required to formalize the simulation model requirements. Prototyping as systems requirements validation was carried on to endorse the correctness of the model itself against the stakeholder’s intensions in resolving those organizational incapability. We have made assumptions for the proposed emergency preparedness and response model (EPRM) through the simulation software. Those assumptions provided a twofold of expected mechanisms, planning and handling of the respective emergency plan as well as in bringing off the hazard involved. This model called RANEPF (Radiation and Nuclear Emergency Planning Framework) simulator demonstrated the training emergency response perquisites rather than the intervention principles alone. The demonstrations involved the determination of the casualties’ absorbed dose range screening and the coordination of the capacity planning of the expected trauma triage. Through user-centred design and sociotechnical approach, RANEPF simulator was strategized and simplified, though certainly it is equally complex

  10. Prototyping and validating requirements of radiation and nuclear emergency plan simulator

    Energy Technology Data Exchange (ETDEWEB)

    Hamid, AHA., E-mail: amyhamijah@nm.gov.my [Malaysian Nuclear Agency (NM), Bangi, 43000 Kajang, Selangor (Malaysia); Faculty of Computing, Universiti Teknologi Malaysia (UTM), Skudai, 81310 Johor Bahru, Johor (Malaysia); Rozan, MZA.; Ibrahim, R.; Deris, S.; Selamat, A. [Faculty of Computing, Universiti Teknologi Malaysia (UTM), Skudai, 81310 Johor Bahru, Johor (Malaysia)

    2015-04-29

    Organizational incapability in developing unrealistic, impractical, inadequate and ambiguous mechanisms of radiological and nuclear emergency preparedness and response plan (EPR) causing emergency plan disorder and severe disasters. These situations resulting from 65.6% of poor definition and unidentified roles and duties of the disaster coordinator. Those unexpected conditions brought huge aftermath to the first responders, operators, workers, patients and community at large. Hence, in this report, we discuss prototyping and validating of Malaysia radiation and nuclear emergency preparedness and response plan simulation model (EPRM). A prototyping technique was required to formalize the simulation model requirements. Prototyping as systems requirements validation was carried on to endorse the correctness of the model itself against the stakeholder’s intensions in resolving those organizational incapability. We have made assumptions for the proposed emergency preparedness and response model (EPRM) through the simulation software. Those assumptions provided a twofold of expected mechanisms, planning and handling of the respective emergency plan as well as in bringing off the hazard involved. This model called RANEPF (Radiation and Nuclear Emergency Planning Framework) simulator demonstrated the training emergency response perquisites rather than the intervention principles alone. The demonstrations involved the determination of the casualties’ absorbed dose range screening and the coordination of the capacity planning of the expected trauma triage. Through user-centred design and sociotechnical approach, RANEPF simulator was strategized and simplified, though certainly it is equally complex.

  11. Lean engineering for planning systems redesign - staff participation by simulation

    NARCIS (Netherlands)

    van der Zee, D.J.; Pool, A.; Wijngaard, J.; Mason, S.J.; Hill, R.R.; Moench, L.; Rose, O.

    2008-01-01

    Lean manufacturing aims at flexible and efficient manufacturing systems by reducing waste in all forms, such as, production of defective parts, excess inventory, unnecessary processing steps, and unnecessary movements of people or materials. Recent research stresses the need to include planning

  12. Simulating a multi-phase tephra fall event: inversion modelling for the 1707 Hoei eruption of Mount Fuji, Japan

    Science.gov (United States)

    Magill, Christina; Mannen, Kazutaka; Connor, Laura; Bonadonna, Costanza; Connor, Charles

    2015-09-01

    Fuji Volcano last erupted in ad 1707 depositing approximately 40 mm of tephra in the area that is now central Tokyo. New high-resolution data describe 17 eruptive phases occurring over a period of 16 days (Miyaji et al., J Volcanol Geotherm Res 207(3-4):113-129, 2011). Inversion techniques were used in order to best replicate geological data and eyewitness accounts, and to estimate eruption source parameters. Inversion results based on data from individual eruptive phases suggest a total erupted mass of 2.09 × 1012 kg. Comparatively, results based on a single data set describing the entire eruption sequence suggest a total mass of 1.69 × 1012 kg. Values for total erupted mass determined by inversion were compared to those calculated using various curve fitting approaches. An exponential (two-segment) method, taking into account missing distal data, was found to be most compatible with inversion results, giving an erupted mass of 2.52 × 1012 kg when combining individual phases and 1.59 × 1012 kg when utilising a single data set describing the Hoei sequence. Similarly, a Weibull fitting method determined a total erupted mass of 1.54 × 1012 kg for the single data set and compared favourably with inversion results when enough data were available. Partitioning extended eruption scenarios into multiple phases and including detailed geological data close to the eruption source, more accurately replicated the observed deposit by taking into account subtleties such as lobes deposited during transient increases in eruption rate and variations in wind velocity or direction throughout the eruption.

  13. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hendrickson, Bruce [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wade, Doug [National Nuclear Security Administration (NNSA), Washington, DC (United States). Office of Advanced Simulation and Computing and Institutional Research and Development; Hoang, Thuc [National Nuclear Security Administration (NNSA), Washington, DC (United States). Computational Systems and Software Environment

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  14. BRUS2. An energy system simulator for long term planning

    DEFF Research Database (Denmark)

    Skytte, K.; Skjerk Christensen, P.

    1999-01-01

    BRUS2 is a technical-economic bottom-up scenario model. The objective of BRUS2 is to provide decision-makers with information on consequences of given trends of parameters of society like population growth and productivity, and of political goals, e.g., energy saving initiatives. BRUS2 simulates ...

  15. Combined Log Inventory and Process Simulation Models for the Planning and Control of Sawmill Operations

    Science.gov (United States)

    Guillermo A. Mendoza; Roger J. Meimban; Philip A. Araman; William G. Luppold

    1991-01-01

    A log inventory model and a real-time hardwood process simulation model were developed and combined into an integrated production planning and control system for hardwood sawmills. The log inventory model was designed to monitor and periodically update the status of the logs in the log yard. The process simulation model was designed to estimate various sawmill...

  16. Simulation model for planning metallurgical treatment of large-size billets

    International Nuclear Information System (INIS)

    Timofeev, M.A.; Echeistova, L.A.; Kuznetsov, V.G.; Semakin, S.V.; Krivonogov, A.B.

    1989-01-01

    The computerized simulation system ''Ritm'' for planning metallurgical treatment of billets is developed. Three principles, specifying the organization structure of the treatment cycle are formulated as follows: a cycling principle, a priority principle and a principle of group treatment. The ''Ritm'' software consists of three independent operating systems: preparation of source data, simulation, data output

  17. NOTE: Implementation of biologically conformal radiation therapy (BCRT) in an algorithmic segmentation-based inverse planning approach

    Science.gov (United States)

    Vanderstraeten, Barbara; DeGersem, Werner; Duthoy, Wim; DeNeve, Wilfried; Thierens, Hubert

    2006-08-01

    The development of new biological imaging technologies offers the opportunity to further individualize radiotherapy. Biologically conformal radiation therapy (BCRT) implies the use of the spatial distribution of one or more radiobiological parameters to guide the IMRT dose prescription. Our aim was to implement BCRT in an algorithmic segmentation-based planning approach. A biology-based segmentation tool was developed to generate initial beam segments that reflect the biological signal intensity pattern. The weights and shapes of the initial segments are optimized by means of an objective function that minimizes the root mean square deviation between the actual and intended dose values within the PTV. As proof of principle, [18F]FDG-PET-guided BCRT plans for two different levels of dose escalation were created for an oropharyngeal cancer patient. Both plans proved to be dosimetrically feasible without violating the planning constraints for the expanded spinal cord and the contralateral parotid gland as organs at risk. The obtained biological conformity was better for the first (2.5 Gy per fraction) than for the second (3 Gy per fraction) dose escalation level.

  18. The use of discrete-event simulation modelling to improve radiation therapy planning processes

    International Nuclear Information System (INIS)

    Werker, Greg; Saure, Antoine; French, John; Shechter, Steven

    2009-01-01

    Background and purpose: The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. Materials and methods: A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. Results: The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Conclusions: Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.

  19. The use of discrete-event simulation modelling to improve radiation therapy planning processes.

    Science.gov (United States)

    Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven

    2009-07-01

    The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.

  20. Strategic planning for skills and simulation labs in colleges of nursing.

    Science.gov (United States)

    Gantt, Laura T

    2010-01-01

    While simulation laboratories for clinical nursing education are predicted to grow, budget cuts may threaten these programs. One of the ways to develop a new lab, as well as to keep an existing one on track, is to develop and regularly update a strategic plan. The process of planning not only helps keep the lab faculty and staff apprised of the challenges to be faced, but it also helps to keep senior level management engaged by reason of the need for their input and approval of the plan. The strategic planning documents drafted by those who supervised the development of the new building and Concepts Integration Labs (CILs) helped guide and orient faculty and other personnel hired to implement the plan and fulfill the vision. As the CILs strategic plan was formalized, the draft plans, including the SWOT analysis, were reviewed to provide historical perspective, stimulate discussion, and to make sure old or potential mistakes were not repeated.

  1. Simulation-based planning of surgical interventions in pediatric cardiology

    Science.gov (United States)

    Marsden, Alison

    2012-11-01

    Hemodynamics plays an essential role in the progression and treatment of cardiovascular disease. This is particularly true in pediatric cardiology, due to the wide variation in anatomy observed in congenital heart disease patients. While medical imaging provides increasingly detailed anatomical information, clinicians currently have limited knowledge of important fluid mechanical parameters. Treatment decisions are therefore often made using anatomical information alone, despite the known links between fluid mechanics and disease progression. Patient-specific simulations now offer the means to provide this missing information, and, more importantly, to perform in-silico testing of new surgical designs at no risk to the patient. In this talk, we will outline the current state of the art in methods for cardiovascular blood flow simulation and virtual surgery. We will then present new methodology for coupling optimization with simulation and uncertainty quantification to customize treatments for individual patients. Finally, we will present examples in pediatric cardiology that illustrate the potential impact of these tools in the clinical setting.

  2. Plans for RegCM4 CORDEX-CORE simulations

    Science.gov (United States)

    Giorgi, Filippo; Coppola, Erika; Giuliani, Graziano

    2017-04-01

    One of initiatives of the next phase CORDEX activities is the so-called CORDEX-CORE program, by which a core set of regional climate models (RCMs) will downscale a core set of Global Climate Model (GCM) 21st century projections over all, or most, CORDEX continental scale domains. This effort is aimed at providing a homogeneous set of RCM-based projections across land regions of the world, for use in impact assessment studies. The RegCM4 model will participate to this effort through contributions from its user community (the Regional Climate research NETwork, or RegCNET). Although the final details of the CORDEX-CORE experiment protocol have yet to be finalized by the CORDEX community, it is envisioned that ensembles of RegCM4 projections for the period 1950-2100 (or minimally 1970-2100), downscaling 3-6 GCMs over all CORDEX domains (except for the polar ones) will be produced, with forcing from a high end (likely RCP8.5) and a low end (likely RCP2.6) GHG concentration pathway. Depending on the availability of GCM simulations, CMIP5 and/or CMIP6 GCMs will be downscaled. The model grid spacing will be 20-25 km, except for the European domain, where it will be 12.5 km. The newest version of the model, RegCM4.6, will be used, which includes several new physics options compared to previous ones. The model will be validated and customized for the different domains via ERA-Interim driven simulations for the period 1979-2014. The simulations will be conducted by the International Centre for Theoretical Physics (ICTP) team, as well as several institutes located in the different CORDEX regions, and the data will be stored in CORDEX output format at different repositories. Analysis teams and targeted workshops will be organized to carefully assess the simulations. This presentation will describe the RegCM4 CORE experiment framework and will discuss preliminary results over different CORDEX domains from the ERA-Interim driven simulations.

  3. Inverse photoemission

    International Nuclear Information System (INIS)

    Namatame, Hirofumi; Taniguchi, Masaki

    1994-01-01

    Photoelectron spectroscopy is regarded as the most powerful means since it can measure almost perfectly the occupied electron state. On the other hand, inverse photoelectron spectroscopy is the technique for measuring unoccupied electron state by using the inverse process of photoelectron spectroscopy, and in principle, the similar experiment to photoelectron spectroscopy becomes feasible. The development of the experimental technology for inverse photoelectron spectroscopy has been carried out energetically by many research groups so far. At present, the heightening of resolution of inverse photoelectron spectroscopy, the development of inverse photoelectron spectroscope in which light energy is variable and so on are carried out. But the inverse photoelectron spectroscope for vacuum ultraviolet region is not on the market. In this report, the principle of inverse photoelectron spectroscopy and the present state of the spectroscope are described, and the direction of the development hereafter is groped. As the experimental equipment, electron guns, light detectors and so on are explained. As the examples of the experiment, the inverse photoelectron spectroscopy of semimagnetic semiconductors and resonance inverse photoelectron spectroscopy are reported. (K.I.)

  4. Irradiation test plan of the simulated DUPIC fuel

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Ki Kwang; Yang, M. S.; Kim, B. K. [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-11-01

    Simulated DUPIC fuel had been irradiated from Aug. 4, 1999 to Oct. 4 1999, in order to produce the data of its in-core behavior, to verify the design of DUPIC non-instrumented capsule developed, and to ensure the irradiation requirements of DUPIC fuel at HANARO. The welding process was certified for manufacturing the mini-element, and simulated DUPIC fuel rods were manufactured with simulated DUPIC pellets through examination and test. The non-instrumented capsule for a irradiation test of DUPIC fuel has been designed and manufactured referring to the design specification of the HANARO fuel. This is to be the design basis of the instrumented capsule under consideration. The verification experiment, whether the capsule loaded in the OR4 hole meet the HANARO requirements under the normal operation condition, as well as the structural analysis was carried out. The items for this experiment were the pressure drop test, vibration test, integrity test, et. al. It was noted that each experimental result meet the HANARO operational requirements. For the safety analysis of the DUPIC non-instrumented capsule loaded in the HANARO core, the nuclear/mechanical compatibility, thermodynamic compatibility, integrity analysis of the irradiation samples according to the reactor condition as well as the safety analysis of the HANARO were performed. Besides, the core reactivity effects were discussed during the irradiation test of the DUPIC capsule. The average power of each fuel rod in the DUPIC capsule was calculated, and maximal linear power reflecting the axial peaking power factor from the MCNP results was evaluated. From these calculation results, the HANARO core safety was evaluated. At the end of this report, similar overseas cases were introduced. 9 refs., 16 figs., 10 tabs. (Author)

  5. Multi-institutional comparison of simulated treatment delivery errors in ssIMRT, manually planned VMAT and autoplan-VMAT plans for nasopharyngeal radiotherapy

    DEFF Research Database (Denmark)

    Pogson, Elise M; Aruguman, Sankar; Hansen, Christian R

    2017-01-01

    PURPOSE: To quantify the impact of simulated errors for nasopharynx radiotherapy across multiple institutions and planning techniques (auto-plan generated Volumetric Modulated Arc Therapy (ap-VMAT), manually planned VMAT (mp-VMAT) and manually planned step and shoot Intensity Modulated Radiation...... Therapy (mp-ssIMRT)). METHODS: Ten patients were retrospectively planned with VMAT according to three institution's protocols. Within one institution two further treatment plans were generated using differing treatment planning techniques. This resulted in mp-ssIMRT, mp-VMAT, and ap-VMAT plans. Introduced...

  6. Training Community Modeling and Simulation Business Plan: 2008 Edition

    Science.gov (United States)

    2009-12-01

    and small teams that do not have ready access to technical aids. Project: The FY 07 DoD Standards Vetting Tool ( SVT ), which was funded by M&S CO, is...and Ser- vices use. The development of the SVT was led by the Space & Naval Warfare Systems Center Pacific (SSC Pacific) and executed by SSC Pacific...general-purpose interface that provides a common and interoperable “look and feel” across different simulations 8 ALL SVT 9. Develop M&S

  7. Visual simulation: a planning and design tool for surface mining

    International Nuclear Information System (INIS)

    Scott, R.D.

    1984-01-01

    Due to the controversial nature of the high level nuclear waste repository project proposed near Canyonlands National Park, public reaction has been considerable, particularly on visual impacts. Canyonlands is a primitive park; its appeal is solitude and pristine landscape. The Bureau of Land Management (BLM), the lead review agency, strongly recommended that the DOE employ visual simulation for displaying effects of visual change for the exploratory shaft, railroad access routes, and repository. As a result this study was conducted to address the concern for scenic values and the potential visual change within and surrounding the lands of Canyonlands National Park

  8. Using total-variation regularization for intensity modulated radiation therapy inverse planning with field-specific numbers of segments

    International Nuclear Information System (INIS)

    Zhu Lei; Lee, Louis; Ma Yunzhi; Xing Lei; Ye Yinyu; Mazzeo, Rafe

    2008-01-01

    Currently, there are two types of treatment planning algorithms for intensity modulated radiation therapy (IMRT). The beamlet-based algorithm generates beamlet intensity maps with high complexity, resulting in large numbers of segments in the delivery after a leaf-sequencing algorithm is applied. The segment-based direct aperture optimization (DAO) algorithm includes the physical constraints of the deliverable apertures in the calculation, and achieves a conformal dose distribution using a small number of segments. However, the number of segments is pre-fixed in most of the DAO approaches, and the typical random search scheme in the optimization is computationally intensive. A regularization-based algorithm is proposed to overcome the drawbacks of the DAO method. Instead of smoothing the beamlet intensity maps as in many existing methods, we include a total-variation term in the optimization objective function to reduce the number of signal levels of the beam intensity maps. An aperture rectification algorithm is then applied to generate a significantly reduced number of deliverable apertures. As compared to the DAO algorithm, our method has an efficient form of quadratic optimization, with an additional advantage of optimizing field-specific numbers of segments based on the modulation complexity. The proposed approach is evaluated using two clinical cases. Under the condition that the clinical acceptance criteria of the treatment plan are satisfied, for the prostate patient, the total number of segments for five fields is reduced from 61 using the Eclipse planning system to 35 using the proposed algorithm; for the head and neck patient, the total number of segments for seven fields is reduced from 107 to 28. The head and neck result is also compared to that using an equal number of four segments for each field. The comparison shows that using field-specific numbers of segments achieves a much improved dose distribution.

  9. The impact of automatic enrollment in 401(k) plans on future retirement accumulations: a simulation study based on plan design modifications of large plan sponsors.

    Science.gov (United States)

    VanDerhei, Jack

    2010-04-01

    SIGNIFICANCE OF AUTO-ENROLLMENT: Automatic enrollment of participants in 401(k) plans, which was encouraged by provisions in the Pension Protect Act of 2006, is designed to overcome the drawbacks of voluntary enrollment by getting more workers to save in their work place retirement plan. Auto-enrollment for 401(k) plans has been demonstrated by previous EBRI research to have substantial potential benefits for some employees. NEW EBRI RESEARCH: This EBRI study analyzes plan-specific data of 1,000 large defined contribution plans for salaried employees from Benefit SpecSelect (Hewitt Associates LLC) in 2005 and 2009 to compare a subsample of plan sponsors that did not have auto-enrollment in 2005 but that had adopted it in 2009. Actual plan information on both actual auto-enrollment and actual match rate information were coded both before and after adoption of auto-enrollment from 225 large 401(k) plan sponsors and found that the average change was positive under auto-enrollment in each of the following three categories: The first-tier match rate, the effective match rate, the average total employer contribution rate. MODELING ANALYSIS: This analysis created a series of simulation programs using these data. The analysis indicates that the adoption of automatic enrollment in 401(k) plans is likely to have a very significant positive impact (even greater than EBRI projected in 2008) in generating additional retirement savings for many workers, especially for young and low-income workers: Under baseline assumptions, the median 401(k) accumulations for the lowest-income quartile of workers currently age 25-29 (assuming all 401(k) plans were voluntary enrollment plans as typified by the 225 large plan sponsors described above) would only be 0.08 times final earnings at age 65. However, if all 401(k) plans are assumed to be using the large plan sponsor auto-enrollment provisions, the median 401(k) accumulations for the lowest-income quartile jumps to 4.96 times final

  10. Mathematical simulation of heat exchange process in regenerator of gas pumping unit using the tools of inverse problems

    Directory of Open Access Journals (Sweden)

    Леонид Михайлович Замиховский

    2015-04-01

    Full Text Available The necessity of technical state control of regenerators during operation of gas pumping unit was proved in the article. Іt was proposed to develop a new method based on the use of methods of mathematical modeling of heat distribution on the surface of the regenerator and hardware methods to determine its temperature. It is considered the regularization algorithm of incorrect inverse problem of heat conduction in the material of regenerator design using values of temperature fields, which were defined experimentally

  11. A SIMULATION-AS-A-SERVICE FRAMEWORK FACILITATING WEBGIS BASED INSTALLATION PLANNING

    Directory of Open Access Journals (Sweden)

    Z. Zheng

    2017-09-01

    Full Text Available Installation Planning is constrained by both natural and social conditions, especially for spatially sparse but functionally connected facilities. Simulation is important for proper deploy in space and configuration in function of facilities to make them a cohesive and supportive system to meet users’ operation needs. Based on requirement analysis, we propose a framework to combine GIS and Agent simulation to overcome the shortness in temporal analysis and task simulation of traditional GIS. In this framework, Agent based simulation runs as a service on the server, exposes basic simulation functions, such as scenario configuration, simulation control, and simulation data retrieval to installation planners. At the same time, the simulation service is able to utilize various kinds of geoprocessing services in Agents’ process logic to make sophisticated spatial inferences and analysis. This simulation-as-a-service framework has many potential benefits, such as easy-to-use, on-demand, shared understanding, and boosted performances. At the end, we present a preliminary implement of this concept using ArcGIS javascript api 4.0 and ArcGIS for server, showing how trip planning and driving can be carried out by agents.

  12. Clinical treatment planning for stereotactic radiotherapy, evaluation by Monte Carlo simulation

    International Nuclear Information System (INIS)

    Kairn, T.; Aland, T.; Kenny, J.; Knight, R.T.; Crowe, S.B.; Langton, C.M.; Franich, R.D.; Johnston, P.N.

    2010-01-01

    Full text: This study uses re-evaluates the doses delivered by a series of clinical stereotactic radiotherapy treatments, to test the accuracy of treatment planning predictions for very small radiation fields. Stereotactic radiotherapy treatment plans for meningiomas near the petrous temporal bone and the foramen magnum (incorp rating fields smaller than I c m2) were examined using Monte Carlo simulations. Important differences between treatment planning predictions and Monte Carlo calculations of doses delivered to stereotactic radiotherapy patients are apparent. For example, in one case the Monte Carlo calculation shows that the delivery a planned meningioma treatment would spare the patient's critical structures (eyes, brainstem) more effectively than the treatment plan predicted, and therefore suggests that this patient could safely receive an increased dose to their tumour. Monte Carlo simulations can be used to test the dose predictions made by a conventional treatment planning system, for dosimetrically challenging small fields, and can thereby suggest valuable modifications to clinical treatment plans. This research was funded by the Wesley Research Institute, Australia. The authors wish to thank Andrew Fielding and David Schlect for valuable discussions of aspects of this work. The authors are also grateful to Muhammad Kakakhel, for assisting with the design and calibration of our linear accelerator model, and to the stereotactic radiation therapy team at Premion, who designed the treatment plans. Computational resources and services used in this work were provided by the HPC and Research Support Unit, QUT, Brisbane, Australia. (author)

  13. Simulation-optimization model for production planning in the blood supply chain.

    Science.gov (United States)

    Osorio, Andres F; Brailsford, Sally C; Smith, Honora K; Forero-Matiz, Sonia P; Camacho-Rodríguez, Bernardo A

    2017-12-01

    Production planning in the blood supply chain is a challenging task. Many complex factors such as uncertain supply and demand, blood group proportions, shelf life constraints and different collection and production methods have to be taken into account, and thus advanced methodologies are required for decision making. This paper presents an integrated simulation-optimization model to support both strategic and operational decisions in production planning. Discrete-event simulation is used to represent the flows through the supply chain, incorporating collection, production, storing and distribution. On the other hand, an integer linear optimization model running over a rolling planning horizon is used to support daily decisions, such as the required number of donors, collection methods and production planning. This approach is evaluated using real data from a blood center in Colombia. The results show that, using the proposed model, key indicators such as shortages, outdated units, donors required and cost are improved.

  14. Spatial policy, planning and infrastructure investment: lessons from urban simulations in three South African cities

    CSIR Research Space (South Africa)

    Coetzee, M

    2014-05-01

    Full Text Available in infrastructure that will not have the desired impact on city functioning. The time has arrived to harness hard evidence to develop fresh perspectives on planning and infrastructure investment that will help to extricate ourselves from the dilemma of neo...: 072 892 8350, email: Spatial policy, planning and infrastructure investment: Lessons from urban simulations in three South African cities Maria Coetzee(†), Louis Waldeck, Alize le Roux, Cathy Meiklejohn, Willemien van...

  15. Inverse Limits

    CERN Document Server

    Ingram, WT

    2012-01-01

    Inverse limits provide a powerful tool for constructing complicated spaces from simple ones. They also turn the study of a dynamical system consisting of a space and a self-map into a study of a (likely more complicated) space and a self-homeomorphism. In four chapters along with an appendix containing background material the authors develop the theory of inverse limits. The book begins with an introduction through inverse limits on [0,1] before moving to a general treatment of the subject. Special topics in continuum theory complete the book. Although it is not a book on dynamics, the influen

  16. An Interprofessional Approach to Continuing Education With Mass Casualty Simulation: Planning and Execution.

    Science.gov (United States)

    Saber, Deborah A; Strout, Kelley; Caruso, Lisa Swanson; Ingwell-Spolan, Charlene; Koplovsky, Aiden

    2017-10-01

    Many natural and man-made disasters require the assistance from teams of health care professionals. Knowing that continuing education about disaster simulation training is essential to nursing students, nurses, and emergency first responders (e.g., emergency medical technicians, firefighters, police officers), a university in the northeastern United States planned and implemented an interprofessional mass casualty incident (MCI) disaster simulation using the Project Management Body of Knowledge (PMBOK) management framework. The school of nursing and University Volunteer Ambulance Corps (UVAC) worked together to simulate a bus crash with disaster victim actors to provide continued education for community first responders and train nursing students on the MCI process. This article explains the simulation activity, planning process, and achieved outcomes. J Contin Educ Nurs. 2017;48(10):447-453. Copyright 2017, SLACK Incorporated.

  17. Three-dimensional virtual reality surgical planning and simulation workbench for orthognathic surgery.

    Science.gov (United States)

    Xia, J; Samman, N; Yeung, R W; Shen, S G; Wang, D; Ip, H H; Tideman, H

    2000-01-01

    A new integrated computer system, the 3-dimensional (3D) virtual reality surgical planning and simulation workbench for orthognathic surgery (VRSP), is presented. Five major functions are implemented in this system: post-processing and reconstruction of computed tomographic (CT) data, transformation of 3D unique coordinate system geometry, generation of 3D color facial soft tissue models, virtual surgical planning and simulation, and presurgical prediction of soft tissue changes. The basic mensuration functions, such as linear and spatial measurements, are also included. The surgical planning and simulation are based on 3D CT reconstructions, whereas soft tissue prediction is based on an individualized, texture-mapped, color facial soft tissue model. The surgeon "enters" the virtual operatory with virtual reality equipment, "holds" a virtual scalpel, and "operates" on a virtual patient to accomplish actual surgical planning, simulation of the surgical procedure, and prediction of soft tissue changes before surgery. As a final result, a quantitative osteotomy-simulated bone model and predicted color facial model with photorealistic quality can be visualized from any arbitrary viewing point in a personal computer system. This system can be installed in any hospital for daily use.

  18. The potential impact of urban growth simulation on the long-term planning of our cities

    CSIR Research Space (South Africa)

    Waldeck, L

    2012-10-01

    Full Text Available consumption patterns of municipal services: Water, energy, waste water, solid waste, public transport, libraries, revenue, ? Slide 14 of 17 Validation (example from City of Johannesburg) hu / ha Growth aggregated to Traffic Analysis Zones Actual Urban... of urban growth simulation on the long-term planning of our cities 4th Biennial Conference Presented by: Dr Louis Waldeck Date: 10 October 2012 Slide 2 of 17 Why Urban Growth Simulation? ? Reduced carbon footprint ? Reduce resource consumption...

  19. RUMBLE Technical Report on Inversion Models

    Science.gov (United States)

    Simons, Dick G.; Ainslie, Michael A.; Muller, Simonette H. E.; Boek, Wilco

    2002-06-01

    The performance of long range low frequency active sonar (LFAS) systems in shallow water is very sensitive to the properties of the sea bed, because of the impact of these on propagation, reverberation and (to a lesser extent) ambient noise. Direct measurement of sea bed parameters using cores or grab samples is impractical for covering a wide area, and instead we consider the possibility of using the LFAS system itself to measure its operating environment. The advantages of this approach are that it exploits existing (or planned) equipment and potentially offers a wide coverage. Geo-acoustic inversion methods are reviewed, with particular consideration for the problems associated with inversion of reverberation data. Three global optimisation methods are described, known as "simulated annealing", "genetic algorithms" and "differential evolution". The Levenberg-Marquardt and downhill simplex local methods are also described. The advantages and disadvantages of each individual method, as well as some hybrid combinations, are discussed in the context of geo-acoustic inversion. A new inversion method has been developed that exploits both the shape and height of the reverberation vs time curve to obtain information about the sea bed reflection loss and scattering strength separately. Tests on synthetic reverberation data show that the inversion method is able to extract parameters representing reflection loss and scattering strength, but cannot always unambiguously separate the effects of sediment sound speed and attenuation. The method is robust to small mismatches in water depth, sonar depth, sediment sound speed gradient and wind speed.

  20. Development of a Searchable Database of Cryoablation Simulations for Use in Treatment Planning

    Energy Technology Data Exchange (ETDEWEB)

    Boas, F. Edward, E-mail: boasf@mskcc.org; Srimathveeravalli, Govindarajan, E-mail: srimaths@mskcc.org; Durack, Jeremy C., E-mail: durackj@mskcc.org [Memorial Sloan Kettering Cancer Center, Department of Radiology (United States); Kaye, Elena A., E-mail: kayee@mskcc.org [Memorial Sloan Kettering Cancer Center, Department of Medical Physics (United States); Erinjeri, Joseph P., E-mail: erinjerj@mskcc.org; Ziv, Etay, E-mail: zive@mskcc.org; Maybody, Majid, E-mail: maybodym@mskcc.org; Yarmohammadi, Hooman, E-mail: yarmohah@mskcc.org; Solomon, Stephen B., E-mail: solomons@mskcc.org [Memorial Sloan Kettering Cancer Center, Department of Radiology (United States)

    2017-05-15

    PurposeTo create and validate a planning tool for multiple-probe cryoablation, using simulations of ice ball size and shape for various ablation probe configurations, ablation times, and types of tissue ablated.Materials and MethodsIce ball size and shape was simulated using the Pennes bioheat equation. Five thousand six hundred and seventy different cryoablation procedures were simulated, using 1–6 cryoablation probes and 1–2 cm spacing between probes. The resulting ice ball was measured along three perpendicular axes and recorded in a database. Simulated ice ball sizes were compared to gel experiments (26 measurements) and clinical cryoablation cases (42 measurements). The clinical cryoablation measurements were obtained from a HIPAA-compliant retrospective review of kidney and liver cryoablation procedures between January 2015 and February 2016. Finally, we created a web-based cryoablation planning tool, which uses the cryoablation simulation database to look up the probe spacing and ablation time that produces the desired ice ball shape and dimensions.ResultsAverage absolute error between the simulated and experimentally measured ice balls was 1 mm in gel experiments and 4 mm in clinical cryoablation cases. The simulations accurately predicted the degree of synergy in multiple-probe ablations. The cryoablation simulation database covers a wide range of ice ball sizes and shapes up to 9.8 cm.ConclusionCryoablation simulations accurately predict the ice ball size in multiple-probe ablations. The cryoablation database can be used to plan ablation procedures: given the desired ice ball size and shape, it will find the number and type of probes, probe configuration and spacing, and ablation time required.

  1. Development of a Searchable Database of Cryoablation Simulations for Use in Treatment Planning

    International Nuclear Information System (INIS)

    Boas, F. Edward; Srimathveeravalli, Govindarajan; Durack, Jeremy C.; Kaye, Elena A.; Erinjeri, Joseph P.; Ziv, Etay; Maybody, Majid; Yarmohammadi, Hooman; Solomon, Stephen B.

    2017-01-01

    PurposeTo create and validate a planning tool for multiple-probe cryoablation, using simulations of ice ball size and shape for various ablation probe configurations, ablation times, and types of tissue ablated.Materials and MethodsIce ball size and shape was simulated using the Pennes bioheat equation. Five thousand six hundred and seventy different cryoablation procedures were simulated, using 1–6 cryoablation probes and 1–2 cm spacing between probes. The resulting ice ball was measured along three perpendicular axes and recorded in a database. Simulated ice ball sizes were compared to gel experiments (26 measurements) and clinical cryoablation cases (42 measurements). The clinical cryoablation measurements were obtained from a HIPAA-compliant retrospective review of kidney and liver cryoablation procedures between January 2015 and February 2016. Finally, we created a web-based cryoablation planning tool, which uses the cryoablation simulation database to look up the probe spacing and ablation time that produces the desired ice ball shape and dimensions.ResultsAverage absolute error between the simulated and experimentally measured ice balls was 1 mm in gel experiments and 4 mm in clinical cryoablation cases. The simulations accurately predicted the degree of synergy in multiple-probe ablations. The cryoablation simulation database covers a wide range of ice ball sizes and shapes up to 9.8 cm.ConclusionCryoablation simulations accurately predict the ice ball size in multiple-probe ablations. The cryoablation database can be used to plan ablation procedures: given the desired ice ball size and shape, it will find the number and type of probes, probe configuration and spacing, and ablation time required.

  2. Development of a Searchable Database of Cryoablation Simulations for Use in Treatment Planning.

    Science.gov (United States)

    Boas, F Edward; Srimathveeravalli, Govindarajan; Durack, Jeremy C; Kaye, Elena A; Erinjeri, Joseph P; Ziv, Etay; Maybody, Majid; Yarmohammadi, Hooman; Solomon, Stephen B

    2017-05-01

    To create and validate a planning tool for multiple-probe cryoablation, using simulations of ice ball size and shape for various ablation probe configurations, ablation times, and types of tissue ablated. Ice ball size and shape was simulated using the Pennes bioheat equation. Five thousand six hundred and seventy different cryoablation procedures were simulated, using 1-6 cryoablation probes and 1-2 cm spacing between probes. The resulting ice ball was measured along three perpendicular axes and recorded in a database. Simulated ice ball sizes were compared to gel experiments (26 measurements) and clinical cryoablation cases (42 measurements). The clinical cryoablation measurements were obtained from a HIPAA-compliant retrospective review of kidney and liver cryoablation procedures between January 2015 and February 2016. Finally, we created a web-based cryoablation planning tool, which uses the cryoablation simulation database to look up the probe spacing and ablation time that produces the desired ice ball shape and dimensions. Average absolute error between the simulated and experimentally measured ice balls was 1 mm in gel experiments and 4 mm in clinical cryoablation cases. The simulations accurately predicted the degree of synergy in multiple-probe ablations. The cryoablation simulation database covers a wide range of ice ball sizes and shapes up to 9.8 cm. Cryoablation simulations accurately predict the ice ball size in multiple-probe ablations. The cryoablation database can be used to plan ablation procedures: given the desired ice ball size and shape, it will find the number and type of probes, probe configuration and spacing, and ablation time required.

  3. CT-Based Brachytherapy Treatment Planning using Monte Carlo Simulation Aided by an Interface Software

    Directory of Open Access Journals (Sweden)

    Vahid Moslemi

    2011-03-01

    Full Text Available Introduction: In brachytherapy, radioactive sources are placed close to the tumor, therefore, small changes in their positions can cause large changes in the dose distribution. This emphasizes the need for computerized treatment planning. The usual method for treatment planning of cervix brachytherapy uses conventional radiographs in the Manchester system. Nowadays, because of their advantages in locating the source positions and the surrounding tissues, CT and MRI images are replacing conventional radiographs. In this study, we used CT images in Monte Carlo based dose calculation for brachytherapy treatment planning, using an interface software to create the geometry file required in the MCNP code. The aim of using the interface software is to facilitate and speed up the geometry set-up for simulations based on the patient’s anatomy. This paper examines the feasibility of this method in cervix brachytherapy and assesses its accuracy and speed. Material and Methods: For dosimetric measurements regarding the treatment plan, a pelvic phantom was made from polyethylene in which the treatment applicators could be placed. For simulations using CT images, the phantom was scanned at 120 kVp. Using an interface software written in MATLAB, the CT images were converted into MCNP input file and the simulation was then performed. Results: Using the interface software, preparation time for the simulations of the applicator and surrounding structures was approximately 3 minutes; the corresponding time needed in the conventional MCNP geometry entry being approximately 1 hour. The discrepancy in the simulated and measured doses to point A was 1.7% of the prescribed dose.  The corresponding dose differences between the two methods in rectum and bladder were 3.0% and 3.7% of the prescribed dose, respectively. Comparing the results of simulation using the interface software with those of simulation using the standard MCNP geometry entry showed a less than 1

  4. Inverse geometric approach for the simulation of close-to-circular growth. The case of multicellular tumor spheroids

    Science.gov (United States)

    Brutovsky, B.; Horvath, D.; Lisy, V.

    2008-02-01

    We demonstrate the power of genetic algorithms to construct a cellular automata model simulating the growth of 2D close-to-circular clusters, revealing the desired properties, such as the growth rate and, at the same time, the fractal behavior of their contours. The possible application of the approach in the field of tumor modeling is outlined.

  5. A comprehensive simulation for wait time reduction and capacity planning applied in general surgery

    NARCIS (Netherlands)

    Vanberkel, P.T.; Blake, John T.

    2007-01-01

    This paper describes the use of operational research techniques to analyze the wait list for the Division of General Surgery at the Capital District Health Authority in Halifax, Nova Scotia, Canada. A discrete event simulation model was developed to aid capacity planning decisions and to analyze the

  6. Fast thermal simulations and temperature optimization for hyperthermia treatment planning, including realistic 3D vessel networks

    NARCIS (Netherlands)

    Kok, H. P.; van den Berg, C. A. T.; Bel, A.; Crezee, J.

    2013-01-01

    Accurate thermal simulations in hyperthermia treatment planning require discrete modeling of large blood vessels. The very long computation time of the finite difference based DIscrete VAsculature model (DIVA) developed for this purpose is impractical for clinical applications. In this work, a fast

  7. Patient dose simulation in X-ray CT using a radiation treatment-planning system

    International Nuclear Information System (INIS)

    Nakae, Yasuo; Oda, Masahiko; Minamoto, Takahiro

    2003-01-01

    Medical irradiation dosage has been increasing with the development of new radiological equipment and new techniques like interventional radiology. It is fair to say that patient dose has been increased as a result of the development of multi-slice CT. A number of studies on the irradiation dose of CT have been reported, and the computed tomography dose index (CTDI) is now used as a general means of determining CT dose. However, patient dose distribution in the body varies with the patient's constitution, bowel gas in the body, and conditions of exposure. In this study, patient dose was analyzed from the viewpoint of dose distribution, using a radiation treatment-planning computer. Percent depth dose (PDD) and the off-center ratio (OCR) of the CT beam are needed to calculate dose distribution by the planning computer. Therefore, X-ray CT data were measured with various apparatuses, and beam data were sent to the planning computer. Measurement and simulation doses in the elliptical phantom (Mix-Dp: water equivalent material) were collated, and the CT irradiation dose was determined for patient dose simulation. The rotational radiation treatment technique was used to obtain the patient dose distribution of CT, and patient dose was evaluated through simulation of the dose distribution. CT images of the thorax were sent to the planning computer and simulated. The result was that the patient dose distribution of the thorax was obtained for CT examination. (author)

  8. Experience Report: Constraint-Based Modelling and Simulation of Railway Emergency Response Plans

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Sandberg, Lene

    2016-01-01

    We report on experiences from a case study applying a constraint-based process-modelling and -simulation tool, dcrgraphs.net, to the modelling and rehearsal of railway emergency response plans with domain experts. The case study confirmed the approach as a viable means for domain experts to analyse...

  9. Image formation simulation for computer-aided inspection planning of machine vision systems

    Science.gov (United States)

    Irgenfried, Stephan; Bergmann, Stephan; Mohammadikaji, Mahsa; Beyerer, Jürgen; Dachsbacher, Carsten; Wörn, Heinz

    2017-06-01

    In this work, a simulation toolset for Computer Aided Inspection Planning (CAIP) of systems for automated optical inspection (AOI) is presented along with a versatile two-robot-setup for verification of simulation and system planning results. The toolset helps to narrow down the large design space of optical inspection systems in interaction with a system expert. The image formation taking place in optical inspection systems is simulated using GPU-based real time graphics and high quality off-line-rendering. The simulation pipeline allows a stepwise optimization of the system, from fast evaluation of surface patch visibility based on real time graphics up to evaluation of image processing results based on off-line global illumination calculation. A focus of this work is on the dependency of simulation quality on measuring, modeling and parameterizing the optical surface properties of the object to be inspected. The applicability to real world problems is demonstrated by taking the example of planning a 3D laser scanner application. Qualitative and quantitative comparison results of synthetic and real images are presented.

  10. Simulated Prosthesis Overlay for Patient-Specific Planning of Transcatheter Aortic Valve Implantation Procedures.

    Science.gov (United States)

    Sündermann, Simon H; Gessat, Michael; Maier, Willibald; Kempfert, Jörg; Frauenfelder, Thomas; Nguyen, Thi D L; Maisano, Francesco; Falk, Volkmar

    2015-01-01

    We tested the hypothesis that simulated three-dimensional prosthesis overlay procedure planning may support valve selection in transcatheter aortic valve implantation (TAVI) procedures. Preoperative multidimensional computed tomography (MDCT) data sets from 81 consecutive TAVI patients were included in the study. A planning tool was developed, which semiautomatically creates a three-dimensional model of the aortic root from these data. Three-dimensional templates of the commonly used TAVI implants are spatially registered with the patient data and presented as graphic overlay. Fourteen physicians used the tool to perform retrospective planning of TAVI procedures. Results of prosthesis sizing were compared with the prosthesis size used in the actually performed procedure, and the patients were accordingly divided into three groups: those with equal size (concordance with retrospective planning), oversizing (retrospective planning of a smaller prosthesis), and undersizing (retrospective planning of a larger prosthesis). In the oversizing group, 85% of the patients had new pacemaker implantation. In the undersizing group, in 66%, at least mild paravalvular leakage was observed (greater than grade 1 in one third of the cases). In 46% of the patients in the equal-size group, neither of these complications was observed. Three-dimensional prosthesis overlay in MDCT-derived patient data for patient-specific planning of TAVI procedures is feasible. It may improve valve selection compared with two-dimensional MDCT planning and thus yield better outcomes.

  11. Planning Irreversible Electroporation in the Porcine Kidney: Are Numerical Simulations Reliable for Predicting Empiric Ablation Outcomes?

    Energy Technology Data Exchange (ETDEWEB)

    Wimmer, Thomas, E-mail: thomas.wimmer@medunigraz.at; Srimathveeravalli, Govindarajan; Gutta, Narendra [Memorial Sloan-Kettering Cancer Center, Interventional Radiology Service, Department of Radiology (United States); Ezell, Paula C. [The Rockefeller University, Research Animal Resource Center, Memorial Sloan-Kettering Cancer Center, Weill Cornell Medical College (United States); Monette, Sebastien [The Rockefeller University, Laboratory of Comparative Pathology, Memorial Sloan-Kettering Cancer Center, Weill Cornell Medical College (United States); Maybody, Majid; Erinjery, Joseph P.; Durack, Jeremy C. [Memorial Sloan-Kettering Cancer Center, Interventional Radiology Service, Department of Radiology (United States); Coleman, Jonathan A. [Memorial Sloan-Kettering Cancer Center, Urology Service, Department of Surgery (United States); Solomon, Stephen B. [Memorial Sloan-Kettering Cancer Center, Interventional Radiology Service, Department of Radiology (United States)

    2015-02-15

    PurposeNumerical simulations are used for treatment planning in clinical applications of irreversible electroporation (IRE) to determine ablation size and shape. To assess the reliability of simulations for treatment planning, we compared simulation results with empiric outcomes of renal IRE using computed tomography (CT) and histology in an animal model.MethodsThe ablation size and shape for six different IRE parameter sets (70–90 pulses, 2,000–2,700 V, 70–100 µs) for monopolar and bipolar electrodes was simulated using a numerical model. Employing these treatment parameters, 35 CT-guided IRE ablations were created in both kidneys of six pigs and followed up with CT immediately and after 24 h. Histopathology was analyzed from postablation day 1.ResultsAblation zones on CT measured 81 ± 18 % (day 0, p ≤ 0.05) and 115 ± 18 % (day 1, p ≤ 0.09) of the simulated size for monopolar electrodes, and 190 ± 33 % (day 0, p ≤ 0.001) and 234 ± 12 % (day 1, p ≤ 0.0001) for bipolar electrodes. Histopathology indicated smaller ablation zones than simulated (71 ± 41 %, p ≤ 0.047) and measured on CT (47 ± 16 %, p ≤ 0.005) with complete ablation of kidney parenchyma within the central zone and incomplete ablation in the periphery.ConclusionBoth numerical simulations for planning renal IRE and CT measurements may overestimate the size of ablation compared to histology, and ablation effects may be incomplete in the periphery.

  12. Planning Irreversible Electroporation in the Porcine Kidney: Are Numerical Simulations Reliable for Predicting Empiric Ablation Outcomes?

    International Nuclear Information System (INIS)

    Wimmer, Thomas; Srimathveeravalli, Govindarajan; Gutta, Narendra; Ezell, Paula C.; Monette, Sebastien; Maybody, Majid; Erinjery, Joseph P.; Durack, Jeremy C.; Coleman, Jonathan A.; Solomon, Stephen B.

    2015-01-01

    PurposeNumerical simulations are used for treatment planning in clinical applications of irreversible electroporation (IRE) to determine ablation size and shape. To assess the reliability of simulations for treatment planning, we compared simulation results with empiric outcomes of renal IRE using computed tomography (CT) and histology in an animal model.MethodsThe ablation size and shape for six different IRE parameter sets (70–90 pulses, 2,000–2,700 V, 70–100 µs) for monopolar and bipolar electrodes was simulated using a numerical model. Employing these treatment parameters, 35 CT-guided IRE ablations were created in both kidneys of six pigs and followed up with CT immediately and after 24 h. Histopathology was analyzed from postablation day 1.ResultsAblation zones on CT measured 81 ± 18 % (day 0, p ≤ 0.05) and 115 ± 18 % (day 1, p ≤ 0.09) of the simulated size for monopolar electrodes, and 190 ± 33 % (day 0, p ≤ 0.001) and 234 ± 12 % (day 1, p ≤ 0.0001) for bipolar electrodes. Histopathology indicated smaller ablation zones than simulated (71 ± 41 %, p ≤ 0.047) and measured on CT (47 ± 16 %, p ≤ 0.005) with complete ablation of kidney parenchyma within the central zone and incomplete ablation in the periphery.ConclusionBoth numerical simulations for planning renal IRE and CT measurements may overestimate the size of ablation compared to histology, and ablation effects may be incomplete in the periphery

  13. 77 FR 31026 - Use of Computer Simulation of the United States Blood Supply in Support of Planning for Emergency...

    Science.gov (United States)

    2012-05-24

    ...] Use of Computer Simulation of the United States Blood Supply in Support of Planning for Emergency... entitled: ``Use of Computer Simulation of the United States Blood Supply in Support of Planning for... stakeholders a forum for discussion of data needs and to obtain feedback on possible modeling scenarios to...

  14. A Simulation Based Approach for Contingency Planning for Aircraft Turnaround Operation System Activities in Airline Hubs

    Science.gov (United States)

    Adeleye, Sanya; Chung, Christopher

    2006-01-01

    Commercial aircraft undergo a significant number of maintenance and logistical activities during the turnaround operation at the departure gate. By analyzing the sequencing of these activities, more effective turnaround contingency plans may be developed for logistical and maintenance disruptions. Turnaround contingency plans are particularly important as any kind of delay in a hub based system may cascade into further delays with subsequent connections. The contingency sequencing of the maintenance and logistical turnaround activities were analyzed using a combined network and computer simulation modeling approach. Experimental analysis of both current and alternative policies provides a framework to aid in more effective tactical decision making.

  15. Benefits of a clinical planning and coordination module: a simulation study

    DEFF Research Database (Denmark)

    Jensen, Sanne; Vingtoft, Søren; Nøhr, Christian

    2013-01-01

    the potential benefits of a planning and coordination module has been assessed in a full-scale simulation test including 18 health care professionals. The results showed that health care professionals can benefit from such a module. Furthermore unexpected new possible benefits concerning communication......igital Clinical Practice Guidelines are commonly used in Danish health care. Planning and decision support are particularly important to patients with chronic diseases, who often are in contact with General Practitioners, Community Nurses and hospitals. In the Capital Region of Denmark...... and quality management emerged during the test and potential new groups of users were identified...

  16. Renal Tumor Cryoablation Planning. The Efficiency of Simulation on Reconstructed 3D CT Scan

    Directory of Open Access Journals (Sweden)

    Ciprian Valerian LUCAN

    2010-12-01

    Full Text Available Introduction & Objective: Nephron-sparing surgical techniques risks are related to tumor relationships with adjacent anatomic structures. Complexity of the renal anatomy drives the interest to develop tools for 3D reconstruction and surgery simulation. The aim of the article was to assess the simulation on reconstructed 3D CT scan used for planning the cryoablation. Material & Method: A prospective randomized study was performed between Jan. 2007 and July 2009 on 27 patients who underwent retroperitoneoscopic T1a renal tumors cryoablation (RC. All patients were assessed preoperatively by CT scan, also used for 3D volume rendering. In the Gr.A, the patients underwent surgery planning by simulation on 3D CT scan. In the Gr.B., patients underwent standard RC. The two groups were compared in terms of surgical time, bleeding, postoperative drainage, analgesics requirement, hospital stay, time to socio-professional reintegration. Results: Fourteen patients underwent preoperative cryoablation planning (Gr.A and 13 patients underwent standard CR (Gr.B. All parameters analyzed were shorter in the Gr.A. On multivariate logistic regression, only shortens of the surgical time (138.79±5.51 min. in Gr.A. vs. 140.92±5.54 min in Gr.B. and bleeding (164.29±60.22 mL in Gr.A. vs. 215.38±100.80 mL in Gr.B. achieved statistical significance (p<0.05. The number of cryoneedles assessed by simulation had a 92.52% accuracy when compared with those effectively used. Conclusions: Simulation of the cryoablation using reconstructed 3D CT scan improves the surgical results. The application used for simulation was able to accurately assess the number of cryoneedles required for tumor ablation, their direction and approach.

  17. Spatial Variability in Column CO2 Inferred from High Resolution GEOS-5 Global Model Simulations: Implications for Remote Sensing and Inversions

    Science.gov (United States)

    Ott, L.; Putman, B.; Collatz, J.; Gregg, W.

    2012-01-01

    Column CO2 observations from current and future remote sensing missions represent a major advancement in our understanding of the carbon cycle and are expected to help constrain source and sink distributions. However, data assimilation and inversion methods are challenged by the difference in scale of models and observations. OCO-2 footprints represent an area of several square kilometers while NASA s future ASCENDS lidar mission is likely to have an even smaller footprint. In contrast, the resolution of models used in global inversions are typically hundreds of kilometers wide and often cover areas that include combinations of land, ocean and coastal areas and areas of significant topographic, land cover, and population density variations. To improve understanding of scales of atmospheric CO2 variability and representativeness of satellite observations, we will present results from a global, 10-km simulation of meteorology and atmospheric CO2 distributions performed using NASA s GEOS-5 general circulation model. This resolution, typical of mesoscale atmospheric models, represents an order of magnitude increase in resolution over typical global simulations of atmospheric composition allowing new insight into small scale CO2 variations across a wide range of surface flux and meteorological conditions. The simulation includes high resolution flux datasets provided by NASA s Carbon Monitoring System Flux Pilot Project at half degree resolution that have been down-scaled to 10-km using remote sensing datasets. Probability distribution functions are calculated over larger areas more typical of global models (100-400 km) to characterize subgrid-scale variability in these models. Particular emphasis is placed on coastal regions and regions containing megacities and fires to evaluate the ability of coarse resolution models to represent these small scale features. Additionally, model output are sampled using averaging kernels characteristic of OCO-2 and ASCENDS measurement

  18. Parallel N-Body Simulation Based on the PM and P3M Methods Using Multigrid Schemes in conjunction with Generic Approximate Sparse Inverses

    Directory of Open Access Journals (Sweden)

    P. E. Kyziropoulos

    2015-01-01

    Full Text Available During the last decades, Multigrid methods have been extensively used for solving large sparse linear systems. Considering their efficiency and the convergence behavior, Multigrid methods are used in many scientific fields as solvers or preconditioners. Herewith, we propose two hybrid parallel algorithms for N-Body simulations using the Particle Mesh method and the Particle Particle Particle Mesh method, respectively, based on the V-Cycle Multigrid method in conjunction with Generic Approximate Sparse Inverses. The N-Body problem resides in a three-dimensional torus space, and the bodies are subject only to gravitational forces. In each time step of the above methods, a large sparse linear system is solved to compute the gravity potential at each nodal point in order to interpolate the solution to each body. Then the Velocity Verlet method is used to compute the new position and velocity from the acceleration of each respective body. Moreover, a parallel Multigrid algorithm, with a truncated approach in the levels computed in parallel, is proposed for solving large linear systems. Furthermore, parallel results are provided indicating the efficiency of the proposed Multigrid N-Body scheme. Theoretical estimates for the complexity of the proposed simulation schemes are provided.

  19. A computer simulation model for the practical planning of cervical cancer screening programmes.

    OpenAIRE

    Parkin, D. M.

    1985-01-01

    There is ample evidence of the efficacy of cytological screening in the prevention of cervical cancer but disagreement on the form which screening programmes should take. Simulation models have been used as a convenient and rapid method of exploring the outcome of different screening policies and of demonstrating the importance and interrelationships of the variables concerned. However, most such models are either too abstract or too simplistic to be of practical value in planning screening p...

  20. MRI-based experimentations of fingertip flat compression: Geometrical measurements and finite element inverse simulations to investigate material property parameters.

    Science.gov (United States)

    Dallard, Jérémy; Merlhiot, Xavier; Petitjean, Noémie; Duprey, Sonia

    2018-01-23

    Modeling human-object interactions is a necessary step in the ergonomic assessment of products. Fingertip finite element models can help investigating these interactions, if they are built based on realistic geometrical data and material properties. The aim of this study was to investigate the fingertip geometry and its mechanical response under compression, and to identify the parameters of a hyperelastic material property associated to the fingertip soft tissues. Fingertip compression tests in an MRI device were performed on 5 subjects at either 2 or 4 N and at 15° or 50°. The MRI images allowed to document both the internal and external fingertip dimensions and to build 5 subject-specific finite element models. Simulations reproducing the fingertip compression tests were run to obtain the material property parameters of the soft tissues. Results indicated that two ellipses in the sagittal and longitudinal plane could describe the external fingertip geometry. The internal geometries indicated an averaged maximal thickness of soft tissues of 6.4 ± 0.8 mm and a 4 ± 1 mm height for the phalanx bone. The averaged deflections under loading went from 1.8 ± 0.3 mm at 2 N, 50° to 3.1 ± 0.2 mm at 4 N, 15°. Finally, the following set of parameters for a second order hyperelastic law to model the fingertip soft tissues was proposed: C 01 =0.59 ± 0.09 kPa and C 20  = 2.65 ± 0.88 kPa. These data should facilitate further efforts on fingertip finite element modeling. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Monte Carlo fluence simulation for prospective evaluation of interstitial photodynamic therapy treatment plans

    Science.gov (United States)

    Cassidy, Jeffrey; Betz, Vaughn; Lilge, Lothar

    2015-03-01

    Photodynamic therapy (PDT) delivers a localized cytotoxic dose that is a function of tissue oxygen availability, photosensitive drug concentration, and light fluence. Providing safe and effective PDT requires an understanding of all three elements and the physiological response to the radicals generated. Interstitial PDT (IPDT) for solid tumours poses particular challenges due to complex organ geometries and the associated limitations for diffusion theory based fluence rate prediction, in addition to restricted access for light delivery and dose monitoring. As a first step towards enabling a complete prospective IPDT treatment-planning platform, we demonstrate use of our previously developed FullMonte tetrahedral Monte Carlo simulation engine for modeling of the interstitial fluence field due to intravesicular insertion of brief light sources. The goal is to enable a complete treatment planning and monitoring work flow analogous to that used in ionizing radiation therapy, including plan evaluation through dose-volume histograms and algorithmic treatment plan optimization. FullMonte is to our knowledge the fastest open-source tetrahedral MC light propagation software. Using custom hardware acceleration, we achieve 4x faster computing with 67x better power efficiency for limited-size meshes compared to the software. Ongoing work will improve the performance advantage to 16x with unlimited mesh size, enabling algorithmic plan optimization in reasonable time. Using FullMonte, we demonstrate significant new plan-evaluation capabilities including fluence field visualization, generation of organ dose-volume histograms, and rendering of isofluence surfaces for a representative bladder cancer mesh from a real patient. We also discuss the advantages of MC simulations for dose-volume histogram generation and the need for online personalized fluence-rate monitoring.

  2. Capacity planning for maternal-fetal medicine using discrete event simulation.

    Science.gov (United States)

    Ferraro, Nicole M; Reamer, Courtney B; Reynolds, Thomas A; Howell, Lori J; Moldenhauer, Julie S; Day, Theodore Eugene

    2015-07-01

    Maternal-fetal medicine is a rapidly growing field requiring collaboration from many subspecialties. We provide an evidence-based estimate of capacity needs for our clinic, as well as demonstrate how simulation can aid in capacity planning in similar environments. A Discrete Event Simulation of the Center for Fetal Diagnosis and Treatment and Special Delivery Unit at The Children's Hospital of Philadelphia was designed and validated. This model was then used to determine the time until demand overwhelms inpatient bed availability under increasing capacity. No significant deviation was found between historical inpatient censuses and simulated censuses for the validation phase (p = 0.889). Prospectively increasing capacity was found to delay time to balk (the inability of the center to provide bed space for a patient in need of admission). With current capacity, the model predicts mean time to balk of 276 days. Adding three beds delays mean time to first balk to 762 days; an additional six beds to 1,335 days. Providing sufficient access is a patient safety issue, and good planning is crucial for targeting infrastructure investments appropriately. Computer-simulated analysis can provide an evidence base for both medical and administrative decision making in a complex clinical environment. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  3. Planning of development strategy for establishment of advanced simulation of nuclear system

    International Nuclear Information System (INIS)

    Chung, Bubdong; Ko, Wonil; Kwon Junhyun

    2013-12-01

    In this product, the long term development plan in each technical area has been prosed with the plan of coupled code system. The consolidated code system for safety analysis has been proposing for future needs. The computing hardware needed for te advanced simulation is also proposing. The best approach for future safety analysis simulation capabilities may be a dual-path program. i. e. the development programs for an integrated analysis tool and multi-scale/multi-physic analysis tools, where the former aims at reducing uncertainty and the latter at enhancing accuracy. Integrated analysis tool with risk informed safety margin quantification It requires a significant extension of the phenomenological and geometric capabilities of existing reactor safety analysis software, capable of detailed simulations that reduce the uncertainties. Multi-scale, multi-physics analysis tools. Simplifications of complex phenomenological models and dependencies have been made in current safety analyses to accommodate computer hardware limitations. With the advent of modern computer hardware, these limitations may be removed to permit greater accuracy in representation of physical behavior of materials in design basis and beyond design basis conditions, and hence more accurate assessment of the true safety margins based on first principle methodology. The proposals can be utilized to develop the advanced simulation project and formulation of organization and establishment of high performance computing system in KAERI

  4. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Phased development plan

    Science.gov (United States)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  5. Modeling, simulation, and optimal initiation planning for needle insertion into the liver.

    Science.gov (United States)

    Sharifi Sedeh, R; Ahmadian, M T; Janabi-Sharifi, F

    2010-04-01

    Needle insertion simulation and planning systems (SPSs) will play an important role in diminishing inappropriate insertions into soft tissues and resultant complications. Difficulties in SPS development are due in large part to the computational requirements of the extensive calculations in finite element (FE) models of tissue. For clinical feasibility, the computational speed of SPSs must be improved. At the same time, a realistic model of tissue properties that reflects large and velocity-dependent deformations must be employed. The purpose of this study is to address the aforementioned difficulties by presenting a cost-effective SPS platform for needle insertions into the liver. The study was constrained to planar (2D) cases, but can be extended to 3D insertions. To accommodate large and velocity-dependent deformations, a hyperviscoelastic model was devised to produce an FE model of liver tissue. Material constants were identified by a genetic algorithm applied to the experimental results of unconfined compressions of bovine liver. The approach for SPS involves B-spline interpolations of sample data generated from the FE model of liver. Two interpolation-based models are introduced to approximate puncture times and to approximate the coordinates of FE model nodes interacting with the needle tip as a function of the needle initiation pose; the latter was also a function of postpuncture time. A real-time simulation framework is provided, and its computational benefit is highlighted by comparing its performance with the FE method. A planning algorithm for optimal needle initiation was designed, and its effectiveness was evaluated by analyzing its accuracy in reaching a random set of targets at different resolutions of sampled data using the FE model. The proposed simulation framework can easily surpass haptic rates (>500 Hz), even with a high pose resolution level ( approximately 30). The computational time required to update the coordinates of the node at the

  6. Statistical perspectives on inverse problems

    DEFF Research Database (Denmark)

    Andersen, Kim Emil

    of the interior of an object from electrical boundary measurements. One part of this thesis concerns statistical approaches for solving, possibly non-linear, inverse problems. Thus inverse problems are recasted in a form suitable for statistical inference. In particular, a Bayesian approach for regularisation...... problem is given in terms of probability distributions. Posterior inference is obtained by Markov chain Monte Carlo methods and new, powerful simulation techniques based on e.g. coupled Markov chains and simulated tempering is developed to improve the computational efficiency of the overall simulation......Inverse problems arise in many scientific disciplines and pertain to situations where inference is to be made about a particular phenomenon from indirect measurements. A typical example, arising in diffusion tomography, is the inverse boundary value problem for non-invasive reconstruction...

  7. Preliminary results of radiation therapy for locally advanced or recurrent adenoid cystic carcinomas of the head and neck using combined conventional radiation therapy and hypofractionated inverse planned stereotactic radiation therapy

    International Nuclear Information System (INIS)

    Nomoto, Satoshi; Shioyama, Yoshiyuki; Ohga, Saiji

    2009-01-01

    The purpose of this study was to investigate the clinical outcomes and feasibility of combined conventional radiation therapy (RT) and hypofractionated inverse planned stereotactic radiation therapy (SRT) for locally advanced or recurrent adenoid cystic carcinomas (ACCs) of the head and neck. Five patients with ACCs of the head and neck were treated with combined conventional RT and inverse planned SRT. Radiation doses of 40 to 50 Gy were delivered with 20 to 25 fractions using conventional RT, and then an additional 20 to 25 Gy was delivered by 4 to 5 fractions of SRT. Median follow-up was 12 months. Local control was obtained in all 5 patients, partial response (PR) in 2 patients and stable disease (SD) in 3 patients. According to the Radiation Therapy Oncology Group (RTOG) late-radiation morbidity scoring criteria, adverse effects included Grade 2 xerostomia in 1 patient, Grade 2 trismus in 1 patient, and Grade 4 mucosal ulceration in 1 patient. Combined treatment with conventional RT and hypofractionated inverse planned SRT may be effective for short-term local control in patients with locally advanced or recurrent ACCs. Further evaluation is needed for long-term follow-up. (author)

  8. Simulation of complex data structures for planning of studies with focus on biomarker comparison.

    Science.gov (United States)

    Schulz, Andreas; Zöller, Daniela; Nickels, Stefan; Beutel, Manfred E; Blettner, Maria; Wild, Philipp S; Binder, Harald

    2017-06-13

    There are a growing number of observational studies that do not only focus on single biomarkers for predicting an outcome event, but address questions in a multivariable setting. For example, when quantifying the added value of new biomarkers in addition to established risk factors, the aim might be to rank several new markers with respect to their prediction performance. This makes it important to consider the marker correlation structure for planning such a study. Because of the complexity, a simulation approach may be required to adequately assess sample size or other aspects, such as the choice of a performance measure. In a simulation study based on real data, we investigated how to generate covariates with realistic distributions and what generating model should be used for the outcome, aiming to determine the least amount of information and complexity needed to obtain realistic results. As a basis for the simulation a large epidemiological cohort study, the Gutenberg Health Study was used. The added value of markers was quantified and ranked in subsampling data sets of this population data, and simulation approaches were judged by the quality of the ranking. One of the evaluated approaches, the random forest, requires original data at the individual level. Therefore, also the effect of the size of a pilot study for random forest based simulation was investigated. We found that simple logistic regression models failed to adequately generate realistic data, even with extensions such as interaction terms or non-linear effects. The random forest approach was seen to be more appropriate for simulation of complex data structures. Pilot studies starting at about 250 observations were seen to provide a reasonable level of information for this approach. We advise to avoid oversimplified regression models for simulation, in particular when focusing on multivariable research questions. More generally, a simulation should be based on real data for adequately reflecting

  9. Simulation of complex data structures for planning of studies with focus on biomarker comparison

    Directory of Open Access Journals (Sweden)

    Andreas Schulz

    2017-06-01

    Full Text Available Abstract Background There are a growing number of observational studies that do not only focus on single biomarkers for predicting an outcome event, but address questions in a multivariable setting. For example, when quantifying the added value of new biomarkers in addition to established risk factors, the aim might be to rank several new markers with respect to their prediction performance. This makes it important to consider the marker correlation structure for planning such a study. Because of the complexity, a simulation approach may be required to adequately assess sample size or other aspects, such as the choice of a performance measure. Methods In a simulation study based on real data, we investigated how to generate covariates with realistic distributions and what generating model should be used for the outcome, aiming to determine the least amount of information and complexity needed to obtain realistic results. As a basis for the simulation a large epidemiological cohort study, the Gutenberg Health Study was used. The added value of markers was quantified and ranked in subsampling data sets of this population data, and simulation approaches were judged by the quality of the ranking. One of the evaluated approaches, the random forest, requires original data at the individual level. Therefore, also the effect of the size of a pilot study for random forest based simulation was investigated. Results We found that simple logistic regression models failed to adequately generate realistic data, even with extensions such as interaction terms or non-linear effects. The random forest approach was seen to be more appropriate for simulation of complex data structures. Pilot studies starting at about 250 observations were seen to provide a reasonable level of information for this approach. Conclusions We advise to avoid oversimplified regression models for simulation, in particular when focusing on multivariable research questions. More generally

  10. Multi-period multi-objective electricity generation expansion planning problem with Monte-Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Tekiner, Hatice [Industrial Engineering, College of Engineering and Natural Sciences, Istanbul Sehir University, 2 Ahmet Bayman Rd, Istanbul (Turkey); Coit, David W. [Department of Industrial and Systems Engineering, Rutgers University, 96 Frelinghuysen Rd., Piscataway, NJ (United States); Felder, Frank A. [Edward J. Bloustein School of Planning and Public Policy, Rutgers University, Piscataway, NJ (United States)

    2010-12-15

    A new approach to the electricity generation expansion problem is proposed to minimize simultaneously multiple objectives, such as cost and air emissions, including CO{sub 2} and NO{sub x}, over a long term planning horizon. In this problem, system expansion decisions are made to select the type of power generation, such as coal, nuclear, wind, etc., where the new generation asset should be located, and at which time period expansion should take place. We are able to find a Pareto front for the multi-objective generation expansion planning problem that explicitly considers availability of the system components over the planning horizon and operational dispatching decisions. Monte-Carlo simulation is used to generate numerous scenarios based on the component availabilities and anticipated demand for energy. The problem is then formulated as a mixed integer linear program, and optimal solutions are found based on the simulated scenarios with a combined objective function considering the multiple problem objectives. The different objectives are combined using dimensionless weights and a Pareto front can be determined by varying these weights. The mathematical model is demonstrated on an example problem with interesting results indicating how expansion decisions vary depending on whether minimizing cost or minimizing greenhouse gas emissions or pollutants is given higher priority. (author)

  11. A multileaf collimator phantom for the quality assurance of radiation therapy planning systems and CT simulators

    International Nuclear Information System (INIS)

    McNiven, Andrea; Kron, Tomas; Van Dyk, Jake

    2004-01-01

    Purpose: The evolution of three-dimensional conformal radiation treatment has led to the use of multileaf collimators (MLCs) in intensity-modulated radiation therapy (IMRT) and other treatment techniques to increase the conformity of the dose distribution. A new quality assurance (QA) phantom has been designed to check the handling of MLC settings in treatment planning and delivery. Methods and materials: The phantom consists of a Perspex block with stepped edges that can be rotated in all planes. The design allows for the assessment of several MLC and micro-MLC types from various manufacturers, and is therefore applicable to most radiation therapy institutions employing MLCs. The phantom is computed tomography (CT) scanned as is a patient, and QA assessments can be made of field edge display for a variety of shapes and orientations on both radiation treatment planning systems (RTPS) and computed tomography simulators. Results: The dimensions of the phantom were verified to be physically correct within an uncertainty range of 0-0.7 mm. Errors in leaf position larger than 1 mm were easily identified by multiple observers. Conclusions: The MLC geometry phantom is a useful tool in the QA of radiation therapy with application to RTPS, CT simulators, and virtual simulation packages with MLC display capabilities

  12. Using discrete-event simulation in strategic capacity planning for an outpatient physical therapy service.

    Science.gov (United States)

    Rau, Chi-Lun; Tsai, Pei-Fang Jennifer; Liang, Sheau-Farn Max; Tan, Jhih-Cian; Syu, Hong-Cheng; Jheng, Yue-Ling; Ciou, Ting-Syuan; Jaw, Fu-Shan

    2013-12-01

    This study uses a simulation model as a tool for strategic capacity planning for an outpatient physical therapy clinic in Taipei, Taiwan. The clinic provides a wide range of physical treatments, with 6 full-time therapists in each session. We constructed a discrete-event simulation model to study the dynamics of patient mixes with realistic treatment plans, and to estimate the practical capacity of the physical therapy room. The changes in time-related and space-related performance measurements were used to evaluate the impact of various strategies on the capacity of the clinic. The simulation results confirmed that the clinic is extremely patient-oriented, with a bottleneck occurring at the traction units for Intermittent Pelvic Traction (IPT), with usage at 58.9 %. Sensitivity analysis showed that attending to more patients would significantly increase the number of patients staying for overtime sessions. We found that pooling the therapists produced beneficial results. The average waiting time per patient could be reduced by 45 % when we pooled 2 therapists. We found that treating up to 12 new patients per session had no significantly negative impact on returning patients. Moreover, we found that the average waiting time for new patients decreased if they were given priority over returning patients when called by the therapists.

  13. Optimal Acceleration-Velocity-Bounded Trajectory Planning in Dynamic Crowd Simulation

    Directory of Open Access Journals (Sweden)

    Fu Yue-wen

    2014-01-01

    Full Text Available Creating complex and realistic crowd behaviors, such as pedestrian navigation behavior with dynamic obstacles, is a difficult and time consuming task. In this paper, we study one special type of crowd which is composed of urgent individuals, normal individuals, and normal groups. We use three steps to construct the crowd simulation in dynamic environment. The first one is that the urgent individuals move forward along a given path around dynamic obstacles and other crowd members. An optimal acceleration-velocity-bounded trajectory planning method is utilized to model their behaviors, which ensures that the durations of the generated trajectories are minimal and the urgent individuals are collision-free with dynamic obstacles (e.g., dynamic vehicles. In the second step, a pushing model is adopted to simulate the interactions between urgent members and normal ones, which ensures that the computational cost of the optimal trajectory planning is acceptable. The third step is obligated to imitate the interactions among normal members using collision avoidance behavior and flocking behavior. Various simulation results demonstrate that these three steps give realistic crowd phenomenon just like the real world.

  14. Preoperative simulation for the planning of microsurgical clipping of intracranial aneurysms.

    Science.gov (United States)

    Marinho, Paulo; Vermandel, Maximilien; Bourgeois, Philippe; Lejeune, Jean-Paul; Mordon, Serge; Thines, Laurent

    2014-12-01

    The safety and success of intracranial aneurysm (IA) surgery could be improved through the dedicated application of simulation covering the procedure from the 3-dimensional (3D) description of the surgical scene to the visual representation of the clip application. We aimed in this study to validate the technical feasibility and clinical relevance of such a protocol. All patients preoperatively underwent 3D magnetic resonance imaging and 3D computed tomography angiography to build 3D reconstructions of the brain, cerebral arteries, and surrounding cranial bone. These 3D models were segmented and merged using Osirix, a DICOM image processing application. This provided the surgical scene that was subsequently imported into Blender, a modeling platform for 3D animation. Digitized clips and appliers could then be manipulated in the virtual operative environment, allowing the visual simulation of clipping. This simulation protocol was assessed in a series of 10 IAs by 2 neurosurgeons. The protocol was feasible in all patients. The visual similarity between the surgical scene and the operative view was excellent in 100% of the cases, and the identification of the vascular structures was accurate in 90% of the cases. The neurosurgeons found the simulation helpful for planning the surgical approach (ie, the bone flap, cisternal opening, and arterial tree exposure) in 100% of the cases. The correct number of final clip(s) needed was predicted from the simulation in 90% of the cases. The preoperatively expected characteristics of the optimal clip(s) (ie, their number, shape, size, and orientation) were validated during surgery in 80% of the cases. This study confirmed that visual simulation of IA clipping based on the processing of high-resolution 3D imaging can be effective. This is a new and important step toward the development of a more sophisticated integrated simulation platform dedicated to cerebrovascular surgery.

  15. Single-Column Model Simulations of Subtropical Marine Boundary-Layer Cloud Transitions Under Weakening Inversions: SCM SIMULATIONS OF CLOUD TRANSITIONS

    Energy Technology Data Exchange (ETDEWEB)

    Neggers, R. A. J. [Institute for Geophysics and Meteorology, Department of Geosciences, University of Cologne, Cologne Germany; Royal Netherlands Meteorological Institute, De Bilt The Netherlands; Ackerman, A. S. [NASA Goddard Institute for Space Studies, New York NY USA; Angevine, W. M. [CIRES, University of Colorado, Boulder CO USA; NOAA Earth System Research Laboratory, Boulder CO USA; Bazile, E. [Météo France/CNRM, Toulouse France; Beau, I. [Météo France/ENM, Toulouse France; Blossey, P. N. [Department of Atmospheric Sciences, University of Washington, Seattle WA USA; Boutle, I. A. [Met Office, Exeter UK; de Bruijn, C. [Royal Netherlands Meteorological Institute, De Bilt The Netherlands; Cheng, A. [NOAA Center for Weather and Climate Prediction, Environmental Modeling Center, College Park MD USA; van der Dussen, J. [Department of Geoscience and Remote Sensing, Delft University of Technology, Delft The Netherlands; Fletcher, J. [Department of Atmospheric Sciences, University of Washington, Seattle WA USA; University of Leeds, Leeds UK; Dal Gesso, S. [Institute for Geophysics and Meteorology, Department of Geosciences, University of Cologne, Cologne Germany; Royal Netherlands Meteorological Institute, De Bilt The Netherlands; Jam, A. [Météo-France/CNRM & CNRS/IPSL/LMD, Toulouse France; Kawai, H. [Meteorological Research Institute, Climate Research Department, Japan Meteorological Agency, Tsukuba Japan; Cheedela, S. K. [Department of Atmosphere in the Earth System, Max-Planck Institut für Meteorologie, Hamburg Germany; Larson, V. E. [Department of Mathematical Sciences, University of Wisconsin-Milwaukee, Milwaukee WI USA; Lefebvre, M. -P. [Météo-France/CNRM & CNRS/IPSL/LMD, Toulouse France; Lock, A. P. [Met Office, Exeter UK; Meyer, N. R. [Department of Mathematical Sciences, University of Wisconsin-Milwaukee, Milwaukee WI USA; de Roode, S. R. [Department of Geoscience and Remote Sensing, Delft University of Technology, Delft The Netherlands; de Rooy, W. [Royal Netherlands Meteorological Institute, De Bilt The Netherlands; Sandu, I. [Section of Physical Aspects, European Centre for Medium-Range Weather Forecasts, Reading UK; Xiao, H. [University of California at Los Angeles, Los Angeles CA USA; Pacific Northwest National Laboratory, Richland WA USA; Xu, K. -M. [NASA Langley Research Centre, Hampton VI USA

    2017-10-01

    Results are presented of the GASS/EUCLIPSE single-column model inter-comparison study on the subtropical marine low-level cloud transition. A central goal is to establish the performance of state-of-the-art boundary-layer schemes for weather and climate mod- els for this cloud regime, using large-eddy simulations of the same scenes as a reference. A novelty is that the comparison covers four different cases instead of one, in order to broaden the covered parameter space. Three cases are situated in the North-Eastern Pa- cific, while one reflects conditions in the North-Eastern Atlantic. A set of variables is considered that reflects key aspects of the transition process, making use of simple met- rics to establish the model performance. Using this method some longstanding problems in low level cloud representation are identified. Considerable spread exists among models concerning the cloud amount, its vertical structure and the associated impact on radia- tive transfer. The sign and amplitude of these biases differ somewhat per case, depending on how far the transition has progressed. After cloud breakup the ensemble median ex- hibits the well-known “too few too bright” problem. The boundary layer deepening rate and its state of decoupling are both underestimated, while the representation of the thin capping cloud layer appears complicated by a lack of vertical resolution. Encouragingly, some models are successful in representing the full set of variables, in particular the verti- cal structure and diurnal cycle of the cloud layer in transition. An intriguing result is that the median of the model ensemble performs best, inspiring a new approach in subgrid pa- rameterization.

  16. Fast computation of the inverse CMH model

    Science.gov (United States)

    Patel, Umesh D.; Della Torre, Edward

    2001-12-01

    A fast computational method based on differential equation approach for inverse Della Torre, Oti, Kádár (DOK) model has been extended for the inverse Complete Moving Hysteresis (CMH) model. A cobweb technique for calculating the inverse CMH model is also presented. The two techniques differ from the point of view of flexibility, accuracy, and computation time. Simulation results of the inverse computation for both methods are presented.

  17. Preparations for a high gradient inverse free electron laser experiment at Brookhaven national laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Duris, J.; Li, R. K.; Musumeci, P.; Sakai, Y.; Threlkeld, E.; Williams, O.; Fedurin, M.; Kusche, K.; Pogorelsky, I.; Polyanskiy, M.; Yakimenko, V. [UCLA Department of Physics and Astronomy, Los Angeles, CA 90095 (United States); Accelerator Test Facility, Brookhaven National Laboratory, Upton, NY, 11973 (United States)

    2012-12-21

    Preparations for an inverse free electron laser experiment at Brookhaven National Laboratory's Accelerator Test Facilty are presented. Details of the experimental setup including beam and laser transport optics are first discussed. Next, the driving laser pulse structure is investigated and initial diagnostics are explored and compared to simulations. Finally, planned improvements to the experimental setup are discussed.

  18. Efficient dam break flood simulation methods for developing a preliminary evacuation plan after the Wenchuan Earthquake

    Directory of Open Access Journals (Sweden)

    Y. Li

    2012-01-01

    Full Text Available The Xiaojiaqiao barrier lake, which was the second largest barrier lake formed by the Wenchuan Earthquake had seriously threatened the lives and property of the population downstream. The lake was finally dredged successfully on 7 June 2008. Because of the limited time available to conduct an inundation potential analysis and make an evacuation plan, barrier lake information extraction and real-time dam break flood simulation should be carried out quickly, integrating remote sensing and geographic information system (GIS techniques with hydrologic/hydraulic analysis. In this paper, a technical framework and several key techniques for this real-time preliminary evacuation planning are introduced. An object-oriented method was used to extract hydrological information on the barrier lake from unmanned aerial vehicle (UAV remote sensing images. The real-time flood routine was calculated by using shallow-water equations, which were solved by means of a finite volume scheme on multiblock structured grids. The results of the hydraulic computations are visualized and analyzed in a 3-D geographic information system for inundation potential analysis, and an emergency response plan is made. The results show that if either a full-break or a half-break situation had occurred for the Chapinghe barrier lake on 19 May 2008, then the Xiaoba Town region and the Sangzao Town region would have been affected, but the downstream towns would have been less influenced. Preliminary evacuation plans under different dam break situations can be effectively made using these methods.

  19. Discharge planning simulation: training the interprofessional team for the future workplace.

    Science.gov (United States)

    Kraft, Sara; Wise, Holly H; Jacques, Paul F; Burik, Jerry K

    2013-01-01

    The integration of interprofessional education (IPE) into health professions curricula offers a possible way to increase collaboration among health professionals. In this paper we introduce an innovative IPE model of a team-based discharge planning case scenario. Occupational therapy, physician assistant, and physical therapy students (n=173) participated in a discharge planning simulation (DPS) focused on a patient with a stroke and subsequent hip fracture. A discharge-planning meeting DVD was developed and disseminated to the students. Pre and post surveys were sent to the students. Eighty-nine percent (n=153) of the students responded to the pre-DPS survey and 77% (n=132) responded to the post-DPS survey. There was no significant difference when comparing pre-DPS or post-DPS by program of study, but significant differences were found in three of the four questions when comparing individual answers. Participation in the DPS model resulted in significant changes in perception of a student's individual role as well as the role of their interprofessional team members in discharge planning for the complex patient. Preliminary results indicate that this model can be a useful tool to effectively teach the interprofessional team.

  20. Simulation-Based Planning and Control of Transport Flows in Port Logistic Systems

    Directory of Open Access Journals (Sweden)

    Antonio Diogo Passos Lima

    2015-01-01

    Full Text Available In highly dynamic and uncertain transport conditions, transport transit time has to be continuously monitored so that the service level is ensured at a proper cost. The aim of this research is to propose and to test a procedure which allows an agile planning and control of transport flows in port logistic systems. The procedure couples an agent-based simulation and a queueing theory model. In this paper, the transport scheduling performed by an agent at the intermodal terminal was taken into consideration. The decision-making agent takes into account data which is acquired in remote points of the system. The obtained results indicate the relevance of continuously considering, for the transport planning and control, the expected transit time and further waiting times along port logistic systems.

  1. Semi-automatic watershed medical image segmentation methods for customized cancer radiation treatment planning simulation

    International Nuclear Information System (INIS)

    Kum Oyeon; Kim Hye Kyung; Max, N.

    2007-01-01

    A cancer radiation treatment planning simulation requires image segmentation to define the gross tumor volume, clinical target volume, and planning target volume. Manual segmentation, which is usual in clinical settings, depends on the operator's experience and may, in addition, change for every trial by the same operator. To overcome this difficulty, we developed semi-automatic watershed medical image segmentation tools using both the top-down watershed algorithm in the insight segmentation and registration toolkit (ITK) and Vincent-Soille's bottom-up watershed algorithm with region merging. We applied our algorithms to segment two- and three-dimensional head phantom CT data and to find pixel (or voxel) numbers for each segmented area, which are needed for radiation treatment optimization. A semi-automatic method is useful to avoid errors incurred by both human and machine sources, and provide clear and visible information for pedagogical purpose. (orig.)

  2. Accuracy of standard measures of family planning service quality: findings from the simulated client method.

    Science.gov (United States)

    Tumlinson, Katherine; Speizer, Ilene S; Curtis, Siân L; Pence, Brian W

    2014-12-01

    In the field of international family planning, quality of care as a reproductive right is widely endorsed, yet we lack validated data-collection instruments that can accurately assess quality in terms of its public health importance. This study, conducted within 19 public and private facilities in Kisumu, Kenya, used the simulated client method to test the validity of three standard data-collection instruments used in large-scale facility surveys: provider interviews, client interviews, and observation of client-provider interactions. Results found low specificity and low positive predictive values in each of the three instruments for a number of quality indicators, suggesting that the quality of care provided may be overestimated by traditional methods of measurement. Revised approaches to measuring family planning service quality may be needed to ensure accurate assessment of programs and to better inform quality-improvement interventions. © 2014 The Population Council, Inc.

  3. Telematics-based online client-server/client collaborative environment for radiotherapy planning simulations.

    Science.gov (United States)

    Kum, Oyeon

    2007-11-01

    Customized cancer radiation treatment planning for each patient is very useful for both a patient and a doctor because it provides the ability to deliver higher doses to a more accurately defined tumor and at the same time lower doses to organs at risk and normal tissues. This can be realized by building an accurate planning simulation system to provide better treatment strategies based on each patient's tomographic data such as CT, MRI, PET, or SPECT. In this study, we develop a real-time online client-server/client collaborative environment between the client (health care professionals or hospitals) and the server/client under a secure network using telematics (the integrated use of telecommunications and medical informatics). The implementation is based on a point-to-point communication scheme between client and server/client following the WYSIWIS (what you see is what I see) paradigm. After uploading the patient tomographic data, the client is able to collaborate with the server/client for treatment planning. Consequently, the level of health care services can be improved, specifically for small radiotherapy clinics in rural/remote-country areas that do not possess much experience or equipment such as a treatment planning simulator. The telematics service of the system can also be used to provide continued medical education in radiotherapy. Moreover, the system is easy to use. A client can use the system if s/he is familiar with the Windows(TM) operating system because it is designed and built based on a user-friendly concept. This system does not require the client to continue hardware and software maintenance and updates. These are performed automatically by the server.

  4. Faster-than-real-time robot simulation for plan development and robot safety

    International Nuclear Information System (INIS)

    Crane, C.D. III; Dalton, R.; Ogles, J.; Tulenko, J.S.; Zhou, X.

    1990-01-01

    The University of Florida, in cooperation with the Universities of Texas, Tennessee, and Michigan and Oak Ridge National Laboratory (ORNL), is developing an advanced robotic system for the US Department of Energy under the University Program for Robotics for Advanced Reactors. As part of this program, the University of Florida has been pursuing the development of a faster-than-real-time robotic simulation program for planning and control of mobile robotic operations to ensure the efficient and safe operation of mobile robots in nuclear power plants and other hazardous environments

  5. Inverse Kinematics Solution and Verification of 4-DOF Hydraulic Manipulator

    Science.gov (United States)

    Chao, Zhiqiang; Wang, Fei; Zhang, Chuanqing; Li, Huaying

    2017-10-01

    Aimed at four degree of freedom (DOF) hydraulic manipulator, an inverse kinematics solution is proposed from Cartesian space to drive space based on geometrical method. To the structural and diver characteristics of the manipulator, a forward kinematics is conducted by using D-H method. The position and orientation of manipulator’s end-effector can be obtained under the kinematics constraint. By analyzing the structure, the solution of inverse kinematics of manipulator can be obtained, and the conversion between drive space and joint space can be got through the sport’s mechanism kinematics. In order to meet the need of motion planning and control of the manipulator, the inverse kinematics and conversion are validated based on simulation.

  6. Simulating behaviour change interventions based on the theory of planned behaviour: Impacts on intention and action.

    Science.gov (United States)

    Fife-Schaw, Chris; Sheeran, Paschal; Norman, Paul

    2007-03-01

    The theory of planned behaviour (TPB; Ajzen, 1991) has been used extensively to predict social and health behaviours. However, a critical test of the TPB is whether interventions that increased scores on the theory's predictors would engender behaviour change. The present research deployed a novel technique in order to provide this test. Statistical simulations were conducted on data for 30 behaviours (N=211) that estimated the impact of interventions that generated maximum positive changes in attitudes, subjective norms and perceived behavioural control (PBC) on subsequent intentions and behaviour. Findings indicated that interventions that maximized TPB variables had a substantial impact on behavioural intentions. Although TPB maximization increased the proportion of the sample that performed respective behaviours by 28% compared with baseline, the behaviour of a substantial minority of the sample (26%) did not change. The research also identified several interactions among TPB variables in predicting simulated intention and behaviour scores and investigated the mediating role of intentions in predicting behaviour.

  7. Industrial Robotics Platform for Simulation Design, Planning and Optimization based on Off-line CAD Programming

    Directory of Open Access Journals (Sweden)

    Baizid Khelifa

    2016-01-01

    Full Text Available This paper presents IRoSim: Industrial Robotics Simulation Design Planning and Optimization platform which we developed based on SolidWorks API. The main objective is to integrate features from mechanical and robotics CAD software into the same platform in order to facilitate the development process through a friendly interaction interface. The platform provides important steps to develop a given robotized task such as: defining a given task, CAD learning of the end-effectors’ trajectory, checking the manipulator’s reach-ability to perform a task, simulating the motion and preventing the trajectory from possible collisions. To assess the usability of the proposed platform, a car’s doors painting task using a 6 Degree Of Freedom industrial manipulator has been developed.

  8. Multiplatform Mission Planning and Operations Simulation Environment for Adaptive Remote Sensors

    Science.gov (United States)

    Smith, G.; Ball, C.; O'Brien, A.; Johnson, J. T.

    2017-12-01

    We report on the design and development of mission simulator libraries to support the emerging field of adaptive remote sensors. We will outline the current state of the art in adaptive sensing, provide analysis of how the current approach to performing observing system simulation experiments (OSSEs) must be changed to enable adaptive sensors for remote sensing, and present an architecture to enable their inclusion in future OSSEs.The growing potential of sensors capable of real-time adaptation of their operational parameters calls for a new class of mission planning and simulation tools. Existing simulation tools used in OSSEs assume a fixed set of sensor parameters in terms of observation geometry, frequencies used, resolution, or observation time, which allows simplifications to be made in the simulation and allows sensor observation errors to be characterized a priori. Adaptive sensors may vary these parameters depending on the details of the scene observed, so that sensor performance is not simple to model without conducting OSSE simulations that include sensor adaptation in response to varying observational environment. Adaptive sensors are of significance to resource-constrained, small satellite platforms because they enable the management of power and data volumes while providing methods for multiple sensors to collaborate.The new class of OSSEs required to utilize adaptive sensors located on multiple platforms must answer the question: If the physical act of sensing has a cost, how does the system determine if the science value of a measurement is worth the cost and how should that cost be shared among the collaborating sensors?Here we propose to answer this question using an architecture structured around three modules: ADAPT, MANAGE and COLLABORATE. The ADAPT module is a set of routines to facilitate modeling of adaptive sensors, the MANAGE module will implement a set of routines to facilitate simulations of sensor resource management when power and data

  9. Thermal measurements and inverse techniques

    CERN Document Server

    Orlande, Helcio RB; Maillet, Denis; Cotta, Renato M

    2011-01-01

    With its uncommon presentation of instructional material regarding mathematical modeling, measurements, and solution of inverse problems, Thermal Measurements and Inverse Techniques is a one-stop reference for those dealing with various aspects of heat transfer. Progress in mathematical modeling of complex industrial and environmental systems has enabled numerical simulations of most physical phenomena. In addition, recent advances in thermal instrumentation and heat transfer modeling have improved experimental procedures and indirect measurements for heat transfer research of both natural phe

  10. Bayesian seismic AVO inversion

    Energy Technology Data Exchange (ETDEWEB)

    Buland, Arild

    2002-07-01

    A new linearized AVO inversion technique is developed in a Bayesian framework. The objective is to obtain posterior distributions for P-wave velocity, S-wave velocity and density. Distributions for other elastic parameters can also be assessed, for example acoustic impedance, shear impedance and P-wave to S-wave velocity ratio. The inversion algorithm is based on the convolutional model and a linearized weak contrast approximation of the Zoeppritz equation. The solution is represented by a Gaussian posterior distribution with explicit expressions for the posterior expectation and covariance, hence exact prediction intervals for the inverted parameters can be computed under the specified model. The explicit analytical form of the posterior distribution provides a computationally fast inversion method. Tests on synthetic data show that all inverted parameters were almost perfectly retrieved when the noise approached zero. With realistic noise levels, acoustic impedance was the best determined parameter, while the inversion provided practically no information about the density. The inversion algorithm has also been tested on a real 3-D dataset from the Sleipner Field. The results show good agreement with well logs but the uncertainty is high. The stochastic model includes uncertainties of both the elastic parameters, the wavelet and the seismic and well log data. The posterior distribution is explored by Markov chain Monte Carlo simulation using the Gibbs sampler algorithm. The inversion algorithm has been tested on a seismic line from the Heidrun Field with two wells located on the line. The uncertainty of the estimated wavelet is low. In the Heidrun examples the effect of including uncertainty of the wavelet and the noise level was marginal with respect to the AVO inversion results. We have developed a 3-D linearized AVO inversion method with spatially coupled model parameters where the objective is to obtain posterior distributions for P-wave velocity, S

  11. Production Planning with Respect to Uncertainties. Simulator Based Production Planning of Average Sized Combined Heat and Power Production Plants; Produktionsplanering under osaekerhet. Simulatorbaserad produktionsplanering av medelstora kraftvaermeanlaeggningar

    Energy Technology Data Exchange (ETDEWEB)

    Haeggstaahl, Daniel [Maelardalen Univ., Vaesteraas (Sweden); Dotzauer, Erik [AB Fortum, Stockholm (Sweden)

    2004-12-01

    Production planning in Combined Heat and Power (CHP) systems is considered. The focus is on development and use of mathematical models and methods. Different aspects on production planning are discussed, including weather and load predictions. Questions relevant on the different planning horizons are illuminated. The main purpose with short-term (one week) planning is to decide when to start and stop the production units, and to decide how to use the heat storage. The main conclusion from the outline of pros and cons of commercial planning software are that several are using Mixed Integer Programming (MIP). In that sense they are similar. Building a production planning model means that the planning problem is formulated as a mathematical optimization problem. The accuracy of the input data determines the practical detail level of the model. Two alternatives to the methods used in today's commercial programs are proposed: stochastic optimization and simulator-based optimization. The basic concepts of mathematical optimization are outlined. A simulator-based model for short-term planning is developed. The purpose is to minimize the production costs, depending on the heat demand in the district heating system, prices of electricity and fuels, emission taxes and fees, etc. The problem is simplified by not including any time-linking conditions. The process model is developed in IPSEpro, a heat and mass-balance software from SimTech Simulation Technology. TOMLAB, an optimization toolbox in MATLAB, is used as optimizer. Three different solvers are applied: glcFast, glcCluster and SNOPT. The link between TOMLAB and IPSEpro is accomplished using the Microsoft COM technology. MATLAB is the automation client and contains the control of IPSEpro and TOMLAB. The simulator-based model is applied to the CHP plant in Eskilstuna. Two days are chosen and analyzed. The optimized production is compared to the measured. A sensitivity analysis on how variations in outdoor

  12. Temporary Workforce Planning with Firm Contracts: A Model and a Simulated Annealing Heuristic

    Directory of Open Access Journals (Sweden)

    Muhammad Al-Salamah

    2011-01-01

    Full Text Available The aim of this paper is to introduce a model for temporary staffing when temporary employment is managed by firm contracts and to propose a simulated annealing-based method to solve the model. Temporary employment is a policy frequently used to adjust the working hour capacity to fluctuating demand. Temporary workforce planning models have been unnecessarily simplified to account for only periodic hiring and laying off; a company can review its workforce requirement every period and make hire-fire decisions accordingly, usually with a layoff cost. We present a more realistic temporary workforce planning model that assumes a firm contract between the worker and the company, which can extend to several periods. The model assumes the traditional constraints, such as inventory balance constraints, worker availability, and labor hour mix. The costs are the inventory holding cost, training cost of the temporary workers, and the backorder cost. The mixed integer model developed for this case has been found to be difficult to solve even for small problem sizes; therefore, a simulated annealing algorithm is proposed to solve the mixed integer model. The performance of the SA algorithm is compared with the CPLEX solution.

  13. Path planning and kinematics simulation of surfacing cladding for hot forging die

    Directory of Open Access Journals (Sweden)

    Wang Huajun

    2015-01-01

    Full Text Available During the course of their work, a variety of damage and failure of hot forging die occurs and seriously affect the service life. Multi-layer metal hot forging die with functionally graded material structure can effectively extend the service life. In this paper, According to the needs of strengthening forging cavity, the CAD model of surfacing forming center was designed. Based on technological requirements of surface cladding for die cavity, the coupled movement equation of weld torch was established, and the trajectory of welding positioner and Cartesian robot kinematics was solved. The weld torch path was planned, according to the typical methods used in plane welding, and the surfacing path data was extracted by the secondary development of UG/OPEN. Then the kinematics solver program, which can output the control function of motion simulation, was written in MATLAB to solve the kinematics equation. Finally, in UG NX7.5, the kinematics simulation model was built to verify the correctness of mathematical model and the rationality of welding path planning. The above studies can provide a technical support for the die repair and manufacturing of a multilayer metal forging die.

  14. Computer-assisted surgical planning and simulation for unilateral condylar benign lesions causing facial asymmetry.

    Science.gov (United States)

    Lu, Chuan; He, Dongmei; Yang, Chi; Huang, Dong; Ellis, Edward

    2017-04-01

    The purpose of this study was to investigate the best surgical sequence for the treatment of unilateral condylar benign lesions causing facial asymmetry by applying computer-assisted surgical planning and simulation. Computed tomography (CT) data from 12 patients whose maxillary cant was corrected by maintaining the vertical position of the central incisors and equally intruding the long side of the maxilla and extruding the short side were analyzed by ProPlan CMF 1.4 software (Materialise Medical, Leuven, Belgium). Condylectomy and double jaw orthognathic surgery with 2 different surgical sequences were simulated: 1) maxillary LeFort I osteotomy first (MaxF), then condylectomy, followed by bilateral sagittal split ramus osteotomy (BSSO); and 2) mandible first (ManF), beginning with condylectomy, then BSSO, and lastly LeFort I osteotomy. The greatest space between the maxillary and mandibular first molar in the interim positions was measured virtually to compare the 2 surgical sequences. The vertical distance between the upper and lower teeth of ManF patients was significantly smaller than that of MaxF patients (mean 2.99 mm, P < .001). When occlusal cants are corrected by equally intruding one side and extruding the other side of the maxillary dentition, the interim position is more conducive to sequencing corrective surgery by performing condylectomy, then BSSO, followed by Le Fort I osteotomy. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Strategic energy planning: Modelling and simulating energy market behaviours using system thinking and systems dynamics principles

    International Nuclear Information System (INIS)

    Papageorgiou, George Nathaniel

    2005-01-01

    In the face of limited energy reserves and the global warming phenomenon, Europe is undergoing a transition from rapidly depleting fossil fuels to renewable unconventional energy sources. During this transition period, energy shortfalls will occur and energy prices will be increasing in an oscillating manner. As a result of the turbulence and dynamicity that will accompany the transition period, energy analysts need new appropriate methods, techniques and tools in order to develop forecasts for the behaviour of energy markets, which would assist in the long term strategic energy planning and policy analysis. This paper reviews energy market behaviour as related to policy formation, and from a dynamic point of view through the use of ''systems thinking'' and ''system dynamics'' principles, provides a framework for modelling the energy production and consumption process in relation to their environment. Thereby, effective energy planning can be developed via computerised simulation using policy experimentation. In a demonstration model depicted in this paper, it is shown that disasters due to attractive policies can be avoided by using simple computer simulation. (Author)

  16. Quality assurance for online adapted treatment plans: Benchmarking and delivery monitoring simulation

    International Nuclear Information System (INIS)

    Li, Taoran; Wu, Qiuwen; Yang, Yun; Rodrigues, Anna; Yin, Fang-Fang; Jackie Wu, Q.

    2015-01-01

    Purpose: An important challenge facing online adaptive radiation therapy is the development of feasible and efficient quality assurance (QA). This project aimed to validate the deliverability of online adapted plans and develop a proof-of-concept online delivery monitoring system for online adaptive radiation therapy QA. Methods: The first part of this project benchmarked automatically online adapted prostate treatment plans using traditional portal dosimetry IMRT QA. The portal dosimetry QA results of online adapted plans were compared to original (unadapted) plans as well as randomly selected prostate IMRT plans from our clinic. In the second part, an online delivery monitoring system was designed and validated via a simulated treatment with intentional multileaf collimator (MLC) errors. This system was based on inputs from the dynamic machine information (DMI), which continuously reports actual MLC positions and machine monitor units (MUs) at intervals of 50 ms or less during delivery. Based on the DMI, the system performed two levels of monitoring/verification during the delivery: (1) dynamic monitoring of cumulative fluence errors resulting from leaf position deviations and visualization using fluence error maps (FEMs); and (2) verification of MLC positions against the treatment plan for potential errors in MLC motion and data transfer at each control point. Validation of the online delivery monitoring system was performed by introducing intentional systematic MLC errors (ranging from 0.5 to 2 mm) to the DMI files for both leaf banks. These DMI files were analyzed by the proposed system to evaluate the system’s performance in quantifying errors and revealing the source of errors, as well as to understand patterns in the FEMs. In addition, FEMs from 210 actual prostate IMRT beams were analyzed using the proposed system to further validate its ability to catch and identify errors, as well as establish error magnitude baselines for prostate IMRT delivery

  17. Simulation of heat exchanger network (HEN) and planning the optimum cleaning schedule

    International Nuclear Information System (INIS)

    Sanaye, Sepehr; Niroomand, Behzad

    2007-01-01

    Modeling and simulation of heat exchanger networks for estimating the amount of fouling, variations in overall heat transfer coefficient, and variations in outlet temperatures of hot and cold streams has a significant effect on production analysis. In this analysis, parameters such as the exchangers' types and arrangements, their heat transfer surface areas, mass flow rates of hot and cold streams, heat transfer coefficients and variations of fouling with time are required input data. The main goal is to find the variations of the outlet temperatures of the hot and cold streams with time to plan the optimum cleaning schedule of heat exchangers that provides the minimum operational cost or maximum amount of savings. In this paper, the simulation of heat exchanger networks is performed by choosing an asymptotic fouling function. Two main parameters in the asymptotic fouling formation model, i.e. the decay time of fouling formation (τ) and the asymptotic fouling resistance (R f ∼ ) were obtained from empirical data as input parameters to the simulation relations. These data were extracted from the technical history sheets of the Khorasan Petrochemical Plant to guaranty the consistency between our model outputs and the real operating conditions. The output results of the software program developed, including the variations with time of the outlet temperatures of the hot and cold streams, the heat transfer coefficient and the heat transfer rate in the exchangers, are presented for two case studies. Then, an objective function (operational cost) was defined, and the optimal cleaning schedule of the HEN (heat exchanger network) in the Urea and Ammonia units were found by minimizing the objective function using a numerical search method. Based on this minimization procedure, the decision was made whether a heat exchanger should be cleaned or continue to operate. The final result was the most cost effective plan for the HEN cleaning schedule. The corresponding savings by

  18. MRI-based treatment plan simulation and adaptation for ion radiotherapy using a classification-based approach

    International Nuclear Information System (INIS)

    Rank, Christopher M; Tremmel, Christoph; Hünemohr, Nora; Nagel, Armin M; Jäkel, Oliver; Greilich, Steffen

    2013-01-01

    In order to benefit from the highly conformal irradiation of tumors in ion radiotherapy, sophisticated treatment planning and simulation are required. The purpose of this study was to investigate the potential of MRI for ion radiotherapy treatment plan simulation and adaptation using a classification-based approach. Firstly, a voxelwise tissue classification was applied to derive pseudo CT numbers from MR images using up to 8 contrasts. Appropriate MR sequences and parameters were evaluated in cross-validation studies of three phantoms. Secondly, ion radiotherapy treatment plans were optimized using both MRI-based pseudo CT and reference CT and recalculated on reference CT. Finally, a target shift was simulated and a treatment plan adapted to the shift was optimized on a pseudo CT and compared to reference CT optimizations without plan adaptation. The derivation of pseudo CT values led to mean absolute errors in the range of 81 - 95 HU. Most significant deviations appeared at borders between air and different tissue classes and originated from partial volume effects. Simulations of ion radiotherapy treatment plans using pseudo CT for optimization revealed only small underdosages in distal regions of a target volume with deviations of the mean dose of PTV between 1.4 - 3.1% compared to reference CT optimizations. A plan adapted to the target volume shift and optimized on the pseudo CT exhibited a comparable target dose coverage as a non-adapted plan optimized on a reference CT. We were able to show that a MRI-based derivation of pseudo CT values using a purely statistical classification approach is feasible although no physical relationship exists. Large errors appeared at compact bone classes and came from an imperfect distinction of bones and other tissue types in MRI. In simulations of treatment plans, it was demonstrated that these deviations are comparable to uncertainties of a target volume shift of 2 mm in two directions indicating that especially

  19. Offline motion planning and simulation of two-robot welding coordination

    Science.gov (United States)

    Zhang, Tie; Ouyang, Fan

    2012-03-01

    This paper focuses on the two-robot welding coordination of complex curve seam which means one robot grasp the workpiece, the other hold the torch, the two robots work on the same workpiece simultaneously. This paper builds the dual-robot coordinate system at the beginning, and three point calibration method of two robots' relative base coordinate system is presented. After that, the non master/slave scheme is chosen for the motion planning, the non master/slave scheme sets the poses versus time function of the point u on the workpiece, and calculates the two robot end effecter trajectories through the constrained relationship matrix automatically. Moreover, downhand welding is employed which can guarantee the torch and the seam keep in good contact condition all the time during the welding. Finally, a Solidworks-Sim Mechanics simulation platform is established, and a simulation of curved steel pipe welding is conducted. The results of the simulation illustrate the welding process can meet the requirements of downhand welding, the joint displacement curves are smooth and continuous and no joint velocities are out of working scope.

  20. Inverse planned stereotactic intensity modulated radiotherapy (IMRT) in the treatment of incompletely and completely resected adenoid cystic carcinomas of the head and neck: initial clinical results and toxicity of treatment

    International Nuclear Information System (INIS)

    Münter, MW; Schulz-Ertner, D; Hof, H; Nikoghosyan, A; Jensen, A; Nill, S; Huber, P; Debus, J

    2006-01-01

    Presenting the initial clinical results in the treatment of complex shaped adenoid cystic carcinomas (ACC) of the head and neck region by inverse planned stereotactic IMRT. 25 patients with huge ACC in different areas of the head and neck were treated. At the time of radiotherapy two patients already suffered from distant metastases. A complete resection of the tumor was possible in only 4 patients. The remaining patients were incompletely resected (R2: 20; R1: 1). 21 patients received an integrated boost IMRT (IBRT), which allow the use of different single doses for different target volumes in one fraction. All patients were treated after inverse treatment planning and stereotactic target point localization. The mean folllow-up was 22.8 months (91 – 1490 days). According to Kaplan Meier the three year overall survival rate was 72%. 4 patients died caused by a systemic progression of the disease. The three-year recurrence free survival was according to Kaplan Meier in this group of patients 38%. 3 patients developed an in-field recurrence and 3 patient showed a metastasis in an adjacent lymph node of the head and neck region. One patient with an in-field recurrence and a patient with the lymph node recurrence could be re-treated by radiotherapy. Both patients are now controlled. Acute side effects >Grade II did only appear so far in a small number of patients. The inverse planned stereotactic IMRT is feasible in the treatment of ACC. By using IMRT, high control rates and low side effects could by achieved. Further evaluation concerning the long term follow-up is needed. Due to the technical advantage of IMRT this treatment modality should be used if a particle therapy is not available

  1. Impact of urban planning on household's residential decisions: An agent-based simulation model for Vienna.

    Science.gov (United States)

    Gaube, Veronika; Remesch, Alexander

    2013-07-01

    Interest in assessing the sustainability of socio-ecological systems of urban areas has increased notably, with additional attention generated due to the fact that half the world's population now lives in cities. Urban areas face both a changing urban population size and increasing sustainability issues in terms of providing good socioeconomic and environmental living conditions. Urban planning has to deal with both challenges. Households play a major role by being affected by urban planning decisions on the one hand and by being responsible - among many other factors - for the environmental performance of a city (e.g. energy use). We here present an agent-based decision model referring to the city of Vienna, the capital of Austria, with a population of about 1.7 million (2.3 million within the metropolitan area, the latter being more than 25% of Austria's total population). Since the early 1990s, after decades of negative population growth, Vienna has been experiencing a steady increase in population, mainly driven by immigration. The aim of the agent-based decision model is to simulate new residential patterns of different household types based on demographic development and migration scenarios. Model results were used to assess spatial patterns of energy use caused by different household types in the four scenarios (1) conventional urban planning, (2) sustainable urban planning, (3) expensive centre and (4) no green area preference. Outcomes show that changes in preferences of households relating to the presence of nearby green areas have the most important impact on the distribution of households across the small-scaled city area. Additionally, the results demonstrate the importance of the distribution of different household types regarding spatial patterns of energy use.

  2. Impact of urban planning on household's residential decisions: An agent-based simulation model for Vienna☆

    Science.gov (United States)

    Gaube, Veronika; Remesch, Alexander

    2013-01-01

    Interest in assessing the sustainability of socio-ecological systems of urban areas has increased notably, with additional attention generated due to the fact that half the world's population now lives in cities. Urban areas face both a changing urban population size and increasing sustainability issues in terms of providing good socioeconomic and environmental living conditions. Urban planning has to deal with both challenges. Households play a major role by being affected by urban planning decisions on the one hand and by being responsible – among many other factors – for the environmental performance of a city (e.g. energy use). We here present an agent-based decision model referring to the city of Vienna, the capital of Austria, with a population of about 1.7 million (2.3 million within the metropolitan area, the latter being more than 25% of Austria's total population). Since the early 1990s, after decades of negative population growth, Vienna has been experiencing a steady increase in population, mainly driven by immigration. The aim of the agent-based decision model is to simulate new residential patterns of different household types based on demographic development and migration scenarios. Model results were used to assess spatial patterns of energy use caused by different household types in the four scenarios (1) conventional urban planning, (2) sustainable urban planning, (3) expensive centre and (4) no green area preference. Outcomes show that changes in preferences of households relating to the presence of nearby green areas have the most important impact on the distribution of households across the small-scaled city area. Additionally, the results demonstrate the importance of the distribution of different household types regarding spatial patterns of energy use. PMID:27667962

  3. Photon energy-modulated radiotherapy: Monte Carlo simulation and treatment planning study

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jong Min; Kim, Jung-in; Heon Choi, Chang; Chie, Eui Kyu; Kim, Il Han; Ye, Sung-Joon [Interdiciplinary Program in Radiation Applied Life Science, Seoul National University, Seoul, 110-744, Korea and Department of Radiation Oncology, Seoul National University Hospital, Seoul, 110-744 (Korea, Republic of); Interdiciplinary Program in Radiation Applied Life Science, Seoul National University, Seoul, 110-744 (Korea, Republic of); Department of Radiation Oncology, Seoul National University Hospital, Seoul, 110-744 (Korea, Republic of); Interdiciplinary Program in Radiation Applied Life Science, Seoul National University, Seoul, 110-744 (Korea, Republic of) and Department of Radiation Oncology, Seoul National University Hospital, Seoul, 110-744 (Korea, Republic of); Interdiciplinary Program in Radiation Applied Life Science, Seoul National University, Seoul, 110-744 (Korea, Republic of); Department of Radiation Oncology, Seoul National University Hospital, Seoul, 110-744 (Korea, Republic of) and Department of Intelligent Convergence Systems, Seoul National University, Seoul, 151-742 (Korea, Republic of)

    2012-03-15

    Purpose: To demonstrate the feasibility of photon energy-modulated radiotherapy during beam-on time. Methods: A cylindrical device made of aluminum was conceptually proposed as an energy modulator. The frame of the device was connected with 20 tubes through which mercury could be injected or drained to adjust the thickness of mercury along the beam axis. In Monte Carlo (MC) simulations, a flattening filter of 6 or 10 MV linac was replaced with the device. The thickness of mercury inside the device varied from 0 to 40 mm at the field sizes of 5 x 5 cm{sup 2} (FS5), 10 x 10 cm{sup 2} (FS10), and 20 x 20 cm{sup 2} (FS20). At least 5 billion histories were followed for each simulation to create phase space files at 100 cm source to surface distance (SSD). In-water beam data were acquired by additional MC simulations using the above phase space files. A treatment planning system (TPS) was commissioned to generate a virtual machine using the MC-generated beam data. Intensity modulated radiation therapy (IMRT) plans for six clinical cases were generated using conventional 6 MV, 6 MV flattening filter free, and energy-modulated photon beams of the virtual machine. Results: As increasing the thickness of mercury, Percentage depth doses (PDD) of modulated 6 and 10 MV after the depth of dose maximum were continuously increased. The amount of PDD increase at the depth of 10 and 20 cm for modulated 6 MV was 4.8% and 5.2% at FS5, 3.9% and 5.0% at FS10 and 3.2%-4.9% at FS20 as increasing the thickness of mercury from 0 to 20 mm. The same for modulated 10 MV was 4.5% and 5.0% at FS5, 3.8% and 4.7% at FS10 and 4.1% and 4.8% at FS20 as increasing the thickness of mercury from 0 to 25 mm. The outputs of modulated 6 MV with 20 mm mercury and of modulated 10 MV with 25 mm mercury were reduced into 30%, and 56% of conventional linac, respectively. The energy-modulated IMRT plans had less integral doses than 6 MV IMRT or 6 MV flattening filter free plans for tumors located in the

  4. Eliminating Inconsistencies in Simulation and Treatment Planning Orders in Radiation Therapy

    International Nuclear Information System (INIS)

    Santanam, Lakshmi; Brame, Ryan S.; Lindsey, Andrew; Dewees, Todd; Danieley, Jon; Labrash, Jason; Parikh, Parag; Bradley, Jeffrey; Zoberi, Imran; Michalski, Jeff; Mutic, Sasa

    2013-01-01

    Purpose: To identify deficiencies with simulation and treatment planning orders and to develop corrective measures to improve safety and quality. Methods and Materials: At Washington University, the DMAIIC formalism is used for process management, whereby the process is understood as comprising Define, Measure, Analyze, Improve, Implement, and Control activities. Two complementary tools were used to provide quantitative assessments: failure modes and effects analysis and reported event data. The events were classified by the user according to severity. The event rates (ie, number of events divided by the number of opportunities to generate an event) related to simulation and treatment plan orders were determined. Results: We analyzed event data from the period 2008-2009 to design an intelligent SIMulation and treatment PLanning Electronic (SIMPLE) order system. Before implementation of SIMPLE, event rates of 0.16 (420 of 2558) for a group of physicians that were subsequently used as a pilot group and 0.13 (787 of 6023) for all physicians were obtained. An interdisciplinary group evaluated and decided to replace the Microsoft Word-based form with a Web-based order system. This order system has mandatory fields and context-sensitive logic, an ability to create templates, and enables an automated process for communication of orders through an enterprise management system. After the implementation of the SIMPLE order, the event rate decreased to 0.09 (96 of 1001) for the pilot group and to 0.06 (145 of 2140) for all physicians (P<.0001). The average time to complete the SIMPLE form was 3 minutes, as compared with 7 minutes for the Word-based form. The number of severe events decreased from 10.7% (45 of 420) and 12.1% (96 of 787) to 6.2% (6 of 96) and 10.3% (15 of 145) for the pilot group and all physicians, respectively. Conclusions: There was a dramatic reduction in the total and the number of potentially severe events through use of the SIMPLE system. In addition

  5. Simulation of 3D-treatment plans in head and neck tumors aided by matching of digitally reconstructed radiographs (DRR) and on-line distortion corrected simulator images

    International Nuclear Information System (INIS)

    Lohr, Frank; Schramm, Oliver; Schraube, Peter; Sroka-Perez, Gabriele; Seeber, Steffen; Schlepple, Gerd; Schlegel, Wolfgang; Wannenmacher, Michael

    1997-01-01

    Background and purpose: Simulation of 3D-treatment plans for head and neck malignancy is difficult due to complex anatomy. Therefore, CT-simulation and stereotactic techniques are becoming more common in the treatment preparation, overcoming the need for simulation. However, if simulation is still performed, it is an important step in the treatment preparation/execution chain, since simulation errors, if not detected immediately, can compromise the success of treatment. A recently developed PC-based system for on-line image matching and comparison of digitally reconstructed radiographs (DRR) and distortion corrected simulator monitor images that enables instant correction of field placement errors during the simulation process was evaluated. The range of field placement errors with noncomputer aided simulation is reported. Materials and methods: For 14 patients either a primary 3D-treatment plan or a 3D-boost plan after initial treatment with opposing laterals for head and neck malignancy with a coplanar or non-coplanar two- or three-field technique was simulated. After determining the robustness of the matching process and the accuracy of field placement error detection with phantom measurements, DRRs were generated from the treatment planning CT-dataset of each patient and were interactively matched with on-line simulator images that had undergone correction for geometrical distortion, using a landmark algorithm. Translational field placement errors in all three planes as well as in-plane rotational errors were studied and were corrected immediately. Results: The interactive matching process is very robust with a tolerance of <2 mm when suitable anatomical landmarks are chosen. The accuracy for detection of translational errors in phantom measurements was <1 mm and for in-plane rotational errors the accuracy had a maximum of only 1.5 deg.. For patient simulation, the mean absolute distance of the planned versus simulated isocenter was 6.4 ± 3.9 mm. The in

  6. Optimal expansion planning of stand-alone systems with stochastic simulations

    Energy Technology Data Exchange (ETDEWEB)

    Hoese, Alejandro [Instituto de Energia Electrica (IEE), Universidad Nacional de San Juan, (Argentina)

    1997-12-31

    Stand-alone systems in the range of 1 kW - 10 MW are taking relevance in the new (global) liberal concept of energy market. State and private investors are becoming increasingly attention on the use of renewable for these systems, but it must be shown that these non-conventional solutions are competitive with the established conventional ones. The high investment costs and the technical and economic uncertainties coupled with the use of time-dependent energy sources are the mainly inhibiting factors for the decision agents to choose these systems instead of conventional ones. In the paper a new model for optimal expansion planning of hybrid stand-alone generating systems under consideration of uncertainties is presented. This model is at present in {sup d}evelopment state{sup .} Results already obtained in the first steps of this research are promising and some of them are here presented. [Espanol] Los sistemas autocontenidos en el rango de 1 Kw a 10 MW estan tomando importancia en el nuevo (global) concepto liberal del mercado de la energia. Inversionistas privados y del Estado estan poniendo mayor atencion en el uso de energias renovables para estos sistemas, pero debe mostrarse que estas soluciones no-convencionales son competitivas con las convencionales establecidas. Los altos costos de inversion y las incertidumbres tecnicas y economicas aunadamente con el uso de fuentes de energia dependientes del tiempo son los principales factores inhibidores de los factores de decision para escoger estos sistemas en lugar de los convencionales. En este articulo se presenta un nuevo modelo de planeacion de expansion optima de sistemas hibridos autocontenidos de generacion electrica bajo la consideracion de incertidumbres. Este modelo esta actualmente en {sup e}stado de desarrollo{sup .} Los resultados ya obtenidos en las primeras etapas de esta investigacion son prometedores y se presentan algunos de ellos.

  7. Simulation as a planning tool for job-shop production environment

    Science.gov (United States)

    Maram, Venkataramana; Nawawi, Mohd Kamal Bin Mohd; Rahman, Syariza Abdul; Sultan, Sultan Juma

    2015-12-01

    In this paper, we made an attempt to use discrete event simulation software ARENA® as a planning tool for job shop production environment. We considered job shop produces three types of Jigs with different sequence of operations to study and improve shop floor performance. The sole purpose of the study is to identifying options to improve machines utilization, reducing job waiting times at bottleneck machines. First, the performance of the existing system was evaluated by using ARENA®. Then identified improvement opportunities by analyzing base system results. Second, updated the model with most economical options. The proposed new system outperforms with that of the current base system by 816% improvement in delay times at paint shop by increase 2 to 3 and Jig cycle time reduces by Jig1 92%, Jig2 65% and Jig3 41% and hence new proposal was recommended.

  8. Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hendrickson, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-27

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individual work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.

  9. A review of computer-aided oral and maxillofacial surgery: planning, simulation and navigation.

    Science.gov (United States)

    Chen, Xiaojun; Xu, Lu; Sun, Yi; Politis, Constantinus

    2016-11-01

    Currently, oral and maxillofacial surgery (OMFS) still poses a significant challenge for surgeons due to the anatomic complexity and limited field of view of the oral cavity. With the great development of computer technologies, he computer-aided surgery has been widely used for minimizing the risks and improving the precision of surgery. Areas covered: The major goal of this paper is to provide a comprehensive reference source of current and future development of computer-aided OMFS including surgical planning, simulation and navigation for relevant researchers. Expert commentary: Compared with the traditional OMFS, computer-aided OMFS overcomes the disadvantage that the treatment on the region of anatomically complex maxillofacial depends almost exclusively on the experience of the surgeon.

  10. WE-DE-201-01: BEST IN PHYSICS (THERAPY): A Fast Multi-Target Inverse Treatment Planning Strategy Optimizing Dosimetric Measures for High-Dose-Rate (HDR) Brachytherapy

    Energy Technology Data Exchange (ETDEWEB)

    Guthier, C [Brigham and Women’s Hospital, Boston, MA (United States); University Medical Center Mannheim, Mannheim (Germany); Harvard Medical School, Boston, MA (United States); Damato, A; Viswanathan, A; Cormack, R [Dana Farber Cancer Institut/Brigham and Women’s Hospital, Boston, MA (United States); Harvard Medical School, Boston, MA (United States); Hesser, J [University Medical Center Mannheim, Mannheim (Germany)

    2016-06-15

    Purpose: Inverse treatment planning (ITP) for interstitial HDR brachytherapy of gynecologic cancers seeks to maximize coverage of the clinical target volumes (tumor and vagina) while respecting dose-volume-histogram related dosimetric measures (DMs) for organs at risk (OARs). Commercially available ITP tools do not support DM-based planning because it is computationally too expensive to solve. In this study we present a novel approach that allows fast ITP for gynecologic cancers based on DMs for the first time. Methods: This novel strategy is an optimization model based on a smooth DM-based objective function. The smooth approximation is achieved by utilizing a logistic function for the evaluation of DMs. The resulting nonconvex and constrained optimization problem is then optimized with a BFGS algorithm. The model was evaluated using the implant geometry extracted from 20 patient treatment plans under an IRB-approved retrospective study. For each plan, the final DMs were evaluated and compared to the original clinical plans. The CTVs were the contoured tumor volume and the contoured surface of the vagina. Statistical significance was evaluated with a one-sided paired Wilcoxon signed-rank test. Results: As did the clinical plans, all generated plans fulfilled the defined DMs for OARs. The proposed strategy showed a statistically significant improvement (p<0.001) in coverage of the tumor and vagina, with absolute improvements of related DMs of (6.9 +/− 7.9)% and (28.2 +/− 12.0)%, respectively. This was achieved with a statistically significant (p<0.01) decrease of the high-dose-related DM for the tumor. The runtime of the optimization was (2.3 +/− 2.0) seconds. Conclusion: We demonstrated using clinical data that our novel approach allows rapid DM-based optimization with improved coverage of CTVs with fewer hot spots. Being up to three orders of magnitude faster than the current clinical practice, the method dramatically shortens planning time.

  11. A surgical simulator for planning and performing repair of cleft lips.

    Science.gov (United States)

    Schendel, Stephen; Montgomery, Kevin; Sorokin, Andrea; Lionetti, Giancarlo

    2005-08-01

    The objective of this project was to develop a computer-based surgical simulation system for planning and performing cleft lip repair. This system allows the user to interact with a virtual patient to perform the traditional steps of cleft-lip repair (rotation-advancement technique). The system interfaces to force-feedback (haptic) devices to track the user's motion and provide feedback during the procedure, while performing real-time soft-tissue simulation. An 11-day-old unilateral cleft lip, alveolus and palate patient was previously CT scanned for ancillary diagnostic purposes using standard imaging protocols and 1mm slices. High-resolution 3D meshes were automatically generated from this data using the ROVE software developed in-house. The resulting 3D meshes of bone and soft tissue were instilled with physical properties of soft tissues for purposes of simulation. Once these preprocessing steps were completed, the patient's bone and soft tissue data are presented on the computer screen in stereo and the user can freely view, rotate, and otherwise interact with the patient's data in real time. The user is prompted to select anatomical landmarks on the patient's data for preoperative planning purposes, then their locations are compared against that of a 'gold standard' and a score, derived from their deviation from that standard and time required, is generated. The user can then move a haptic stylus and guide the motion of the virtual cutting tool. The soft tissues can thus be incised using this virtual cutting tool, moved using virtual forceps, and fused in order to perform any of the major procedures for cleft lip repair. Real-time soft tissue deformation of the mesh realistically simulates normal tissues and haptic-rate (>1 kHz) force-feedback is provided. The surgical result of the procedure can then be immediately visualized and the entire training process can be repeated at will. A short evaluation study was also performed. Two groups (non-medical and

  12. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  13. Human-robot collaborated path planning for bevel-tip needle steering in simulated human environment.

    Science.gov (United States)

    Jing Xiong; Zeyang Xia; Yangzhou Gan

    2016-08-01

    Clinical Application of linear percutaneous needle insertion is restricted due to issues such as limited path and deflection. Thus steering of flexible needle is critical demanded in the clinic. Previous studies tended to use autonomous methods to conduct path planning for needle steering. However, these methods had very limited adaptabilities, and they also decreased the human operator's domination of the operation, as clinically required. In this case, teleoperation has been an option, while in complicated environments sole teleoperation is not sufficient for a human operator to generate multi-curved insertion path. Therefore, in this paper, we propose a semiautonomous human-robot collaborated path planning method for teleoperated bevel-tip needle steering. The key module of this method is a human-robot collaboration mechanism which consists of the operator input, environment constraints, and path constraints. The proposed method were tested semi-physically in a simulated human environment and the results validated that the proposed method were able to efficiently assist the operator to generate multi-curved paths under human operator's domination.

  14. Forward and Inverse Predictive Model for the Trajectory Tracking Control of a Lower Limb Exoskeleton for Gait Rehabilitation: Simulation modelling analysis

    Science.gov (United States)

    Zakaria, M. A.; Majeed, A. P. P. A.; Taha, Z.; Alim, M. M.; Baarath, K.

    2018-03-01

    The movement of a lower limb exoskeleton requires a reasonably accurate control method to allow for an effective gait therapy session to transpire. Trajectory tracking is a nontrivial means of passive rehabilitation technique to correct the motion of the patients’ impaired limb. This paper proposes an inverse predictive model that is coupled together with the forward kinematics of the exoskeleton to estimate the behaviour of the system. A conventional PID control system is used to converge the required joint angles based on the desired input from the inverse predictive model. It was demonstrated through the present study, that the inverse predictive model is capable of meeting the trajectory demand with acceptable error tolerance. The findings further suggest the ability of the predictive model of the exoskeleton to predict a correct joint angle command to the system.

  15. Assembly Line Productivity Assessment by Comparing Optimization-Simulation Algorithms of Trajectory Planning for Industrial Robots

    Directory of Open Access Journals (Sweden)

    Francisco Rubio

    2015-01-01

    Full Text Available In this paper an analysis of productivity will be carried out from the resolution of the problem of trajectory planning of industrial robots. The analysis entails economic considerations, thus overcoming some limitations of the existing literature. Two methodologies based on optimization-simulation procedures are compared to calculate the time needed to perform an industrial robot task. The simulation methodology relies on the use of robotics and automation software called GRASP. The optimization methodology developed in this work is based on the kinematics and the dynamics of industrial robots. It allows us to pose a multiobjective optimization problem to assess the trade-offs between the economic variables by means of the Pareto fronts. The comparison is carried out for different examples and from a multidisciplinary point of view, thus, to determine the impact of using each method. Results have shown the opportunity costs of non using the methodology with optimized time trajectories. Furthermore, it allows companies to stay competitive because of the quick adaptation to rapidly changing markets.

  16. Exploring strategies in integrated container terminal planning tasks : A data-intensive simulation game analysis

    NARCIS (Netherlands)

    Kurapati, S.; Lukosch, H.K.; Cunningham, S.; Kwakkel, J.H.; Verbraeck, A.

    2016-01-01

    Planning tasks in modern, fully automated container terminals require a high awareness of the complex situation, and successful planning strategies. Operational planning includes both strategies of planning and resource management. As planning procedures are not (yet) fully automated, a skilled

  17. Normative models and healthcare planning: network-based simulations within a geographic information system environment.

    Science.gov (United States)

    Walsh, S J; Page, P H; Gesler, W M

    1997-06-01

    Network analysis to integrate patient, transportation and hospital characteristics for healthcare planning in order to assess the role of geographic information systems (GIS). A normative model of base-level responses of patient flows to hospitals, based on estimated travel times, was developed for this purpose. A GIS database developed to include patient discharge data, locations of hospitals, US TIGER/Line files of the transportation network, enhanced address-range data, and U.S. Census variables. The study area included a 16-county region centered on the city of Charlotte and Mecklenburg County, North Carolina, and contained 25 hospitals serving nearly 2 million people over a geographic area of nearly 9,000 square miles. Normative models as a tool for healthcare planning were derived through a spatial Network analysis and a distance optimization model that was implemented within a GIS. Scenarios were developed and tested that involved patient discharge data geocoded to the five-digit zip code, hospital locations geocoded to their individual addresses, and a transportation network of varying road types and corresponding estimated travel speeds to examine both patient discharge levels and a doubling of discharge levels associated with total discharges and DRG 391 (Normal Newborns). The Network analysis used location/allocation modeling to optimize for travel time and integrated measures of supply, demand, and impedance. Patient discharge data from the North Carolina Medical Database Commission, address-ranges from the North Carolina Institute for Transportation Research and Education, and U.S. Census TIGER/Line files were entered-into the ARC/INFO GIS software system for analysis. A relational database structure was used to organize the information and to link spatial features to their attributes. Advances in healthcare planning can be achieved by examining baseline responses of patient flows to distance optimization simulations and healthcare scenarios conducted

  18. Planning intensive care unit design using computer simulation modeling: optimizing integration of clinical, operational, and architectural requirements.

    Science.gov (United States)

    OʼHara, Susan

    2014-01-01

    Nurses have increasingly been regarded as critical members of the planning team as architects recognize their knowledge and value. But the nurses' role as knowledge experts can be expanded to leading efforts to integrate the clinical, operational, and architectural expertise through simulation modeling. Simulation modeling allows for the optimal merge of multifactorial data to understand the current state of the intensive care unit and predict future states. Nurses can champion the simulation modeling process and reap the benefits of a cost-effective way to test new designs, processes, staffing models, and future programming trends prior to implementation. Simulation modeling is an evidence-based planning approach, a standard, for integrating the sciences with real client data, to offer solutions for improving patient care.

  19. A bi-level integrated generation-transmission planning model incorporating the impacts of demand response by operation simulation

    International Nuclear Information System (INIS)

    Zhang, Ning; Hu, Zhaoguang; Springer, Cecilia; Li, Yanning; Shen, Bo

    2016-01-01

    Highlights: • We put forward a novel bi-level integrated power system planning model. • Generation expansion planning and transmission expansion planning are combined. • The effects of two sorts of demand response in reducing peak load are considered. • Operation simulation is conducted to reflect the actual effects of demand response. • The interactions between the two levels can guarantee a reasonably optimal result. - Abstract: If all the resources in power supply side, transmission part, and power demand side are considered together, the optimal expansion scheme from the perspective of the whole system can be achieved. In this paper, generation expansion planning and transmission expansion planning are combined into one model. Moreover, the effects of demand response in reducing peak load are taken into account in the planning model, which can cut back the generation expansion capacity and transmission expansion capacity. Existing approaches to considering demand response for planning tend to overestimate the impacts of demand response on peak load reduction. These approaches usually focus on power reduction at the moment of peak load without considering the situations in which load demand at another moment may unexpectedly become the new peak load due to demand response. These situations are analyzed in this paper. Accordingly, a novel approach to incorporating demand response in a planning model is proposed. A modified unit commitment model with demand response is utilized. The planning model is thereby a bi-level model with interactions between generation-transmission expansion planning and operation simulation to reflect the actual effects of demand response and find the reasonably optimal planning result.

  20. Simulation in Pre-departure Training for Residents Planning Clinical Work in a Low-Income Country

    Directory of Open Access Journals (Sweden)

    Kevin R. Schwartz

    2015-12-01

    Full Text Available Introduction: Increasingly, pediatric and emergency medicine (EM residents are pursuing clinical rotations in low-income countries. Optimal pre-departure preparation for such rotations has not yet been established. High-fidelity simulation represents a potentially effective modality for such preparation. This study was designed to assess whether a pre-departure high-fidelity medical simulation curriculum is effective in helping to prepare residents for clinical rotations in a low-income country. Methods: 43 pediatric and EM residents planning clinical rotations in Liberia, West Africa, participated in a simulation-based curriculum focused on severe pediatric malaria and malnutrition and were then assessed by survey at three time points: pre-simulation, post-simulation, and after returning from work abroad. Results: Prior to simulation, 1/43 (2% participants reported they were comfortable with the diagnosis and management of severe malnutrition; this increased to 30/42 (71% after simulation and 24/31 (77% after working abroad. Prior to simulation, 1/43 (2% of residents reported comfort with the diagnosis and management of severe malaria; this increased to 26/42 (62% after simulation and 28/31 (90% after working abroad; 36/42 (86% of residents agreed that a simulation-based global health curriculum is more useful than a didactic curriculum alone, and 41/42 (98% felt a simulator-based curriculum should be offered to all residents planning a clinical trip to a low-income country. Conclusion: High-fidelity simulation is effective in increasing residents’ self-rated comfort in management of pediatric malaria and malnutrition and a majority of participating residents feel it should be included as a component of pre-departure training for all residents rotating clinically to low-income countries.

  1. Inverse Interval Matrix: A Survey

    Czech Academy of Sciences Publication Activity Database

    Rohn, Jiří; Farhadsefat, R.

    2011-01-01

    Roč. 22, - (2011), s. 704-719 E-ISSN 1081-3810 R&D Projects: GA ČR GA201/09/1957; GA ČR GC201/08/J020 Institutional research plan: CEZ:AV0Z10300504 Keywords : interval matrix * inverse interval matrix * NP-hardness * enclosure * unit midpoint * inverse sign stability * nonnegative invertibility * absolute value equation * algorithm Subject RIV: BA - General Mathematics Impact factor: 0.808, year: 2010 http://www.math.technion.ac.il/iic/ela/ela-articles/articles/vol22_pp704-719.pdf

  2. Inverse problem of Ocean Acoustic Tomography (OAT) - A numerical experiment

    Digital Repository Service at National Institute of Oceanography (India)

    Murty, T.V.R.; Somayajulu, Y.K.; Mahadevan, R.; Murty, C.S.

    Acoustic model simulation experiments related to the forward and inverse aspects of ocean tomography have been taken up with a view to estimate the vertical sound speed field by inverting the travel time data. Two methods of inversion have been...

  3. Inverse Kinematics with Closed Form Solution for Denso Robot Manipulator

    OpenAIRE

    Prasetia, Ikhsan Eka; Agustinah, Trihastuti

    2015-01-01

    In this paper, the forward kinematics and inverse kinematics used on the Denso robot manipulator which has a 6-DOF. The forward kinematics will result in the desired position by end-effector, while inverse kinematics produce angel on each joint. Inverse kinematics problem are very difficult, therefor to obtain the solution of inverse kinematics using closed form solution with geometry approach. The simulation result obtained from forward kinematics and inverse kinematics is determining desire...

  4. Optimal Inversion Parameters for Full Waveform Inversion using OBS Data Set

    Science.gov (United States)

    Kim, S.; Chung, W.; Shin, S.; Kim, D.; Lee, D.

    2017-12-01

    In recent years, full Waveform Inversion (FWI) has been the most researched technique in seismic data processing. It uses the residuals between observed and modeled data as an objective function; thereafter, the final subsurface velocity model is generated through a series of iterations meant to minimize the residuals.Research on FWI has expanded from acoustic media to elastic media. In acoustic media, the subsurface property is defined by P-velocity; however, in elastic media, properties are defined by multiple parameters, such as P-velocity, S-velocity, and density. Further, the elastic media can also be defined by Lamé constants, density or impedance PI, SI; consequently, research is being carried out to ascertain the optimal parameters.From results of advanced exploration equipment and Ocean Bottom Seismic (OBS) survey, it is now possible to obtain multi-component seismic data. However, to perform FWI on these data and generate an accurate subsurface model, it is important to determine optimal inversion parameters among (Vp, Vs, ρ), (λ, μ, ρ), and (PI, SI) in elastic media. In this study, staggered grid finite difference method was applied to simulate OBS survey. As in inversion, l2-norm was set as objective function. Further, the accurate computation of gradient direction was performed using the back-propagation technique and its scaling was done using the Pseudo-hessian matrix.In acoustic media, only Vp is used as the inversion parameter. In contrast, various sets of parameters, such as (Vp, Vs, ρ) and (λ, μ, ρ) can be used to define inversion in elastic media. Therefore, it is important to ascertain the parameter that gives the most accurate result for inversion with OBS data set.In this study, we generated Vp and Vs subsurface models by using (λ, μ, ρ) and (Vp, Vs, ρ) as inversion parameters in every iteration, and compared the final two FWI results.This research was supported by the Basic Research Project(17-3312) of the Korea Institute of

  5. Three-Dimensional Printing and Surgical Simulation for Preoperative Planning of Deformity Correction in Foot and Ankle Surgery.

    Science.gov (United States)

    Jastifer, James R; Gustafson, Peter A

    A paucity of published data is available describing the methods for the integration of 3-dimensional (3D) printing technology and surgical simulation into orthopedic surgery. The cost of this technology has decreased and the ease of use has increased, making routine use of 3D printed models and surgical simulation for difficult orthopedic problems a realistic option. We report the use of 3D printed models and surgical simulation for preoperative planning and patient education in the case of deformity correction in foot and ankle surgery using open source, free software. Copyright © 2016 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  6. Inverse problems of geophysics

    International Nuclear Information System (INIS)

    Yanovskaya, T.B.

    2003-07-01

    This report gives an overview and the mathematical formulation of geophysical inverse problems. General principles of statistical estimation are explained. The maximum likelihood and least square fit methods, the Backus-Gilbert method and general approaches for solving inverse problems are discussed. General formulations of linearized inverse problems, singular value decomposition and properties of pseudo-inverse solutions are given

  7. Fuzzy Inverse Compactness

    Directory of Open Access Journals (Sweden)

    Halis Aygün

    2008-01-01

    Full Text Available We introduce definitions of fuzzy inverse compactness, fuzzy inverse countable compactness, and fuzzy inverse Lindelöfness on arbitrary -fuzzy sets in -fuzzy topological spaces. We prove that the proposed definitions are good extensions of the corresponding concepts in ordinary topology and obtain different characterizations of fuzzy inverse compactness.

  8. Complexity in Foresight: experiences with INTERSECTIONS: an agent-based simulation workbench to help achieve adaptiveness in strategic planning

    NARCIS (Netherlands)

    M.P. Schilperoord (Michel)

    2005-01-01

    textabstract“Complexity in Foresight” is a new synthetic paradigm that crosses areas in strategic planning and the complexity sciences. It connects the fields of agent-based simulation and complex adapative systems, and provides the overall blueprint for the construction of a new generation of

  9. Evaluation of a performance appraisal framework for radiation therapists in planning and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Becker, Jillian, E-mail: jillian.becker@health.qld.gov.au [Radiation Oncology Mater Centre, South Brisbane, Queensland (Australia); Bridge, Pete [School of Clinical Sciences, Queensland University of Technology, Brisbane, Queensland (Australia); Brown, Elizabeth; Lusk, Ryan; Ferrari-Anderson, Janet [Radiation Oncology, Princess Alexandra Hospital, Brisbane, Queensland (Australia); Radiation Oncology Mater Centre, South Brisbane, Queensland (Australia)

    2015-06-15

    Constantly evolving technology and techniques within radiation therapy require practitioners to maintain a continuous approach to professional development and training. Systems of performance appraisal and adoption of regular feedback mechanisms are vital to support this development yet frequently lack structure and rely on informal peer support. A Radiation Therapy Performance Appraisal Framework (RT-PAF) for radiation therapists in planning and simulation was developed to define expectations of practice and promote a supportive and objective culture of performance and skills appraisal. Evaluation of the framework was conducted via an anonymous online survey tool. Nine peer reviewers and fourteen recipients provided feedback on its effectiveness and the challenges and limitations of the approach. Findings from the evaluation were positive and suggested that both groups gained benefit from and expressed a strong interest in embedding the approach more routinely. Respondents identified common challenges related to the limited ability to implement suggested development strategies; this was strongly associated with time and rostering issues. This framework successfully defined expectations for practice and provided a fair and objective feedback process that focussed on skills development. It empowered staff to maintain their skills and reach their professional potential. Management support, particularly in regard to provision of protected time was highlighted as critical to the framework's ongoing success. The demonstrated benefits arising in terms of staff satisfaction and development highlight the importance of this commitment to the modern radiation therapy workforce.

  10. INPRES (intraoperative presentation of surgical planning and simulation results): augmented reality for craniofacial surgery

    Science.gov (United States)

    Salb, Tobias; Brief, Jakob; Welzel, Thomas; Giesler, Bjoern; Hassfeld, Steffan; Muehling, Joachim; Dillmann, Ruediger

    2003-05-01

    In this paper we present recent developments and pre-clinical validation results of our approach for augmented reality (AR, for short) in craniofacial surgery. A commercial Sony Glasstron display is used for optical see-through overlay of surgical planning and simulation results with a patient inside the operation room (OR). For the tracking of the glasses, of the patient and of various medical instruments an NDI Polaris system is used as standard solution. A complementary inside-out navigation approach has been realized with a panoramic camera. This device is mounted on the head of the surgeon for tracking of fiducials placed on the walls of the OR. Further tasks described include the calibration of the head-mounted display (HMD), the registration of virtual objects with the real world and the detection of occlusions in the object overlay with help of two miniature CCD cameras. The evaluation of our work took place in the laboratory environment and showed promising results. Future work will concentrate on the optimization of the technical features of the prototype and on the development of a system for everyday clinical use.

  11. Analysis of RAE-1 inversion

    Science.gov (United States)

    Hedland, D. A.; Degonia, P. K.

    1974-01-01

    The RAE-1 spacecraft inversion performed October 31, 1972 is described based upon the in-orbit dynamical data in conjunction with results obtained from previously developed computer simulation models. The computer simulations used are predictive of the satellite dynamics, including boom flexing, and are applicable during boom deployment and retraction, inter-phase coast periods, and post-deployment operations. Attitude data, as well as boom tip data, were analyzed in order to obtain a detailed description of the dynamical behavior of the spacecraft during and after the inversion. Runs were made using the computer model and the results were analyzed and compared with the real time data. Close agreement between the actual recorded spacecraft attitude and the computer simulation results was obtained.

  12. Generalizable open source urban water portfolio simulation framework demonstrated using a multi-objective risk-based planning benchmark problem.

    Science.gov (United States)

    Trindade, B. C.; Reed, P. M.

    2017-12-01

    The growing access and reduced cost for computing power in recent years has promoted rapid development and application of multi-objective water supply portfolio planning. As this trend continues there is a pressing need for flexible risk-based simulation frameworks and improved algorithm benchmarking for emerging classes of water supply planning and management problems. This work contributes the Water Utilities Management and Planning (WUMP) model: a generalizable and open source simulation framework designed to capture how water utilities can minimize operational and financial risks by regionally coordinating planning and management choices, i.e. making more efficient and coordinated use of restrictions, water transfers and financial hedging combined with possible construction of new infrastructure. We introduce the WUMP simulation framework as part of a new multi-objective benchmark problem for planning and management of regionally integrated water utility companies. In this problem, a group of fictitious water utilities seek to balance the use of the mentioned reliability driven actions (e.g., restrictions, water transfers and infrastructure pathways) and their inherent financial risks. Several traits of this problem make it ideal for a benchmark problem, namely the presence of (1) strong non-linearities and discontinuities in the Pareto front caused by the step-wise nature of the decision making formulation and by the abrupt addition of storage through infrastructure construction, (2) noise due to the stochastic nature of the streamflows and water demands, and (3) non-separability resulting from the cooperative formulation of the problem, in which decisions made by stakeholder may substantially impact others. Both the open source WUMP simulation framework and its demonstration in a challenging benchmarking example hold value for promoting broader advances in urban water supply portfolio planning for regions confronting change.

  13. PDCI Wide-Area Damping Control: PSLF Simulations of the 2016 Open and Closed Loop Test Plan

    Energy Technology Data Exchange (ETDEWEB)

    Wilches Bernal, Felipe [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pierre, Brian Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Elliott, Ryan Thomas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schoenwald, David A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Byrne, Raymond H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neely, Jason C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Trudnowski, Daniel J. [Montana Tech of the Univ. of Montana, Butte, MT (United States); Donnelly, Matthew K. [Montana Tech of the Univ. of Montana, Butte, MT (United States)

    2017-03-01

    To demonstrate and validate the performance of the wide-are a damping control system, the project plans to conduct closed-loop tests on the PDCI in summer/fall 2016. A test plan details the open and closed loop tests to be conducted on the P DCI using the wide-area damping control system. To ensure the appropriate level of preparedness, simulations were performed in order to predict and evaluate any possible unsafe operations before hardware experiments are attempted. This report contains the result s from these simulations using the power system dynamics software PSLF (Power System Load Flow, trademark of GE). The simulations use the WECC (Western Electricity Coordinating Council) 2016 light summer and heavy summer base cases.

  14. Future planning: default network activity couples with frontoparietal control network and reward-processing regions during process and outcome simulations.

    Science.gov (United States)

    Gerlach, Kathy D; Spreng, R Nathan; Madore, Kevin P; Schacter, Daniel L

    2014-12-01

    We spend much of our daily lives imagining how we can reach future goals and what will happen when we attain them. Despite the prevalence of such goal-directed simulations, neuroimaging studies on planning have mainly focused on executive processes in the frontal lobe. This experiment examined the neural basis of process simulations, during which participants imagined themselves going through steps toward attaining a goal, and outcome simulations, during which participants imagined events they associated with achieving a goal. In the scanner, participants engaged in these simulation tasks and an odd/even control task. We hypothesized that process simulations would recruit default and frontoparietal control network regions, and that outcome simulations, which allow us to anticipate the affective consequences of achieving goals, would recruit default and reward-processing regions. Our analysis of brain activity that covaried with process and outcome simulations confirmed these hypotheses. A functional connectivity analysis with posterior cingulate, dorsolateral prefrontal cortex and anterior inferior parietal lobule seeds showed that their activity was correlated during process simulations and associated with a distributed network of default and frontoparietal control network regions. During outcome simulations, medial prefrontal cortex and amygdala seeds covaried together and formed a functional network with default and reward-processing regions. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  15. Policy planning under uncertainty: efficient starting populations for simulation-optimization methods applied to municipal solid waste management.

    Science.gov (United States)

    Huang, Gordon H; Linton, Jonathan D; Yeomans, Julian Scott; Yoogalingam, Reena

    2005-10-01

    Evolutionary simulation-optimization (ESO) techniques can be adapted to model a wide variety of problem types in which system components are stochastic. Grey programming (GP) methods have been previously applied to numerous environmental planning problems containing uncertain information. In this paper, ESO is combined with GP for policy planning to create a hybrid solution approach named GESO. It can be shown that multiple policy alternatives meeting required system criteria, or modelling-to-generate-alternatives (MGA), can be quickly and efficiently created by applying GESO to this case data. The efficacy of GESO is illustrated using a municipal solid waste management case taken from the regional municipality of Hamilton-Wentworth in the Province of Ontario, Canada. The MGA capability of GESO is especially meaningful for large-scale real-world planning problems and the practicality of this procedure can easily be extended from MSW systems to many other planning applications containing significant sources of uncertainty.

  16. Chapter 8: Planning Tools to Simulate and Optimize Neighborhood Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Zhivov, Alexander Michael; Case, Michael Patrick; Jank, Reinhard; Eicker, Ursula; Booth, Samuel

    2017-03-15

    This section introduces different energy modeling tools available in Europe and the USA for community energy master planning process varying from strategic Urban Energy Planning to more detailed Local Energy Planning. Two modeling tools used for Energy Master Planning of primarily residential communities, the 3D city model with CityGML, and the Net Zero Planner tool developed for the US Department of Defense installations are described in more details.

  17. Dose/volume–response relations for rectal morbidity using planned and simulated motion-inclusive dose distributions

    International Nuclear Information System (INIS)

    Thor, Maria; Apte, Aditya; Deasy, Joseph O.; Karlsdóttir, Àsa; Moiseenko, Vitali; Liu, Mitchell; Muren, Ludvig Paul

    2013-01-01

    Background and purpose: Many dose-limiting normal tissues in radiotherapy (RT) display considerable internal motion between fractions over a course of treatment, potentially reducing the appropriateness of using planned dose distributions to predict morbidity. Accounting explicitly for rectal motion could improve the predictive power of modelling rectal morbidity. To test this, we simulated the effect of motion in two cohorts. Materials and methods: The included patients (232 and 159 cases) received RT for prostate cancer to 70 and 74 Gy. Motion-inclusive dose distributions were introduced as simulations of random or systematic motion to the planned dose distributions. Six rectal morbidity endpoints were analysed. A probit model using the QUANTEC recommended parameters was also applied to the cohorts. Results: The differences in associations using the planned over the motion-inclusive dose distributions were modest. Statistically significant associations were obtained with four of the endpoints, mainly at high doses (55–70 Gy), using both the planned and the motion-inclusive dose distributions, primarily when simulating random motion. The strongest associations were observed for GI toxicity and rectal bleeding (Rs = 0.12–0.21; Rs = 0.11–0.20). Applying the probit model, significant associations were found for tenesmus and rectal bleeding (Rs = 0.13, p = 0.02). Conclusion: Equally strong associations with rectal morbidity were observed at high doses (>55 Gy), for the planned and the simulated dose distributions including in particular random rectal motion. Future studies should explore patient-specific descriptions of rectal motion to achieve improved predictive power

  18. Three-dimensional planning and simulation of hip operations and computer-assisted construction of endoprostheses in bone tumor surgery.

    Science.gov (United States)

    Handels, H; Ehrhardt, J; Plötz, W; Pöppl, S J

    2001-01-01

    This article presents the VIRTOPS (VIRTual Operation Planning in Orthopaedic Surgery) software system for virtual preoperative planning and simulation of hip operations. The system is applied to simulate the endoprosthetic reconstruction of the hip joint with hemipelvic replacement, and supports the individual design of anatomically adaptable, modular prostheses in bone tumor surgery. The virtual planning of the operation and the construction of the individual implant are supported by virtual reality techniques. The central step of the operation planning procedure, the placement of the cutting plane in the hip bone, depends strongly on the tumor's position. Segmentation of the tumor and the bones in MR and CT data, as well as fusion of MR and CT image sequences, is necessary to visualize the tumor's position within the hip bone. Three-dimensional models of the patient's hip are generated based on CT image data. A ROI-based segmentation algorithm enables the separation of the bone tumor in multispectral MR image sequences. A special registration method using segmentation results has been developed to transfer CT and MR data into one common coordinate system. During the 3D planning process, the surgeon simulates the operation and defines the position and geometry of the custom-made endoprosthesis. Stereoscopic visualization and 3D input devices facilitate navigation and 3D interaction in the virtual environment. Special visualization techniques such as texture mapping, color coding of quantitative parameters, and transparency support the determination of the correct position and geometry of the prosthesis. The VIRTOPS system enables the complete virtual planning of hip operations with endoprosthetic reconstruction, as well as the optimal placement and design of endoprostheses. After the registration and segmentation of CT and MR data, 3D visualizations of the tumor within the bone are generated to support the surgeon during the planning procedure. In the virtual

  19. A Simulation Study for Radiation Treatment Planning Based on the Atomic Physics of the Proton-Boron Fusion Reaction

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sunmi; Yoon, Do-Kun; Shin, Han-Back; Jung, Joo-Young; Kim, Moo-Sub; Kim, Kyeong-Hyeon; Jang, Hong-Seok; Suh, Tae Suk [the Catholic University of Korea, Seoul (Korea, Republic of)

    2017-03-15

    The purpose of this research is to demonstrate, based on a Monte Carlo simulation code, the procedure of radiation treatment planning for proton-boron fusion therapy (PBFT). A discrete proton beam (60 - 120 MeV) relevant to the Bragg peak was simulated using a Monte Carlo particle extended (MCNPX, Ver. 2.6.0, National Laboratory, Los Alamos NM, USA) simulation code. After computed tomography (CT) scanning of a virtual water phantom including air cavities, the acquired CT images were converted using the simulation source code. We set the boron uptake regions (BURs) in the simulated water phantom to achieve the proton-boron fusion reaction. Proton sources irradiated the BUR, in the phantom. The acquired dose maps were overlapped with the original CT image of the phantom to analyze the dose volume histogram (DVH). We successfully confirmed amplifications of the proton doses (average: 130%) at the target regions. From the DVH result for each simulation, we acquired a relatively accurate dose map for the treatment. A simulation was conducted to characterize the dose distribution and verify the feasibility of proton boron fusion therapy (PBFT). We observed a variation in proton range and developed a tumor targeting technique for treatment that was more accurate and powerful than both conventional proton therapy and boron-neutron capture therapy.

  20. A simulation study for radiation treatment planning based on the atomic physics of the proton-boron fusion reaction

    Science.gov (United States)

    Kim, Sunmi; Yoon, Do-Kun; Shin, Han-Back; Jung, Joo-Young; Kim, Moo-Sub; Kim, Kyeong-Hyeon; Jang, Hong-Seok; Suh, Tae Suk

    2017-03-01

    The purpose of this research is to demonstrate, based on a Monte Carlo simulation code, the procedure of radiation treatment planning for proton-boron fusion therapy (PBFT). A discrete proton beam (60 - 120 MeV) relevant to the Bragg peak was simulated using a Monte Carlo n-particle extended (MCNPX, Ver. 2.6.0, National Laboratory, Los Alamos NM, USA) simulation code. After computed tomography (CT) scanning of a virtual water phantom including air cavities, the acquired CT images were converted using the simulation source code. We set the boron uptake regions (BURs) in the simulated water phantom to achieve the proton-boron fusion reaction. Proton sources irradiated the BUR, in the phantom. The acquired dose maps were overlapped with the original CT image of the phantom to analyze the dose volume histogram (DVH). We successfully confirmed amplifications of the proton doses (average: 130%) at the target regions. From the DVH result for each simulation, we acquired a relatively accurate dose map for the treatment. A simulation was conducted to characterize the dose distribution and verify the feasibility of proton-boron fusion therapy (PBFT). We observed a variation in proton range and developed a tumor-targeting technique for treatment that was more accurate and powerful than both conventional proton therapy and boron-neutron capture therapy.

  1. Enhancing Student’s Understanding in Entrepreneurship Through Business Plan Simulation

    OpenAIRE

    Guzairy M.; Mohamad N.; Yunus A.R.

    2018-01-01

    Business Plan is an important document for entrepreneurs to guide them managing their business. Business Plan also assist the entrepreneur to strategies their business and manage future growth. That is why Malaysian government has foster all Higher Education Provider to set entrepreneurship education as compulsory course. One of the entrepreneurship education learning outcome is the student can write effective business plan. This study focused on enhancing student’s understanding in entrepren...

  2. Merging Methods to Manage Uncertainty: Combining Simulation Modeling and Scenario Planning to Inform Resource Management Under Climate Change

    Science.gov (United States)

    Miller, B. W.; Schuurman, G. W.; Symstad, A.; Fisichelli, N. A.; Frid, L.

    2017-12-01

    Managing natural resources in this era of anthropogenic climate change is fraught with uncertainties around how ecosystems will respond to management actions and a changing climate. Scenario planning (oftentimes implemented as a qualitative, participatory exercise for exploring multiple possible futures) is a valuable tool for addressing this challenge. However, this approach may face limits in resolving responses of complex systems to altered climate and management conditions, and may not provide the scientific credibility that managers often require to support actions that depart from current practice. Quantitative information on projected climate changes and ecological responses is rapidly growing and evolving, but this information is often not at a scale or in a form that is `actionable' for resource managers. We describe a project that sought to create usable information for resource managers in the northern Great Plains by combining qualitative and quantitative methods. In particular, researchers, resource managers, and climate adaptation specialists co-produced a simulation model in conjunction with scenario planning workshops to inform natural resource management in southwest South Dakota. Scenario planning for a wide range of resources facilitated open-minded thinking about a set of divergent and challenging, yet relevant and plausible, climate scenarios and management alternatives that could be implemented in the simulation. With stakeholder input throughout the process, we built a simulation of key vegetation types, grazing, exotic plants, fire, and the effects of climate and management on rangeland productivity and composition. By simulating multiple land management jurisdictions, climate scenarios, and management alternatives, the model highlighted important tradeoffs between herd sizes and vegetation composition, and between the short- versus long-term costs of invasive species management. It also identified impactful uncertainties related to the

  3. Model-based inversion for the characterization of crack-like defects detected by ultrasound in a cladded component; Etude d'une methode d'inversion basee sur la simulation pour la caracterisation de fissures detectees par ultrasons dans un composant revetu

    Energy Technology Data Exchange (ETDEWEB)

    Haiat, G

    2004-03-01

    This work deals with the inversion of ultrasonic data. The industrial context of the study in the non destructive evaluation of the internal walls of French reactor pressure vessels. Those inspections aim at detecting and characterizing cracks. Ultrasonic data correspond to echographic responses obtained with a transducer acting in pulse echo mode. Cracks are detected by crack tip diffraction effect. The analysis of measured data can become difficult because of the presence of a cladding, which surface is irregular. Moreover, its constituting material differs from the one of the reactor vessel. A model-based inverse method uses simulation of propagation and of diffraction of ultrasound taking into account the irregular properties of the cladding surface, as well as the heterogeneous nature of the component. The method developed was implemented and tested on a set of representative cases. Its performances were evaluated by the analysis of experimental results. The precision obtained in the laboratory on experimental cases treated is conform with industrial expectations motivating this study. (author)

  4. An observation planning algorithm applied to multi-objective astronomical observations and its simulation in COSMOS field

    Science.gov (United States)

    Jin, Yi; Gu, Yonggang; Zhai, Chao

    2012-09-01

    Multi-Object Fiber Spectroscopic sky surveys are now booming, such as LAMOST already built by China, BIGBOSS project put forward by the U.S. Lawrence Berkeley National Lab and GTC (Gran Telescopio Canarias) telescope developed by the United States, Mexico and Spain. They all use or will use this approach and each fiber can be moved within a certain area for one astrology target, so observation planning is particularly important for this Sky Surveys. One observation planning algorithm used in multi-objective astronomical observations is developed. It can avoid the collision and interference between the fiber positioning units in the focal plane during the observation in one field of view, and the interested objects can be ovserved in a limited round with the maximize efficiency. Also, the observation simulation can be made for wide field of view through multi-FOV observation. After the observation planning is built ,the simulation is made in COSMOS field using GTC telescope. Interested galaxies, stars and high-redshift LBG galaxies are selected after the removal of the mask area, which may be bright stars. Then 9 FOV simulation is completed and observation efficiency and fiber utilization ratio for every round are given. Otherwise,allocating a certain number of fibers for background sky, giving different weights for different objects and how to move the FOV to improve the overall observation efficiency are discussed.

  5. Large maneuverable flight control using neural networks dynamic inversion

    Science.gov (United States)

    Yang, Enquan; Gao, Jinyuan

    2003-09-01

    An adaptive dynamic-inversion-based neural network is applied to aircraft large maneuverable flight control. Neural network is used to cancel the inversion error which may arise from imperfect modeling or approximate inversion. Simulation results for an aircraft model are presented to illustrate the performance of the flight control system.

  6. EA-based evacuation planning using agent-based crowd simulation

    NARCIS (Netherlands)

    Zhong, J.; Cai, W.; Luo, L.; Lees, M.; Tolk, A.; Diallo, S.Y.; Ryzhov, I.O.; Yilmaz, L.; Buckley, S.; Miller, J.A.

    2014-01-01

    Safety planning for crowd evacuation is an important and active research topic nowadays. One important issue is to devise the evacuation plans of individuals in emergency situations so as to reduce the total evacuation time. This paper proposes a novel evolutionary algorithm (EA)-based methodology,

  7. An Innovative Tool for Intraoperative Electron Beam Radiotherapy Simulation and Planning: Description and Initial Evaluation by Radiation Oncologists

    Energy Technology Data Exchange (ETDEWEB)

    Pascau, Javier, E-mail: jpascau@mce.hggm.es [Unidad de Medicina y Cirugia Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Departamento de Bioingenieria e Ingenieria Aeroespacial, Universidad Carlos III de Madrid, Madrid (Spain); Santos Miranda, Juan Antonio [Servicio de Oncologia Radioterapica, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Facultad de Medicina, Universidad Complutense de Madrid, Madrid (Spain); Calvo, Felipe A. [Servicio de Oncologia Radioterapica, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Facultad de Medicina, Universidad Complutense de Madrid, Madrid (Spain); Departamento de Oncologia, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Bouche, Ana; Morillo, Virgina [Consorcio Hospitalario Provincial de Castellon, Castellon (Spain); Gonzalez-San Segundo, Carmen [Servicio de Oncologia Radioterapica, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Facultad de Medicina, Universidad Complutense de Madrid, Madrid (Spain); Ferrer, Carlos; Lopez Tarjuelo, Juan [Consorcio Hospitalario Provincial de Castellon, Castellon (Spain); and others

    2012-06-01

    Purpose: Intraoperative electron beam radiation therapy (IOERT) involves a modified strategy of conventional radiation therapy and surgery. The lack of specific planning tools limits the spread of this technique. The purpose of the present study is to describe a new simulation and planning tool and its initial evaluation by clinical users. Methods and Materials: The tool works on a preoperative computed tomography scan. A physician contours regions to be treated and protected and simulates applicator positioning, calculating isodoses and the corresponding dose-volume histograms depending on the selected electron energy. Three radiation oncologists evaluated data from 15 IOERT patients, including different tumor locations. Segmentation masks, applicator positions, and treatment parameters were compared. Results: High parameter agreement was found in the following cases: three breast and three rectal cancer, retroperitoneal sarcoma, and rectal and ovary monotopic recurrences. All radiation oncologists performed similar segmentations of tumors and high-risk areas. The average applicator position difference was 1.2 {+-} 0.95 cm. The remaining cancer sites showed higher deviations because of differences in the criteria for segmenting high-risk areas (one rectal, one pancreas) and different surgical access simulated (two rectal, one Ewing sarcoma). Conclusions: The results show that this new tool can be used to simulate IOERT cases involving different anatomic locations, and that preplanning has to be carried out with specialized surgical input.

  8. Intensity-modulated radiotherapy as the boost or salvage treatment of nasopharyngeal carcinoma: The appropriate parameters in the inverse planning and the effect of patient's anatomic factors on the planning results

    International Nuclear Information System (INIS)

    Hsiung, C.-Y.; Hunt, Margie A.; Yorke, Ellen D.; Chui, C.-S.; Hu, Jason; Xiong, J.-P.; Ling, Clifton C.; Lo, S.-K.; Wang, C.-J.; Huang, E.-Y.; Amols, Howard I.

    2005-01-01

    The current study demonstrates that the large increase in normal tissue penalty often degrades target dose uniformity without a concomitant large improvement in normal tissue dose, especially in anatomically unfavorable patients. The excessively large normal tissue penalties do not improve treatment plans for patients having unfavorable geometry

  9. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    Energy Technology Data Exchange (ETDEWEB)

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

  10. Simulations

    CERN Document Server

    Ngada, Narcisse

    2015-06-15

    The complexity and cost of building and running high-power electrical systems make the use of simulations unavoidable. The simulations available today provide great understanding about how systems really operate. This paper helps the reader to gain an insight into simulation in the field of power converters for particle accelerators. Starting with the definition and basic principles of simulation, two simulation types, as well as their leading tools, are presented: analog and numerical simulations. Some practical applications of each simulation type are also considered. The final conclusion then summarizes the main important items to keep in mind before opting for a simulation tool or before performing a simulation.

  11. Adjoint modeling for acoustic inversion

    Science.gov (United States)

    Hursky, Paul; Porter, Michael B.; Cornuelle, B. D.; Hodgkiss, W. S.; Kuperman, W. A.

    2004-02-01

    The use of adjoint modeling for acoustic inversion is investigated. An adjoint model is derived from a linearized forward propagation model to propagate data-model misfit at the observation points back through the medium to the medium perturbations not being accounted for in the model. This adjoint model can be used to aid in inverting for these unaccounted medium perturbations. Adjoint methods are being applied to a variety of inversion problems, but have not drawn much attention from the underwater acoustic community. This paper presents an application of adjoint methods to acoustic inversion. Inversions are demonstrated in simulation for both range-independent and range-dependent sound speed profiles using the adjoint of a parabolic equation model. Sensitivity and error analyses are discussed showing how the adjoint model enables calculations to be performed in the space of observations, rather than the often much larger space of model parameters. Using an adjoint model enables directions of steepest descent in the model parameters (what we invert for) to be calculated using far fewer modeling runs than if a forward model only were used.

  12. Solving inverse problems of optical microlithography

    Science.gov (United States)

    Granik, Yuri

    2005-05-01

    The direct problem of microlithography is to simulate printing features on the wafer under given mask, imaging system, and process characteristics. The goal of inverse problems is to find the best mask and/or imaging system and/or process to print the given wafer features. In this study we will describe and compare solutions of inverse mask problems. Pixel-based inverse problem of mask optimization (or "layout inversion") is harder than inverse source problem, especially for partially-coherent systems. It can be stated as a non-linear constrained minimization problem over complex domain, with large number of variables. We compare method of Nashold projections, variations of Fienap phase-retrieval algorithms, coherent approximation with deconvolution, local variations, and descent searches. We propose electrical field caching technique to substantially speedup the searching algorithms. We demonstrate applications of phase-shifted masks, assist features, and maskless printing.

  13. Recurrent Neural Network for Computing Outer Inverse.

    Science.gov (United States)

    Živković, Ivan S; Stanimirović, Predrag S; Wei, Yimin

    2016-05-01

    Two linear recurrent neural networks for generating outer inverses with prescribed range and null space are defined. Each of the proposed recurrent neural networks is based on the matrix-valued differential equation, a generalization of dynamic equations proposed earlier for the nonsingular matrix inversion, the Moore-Penrose inversion, as well as the Drazin inversion, under the condition of zero initial state. The application of the first approach is conditioned by the properties of the spectrum of a certain matrix; the second approach eliminates this drawback, though at the cost of increasing the number of matrix operations. The cases corresponding to the most common generalized inverses are defined. The conditions that ensure stability of the proposed neural network are presented. Illustrative examples present the results of numerical simulations.

  14. Forward modeling. Route to electromagnetic inversion

    Energy Technology Data Exchange (ETDEWEB)

    Groom, R.; Walker, P. [PetRos EiKon Incorporated, Ontario (Canada)

    1996-05-01

    Inversion of electromagnetic data is a topical subject in the literature, and much time has been devoted to understanding the convergence properties of various inverse methods. The relative lack of success of electromagnetic inversion techniques is partly attributable to the difficulties in the kernel forward modeling software. These difficulties come in two broad classes: (1) Completeness and robustness, and (2) convergence, execution time and model simplicity. If such problems exist in the forward modeling kernel, it was demonstrated that inversion can fail to generate reasonable results. It was suggested that classical inversion techniques, which are based on minimizing a norm of the error between data and the simulated data, will only be successful when these difficulties in forward modeling kernels are properly dealt with. 4 refs., 5 figs.

  15. Simulation in Quality Management – An Approach to Improve Inspection Planning

    Directory of Open Access Journals (Sweden)

    H.-A. Crostack

    2005-01-01

    Full Text Available Production is a multi-step process involving many different articles produced in different jobs by various machining stations. Quality inspection has to be integrated in the production sequence in order to ensure the conformance of the products. The interactions between manufacturing processes and inspections are very complex since three aspects (quality, cost, and time should all be considered at the same time while determining the suitable inspection strategy. Therefore, a simulation approach was introduced to solve this problem.The simulator called QUINTE [the QUINTE simulator has been developed at the University of Dortmund in the course of two research projects funded by the German Federal Ministry of Economics and Labour (BMWA: Bundesministerium für Wirtschaft und Arbeit, the Arbeitsgemeinschaft industrieller Forschungsvereinigungen (AiF, Cologne/Germany and the Forschungsgemeinschaft Qualität, Frankfurt a.M./Germany] was developed to simulate the machining as well as the inspection. It can be used to investigate and evaluate the inspection strategies in manufacturing processes. The investigation into the application of QUINTE simulator in industry was carried out at two pilot companies. The results show the validity of this simulator. An attempt to run QUINTE in a user-friendly environment, i.e., the commercial simulation software – Arena® is also described in this paper.NOTATION: QUINTE Qualität in der Teilefertigung  (Quality in  the manufacturing process  

  16. ``Exact''-N-body simulations for star clusters and galaxies, GRAPE, and future plans

    Science.gov (United States)

    Spurzem, R.

    1998-07-01

    The subjects and key questions faced by computational astrophysics using N-body simulations are discussed in the fields of globular star cluster dynamics and galactic nuclei, with the focus of interest centered to the so-called ``exact'' or Aarseth-type collisional N-body simulations. Various algorithms are briefly described. A new concept for a more flexible customized special purpose computer based on a combination of GRAPE and FPGA special-purpose hardware is proposed. It is an ideal machine for all kinds of N-body simulations using neighbour schemes, as the Ahmad-Cohen direct N-body codes and smoothed particle hydrodynamics for systems including gas.

  17. Multiparameter Optimization for Electromagnetic Inversion Problem

    Directory of Open Access Journals (Sweden)

    M. Elkattan

    2017-10-01

    Full Text Available Electromagnetic (EM methods have been extensively used in geophysical investigations such as mineral and hydrocarbon exploration as well as in geological mapping and structural studies. In this paper, we developed an inversion methodology for Electromagnetic data to determine physical parameters of a set of horizontal layers. We conducted Forward model using transmission line method. In the inversion part, we solved multi parameter optimization problem where, the parameters are conductivity, dielectric constant, and permeability of each layer. The optimization problem was solved by simulated annealing approach. The inversion methodology was tested using a set of models representing common geological formations.

  18. PEGASO - simulation model for the operation of nuclear power plants for planning purposes

    International Nuclear Information System (INIS)

    Ribeiro, A.A.T.; Muniz, A.A.

    1979-07-01

    The utilization manual for PEGASO is presented, consisting of a set of programs whose objective is to simulate the monthly operation of nuclear power plants (up to 10 NPP), determining the principal physical parameters and criticality. (Author) [pt

  19. Inter-Enterprise Planning of Manufacturing Systems Applying Simulation with IPR Protection

    Science.gov (United States)

    Mertins, Kai; Rabe, Markus

    Discrete Event Simulation is a well-proved method to analyse the dynamic behaviour of manufacturing systems. However, simulation application is still poor for external supply chains or virtual enterprises, encompassing several legal entities. Most conventional simulation systems provide no means to protect intellectual property rights (IPR), nor methods to support cross-enterprise teamwork. This paper describes a solution to keep enterprise models private, but still provide their functionality for cross-enterprise evaluation purposes. Applying the new modelling system, the inter-enterprise business process is specified by the user, including a specification of the objects exchanged between the local models. The required environment for a distributed simulation is generated automatically. The mechanisms have been tested with a large supply chain model.

  20. Laterally constrained inversion for CSAMT data interpretation

    Science.gov (United States)

    Wang, Ruo; Yin, Changchun; Wang, Miaoyue; Di, Qingyun

    2015-10-01

    Laterally constrained inversion (LCI) has been successfully applied to the inversion of dc resistivity, TEM and airborne EM data. However, it hasn't been yet applied to the interpretation of controlled-source audio-frequency magnetotelluric (CSAMT) data. In this paper, we apply the LCI method for CSAMT data inversion by preconditioning the Jacobian matrix. We apply a weighting matrix to Jacobian to balance the sensitivity of model parameters, so that the resolution with respect to different model parameters becomes more uniform. Numerical experiments confirm that this can improve the convergence of the inversion. We first invert a synthetic dataset with and without noise to investigate the effect of LCI applications to CSAMT data, for the noise free data, the results show that the LCI method can recover the true model better compared to the traditional single-station inversion; and for the noisy data, the true model is recovered even with a noise level of 8%, indicating that LCI inversions are to some extent noise insensitive. Then, we re-invert two CSAMT datasets collected respectively in a watershed and a coal mine area in Northern China and compare our results with those from previous inversions. The comparison with the previous inversion in a coal mine shows that LCI method delivers smoother layer interfaces that well correlate to seismic data, while comparison with a global searching algorithm of simulated annealing (SA) in a watershed shows that though both methods deliver very similar good results, however, LCI algorithm presented in this paper runs much faster. The inversion results for the coal mine CSAMT survey show that a conductive water-bearing zone that was not revealed by the previous inversions has been identified by the LCI. This further demonstrates that the method presented in this paper works for CSAMT data inversion.

  1. Urban sewage plant investment costs in the Water Treatment Plan for Catalonia; Costes de inversion en las depuradoras de aguas residuales urbanas en el Plan de Saneamiento de Catalunya

    Energy Technology Data Exchange (ETDEWEB)

    Queralt Torrell, R. [Perito Industrial, miembro de ADECAGUA, Junta de Sanejament, Dpto. Medi Ambient, Generalitat de Catalunya (Spain)

    1997-06-01

    A brief historical overview is provided of the waste water treatment plants built in Catalonia, with special reference to those included in the Water Treatment Plan. The 96 plants constructed between 1991 and 1996 are listed in a table showing the year they came into service, their location, the number of inhabitants served, inhabitant equivalents, daily volume of water, pollution charge and investments. The correlations between different parameters are examined and the causes of the most extreme figures pointed out. A graph and a function showing the relationship between the daily volume of water to be treated and the capital investment cost of building the plant is also provided. (Author)

  2. Preoperative planning of thoracic surgery with use of three-dimensional reconstruction, rapid prototyping, simulation and virtual navigation.

    Science.gov (United States)

    Heuts, Samuel; Sardari Nia, Peyman; Maessen, Jos G

    2016-01-01

    For the past decades, surgeries have become more complex, due to the increasing age of the patient population referred for thoracic surgery, more complex pathology and the emergence of minimally invasive thoracic surgery. Together with the early detection of thoracic disease as a result of innovations in diagnostic possibilities and the paradigm shift to personalized medicine, preoperative planning is becoming an indispensable and crucial aspect of surgery. Several new techniques facilitating this paradigm shift have emerged. Pre-operative marking and staining of lesions are already a widely accepted method of preoperative planning in thoracic surgery. However, three-dimensional (3D) image reconstructions, virtual simulation and rapid prototyping (RP) are still in development phase. These new techniques are expected to become an important part of the standard work-up of patients undergoing thoracic surgery in the future. This review aims at graphically presenting and summarizing these new diagnostic and therapeutic tools.

  3. Advanced Simulation and Computing FY09-FY10 Implementation Plan, Volume 2, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, R; Hopson, J; Peery, J; McCoy, M

    2008-10-07

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  4. Advanced Simulation and Computing FY09-FY10 Implementation Plan Volume 2, Rev. 1

    Energy Technology Data Exchange (ETDEWEB)

    Kissel, L

    2009-04-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  5. Advanced Simulation & Computing FY09-FY10 Implementation Plan Volume 2, Rev. 0

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, R; Perry, J; McCoy, M; Hopson, J

    2008-04-30

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  6. Advanced Simulation and Computing FY08-09 Implementation Plan Volume 2 Revision 0

    International Nuclear Information System (INIS)

    McCoy, M; Kusnezov, D; Bikkel, T; Hopson, J

    2007-01-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  7. Advanced Simulation and Computing FY08-09 Implementation Plan, Volume 2, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Kusnezov, D; Bickel, T; McCoy, M; Hopson, J

    2007-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  8. Advanced Simulation and Computing FY10-FY11 Implementation Plan Volume 2, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, R; Peery, J; McCoy, M; Hopson, J

    2009-09-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  9. Advanced Simulation and Computing FY10-11 Implementation Plan Volume 2, Rev. 0

    Energy Technology Data Exchange (ETDEWEB)

    Carnes, B

    2009-06-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  10. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, M; Phillips, J; Hpson, J; Meisner, R

    2010-04-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  11. Production planning and scheduling with material handling using modelling and simulation

    Directory of Open Access Journals (Sweden)

    Krenczyk Damian

    2017-01-01

    Full Text Available Increase flexibility of manufacturing requires implementation of effective planning IT tools. Planning and scheduling problems traditionally consider the production resources as the only constraints. Since the raw material and intermediate transferring times are relevant in relation to the production times, it is required to model the material handling equipment and its activities. The problem of scheduling subject to manufacturing and logistical constraints based on data-driven models generation method is presented. A KbRS and FlexSim systems are used as a scheduling tool.

  12. Protocol for quality control of scanners used in the simulation of radiotherapy treatment planning

    International Nuclear Information System (INIS)

    Morales, Jorge l; Alfonso, Rodolfo; Vega, Manuel

    2009-01-01

    The treatment planning of HDR brachytherapy with Ir-192 is made in the INOR based on semi-orthogonal X-ray images. In the case of implants of molds for head and neck injuries for the purpose of strengthening the external radiation doses, reports valuable information can combine isodose distributions of both modalities. The CT imaging the patient with the applicator-placed cast, gives the possibility to obtain three-dimensional dose distributions in different anatomical views. The aim of this study was to implement the verification of post-plan dose distributions and the possibility of combined distributions. (author)

  13. Multiple methods for multiple futures: Integrating qualitative scenario planning and quantitative simulation modeling for natural resource decision making

    Science.gov (United States)

    Symstad, Amy J.; Fisichelli, Nicholas A.; Miller, Brian W.; Rowland, Erika; Schuurman, Gregor W.

    2017-01-01

    Scenario planning helps managers incorporate climate change into their natural resource decision making through a structured “what-if” process of identifying key uncertainties and potential impacts and responses. Although qualitative scenarios, in which ecosystem responses to climate change are derived via expert opinion, often suffice for managers to begin addressing climate change in their planning, this approach may face limits in resolving the responses of complex systems to altered climate conditions. In addition, this approach may fall short of the scientific credibility managers often require to take actions that differ from current practice. Quantitative simulation modeling of ecosystem response to climate conditions and management actions can provide this credibility, but its utility is limited unless the modeling addresses the most impactful and management-relevant uncertainties and incorporates realistic management actions. We use a case study to compare and contrast management implications derived from qualitative scenario narratives and from scenarios supported by quantitative simulations. We then describe an analytical framework that refines the case study’s integrated approach in order to improve applicability of results to management decisions. The case study illustrates the value of an integrated approach for identifying counterintuitive system dynamics, refining understanding of complex relationships, clarifying the magnitude and timing of changes, identifying and checking the validity of assumptions about resource responses to climate, and refining management directions. Our proposed analytical framework retains qualitative scenario planning as a core element because its participatory approach builds understanding for both managers and scientists, lays the groundwork to focus quantitative simulations on key system dynamics, and clarifies the challenges that subsequent decision making must address.

  14. Reducing Patient Waiting Times for Radiation Therapy and Improving the Treatment Planning Process: a Discrete-event Simulation Model (Radiation Treatment Planning).

    Science.gov (United States)

    Babashov, V; Aivas, I; Begen, M A; Cao, J Q; Rodrigues, G; D'Souza, D; Lock, M; Zaric, G S

    2017-06-01

    We analysed the radiotherapy planning process at the London Regional Cancer Program to determine the bottlenecks and to quantify the effect of specific resource levels with the goal of reducing waiting times. We developed a discrete-event simulation model of a patient's journey from the point of referral to a radiation oncologist to the start of radiotherapy, considering the sequential steps and resources of the treatment planning process. We measured the effect of several resource changes on the ready-to-treat to treatment (RTTT) waiting time and on the percentage treated within a 14 calendar day target. Increasing the number of dosimetrists by one reduced the mean RTTT by 6.55%, leading to 84.92% of patients being treated within the 14 calendar day target. Adding one more oncologist decreased the mean RTTT from 10.83 to 10.55 days, whereas a 15% increase in arriving patients increased the waiting time by 22.53%. The model was relatively robust to the changes in quantity of other resources. Our model identified sensitive and non-sensitive system parameters. A similar approach could be applied by other cancer programmes, using their respective data and individualised adjustments, which may be beneficial in making the most effective use of limited resources. Copyright © 2017 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  15. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Phillips, Julia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wampler, Cheryl [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meisner, Robert [National Nuclear Security Administration (NNSA), Washington, DC (United States)

    2010-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality, and scientific details); to quantify critical margins and uncertainties; and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  16. Advanced Simulation and Computing Fiscal Year 14 Implementation Plan, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, Robert [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Matzen, M. Keith [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-09-11

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Moreover, ASC’s business model is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive

  17. Dosimetric study of prostate brachytherapy using techniques of Monte-Carlo simulation, experimental measurements and comparison with a treatment plan

    International Nuclear Information System (INIS)

    Teles, Pedro; Barros, Silvia; Vaz, Pedro; Goncalves, Isabel; Facure, Alessandro; Rosa, Luiz da; Santos, Maira; Pereira Junior, Pedro Paulo; Zankl, Maria

    2013-01-01

    Prostate Brachytherapy is a radiotherapy technique, which consists in inserting a number of radioactive seeds (containing, usually, the following radionuclides 125 l, 241 Am or 103 Pd ) surrounding or in the vicinity of, prostate tumor tissue . The main objective of this technique is to maximize the radiation dose to the tumor and minimize it in other tissues and organs healthy, in order to reduce its morbidity. The absorbed dose distribution in the prostate, using this technique is usually non-homogeneous and time dependent. Various parameters such as the type of seed, the attenuation interactions between them, their geometrical arrangement within the prostate, the actual geometry of the seeds,and further swelling of the prostate gland after implantation greatly influence the course of absorbed dose in the prostate and surrounding areas. Quantification of these parameters is therefore extremely important for dose optimization and improvement of their plans conventional treatment, which in many cases not fully take into account. The Monte Carlo techniques allow to study these parameters quickly and effectively. In this work, we use the program MCNPX and generic voxel phantom (GOLEM) where simulated different geometric arrangements of seeds containing 125 I, Amersham Health model of type 6711 in prostates of different sizes, in order to try to quantify some of the parameters. The computational model was validated using a phantom prostate cubic RW3 type , consisting of tissue equivalent, and thermoluminescent dosimeters. Finally, to have a term of comparison with a treatment real plan it was simulate a treatment plan used in a hospital of Rio de Janeiro, with exactly the same parameters, and our computational model. The results obtained in our study seem to indicate that the parameters described above may be a source of uncertainty in the correct evaluation of the dose required for actual treatment plans. The use of Monte Carlo techniques can serve as a complementary

  18. Design, development and clinical validation of computer-aided surgical simulation system for streamlined orthognathic surgical planning.

    Science.gov (United States)

    Yuan, Peng; Mai, Huaming; Li, Jianfu; Ho, Dennis Chun-Yu; Lai, Yingying; Liu, Siting; Kim, Daeseung; Xiong, Zixiang; Alfi, David M; Teichgraeber, John F; Gateno, Jaime; Xia, James J

    2017-12-01

    There are many proven problems associated with traditional surgical planning methods for orthognathic surgery. To address these problems, we developed a computer-aided surgical simulation (CASS) system, the AnatomicAligner, to plan orthognathic surgery following our streamlined clinical protocol. The system includes six modules: image segmentation and three-dimensional (3D) reconstruction, registration and reorientation of models to neutral head posture, 3D cephalometric analysis, virtual osteotomy, surgical simulation, and surgical splint generation. The accuracy of the system was validated in a stepwise fashion: first to evaluate the accuracy of AnatomicAligner using 30 sets of patient data, then to evaluate the fitting of splints generated by AnatomicAligner using 10 sets of patient data. The industrial gold standard system, Mimics, was used as the reference. When comparing the results of segmentation, virtual osteotomy and transformation achieved with AnatomicAligner to the ones achieved with Mimics, the absolute deviation between the two systems was clinically insignificant. The average surface deviation between the two models after 3D model reconstruction in AnatomicAligner and Mimics was 0.3 mm with a standard deviation (SD) of 0.03 mm. All the average surface deviations between the two models after virtual osteotomy and transformations were smaller than 0.01 mm with a SD of 0.01 mm. In addition, the fitting of splints generated by AnatomicAligner was at least as good as the ones generated by Mimics. We successfully developed a CASS system, the AnatomicAligner, for planning orthognathic surgery following the streamlined planning protocol. The system has been proven accurate. AnatomicAligner will soon be available freely to the boarder clinical and research communities.

  19. pyGIMLi: An open-source library for modelling and inversion in geophysics

    Science.gov (United States)

    Rücker, Carsten; Günther, Thomas; Wagner, Florian M.

    2017-12-01

    Many tasks in applied geosciences cannot be solved by single measurements, but require the integration of geophysical, geotechnical and hydrological methods. Numerical simulation techniques are essential both for planning and interpretation, as well as for the process understanding of modern geophysical methods. These trends encourage open, simple, and modern software architectures aiming at a uniform interface for interdisciplinary and flexible modelling and inversion approaches. We present pyGIMLi (Python Library for Inversion and Modelling in Geophysics), an open-source framework that provides tools for modelling and inversion of various geophysical but also hydrological methods. The modelling component supplies discretization management and the numerical basis for finite-element and finite-volume solvers in 1D, 2D and 3D on arbitrarily structured meshes. The generalized inversion framework solves the minimization problem with a Gauss-Newton algorithm for any physical forward operator and provides opportunities for uncertainty and resolution analyses. More general requirements, such as flexible regularization strategies, time-lapse processing and different sorts of coupling individual methods are provided independently of the actual methods used. The usage of pyGIMLi is first demonstrated by solving the steady-state heat equation, followed by a demonstration of more complex capabilities for the combination of different geophysical data sets. A fully coupled hydrogeophysical inversion of electrical resistivity tomography (ERT) data of a simulated tracer experiment is presented that allows to directly reconstruct the underlying hydraulic conductivity distribution of the aquifer. Another example demonstrates the improvement of jointly inverting ERT and ultrasonic data with respect to saturation by a new approach that incorporates petrophysical relations in the inversion. Potential applications of the presented framework are manifold and include time

  20. SIMULATION OF CARS ACCUMULATION PROCESSES FOR SOLVING TASKS OF OPERATIONAL PLANNING IN CONDITIONS OF INITIAL INFORMATION UNCERTAINTY

    Directory of Open Access Journals (Sweden)

    О. A. Tereshchenko

    2017-06-01

    Full Text Available Purpose. The article highlights development of the methodological basis for simulation the processes of cars accumulation in solving operational planning problems under conditions of initial information uncertainty for assessing the sustainability of the adopted planning scenario and calculating the associated technological risks. Methodology. The solution of the problem under investigation is based on the use of general scientific approaches, the apparatus of probability theory and the theory of fuzzy sets. To achieve this purpose, the factors influencing the entropy of operational plans are systematized. It is established that when planning the operational work of railway stations, sections and nodes, the most significant factors that cause uncertainty in the initial information are: a external conditions with respect to the railway ground in question, expressed by the uncertainty of the timing of cars arrivals; b external, hard-to-identify goals for the functioning of other participants in the logistics chain (primarily customers, expressed by the uncertainty of the completion time with the freight cars. These factors are suggested to be taken into account in automated planning through statistical analysis – the establishment and study of the remaining time (prediction errors. As a result, analytical dependencies are proposed for rational representation of the probability density functions of the time residual distribution in the form of point, piecewise-defined and continuous analytic models. The developed models of cars accumulation, the application of which depends on the identified states of the predicted incoming car flow to the accumulation system, are presented below. In addition, the last proposed model is a general case of models of accumulation processes with an arbitrary level of reliability of the initial information for any structure of the incoming flow of cars. In conclusion, a technique for estimating the results of

  1. The Comparison of Layout Arrangements for the Material Flow Ordering Planning in Production Systems through Simulation Analysis

    Directory of Open Access Journals (Sweden)

    Mehmet AKSARAYLI

    2009-02-01

    Full Text Available Enterprises have to make suitable location planning to decrease their product cost and to increase their productivity in our time. The aim of this study is to compare the basic layout types used in arranging inside facility layout with each other by using simulation. Especially in this study, material handling times between machines and ratio of these times in total times were interested. First of all a new production system is designed to the basic layout types used in arranging inside facility layout. And then in the designed production system, machines are arranged for each machine layout types. Machine layout types are transferred to PROMODEL simulation software. Then with the results of analysis, material handling times of different machine layout types and ratio of material handling times in total production time were compared and the results obtained from this analysis were given after commented on this study.

  2. The application of dynamic micro-simulation model of urban planning based on multi-agent system

    Science.gov (United States)

    Xu, J.; Shiming, W.

    2012-12-01

    The dynamic micro-simulation model of urban planning based on multi-agent, is mainly used to measure and predict the impact of the policy on urban land use, employment opportunities and the price of real estate. The representation of the supply and characteristics of land and of real estate development, at a spatial scale. The use of real estate markets as a central organizing focus, with consumer choices and supplier choices explicitly represented, as well as the resulting effects on real estate prices. The relationship of agents to real estate tied to specific locations provided a clean accounting of space and its use. Finally, it will produce a map composited with the dynamic demographic distribution and the dynamic employment transfer by the geographic spatial data. With the data produced by the urban micro-simulation model, it can provide the favorable forecast reference for the scientific urban land use.

  3. Using ProModel as a simulation tools to assist plant layout design and planning: Case study plastic packaging factory

    Directory of Open Access Journals (Sweden)

    Pochamarn Tearwattanarattikal

    2008-01-01

    Full Text Available This study is about the application of a Simulation Model to assist decision making on expanding capacity and plant layout design and planning. The plant layout design concept is performed first to create the physical layouts then the simulation model used to test the capability of plant to meet various demand forecast scena. The study employed ProModel package as a tool, using the model to compare the performances in term of % utilization, characteristics of WIP and ability to meet due date. The verification and validation stages were perform before running the scenarios. The model runs daily production and then the capacity constraint resources defined by % utilization. The expanding capacity policy can be extra shift-working hours or increasing the number of machines. After expanding capacity solutions are found, the physical layout is selected based on the criterion of space available for WIP and easy flow of material.

  4. Inverse Smith-Purcell effect in the submillimeter wave region. Theoretical analyses

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Jongsuck; Furuya, Kazuyuki; Shirai, Hirokazu; Nozokido, Tatsuo; Mizuno, Koji

    1988-03-01

    The inverse Smith-Purcell effect is theoretically investigated in the submillimeter wave region for planning experiments. The effect in which coherent light waves interact with charged particles resonantly through a diffraction grating is one of the candidates for laser-driven linacs. The optimum grating dimensions with a rectangular groove profile are designed by analyzing the fields just in front of the grating surface. The energy spreads of the electrons resulting from interactions with the laser field were evaluated by computer simulations. It can be seen from the simulations that a laser power of 1 W can produce /similar to/ 30 eV increase in the electron energy spectrum.

  5. The Decision to Emigrate: A Simulation Model Based on the Theory of Planned Behaviour

    NARCIS (Netherlands)

    Willekens, F.J.; Grow, A.; Van Bavel, J.

    2016-01-01

    The theory of planned behaviour (TPB) is one of the most widely used theories of behaviour. It was developed by Ajzen as an extension of Fishbein’s theory of reasoned action (Fishbein and Ajzen, Predicting and changing behaviour. Psychology Press, New York, 2010). The theory states that intentions

  6. A discrete event simulation design for block-based maintenance planning under random machine usage

    NARCIS (Netherlands)

    de Jonge, Bram

    2015-01-01

    Existing research on block-based preventive maintenance planning generally assumes that machines are either used continuously, or that times until failure do not depend on the actual usage of machines. In practice, however, it is often more realistic to assume that machines are not used continuously

  7. Preparatory co-activation of the ankle muscles may prevent ankle inversion injuries

    Science.gov (United States)

    DeMers, Matthew S.; Hicks, Jennifer L.; Delp, Scott L.

    2018-01-01

    Ankle inversion sprains are the most frequent acute musculoskeletal injuries occurring in physical activity. Interventions that retrain muscle coordination have helped rehabilitate injured ankles, but it is unclear which muscle coordination strategies, if any, can prevent ankle sprains. The purpose of this study was to determine whether coordinated activity of the ankle muscles could prevent excessive ankle inversion during a simulated landing on a 30-degree incline. We used a set of musculoskeletal simulations to evaluate the efficacy of two strategies for coordinating the ankle evertor and invertor muscles during simulated landing scenarios: planned co-activation and stretch reflex activation with physiologic latency (60-millisecond delay). A full-body musculoskeletal model of landing was used to generate simulations of a subject dropping onto an inclined surface with each coordination condition. Within each condition, the intensity of evertor and invertor co-activity or stretch reflexes were varied systematically. The simulations revealed that strong preparatory co-activation of the ankle evertors and invertors prior to ground contact prevented ankle inversion from exceeding injury thresholds by rapidly generating eversion moments after initial contact. Conversely, stretch reflexes were too slow to generate eversion moments before the simulations reached the threshold for inversion injury. These results suggest that training interventions to protect the ankle should focus on stiffening the ankle with muscle co-activation prior to landing. The musculoskeletal models, controllers, software, and simulation results are freely available online at http://simtk.org/home/ankle-sprains, enabling others to reproduce the results and explore new injury scenarios and interventions. PMID:28057351

  8. Treatment plan evaluation for interstitial photodynamic therapy in a mouse model by Monte Carlo simulation with FullMonte

    Directory of Open Access Journals (Sweden)

    Jeffrey eCassidy

    2015-02-01

    Full Text Available Monte Carlo (MC simulation is recognized as the gold standard for biophotonic simulation, capturing all relevant physics and material properties at the perceived cost of high computing demands. Tetrahedral-mesh-based MC simulations particularly are attractive due to the ability to refine the mesh at will to conform to complicated geometries or user-defined resolution requirements. Since no approximations of material or light-source properties are required, MC methods are applicable to the broadest set of biophotonic simulation problems. MC methods also have other implementation features including inherent parallelism, and permit a continuously-variable quality-runtime tradeoff. We demonstrate here a complete MC-based prospective fluence dose evaluation system for interstitial PDT to generate dose-volume histograms on a tetrahedral mesh geometry description. To our knowledge, this is the first such system for general interstitial photodynamic therapy employing MC methods and is therefore applicable to a very broad cross-section of anatomy and material properties. We demonstrate that evaluation of dose-volume histograms is an effective variance-reduction scheme in its own right which greatly reduces the number of packets required and hence runtime required to achieve acceptable result confidence. We conclude that MC methods are feasible for general PDT treatment evaluation and planning, and considerably less costly than widely believed.

  9. Integration of scheduling and discrete event simulation systems to improve production flow planning

    Science.gov (United States)

    Krenczyk, D.; Paprocka, I.; Kempa, W. M.; Grabowik, C.; Kalinowski, K.

    2016-08-01

    The increased availability of data and computer-aided technologies such as MRPI/II, ERP and MES system, allowing producers to be more adaptive to market dynamics and to improve production scheduling. Integration of production scheduling and computer modelling, simulation and visualization systems can be useful in the analysis of production system constraints related to the efficiency of manufacturing systems. A integration methodology based on semi-automatic model generation method for eliminating problems associated with complexity of the model and labour-intensive and time-consuming process of simulation model creation is proposed. Data mapping and data transformation techniques for the proposed method have been applied. This approach has been illustrated through examples of practical implementation of the proposed method using KbRS scheduling system and Enterprise Dynamics simulation system.

  10. Clinical trial optimization: Monte Carlo simulation Markov model for planning clinical trials recruitment.

    Science.gov (United States)

    Abbas, Ismail; Rovira, Joan; Casanovas, Josep

    2007-05-01

    The patient recruitment process of clinical trials is an essential element which needs to be designed properly. In this paper we describe different simulation models under continuous and discrete time assumptions for the design of recruitment in clinical trials. The results of hypothetical examples of clinical trial recruitments are presented. The recruitment time is calculated and the number of recruited patients is quantified for a given time and probability of recruitment. The expected delay and the effective recruitment durations are estimated using both continuous and discrete time modeling. The proposed type of Monte Carlo simulation Markov models will enable optimization of the recruitment process and the estimation and the calibration of its parameters to aid the proposed clinical trials. A continuous time simulation may minimize the duration of the recruitment and, consequently, the total duration of the trial.

  11. Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Matzen, M. Keith [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-16

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.

  12. Inverse Kinematics using Quaternions

    DEFF Research Database (Denmark)

    Henriksen, Knud; Erleben, Kenny; Engell-Nørregård, Morten

    In this project I describe the status of inverse kinematics research, with the focus firmly on the methods that solve the core problem. An overview of the different methods are presented Three common methods used in inverse kinematics computation have been chosen as subject for closer inspection....

  13. Inverse logarithmic potential problem

    CERN Document Server

    Cherednichenko, V G

    1996-01-01

    The Inverse and Ill-Posed Problems Series is a series of monographs publishing postgraduate level information on inverse and ill-posed problems for an international readership of professional scientists and researchers. The series aims to publish works which involve both theory and applications in, e.g., physics, medicine, geophysics, acoustics, electrodynamics, tomography, and ecology.

  14. Transforming network simulation data to semantic data for network attack planning

    CSIR Research Space (South Africa)

    Chan, Ke Fai Peter

    2017-03-01

    Full Text Available This research paper investigates a technique to transform network simulation data into linked data through the use of ontology models. By transforming the data, it allows one to use semantic reasoners to infer and reason additional insight. A case...

  15. A Recreational Visitor Travel Simulation Model as an Aid to Management Planning

    Science.gov (United States)

    Lucas, Robert C.; Shechter, Mordechai

    1977-01-01

    The article describes the use of a simulation for outdoor recreation management which is applicable for any type of dispersed recreation area where visitor flows are of concern, where there are capacity constraints, where visitor encounters are significant, and where it is desired to allow visitors substantial freedom to move about flexibly. (MJB)

  16. Simulation of recreational use in backcountry settings: an aid to management planning

    Science.gov (United States)

    David N. Cole

    2002-01-01

    Simulation models of recreation use patterns can be a valuable tool to managers of backcountry areas, such as wilderness areas and national parks. They can help fine-tune existing management programs, particularly in places that ration recreation use or that require the use of designated campsites. They can assist managers in evaluating the likely effects of increasing...

  17. Linking population viability, habitat suitability, and landscape simulation models for conservation planning

    Science.gov (United States)

    Michael A. Larson; Frank R., III Thompson; Joshua J. Millspaugh; William D. Dijak; Stephen R. Shifley

    2004-01-01

    Methods for habitat modeling based on landscape simulations and population viability modeling based on habitat quality are well developed, but no published study of which we are aware has effectively joined them in a single, comprehensive analysis. We demonstrate the application of a population viability model for ovenbirds (Seiurus aurocapillus)...

  18. Modeling, Simulation, and Analysis for State and Local Emergency Planning and Response. Operational Requirements Document

    Science.gov (United States)

    2009-01-01

    quickly can users teach themselves to use the model and its out- puts? Efficient to use. Once users have learned to use the model, how fast can they...The following are examples: Common Alerting Protocol Emergency Data Exchange Language Resource Messaging Hospital Availability Exchange. The... videoconferencing or other technology, implementers need to know what they must consider when planning and imple- menting the solution. Guidance in the

  19. Comprehensive MRI simulation methodology using a dedicated MRI scanner in radiation oncology for external beam radiation treatment planning

    Energy Technology Data Exchange (ETDEWEB)

    Paulson, Eric S., E-mail: epaulson@mcw.edu [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin 53226 and Department of Radiology, Medical College of Wisconsin, Milwaukee, Wisconsin 53226 (United States); Erickson, Beth; Schultz, Chris; Allen Li, X. [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin 53226 (United States)

    2015-01-15

    Purpose: The use of magnetic resonance imaging (MRI) in radiation oncology is expanding rapidly, and more clinics are integrating MRI into their radiation therapy workflows. However, radiation therapy presents a new set of challenges and places additional constraints on MRI compared to diagnostic radiology that, if not properly addressed, can undermine the advantages MRI offers for radiation treatment planning (RTP). The authors introduce here strategies to manage several challenges of using MRI for virtual simulation in external beam RTP. Methods: A total of 810 clinical MRI simulation exams were performed using a dedicated MRI scanner for external beam RTP of brain, breast, cervix, head and neck, liver, pancreas, prostate, and sarcoma cancers. Patients were imaged in treatment position using MRI-optimal immobilization devices. Radiofrequency (RF) coil configurations and scan protocols were optimized based on RTP constraints. Off-resonance and gradient nonlinearity-induced geometric distortions were minimized or corrected prior to using images for RTP. A multidisciplinary MRI simulation guide, along with window width and level presets, was created to standardize use of MR images during RTP. A quality assurance program was implemented to maintain accuracy and repeatability of MRI simulation exams. Results: The combination of a large bore scanner, high field strength, and circumferentially wrapped, flexible phased array RF receive coils permitted acquisition of thin slice images with high contrast-to-noise ratio (CNR) and image intensity uniformity, while simultaneously accommodating patient setup and immobilization devices. Postprocessing corrections and alternative acquisition methods were required to reduce or correct off-resonance and gradient nonlinearity induced geometric distortions. Conclusions: The methodology described herein contains practical strategies the authors have implemented through lessons learned performing clinical MRI simulation exams. In

  20. Temperature simulations in hyperthermia treatment planning of the head and neck region. Rigorous optimization of tissue properties

    International Nuclear Information System (INIS)

    Verhaart, Rene F.; Rijnen, Zef; Verduijn, Gerda M.; Paulides, Margarethus M.; Fortunati, Valerio; Walsum, Theo van; Veenland, Jifke F.

    2014-01-01

    Hyperthermia treatment planning (HTP) is used in the head and neck region (H and N) for pretreatment optimization, decision making, and real-time HTP-guided adaptive application of hyperthermia. In current clinical practice, HTP is based on power-absorption predictions, but thermal dose-effect relationships advocate its extension to temperature predictions. Exploitation of temperature simulations requires region- and temperature-specific thermal tissue properties due to the strong thermoregulatory response of H and N tissues. The purpose of our work was to develop a technique for patient group-specific optimization of thermal tissue properties based on invasively measured temperatures, and to evaluate the accuracy achievable. Data from 17 treated patients were used to optimize the perfusion and thermal conductivity values for the Pennes bioheat equation-based thermal model. A leave-one-out approach was applied to accurately assess the difference between measured and simulated temperature (∇T). The improvement in ∇T for optimized thermal property values was assessed by comparison with the ∇T for values from the literature, i.e., baseline and under thermal stress. The optimized perfusion and conductivity values of tumor, muscle, and fat led to an improvement in simulation accuracy (∇T: 2.1 ± 1.2 C) compared with the accuracy for baseline (∇T: 12.7 ± 11.1 C) or thermal stress (∇T: 4.4 ± 3.5 C) property values. The presented technique leads to patient group-specific temperature property values that effectively improve simulation accuracy for the challenging H and N region, thereby making simulations an elegant addition to invasive measurements. The rigorous leave-one-out assessment indicates that improvements in accuracy are required to rely only on temperature-based HTP in the clinic. (orig.) [de

  1. A Monte Carlo pencil beam scanning model for proton treatment plan simulation using GATE/GEANT4

    Energy Technology Data Exchange (ETDEWEB)

    Grevillot, L; Freud, N; Sarrut, D [Universite de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Universite Lyon 1, Centre Leon Berard, Lyon (France); Bertrand, D; Dessy, F, E-mail: loic.grevillot@creatis.insa-lyon.fr [IBA, B-1348, Louvain-la Neuve (Belgium)

    2011-08-21

    This work proposes a generic method for modeling scanned ion beam delivery systems, without simulation of the treatment nozzle and based exclusively on beam data library (BDL) measurements required for treatment planning systems (TPS). To this aim, new tools dedicated to treatment plan simulation were implemented in the Gate Monte Carlo platform. The method was applied to a dedicated nozzle from IBA for proton pencil beam scanning delivery. Optical and energy parameters of the system were modeled using a set of proton depth-dose profiles and spot sizes measured at 27 therapeutic energies. For further validation of the beam model, specific 2D and 3D plans were produced and then measured with appropriate dosimetric tools. Dose contributions from secondary particles produced by nuclear interactions were also investigated using field size factor experiments. Pristine Bragg peaks were reproduced with 0.7 mm range and 0.2 mm spot size accuracy. A 32 cm range spread-out Bragg peak with 10 cm modulation was reproduced with 0.8 mm range accuracy and a maximum point-to-point dose difference of less than 2%. A 2D test pattern consisting of a combination of homogeneous and high-gradient dose regions passed a 2%/2 mm gamma index comparison for 97% of the points. In conclusion, the generic modeling method proposed for scanned ion beam delivery systems was applicable to an IBA proton therapy system. The key advantage of the method is that it only requires BDL measurements of the system. The validation tests performed so far demonstrated that the beam model achieves clinical performance, paving the way for further studies toward TPS benchmarking. The method involves new sources that are available in the new Gate release V6.1 and could be further applied to other particle therapy systems delivering protons or other types of ions like carbon.

  2. A Monte Carlo pencil beam scanning model for proton treatment plan simulation using GATE/GEANT4.

    Science.gov (United States)

    Grevillot, L; Bertrand, D; Dessy, F; Freud, N; Sarrut, D

    2011-08-21

    This work proposes a generic method for modeling scanned ion beam delivery systems, without simulation of the treatment nozzle and based exclusively on beam data library (BDL) measurements required for treatment planning systems (TPS). To this aim, new tools dedicated to treatment plan simulation were implemented in the Gate Monte Carlo platform. The method was applied to a dedicated nozzle from IBA for proton pencil beam scanning delivery. Optical and energy parameters of the system were modeled using a set of proton depth-dose profiles and spot sizes measured at 27 therapeutic energies. For further validation of the beam model, specific 2D and 3D plans were produced and then measured with appropriate dosimetric tools. Dose contributions from secondary particles produced by nuclear interactions were also investigated using field size factor experiments. Pristine Bragg peaks were reproduced with 0.7 mm range and 0.2 mm spot size accuracy. A 32 cm range spread-out Bragg peak with 10 cm modulation was reproduced with 0.8 mm range accuracy and a maximum point-to-point dose difference of less than 2%. A 2D test pattern consisting of a combination of homogeneous and high-gradient dose regions passed a 2%/2 mm gamma index comparison for 97% of the points. In conclusion, the generic modeling method proposed for scanned ion beam delivery systems was applicable to an IBA proton therapy system. The key advantage of the method is that it only requires BDL measurements of the system. The validation tests performed so far demonstrated that the beam model achieves clinical performance, paving the way for further studies toward TPS benchmarking. The method involves new sources that are available in the new Gate release V6.1 and could be further applied to other particle therapy systems delivering protons or other types of ions like carbon.

  3. Walking Planning Based on Artificial Vector Field with Prediction Simulation for Biped Robot

    Science.gov (United States)

    Yamaguchi, Takashi; Shibata, Masaaki

    This paper proposes a way of gait trajectory generation with artificial vector field for stable walking of a biped robot. The tip of the robot on walking can often deviate from the desired trajectory by the disturbances forced by unexpected outside factors. In our approach, though no prepared trajectory is specified a priori, the tip follows the artificial vectors designed in the workspace. Moreover, the prediction simulation is performed on-line. The simulator judges the stability under comparison with the present state and the prediction results, and then the gait parameters are adaptively improved in feedforward for the stable walk. The numerical and physical experimental results show the validity of the proposed method in the continuous walk.

  4. Hydrodynamic simulations of integrated experiments planned for OMEGA/OMEGA EP laser systems

    International Nuclear Information System (INIS)

    Delettrez, J. A.; Myatt, J.; Radha, P. B.; Stoeckl, C.; Meyerhofer, D. D.

    2005-01-01

    Integrated fast-ignition experiments for the combined OMEGA/OMEGA EP laser systems have been simulated with the multidimensional hydrodynamic code DRACO. In the simplified electron transport model included in DRACO, the electrons are introduced at the pole of a 2-D simulation and transported in a straight line toward the target core, depositing their energy according to a recently published slowing-down formula.1 Simulations, including alpha transport, of an OMEGA cryogenic target designed to reach a 1-D fuel R of 500 mg/cm2 have been carried out for 1-D (clean) and, more realistic, 2-D (with nonuniformities) implosions to assess the sensitivity to energy, timing, and irradiance of the Gaussian fast-ignitor beam. The OMEGA laser system provides up to 30 kJ of compression energy, and OMEGA EP will provide two short pulse beams, each with energies up to 2.6 kJ. For the 1-D case, the neutron yield is predicted to be in excess of 1015 (compared to 1014 for no ignitor beam) over a timing range of about 80 ps. This talk will present these results and new 2-D simulation results that include the effects of realistic cryogenic target perturbations on the compressed core. This work was supported by the U.S. Department of Energy Office of Inertial Confinement Fusion under Cooperative Agreement No. DE-FC52-92SF19460, the University of Rochester, and the New York State Energy Research and Development Authority. The support of DOE does not constitute an endorsement by DOE of the views expressed in this article. (Author)

  5. Simulation-Based Planning of Optimal Conditions for Industrial Computed Tomography

    DEFF Research Database (Denmark)

    Reisinger, S.; Kasperl, S.; Franz, M.

    2011-01-01

    We present a method to optimise conditions for industrial computed tomography (CT). This optimisation is based on a deterministic simulation. Our algorithm finds task-specific CT equipment settings to achieve optimal exposure parameters by means of an STL-model of the specimen and a raytracing me...... method. These parameters are positioning and orientation of the specimen, X-ray tube voltage and prefilter thickness....

  6. Gravity inversion code

    International Nuclear Information System (INIS)

    Burkhard, N.R.

    1979-01-01

    The gravity inversion code applies stabilized linear inverse theory to determine the topography of a subsurface density anomaly from Bouguer gravity data. The gravity inversion program consists of four source codes: SEARCH, TREND, INVERT, and AVERAGE. TREND and INVERT are used iteratively to converge on a solution. SEARCH forms the input gravity data files for Nevada Test Site data. AVERAGE performs a covariance analysis on the solution. This document describes the necessary input files and the proper operation of the code. 2 figures, 2 tables

  7. Strategic Plan for Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS)

    Energy Technology Data Exchange (ETDEWEB)

    Kimberlyn C. Mousseau

    2011-10-01

    The Nuclear Energy Computational Fluid Dynamics Advanced Modeling and Simulation (NE-CAMS) system is being developed at the Idaho National Laboratory (INL) in collaboration with Bettis Laboratory, Sandia National Laboratory (SNL), Argonne National Laboratory (ANL), Utah State University (USU), and other interested parties with the objective of developing and implementing a comprehensive and readily accessible data and information management system for computational fluid dynamics (CFD) verification and validation (V&V) in support of nuclear energy systems design and safety analysis. The two key objectives of the NE-CAMS effort are to identify, collect, assess, store and maintain high resolution and high quality experimental data and related expert knowledge (metadata) for use in CFD V&V assessments specific to the nuclear energy field and to establish a working relationship with the U.S. Nuclear Regulatory Commission (NRC) to develop a CFD V&V database, including benchmark cases, that addresses and supports the associated NRC regulations and policies on the use of CFD analysis. In particular, the NE-CAMS system will support the Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Program, which aims to develop and deploy advanced modeling and simulation methods and computational tools for reliable numerical simulation of nuclear reactor systems for design and safety analysis. Primary NE-CAMS Elements There are four primary elements of the NE-CAMS knowledge base designed to support computer modeling and simulation in the nuclear energy arena as listed below. Element 1. The database will contain experimental data that can be used for CFD validation that is relevant to nuclear reactor and plant processes, particularly those important to the nuclear industry and the NRC. Element 2. Qualification standards for data evaluation and classification will be incorporated and applied such that validation data sets will result in well

  8. Simulation of electricity demand in a remote island for optimal planning of a hybrid renewable energy system

    Science.gov (United States)

    Koskinas, Aristotelis; Zacharopoulou, Eleni; Pouliasis, George; Engonopoulos, Ioannis; Mavroyeoryos, Konstantinos; Deligiannis, Ilias; Karakatsanis, Georgios; Dimitriadis, Panayiotis; Iliopoulou, Theano; Koutsoyiannis, Demetris; Tyralis, Hristos

    2017-04-01

    We simulate the electrical energy demand in the remote island of Astypalaia. To this end we first obtain information regarding the local socioeconomic conditions and energy demand. Secondly, the available hourly demand data are analysed at various time scales (hourly, weekly, daily, seasonal). The cross-correlations between the electrical energy demand and the mean daily temperature as well as other climatic variables for the same time period are computed. Also, we investigate the cross-correlation between those climatic variables and other variables related to renewable energy resources from numerous observations around the globe in order to assess the impact of each one to a hybrid renewable energy system. An exploratory data analysis including all variables is performed with the purpose to find hidden relationships. Finally, the demand is simulated considering all the periodicities found in the analysis. The simulation time series will be used in the development of a framework for planning of a hybrid renewable energy system in Astypalaia. Acknowledgement: This research is conducted within the frame of the undergraduate course "Stochastic Methods in Water Resources" of the National Technical University of Athens (NTUA). The School of Civil Engineering of NTUA provided moral support for the participation of the students in the Assembly.

  9. Neural Network Learning as an Inverse Problem

    Czech Academy of Sciences Publication Activity Database

    Kůrková, Věra

    2005-01-01

    Roč. 13, č. 5 (2005), s. 551-559 ISSN 1367-0751 R&D Projects: GA AV ČR 1ET100300517 Institutional research plan: CEZ:AV0Z10300504 Keywords : learning from data * generalization * empirical error functional * inverse problem * evaluation operator * kernel methods Subject RIV: BA - General Mathematics Impact factor: 0.382, year: 2005

  10. Inversion of e-simple Block Matrices

    Czech Academy of Sciences Publication Activity Database

    Fiedler, Miroslav

    2005-01-01

    Roč. 400, - (2005), s. 231-241 ISSN 0024-3795 R&D Projects: GA AV ČR IAA1030302 Institutional research plan: CEZ:AV0Z10300504 Keywords : matrix inversion * M-matrix * unipathic graph * e-simple graph Subject RIV: BA - General Mathematics Impact factor: 0.590, year: 2005

  11. Modelling "reality" in tectonics: Simulation of the mechanical evolution of the Jura Mountains-Molasse Basin system, and routes to forward-inverse modelling of fold thrust belts.

    Science.gov (United States)

    Hindle, David; Kley, Jonas

    2016-04-01

    The ultimate validation of any numerical model of any geological process comes when it can accurately forward model a case study from the geological record. However, as the example of the Jura-Molasse fold thrust belt demonstrates, geological information on even the most basic aspects of the present day state of such systems is highly incomplete and usually known only with large uncertainties. Fold thrust-belts are studied and understood by geologists in an iterative process of constructing their subsurface geometries and structures (folds, faults, bedding etc) based on limited subsurface information from boreholes, tunnels or seismic data where available, and surface information on outcrops of different layers and their dips. This data is usually processed through geometric models which involve conservation of line length of different beds over the length of an entire cross section. Constructing such sections is the art of cross section balancing. A balanced cross section can be easily restored to its pre-deformation state, assuming (usually) originally horizontal bedding to remove the effects of folding and faulting. Such a pre-deformation state can then form an initial condition for a forward mechanical model of the section. A mechanical model introduces new parameters into the system such as rock elasticity, cohesion, and frictional properties. However, a forward mechanical model can also potentially show the continuous evolution of a fold thrust belt, including dynamic quantities like stress. Moreover, a forward mechanical model, if correct in most aspects, should match in its final state, the present day geological cross section it is simulating. However, when attempting to achieve a match between geometric and mechanical models, it becomes clear that many more aspects of the geodynamic history of a fold thrust belt have to be taken into account. Erosion of the uppermost layers of an evolving thrust belt is the most obvious one of these. This can potentially

  12. Improving D and D Planning and Waste Management with Cutting and Packaging Simulation

    International Nuclear Information System (INIS)

    Richard H. Meservey; Jean-Louis Bouchet

    2005-01-01

    The increased amount of decontamination and decommissioning (D and D) being performed throughout the world not only strains nuclear cleanup budgets, but places severe demands on the capacities of nuclear waste disposal sites. Although budgets and waste disposal sites have been able to accommodate the demand thus far, the increasing number of large facilities being decommissioned will cause major impacts to the waste disposal process. It is thus imperative that new and innovative technologies are applied within the D and D industry to reduce costs and waste disposal requirements for the decommissioning of our inventory of large and aging nuclear facilities. One of the most significant problems reactor owner's deal with is the accurate determination of the types and volumes of wastes that will be generated during decommissioning of their facilities. Waste disposal costs, restrictions, and transportation issues can account for as much as 30% of the total costs to decommission a facility and thus it is very important to have accurate waste volume estimates. The use of simulation technologies to estimate and reduce decommissioning waste volumes provides a new way to manage risks associated with this work. Simulation improves the process by allowing facility owners to obtain accurate estimates of the types and amounts of waste prior to starting the actual D and D work. This reduces risk by permitting earlier and better negotiations with the disposal sites, and more time to resolve transportation issues. While simulation is a tool to be used by the D and D contractors, its real value is in reducing risks and costs to the reactor owners

  13. Strategic Plan for Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS)

    Energy Technology Data Exchange (ETDEWEB)

    Rich Johnson; Kimberlyn C. Mousseau; Hyung Lee

    2011-09-01

    NE-KAMS knowledge base will assist computational analysts, physics model developers, experimentalists, nuclear reactor designers, and federal regulators by: (1) Establishing accepted standards, requirements and best practices for V&V and UQ of computational models and simulations, (2) Establishing accepted standards and procedures for qualifying and classifying experimental and numerical benchmark data, (3) Providing readily accessible databases for nuclear energy related experimental and numerical benchmark data that can be used in V&V assessments and computational methods development, (4) Providing a searchable knowledge base of information, documents and data on V&V and UQ, and (5) Providing web-enabled applications, tools and utilities for V&V and UQ activities, data assessment and processing, and information and data searches. From its inception, NE-KAMS will directly support nuclear energy research, development and demonstration programs within the U.S. Department of Energy (DOE), including the Consortium for Advanced Simulation of Light Water Reactors (CASL), the Nuclear Energy Advanced Modeling and Simulation (NEAMS), the Light Water Reactor Sustainability (LWRS), the Small Modular Reactors (SMR), and the Next Generation Nuclear Power Plant (NGNP) programs. These programs all involve computational modeling and simulation (M&S) of nuclear reactor systems, components and processes, and it is envisioned that NE-KAMS will help to coordinate and facilitate collaboration and sharing of resources and expertise for V&V and UQ across these programs. In addition, from the outset, NE-KAMS will support the use of computational M&S in the nuclear industry by developing guidelines and recommended practices aimed at quantifying the uncertainty and assessing the applicability of existing analysis models and methods. The NE-KAMS effort will initially focus on supporting the use of computational fluid dynamics (CFD) and thermal hydraulics (T/H) analysis for M&S of nuclear

  14. Sharp spatially constrained inversion

    DEFF Research Database (Denmark)

    Vignoli, Giulio G.; Fiandaca, Gianluca G.; Christiansen, Anders Vest C A.V.C.

    2013-01-01

    We present sharp reconstruction of multi-layer models using a spatially constrained inversion with minimum gradient support regularization. In particular, its application to airborne electromagnetic data is discussed. Airborne surveys produce extremely large datasets, traditionally inverted...... by using smoothly varying 1D models. Smoothness is a result of the regularization constraints applied to address the inversion ill-posedness. The standard Occam-type regularized multi-layer inversion produces results where boundaries between layers are smeared. The sharp regularization overcomes...... inversions are compared against classical smooth results and available boreholes. With the focusing approach, the obtained blocky results agree with the underlying geology and allow for easier interpretation by the end-user....

  15. Submucous Myoma Induces Uterine Inversion

    Directory of Open Access Journals (Sweden)

    Yu-Li Chen

    2006-06-01

    Conclusion: Nonpuerperal inversion of the uterus is rarely encountered by gynecologists. Diagnosis of uterine inversion is often not easy and imaging studies might be helpful. Surgical treatment is the method of choice in nonpuerperal uterine inversion.

  16. Use of modeling and simulation in the planning, analysis and interpretation of ultrasonic testing

    International Nuclear Information System (INIS)

    Algernon, Daniel; Grosse, Christian U.

    2016-01-01

    Acoustic testing methods such as ultrasound and impact echo are an important tool in building diagnostics. The range includes thickness measurements, the representation of the internal component geometry as well as the detection of voids (gravel pockets), delaminations or possibly locating grouting faults in the interior of metallic cladding tubes of tendon ducts. Basically acoustic method for non-destructive testing (NDT) is based on the excitation of elastic waves that interact with the target object (e.g. to detect discontinuity in the component) at the acoustic interface. From the signal received at the component surface this interaction shall be detected and interpreted to draw conclusions about the presence of the target object, and optionally to determine its size and position (approximately). Although the basic underlying physical principles of the application of elastic waves in NDT are known, it can be complicated by complex relationships in the form of restricted access, component geometries, or the type and form of reflectors. To estimate the chances of success of a test is already often not trivial. These circumstances highlight the importance of using simulations that allow a theoretically sound basis for testing and allow easy optimizing test systems. The deployable simulation methods are varied. Common are in particular the finite element method, the Elasto Finite Integration Technique and semi-analytical calculation methods. [de

  17. Planning for Regional Water Resources in Northwest China Using a Dynamic Simulation Model

    Science.gov (United States)

    Chen, C.; Kalra, A.; Ahmad, S.

    2014-12-01

    Problem of water scarcity is prominent in northwest China due to its typical desert climate. Exceedence of sustainable yield of groundwater resources has resulted in groundwater depletion, which has raised a series of issues such as drying wells, increasing pumping costs and environmental damage. With a rapid agricultural and economic development, population increase has added extra stress on available water resources by increasing municipal, agricultural and industrial demands. This necessitates efficient water resources management strategies with better understanding of the causes of water stress and options for sustainable development of economy and management of environment. This study focuses on simulating the water supply and demand, under the influence of changing climate, for Shanshan County, located in northwest of China. A dynamic simulation model is developed using the modeling tool Stella for monthly water balance for the period ranging from 2000-2030. Different future water demand and supply scenarios are developed to represent: (1) base scenario- with current practices; (2) change of the primary water source; (3) improvement of irrigation efficiency; (4) reduction of irrigation area; and (5) reduction of industrial water demand. The results indicate that besides growing demand, the low water use efficiency and low level of water reuse are the primary concerns for water scarcity. Groundwater recharge and abstraction could be balanced by 2030, by reducing industrial demand by 50% and using high efficiency irrigation for agriculture. The model provided a better understanding of the effect of different policies and can help in identifying water resources management strategies.

  18. Experimental verification of lung dose with radiochromic film: comparison with Monte Carlo simulations and commercially available treatment planning systems

    Science.gov (United States)

    Paelinck, L.; Reynaert, N.; Thierens, H.; DeNeve, W.; DeWagter, C.

    2005-05-01

    The purpose of this study was to assess the absorbed dose in and around lung tissue by performing radiochromic film measurements, Monte Carlo simulations and calculations with superposition convolution algorithms. We considered a layered polystyrene phantom of 12 × 12 × 12 cm3 containing a central cavity of 6 × 6 × 6 cm3 filled with Gammex RMI lung-equivalent material. Two field configurations were investigated, a small 1 × 10 cm2 field and a larger 10 × 10 cm2 field. First, we performed Monte Carlo simulations to investigate the influence of radiochromic film itself on the measured dose distribution when the film intersects a lung-equivalent region and is oriented parallel to the central beam axis. To that end, the film and the lung-equivalent materials were modelled in detail, taking into account their specific composition. Next, measurements were performed with the film oriented both parallel and perpendicular to the central beam axis to verify the results of our Monte Carlo simulations. Finally, we digitized the phantom in two commercially available treatment planning systems, Helax-TMS version 6.1A and Pinnacle version 6.2b, and calculated the absorbed dose in the phantom with their incorporated superposition convolution algorithms to compare with the Monte Carlo simulations. Comparing Monte Carlo simulations with measurements reveals that radiochromic film is a reliable dosimeter in and around lung-equivalent regions when the film is positioned perpendicular to the central beam axis. Radiochromic film is also able to predict the absorbed dose accurately when the film is positioned parallel to the central beam axis through the lung-equivalent region. However, attention must be paid when the film is not positioned along the central beam axis, in which case the film gradually attenuates the beam and decreases the dose measured behind the cavity. This underdosage disappears by offsetting the film a few centimetres. We find deviations of about 3.6% between

  19. Simulations of the Fuel Economy and Emissions of Hybrid Transit Buses over Planned Local Routes

    Energy Technology Data Exchange (ETDEWEB)

    Gao, Zhiming [ORNL; LaClair, Tim J [ORNL; Daw, C Stuart [ORNL; Smith, David E [ORNL; Franzese, Oscar [ORNL

    2014-01-01

    We present simulated fuel economy and emissions city transit buses powered by conventional diesel engines and diesel-hybrid electric powertrains of varying size. Six representative city drive cycles were included in the study. In addition, we included previously published aftertreatment device models for control of CO, HC, NOx, and particulate matter (PM) emissions. Our results reveal that bus hybridization can significantly enhance fuel economy by reducing engine idling time, reducing demands for accessory loads, exploiting regenerative braking, and shifting engine operation to speeds and loads with higher fuel efficiency. Increased hybridization also tends to monotonically reduce engine-out emissions, but trends in the tailpipe (post-aftertreatment) emissions involve more complex interactions that significantly depend on motor size and drive cycle details.

  20. USING THE GEOMETRIC SIMULATION AT PLANNING OF MIXERS OF TELESCOPIC CONSTRUCTION

    Directory of Open Access Journals (Sweden)

    K. K. Miroshnychenko

    2015-07-01

    Full Text Available Purpose. The use of traditional processing methods to obtain a homogeneous fiber reinforced concrete does not ensure the creation of high quality, homogeneous construction materials. This study aimes to develop (with the use of geometric simulation of different variants of the working parts of faucets, ensures effective mixing of building structures from concrete. Methodology. The complex of theoretical research allowed formulating the design principles of resource-saving technologies of production of particulate-reinforced compounds with high performance properties. Using the geometric simulation developed different versions of the blades of the working bodies of mixers with the complex geometric shapes, providing excellent mixing of the the fiber-reinforced fine-grained material. Findings. As a result of theoretical and experimental studies aimed at developing the fundamentally new approaches to the preparation (mixing of fiber-reinforced concrete with different types of fibers and the manufacture of products from them, the author obtained some results. Namely the technology of preparation of fiber-reinforced concrete using telescopic design mixers with effective blades of complex shape was developed. Application of the developed blades allows obtaining a homogeneous fiber-reinforced concrete composition. Due to the high quality of mixing the time of preparation of the mixture reduces. This factor reduces the repair costs of equipment and electricity. Originality. The author developed the design of the mixer with the working body of the telescopic type with blades of complex shape. Practical value. The use of the proposed technology of mixing a particle-reinforced material with the use of the mixer with a working body of a telescopic design with blades with complex geometric shapes provides the high uniformity of fiber-reinforced concrete composition. The author proposed technological methods of production, allow expanding the scope of

  1. Vadose zone transport field study: Detailed test plan for simulated leak tests

    International Nuclear Information System (INIS)

    AL Ward; GW Gee

    2000-01-01

    : identify mechanisms controlling transport processes in soils typical of the hydrogeologic conditions of Hanford's waste disposal sites; reduce uncertainty in conceptual models; develop a detailed and accurate database of hydraulic and transport parameters for validation of three-dimensional numerical models; identify and evaluate advanced, cost-effective characterization methods with the potential to assess changing conditions in the vadose zone, particularly as surrogates of currently undetectable high-risk contaminants. This plan provides details for conducting field tests during FY 2000 to accomplish these objectives. Details of additional testing during FY 2001 and FY 2002 will be developed as part of the work planning process implemented by the Integration Project

  2. Vadose zone transport field study: Detailed test plan for simulated leak tests

    Energy Technology Data Exchange (ETDEWEB)

    AL Ward; GW Gee

    2000-06-23

    Hanford to: identify mechanisms controlling transport processes in soils typical of the hydrogeologic conditions of Hanford's waste disposal sites; reduce uncertainty in conceptual models; develop a detailed and accurate database of hydraulic and transport parameters for validation of three-dimensional numerical models; identify and evaluate advanced, cost-effective characterization methods with the potential to assess changing conditions in the vadose zone, particularly as surrogates of currently undetectable high-risk contaminants. This plan provides details for conducting field tests during FY 2000 to accomplish these objectives. Details of additional testing during FY 2001 and FY 2002 will be developed as part of the work planning process implemented by the Integration Project.

  3. Inverse problem in neutron transport and radiation

    International Nuclear Information System (INIS)

    Barichello, L.B.; Vilhena, M.T. de

    1993-01-01

    In this work the LTS N method is applied to solve a inverse problem which consists on the determination of the incident angular fluxes at the boundary from the known values of the scalar flux at interior points. Numerical simulations are presented. (author)

  4. Inverse Kinematics With Closed Form Solution For Denso Robot Manipulator

    Directory of Open Access Journals (Sweden)

    Ikhsan Eka Prasetia

    2015-03-01

    Full Text Available In this paper, the forward kinematics and inverse kinematics used on the Denso robot manipulator which has a 6-DOF. The forward kinematics will result in the desired position by end-effector, while inverse kinematics produce angel on each joint. Inverse kinematics problem are very difficult, therefor to obtain the solution of inverse kinematics using closed form solution with geometry approach. The simulation result obtained from forward kinematics and inverse kinematics is determining desired position by Denso robot manipulator. Forward kinematics produce the desired position by the end-effector. Inverse kinematics produce joint angle, where the inverse kinematics produce eight conditions obtained from closed form solution with geometry approach to reach the desired position by the end-effector.

  5. Planning and Analysis of the Company’s Financial Performances by Using a Software Simulation

    Directory of Open Access Journals (Sweden)

    Meri BOSHKOSKA

    2017-06-01

    Full Text Available Information Technology includes a wide range of software solution that helps managers in decision making processes in order to increase the company's business performance. Using software solution in financial analysis is a valuable tool for managers in the financial decision making process. The objective of the study was accomplished by developing Software that easily determines the financial performances of the company through integration of the analysis of financial indicators and DuPont profitability analysis model. Through this software, managers will be able to calculate the current financial state and visually analyze how their actions will affect the financial performance of the company. This will enable them to identify the best ways to improve the financial performance of the company. The software can perform a financial analysis and give a clear, useful overview of the current business performance and can also help in planning the growth of the company. The Software can also be implemented in educational purposes for students and managers in the field of financial management.

  6. Assessment of urban pluvial flood risk and efficiency of adaptation options through simulations - A new generation of urban planning tools

    Science.gov (United States)

    Löwe, Roland; Urich, Christian; Sto. Domingo, Nina; Mark, Ole; Deletic, Ana; Arnbjerg-Nielsen, Karsten

    2017-07-01

    We present a new framework for flexible testing of flood risk adaptation strategies in a variety of urban development and climate scenarios. This framework couples the 1D-2D hydrodynamic simulation package MIKE FLOOD with the agent-based urban development model DAnCE4Water and provides the possibility to systematically test various flood risk adaptation measures ranging from large infrastructure changes over decentralised water management to urban planning policies. We have tested the framework in a case study in Melbourne, Australia considering 9 scenarios for urban development and climate and 32 potential combinations of flood adaptation measures. We found that the performance of adaptation measures strongly depended on the considered climate and urban development scenario and the other implementation measures implemented, suggesting that adaptive strategies are preferable over one-off investments. Urban planning policies proved to be an efficient means for the reduction of flood risk, while implementing property buyback and pipe increases in a guideline-oriented manner was too costly. Random variations in location and time point of urban development could have significant impact on flood risk and would in some cases outweigh the benefits of less efficient adaptation strategies. The results of our setup can serve as an input for robust decision making frameworks and thus support the identification of flood risk adaptation measures that are economically efficient and robust to variations of climate and urban layout.

  7. Energy planning of a hospital using Mathematical Programming and Monte Carlo simulation for dealing with uncertainty in the economic parameters

    Energy Technology Data Exchange (ETDEWEB)

    Mavrotas, George; Florios, Kostas; Vlachou, Dimitra [Laboratory of Industrial and Energy Economics, School of Chemical Engineering, National Technical University of Athens, Zographou Campus, 15780 Athens (Greece)

    2010-04-15

    For more than 40 years, Mathematical Programming is the traditional tool for energy planning at the national or regional level aiming at cost minimization subject to specific technological, political and demand satisfaction constraints. The liberalization of the energy market along with the ongoing technical progress increased the level of competition and forced energy consumers, even at the unit level, to make their choices among a large number of alternative or complementary energy technologies, fuels and/or suppliers. In the present work we develop a modelling framework for energy planning in units of the tertiary sector giving special emphasis to model reduction and to the uncertainty of the economic parameters. In the given case study, the energy rehabilitation of a hospital in Athens is examined and the installation of a cogeneration, absorption and compression unit is examined for the supply of the electricity, heating and cooling load. The basic innovation of the given energy model lies in the uncertainty modelling through the combined use of Mathematical Programming (namely, Mixed Integer Linear Programming, MILP) and Monte Carlo simulation that permits the risk management for the most volatile parameters of the objective function such as the fuel costs and the interest rate. The results come in the form of probability distributions that provide fruitful information to the decision maker. The effect of model reduction through appropriate data compression of the load data is also addressed. (author)

  8. Operational Simulation Tools and Long Term Strategic Planning for High Penetrations of PV in the Southeastern United States

    Energy Technology Data Exchange (ETDEWEB)

    Tuohy, Aidan [Electric Power Research Institute, Knoxville, TN (United States); Smith, Jeff [Electric Power Research Institute, Knoxville, TN (United States); Rylander, Matt [Electric Power Research Institute, Knoxville, TN (United States); Singhvi, Vikas [Electric Power Research Institute, Knoxville, TN (United States); Enbar, Nadav [Electric Power Research Institute, Knoxville, TN (United States); Coley, Steven [Electric Power Research Institute, Knoxville, TN (United States); Roark, Jeff [Electric Power Research Institute, Knoxville, TN (United States); Ela, Erik [Electric Power Research Institute, Knoxville, TN (United States); Lannoye, Eamonn [Electric Power Research Institute, Knoxville, TN (United States); Pilbrick, Charles Russ [Electric Power Research Institute, Knoxville, TN (United States); Rudkevich, Alex [Electric Power Research Institute, Knoxville, TN (United States); Hansen, Cliff [Electric Power Research Institute, Knoxville, TN (United States)

    2016-07-11

    Increasing levels of distributed and utility scale Solar Photovoltaics (PV) will have an impact on many utility functions, including distribution system operations, bulk system performance, business models and scheduling of generation. In this project, EPRI worked with Southern Company Services and its affiliates and the Tennessee Valley Authority to assist these utilities in their strategic planning efforts for integrating PV, based on modeling, simulation and analysis using a set of innovative tools. Advanced production simulation models were used to investigate operating reserve requirements. To leverage existing work and datasets, this last task was carried out on the California system. Overall, the project resulted in providing useful information to both of the utilities involved and through the final reports and interactions during the project. The results from this project can be used to inform the industry about new and improved methodologies for understanding solar PV penetration, and will influence ongoing and future research. This report summarizes each of the topics investigated over the 2.5-year project period.

  9. The development and implementation of a performance appraisal framework for radiation therapists in planning and simulation.

    Science.gov (United States)

    Becker, Jillian; Bridge, Pete; Brown, Elizabeth; Ferrari-Anderson, Janet; Lusk, Ryan

    2017-12-01

    It is a challenge for radiation therapists (RTs) to keep pace with changing planning technology and techniques while maintaining appropriate skills levels. The ability of individual RTs to meet the demands of this constantly changing practice can only be assured through establishing clearly defined standards for practice and a systematic process for providing feedback on performance. Investigation into existing models for performance appraisal produced minimal results so a radiation therapy-specific framework was developed. The goal for this initiative was to establish a framework that would reflect the complexity of practice and provide a clear measure of performance against them. This paper outlines the implementation of this framework into practice and discusses some lessons learned in the process. The framework was developed and implemented in six stages: (1) project team, (2) scope, (3) dosimetry pilot, (4) staff consultation, (5) finalisation and implementation and (6) future development and evaluation. Both cultural and organisational obstacles needed to be addressed before this framework could be successfully introduced. Even though this slowed progress, addressing these obstacles during the development process was essential to the success of this framework. The incremental approach provided the opportunity for each aspect to be tested and the development of subsequent stages to be informed by lessons learned during the previous one. This approach may be beneficial when developing and implementing projects involving performance appraisal to promote consistency, fairness and quality. © 2017 The Authors. Journal of Medical Radiation Sciences published by John Wiley & Sons Australia, Ltd on behalf of Australian Society of Medical Imaging and Radiation Therapy and New Zealand Institute of Medical Radiation Technology.

  10. Atmospheric inverse modeling via sparse reconstruction

    Directory of Open Access Journals (Sweden)

    N. Hase

    2017-10-01

    Full Text Available Many applications in atmospheric science involve ill-posed inverse problems. A crucial component of many inverse problems is the proper formulation of a priori knowledge about the unknown parameters. In most cases, this knowledge is expressed as a Gaussian prior. This formulation often performs well at capturing smoothed, large-scale processes but is often ill equipped to capture localized structures like large point sources or localized hot spots. Over the last decade, scientists from a diverse array of applied mathematics and engineering fields have developed sparse reconstruction techniques to identify localized structures. In this study, we present a new regularization approach for ill-posed inverse problems in atmospheric science. It is based on Tikhonov regularization with sparsity constraint and allows bounds on the parameters. We enforce sparsity using a dictionary representation system. We analyze its performance in an atmospheric inverse modeling scenario by estimating anthropogenic US methane (CH4 emissions from simulated atmospheric measurements. Different measures indicate that our sparse reconstruction approach is better able to capture large point sources or localized hot spots than other methods commonly used in atmospheric inversions. It captures the overall signal equally well but adds details on the grid scale. This feature can be of value for any inverse problem with point or spatially discrete sources. We show an example for source estimation of synthetic methane emissions from the Barnett shale formation.

  11. Atmospheric inverse modeling via sparse reconstruction

    Science.gov (United States)

    Hase, Nils; Miller, Scot M.; Maaß, Peter; Notholt, Justus; Palm, Mathias; Warneke, Thorsten

    2017-10-01

    Many applications in atmospheric science involve ill-posed inverse problems. A crucial component of many inverse problems is the proper formulation of a priori knowledge about the unknown parameters. In most cases, this knowledge is expressed as a Gaussian prior. This formulation often performs well at capturing smoothed, large-scale processes but is often ill equipped to capture localized structures like large point sources or localized hot spots. Over the last decade, scientists from a diverse array of applied mathematics and engineering fields have developed sparse reconstruction techniques to identify localized structures. In this study, we present a new regularization approach for ill-posed inverse problems in atmospheric science. It is based on Tikhonov regularization with sparsity constraint and allows bounds on the parameters. We enforce sparsity using a dictionary representation system. We analyze its performance in an atmospheric inverse modeling scenario by estimating anthropogenic US methane (CH4) emissions from simulated atmospheric measurements. Different measures indicate that our sparse reconstruction approach is better able to capture large point sources or localized hot spots than other methods commonly used in atmospheric inversions. It captures the overall signal equally well but adds details on the grid scale. This feature can be of value for any inverse problem with point or spatially discrete sources. We show an example for source estimation of synthetic methane emissions from the Barnett shale formation.

  12. Numerical simulation supports formation testing planning; Simulacao numerica auxilia planejamento de teste de formacao

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, Rogerio Marques; Fonseca, Carlos Eduardo da [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil)

    2008-07-01

    A well test is an operation that allows the engineer assessing reservoir performance and fluids properties by measuring flow rates and pressures under a range of flowing conditions. In most well tests, a limited amount of fluid is allowed to flow from the formation being tested. The formation is isolated behind cemented casing and perforated at the formation depth or, in open hole, the formation is straddled by a pair of packers that isolate the formation. During the flow period, the pressure at the formation is monitored over time. Then, the formation is closed (or shut in) and the pressure monitored at the formation while the fluid within the formation equilibrates. The analysis of these pressure changes can provide information on the size and shape of the formation as well as its ability to produce fluids. . The flow of fluid through the column test causes your heating and hence its elongation. Several factors affect the rate of exchange of heat as well and the characteristics of the fluid, the flow of time and the flow and the existence of deep water. The prediction of temperature over well, in its various components, and the effect caused in the column test is not a trivial task. Some authors, for example, describe a method of calculating the behaviour of columns of production, making it simpler variation of constant temperature throughout the entire column, a fact that this does not occur in practice. The work aims at presenting the advantages of using the numerical simulation in determining the efforts and corresponding movements of the column of test of formation. (author)

  13. Simulation

    DEFF Research Database (Denmark)

    Gould, Derek A; Chalmers, Nicholas; Johnson, Sheena J

    2012-01-01

    Recognition of the many limitations of traditional apprenticeship training is driving new approaches to learning medical procedural skills. Among simulation technologies and methods available today, computer-based systems are topical and bring the benefits of automated, repeatable, and reliable...... performance assessments. Human factors research is central to simulator model development that is relevant to real-world imaging-guided interventional tasks and to the credentialing programs in which it would be used....

  14. Inverse scale space decomposition

    DEFF Research Database (Denmark)

    Schmidt, Marie Foged; Benning, Martin; Schönlieb, Carola-Bibiane

    2018-01-01

    We investigate the inverse scale space flow as a decomposition method for decomposing data into generalised singular vectors. We show that the inverse scale space flow, based on convex and even and positively one-homogeneous regularisation functionals, can decompose data represented...... by the application of a forward operator to a linear combination of generalised singular vectors into its individual singular vectors. We verify that for this decomposition to hold true, two additional conditions on the singular vectors are sufficient: orthogonality in the data space and inclusion of partial sums...... of the subgradients of the singular vectors in the subdifferential of the regularisation functional at zero. We also address the converse question of when the inverse scale space flow returns a generalised singular vector given that the initial data is arbitrary (and therefore not necessarily in the range...

  15. Mesoscale inversion of carbon sources and sinks

    International Nuclear Information System (INIS)

    Lauvaux, T.

    2008-01-01

    Inverse methods at large scales are used to infer the spatial variability of carbon sources and sinks over the continents but their uncertainties remain large. Atmospheric concentrations integrate the surface flux variability but atmospheric transport models at low resolution are not able to simulate properly the local atmospheric dynamics at the measurement sites. However, the inverse estimates are more representative of the large spatial heterogeneity of the ecosystems compared to direct flux measurements. Top-down and bottom-up methods that aim at quantifying the carbon exchanges between the surface and the atmosphere correspond to different scales and are not easily comparable. During this phD, a mesoscale inverse system was developed to correct carbon fluxes at 8 km resolution. The high resolution transport model MesoNH was used to simulate accurately the variability of the atmospheric concentrations, which allowed us to reduce the uncertainty of the retrieved fluxes. All the measurements used here were observed during the intensive regional campaign CERES of May and June 2005, during which several instrumented towers measured CO 2 concentrations and fluxes in the South West of France. Airborne measurements allowed us to observe concentrations at high altitude but also CO 2 surface fluxes over large parts of the domain. First, the capacity of the inverse system to correct the CO 2 fluxes was estimated using pseudo-data experiments. The largest fraction of the concentration variability was attributed to regional surface fluxes over an area of about 300 km around the site locations depending on the meteorological conditions. Second, an ensemble of simulations allowed us to define the spatial and temporal structures of the transport errors. Finally, the inverse fluxes at 8 km resolution were compared to direct flux measurements. The inverse system has been validated in space and time and showed an improvement of the first guess fluxes from a vegetation model

  16. Temperature simulations in hyperthermia treatment planning of the head and neck region. Rigorous optimization of tissue properties

    Energy Technology Data Exchange (ETDEWEB)

    Verhaart, Rene F.; Rijnen, Zef; Verduijn, Gerda M.; Paulides, Margarethus M. [Erasmus MC - Cancer Institute, Department of Radiation Oncology, Hyperthermia Unit, Rotterdam (Netherlands); Fortunati, Valerio; Walsum, Theo van; Veenland, Jifke F. [Erasmus MC, Departments of Medical Informatics and Radiology, Biomedical Imaging Group Rotterdam, Rotterdam (Netherlands)

    2014-12-15

    Hyperthermia treatment planning (HTP) is used in the head and neck region (H and N) for pretreatment optimization, decision making, and real-time HTP-guided adaptive application of hyperthermia. In current clinical practice, HTP is based on power-absorption predictions, but thermal dose-effect relationships advocate its extension to temperature predictions. Exploitation of temperature simulations requires region- and temperature-specific thermal tissue properties due to the strong thermoregulatory response of H and N tissues. The purpose of our work was to develop a technique for patient group-specific optimization of thermal tissue properties based on invasively measured temperatures, and to evaluate the accuracy achievable. Data from 17 treated patients were used to optimize the perfusion and thermal conductivity values for the Pennes bioheat equation-based thermal model. A leave-one-out approach was applied to accurately assess the difference between measured and simulated temperature (∇T). The improvement in ∇T for optimized thermal property values was assessed by comparison with the ∇T for values from the literature, i.e., baseline and under thermal stress. The optimized perfusion and conductivity values of tumor, muscle, and fat led to an improvement in simulation accuracy (∇T: 2.1 ± 1.2 C) compared with the accuracy for baseline (∇T: 12.7 ± 11.1 C) or thermal stress (∇T: 4.4 ± 3.5 C) property values. The presented technique leads to patient group-specific temperature property values that effectively improve simulation accuracy for the challenging H and N region, thereby making simulations an elegant addition to invasive measurements. The rigorous leave-one-out assessment indicates that improvements in accuracy are required to rely only on temperature-based HTP in the clinic. (orig.) [German] Die Hyperthermiebehandlungsplanung (HTP, ''hyperthermia treatment planning'') wird in der Kopf- und Halsregion zur Optimierung der

  17. Computer-Assisted Orthognathic Surgery for Patients with Cleft Lip/Palate: From Traditional Planning to Three-Dimensional Surgical Simulation.

    Directory of Open Access Journals (Sweden)

    Daniel Lonic

    Full Text Available Although conventional two-dimensional (2D methods for orthognathic surgery planning are still popular, the use of three-dimensional (3D simulation is steadily increasing. In facial asymmetry cases such as in cleft lip/palate patients, the additional information can dramatically improve planning accuracy and outcome. The purpose of this study is to investigate which parameters are changed most frequently in transferring a traditional 2D plan to 3D simulation, and what planning parameters can be better adjusted by this method.This prospective study enrolled 30 consecutive patients with cleft lip and/or cleft palate (mean age 18.6±2.9 years, range 15 to 32 years. All patients received two-jaw single-splint orthognathic surgery. 2D orthodontic surgery plans were transferred into a 3D setting. Severe bony collisions in the ramus area after 2D plan transfer were noted. The position of the maxillo-mandibular complex was evaluated and eventually adjusted. Position changes of roll, midline, pitch, yaw, genioplasty and their frequency within the patient group were recorded as an alternation of the initial 2D plan. Patients were divided in groups of no change from the original 2D plan and changes in one, two, three and four of the aforementioned parameters as well as subgroups of unilateral, bilateral cleft lip/palate and isolated cleft palate cases. Postoperative OQLQ scores were obtained for 20 patients who finished orthodontic treatment.83.3% of 2D plans were modified, mostly concerning yaw (63.3% and midline (36.7% adjustments. Yaw adjustments had the highest mean values in total and in all subgroups. Severe bony collisions as a result of 2D planning were seen in 46.7% of patients. Possible asymmetry was regularly foreseen and corrected in the 3D simulation.Based on our findings, 3D simulation renders important information for accurate planning in complex cleft lip/palate cases involving facial asymmetry that is regularly missed in conventional 2D

  18. Direct and Inverse problems in Electrocardiography

    Science.gov (United States)

    Boulakia, M.; Fernández, M. A.; Gerbeau, J. F.; Zemzemi, N.

    2008-09-01

    We present numerical results related to the direct and the inverse problems in electrocardiography. The electrical activity of the heart is described by the bidomain equations. The electrocardiograms (ECGs) recorded in different points on the body surface are obtained by coupling the bidomain equation to a Laplace equation in the torso. The simulated ECGs are quite satisfactory. As regards the inverse problem, our goal is to estimate the parameters of the bidomain-torso model. Here we present some preliminary results of a parameter estimation for the torso model.

  19. Inversion assuming weak scattering

    DEFF Research Database (Denmark)

    Xenaki, Angeliki; Gerstoft, Peter; Mosegaard, Klaus

    2013-01-01

    due to the complex nature of the field. A method based on linear inversion is employed to infer information about the statistical properties of the scattering field from the obtained cross-spectral matrix. A synthetic example based on an active high-frequency sonar demonstrates that the proposed...

  20. Locative Inversion in English

    NARCIS (Netherlands)

    Broekhuis, H.

    2005-01-01

    This article aims at reformulating in more current terms Hoekstra and Mulder’s (1990) analysis of the Locative Inversion (LI) construction. The new proposal is crucially based on the assumption that Small Clause (SC) predicates agree with their external argument in phi-features, which may be

  1. Simulation

    CERN Document Server

    Ross, Sheldon

    2006-01-01

    Ross's Simulation, Fourth Edition introduces aspiring and practicing actuaries, engineers, computer scientists and others to the practical aspects of constructing computerized simulation studies to analyze and interpret real phenomena. Readers learn to apply results of these analyses to problems in a wide variety of fields to obtain effective, accurate solutions and make predictions about future outcomes. This text explains how a computer can be used to generate random numbers, and how to use these random numbers to generate the behavior of a stochastic model over time. It presents the statist

  2. Uncertainty analysis for effluent trading planning using a Bayesian estimation-based simulation-optimization modeling approach.

    Science.gov (United States)

    Zhang, J L; Li, Y P; Huang, G H; Baetz, B W; Liu, J

    2017-06-01

    In this study, a Bayesian estimation-based simulation-optimization modeling approach (BESMA) is developed for identifying effluent trading strategies. BESMA incorporates nutrient fate modeling with soil and water assessment tool (SWAT), Bayesian estimation, and probabilistic-possibilistic interval programming with fuzzy random coefficients (PPI-FRC) within a general framework. Based on the water quality protocols provided by SWAT, posterior distributions of parameters can be analyzed through Bayesian estimation; stochastic characteristic of nutrient loading can be investigated which provides the inputs for the decision making. PPI-FRC can address multiple uncertainties in the form of intervals with fuzzy random boundaries and the associated system risk through incorporating the concept of possibility and necessity measures. The possibility and necessity measures are suitable for optimistic and pessimistic decision making, respectively. BESMA is applied to a real case of effluent trading planning in the Xiangxihe watershed, China. A number of decision alternatives can be obtained under different trading ratios and treatment rates. The results can not only facilitate identification of optimal effluent-trading schemes, but also gain insight into the effects of trading ratio and treatment rate on decision making. The results also reveal that decision maker's preference towards risk would affect decision alternatives on trading scheme as well as system benefit. Compared with the conventional optimization methods, it is proved that BESMA is advantageous in (i) dealing with multiple uncertainties associated with randomness and fuzziness in effluent-trading planning within a multi-source, multi-reach and multi-period context; (ii) reflecting uncertainties existing in nutrient transport behaviors to improve the accuracy in water quality prediction; and (iii) supporting pessimistic and optimistic decision making for effluent trading as well as promoting diversity of decision

  3. Large-Scale Inverse Problems and Quantification of Uncertainty

    CERN Document Server

    Biegler, Lorenz; Ghattas, Omar

    2010-01-01

    Large-scale inverse problems and associated uncertainty quantification has become an important area of research, central to a wide range of science and engineering applications. Written by leading experts in the field, Large-scale Inverse Problems and Quantification of Uncertainty focuses on the computational methods used to analyze and simulate inverse problems. The text provides PhD students, researchers, advanced undergraduate students, and engineering practitioners with the perspectives of researchers in areas of inverse problems and data assimilation, ranging from statistics and large-sca

  4. Pseudo waveform inversion

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Chang Soo; Park, Keun Pil [Korea Inst. of Geology Mining and Materials, Taejon (Korea, Republic of); Suh, Jung Hee; Hyun, Byung Koo; Shin, Sung Ryul [Seoul National University, Seoul (Korea, Republic of)

    1995-12-01

    The seismic reflection exploration technique which is one of the geophysical methods for oil exploration became effectively to image the subsurface structure with rapid development of computer. However, the imagining of subsurface based on the conventional data processing is almost impossible to obtain the information on physical properties of the subsurface such as velocity and density. Since seismic data are implicitly function of velocities of subsurface, it is necessary to develop the inversion method that can delineate the velocity structure using seismic topography and waveform inversion. As a tool to perform seismic inversion, seismic forward modeling program using ray tracing should be developed. In this study, we have developed the algorithm that calculate the travel time of the complex geologic structure using shooting ray tracing by subdividing the geologic model into blocky structure having the constant velocity. With the travel time calculation, the partial derivatives of travel time can be calculated efficiently without difficulties. Since the current ray tracing technique has a limitation to calculate the travel times for extremely complex geologic model, our aim in the future is to develop the powerful ray tracer using the finite element technique. After applying the pseudo waveform inversion to the seismic data of Korea offshore, we can obtain the subsurface velocity model and use the result in bring up the quality of the seismic data processing. If conventional seismic data processing and seismic interpretation are linked with this inversion technique, the high quality of seismic data processing can be expected to image the structure of the subsurface. Future research area is to develop the powerful ray tracer of ray tracing which can calculate the travel times for the extremely complex geologic model. (author). 39 refs., 32 figs., 2 tabs.

  5. Calculation of the inverse data space via sparse inversion

    KAUST Repository

    Saragiotis, Christos

    2011-01-01

    The inverse data space provides a natural separation of primaries and surface-related multiples, as the surface multiples map onto the area around the origin while the primaries map elsewhere. However, the calculation of the inverse data is far from trivial as theory requires infinite time and offset recording. Furthermore regularization issues arise during inversion. We perform the inversion by minimizing the least-squares norm of the misfit function by constraining the $ell_1$ norm of the solution, being the inverse data space. In this way a sparse inversion approach is obtained. We show results on field data with an application to surface multiple removal.

  6. Modification of planned postoperative occlusion in orthognathic surgery, based on computer-aided design/computer-aided manufacturing-engineered preoperative surgical simulation.

    Science.gov (United States)

    Kang, Sang-Hoon; Kim, Moon-Key; You, Tae-Kwon; Lee, Ji-Yeon

    2015-01-01

    In orthognathic surgery, it is important to have a planned postoperative occlusion. A 3-dimensional preoperative simulation, based on 3-dimensional optically scanned occlusion data, can predict how the planned postoperative occlusion will affect the maxilla-mandibular relationship that results from orthognathic surgery. In this study we modified the planned postoperative occlusion, based on computer-aided design/computer-aided manufacturing-engineered preoperative surgical simulations. This modification made it possible to resolve the facial asymmetry of the patient successfully with a simple bilateral intraoral vertical ramus osteotomy and no additional maxillary or mandibular surgery. Copyright © 2015 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  7. A decision support tool for sustainable planning of urban water systems: presenting the Dynamic Urban Water Simulation Model.

    Science.gov (United States)

    Willuweit, Lars; O'Sullivan, John J

    2013-12-15

    Population growth, urbanisation and climate change represent significant pressures on urban water resources, requiring water managers to consider a wider array of management options that account for economic, social and environmental factors. The Dynamic Urban Water Simulation Model (DUWSiM) developed in this study links urban water balance concepts with the land use dynamics model MOLAND and the climate model LARS-WG, providing a platform for long term planning of urban water supply and water demand by analysing the effects of urbanisation scenarios and climatic changes on the urban water cycle. Based on potential urbanisation scenarios and their effects on a city's water cycle, DUWSiM provides the functionality for assessing the feasibility of centralised and decentralised water supply and water demand management options based on forecasted water demand, stormwater and wastewater generation, whole life cost and energy and potential for water recycling. DUWSiM has been tested using data from Dublin, the capital of Ireland, and it has been shown that the model is able to satisfactorily predict water demand and stormwater runoff. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Data Analysis of Heating Systems for Buildings—A Tool for Energy Planning, Policies and Systems Simulation

    Directory of Open Access Journals (Sweden)

    Michel Noussan

    2018-01-01

    Full Text Available Heating and cooling in buildings is a central aspect for adopting energy efficiency measures and implementing local policies for energy planning. The knowledge of features and performance of those existing systems is fundamental to conceiving realistic energy savings strategies. Thanks to Information and Communication Technologies (ICT development and energy regulations’ progress, the amount of data able to be collected and processed allows detailed analyses on entire regions or even countries. However, big data need to be handled through proper analyses, to identify and highlight the main trends by selecting the most significant information. To do so, careful attention must be paid to data collection and preprocessing, for ensuring the coherence of the associated analyses and the accuracy of results and discussion. This work presents an insightful analysis on building heating systems of the most populated Italian region—Lombardy. From a dataset of almost 2.9 million of heating systems, selected reference values are presented, aiming at describing the features of current heating systems in households, offices and public buildings. Several aspects are considered, including the type of heating systems, their thermal power, fuels, age, nominal and measured efficiency. The results of this work can be a support for local energy planners and policy makers, and for a more accurate simulation of existing energy systems in buildings.

  9. A review of recent programs and future plans for rotorcraft in-flight simulation at Ames Research Center

    Science.gov (United States)

    Eshow, Michelle M.; Aiken, Edwin W.; Hindson, William S.; Lebacqz, J. V.; Denery, Dallas G.

    1991-01-01

    A new flight research vehicle, the Rotorcraft-Aircrew Systems Concepts Airborne Laboratory (RASCAL), is being developed by the U.S. Army and NASA at Ames Research Center. The requirements for this new facility stem from a perception of rotorcraft system technology requirements for the next decade together with operational experience with the CH-47B research helicopter that was operated as an in-flight simulator at Ames during the past 10 years. Accordingly, both the principal design features of the CH-47B variable-stability system and the flight-control and cockpit-display programs that were conducted using this aircraft at Ames are reviewed. Another U.S. Army helicopter, a UH-60A Black Hawk, has been selected as the baseline vehicle for the RASCAL. The research programs that influence the design of the RASCAL are summarized, and the resultant requirements for the RASCAL research system are described. These research programs include investigations of advanced, integrated control concepts for achieving high levels of agility and maneuverability, and guidance technologies, employing computer/sensor-aiding, designed to assist the pilot during low-altitude flight in conditions of limited visibility. The approach to the development of the new facility is presented and selected plans for the preliminary design of the RASCAL are described.

  10. SU-E-T-222: Computational Optimization of Monte Carlo Simulation On 4D Treatment Planning Using the Cloud Computing Technology

    International Nuclear Information System (INIS)

    Chow, J

    2015-01-01

    Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of compute node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant

  11. Electrochemically driven emulsion inversion

    International Nuclear Information System (INIS)

    Johans, Christoffer; Kontturi, Kyoesti

    2007-01-01

    It is shown that emulsions stabilized by ionic surfactants can be inverted by controlling the electrical potential across the oil-water interface. The potential dependent partitioning of sodium dodecyl sulfate (SDS) was studied by cyclic voltammetry at the 1,2-dichlorobenzene|water interface. In the emulsion the potential control was achieved by using a potential-determining salt. The inversion of a 1,2-dichlorobenzene-in-water (O/W) emulsion stabilized by SDS was followed by conductometry as a function of added tetrapropylammonium chloride. A sudden drop in conductivity was observed, indicating the change of the continuous phase from water to 1,2-dichlorobenzene, i.e. a water-in-1,2-dichlorobenzene emulsion was formed. The inversion potential is well in accordance with that predicted by the hydrophilic-lipophilic deviation if the interfacial potential is appropriately accounted for

  12. Channelling versus inversion

    DEFF Research Database (Denmark)

    Gale, A.S.; Surlyk, Finn; Anderskouv, Kresten

    2013-01-01

    Evidence from regional stratigraphical patterns in Santonian−Campanian chalk is used to infer the presence of a very broad channel system (5 km across) with a depth of at least 50 m, running NNW−SSE across the eastern Isle of Wight; only the western part of the channel wall and fill is exposed. W......−Campanian chalks in the eastern Isle of Wight, involving penecontemporaneous tectonic inversion of the underlying basement structure, are rejected....

  13. Intersections, ideals, and inversion

    International Nuclear Information System (INIS)

    Vasco, D.W.

    1998-01-01

    Techniques from computational algebra provide a framework for treating large classes of inverse problems. In particular, the discretization of many types of integral equations and of partial differential equations with undetermined coefficients lead to systems of polynomial equations. The structure of the solution set of such equations may be examined using algebraic techniques.. For example, the existence and dimensionality of the solution set may be determined. Furthermore, it is possible to bound the total number of solutions. The approach is illustrated by a numerical application to the inverse problem associated with the Helmholtz equation. The algebraic methods are used in the inversion of a set of transverse electric (TE) mode magnetotelluric data from Antarctica. The existence of solutions is demonstrated and the number of solutions is found to be finite, bounded from above at 50. The best fitting structure is dominantly one dimensional with a low crustal resistivity of about 2 ohm-m. Such a low value is compatible with studies suggesting lower surface wave velocities than found in typical stable cratons

  14. Intersections, ideals, and inversion

    Energy Technology Data Exchange (ETDEWEB)

    Vasco, D.W.

    1998-10-01

    Techniques from computational algebra provide a framework for treating large classes of inverse problems. In particular, the discretization of many types of integral equations and of partial differential equations with undetermined coefficients lead to systems of polynomial equations. The structure of the solution set of such equations may be examined using algebraic techniques.. For example, the existence and dimensionality of the solution set may be determined. Furthermore, it is possible to bound the total number of solutions. The approach is illustrated by a numerical application to the inverse problem associated with the Helmholtz equation. The algebraic methods are used in the inversion of a set of transverse electric (TE) mode magnetotelluric data from Antarctica. The existence of solutions is demonstrated and the number of solutions is found to be finite, bounded from above at 50. The best fitting structure is dominantly onedimensional with a low crustal resistivity of about 2 ohm-m. Such a low value is compatible with studies suggesting lower surface wave velocities than found in typical stable cratons.

  15. Inverse transition radiation

    International Nuclear Information System (INIS)

    Steinhauer, L.C.; Romea, R.D.; Kimura, W.D.

    1997-01-01

    A new method for laser acceleration is proposed based upon the inverse process of transition radiation. The laser beam intersects an electron-beam traveling between two thin foils. The principle of this acceleration method is explored in terms of its classical and quantum bases and its inverse process. A closely related concept based on the inverse of diffraction radiation is also presented: this concept has the significant advantage that apertures are used to allow free passage of the electron beam. These concepts can produce net acceleration because they do not satisfy the conditions in which the Lawson-Woodward theorem applies (no net acceleration in an unbounded vacuum). Finally, practical aspects such as damage limits at optics are employed to find an optimized set of parameters. For reasonable assumptions an acceleration gradient of 200 MeV/m requiring a laser power of less than 1 GW is projected. An interesting approach to multi-staging the acceleration sections is also presented. copyright 1997 American Institute of Physics

  16. Essays in energy policy and planning modeling under uncertainty: Value of information, optimistic biases, and simulation of capacity markets

    Science.gov (United States)

    Hu, Ming-Che

    Optimization and simulation are popular operations research and systems analysis tools for energy policy modeling. This dissertation addresses three important questions concerning the use of these tools for energy market (and electricity market) modeling and planning under uncertainty. (1) What is the value of information and cost of disregarding different sources of uncertainty for the U.S. energy economy? (2) Could model-based calculations of the performance (social welfare) of competitive and oligopolistic market equilibria be optimistically biased due to uncertainties in objective function coefficients? (3) How do alternative sloped demand curves perform in the PJM capacity market under economic and weather uncertainty? How does curve adjustment and cost dynamics affect the capacity market outcomes? To address the first question, two-stage stochastic optimization is utilized in the U.S. national MARKAL energy model; then the value of information and cost of ignoring uncertainty are estimated for three uncertainties: carbon cap policy, load growth and natural gas prices. When an uncertainty is important, then explicitly considering those risks when making investments will result in better performance in expectation (positive expected cost of ignoring uncertainty). Furthermore, eliminating the uncertainty would improve strategies even further, meaning that improved forecasts of future conditions are valuable ( i.e., a positive expected value of information). Also, the value of policy coordination shows the difference between a strategy developed under the incorrect assumption of no carbon cap and a strategy correctly anticipating imposition of such a cap. For the second question, game theory models are formulated and the existence of optimistic (positive) biases in market equilibria (both competitive and oligopoly markets) are proved, in that calculated social welfare and producer profits will, in expectation, exceed the values that will actually be received

  17. NACP Regional: Gridded 1-deg Observation Data and Biosphere and Inverse Model Outputs

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: This data set contains standardized gridded observation data, terrestrial biosphere model output data, and inverse model simulations of carbon flux...

  18. NACP Regional: Gridded 1-deg Observation Data and Biosphere and Inverse Model Outputs

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set contains standardized gridded observation data, terrestrial biosphere model output data, and inverse model simulations of carbon flux parameters that...

  19. NACP Regional: Original Observation Data and Biosphere and Inverse Model Outputs

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set contains the originally-submitted observation measurement data, terrestrial biosphere model output data, and inverse model simulations that various...

  20. NACP Regional: Original Observation Data and Biosphere and Inverse Model Outputs

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: This data set contains the originally-submitted observation measurement data, terrestrial biosphere model output data, and inverse model simulations that...

  1. The Open-source Data Inventory for Anthropogenic CO2, version 2016 (ODIAC2016: a global monthly fossil fuel CO2 gridded emissions data product for tracer transport simulations and surface flux inversions

    Directory of Open Access Journals (Sweden)

    T. Oda

    2018-01-01

    Full Text Available The Open-source Data Inventory for Anthropogenic CO2 (ODIAC is a global high-spatial-resolution gridded emissions data product that distributes carbon dioxide (CO2 emissions from fossil fuel combustion. The emissions spatial distributions are estimated at a 1  ×  1 km spatial resolution over land using power plant profiles (emissions intensity and geographical location and satellite-observed nighttime lights. This paper describes the year 2016 version of the ODIAC emissions data product (ODIAC2016 and presents analyses that help guide data users, especially for atmospheric CO2 tracer transport simulations and flux inversion analysis. Since the original publication in 2011, we have made modifications to our emissions modeling framework in order to deliver a comprehensive global gridded emissions data product. Major changes from the 2011 publication are (1 the use of emissions estimates made by the Carbon Dioxide Information Analysis Center (CDIAC at the Oak Ridge National Laboratory (ORNL by fuel type (solid, liquid, gas, cement manufacturing, gas flaring, and international aviation and marine bunkers; (2 the use of multiple spatial emissions proxies by fuel type such as (a nighttime light data specific to gas flaring and (b ship/aircraft fleet tracks; and (3 the inclusion of emissions temporal variations. Using global fuel consumption data, we extrapolated the CDIAC emissions estimates for the recent years and produced the ODIAC2016 emissions data product that covers 2000–2015. Our emissions data can be viewed as an extended version of CDIAC gridded emissions data product, which should allow data users to impose global fossil fuel emissions in a more comprehensive manner than the original CDIAC product. Our new emissions modeling framework allows us to produce future versions of the ODIAC emissions data product with a timely update. Such capability has become more significant given the CDIAC/ORNL's shutdown. The ODIAC data

  2. The Open-source Data Inventory for Anthropogenic CO2, version 2016 (ODIAC2016): a global monthly fossil fuel CO2 gridded emissions data product for tracer transport simulations and surface flux inversions

    Science.gov (United States)

    Oda, Tomohiro; Maksyutov, Shamil; Andres, Robert J.

    2018-01-01

    The Open-source Data Inventory for Anthropogenic CO2 (ODIAC) is a global high-spatial-resolution gridded emissions data product that distributes carbon dioxide (CO2) emissions from fossil fuel combustion. The emissions spatial distributions are estimated at a 1 × 1 km spatial resolution over land using power plant profiles (emissions intensity and geographical location) and satellite-observed nighttime lights. This paper describes the year 2016 version of the ODIAC emissions data product (ODIAC2016) and presents analyses that help guide data users, especially for atmospheric CO2 tracer transport simulations and flux inversion analysis. Since the original publication in 2011, we have made modifications to our emissions modeling framework in order to deliver a comprehensive global gridded emissions data product. Major changes from the 2011 publication are (1) the use of emissions estimates made by the Carbon Dioxide Information Analysis Center (CDIAC) at the Oak Ridge National Laboratory (ORNL) by fuel type (solid, liquid, gas, cement manufacturing, gas flaring, and international aviation and marine bunkers); (2) the use of multiple spatial emissions proxies by fuel type such as (a) nighttime light data specific to gas flaring and (b) ship/aircraft fleet tracks; and (3) the inclusion of emissions temporal variations. Using global fuel consumption data, we extrapolated the CDIAC emissions estimates for the recent years and produced the ODIAC2016 emissions data product that covers 2000-2015. Our emissions data can be viewed as an extended version of CDIAC gridded emissions data product, which should allow data users to impose global fossil fuel emissions in a more comprehensive manner than the original CDIAC product. Our new emissions modeling framework allows us to produce future versions of the ODIAC emissions data product with a timely update. Such capability has become more significant given the CDIAC/ORNL's shutdown. The ODIAC data product could play an important

  3. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L

    2009-05-01

    Many complex systems of importance to the U.S. Department of Energy consist of networks of discrete components. Examples are cyber networks, such as the internet and local area networks over which nearly all DOE scientific, technical and administrative data must travel, the electric power grid, social networks whose behavior can drive energy demand, and biological networks such as genetic regulatory networks and metabolic networks. In spite of the importance of these complex networked systems to all aspects of DOE's operations, the scientific basis for understanding these systems lags seriously behind the strong foundations that exist for the 'physically-based' systems usually associated with DOE research programs that focus on such areas as climate modeling, fusion energy, high-energy and nuclear physics, nano-science, combustion, and astrophysics. DOE has a clear opportunity to develop a similarly strong scientific basis for understanding the structure and dynamics of networked systems by supporting a strong basic research program in this area. Such knowledge will provide a broad basis for, e.g., understanding and quantifying the efficacy of new security approaches for computer networks, improving the design of computer or communication networks to be more robust against failures or attacks, detecting potential catastrophic failure on the power grid and preventing or mitigating its effects, understanding how populations will respond to the availability of new energy sources or changes in energy policy, and detecting subtle vulnerabilities in large software systems to intentional attack. This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE. It will focus principally on four research areas: (1) understanding network structure, (2) understanding network dynamics, (3) predictive modeling and simulation for complex

  4. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    International Nuclear Information System (INIS)

    Brown, D.L.

    2009-01-01

    Many complex systems of importance to the U.S. Department of Energy consist of networks of discrete components. Examples are cyber networks, such as the internet and local area networks over which nearly all DOE scientific, technical and administrative data must travel, the electric power grid, social networks whose behavior can drive energy demand, and biological networks such as genetic regulatory networks and metabolic networks. In spite of the importance of these complex networked systems to all aspects of DOE's operations, the scientific basis for understanding these systems lags seriously behind the strong foundations that exist for the 'physically-based' systems usually associated with DOE research programs that focus on such areas as climate modeling, fusion energy, high-energy and nuclear physics, nano-science, combustion, and astrophysics. DOE has a clear opportunity to develop a similarly strong scientific basis for understanding the structure and dynamics of networked systems by supporting a strong basic research program in this area. Such knowledge will provide a broad basis for, e.g., understanding and quantifying the efficacy of new security approaches for computer networks, improving the design of computer or communication networks to be more robust against failures or attacks, detecting potential catastrophic failure on the power grid and preventing or mitigating its effects, understanding how populations will respond to the availability of new energy sources or changes in energy policy, and detecting subtle vulnerabilities in large software systems to intentional attack. This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE. It will focus principally on four research areas: (1) understanding network structure, (2) understanding network dynamics, (3) predictive modeling and simulation for complex networked systems

  5. Analysis of Extreme Heat in Historical and Projected Climate Simulations for Regional Climate Planning Purposes in the U.S.

    Science.gov (United States)

    Geil, K.; Zeng, X.; McMahan, B.; Ferguson, D. B.

    2015-12-01

    The U.S. National Climate Assessment (NCA) states that global climate models predict more extreme temperatures and more frequent, intense, and longer heat waves on a regional basis as global temperatures rise throughout the 21st century, but a thorough test of whether these models can simulate observed heat metrics and trends over the historical period was not included in the assessment. Understanding the capabilities of climate models over the historical period is crucial to assessing our confidence in their predictive ability at regional scales. Our work fills this research gap by evaluating the performance of Coupled Model Intercomparison Phase 5 (CMIP5) models as compared to observational data using multiple heat metrics. Our metrics are targeted for the southwest United States, but our regional analysis covers the entire continental U.S. and Alaska using 7 of the regions delineated by the NCA. The heat metrics include heat wave and cold wave frequency, intensity, and duration, overnight low temperatures, onset and length of the hot season, and human heat stress. For the best performing models, we compute the same heat metrics for the RCP scenarios. In addition to presenting the results of our CMIP5 historical and RCP analyses, we also describe how our results may be applied to the benefit of our community in Southern Arizona as a case study. Our research will be used by NOAA's Climate Assessment for the Southwest (CLIMAS) and by an interdisciplinary collaborative team of researchers from the University of Arizona working with an electric utility to integrate climate information into their strategic planning.

  6. Challenges while Updating Planning Parameters of an ERP System and How a Simulation-Based Support System Can Support Material Planners

    Directory of Open Access Journals (Sweden)

    Ulrike Stumvoll

    2016-01-01

    Full Text Available In an Enterprise Resource Planning (ERP system, production planning is influenced by a variety of parameters. Previous investigations show that setting parameter values is highly relevant to a company’s target system. Parameter settings should be checked and adjusted, e.g., after a change in environmental factors, by material planners. In practice, updating the parameters is difficult due to several reasons. This paper presents a simulation-based decision support system, which helps material planners in all stages of decision-making processes. It will present the system prototype’s user interface and the results of applying the system to a case study.

  7. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  8. Helios: a Multi-Purpose LIDAR Simulation Framework for Research, Planning and Training of Laser Scanning Operations with Airborne, Ground-Based Mobile and Stationary Platforms

    Science.gov (United States)

    Bechtold, S.; Höfle, B.

    2016-06-01

    In many technical domains of modern society, there is a growing demand for fast, precise and automatic acquisition of digital 3D models of a wide variety of physical objects and environments. Laser scanning is a popular and widely used technology to cover this demand, but it is also expensive and complex to use to its full potential. However, there might exist scenarios where the operation of a real laser scanner could be replaced by a computer simulation, in order to save time and costs. This includes scenarios like teaching and training of laser scanning, development of new scanner hardware and scanning methods, or generation of artificial scan data sets to support the development of point cloud processing and analysis algorithms. To test the feasibility of this idea, we have developed a highly flexible laser scanning simulation framework named Heidelberg LiDAR Operations Simulator (HELIOS). HELIOS is implemented as a Java library and split up into a core component and multiple extension modules. Extensible Markup Language (XML) is used to define scanner, platform and scene models and to configure the behaviour of modules. Modules were developed and implemented for (1) loading of simulation assets and configuration (i.e. 3D scene models, scanner definitions, survey descriptions etc.), (2) playback of XML survey descriptions, (3) TLS survey planning (i.e. automatic computation of recommended scanning positions) and (4) interactive real-time 3D visualization of simulated surveys. As a proof of concept, we show the results of two experiments: First, a survey planning test in a scene that was specifically created to evaluate the quality of the survey planning algorithm. Second, a simulated TLS scan of a crop field in a precision farming scenario. The results show that HELIOS fulfills its design goals.

  9. Limits to Nonlinear Inversion

    DEFF Research Database (Denmark)

    Mosegaard, Klaus

    2012-01-01

    For non-linear inverse problems, the mathematical structure of the mapping from model parameters to data is usually unknown or partly unknown. Absence of information about the mathematical structure of this function prevents us from presenting an analytical solution, so our solution depends on our...... ability to produce efficient search algorithms. Such algorithms may be completely problem-independent (which is the case for the so-called 'meta-heuristics' or 'blind-search' algorithms), or they may be designed with the structure of the concrete problem in mind. We show that pure meta...

  10. SU-F-T-520: Dosimetric Comparison of Radiation Treatment Plans for Whole Breast Irradiation Between 3D Conformal in Prone and Supine Positions Vs. VMAT and IMRT in Supine Positions

    Energy Technology Data Exchange (ETDEWEB)

    Bejarano Buele, A; Parsai, E [University of Toledo Medical Center, Toledo, OH (United States)

    2016-06-15

    Purpose: The target volume for Whole Breast Irradiation (WBI) is dictated by location of tumor mass, breast tissue distribution, and involvement of lymph nodes. Dose coverage and Organs at Risk (OARs) sparing can be difficult to achieve in patients with unfavorable thoracic geometries. For these cases, inverse-planned and 3D-conformal prone treatments can be alternatives to traditional supine 3D-conformal plans. A dosimetric comparison can determine which of these techniques achieve optimal target coverage while sparing OARs. Methods: This study included simulation datasets for 8 patients, 5 of whom were simulated in both supine and prone positions. Positioning devices included breast boards and Vaclok bags for the supine position, and prone breast boards for the prone position. WBI 3-D conformal plans were created for patients simulated in both positions. Additional VMAT and IMRT WBI plans were made for all patients in the supine position. Results: Prone and supine 3D conformal plans had comparable PTV coverage. Prone 3D conformal plans received a significant 50% decrease to V20, V10, V5 and V30% for the ipsilateral lung in contrast to the supine plans. The heart also experienced a 10% decrease in maximum dose in the prone position, and V20, V10, V5 and V2 had significantly lower values than the supine plan. Supine IMRT and VMAT breast plans obtained comparable PTV coverage. The heart experienced a 10% decrease in maximum dose with inverse modulated plans when compared to the supine 3D conformal plan, while V20, V10, V5 and V2 showed higher values with inverse modulated plans than with supine 3D conformal plans. Conclusion: Prone 3D-conformal, and supine inverse planned treatments were generally superior in sparing OARs to supine plans with comparable PTV coverage. IMRT and VMAT plans offer sparing of OARs from high dose regions with an increase of irradiated volume in the low dose regions.

  11. Dynamic Inversion of Intraslab Intermediate Depth Earthquakes

    Science.gov (United States)

    Madariaga, R. I.; Ruiz, S.

    2011-12-01

    We perform kinematic and dynamic inversions of the 24 July 2008 (Mw=6.8) Iwate northern Japan and 16 December 2007 (Mw 6.8) Michilla, Chile earthquakes using near field strong motion digital data. The data were filtered between 0.02 - 1 Hz. The rupture process is simulated with elliptical patches because we are looking for the average properties of the seismic rupture. The direct dynamic simulation problem was solved by a combination of finite difference modeling on a 32 km3 grid with 200 m spacing, and propagation from source to recorders using the AXITRA spectral program. For both earthquakes we used layered models of the structure. The Neighborhood algorithm and Monte Carlo methods are used to obtain the best fitting solutions and to explore the solution space. The optimum solutions are found comparing observed and synthetic records using an L2 norm. Both kinematic and dynamic inversions fit the observed data with misfits lower than 0.3. For both earthquakes, kinematic inversion shows strong trade-off between rupture velocity and maximum slip although the seismic moment remains invariant. Rupture velocities vary between sub-shear speeds to almost Rayleigh wave speeds. In the dynamic inversions 10 seismic source parameters were inverted for the Michilla earthquake and 8 parameters for the Iwate event, among them stress, friction and geometrical parameters. For the Iwate event the properties of the initial asperity at the source were not inverted because they could not be resolved by the data. In the dynamic inversion we observed a strong trade off among the friction law parameters. The best dynamic models form a family of that shares similar values of seismic moment and kappa (the ratio of released strain energy to energy release rate for friction). Kinematic and dynamic inversions in the 0.02 - 1 Hz frequency range form a set of non-unique solutions controlled by specific combinations of seismic source parameters. We discuss the origin of the non-uniqueness of

  12. Toward Online Adaptive Hyperthermia Treatment Planning: Correlation Between Measured and Simulated Specific Absorption Rate Changes Caused by Phase Steering in Patients

    Energy Technology Data Exchange (ETDEWEB)

    Kok, H. Petra, E-mail: H.P.Kok@amc.uva.nl [Department of Radiation Oncology, Academic Medical Center, University of Amsterdam, Amsterdam (Netherlands); Ciampa, Silvia [Department of Radiation Oncology, Academic Medical Center, University of Amsterdam, Amsterdam (Netherlands); Department of Civil Engineering and Computer Science, University of Rome Tor Vergata, Rome (Italy); Kroon-Oldenhof, Rianne de; Steggerda-Carvalho, Eva J.; Stam, Gerard van; Zum Vörde Sive Vörding, Paul J.; Stalpers, Lukas J.A.; Geijsen, Elisabeth D. [Department of Radiation Oncology, Academic Medical Center, University of Amsterdam, Amsterdam (Netherlands); Bardati, Fernando [Department of Civil Engineering and Computer Science, University of Rome Tor Vergata, Rome (Italy); Bel, Arjan; Crezee, Johannes [Department of Radiation Oncology, Academic Medical Center, University of Amsterdam, Amsterdam (Netherlands)

    2014-10-01

    Purpose: Hyperthermia is the clinical application of heat, in which tumor temperatures are raised to 40°C to 45°C. This proven radiation and chemosensitizer significantly improves clinical outcome for several tumor sites. Earlier studies of the use of pre-treatment planning for hyperthermia showed good qualitative but disappointing quantitative reliability. The purpose of this study was to investigate whether hyperthermia treatment planning (HTP) can be used more reliably for online adaptive treatment planning during locoregional hyperthermia treatments. Methods and Materials: This study included 78 treatment sessions for 15 patients with non-muscle-invasive bladder cancer. At the start of treatments, temperature rise measurements were performed with 3 different antenna settings optimized for each patient, from which the absorbed power (specific absorption rate [SAR]) was derived. HTP was performed based on a computed tomography (CT) scan in treatment position with the bladder catheter in situ. The SAR along the thermocouple tracks was extracted from the simulated SAR distributions. Correlations between measured and simulated (average) SAR values were determined. To evaluate phase steering, correlations between the changes in simulated and measured SAR values averaged over the thermocouple probe were determined for all 3 combinations of antenna settings. Results: For 42% of the individual treatment sessions, the correlation coefficient between measured and simulated SAR profiles was higher than 0.5, whereas 58% showed a weak correlation (R of <0.5). The overall correlation coefficient between measured and simulated average SAR was weak (R=0.31; P<.001). The measured and simulated changes in average SAR after adapting antenna settings correlated much better (R=0.70; P<.001). The ratio between the measured and simulated quotients of maximum and average SARs was 1.03 ± 0.26 (mean ± SD), indicating that HTP can also correctly predict the relative amplitude of

  13. Simulating stakeholder behavior in a marine setting: Integrated coastal zone planning and the influential power of selected stakeholders in Frøya, Norway

    Directory of Open Access Journals (Sweden)

    Rachel Gjelsvik Tiller

    2015-11-01

    Full Text Available Aquaculture expansion is a political priority in Norway, despite simmering conflicts and competing claims. We expand on this hypothesis and analyze the Norwegian governance system by adding stakeholder theory in case of a simulated model of the effects of municipal coastal zone planning in the municipality of Frøya, Norway. One cannot analyze the governance system in Norway without fully comprehending the perspectives of the stakeholders involved. Different stakeholders will react and respond differently and have conflicting presumptions basing their actions towards the planning process for coastal areas. They will also have different levels of power and abilities to influence the system. The article presents the interdisciplinary, first generation development of an agent based simulation model that mimics the outcomes of coastal zone planning for a stakeholder groups, the commercial fishers and the aquaculture industry, based on qualitative input from legislation, regulations and stakeholder workshops. We proceed with verifying the applicability of this simulator in light of the key actors involved, namely the commercial fishers. We found that the simulator had two outcomes for the commercial fishers that were consistently recurring, namely collapse and stability, based on the simulated occurrences of complaints by the stakeholders, with the latter being the de facto perceptions of actuality by the commercial fishers. Using stakeholder theory, we argue that the aquaculture industry’s role has the saliency of an Important Stakeholder in Frøya has steered the commercial fishers, who has the role of Dependent Stakeholders according to stakeholder theory, to no longer see any legitimacy in the process in that their complaints were never upheld because of their lack of the attribute Power.

  14. Inverse plasma equilibria

    International Nuclear Information System (INIS)

    Hicks, H.R.; Dory, R.A.; Holmes, J.A.

    1983-01-01

    We illustrate in some detail a 2D inverse-equilibrium solver that was constructed to analyze tokamak configurations and stellarators (the latter in the context of the average method). To ensure that the method is suitable not only to determine equilibria, but also to provide appropriately represented data for existing stability codes, it is important to be able to control the Jacobian, tilde J is identical to delta(R,Z)/delta(rho, theta). The form chosen is tilde J = J 0 (rho)R/sup l/rho where rho is a flux surface label, and l is an integer. The initial implementation is for a fixed conducting-wall boundary, but the technique can be extended to a free-boundary model

  15. RFDR with Adiabatic Inversion Pulses: Application to Internuclear Distance Measurements

    International Nuclear Information System (INIS)

    Leppert, Joerg; Ohlenschlaeger, Oliver; Goerlach, Matthias; Ramachandran, Ramadurai

    2004-01-01

    In the context of the structural characterisation of biomolecular systems via MAS solid state NMR, the potential utility of homonuclear dipolar recoupling with adiabatic inversion pulses has been assessed via numerical simulations and experimental measurements. The results obtained suggest that it is possible to obtain reliable estimates of internuclear distances via an analysis of the initial cross-peak intensity buildup curves generated from two-dimensional adiabatic inversion pulse driven longitudinal magnetisation exchange experiments

  16. The role of the umbrella inversion mode in proton diffusion

    Science.gov (United States)

    Hassanali, Ali A.; Giberti, Federico; Sosso, Gabriele C.; Parrinello, Michele

    2014-04-01

    Here, using ab initio molecular dynamics (AIMD) simulations, we elucidate the role of the umbrella inversion mode of the hydronium in proton transfer (PT) in liquid water. The hydrophobic face of the hydronium oxygen experiences asymmetries in the solvent potential along the inversion coordinate and this has a rather drastic effect on the barrier for proton transfer. This behavior is coupled to the fluctuations of voids or cavities in the vicinity of the hydronium in the water network. The peculiar inversion mode can either trap or release the proton from different parts of the water network.

  17. Holocaust inversion and contemporary antisemitism.

    OpenAIRE

    Klaff, Lesley D

    2014-01-01

    One of the cruellest aspects of the new antisemitism is its perverse use of the Holocaust as a stick to beat 'the Jews'. This article explains the phenomenon of 'Holocaust Inversion', which involves an 'inversion of reality' (the Israelis are cast as the 'new' Nazis and the Palestinians as the 'new' Jews) and an 'inversion of morality' (the Holocaust is presented as a moral lesson for, or even a moral indictment of, 'the Jews'). Holocaust inversion is a form of soft-core Holocaust denial, yet...

  18. Inversion twinning in troilite

    Czech Academy of Sciences Publication Activity Database

    Skála, Roman; Císařová, I.; Drábek, M.

    2006-01-01

    Roč. 91, 5-6 (2006), s. 917-921 ISSN 0003-004X R&D Projects: GA ČR GA205/02/1101 Institutional research plan: CEZ:AV0Z30130516 Keywords : troilite * absolute structure * crystallography * iron monosulfide * iron meteorites Subject RIV: DB - Geology ; Mineralogy Impact factor: 1.977, year: 2006

  19. TU-C-17A-08: Improving IMRT Planning and Reducing Inter-Planner Variability Using the Stochastic Frontier Method: Validation Based On Clinical and Simulated Data

    Energy Technology Data Exchange (ETDEWEB)

    Gagne, MC; Archambault, L [Departement de Physique, Genie Physique et Optique, Universite Laval, Quebec, Quebec (Canada); CHU de Quebec, Quebec, Quebec (Canada); Centre de recherche sur le cancer, Quebec, Quebec (Canada); Tremblay, D; Varfalvy, N [Departement de Physique, Genie Physique et Optique, Universite Laval, Quebec, Quebec (Canada); CHU de Quebec, Quebec, Quebec (Canada)

    2014-06-15

    Purpose: Intensity modulated radiation therapy always requires compromises between PTV coverage and organs at risk (OAR) sparing. We previously developed metrics that correlate doses to OAR to specific patients’ morphology using stochastic frontier analysis (SFA). Here, we aim to examine the validity of this approach using a large set of realistically simulated dosimetric and geometric data. Methods: SFA describes a set of treatment plans as an asymmetric distribution with respect to a frontier defining optimal plans. Eighty head and neck IMRT plans were used to establish a metric predicting the mean dose to parotids as a function of simple geometric parameters. A database of 140 parotids was used as a basis distribution to simulate physically plausible data of geometry and dose. Distributions comprising between 20 and 5000 were simulated and the SFA was applied to obtain new frontiers, which were compared to the original frontier. Results: It was possible to simulate distributions consistent with the original dataset. Below 160 organs, the SFA could not always describe distributions as asymmetric: a few cases showed a Gaussian or half-Gaussian distribution. In order to converge to a stable solution, the number of organs in a distribution must ideally be above 100, but in many cases stable parameters could be achieved with as low as 60 samples of organ data. Mean RMS value of the error of new frontiers was significantly reduced when additional organs are used. Conclusion: The number of organs in a distribution showed to have an impact on the effectiveness of the model. It is always possible to obtain a frontier, but if the number of organs in the distribution is small (< 160), it may not represent de lowest dose achievable. These results will be used to determine number of cases necessary to adapt the model to other organs.

  20. Inverse feasibility problems of the inverse maximum flow problems

    Indian Academy of Sciences (India)

    A linear time method to decide if any inverse maximum flow (denoted General Inverse Maximum Flow problems (IMFG)) problem has solution is deduced. If IMFG does not have solution, methods to transform IMFG into a feasible problem are presented. The methods consist of modifying as little as possible the restrictions to ...

  1. Inverse feasibility problems of the inverse maximum flow problems

    Indian Academy of Sciences (India)

    199–209. c Indian Academy of Sciences. Inverse feasibility problems of the inverse maximum flow problems. ADRIAN DEACONU. ∗ and ELEONOR CIUREA. Department of Mathematics and Computer Science, Faculty of Mathematics and Informatics, Transilvania University of Brasov, Brasov, Iuliu Maniu st. 50,. Romania.

  2. Relative plan robustness of step-and-shoot vs rotational intensity–modulated radiotherapy on repeat computed tomographic simulation for weight loss in head and neck cancer

    Energy Technology Data Exchange (ETDEWEB)

    Thomson, David J. [Department of Clinical Oncology, The Christie NHS Foundation Trust, Manchester (United Kingdom); The University of Manchester, Manchester Academic Health Science Centre, Institute of Cancer Sciences, Manchester (United Kingdom); Beasley, William J. [The University of Manchester, Manchester Academic Health Science Centre, Institute of Cancer Sciences, Manchester (United Kingdom); Christie Medical Physics and Engineering, The Christie NHS Foundation Trust, Manchester (United Kingdom); Garcez, Kate; Lee, Lip W.; Sykes, Andrew J. [Department of Clinical Oncology, The Christie NHS Foundation Trust, Manchester (United Kingdom); Rowbottom, Carl G. [The University of Manchester, Manchester Academic Health Science Centre, Institute of Cancer Sciences, Manchester (United Kingdom); Christie Medical Physics and Engineering, The Christie NHS Foundation Trust, Manchester (United Kingdom); Slevin, Nicholas J., E-mail: nick.slevin@christie.nhs.uk [Department of Clinical Oncology, The Christie NHS Foundation Trust, Manchester (United Kingdom); The University of Manchester, Manchester Academic Health Science Centre, Institute of Cancer Sciences, Manchester (United Kingdom)

    2016-07-01

    Introduction: Interfractional anatomical alterations may have a differential effect on the dose delivered by step-and-shoot intensity-modulated radiotherapy (IMRT) and volumetric-modulated arc therapy (VMAT). The increased degrees of freedom afforded by rotational delivery may increase plan robustness (measured by change in target volume coverage and doses to organs at risk [OARs]). However, this has not been evaluated for head and neck cancer. Materials and methods: A total of 10 patients who required repeat computed tomography (CT) simulation and replanning during head and neck IMRT were included. Step-and-shoot IMRT and VMAT plans were generated from the original planning scan. The initial and second CT