WorldWideScience

Sample records for simulated photograph technique

  1. Photographic and drafting techniques simplify method of producing engineering drawings

    Science.gov (United States)

    Provisor, H.

    1968-01-01

    Combination of photographic and drafting techniques has been developed to simplify the preparation of three dimensional and dimetric engineering drawings. Conventional photographs can be converted to line drawings by making copy negatives on high contrast film.

  2. A histogram-based technique for rapid vector extraction from PIV photographs

    Science.gov (United States)

    Humphreys, William M., Jr.

    1991-01-01

    A new analysis technique, performed totally in the image plane, is proposed which rapidly extracts all available vectors from individual interrogation regions on PIV photographs. The technique avoids the need for using Fourier transforms with the associated computational burden. The data acquisition and analysis procedure is described, and results of a preliminary simulation study to evaluate the accuracy of the technique are presented. Recently obtained PIV photographs are analyzed.

  3. New applications of photographic materials in science and technique

    International Nuclear Information System (INIS)

    Buschmann, H.T.; Deml, R.; Duville, R.; Philippaerts, H.; Bollen, R.; Ranz, E.

    1976-01-01

    In spite of some disatvantages photographic materials based on silver halides possess the outstanding feature of high sensitivity. So again and again special photographic materials have been developed for new techniques including information storage. This contribution reports on some special photographic materials and it briefly discusses some applications. Materials are discussed in detail for holography, carrier-frequency photography, producing masks for integrated circuits, recording equidensities, bubble chamber photography, and for neutron-radiography. (orig.) [de

  4. Photographic appraisal of crystal lattice growth technique

    Directory of Open Access Journals (Sweden)

    Kapoor D

    2005-01-01

    Full Text Available Concept of creating mechanical retention for bonding through crystal growth has been successfully achieved in the present study. By using polyacrylic acid, sulphated with sulphuric acid as etchant, abundant crystal growth was demonstrated. Keeping in view the obvious benefits of crystal growth technique, the present SEM study was aimed to observe and compare the changes brought about by different etching agents (phosphoric acid, polyacrylic acid and polyacrylic acid sulphated and to evaluate their advantages and disadvantages in an attempt to reduce iatrogenic trauma caused due to surface enamel alteration. Control and experimental groups were made of 24 and 30 premolars, respectively, for scanning electron microscopic appraisal of normal unetched and etched enamel surface and fracture site and finished surface evaluation. When compared with conventional phosphoric acid and weaker polyacrylic acid, investigations indicated that crystal growth treatment on enamel surface caused minimal iatrogenic trauma and surface alteration were restored to the original untreated condition to a large extent.

  5. Determination of rock fragmentation based on a photographic technique

    International Nuclear Information System (INIS)

    Dehgan Banadaki, M.M.; Majdi, A.; Raessi Gahrooei, D.

    2002-01-01

    The paper represents a physical blasting model in laboratory scale along with a photographic approach to describe the distribution of blasted rock materials. For this purpose, based on wobble probability distribution function, eight samples each weighted 100 kg,were obtained. Four pictures from four different section of each sample were taken. Then, pictures were converted into graphic files with characterizing boundary of each piece of rocks in the samples. Error caused due to perspective were eliminated. Volume of each piece of the blasted rock materials and hence the required sieve size, each piece of rock to pass through, were calculated. Finally, original blasted rock size distribution was compared with that obtained from the photographic method. The paper concludes with presenting an approach to convert the results of photographic technique into size distribution obtained by seine analysis with sufficient verification

  6. Best of Adobe Photoshop techniques and images from professional photographers

    CERN Document Server

    Hurter, Bill

    2006-01-01

    Bill Hurter is the editor of ""Rangefinder"" magazine, the former editor of ""Petersen's PhotoGraphic,"" and the author of ""The Best of Wedding Photography, Group Portrait Photography Handbook, The Portrait Photographer's Guide to Posing, ""and ""Portrait Photographer's Handbook. ""He lives in Santa Monica, California.

  7. Advanced imaging techniques II: using a compound microscope for photographing point-mount specimens

    Science.gov (United States)

    Digital imaging technology has revolutionized the practice photographing insects for scientific study. Herein described are lighting and mounting techniques designed for imaging micro Hymenoptera. Techniques described here are applicable to all small insects, as well as other invertebrates. The ke...

  8. Airflow Simulation Techniques

    DEFF Research Database (Denmark)

    Nielsen, Peter V.

    The paper describes the development in airflow simulations in rooms . The research is, as other areas of flow research, influenced by the decreasing cost of computation which seems to indicate an increased use of airflow simulation in the coming years.......The paper describes the development in airflow simulations in rooms . The research is, as other areas of flow research, influenced by the decreasing cost of computation which seems to indicate an increased use of airflow simulation in the coming years....

  9. Simulation Techniques That Work.

    Science.gov (United States)

    Beland, Robert M.

    1983-01-01

    At the University of Florida, simulated experiences with disabled clients help bridge the gap between coursework and internships for recreation therapy students. Actors from the university's drama department act out the roles of handicapped persons, who are interviewed by therapy students. (PP)

  10. Photographic and video techniques used in the 1/5-scale Mark I boiling water reactor pressure suppression experiment

    International Nuclear Information System (INIS)

    Dixon, D.; Lord, D.

    1978-01-01

    The report provides a description of the techniques and equipment used for the photographic and video recordings of the air test series conducted on the 1/5 scale Mark I boiling water reactor (BWR) pressure suppression experimental facility at Lawrence Livermore Laboratory (LLL) between March 4, 1977, and May 12, 1977. Lighting and water filtering are discussed in the photographic system section and are also applicable to the video system. The appendices contain information from the photographic and video camera logs

  11. Design Techniques and Reservoir Simulation

    Directory of Open Access Journals (Sweden)

    Ahad Fereidooni

    2012-11-01

    Full Text Available Enhanced oil recovery using nitrogen injection is a commonly applied method for pressure maintenance in conventional reservoirs. Numerical simulations can be practiced for the prediction of a reservoir performance in the course of injection process; however, a detailed simulation might take up enormous computer processing time. In such cases, a simple statistical model may be a good approach to the preliminary prediction of the process without any application of numerical simulation. In the current work, seven rock/fluid reservoir properties are considered as screening parameters and those parameters having the most considerable effect on the process are determined using the combination of experimental design techniques and reservoir simulations. Therefore, the statistical significance of the main effects and interactions of screening parameters are analyzed utilizing statistical inference approaches. Finally, the influential parameters are employed to create a simple statistical model which allows the preliminary prediction of nitrogen injection in terms of a recovery factor without resorting to numerical simulations.

  12. Physical simulations using centrifuge techniques

    International Nuclear Information System (INIS)

    Sutherland, H.J.

    1981-01-01

    Centrifuge techniques offer a technique for doing physical simulations of the long-term mechanical response of deep ocean sediment to the emplacement of waste canisters and to the temperature gradients generated by them. Preliminary investigations of the scaling laws for pertinent phenomena indicate that the time scaling will be consistent among them and equal to the scaling factor squared. This result implies that this technique will permit accelerated-life-testing of proposed configurations; i.e, long-term studies may be done in relatively short times. Presently, existing centrifuges are being modified to permit scale model testing. This testing will start next year

  13. Study of two-phase boundary layer phenomena in boiling water by means of photographic techniques

    International Nuclear Information System (INIS)

    Molen, S.B. van der

    1976-01-01

    The behaviour of bubbles in the boundary layer of a two-phase flow is important for the heat exchange between the heat production unit and the cooling medium. Theoretical knowledge of the forces on a bubble and the interaction between molecules of different kind are essential for understanding the phenomena. The photographic techniques are needed for the investigation of the bubble pattern which exists where we find Departure from Nucleate Boiling. (orig.) [de

  14. Optimizing of verification photographs by using the so-called tangential field technique

    International Nuclear Information System (INIS)

    Proske, H.; Merte, H.; Kratz, H.

    1991-01-01

    When irradiating under high voltage condition, verification photographs prove to be difficult to take if the Gantry position is not aligned to 0deg respectively 180deg, since the patient is being irradiated diagonally. Under these conditions it is extremely difficult to align the X-ray-cartridge vertically to the central beam of the therapeutic radiation. This results in, amongst others, misprojections, so that definite dimensions of portrayed organ structures become practical impossible to determine. This paper describes how we have solved these problems on our high voltage units (tele-gamma cobalt unit and linear-accelerator). By using simple accessories, determination of dimensions of organ structures, as shown on the verification photographs, are made possible. We illustrate our method by using the so-called tangential fields technique when irradiating mamma carcinoma. (orig.) [de

  15. Multilevel techniques for Reservoir Simulation

    DEFF Research Database (Denmark)

    Christensen, Max la Cour

    The subject of this thesis is the development, application and study of novel multilevel methods for the acceleration and improvement of reservoir simulation techniques. The motivation for addressing this topic is a need for more accurate predictions of porous media flow and the ability to carry...... Full Approximation Scheme) • Variational (Galerkin) upscaling • Linear solvers and preconditioners First, a nonlinear multigrid scheme in the form of the Full Approximation Scheme (FAS) is implemented and studied for a 3D three-phase compressible rock/fluids immiscible reservoir simulator...... is extended to include a hybrid strategy, where FAS is combined with Newton’s method to construct a multilevel nonlinear preconditioner. This method demonstrates high efficiency and robustness. Second, an improved IMPES formulated reservoir simulator is implemented using a novel variational upscaling approach...

  16. Sources of uncertainty in individual monitoring for photographic,TL and OSL dosimetry techniques

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, Max S.; Silva, Everton R.; Mauricio, Claudia L.P., E-mail: max.das.ferreira@gmail.com, E-mail: everton@ird.gov.br, E-mail: claudia@ird.gov.br [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2015-07-01

    The identification of the uncertainty sources and their quantification is essential to the quality of any dosimetric results. If uncertainties are not stated for all dose measurements informed in the monthly dose report to the monitored radiation facilities, they need to be known. This study aims to analyze the influence of different sources of uncertainties associated with photographic, TL and OSL dosimetric techniques, considering the evaluation of occupational doses of whole-body exposure for photons. To identify the sources of uncertainty it was conducted a bibliographic review in specific documents that deal with operational aspects of each technique and the uncertainties associated to each of them. Withal, technical visits to individual monitoring services were conducted to assist in this identification. The sources of uncertainty were categorized and their contributions were expressed in a qualitative way. The process of calibration and traceability are the most important sources of uncertainties, regardless the technique used. For photographic dosimetry, the remaining important uncertainty sources are due to: energy and angular dependence; linearity of response; variations in the films processing. For TL and OSL, the key process for a good performance is respectively the reproducibility of the thermal and optical cycles. For the three techniques, all procedures of the measurement process must be standardized, controlled and reproducible. Further studies can be performed to quantify the contribution of the sources of uncertainty. (author)

  17. Testing philosophy and simulation techniques

    International Nuclear Information System (INIS)

    Holtbecker, H.

    1977-01-01

    This paper reviews past and present testing philosophies and simulation techniques in the field of structure loading and response studies. The main objective of experimental programmes in the past was to simulate a hypothetical energy release with explosives and to deduce the potential damage to a reactor from the measured damage to the model. This approach was continuously refined by improving the instrumentation of the models, by reproducing the structures as faithful as possible and by developing new explosive charges. This paper presents an analysis of the factors which are expected to have an influence on the validity of the results e.g. strain rate effects and the use of water instead of sodium. More recently the discussion of a whole series of accidents in the probabilistic accident analysis and the intention to compare different reactor designs has revealed the need to develop and validate computer codes. Consequently experimental programmes have been started in which the primary aim is not to test a specific reactor but to validate codes. This paper shows the principal aspects of this approach and discusses first results. (Auth.)

  18. Visual air quality simulation techniques

    Science.gov (United States)

    Molenar, John V.; Malm, William C.; Johnson, Christopher E.

    Visual air quality is primarily a human perceptual phenomenon beginning with the transfer of image-forming information through an illuminated, scattering and absorbing atmosphere. Visibility, especially the visual appearance of industrial emissions or the degradation of a scenic view, is the principal atmospheric characteristic through which humans perceive air pollution, and is more sensitive to changing pollution levels than any other air pollution effect. Every attempt to quantify economic costs and benefits of air pollution has indicated that good visibility is a highly valued and desired environmental condition. Measurement programs can at best approximate the state of the ambient atmosphere at a few points in a scenic vista viewed by an observer. To fully understand the visual effect of various changes in the concentration and distribution of optically important atmospheric pollutants requires the use of aerosol and radiative transfer models. Communication of the output of these models to scientists, decision makers and the public is best done by applying modern image-processing systems to generate synthetic images representing the modeled air quality conditions. This combination of modeling techniques has been under development for the past 15 yr. Initially, visual air quality simulations were limited by a lack of computational power to simplified models depicting Gaussian plumes or uniform haze conditions. Recent explosive growth in low cost, high powered computer technology has allowed the development of sophisticated aerosol and radiative transfer models that incorporate realistic terrain, multiple scattering, non-uniform illumination, varying spatial distribution, concentration and optical properties of atmospheric constituents, and relative humidity effects on aerosol scattering properties. This paper discusses these improved models and image-processing techniques in detail. Results addressing uniform and non-uniform layered haze conditions in both

  19. Secondary side photographic techniques used in characterization of Surry steam generator

    International Nuclear Information System (INIS)

    Sinclair, R.B.

    1984-10-01

    Characterization of the generator's secondary side prior to destructive removal of tubing presents a significant challenge. Information must be obtained in a radioactive field (up to 15 R/h) throughout the tightly spaced bundle of steam generator tubes. This report discusses the various techniques employed, along with their respective advantages and disadvantages. The most successful approach to nondestructive secondary side characterization and documentation was through use of in-house developed pinhole cameras. These devices provided accurate photographic documentation of generator condition. They could be fabricated in geometries allowing access to all parts of the generator. Semi-remote operation coupled with large area coverage per investigation and short at-location times resulted in significant personnel exposure advantages. The fabrication and use of pinhole cameras for remote inspection is discussed in detail

  20. Synchronization Techniques in Parallel Discrete Event Simulation

    OpenAIRE

    Lindén, Jonatan

    2018-01-01

    Discrete event simulation is an important tool for evaluating system models in many fields of science and engineering. To improve the performance of large-scale discrete event simulations, several techniques to parallelize discrete event simulation have been developed. In parallel discrete event simulation, the work of a single discrete event simulation is distributed over multiple processing elements. A key challenge in parallel discrete event simulation is to ensure that causally dependent ...

  1. Determine Conjugate Points of an Aerial Photograph Stereopairs using Seperate Channel Mean Value Technique

    Directory of Open Access Journals (Sweden)

    Andri Hernandi

    2009-11-01

    Full Text Available In the development of digital photogrammetric system, automatic image matching process play an important role. The automatic image matching is used in finding the conjugate points of an aerial photograph stereopair automatically. This matching technique gives quite significant contribution especially in the development of 3D photogrammetry in an attempt to get the exact and precise topographic information during the stereo restitution. There are two image matching methods that have been so far developed, i.e. the area based system for gray level environment and the feature based system for natural feature environment. This re¬search is trying to implement the area based matching with normalized cross correlation technique to get the correlation coefficient between the spectral value of the left image and its pair on the right. Based on the previous researches, the use of color image could increase the quality of match-ing. One of the color image matching technique is known as Separate Channel Mean Value. In order to be able to see the performance of the technique, a number of sampling areas with various different characteristics have been chosen, i.e. the heterogeneous, homogeneous, texture, shadow, and contrast. The result shows the highest similarity measure is obtained on heterogeneous sample area at size of all reference and search image, i.e. (11 pixels x 11 pixels and (23 pixels x 23 pixels. In these area the correlation coefficient reached more than 0.7 and the highest percentage of similarity measure is obtained. The average of total similarity measure of conjugate images in the sampling image area only reach about 41.43 % of success. Therefore, this technique has a weakness and some treatment to overcome the problems is still needed.

  2. Urban Road Traffic Simulation Techniques

    Directory of Open Access Journals (Sweden)

    Ana Maria Nicoleta Mocofan

    2011-09-01

    Full Text Available For achieving a reliable traffic control system it is necessary to first establish a network parameter evaluation system and also a simulation system for the traffic lights plan. In 40 years of history, the computer aided traffic simulation has developed from a small research group to a large scale technology for traffic systems planning and development. In the following thesis, a presentation of the main modeling and simulation road traffic applications will be provided, along with their utility, as well as the practical application of one of the models in a case study.

  3. New techniques to measure cliff change from historical oblique aerial photographs and structure-from-motion photogrammetry

    Science.gov (United States)

    Warrick, Jonathan; Ritchie, Andy; Adelman, Gabrielle; Adelman, Ken; Limber, Patrick W.

    2017-01-01

    Oblique aerial photograph surveys are commonly used to document coastal landscapes. Here it is shown that adequate overlap may exist in these photographic records to develop topographic models with Structure-from-Motion (SfM) photogrammetric techniques. Using photographs of Fort Funston, California, from the California Coastal Records Project, imagery were combined with ground control points in a four-dimensional analysis that produced topographic point clouds of the study area’s cliffs for 5 years spanning 2002 to 2010. Uncertainty was assessed by comparing point clouds with airborne LIDAR data, and these uncertainties were related to the number and spatial distribution of ground control points used in the SfM analyses. With six or more ground control points, the root mean squared errors between the SfM and LIDAR data were less than 0.30 m (minimum 1⁄4 0.18 m), and the mean systematic error was less than 0.10 m. The SfM results had several benefits over traditional airborne LIDAR in that they included point coverage on vertical- to-overhanging sections of the cliff and resulted in 10–100 times greater point densities. Time series of the SfM results revealed topographic changes, including landslides, rock falls, and the erosion of landslide talus along the Fort Funston beach. Thus, it was concluded that SfM photogrammetric techniques with historical oblique photographs allow for the extraction of useful quantitative information for mapping coastal topography and measuring coastal change. The new techniques presented here are likely applicable to many photograph collections and problems in the earth sciences.

  4. The Impact of Simulated Nature on Patient Outcomes: A Study of Photographic Sky Compositions.

    Science.gov (United States)

    Pati, Debajyoti; Freier, Patricia; O'Boyle, Michael; Amor, Cherif; Valipoor, Shabboo

    2016-01-01

    To examine whether incorporation of simulated nature, in the form of ceiling mounted photographic sky compositions, influences patient outcomes. Previous studies have shown that most forms of nature exposure have a positive influence on patients. However, earlier studies have mostly focused on wall-hung nature representations. The emergence of simulated nature products has raised the question regarding the effects of the new product on patient outcomes. A between-subject experimental design was adopted, where outcomes from five inpatient rooms with sky composition ceiling fixture were compared to corresponding outcomes in five identical rooms without the intervention. Data were collected from a total of 181 subjects on 11 outcomes. Independent sample tests were performed to identify differences in mean outcomes. Significant positive outcomes were observed in environmental satisfaction and diastolic blood pressure (BP). Environmental satisfaction in the experimental group was 12.4% higher than the control group. Direction of association for diastolic BP, nausea/indigestion medication, acute stress, anxiety, pain, and environmental satisfaction were consistent with a priori hypothesis. A post hoc exploratory assessment involving patients who did not self-request additional pain and sleep medication demonstrated confirmatory directions for all outcomes except Systolic BP, and statistically significant outcomes for Acute Stress and Anxiety-Acute Stress and Anxiety levels of the experimental group subjects was 53.4% and 34.79% lower, respectively, than that of the control group subjects. Salutogenic benefits of photographic sky compositions render them better than traditional ceiling tiles and offer an alternative to other nature interventions. © The Author(s) 2015.

  5. Comparison of anthropometry with photogrammetry based on a standardized clinical photographic technique using a cephalostat and chair.

    Science.gov (United States)

    Han, Kihwan; Kwon, Hyuk Joon; Choi, Tae Hyun; Kim, Jun Hyung; Son, Daegu

    2010-03-01

    The aim of this study was to standardize clinical photogrammetric techniques, and to compare anthropometry with photogrammetry. To standardize clinical photography, we have developed a photographic cephalostat and chair. We investigated the repeatability of the standardized clinical photogrammetric technique. Then, with 40 landmarks, a total of 96 anthropometric measurement items was obtained from 100 Koreans. Ninety six photogrammetric measurements from the same subjects were also obtained from standardized clinical photographs using Adobe Photoshop version 7.0 (Adobe Systems Corporation, San Jose, CA, USA). The photogrammetric and anthropometric measurement data (mm, degree) were then compared. A coefficient was obtained by dividing the anthropometric measurements by the photogrammetric measurements. The repeatability of the standardized photography was statistically significantly high (p=0.463). Among the 96 measurement items, 44 items were reliable; for these items the photogrammetric measurements were not different to the anthropometric measurements. The remaining 52 items must be classified as unreliable. By developing a photographic cephalostat and chair, we have standardized clinical photogrammetric techniques. The reliable set of measurement items can be used as anthropometric measurements. For unreliable measurement items, applying a suitable coefficient to the photogrammetric measurement allows the anthropometric measurement to be obtained indirectly.

  6. Photographic simulation of off-axis blurring due to chromatic aberration in spectacle lenses.

    Science.gov (United States)

    Doroslovački, Pavle; Guyton, David L

    2015-02-01

    Spectacle lens materials of high refractive index (nd) tend to have high chromatic dispersion (low Abbé number [V]), which may contribute to visual blurring with oblique viewing. A patient who noted off-axis blurring with new high-refractive-index spectacle lenses prompted us to do a photographic simulation of the off-axis aberrations in 3 readily available spectacle lens materials, CR-39 (nd = 1.50), polyurethane (nd = 1.60), and polycarbonate (nd = 1.59). Both chromatic and monochromatic aberrations were found to cause off-axis image degradation. Chromatic aberration was more prominent in the higher-index materials (especially polycarbonate), whereas the lower-index CR-39 had more astigmatism of oblique incidence. It is important to consider off-axis aberrations when a patient complains of otherwise unexplained blurred vision with a new pair of spectacle lenses, especially given the increasing promotion of high-refractive-index materials with high chromatic dispersion. Copyright © 2015 American Association for Pediatric Ophthalmology and Strabismus. Published by Elsevier Inc. All rights reserved.

  7. Failure of anthropometry as a facial identification technique using high-quality photographs.

    Science.gov (United States)

    Kleinberg, Krista F; Vanezis, Peter; Burton, A Mike

    2007-07-01

    Anthropometry can be used in certain circumstances to facilitate comparison of a photograph of a suspect with that of the potential offender from surveillance footage. Experimental research was conducted to determine whether anthropometry has a place in forensic practice in confirming the identity of a suspect from a surveillance video. We examined an existing database of photographic lineups, where one video image was compared against 10 photographs, which has previously been used in psychological research. Target (1) and test (10) photos were of high quality, although taken with a different camera. The anthropometric landmarks of right and left ectocanthions, nasion, and stomion were chosen, and proportions and angle values between these landmarks were measured to compare target with test photos. Results indicate that these measurements failed to accurately identify targets. There was also no indication that any of the landmarks made a better comparison than another. It was concluded that, for these landmarks, this method does not generate the consistent results necessary for use as evidence in a court of law.

  8. Interfacial area measurements in two-phase bubbly flows. Pt.1. Comparison between the light attenuation technique and the photographic method

    International Nuclear Information System (INIS)

    Veteau, J.-M.; Charlot, Roland.

    1981-02-01

    In order to measure specific area by a light attenuation technique in bubbly stationnary flows, the main features of an optical design are given. This method, valid for bubble sizes between 0,5 and several millimeters, is compared with a photographic technique. The latter gives values systematically higher (15 to 25%) than the former. The measured specific areas range from 0.5 to 2 cm -1 . The multiple sources of error inherent in the photographic method are discussed [fr

  9. Approximation of a foreign object using x-rays, reference photographs and 3D reconstruction techniques.

    Science.gov (United States)

    Briggs, Matt; Shanmugam, Mohan

    2013-12-01

    This case study describes how a 3D animation was created to approximate the depth and angle of a foreign object (metal bar) that had become embedded into a patient's head. A pre-operative CT scan was not available as the patient could not fit though the CT scanner, therefore a post surgical CT scan, x-ray and photographic images were used. A surface render was made of the skull and imported into Blender (a 3D animation application). The metal bar was not available, however images of a similar object that was retrieved from the scene by the ambulance crew were used to recreate a 3D model. The x-ray images were then imported into Blender and used as background images in order to align the skull reconstruction and metal bar at the correct depth/angle. A 3D animation was then created to fully illustrate the angle and depth of the iron bar in the skull.

  10. Fast simulation techniques for switching converters

    Science.gov (United States)

    King, Roger J.

    1987-01-01

    Techniques for simulating a switching converter are examined. The state equations for the equivalent circuits, which represent the switching converter, are presented and explained. The uses of the Newton-Raphson iteration, low ripple approximation, half-cycle symmetry, and discrete time equations to compute the interval durations are described. An example is presented in which these methods are illustrated by applying them to a parallel-loaded resonant inverter with three equivalent circuits for its continuous mode of operation.

  11. Visualization needs and techniques for astrophysical simulations

    International Nuclear Information System (INIS)

    Kapferer, W; Riser, T

    2008-01-01

    Numerical simulations have evolved continuously towards being an important field in astrophysics, equivalent to theory and observation. Due to the enormous developments in computer sciences, both hardware- and software-architecture, state-of-the-art simulations produce huge amounts of raw data with increasing complexity. In this paper some aspects of problems in the field of visualization in numerical astrophysics in combination with possible solutions are given. Commonly used visualization packages along with a newly developed approach to real-time visualization, incorporating shader programming to uncover the computational power of modern graphics cards, are presented. With these techniques at hand, real-time visualizations help scientists to understand the coherences in the results of their numerical simulations. Furthermore a fundamental problem in data analysis, i.e. coverage of metadata on how a visualization was created, is highlighted.

  12. Psychological and physiological human responses to simulated and real environments: A comparison between Photographs, 360° Panoramas, and Virtual Reality.

    Science.gov (United States)

    Higuera-Trujillo, Juan Luis; López-Tarruella Maldonado, Juan; Llinares Millán, Carmen

    2017-11-01

    Psychological research into human factors frequently uses simulations to study the relationship between human behaviour and the environment. Their validity depends on their similarity with the physical environments. This paper aims to validate three environmental-simulation display formats: photographs, 360° panoramas, and virtual reality. To do this we compared the psychological and physiological responses evoked by simulated environments set-ups to those from a physical environment setup; we also assessed the users' sense of presence. Analysis show that 360° panoramas offer the closest to reality results according to the participants' psychological responses, and virtual reality according to the physiological responses. Correlations between the feeling of presence and physiological and other psychological responses were also observed. These results may be of interest to researchers using environmental-simulation technologies currently available in order to replicate the experience of physical environments. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Tree Simulation Techniques for Integrated Safety Assessment

    International Nuclear Information System (INIS)

    Melendez Asensio, E.; Izquierdo Rocha, J.M.; Sanchez Perez, M.; Hortal Reymundo, J.; Perez Mulas, A.

    1999-01-01

    techniques are: (a) An unifying theory that should (i) establish the relationship among different approaches and, in particular, be able to demonstrate the standard safety assessment approach as a particular case, (ii) identify implicit assumptions in present practice and (iii) establish a sound scientific reference for an ideal treatment in order to judge the relative importance of implicit and explicit assumptions. In addition, the theoretical developments help to identify the type of applications where the new developments will be a necessary requirement. (b) The capability for simulation of trees. By this we mean the techniques required to be able to efficiently simulate all branches. Historically algorithms able to do this were already implemented in earlier pioneering work for discrete number of branches while stochastic branching requires Montecarlo techniques. (c) The capability to incorporate new types of branching, particularly operator actions. This paper shortly reviews these aspects and justifies in that frame our particular development, denoted here as Integrated Safety Assessment methodology. In this method, the dynamics of the event is followed by transient simulation in tree form, building a Setpoint or Deterministic Dynamic Event Tree (DDET). When a setpoint that should trigger the actuation of a protection is crossed, the tree is opened in branches corresponding to different functioning states of the protection device and each branch followed by the engineering simulator. One of these states is the nominal state, which, in the PSAs, is Associated to the success criterion of the system

  14. Visualization techniques in plasma numerical simulations

    International Nuclear Information System (INIS)

    Kulhanek, P.; Smetana, M.

    2004-01-01

    Numerical simulations of plasma processes usually yield a huge amount of raw numerical data. Information about electric and magnetic fields and particle positions and velocities can be typically obtained. There are two major ways of elaborating these data. First of them is called plasma diagnostics. We can calculate average values, variances, correlations of variables, etc. These results may be directly comparable with experiments and serve as the typical quantitative output of plasma simulations. The second possibility is the plasma visualization. The results are qualitative only, but serve as vivid display of phenomena in the plasma followed-up. An experience with visualizing electric and magnetic fields via Line Integral Convolution method is described in the first part of the paper. The LIC method serves for visualization of vector fields in two dimensional section of the three dimensional plasma. The field values can be known only in grid points of three-dimensional grid. The second part of the paper is devoted to the visualization techniques of the charged particle motion. The colour tint can be used for particle temperature representation. The motion can be visualized by a trace fading away with the distance from the particle. In this manner the impressive animations of the particle motion can be achieved. (author)

  15. Photographic paper X-ray procedure - a simple technique for the visualisation of osseous norm variations and malformations

    International Nuclear Information System (INIS)

    Markert, K.; Wirth, I.; Reinhold-Richter, L.

    1983-01-01

    On the basis of osseous norm variations and malformations, a simple X-ray procedure by means of photographic paper which can be applied in every institute of pathology is demonstrated. The quality of the photographs permits the assessment of skeletal changes which are of diagnostic importance. (author)

  16. Fast, Accurate Memory Architecture Simulation Technique Using Memory Access Characteristics

    OpenAIRE

    小野, 貴継; 井上, 弘士; 村上, 和彰

    2007-01-01

    This paper proposes a fast and accurate memory architecture simulation technique. To design memory architecture, the first steps commonly involve using trace-driven simulation. However, expanding the design space makes the evaluation time increase. A fast simulation is achieved by a trace size reduction, but it reduces the simulation accuracy. Our approach can reduce the simulation time while maintaining the accuracy of the simulation results. In order to evaluate validity of proposed techniq...

  17. optimal assembly line balancing using simulation techniques

    African Journals Online (AJOL)

    user

    Department of Mechanical Engineering ... perspective on how the business operates, and ... Process simulation allows management ... improvement and change since it would be a costly ... The work content performed on an assembly line.

  18. Real time simulation techniques in Taiwan - Maanshan compact simulator

    International Nuclear Information System (INIS)

    Liang, K.-S.; Chuang, Y.-M.; Ko, H.-T.

    2004-01-01

    Recognizing the demand and potential market of simulators in various industries, a special project for real time simulation technology transfer was initiated in Taiwan in 1991. In this technology transfer program, the most advanced real-time dynamic modules for nuclear power simulation were introduced. Those modules can be divided into two categories; one is modeling related to catch dynamic response of each system, and the other is computer related to provide special real time computing environment and man-machine interface. The modeling related modules consist of the thermodynamic module, the three-dimensional core neutronics module and the advanced balance of plant module. As planned in the project, the technology transfer team should build a compact simulator for the Maanshan power plant before the end of the project to demonstrate the success of the technology transfer program. The compact simulator was designed to support the training from the regular full scope simulator which was already equipped in the Maanshan plant. The feature of this compact simulator focused on providing know-why training by the enhanced graphic display. The potential users were identified as senior operators, instructors and nuclear engineers. Total about 13 important systems were covered in the scope of the compact simulator, and multi-graphic displays from three color monitors mounted on the 10 feet compact panel were facilitated to help the user visualize detailed phenomena under scenarios of interest. (author)

  19. A general software reliability process simulation technique

    Science.gov (United States)

    Tausworthe, Robert C.

    1991-01-01

    The structure and rationale of the generalized software reliability process, together with the design and implementation of a computer program that simulates this process are described. Given assumed parameters of a particular project, the users of this program are able to generate simulated status timelines of work products, numbers of injected anomalies, and the progress of testing, fault isolation, repair, validation, and retest. Such timelines are useful in comparison with actual timeline data, for validating the project input parameters, and for providing data for researchers in reliability prediction modeling.

  20. Comparison of radiographic technique by computer simulation

    International Nuclear Information System (INIS)

    Brochi, M.A.C.; Ghilardi Neto, T.

    1989-01-01

    A computational algorithm to compare radiographic techniques (KVp, mAs and filters) is developed based in the fixation of parameters that defines the images, such as optical density and constrast. Before the experience, the results were used in a radiography of thorax. (author) [pt

  1. Simulation of aluminium STIR casting technique

    International Nuclear Information System (INIS)

    Hafizal Yazid; Mohd Harun; Hanani Yazid; Abd Aziz Mohamed; Muhammad Rawi Muhammad Zain; Zaiton Selamat; Mohd Shariff Sattar; Muhamad Jalil; Ismail Mustapha; Razali Kasim

    2006-01-01

    In this paper, the objective is to determine the optimum impeller speed correlated with holding time to achieve homogeneous reinforcement distribution for a particular set of experimental condition. Attempts are made to simulate the flow behaviourof the liquid aluminium using FLUENT software. Stepwise impeller speed ranging from 50 to 300 rpm.with 2 impeller angle blades of 45 and 90 degree with respect to the rotational plane were used

  2. Acceleration techniques for dependability simulation. M.S. Thesis

    Science.gov (United States)

    Barnette, James David

    1995-01-01

    As computer systems increase in complexity, the need to project system performance from the earliest design and development stages increases. We have to employ simulation for detailed dependability studies of large systems. However, as the complexity of the simulation model increases, the time required to obtain statistically significant results also increases. This paper discusses an approach that is application independent and can be readily applied to any process-based simulation model. Topics include background on classical discrete event simulation and techniques for random variate generation and statistics gathering to support simulation.

  3. Oblique Photographs

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set includes photographs of marine mammals and sea turtles taken in the field. Most are lateral views of animals that are used to confirm species identity...

  4. Vertical Photographs

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set includes of photographs of marine mammals and sea turtles taken with high resolution cameras mounted in airplanes, unmanned platforms or the bow of...

  5. Simulation-based optimization parametric optimization techniques and reinforcement learning

    CERN Document Server

    Gosavi, Abhijit

    2003-01-01

    Simulation-Based Optimization: Parametric Optimization Techniques and Reinforcement Learning introduces the evolving area of simulation-based optimization. The book's objective is two-fold: (1) It examines the mathematical governing principles of simulation-based optimization, thereby providing the reader with the ability to model relevant real-life problems using these techniques. (2) It outlines the computational technology underlying these methods. Taken together these two aspects demonstrate that the mathematical and computational methods discussed in this book do work. Broadly speaking, the book has two parts: (1) parametric (static) optimization and (2) control (dynamic) optimization. Some of the book's special features are: *An accessible introduction to reinforcement learning and parametric-optimization techniques. *A step-by-step description of several algorithms of simulation-based optimization. *A clear and simple introduction to the methodology of neural networks. *A gentle introduction to converg...

  6. Recent developments in numerical simulation techniques of thermal recovery processes

    Energy Technology Data Exchange (ETDEWEB)

    Tamim, M. [Bangladesh University of Engineering and Technology, Bangladesh (Bangladesh); Abou-Kassem, J.H. [Chemical and Petroleum Engineering Department, UAE University, Al-Ain 17555 (United Arab Emirates); Farouq Ali, S.M. [University of Alberta, Alberta (Canada)

    2000-05-01

    Numerical simulation of thermal processes (steam flooding, steam stimulation, SAGD, in-situ combustion, electrical heating, etc.) is an integral part of a thermal project design. The general tendency in the last 10 years has been to use commercial simulators. During the last decade, only a few new models have been reported in the literature. More work has been done to modify and refine solutions to existing problems to improve the efficiency of simulators. The paper discusses some of the recent developments in simulation techniques of thermal processes such as grid refinement, grid orientation, effect of temperature on relative permeability, mathematical models, and solution methods. The various aspects of simulation discussed here promote better understanding of the problems encountered in the simulation of thermal processes and will be of value to both simulator users and developers.

  7. Photographic Tourism Research: Literature Review

    OpenAIRE

    Virdee, Inderpal

    2017-01-01

    This study reviews the current photographic tourism literature to identify what fields within tourism have been studied by researchers, the contexts, the samples used, the sampling methods employed, the photographic methods and supporting methods used, the data analysis techniques applied and the countries studied. A set of 115 relevant academic articles were selected and assessed using content analysis. The findings showed that overall publications in the field of photographic tourism increa...

  8. Precision of a photogrammetric method to perform 3D wound measurements compared to standard 2D photographic techniques in the horse.

    Science.gov (United States)

    Labens, R; Blikslager, A

    2013-01-01

    Methods of 3D wound imaging in man play an important role in monitoring of healing and determination of the prognosis. Standard photographic assessments in equine wound management consist of 2D analyses, which provide little quantitative information on the wound bed. 3D imaging of equine wounds is feasible using principles of stereophotogrammetry. 3D measurements differ significantly and are more precise than results with standard 2D assessments. Repeated specialised photographic imaging of 4 clinical wounds left to heal by second intention was performed. The intraoperator variability in measurements due to imaging and 3D processing was compared to that of a standard 2D technique using descriptive statistics and multivariate repeated measures ANOVA. Using a custom made imaging system, 3D analyses were successfully performed. Area and circumference measurements were significantly different between imaging modalities. The intraoperator variability of 3D measurements was up to 2.8 times less than that of 2D results. On average, the maximum discrepancy between repeated measurements was 5.8% of the mean for 3D and 17.3% of the mean for 2D assessments. The intraoperator repeatability of 3D wound measurements based on principles of stereophotogrammetry is significantly increased compared to that of a standard 2D photographic technique indicating it may be a useful diagnostic and monitoring tool. The equine granulation bed plays an important role in equine wound healing. When compared to 2D analyses 3D monitoring of the equine wound bed allows superior quantitative characterisation, contributing to clinical and experimental investigations by offering potential new parameters. © 2012 EVJ Ltd.

  9. An analog simulation technique for distributed flow systems

    DEFF Research Database (Denmark)

    Jørgensen, Sten Bay; Kümmel, Mogens

    1973-01-01

    earlier[3]. This is an important extension since flow systems are frequently controlled through manipulation of the flow rate. Previously the tech­nique has been applied with constant flows [4, 5]. Results demonstrating the new hardware are presented from simula­tion of a transportation lag and a double......Simulation of distributed flow systems in chemical engine­ering has been applied more and more during the last decade as computer techniques have developed [l]. The applications have served the purpose of identification of process dynamics and parameter estimation as well as improving process...... and process control design. Although the conventional analog computer has been expanded with hybrid techniques and digital simulation languages have appeared, none of these has demonstrated superiority in simulating distributed flow systems in general [l]. Conventional analog techniques are expensive...

  10. Simulation of wind turbine wakes using the actuator line technique

    DEFF Research Database (Denmark)

    Sørensen, Jens Nørkær; Mikkelsen, Robert Flemming; Henningson, Dan S.

    2015-01-01

    The actuator line technique was introduced as a numerical tool to be employed in combination with large eddy simulations to enable the study of wakes and wake interaction in wind farms. The technique is today largely used for studying basic features of wakes as well as for making performance...... predictions of wind farms. In this paper, we give a short introduction to the wake problem and the actuator line methodology and present a study in which the technique is employed to determine the near-wake properties of wind turbines. The presented results include a comparison of experimental results...

  11. A computer code to simulate X-ray imaging techniques

    International Nuclear Information System (INIS)

    Duvauchelle, Philippe; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-01-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests

  12. A computer code to simulate X-ray imaging techniques

    Energy Technology Data Exchange (ETDEWEB)

    Duvauchelle, Philippe E-mail: philippe.duvauchelle@insa-lyon.fr; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-09-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests.

  13. Application of the PRBS/FFT technique to digital simulations

    International Nuclear Information System (INIS)

    Hinds, H.W.

    1977-01-01

    This paper describes a method for obtaining a small-signal frequency response from a digital dynamic simulation. It employs a modified form of the PRBS/FFT technique, whereby a system is perturbed by a pseudo-random binary sequence and its response is analyzed using a fast Fourier transform-based program. Two applications of the technique are described; one involves a set of two coupled, second-order, ordinary differential equations; the other is a set of non-linear partial differential equations describing the thermohydraulic behaviour of water boiling in a fuel channel. (author)

  14. Application of simulation techniques in the probabilistic fracture mechanics

    International Nuclear Information System (INIS)

    De Ruyter van Steveninck, J.L.

    1995-03-01

    The Monte Carlo simulation is applied on a model of the fracture mechanics in order to assess the applicability of this simulation technique in the probabilistic fracture mechanics. By means of the fracture mechanics model the brittle fracture of a steel container or pipe with defects can be predicted. By means of the Monte Carlo simulation also the uncertainty regarding failures can be determined. Based on the variations in the toughness of the fracture and the defect dimensions the distribution of the chance of failure is determined. Also attention is paid to the impact of dependency between uncertain variables. Furthermore, the influence of the applied distributions of the uncertain variables and non-destructive survey on the chance of failure is analyzed. The Monte Carlo simulation results agree quite well with the results of other methods from the probabilistic fracture mechanics. If an analytic expression can be found for the chance of failure, it is possible to determine the variation of the chance of failure, next to an estimation of the chance of failure. It also appears that the dependency between the uncertain variables has a large impact on the chance of failure. It is also concluded from the simulation that the chance of failure strongly depends on the crack depth, and therefore of the distribution of the crack depth. 15 figs., 7 tabs., 12 refs

  15. Nuclear fuel cycle cost analysis using a probabilistic simulation technique

    International Nuclear Information System (INIS)

    Won, Il Ko; Jong, Won Choi; Chul, Hyung Kang; Jae, Sol Lee; Kun, Jai Lee

    1998-01-01

    A simple approach was described to incorporate the Monte Carlo simulation technique into a fuel cycle cost estimate. As a case study, the once-through and recycle fuel cycle options were tested with some alternatives (ie. the change of distribution type for input parameters), and the simulation results were compared with the values calculated by a deterministic method. A three-estimate approach was used for converting cost inputs into the statistical parameters of assumed probabilistic distributions. It was indicated that the Monte Carlo simulation by a Latin Hypercube Sampling technique and subsequent sensitivity analyses were useful for examining uncertainty propagation of fuel cycle costs, and could more efficiently provide information to decisions makers than a deterministic method. It was shown from the change of distribution types of input parameters that the values calculated by the deterministic method were set around a 40 th ∼ 50 th percentile of the output distribution function calculated by probabilistic simulation. Assuming lognormal distribution of inputs, however, the values calculated by the deterministic method were set around an 85 th percentile of the output distribution function calculated by probabilistic simulation. It was also indicated from the results of the sensitivity analysis that the front-end components were generally more sensitive than the back-end components, of which the uranium purchase cost was the most important factor of all. It showed, also, that the discount rate made many contributions to the fuel cycle cost, showing the rank of third or fifth of all components. The results of this study could be useful in applications to another options, such as the Dcp (Direct Use of PWR spent fuel In Candu reactors) cycle with high cost uncertainty

  16. Application perspectives of simulation techniques CFD in nuclear power plants

    International Nuclear Information System (INIS)

    Galindo G, I. F.

    2013-10-01

    The scenarios simulation in nuclear power plants is usually carried out with system codes that are based on concentrated parameters networks. However situations exist in some components where the flow is predominantly 3-D, as they are the natural circulation, mixed and stratification phenomena. The simulation techniques of computational fluid dynamics (CFD) have the potential to simulate these flows numerically. The use of CFD simulations embraces many branches of the engineering and continues growing, however, in relation to its application with respect to the problems related with the safety in nuclear power plants, has a smaller development, although is accelerating quickly and is expected that in the future they play a more emphasized paper in the analyses. A main obstacle to be able to achieve a general acceptance of the CFD is that the simulations should have very complete validation studies, sometimes not available. In this article a general panorama of the state of the methods application CFD in nuclear power plants is presented and the problem associated to its routine application and acceptance, including the view point of the regulatory authorities. Application examples are revised in those that the CFD offers real benefits and are also presented two illustrative study cases of the application of CFD techniques. The case of a water recipient with a heat source in its interior, similar to spent fuel pool of a nuclear power plant is presented firstly; and later the case of the Boron dilution of a water volume that enters to a nuclear reactor is presented. We can conclude that the CFD technology represents a very important opportunity to improve the phenomena understanding with a strong component 3-D and to contribute in the uncertainty reduction. (Author)

  17. D Digital Simulation of Minnan Temple Architecture CAISSON'S Craft Techniques

    Science.gov (United States)

    Lin, Y. C.; Wu, T. C.; Hsu, M. F.

    2013-07-01

    Caisson is one of the important representations of the Minnan (southern Fujian) temple architecture craft techniques and decorative aesthetics. The special component design and group building method present the architectural thinking and personal characteristics of great carpenters of Minnan temple architecture. In late Qing Dynasty, the appearance and style of caissons of famous temples in Taiwan apparently presented the building techniques of the great carpenters. However, as the years went by, the caisson design and craft techniques were not fully inherited, which has been a great loss of cultural assets. Accordingly, with the caisson of Fulong temple, a work by the well-known great carpenter in Tainan as an example, this study obtained the thinking principles of the original design and the design method at initial period of construction through interview records and the step of redrawing the "Tng-Ko" (traditional design, stakeout and construction tool). We obtained the 3D point cloud model of the caisson of Fulong temple using 3D laser scanning technology, and established the 3D digital model of each component of the caisson. Based on the caisson component procedure obtained from interview records, this study conducted the digital simulation of the caisson component to completely recode and present the caisson design, construction and completion procedure. This model of preserving the craft techniques for Minnan temple caisson by using digital technology makes specific contribution to the heritage of the craft techniques while providing an important reference for the digital preservation of human cultural assets.

  18. Technique for in situ leach simulation of uranium ores

    International Nuclear Information System (INIS)

    Grant, D.C.; Seidel, D.C.; Nichols, I.L.

    1985-01-01

    In situ uranium mining offers the advantages of minimal environmental disturbance, low capital and operating costs, and reduced mining development time. It is becoming an increasingly attractive mining method for the recovery of uranium from secondary ore deposits. In order to better understand the process, a laboratory technique was developed and used to study and simulate both the chemical and physical phenomena occurring in ore bodies during in situ leaching. The laboratory simulation technique has been used to determine effects of leaching variables on permeability, uranium recovery, and post-leach aquifer restoration. This report describes the simulation system and testing procedure in sufficient detail to allow the construction of the system, and to perform the desired leaching tests. With construction of such a system, in situ leaching of a given ore using various leach conditions can be evaluated relatively rapidly in the laboratory. Not only could optimum leach conditions be selected for existing ore bodies, but also exploitation of new ore bodies could be accelerated. 8 references, 8 figures, 2 tables

  19. A New Multiscale Technique for Time-Accurate Geophysics Simulations

    Science.gov (United States)

    Omelchenko, Y. A.; Karimabadi, H.

    2006-12-01

    Large-scale geophysics systems are frequently described by multiscale reactive flow models (e.g., wildfire and climate models, multiphase flows in porous rocks, etc.). Accurate and robust simulations of such systems by traditional time-stepping techniques face a formidable computational challenge. Explicit time integration suffers from global (CFL and accuracy) timestep restrictions due to inhomogeneous convective and diffusion processes, as well as closely coupled physical and chemical reactions. Application of adaptive mesh refinement (AMR) to such systems may not be always sufficient since its success critically depends on a careful choice of domain refinement strategy. On the other hand, implicit and timestep-splitting integrations may result in a considerable loss of accuracy when fast transients in the solution become important. To address this issue, we developed an alternative explicit approach to time-accurate integration of such systems: Discrete-Event Simulation (DES). DES enables asynchronous computation by automatically adjusting the CPU resources in accordance with local timescales. This is done by encapsulating flux- conservative updates of numerical variables in the form of events, whose execution and synchronization is explicitly controlled by imposing accuracy and causality constraints. As a result, at each time step DES self- adaptively updates only a fraction of the global system state, which eliminates unnecessary computation of inactive elements. DES can be naturally combined with various mesh generation techniques. The event-driven paradigm results in robust and fast simulation codes, which can be efficiently parallelized via a new preemptive event processing (PEP) technique. We discuss applications of this novel technology to time-dependent diffusion-advection-reaction and CFD models representative of various geophysics applications.

  20. Photographic materials

    International Nuclear Information System (INIS)

    1980-01-01

    Radiographic films based on silver halides are normally handled under red or orange safelights to prevent fogging due to their sensitivity to white light. The present invention relates to ultraviolet radiation sensitive material which can be handled under virtually white light without significant fogging. A photographic, chemically sensitised silver halide emulsion is described, containing 50-100 mole % of silver chloride, the higher the silver chloride content, the lower the visible light sensitivity. The remaining silver halide, if any, is silver bromide and/or silver iodide. The silver halide grains are grown in the presence of ammonia, an excess of chloride ions and tetraazaindene growth controller. Examples illustrating the invention are given. (U.K.)

  1. Validation techniques of agent based modelling for geospatial simulations

    Directory of Open Access Journals (Sweden)

    M. Darvishi

    2014-10-01

    Full Text Available One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS, biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI’s ArcGIS, OpenMap, GeoTools, etc for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  2. [Preparation of simulate craniocerebral models via three dimensional printing technique].

    Science.gov (United States)

    Lan, Q; Chen, A L; Zhang, T; Zhu, Q; Xu, T

    2016-08-09

    Three dimensional (3D) printing technique was used to prepare the simulate craniocerebral models, which were applied to preoperative planning and surgical simulation. The image data was collected from PACS system. Image data of skull bone, brain tissue and tumors, cerebral arteries and aneurysms, and functional regions and relative neural tracts of the brain were extracted from thin slice scan (slice thickness 0.5 mm) of computed tomography (CT), magnetic resonance imaging (MRI, slice thickness 1mm), computed tomography angiography (CTA), and functional magnetic resonance imaging (fMRI) data, respectively. MIMICS software was applied to reconstruct colored virtual models by identifying and differentiating tissues according to their gray scales. Then the colored virtual models were submitted to 3D printer which produced life-sized craniocerebral models for surgical planning and surgical simulation. 3D printing craniocerebral models allowed neurosurgeons to perform complex procedures in specific clinical cases though detailed surgical planning. It offered great convenience for evaluating the size of spatial fissure of sellar region before surgery, which helped to optimize surgical approach planning. These 3D models also provided detailed information about the location of aneurysms and their parent arteries, which helped surgeons to choose appropriate aneurismal clips, as well as perform surgical simulation. The models further gave clear indications of depth and extent of tumors and their relationship to eloquent cortical areas and adjacent neural tracts, which were able to avoid surgical damaging of important neural structures. As a novel and promising technique, the application of 3D printing craniocerebral models could improve the surgical planning by converting virtual visualization into real life-sized models.It also contributes to functional anatomy study.

  3. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  4. Radiotracer technique for leakage detection under simulated conditions

    International Nuclear Information System (INIS)

    Yelgaonkar, V.N.; Sharma, V.K.; Tapase, A.S.

    2001-01-01

    Radiotracer techniques are often used to locate leaks in underground pipelines. An attempt was made to standardize radiotracer pulse migration in terms of minimum detectable limit. For this purpose a 6 inch diameter 1200 long steel pipe was used. Two leak rates viz. 10 litres per minute and 1 litre per minute with an accuracy of ± 10% were simulated. The experiments on this pipeline showed that this method could be used to locate a leak of the order of 1 litre per minute in a 6 inch diameter isolated underground pipeline. (author)

  5. Strategies in edge plasma simulation using adaptive dynamic nodalization techniques

    International Nuclear Information System (INIS)

    Kainz, A.; Weimann, G.; Kamelander, G.

    2003-01-01

    A wide span of steady-state and transient edge plasma processes simulation problems require accurate discretization techniques and can then be treated with Finite Element (FE) and Finite Volume (FV) methods. The software used here to meet these meshing requirements is a 2D finite element grid generator. It allows to produce adaptive unstructured grids taking into consideration the flux surface characteristics. To comply with the common mesh handling features of FE/FV packages, some options have been added to the basic generation tool. These enhancements include quadrilateral meshes without non-regular transition elements obtained by substituting them by transition constructions consisting of regular quadrilateral elements. Furthermore triangular grids can be created with one edge parallel to the magnetic field and modified by the basic adaptation/realignment techniques. Enhanced code operation properties and processing capabilities are expected. (author)

  6. Parallel pic plasma simulation through particle decomposition techniques

    International Nuclear Information System (INIS)

    Briguglio, S.; Vlad, G.; Di Martino, B.; Naples, Univ. 'Federico II'

    1998-02-01

    Particle-in-cell (PIC) codes are among the major candidates to yield a satisfactory description of the detail of kinetic effects, such as the resonant wave-particle interaction, relevant in determining the transport mechanism in magnetically confined plasmas. A significant improvement of the simulation performance of such codes con be expected from parallelization, e.g., by distributing the particle population among several parallel processors. Parallelization of a hybrid magnetohydrodynamic-gyrokinetic code has been accomplished within the High Performance Fortran (HPF) framework, and tested on the IBM SP2 parallel system, using a 'particle decomposition' technique. The adopted technique requires a moderate effort in porting the code in parallel form and results in intrinsic load balancing and modest inter processor communication. The performance tests obtained confirm the hypothesis of high effectiveness of the strategy, if targeted towards moderately parallel architectures. Optimal use of resources is also discussed with reference to a specific physics problem [it

  7. CT simulation technique for craniospinal irradiation in supine position

    International Nuclear Information System (INIS)

    Lee, Suk; Kim, Yong Bae; Chu, Sung Sil; Suh, Chang Ok; Kwon, Soo Il

    2002-01-01

    In order to perform craniospinal irradiation (CSI) in the supine position on patients who are unable to lie in the prone position, a new simulation technique using a CT simulator was developed and its availability was evaluated. A CT simulator and a 3-D conformal treatment planning system were used to develop CSI in the supine position. The head and neck were immobilized with a thermoplastic mask in the supine position and the entire body was immobilized with a Vac-Loc. A volumetric image was then obtained using the CT simulator. In order to improve the reproducibility of the patients' setup, datum lines and points were marked on the head and the body. Virtual fluoroscopy was performed with the removal of visual obstacles such as the treatment table or the immobilization devices. After the virtual simulation, the treatment isocenters of each field were marked on the body and the immobilization devices at the conventional simulation room. Each treatment field was confirmed by comparing the fluoroscopy images with the digitally reconstructed radiography (DRR)/digitally composite radiography (DCR) images from the virtual simulation. The port verification films from the first treatment were also compared with the DRR/DCR images for a geometrical verification. CSI in the supine position was successfully performed in 9 patients. It required less than 20 minutes to construct the immobilization device and to obtain the whole body volumetric images. This made it possible to not only reduce the patients' inconvenience, but also to eliminate the position change variables during the long conventional simulation process. In addition, by obtaining the CT volumetric image, critical organs, such as the eyeballs and spinal cord, were better defined, and the accuracy of the port designs and shielding was improved. The difference between the DRRs and the portal films were less than 3 mm in the vertebral contour. CSI in the supine position is feasible in patients who cannot lie on

  8. CT simulation technique for craniospinal irradiation in supine position

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Suk; Kim, Yong Bae; Chu, Sung Sil; Suh, Chang Ok [Yonsei Cancer Center, College of Medicine, Yonsei University, Seoul (Korea, Republic of); Kwon, Soo Il [Kyonggi University, Seoul (Korea, Republic of)

    2002-06-15

    In order to perform craniospinal irradiation (CSI) in the supine position on patients who are unable to lie in the prone position, a new simulation technique using a CT simulator was developed and its availability was evaluated. A CT simulator and a 3-D conformal treatment planning system were used to develop CSI in the supine position. The head and neck were immobilized with a thermoplastic mask in the supine position and the entire body was immobilized with a Vac-Loc. A volumetric image was then obtained using the CT simulator. In order to improve the reproducibility of the patients' setup, datum lines and points were marked on the head and the body. Virtual fluoroscopy was performed with the removal of visual obstacles such as the treatment table or the immobilization devices. After the virtual simulation, the treatment isocenters of each field were marked on the body and the immobilization devices at the conventional simulation room. Each treatment field was confirmed by comparing the fluoroscopy images with the digitally reconstructed radiography (DRR)/digitally composite radiography (DCR) images from the virtual simulation. The port verification films from the first treatment were also compared with the DRR/DCR images for a geometrical verification. CSI in the supine position was successfully performed in 9 patients. It required less than 20 minutes to construct the immobilization device and to obtain the whole body volumetric images. This made it possible to not only reduce the patients' inconvenience, but also to eliminate the position change variables during the long conventional simulation process. In addition, by obtaining the CT volumetric image, critical organs, such as the eyeballs and spinal cord, were better defined, and the accuracy of the port designs and shielding was improved. The difference between the DRRs and the portal films were less than 3 mm in the vertebral contour. CSI in the supine position is feasible in patients who cannot

  9. ADVANCED TECHNIQUES FOR RESERVOIR SIMULATION AND MODELING OF NONCONVENTIONAL WELLS

    Energy Technology Data Exchange (ETDEWEB)

    Louis J. Durlofsky; Khalid Aziz

    2004-08-20

    Nonconventional wells, which include horizontal, deviated, multilateral and ''smart'' wells, offer great potential for the efficient management of oil and gas reservoirs. These wells are able to contact larger regions of the reservoir than conventional wells and can also be used to target isolated hydrocarbon accumulations. The use of nonconventional wells instrumented with downhole inflow control devices allows for even greater flexibility in production. Because nonconventional wells can be very expensive to drill, complete and instrument, it is important to be able to optimize their deployment, which requires the accurate prediction of their performance. However, predictions of nonconventional well performance are often inaccurate. This is likely due to inadequacies in some of the reservoir engineering and reservoir simulation tools used to model and optimize nonconventional well performance. A number of new issues arise in the modeling and optimization of nonconventional wells. For example, the optimal use of downhole inflow control devices has not been addressed for practical problems. In addition, the impact of geological and engineering uncertainty (e.g., valve reliability) has not been previously considered. In order to model and optimize nonconventional wells in different settings, it is essential that the tools be implemented into a general reservoir simulator. This simulator must be sufficiently general and robust and must in addition be linked to a sophisticated well model. Our research under this five year project addressed all of the key areas indicated above. The overall project was divided into three main categories: (1) advanced reservoir simulation techniques for modeling nonconventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and for coupling the well to the simulator (which includes the accurate calculation of well index and the modeling of multiphase flow

  10. A New Simulation Technique for Study of Collisionless Shocks: Self-Adaptive Simulations

    International Nuclear Information System (INIS)

    Karimabadi, H.; Omelchenko, Y.; Driscoll, J.; Krauss-Varban, D.; Fujimoto, R.; Perumalla, K.

    2005-01-01

    The traditional technique for simulating physical systems modeled by partial differential equations is by means of time-stepping methodology where the state of the system is updated at regular discrete time intervals. This method has inherent inefficiencies. In contrast to this methodology, we have developed a new asynchronous type of simulation based on a discrete-event-driven (as opposed to time-driven) approach, where the simulation state is updated on a 'need-to-be-done-only' basis. Here we report on this new technique, show an example of particle acceleration in a fast magnetosonic shockwave, and briefly discuss additional issues that we are addressing concerning algorithm development and parallel execution

  11. The development of damage identification methods for buildings with image recognition and machine learning techniques utilizing aerial photographs of the 2016 Kumamoto earthquake

    Science.gov (United States)

    Shohei, N.; Nakamura, H.; Fujiwara, H.; Naoichi, M.; Hiromitsu, T.

    2017-12-01

    It is important to get schematic information of the damage situation immediately after the earthquake utilizing photographs shot from an airplane in terms of the investigation and the decision-making for authorities. In case of the 2016 Kumamoto earthquake, we have acquired more than 1,800 orthographic projection photographs adjacent to damaged areas. These photos have taken between April 16th and 19th by airplanes, then we have distinguished damages of all buildings with 4 levels, and organized as approximately 296,000 GIS data corresponding to the fundamental Geospatial data published by Geospatial Information Authority of Japan. These data have organized by effort of hundreds of engineers. However, it is not considered practical for more extensive disasters like the Nankai Trough earthquake by only human powers. So, we have been developing the automatic damage identification method utilizing image recognition and machine learning techniques. First, we have extracted training data of more than 10,000 buildings which have equally damage levels divided in 4 grades. With these training data, we have been raster scanning in each scanning ranges of entire images, then clipping patch images which represents damage levels each. By utilizing these patch images, we have been developing discriminant models by two ways. One is a model using the Support Vector Machine (SVM). First, extract a feature quantity of each patch images. Then, with these vector values, calculate the histogram density as a method of Bag of Visual Words (BoVW), then classify borders with each damage grades by SVM. The other one is a model using the multi-layered Neural Network. First, design a multi-layered Neural Network. Second, input patch images and damage levels based on a visual judgement, and then, optimize learning parameters with error backpropagation method. By use of both discriminant models, we are going to discriminate damage levels in each patches, then create the image that shows

  12. Using simulation-optimization techniques to improve multiphase aquifer remediation

    Energy Technology Data Exchange (ETDEWEB)

    Finsterle, S.; Pruess, K. [Lawrence Berkeley Laboratory, Berkeley, CA (United States)

    1995-03-01

    The T2VOC computer model for simulating the transport of organic chemical contaminants in non-isothermal multiphase systems has been coupled to the ITOUGH2 code which solves parameter optimization problems. This allows one to use linear programming and simulated annealing techniques to solve groundwater management problems, i.e. the optimization of operations for multiphase aquifer remediation. A cost function has to be defined, containing the actual and hypothetical expenses of a cleanup operation which depend - directly or indirectly - on the state variables calculated by T2VOC. Subsequently, the code iteratively determines a remediation strategy (e.g. pumping schedule) which minimizes, for instance, pumping and energy costs, the time for cleanup, and residual contamination. We discuss an illustrative sample problem to discuss potential applications of the code. The study shows that the techniques developed for estimating model parameters can be successfully applied to the solution of remediation management problems. The resulting optimum pumping scheme depends, however, on the formulation of the remediation goals and the relative weighting between individual terms of the cost function.

  13. Simulation error propagation for a dynamic rod worth measurement technique

    International Nuclear Information System (INIS)

    Kastanya, D.F.; Turinsky, P.J.

    1996-01-01

    KRSKO nuclear station, subsequently adapted by Westinghouse, introduced the dynamic rod worth measurement (DRWM) technique for measuring pressurized water reactor rod worths. This technique has the potential for reduced test time and primary loop waste water versus alternatives. The measurement is performed starting from a slightly supercritical state with all rods out (ARO), driving a bank in at the maximum stepping rate, and recording the ex-core detectors responses and bank position as a function of time. The static bank worth is obtained by (1) using the ex-core detectors' responses to obtain the core average flux (2) using the core average flux in the inverse point-kinetics equations to obtain the dynamic bank worth (3) converting the dynamic bank worth to the static bank worth. In this data interpretation process, various calculated quantities obtained from a core simulator are utilized. This paper presents an analysis of the sensitivity to the impact of core simulator errors on the deduced static bank worth

  14. Numerical techniques for large cosmological N-body simulations

    International Nuclear Information System (INIS)

    Efstathiou, G.; Davis, M.; Frenk, C.S.; White, S.D.M.

    1985-01-01

    We describe and compare techniques for carrying out large N-body simulations of the gravitational evolution of clustering in the fundamental cube of an infinite periodic universe. In particular, we consider both particle mesh (PM) codes and P 3 M codes in which a higher resolution force is obtained by direct summation of contributions from neighboring particles. We discuss the mesh-induced anisotropies in the forces calculated by these schemes, and the extent to which they can model the desired 1/r 2 particle-particle interaction. We also consider how transformation of the time variable can improve the efficiency with which the equations of motion are integrated. We present tests of the accuracy with which the resulting schemes conserve energy and are able to follow individual particle trajectories. We have implemented an algorithm which allows initial conditions to be set up to model any desired spectrum of linear growing mode density fluctuations. A number of tests demonstrate the power of this algorithm and delineate the conditions under which it is effective. We carry out several test simulations using a variety of techniques in order to show how the results are affected by dynamic range limitations in the force calculations, by boundary effects, by residual artificialities in the initial conditions, and by the number of particles employed. For most purposes cosmological simulations are limited by the resolution of their force calculation rather than by the number of particles they can employ. For this reason, while PM codes are quite adequate to study the evolution of structure on large scale, P 3 M methods are to be preferred, in spite of their greater cost and complexity, whenever the evolution of small-scale structure is important

  15. Simulation of wind turbine wakes using the actuator line technique.

    Science.gov (United States)

    Sørensen, Jens N; Mikkelsen, Robert F; Henningson, Dan S; Ivanell, Stefan; Sarmast, Sasan; Andersen, Søren J

    2015-02-28

    The actuator line technique was introduced as a numerical tool to be employed in combination with large eddy simulations to enable the study of wakes and wake interaction in wind farms. The technique is today largely used for studying basic features of wakes as well as for making performance predictions of wind farms. In this paper, we give a short introduction to the wake problem and the actuator line methodology and present a study in which the technique is employed to determine the near-wake properties of wind turbines. The presented results include a comparison of experimental results of the wake characteristics of the flow around a three-bladed model wind turbine, the development of a simple analytical formula for determining the near-wake length behind a wind turbine and a detailed investigation of wake structures based on proper orthogonal decomposition analysis of numerically generated snapshots of the wake. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  16. Documentation of the ground for the planned MERO-IKL oil pipeline using the remote sensing technique. Annex P-5: Aerial photographs of the Nelahozeves - national border segment

    International Nuclear Information System (INIS)

    1994-02-01

    The remote sensing method was employed to obtain complete photographic documentation of the planned route for the Ingolstadt-Kralupy-Litvinov pipeline; sites of potentially hazardous sources of soil or water pollution were identified. (J.B.). 83 figs

  17. Photographic materials

    International Nuclear Information System (INIS)

    Jamieson, P.B.

    1980-01-01

    Radiographic films based on silver halides are normally handled under red or orange safelights to prevent fogging due to their sensitivity to white light. The present invention relates to ultraviolet radiation sensitive material which can be handled under virtually white light without significant fogging. The film material is comprised of a base having at least one layer of a photographic silver halide emulsion and a yellow filter dye screening the emulsion from visible radiation. The silver halide emulsion contains 50-100 mole % of silver chloride, the higher the silver chloride content, the lower the visible light sensitivity. The nature and properties of the yellow filter dye are described. When recording an X-ray image, the film is loaded into the camera under white safelight conditions from which light of wavelength shorter than 400 nm is excluded. The film is in contact with one or more phosphor screens capable when struck by X-rays of emitting ultraviolet radiation, the screens having a peak ultraviolet emission within the wavelength range of 250-380 nm. After X-ray exposure, the film is removed and developed. Two examples illustrating the invention are given. (U.K.)

  18. Parallel Reservoir Simulations with Sparse Grid Techniques and Applications to Wormhole Propagation

    KAUST Repository

    Wu, Yuanqing

    2015-01-01

    the traditional simulation technique relying on the Darcy framework, we propose a new framework called Darcy-Brinkman-Forchheimer framework to simulate wormhole propagation. Furthermore, to process the large quantity of cells in the simulation grid and shorten

  19. A Monte Carlo simulation technique to determine the optimal portfolio

    Directory of Open Access Journals (Sweden)

    Hassan Ghodrati

    2014-03-01

    Full Text Available During the past few years, there have been several studies for portfolio management. One of the primary concerns on any stock market is to detect the risk associated with various assets. One of the recognized methods in order to measure, to forecast, and to manage the existing risk is associated with Value at Risk (VaR, which draws much attention by financial institutions in recent years. VaR is a method for recognizing and evaluating of risk, which uses the standard statistical techniques and the method has been used in other fields, increasingly. The present study has measured the value at risk of 26 companies from chemical industry in Tehran Stock Exchange over the period 2009-2011 using the simulation technique of Monte Carlo with 95% confidence level. The used variability in the present study has been the daily return resulted from the stock daily price change. Moreover, the weight of optimal investment has been determined using a hybrid model called Markowitz and Winker model in each determined stocks. The results showed that the maximum loss would not exceed from 1259432 Rials at 95% confidence level in future day.

  20. Adobe Photoshop CC for photographers

    CERN Document Server

    Evening, Martin

    2014-01-01

    Adobe Photoshop for Photographers 2014 Release by Photoshop hall-of-famer and acclaimed digital imaging professional Martin Evening has been fully updated to include detailed instruction for all of the updates to Photoshop CC 2014 on Adobe's Creative Cloud, including significant new features, such as Focus Area selections, enhanced Content-Aware filling, and new Spin and Path blur gallery effects. This guide covers all the tools and techniques photographers and professional image editors need to know when using Photoshop, from workflow guidance to core skills to advanced techniques for profess

  1. Supply chain simulation tools and techniques: a survey

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2005-01-01

    The main contribution of this paper is twofold: it surveys different types of simulation for supply chain management; it discusses several methodological issues. These different types of simulation are spreadsheet simulation, system dynamics, discrete-event simulation and business games. Which

  2. Preflight screening techniques for centrifuge-simulated suborbital spaceflight.

    Science.gov (United States)

    Pattarini, James M; Blue, Rebecca S; Castleberry, Tarah L; Vanderploeg, James M

    2014-12-01

    Historically, space has been the venue of the healthy individual. With the advent of commercial spaceflight, we face the novel prospect of routinely exposing spaceflight participants (SPFs) with multiple comorbidities to the space environment. Preflight screening procedures must be developed to identify those individuals at increased risk during flight. We examined the responses of volunteers to centrifuge accelerations mimicking commercial suborbital spaceflight profiles to evaluate how potential SFPs might tolerate such forces. We evaluated our screening process for medical approval of subjects for centrifuge participation for applicability to commercial spaceflight operations. All registered subjects completed a medical questionnaire, physical examination, and electrocardiogram. Subjects with identified concerns including cardiopulmonary disease, hypertension, and diabetes were required to provide documentation of their conditions. There were 335 subjects who registered for the study, 124 who completed all prescreening, and 86 subjects who participated in centrifuge trials. Due to prior medical history, five subjects were disqualified, most commonly for psychiatric reasons or uncontrolled medical conditions. Of the subjects approved, four individuals experienced abnormal physiological responses to centrifuge profiles, including one back strain and three with anxiety reactions. The screening methods used were judged to be sufficient to identify individuals physically capable of tolerating simulated suborbital flight. Improved methods will be needed to identify susceptibility to anxiety reactions. While severe or uncontrolled disease was excluded, many subjects successfully participated in centrifuge trials despite medical histories of disease that are disqualifying under historical spaceflight screening regimes. Such screening techniques are applicable for use in future commercial spaceflight operations.

  3. Development of joining techniques for fabrication of fuel rod simulators

    International Nuclear Information System (INIS)

    Moorhead, A.J.; McCulloch, R.W.; Reed, R.W.; Woodhouse, J.J.

    1980-10-01

    Much of the safety-related thermal-hydraulic tests on nuclear reactors are conducted not in the reactor itself, but in mockup segments of a core that uses resistance-heated fuel rod simulators (FRS) in place of the radioactive fuel rods. Laser welding and furnace brazing techniques are described for joining subassemblies for FRS that have survived up to 1000 h steady-state operation at 700 to 1100 0 C cladding temperatures and over 5000 thermal transients, ranging from 10 to 100 0 C/s. A pulsed-laser welding procedure that includes use of small-diameter filler wire is used to join one end of a resistance heating element of Pt-8 W, Fe-22 Cr-5.5 Al-0.5 Co, or 80 Ni-20 Cr (wt %) to a tubular conductor of an appropriate intermediate material. The other end of the heating element is laser welded to an end plug, which in turn is welded to a central conductor rod

  4. Experiencing Photographs Qua Photographs: What's So Special about Them?

    Directory of Open Access Journals (Sweden)

    Jiri Benovsky

    2013-01-01

    Full Text Available Merely rhetorically and answering in the negative, Kendall Walton has asked: "Isn't photography just another method people have of making pictures, one that merely uses different tools and materials; cameras, photosensitive paper, and darkroom equipment, rather than canvas, paint, and brushes? And don't the results differ only contingently and in degree, not fundamentally, from pictures of other kinds?" Contrary to Walton and others, I answer with a resounding "Yes" to Walton’s questions in this article. It is a widely shared view that photographs are somehow special and that they fundamentally differ from hand-made pictures such as paintings, both from a phenomenological point of view (in the way we experience them and an epistemic point of view (since they are supposed to have a different that is, greater, epistemic value from paintings that gives us a privileged access to the world. I almost reject the totality of these claims and, as a consequence, there remains little difference between photographs and paintings. As we shall see, “photographs are always partly paintings,” a claim that is true not only of retouched digital photographs but of all photographs, including traditional ones made using photosensitive film and development techniques.

  5. Glacier Photograph Collection

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Glacier Photograph Collection is a database of photographs of glaciers from around the world, some dating back to the mid-1850's, that provide an historical...

  6. Teaching Behavioral Modeling and Simulation Techniques for Power Electronics Courses

    Science.gov (United States)

    Abramovitz, A.

    2011-01-01

    This paper suggests a pedagogical approach to teaching the subject of behavioral modeling of switch-mode power electronics systems through simulation by general-purpose electronic circuit simulators. The methodology is oriented toward electrical engineering (EE) students at the undergraduate level, enrolled in courses such as "Power…

  7. Determine the feasibility of techniques for simulating coal dust explosions

    CSIR Research Space (South Africa)

    Kirsten, JT

    1994-07-01

    Full Text Available The primary objective of this work is to assess the feasibility of reliably simulating the coal dust explosion process taking place in the Kloppersbos tunnel with a computer model. Secondary objectives are to investigate the viability of simulating...

  8. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  9. A Simulation of AI Programming Techniques in BASIC.

    Science.gov (United States)

    Mandell, Alan

    1986-01-01

    Explains the functions of and the techniques employed in expert systems. Offers the program "The Periodic Table Expert," as a model for using artificial intelligence techniques in BASIC. Includes the program listing and directions for its use on: Tandy 1000, 1200, and 2000; IBM PC; PC Jr; TRS-80; and Apple computers. (ML)

  10. 360-degree videos: a new visualization technique for astrophysical simulations

    Science.gov (United States)

    Russell, Christopher M. P.

    2017-11-01

    360-degree videos are a new type of movie that renders over all 4π steradian. Video sharing sites such as YouTube now allow this unique content to be shared via virtual reality (VR) goggles, hand-held smartphones/tablets, and computers. Creating 360° videos from astrophysical simulations is not only a new way to view these simulations as you are immersed in them, but is also a way to create engaging content for outreach to the public. We present what we believe is the first 360° video of an astrophysical simulation: a hydrodynamics calculation of the central parsec of the Galactic centre. We also describe how to create such movies, and briefly comment on what new science can be extracted from astrophysical simulations using 360° videos.

  11. Monte Carlo simulation of tomography techniques using the platform Gate

    International Nuclear Information System (INIS)

    Barbouchi, Asma

    2007-01-01

    Simulations play a key role in functional imaging, with applications ranging from scanner design, scatter correction, protocol optimisation. GATE (Geant4 for Application Tomography Emission) is a platform for Monte Carlo Simulation. It is based on Geant4 to generate and track particles, to model geometry and physics process. Explicit modelling of time includes detector motion, time of flight, tracer kinetics. Interfaces to voxellised models and image reconstruction packages improve the integration of GATE in the global modelling cycle. In this work Monte Carlo simulations are used to understand and optimise the gamma camera's performances. We study the effect of the distance between source and collimator, the diameter of the holes and the thick of the collimator on the spatial resolution, energy resolution and efficiency of the gamma camera. We also study the reduction of simulation's time and implement a model of left ventricle in GATE. (Author). 7 refs

  12. Validation techniques of agent based modelling for geospatial simulations

    OpenAIRE

    Darvishi, M.; Ahmadi, G.

    2014-01-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent...

  13. Enamel dose calculation by electron paramagnetic resonance spectral simulation technique

    International Nuclear Information System (INIS)

    Dong Guofu; Cong Jianbo; Guo Linchao; Ning Jing; Xian Hong; Wang Changzhen; Wu Ke

    2011-01-01

    Objective: To optimize the enamel electron paramagnetic resonance (EPR) spectral processing by using the EPR spectral simulation method to improve the accuracy of enamel EPR dosimetry and reduce artificial error. Methods: The multi-component superimposed EPR powder spectral simulation software was developed to simulate EPR spectrum models of the background signal (BS) and the radiation- induced signal (RS) of irradiated enamel respectively. RS was extracted from the multi-component superimposed spectrum of irradiated enamel and its amplitude was calculated. The dose-response curve was then established for calculating the doses of a group of enamel samples. The result of estimated dose was compared with that calculated by traditional method. Results: BS was simulated as a powder spectrum of gaussian line shape with the following spectrum parameters: g=2.00 35 and Hpp=0.65-1.1 mT, RS signal was also simulated as a powder spectrum but with axi-symmetric spectrum characteristics. The spectrum parameters of RS were: g ⊥ =2.0018, g ‖ =1.9965, Hpp=0.335-0.4 mT. The amplitude of RS had a linear response to radiation dose with the regression equation as y=240.74x + 76 724 (R 2 =0.9947). The expectation of relative error of dose estimation was 0.13. Conclusions: EPR simulation method has improved somehow the accuracy and reliability of enamel EPR dose estimation. (authors)

  14. Advanced Techniques for Reservoir Simulation and Modeling of Non-Conventional Wells

    Energy Technology Data Exchange (ETDEWEB)

    Durlofsky, Louis J.

    2000-08-28

    This project targets the development of (1) advanced reservoir simulation techniques for modeling non-conventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and well index (for use in simulation models), including the effects of wellbore flow; and (3) accurate approaches to account for heterogeneity in the near-well region.

  15. Advancing botnet modeling techniques for military and security simulations

    Science.gov (United States)

    Banks, Sheila B.; Stytz, Martin R.

    2011-06-01

    Simulation environments serve many purposes, but they are only as good as their content. One of the most challenging and pressing areas that call for improved content is the simulation of bot armies (botnets) and their effects upon networks and computer systems. Botnets are a new type of malware, a type that is more powerful and potentially dangerous than any other type of malware. A botnet's power derives from several capabilities including the following: 1) the botnet's capability to be controlled and directed throughout all phases of its activity, 2) a command and control structure that grows increasingly sophisticated, and 3) the ability of a bot's software to be updated at any time by the owner of the bot (a person commonly called a bot master or bot herder.) Not only is a bot army powerful and agile in its technical capabilities, a bot army can be extremely large, can be comprised of tens of thousands, if not millions, of compromised computers or it can be as small as a few thousand targeted systems. In all botnets, their members can surreptitiously communicate with each other and their command and control centers. In sum, these capabilities allow a bot army to execute attacks that are technically sophisticated, difficult to trace, tactically agile, massive, and coordinated. To improve our understanding of their operation and potential, we believe that it is necessary to develop computer security simulations that accurately portray bot army activities, with the goal of including bot army simulations within military simulation environments. In this paper, we investigate issues that arise when simulating bot armies and propose a combination of the biologically inspired MSEIR infection spread model coupled with the jump-diffusion infection spread model to portray botnet propagation.

  16. Simulation techniques for determining reliability and availability of technical systems

    International Nuclear Information System (INIS)

    Lindauer, E.

    1975-01-01

    The system is described in the form of a fault tree with components representing part functions of the system and connections which reproduce the logical structure of the system. Both have the states intact or failed, they are defined here as in the programme FESIVAR of the IRS. For the simulation of components corresponding to the given probabilities, pseudo-random numbers are applied; these are numbers whose sequence is determined by the producing algorithm, but which for the given purpose sufficiently exhibit the behaviour of randomly successive numbers. This method of simulation is compared with deterministic methods. (HP/LH) [de

  17. Development of a technique for inflight jet noise simulation. I, II

    Science.gov (United States)

    Clapper, W. S.; Stringas, E. J.; Mani, R.; Banerian, G.

    1976-01-01

    Several possible noise simulation techniques were evaluated, including closed circuit wind tunnels, free jets, rocket sleds and high speed trains. The free jet technique was selected for demonstration and verification. The first paper describes the selection and development of the technique and presents results for simulation and in-flight tests of the Learjet, F106, and Bertin Aerotrain. The second presents a theoretical study relating the two sets of noise signatures. It is concluded that the free jet simulation technique provides a satisfactory assessment of in-flight noise.

  18. Two dimensional numerical simulation of gas discharges: comparison between particle-in-cell and FCT techniques

    Energy Technology Data Exchange (ETDEWEB)

    Soria-Hoyo, C; Castellanos, A [Departamento de Electronica y Electromagnetismo, Facultad de Fisica, Universidad de Sevilla, Avda. Reina Mercedes s/n, 41012 Sevilla (Spain); Pontiga, F [Departamento de Fisica Aplicada II, EUAT, Universidad de Sevilla, Avda. Reina Mercedes s/n, 41012 Sevilla (Spain)], E-mail: cshoyo@us.es

    2008-10-21

    Two different numerical techniques have been applied to the numerical integration of equations modelling gas discharges: a finite-difference flux corrected transport (FD-FCT) technique and a particle-in-cell (PIC) technique. The PIC technique here implemented has been specifically designed for the simulation of 2D electrical discharges using cylindrical coordinates. The development and propagation of a streamer between two parallel electrodes has been used as a convenient test to compare the performance of both techniques. In particular, the phase velocity of the cathode directed streamer has been used to check the internal consistency of the numerical simulations. The results obtained from the two techniques are in reasonable agreement with each other, and both techniques have proved their ability to follow the high gradients of charge density and electric field present in this type of problems. Moreover, the streamer velocities predicted by the simulation are in accordance with the typical experimental values.

  19. Two dimensional numerical simulation of gas discharges: comparison between particle-in-cell and FCT techniques

    International Nuclear Information System (INIS)

    Soria-Hoyo, C; Castellanos, A; Pontiga, F

    2008-01-01

    Two different numerical techniques have been applied to the numerical integration of equations modelling gas discharges: a finite-difference flux corrected transport (FD-FCT) technique and a particle-in-cell (PIC) technique. The PIC technique here implemented has been specifically designed for the simulation of 2D electrical discharges using cylindrical coordinates. The development and propagation of a streamer between two parallel electrodes has been used as a convenient test to compare the performance of both techniques. In particular, the phase velocity of the cathode directed streamer has been used to check the internal consistency of the numerical simulations. The results obtained from the two techniques are in reasonable agreement with each other, and both techniques have proved their ability to follow the high gradients of charge density and electric field present in this type of problems. Moreover, the streamer velocities predicted by the simulation are in accordance with the typical experimental values.

  20. Simulation tools for industrial applications of phased array inspection techniques

    International Nuclear Information System (INIS)

    Mahaut, St.; Roy, O.; Chatillon, S.; Calmon, P.

    2001-01-01

    Ultrasonic phased arrays techniques have been developed at the French Atomic Energy Commission in order to improve defects characterization and adaptability to various inspection configuration (complex geometry specimen). Such transducers allow 'standard' techniques - adjustable beam-steering and focusing -, or more 'advanced' techniques - self-focusing on defects for instance -. To estimate the performances of those techniques, models have been developed, which allows to compute the ultrasonic field radiated by an arbitrary phased array transducer through any complex specimen, and to predict the ultrasonic response of various defects inspected with a known beam. Both modeling applications are gathered in the Civa software, dedicated to NDT expertise. The use of those complementary models allows to evaluate the ability of a phased array to steer and focus the ultrasonic beam, and therefore its relevancy to detect and characterize defects. These models are specifically developed to give accurate solutions to realistic inspection applications. This paper briefly describes the CIVA models, and presents some applications dedicated to the inspection of complex specimen containing various defects with a phased array used to steer and focus the beam. Defect detection and characterization performances are discussed for the various configurations. Some experimental validation of both models are also presented. (authors)

  1. Estimation of fracture aperture using simulation technique; Simulation wo mochiita fracture kaiko haba no suitei

    Energy Technology Data Exchange (ETDEWEB)

    Kikuchi, T [Geological Survey of Japan, Tsukuba (Japan); Abe, M [Tohoku University, Sendai (Japan). Faculty of Engineering

    1996-10-01

    Characteristics of amplitude variation around fractures have been investigated using simulation technique in the case changing the fracture aperture. Four models were used. The model-1 was a fracture model having a horizontal fracture at Z=0. For the model-2, the fracture was replaced by a group of small fractures. The model-3 had an extended borehole diameter at Z=0 in a shape of wedge. The model-4 had a low velocity layer at Z=0. The maximum amplitude was compared each other for each depth and for each model. For the model-1, the amplitude became larger at the depth of the fracture, and became smaller above the fracture. For the model-2, when the cross width D increased to 4 cm, the amplitude approached to that of the model-1. For the model-3 having extended borehole diameter, when the extension of borehole diameter ranged between 1 cm and 2 cm, the change of amplitude was hardly observed above and below the fracture. However, when the extension of borehole diameter was 4 cm, the amplitude became smaller above the extension part of borehole. 3 refs., 4 figs., 1 tab.

  2. Modeling and Simulation Techniques for Large-Scale Communications Modeling

    National Research Council Canada - National Science Library

    Webb, Steve

    1997-01-01

    .... Tests of random number generators were also developed and applied to CECOM models. It was found that synchronization of random number strings in simulations is easy to implement and can provide significant savings for making comparative studies. If synchronization is in place, then statistical experiment design can be used to provide information on the sensitivity of the output to input parameters. The report concludes with recommendations and an implementation plan.

  3. Assessing suturing techniques using a virtual reality surgical simulator.

    Science.gov (United States)

    Kazemi, Hamed; Rappel, James K; Poston, Timothy; Hai Lim, Beng; Burdet, Etienne; Leong Teo, Chee

    2010-09-01

    Advantages of virtual-reality simulators surgical skill assessment and training include more training time, no risk to patient, repeatable difficulty level, reliable feedback, without the resource demands, and ethical issues of animal-based training. We tested this for a key subtask and showed a strong link between skill in the simulator and in reality. Suturing performance was assessed for four groups of participants, including experienced surgeons and naive subjects, on a custom-made virtual-reality simulator. Each subject tried the experiment 30 times using five different types of needles to perform a standardized suture placement task. Traditional metrics of performance as well as new metrics enabled by our system were proposed, and the data indicate difference between trained and untrained performance. In all traditional parameters such as time, number of attempts, and motion quantity, the medical surgeons outperformed the other three groups, though differences were not significant. However, motion smoothness, penetration and exit angles, tear size areas, and orientation change were statistically significant in the trained group when compared with untrained group. This suggests that these parameters can be used in virtual microsurgery training.

  4. Measurement and Simulation Techniques For Piezoresistive Microcantilever Biosensor Applications

    Directory of Open Access Journals (Sweden)

    Aan Febriansyah

    2012-12-01

    Full Text Available Applications of microcantilevers as biosensors have been explored by many researchers for the applications in medicine, biological, chemistry, and environmental monitoring. This research discusses a design of measurement method and simuations for piezoresistive microcantilever as a biosensor, which consist of designing Wheatstone bridge circuit as object detector, simulation of resonance frequency shift based on Euler Bernoulli Beam equation, and microcantilever vibration simulation using COMSOL Multiphysics 3.5. The piezoresistive microcantilever used here is Seiko Instrument Technology (Japan product with length of 110 ?m, width of 50 ?m, and thickness of 1 ?m. Microcantilever mass is 12.815 ng, including the mass receptor. The sample object in this research is bacteria EColi. One bacteria mass is assumed to 0.3 pg. Simulation results show that the mass of one bacterium will cause the deflection of 0,03053 nm and resonance frequency value of 118,90 kHz. Moreover, four bacterium will cause the deflection of 0,03054 nm and resonance frequency value of 118,68 kHz. These datas indicate that the increasing of the bacteria mass increases the deflection value and reduces the value of resonance frequency.

  5. Filament winding technique, experiment and simulation analysis on tubular structure

    Science.gov (United States)

    Quanjin, Ma; Rejab, M. R. M.; Kaige, Jiang; Idris, M. S.; Harith, M. N.

    2018-04-01

    Filament winding process has emerged as one of the potential composite fabrication processes with lower costs. Filament wound products involve classic axisymmetric parts (pipes, rings, driveshafts, high-pressure vessels and storage tanks), non-axisymmetric parts (prismatic nonround sections and pipe fittings). Based on the 3-axis filament winding machine has been designed with the inexpensive control system, it is completely necessary to make a relative comparison between experiment and simulation on tubular structure. In this technical paper, the aim of this paper is to perform a dry winding experiment using the 3-axis filament winding machine and simulate winding process on the tubular structure using CADWIND software with 30°, 45°, 60° winding angle. The main result indicates that the 3-axis filament winding machine can produce tubular structure with high winding pattern performance with different winding angle. This developed 3-axis winding machine still has weakness compared to CAWIND software simulation results with high axes winding machine about winding pattern, turnaround impact, process error, thickness, friction impact etc. In conclusion, the 3-axis filament winding machine improvements and recommendations come up with its comparison results, which can intuitively understand its limitations and characteristics.

  6. Reliability assessment of restructured power systems using reliability network equivalent and pseudo-sequential simulation techniques

    International Nuclear Information System (INIS)

    Ding, Yi; Wang, Peng; Goel, Lalit; Billinton, Roy; Karki, Rajesh

    2007-01-01

    This paper presents a technique to evaluate reliability of a restructured power system with a bilateral market. The proposed technique is based on the combination of the reliability network equivalent and pseudo-sequential simulation approaches. The reliability network equivalent techniques have been implemented in the Monte Carlo simulation procedure to reduce the computational burden of the analysis. Pseudo-sequential simulation has been used to increase the computational efficiency of the non-sequential simulation method and to model the chronological aspects of market trading and system operation. Multi-state Markov models for generation and transmission systems are proposed and implemented in the simulation. A new load shedding scheme is proposed during generation inadequacy and network congestion to minimize the load curtailment. The IEEE reliability test system (RTS) is used to illustrate the technique. (author)

  7. Improved importance sampling technique for efficient simulation of digital communication systems

    Science.gov (United States)

    Lu, Dingqing; Yao, Kung

    1988-01-01

    A new, improved importance sampling (IIS) approach to simulation is considered. Some basic concepts of IS are introduced, and detailed evolutions of simulation estimation variances for Monte Carlo (MC) and IS simulations are given. The general results obtained from these evolutions are applied to the specific previously known conventional importance sampling (CIS) technique and the new IIS technique. The derivation for a linear system with no signal random memory is considered in some detail. For the CIS technique, the optimum input scaling parameter is found, while for the IIS technique, the optimum translation parameter is found. The results are generalized to a linear system with memory and signals. Specific numerical and simulation results are given which show the advantages of CIS over MC and IIS over CIS for simulations of digital communications systems.

  8. FEM Techniques for High Stress Detection in Accelerated Fatigue Simulation

    Science.gov (United States)

    Veltri, M.

    2016-09-01

    This work presents the theory and a numerical validation study in support to a novel method for a priori identification of fatigue critical regions, with the aim to accelerate durability design in large FEM problems. The investigation is placed in the context of modern full-body structural durability analysis, where a computationally intensive dynamic solution could be required to identify areas with potential for fatigue damage initiation. The early detection of fatigue critical areas can drive a simplification of the problem size, leading to sensible improvement in solution time and model handling while allowing processing of the critical areas in higher detail. The proposed technique is applied to a real life industrial case in a comparative assessment with established practices. Synthetic damage prediction quantification and visualization techniques allow for a quick and efficient comparison between methods, outlining potential application benefits and boundaries.

  9. Simulation of land mine detection processes using nuclear techniques

    International Nuclear Information System (INIS)

    Aziz, M.

    2005-01-01

    A computer models were designed to study the processes of land mine detection using nuclear technique. Parameters that affect the detection were analyzed . Mines of different masses at different depths in the soil are considered using two types of sources , 252 C f and 14 MeV neutron source. The capability to differentiate between mines and other objects such as concrete , iron , wood , Aluminum ,water and polyethylene were analyzed and studied

  10. Dielectric properties of proteins from simulations: tools and techniques

    Science.gov (United States)

    Simonson, Thomas; Perahia, David

    1995-09-01

    Tools and techniques to analyze the dielectric properties of proteins are described. Microscopic dielectric properties are determined by a susceptibility tensor of order 3 n, where n is the number of protein atoms. For perturbing charges not too close to the protein, the dielectric relaxation free energy is directly related to the dipole-dipole correlation matrix of the unperturbed protein, or equivalently to the covariance matrix of its atomic displacements. These are straightforward to obtain from existing molecular dynamics packages such as CHARMM or X- PLOR. Macroscopic dielectric properties can be derived from the dipolar fluctuations of the protein, by idealizing the protein as one or more spherical media. The dipolar fluctuations are again directly related to the covariance matrix of the atomic displacements. An interesting consequence is that the quasiharmonic approximation, which by definition exactly reproduces this covariance matrix, gives the protein dielectric constant exactly. Finally a technique is reviewed to obtain normal or quasinormal modes of vibration of symmetric protein assemblies. Using elementary group theory, and eliminating the high-frequency modes of vibration of each monomer, the limiting step in terms of memory and computation is finding the normal modes of a single monomer, with the other monomers held fixed. This technique was used to study the dielectric properties of the Tobacco Mosaic Virus protein disk.

  11. Purpose compliant visual simulation: towards effective and selective methods and techniques of visualisation and simulation

    NARCIS (Netherlands)

    Daru, R.; Venemans, P.

    1998-01-01

    Visualisation, simulation and communication were always intimately interconnected. Visualisations and simulations impersonate existing or virtual realities. Without those tools it is arduous to communicate mental depictions about virtual objects and events. A communication model is presented to

  12. Variance reduction techniques in the simulation of Markov processes

    International Nuclear Information System (INIS)

    Lessi, O.

    1987-01-01

    We study a functional r of the stationary distribution of a homogeneous Markov chain. It is often difficult or impossible to perform the analytical calculation of r and so it is reasonable to estimate r by a simulation process. A consistent estimator r(n) of r is obtained with respect to a chain with a countable state space. Suitably modifying the estimator r(n) of r one obtains a new consistent estimator which has a smaller variance than r(n). The same is obtained in the case of finite state space

  13. Simulation of California's Major Reservoirs Outflow Using Data Mining Technique

    Science.gov (United States)

    Yang, T.; Gao, X.; Sorooshian, S.

    2014-12-01

    The reservoir's outflow is controlled by reservoir operators, which is different from the upstream inflow. The outflow is more important than the reservoir's inflow for the downstream water users. In order to simulate the complicated reservoir operation and extract the outflow decision making patterns for California's 12 major reservoirs, we build a data-driven, computer-based ("artificial intelligent") reservoir decision making tool, using decision regression and classification tree approach. This is a well-developed statistical and graphical modeling methodology in the field of data mining. A shuffled cross validation approach is also employed to extract the outflow decision making patterns and rules based on the selected decision variables (inflow amount, precipitation, timing, water type year etc.). To show the accuracy of the model, a verification study is carried out comparing the model-generated outflow decisions ("artificial intelligent" decisions) with that made by reservoir operators (human decisions). The simulation results show that the machine-generated outflow decisions are very similar to the real reservoir operators' decisions. This conclusion is based on statistical evaluations using the Nash-Sutcliffe test. The proposed model is able to detect the most influential variables and their weights when the reservoir operators make an outflow decision. While the proposed approach was firstly applied and tested on California's 12 major reservoirs, the method is universally adaptable to other reservoir systems.

  14. Hybrid simulation techniques applied to the earth's bow shock

    Science.gov (United States)

    Winske, D.; Leroy, M. M.

    1985-01-01

    The application of a hybrid simulation model, in which the ions are treated as discrete particles and the electrons as a massless charge-neutralizing fluid, to the study of the earth's bow shock is discussed. The essentials of the numerical methods are described in detail; movement of the ions, solution of the electromagnetic fields and electron fluid equations, and imposition of appropriate boundary and initial conditions. Examples of results of calculations for perpendicular shocks are presented which demonstrate the need for a kinetic treatment of the ions to reproduce the correct ion dynamics and the corresponding shock structure. Results for oblique shocks are also presented to show how the magnetic field and ion motion differ from the perpendicular case.

  15. Drift simulation of MH370 debris using superensemble techniques

    Science.gov (United States)

    Jansen, Eric; Coppini, Giovanni; Pinardi, Nadia

    2016-07-01

    On 7 March 2014 (UTC), Malaysia Airlines flight 370 vanished without a trace. The aircraft is believed to have crashed in the southern Indian Ocean, but despite extensive search operations the location of the wreckage is still unknown. The first tangible evidence of the accident was discovered almost 17 months after the disappearance. On 29 July 2015, a small piece of the right wing of the aircraft was found washed up on the island of Réunion, approximately 4000 km from the assumed crash site. Since then a number of other parts have been found in Mozambique, South Africa and on Rodrigues Island. This paper presents a numerical simulation using high-resolution oceanographic and meteorological data to predict the movement of floating debris from the accident. Multiple model realisations are used with different starting locations and wind drag parameters. The model realisations are combined into a superensemble, adjusting the model weights to best represent the discovered debris. The superensemble is then used to predict the distribution of marine debris at various moments in time. This approach can be easily generalised to other drift simulations where observations are available to constrain unknown input parameters. The distribution at the time of the accident shows that the discovered debris most likely originated from the wide search area between 28 and 35° S. This partially overlaps with the current underwater search area, but extends further towards the north. Results at later times show that the most probable locations to discover washed-up debris are along the African east coast, especially in the area around Madagascar. The debris remaining at sea in 2016 is spread out over a wide area and its distribution changes only slowly.

  16. Personnel photographic film dosimetry

    International Nuclear Information System (INIS)

    Keirim-Markus, I.B.

    1981-01-01

    Technology of personnel photographic film dosimetry (PPD) based on the photographic effect of ionizing radiation is described briefly. Kinds of roentgen films used in PPD method are enumerated, compositions of a developer and fixing agents for these films are given [ru

  17. Limits of validity of photon-in-cell simulation techniques

    International Nuclear Information System (INIS)

    Reitsma, A. J. W.; Jaroszynski, D. A.

    2008-01-01

    A comparison is made between two reduced models for studying laser propagation in underdense plasma; namely, photon kinetic theory and the slowly varying envelope approximation. Photon kinetic theory is a wave-kinetic description of the electromagnetic field where the motion of quasiparticles in photon coordinate-wave number phase space is described by the ray-tracing equations. Numerically, the photon kinetic theory is implemented with standard particle-in-cell techniques, which results in a so-called photon-in-cell code. For all the examples presented in this paper, the slowly varying envelope approximation is accurate and therefore discrepancies indicate the failure of photon kinetic approximation for these cases. Possible remedies for this failure are discussed at the end of the paper

  18. Visualization and simulation techniques for surgical simulators using actual patient's data.

    Science.gov (United States)

    Radetzky, Arne; Nürnberger, Andreas

    2002-11-01

    Because of the increasing complexity of surgical interventions research in surgical simulation became more and more important over the last years. However, the simulation of tissue deformation is still a challenging problem, mainly due to the short response times that are required for real-time interaction. The demands to hard and software are even larger if not only the modeled human anatomy is used but the anatomy of actual patients. This is required if the surgical simulator should be used as training medium for expert surgeons rather than students. In this article, suitable visualization and simulation methods for surgical simulation utilizing actual patient's datasets are described. Therefore, the advantages and disadvantages of direct and indirect volume rendering for the visualization are discussed and a neuro-fuzzy system is described, which can be used for the simulation of interactive tissue deformations. The neuro-fuzzy system makes it possible to define the deformation behavior based on a linguistic description of the tissue characteristics or to learn the dynamics by using measured data of real tissue. Furthermore, a simulator for minimally-invasive neurosurgical interventions is presented that utilizes the described visualization and simulation methods. The structure of the simulator is described in detail and the results of a system evaluation by an experienced neurosurgeon--a quantitative comparison between different methods of virtual endoscopy as well as a comparison between real brain images and virtual endoscopies--are given. The evaluation proved that the simulator provides a higher realism of the visualization and simulation then other currently available simulators. Copyright 2002 Elsevier Science B.V.

  19. Monte Carlo simulation techniques for predicting annual power production

    International Nuclear Information System (INIS)

    Cross, J.P.; Bulandr, P.J.

    1991-01-01

    As the owner and operator of a number of small to mid-sized hydroelectric sites, STS HydroPower has been faced with the need to accurately predict anticipated hydroelectric revenues over a period of years. The typical approach to this problem has been to look at each site from a mathematical deterministic perspective and evaluate the annual production from historic streamflows. Average annual production is simply taken to be the area under the flow duration curve defined by the operating and design characteristics of the selected turbines. Minimum annual production is taken to be a historic dry year scenario and maximum production is viewed as power generated under the most ideal of conditions. Such an approach creates two problems. First, in viewing the characteristics of a single site, it does not take into account the probability of such an event occurring. Second, in viewing all sites in a single organization's portfolio together, it does not reflect the varying flow conditions at the different sites. This paper attempts to address the first of these two concerns, that being the creation of a simulation model utilizing the Monte Carlo method at a single site. The result of the analysis is a picture of the production at the site that is both a better representation of anticipated conditions and defined probabilistically

  20. Simulated Annealing Technique for Routing in a Rectangular Mesh Network

    Directory of Open Access Journals (Sweden)

    Noraziah Adzhar

    2014-01-01

    Full Text Available In the process of automatic design for printed circuit boards (PCBs, the phase following cell placement is routing. On the other hand, routing process is a notoriously difficult problem, and even the simplest routing problem which consists of a set of two-pin nets is known to be NP-complete. In this research, our routing region is first tessellated into a uniform Nx×Ny array of square cells. The ultimate goal for a routing problem is to achieve complete automatic routing with minimal need for any manual intervention. Therefore, shortest path for all connections needs to be established. While classical Dijkstra’s algorithm guarantees to find shortest path for a single net, each routed net will form obstacles for later paths. This will add complexities to route later nets and make its routing longer than the optimal path or sometimes impossible to complete. Today’s sequential routing often applies heuristic method to further refine the solution. Through this process, all nets will be rerouted in different order to improve the quality of routing. Because of this, we are motivated to apply simulated annealing, one of the metaheuristic methods to our routing model to produce better candidates of sequence.

  1. Agreement between radiographic and photographic trabecular patterns

    International Nuclear Information System (INIS)

    Korstjens, C.M.; Geraets, W.G.M.; Stelt, P.F. van der; Spruijt, R.J.; Mosekilde, L.

    1998-01-01

    Purpose: It has been hypothesized that photographs can facilitate the interpretation of the radiographic characteristics of trabecular bone. The reliability of these photographic and radiographic approaches has been determined, as have various agreements between the two approaches and their correlations with biomechanical characteristics. Material and Methods: Fourteen vertebral bodies were obtained at autopsy from 6 women and 8 men aged 22-76 years. Photographs (n=28) and radiographs (n=28) were taken of midsagittal slices from the third lumbar vertebra. The radiographs and photographs were digitized and the geometric properties of the trabecular architecture were then determined with a digital images analysis technique. Information on the compressive strength and ash density of the vertebral body was also available. Results: The geometric properties of both radiographs and photographs could be measured with a high degree of reliability (Cronbach's α>0.85). Agreement between the radiographic and photographic approaches was mediocre as only the radiographic measurements showed insignificant correlations (p<0.05) with the biomechanical characteristics. We suggest that optical phenomena may result in the significant correlations between the photographs and the biomechanical characteristics. Conclusion: For digital image processing, radiography offers a superior description of the architecture of trabecular bone to that offered by photography. (orig.)

  2. Agreement between radiographic and photographic trabecular patterns

    Energy Technology Data Exchange (ETDEWEB)

    Korstjens, C.M.; Geraets, W.G.M.; Stelt, P.F. van der [Dept. of Oral Radiology, Academic Centre for Dentistry, Amsterdam (Netherlands); Spruijt, R.J. [Div. of Psychosocial Research and Epidemiology, Netherlands Cancer Inst., Amsterdam (Netherlands); Mosekilde, L. [Dept. of Cell Biology, Univ. of Aarhus (Denmark)

    1998-11-01

    Purpose: It has been hypothesized that photographs can facilitate the interpretation of the radiographic characteristics of trabecular bone. The reliability of these photographic and radiographic approaches has been determined, as have various agreements between the two approaches and their correlations with biomechanical characteristics. Material and Methods: Fourteen vertebral bodies were obtained at autopsy from 6 women and 8 men aged 22-76 years. Photographs (n=28) and radiographs (n=28) were taken of midsagittal slices from the third lumbar vertebra. The radiographs and photographs were digitized and the geometric properties of the trabecular architecture were then determined with a digital images analysis technique. Information on the compressive strength and ash density of the vertebral body was also available. Results: The geometric properties of both radiographs and photographs could be measured with a high degree of reliability (Cronbach`s {alpha}>0.85). Agreement between the radiographic and photographic approaches was mediocre as only the radiographic measurements showed insignificant correlations (p<0.05) with the biomechanical characteristics. We suggest that optical phenomena may result in the significant correlations between the photographs and the biomechanical characteristics. Conclusion: For digital image processing, radiography offers a superior description of the architecture of trabecular bone to that offered by photography. (orig.)

  3. Shower library technique for fast simulation of showers in calorimeters of the H1 experiment

    International Nuclear Information System (INIS)

    Raičević, N.; Glazov, A.; Zhokin, A.

    2013-01-01

    Fast simulation of showers in calorimeters is very important for particle physics analysis since shower simulation typically takes significant amount of the simulation time. At the same time, a simulation must reproduce experimental data in the best possible way. In this paper, a fast simulation of showers in two calorimeters of the H1 experiment is presented. High speed and good quality of shower simulation is achieved by using a shower library technique in which the detector response is simulated using a collection of stored showers for different particle types and topologies. The library is created using the GEANT programme. The fast simulation based on shower library is compared to the data collected by the H1 experiment

  4. Simulation techniques for spatially evolving instabilities in compressible flow over a flat plate

    NARCIS (Netherlands)

    Wasistho, B.; Geurts, Bernardus J.; Kuerten, Johannes G.M.

    1997-01-01

    In this paper we present numerical techniques suitable for a direct numerical simulation in the spatial setting. We demonstrate the application to the simulation of compressible flat plate flow instabilities. We compare second and fourth order accurate spatial discretization schemes in combination

  5. Progress in the development of a video-based wind farm simulation technique

    OpenAIRE

    Robotham, AJ

    1992-01-01

    The progress in the development of a video-based wind farm simulation technique is reviewed. While improvements have been achieved in the quality of the composite picture created by combining computer generated animation sequences of wind turbines with background scenes of the wind farm site, extending the technique to include camera movements has proved troublesome.

  6. Individual external monitoring system for gamma and X ray evaluation of the individual dose equivalent 'HP(10)', utilizing a photographic dosimetry technique

    International Nuclear Information System (INIS)

    Santoro, Christiana; Filho, Joao Antonio

    2008-01-01

    Full text: Individual monitoring evaluates external sources of ionizing radiation X, γ, β and n, to which workers are occupationally exposed, for ensuring safe and acceptable radiological conditions in their places of employment. The dose received by workers should attend the limits authorized by national regulatory organs. Nowadays, there are two radiometric unit systems, based on resolutions of the National Nuclear Energy Commission (NNEC) and the International Commission on Radiation Units and Measurements (ICRU); in the conventional (NNEC) system, the doses received by workers are evaluated through the individual dose H x , where dosemeters used on surface of thorax are calibrated in terms of air kerma; in the recent system (ICRU), the doses are evaluated through the individual dose equivalent H P (d), where dosemeters are calibrated in terms of dose from phantom. The recent system improves the method of evaluation, by taking into account the scattering effect and absorption of radiation in the human body. This work adapts a photographic dosimetry service to the recent ICRU publications, for evaluation of individual monitoring, in function of the individual dose equivalent H P (10) of strong penetrating radiation. For this, a methodology based on linear programming and determination of calibration curves is used for radiation capacities, wide (W) and narrow (N) spectra, as described by the International Organization for Standardization (ISO 4037-1, 1995). These calibration curves offer better accuracy in the determination of doses and energy, which will improve the quality of the service given to society. The results show that the values of individual dose equivalent, evaluated at intervals of 0.2 to 200 mSv, have lower significant uncertainties (10%) than those recommended by the ICRP 75, for individual monitoring; therefore, the evaluation system for developed doses attends the new recommendations proposed by International Commissions. From what has been

  7. Parallel Reservoir Simulations with Sparse Grid Techniques and Applications to Wormhole Propagation

    KAUST Repository

    Wu, Yuanqing

    2015-09-08

    In this work, two topics of reservoir simulations are discussed. The first topic is the two-phase compositional flow simulation in hydrocarbon reservoir. The major obstacle that impedes the applicability of the simulation code is the long run time of the simulation procedure, and thus speeding up the simulation code is necessary. Two means are demonstrated to address the problem: parallelism in physical space and the application of sparse grids in parameter space. The parallel code can gain satisfactory scalability, and the sparse grids can remove the bottleneck of flash calculations. Instead of carrying out the flash calculation in each time step of the simulation, a sparse grid approximation of all possible results of the flash calculation is generated before the simulation. Then the constructed surrogate model is evaluated to approximate the flash calculation results during the simulation. The second topic is the wormhole propagation simulation in carbonate reservoir. In this work, different from the traditional simulation technique relying on the Darcy framework, we propose a new framework called Darcy-Brinkman-Forchheimer framework to simulate wormhole propagation. Furthermore, to process the large quantity of cells in the simulation grid and shorten the long simulation time of the traditional serial code, standard domain-based parallelism is employed, using the Hypre multigrid library. In addition to that, a new technique called “experimenting field approach” to set coefficients in the model equations is introduced. In the 2D dissolution experiments, different configurations of wormholes and a series of properties simulated by both frameworks are compared. We conclude that the numerical results of the DBF framework are more like wormholes and more stable than the Darcy framework, which is a demonstration of the advantages of the DBF framework. The scalability of the parallel code is also evaluated, and good scalability can be achieved. Finally, a mixed

  8. Simulation of sustainability aspects within the industrial environment and their implication on the simulation technique

    OpenAIRE

    Rabe, M.; Jäkel, F.-W.; Weinaug, H.

    2010-01-01

    Simulation is a broadly excepted analytic instrument and planning tool. Today, industrial simulation is mainly applied for engineering and physical purposes and covers a short time horizon compared to intergenerational justice. In parallel, sustainability is gaining more importance for the industrial planning because themes like global warming, child labour, and compliance with social and environmental standards have to be taken into account. Sustainability is characterized by comprehensively...

  9. Radiography using photographic paper

    International Nuclear Information System (INIS)

    Gromov, Yu.V.; Kapustin, V.I.; Volkova, T.G.

    1982-01-01

    The possibility of radiographic control with application of ''Fototelegrafnaya BS'' photographic paper in conjunction with the ''Standart'' image intensifier as an X-ray image converter is studied. Investigations were made using steel samples 5 to 45 mm thick, X-ray radiation energy varied from 80 to 240 keV. Specifications of the image intensifier of the ''Standart type and photographic paper are given. It is shown, that the photographic paper improves the sensitivity of the method to the detection of small defects. The method provides standard and panoramic radioscopy, conservation of objective documentation, enables one to mechanize and automatize the process of photodevelopment. The application of the photographic paper is beneficial, its cost being six times as low as that of X-ray film

  10. Quality comparison between DEF-10 digital image from simulation technique and Computed Tomography (CR) technique in industrial radiography

    International Nuclear Information System (INIS)

    Siti Nur Syatirah Ismail

    2012-01-01

    The study was conducted to make comparison of digital image quality of DEF-10 from the techniques of simulation and computed radiography (CR). The sample used is steel DEF-10 with thickness of 15.28 mm. In this study, the sample is exposed to radiation from X-ray machine (ISOVOLT Titan E) with certain parameters. The parameters used in this study such as current, volt, exposure time and distance are specified. The current and distance of 3 mA and 700 mm respectively are specified while the applied voltage varies at 140, 160, 180 and 200 kV. The exposure time is reduced at a rate of 0, 20, 40, 60 and 80 % for each sample exposure. Digital image of simulation produced from aRTist software whereas digital image of computed radiography produced from imaging plate. Therefore, both images were compared qualitatively (sensitivity) and quantitatively (Signal to-Noise Ratio; SNR, Basic Spatial Resolution; SRb and LOP size) using Isee software. Radiographic sensitivity is indicated by Image Quality Indicator (IQI) which is the ability of the CR system and aRTist software to identify IQI of wire type when the time exposure is reduced up to 80% according to exposure chart ( D7; ISOVOLT Titan E). The image of the thinnest wire diameter achieved by radiograph from simulation and CR are the wire numbered 7 rather than the wire numbered 8 required by the standard. In quantitative comparison, this study shows that the SNR values decreases with reducing exposure time. SRb values increases for simulation and decreases for CR when the exposure time decreases and the good image quality can be achieved at 80% reduced exposure time. The high SNR and SRb values produced good image quality in CR and simulation techniques respectively. (author)

  11. Low-mass molecular dynamics simulation: A simple and generic technique to enhance configurational sampling

    Energy Technology Data Exchange (ETDEWEB)

    Pang, Yuan-Ping, E-mail: pang@mayo.edu

    2014-09-26

    Highlights: • Reducing atomic masses by 10-fold vastly improves sampling in MD simulations. • CLN025 folded in 4 of 10 × 0.5-μs MD simulations when masses were reduced by 10-fold. • CLN025 folded as early as 96.2 ns in 1 of the 4 simulations that captured folding. • CLN025 did not fold in 10 × 0.5-μs MD simulations when standard masses were used. • Low-mass MD simulation is a simple and generic sampling enhancement technique. - Abstract: CLN025 is one of the smallest fast-folding proteins. Until now it has not been reported that CLN025 can autonomously fold to its native conformation in a classical, all-atom, and isothermal–isobaric molecular dynamics (MD) simulation. This article reports the autonomous and repeated folding of CLN025 from a fully extended backbone conformation to its native conformation in explicit solvent in multiple 500-ns MD simulations at 277 K and 1 atm with the first folding event occurring as early as 66.1 ns. These simulations were accomplished by using AMBER forcefield derivatives with atomic masses reduced by 10-fold on Apple Mac Pros. By contrast, no folding event was observed when the simulations were repeated using the original AMBER forcefields of FF12SB and FF14SB. The results demonstrate that low-mass MD simulation is a simple and generic technique to enhance configurational sampling. This technique may propel autonomous folding of a wide range of miniature proteins in classical, all-atom, and isothermal–isobaric MD simulations performed on commodity computers—an important step forward in quantitative biology.

  12. Simulation into Reality: Some Effects of Simulation Techniques on Organizational Communication Students.

    Science.gov (United States)

    Allen, Richard K.

    In an attempt to discover improved classroom teaching methods, a class was turned into a business organization as a way of bringing life to the previously covered lectures and textual materials. The simulated games were an attempt to get people to work toward a common goal with all of the power plays, secret meetings, brainstorming, anger, and…

  13. Simulation of the fissureless technique for thoracoscopic segmentectomy using rapid prototyping.

    Science.gov (United States)

    Akiba, Tadashi; Nakada, Takeo; Inagaki, Takuya

    2015-01-01

    The fissureless lobectomy or anterior fissureless technique is a novel surgical technique, which avoids dissection of the lung parenchyma over the pulmonary artery during lobectomy by open thoracotomy approach or direct vision thoracoscopic surgery. This technique is indicated for fused lobes. We present two cases where thoracoscopic pulmonary segmentectomy was performed using the fissureless technique simulated by three-dimensional (3D) pulmonary models. The 3D model and rapid prototyping provided an accurate anatomical understanding of the operative field in both cases. We believe that the construction of these models is useful for thoracoscopic and other complicated surgeries of the chest.

  14. Simulation of the Fissureless Technique for Thoracoscopic Segmentectomy Using Rapid Prototyping

    Science.gov (United States)

    Nakada, Takeo; Inagaki, Takuya

    2014-01-01

    The fissureless lobectomy or anterior fissureless technique is a novel surgical technique, which avoids dissection of the lung parenchyma over the pulmonary artery during lobectomy by open thoracotomy approach or direct vision thoracoscopic surgery. This technique is indicated for fused lobes. We present two cases where thoracoscopic pulmonary segmentectomy was performed using the fissureless technique simulated by three-dimensional (3D) pulmonary models. The 3D model and rapid prototyping provided an accurate anatomical understanding of the operative field in both cases. We believe that the construction of these models is useful for thoracoscopic and other complicated surgeries of the chest. PMID:24633132

  15. A 3D technique for simulation of irregular electron treatment fields using a digital camera

    International Nuclear Information System (INIS)

    Bassalow, Roustem; Sidhu, Narinder P.

    2003-01-01

    Cerrobend inserts, which define electron field apertures, are manufactured at our institution using perspex templates. Contours are reproduced manually on these templates at the simulator from the field outlines drawn on the skin or mask of a patient. A previously reported technique for simulation of electron treatment fields uses a digital camera to eliminate the need for such templates. However, avoidance of the image distortions introduced by non-flat surfaces on which the electron field outlines were drawn could only be achieved by limiting the application of this technique to surfaces which were flat or near flat. We present a technique that employs a digital camera and allows simulation of electron treatment fields contoured on an anatomical surface of an arbitrary three-dimensional (3D) shape, such as that of the neck, extremities, face, or breast. The procedure is fast, accurate, and easy to perform

  16. An overview of uncertainty quantification techniques with application to oceanic and oil-spill simulations

    KAUST Repository

    Iskandarani, Mohamed; Wang, Shitao; Srinivasan, Ashwanth; Carlisle Thacker, W.; Winokur, Justin; Knio, Omar

    2016-01-01

    We give an overview of four different ensemble-based techniques for uncertainty quantification and illustrate their application in the context of oil plume simulations. These techniques share the common paradigm of constructing a model proxy that efficiently captures the functional dependence of the model output on uncertain model inputs. This proxy is then used to explore the space of uncertain inputs using a large number of samples, so that reliable estimates of the model's output statistics can be calculated. Three of these techniques use polynomial chaos (PC) expansions to construct the model proxy, but they differ in their approach to determining the expansions' coefficients; the fourth technique uses Gaussian Process Regression (GPR). An integral plume model for simulating the Deepwater Horizon oil-gas blowout provides examples for illustrating the different techniques. A Monte Carlo ensemble of 50,000 model simulations is used for gauging the performance of the different proxies. The examples illustrate how regression-based techniques can outperform projection-based techniques when the model output is noisy. They also demonstrate that robust uncertainty analysis can be performed at a fraction of the cost of the Monte Carlo calculation.

  17. An overview of uncertainty quantification techniques with application to oceanic and oil-spill simulations

    KAUST Repository

    Iskandarani, Mohamed

    2016-04-22

    We give an overview of four different ensemble-based techniques for uncertainty quantification and illustrate their application in the context of oil plume simulations. These techniques share the common paradigm of constructing a model proxy that efficiently captures the functional dependence of the model output on uncertain model inputs. This proxy is then used to explore the space of uncertain inputs using a large number of samples, so that reliable estimates of the model\\'s output statistics can be calculated. Three of these techniques use polynomial chaos (PC) expansions to construct the model proxy, but they differ in their approach to determining the expansions\\' coefficients; the fourth technique uses Gaussian Process Regression (GPR). An integral plume model for simulating the Deepwater Horizon oil-gas blowout provides examples for illustrating the different techniques. A Monte Carlo ensemble of 50,000 model simulations is used for gauging the performance of the different proxies. The examples illustrate how regression-based techniques can outperform projection-based techniques when the model output is noisy. They also demonstrate that robust uncertainty analysis can be performed at a fraction of the cost of the Monte Carlo calculation.

  18. An analytical simulation technique for cone-beam CT and pinhole SPECT

    International Nuclear Information System (INIS)

    Zhang Xuezhu; Qi Yujin

    2011-01-01

    This study was aimed at developing an efficient simulation technique with an ordinary PC. The work involved derivation of mathematical operators, analytic phantom generations, and effective analytical projectors developing for cone-beam CT and pinhole SPECT imaging. The computer simulations based on the analytical projectors were developed by ray-tracing method for cone-beam CT and voxel-driven method for pinhole SPECT of degrading blurring. The 3D Shepp-Logan, Jaszczak and Defrise phantoms were used for simulation evaluations and image reconstructions. The reconstructed phantom images were of good accuracy with the phantoms. The results showed that the analytical simulation technique is an efficient tool for studying cone-beam CT and pinhole SPECT imaging. (authors)

  19. Analytical vs. Simulation Solution Techniques for Pulse Problems in Non-linear Stochastic Dynamics

    DEFF Research Database (Denmark)

    Iwankiewicz, R.; Nielsen, Søren R. K.

    Advantages and disadvantages of available analytical and simulation techniques for pulse problems in non-linear stochastic dynamics are discussed. First, random pulse problems, both those which do and do not lead to Markov theory, are presented. Next, the analytical and analytically-numerical tec......Advantages and disadvantages of available analytical and simulation techniques for pulse problems in non-linear stochastic dynamics are discussed. First, random pulse problems, both those which do and do not lead to Markov theory, are presented. Next, the analytical and analytically...

  20. Research on integrated simulation of fluid-structure system by computation science techniques

    International Nuclear Information System (INIS)

    Yamaguchi, Akira

    1996-01-01

    In Power Reactor and Nuclear Fuel Development Corporation, the research on the integrated simulation of fluid-structure system by computation science techniques has been carried out, and by its achievement, the verification of plant systems which has depended on large scale experiments is substituted by computation science techniques, in this way, it has been aimed at to reduce development costs and to attain the optimization of FBR systems. For the purpose, it is necessary to establish the technology for integrally and accurately analyzing complicated phenomena (simulation technology), the technology for applying it to large scale problems (speed increasing technology), and the technology for assuring the reliability of the results of analysis when simulation technology is utilized for the permission and approval of FBRs (verifying technology). The simulation of fluid-structure interaction, the heat flow simulation in the space with complicated form and the related technologies are explained. As the utilization of computation science techniques, the elucidation of phenomena by numerical experiment and the numerical simulation as the substitute for tests are discussed. (K.I.)

  1. Development of a Car Racing Simulator Game Using Artificial Intelligence Techniques

    Directory of Open Access Journals (Sweden)

    Marvin T. Chan

    2015-01-01

    Full Text Available This paper presents a car racing simulator game called Racer, in which the human player races a car against three game-controlled cars in a three-dimensional environment. The objective of the game is not to defeat the human player, but to provide the player with a challenging and enjoyable experience. To ensure that this objective can be accomplished, the game incorporates artificial intelligence (AI techniques, which enable the cars to be controlled in a manner that mimics natural driving. The paper provides a brief history of AI techniques in games, presents the use of AI techniques in contemporary video games, and discusses the AI techniques that were implemented in the development of Racer. A comparison of the AI techniques implemented in the Unity platform with traditional AI search techniques is also included in the discussion.

  2. Modeling and numerical techniques for high-speed digital simulation of nuclear power plants

    International Nuclear Information System (INIS)

    Wulff, W.; Cheng, H.S.; Mallen, A.N.

    1987-01-01

    Conventional computing methods are contrasted with newly developed high-speed and low-cost computing techniques for simulating normal and accidental transients in nuclear power plants. Six principles are formulated for cost-effective high-fidelity simulation with emphasis on modeling of transient two-phase flow coolant dynamics in nuclear reactors. Available computing architectures are characterized. It is shown that the combination of the newly developed modeling and computing principles with the use of existing special-purpose peripheral processors is capable of achieving low-cost and high-speed simulation with high-fidelity and outstanding user convenience, suitable for detailed reactor plant response analyses

  3. Virtual X-ray imaging techniques in an immersive casting simulation environment

    International Nuclear Information System (INIS)

    Li, Ning; Kim, Sung-Hee; Suh, Ji-Hyun; Cho, Sang-Hyun; Choi, Jung-Gil; Kim, Myoung-Hee

    2007-01-01

    A computer code was developed to simulate radiograph of complex casting products in a CAVE TM -like environment. The simulation is based on the deterministic algorithms and ray tracing techniques. The aim of this study is to examine CAD/CAE/CAM models at the design stage, to optimize the design and inspect predicted defective regions with fast speed, good accuracy and small numerical expense. The present work discusses the algorithms for the radiography simulation of CAD/CAM model and proposes algorithmic solutions adapted from ray-box intersection algorithm and octree data structure specifically for radiographic simulation of CAE model. The stereoscopic visualization of full-size of product in the immersive casting simulation environment as well as the virtual X-ray images of castings provides an effective tool for design and evaluation of foundry processes by engineers and metallurgists

  4. Around the laboratories: Dubna: Physics results and progress on bubble chamber techniques; Stanford (SLAC): Operation of a very rapid cycling bubble chamber; Daresbury: Photographs of visitors to the Laboratory; Argonne: Charge exchange injection tests into the ZGS in preparation for a proposed Booster

    CERN Multimedia

    1969-01-01

    Around the laboratories: Dubna: Physics results and progress on bubble chamber techniques; Stanford (SLAC): Operation of a very rapid cycling bubble chamber; Daresbury: Photographs of visitors to the Laboratory; Argonne: Charge exchange injection tests into the ZGS in preparation for a proposed Booster

  5. Electro photographic materials

    International Nuclear Information System (INIS)

    Buzdugan, A.; Andries, A.; Iovu, M.

    2000-01-01

    The invention relates to the creation of electro photographic materials . The invention allows to extend the material photosensitivity into the infrared range of the spectrum. An electro photographic materials contains an electro conducting base, including a dielectric base 1, for example glass, an electro conducting layer 2, for example of Al, Ni, Cr, an injecting layer 3, consisting of amorphous indium phosphide, a vitreous layer 4 of the arsenic sulphide - antimony sulphide system and a transporting layer 5 of the arsenic sulphide or arsenic selenide

  6. Multilevel techniques lead to accurate numerical upscaling and scalable robust solvers for reservoir simulation

    DEFF Research Database (Denmark)

    Christensen, Max la Cour; Villa, Umberto; Vassilevski, Panayot

    2015-01-01

    approach is well suited for the solution of large problems coming from finite element discretizations of systems of partial differential equations. The AMGe technique from 10,9 allows for the construction of operator-dependent coarse (upscaled) models and guarantees approximation properties of the coarse...... implementation of the reservoir simulator is demonstrated....

  7. The Generalized Multipole Technique for the Simulation of Low-Loss Electron Energy Loss Spectroscopy

    DEFF Research Database (Denmark)

    Kiewidt, Lars; Karamehmedovic, Mirza

    2018-01-01

    In this study, we demonstrate the use of a Generalized Multipole Technique (GMT) to simulate low-loss Electron Energy Loss Spectroscopy (EELS) spectra of isolated spheriodal nanoparticles. The GMT provides certain properties, such as semi-analytical description of the electromagnetic fields...

  8. An Improved Simulated Annealing Technique for Enhanced Mobility in Smart Cities.

    Science.gov (United States)

    Amer, Hayder; Salman, Naveed; Hawes, Matthew; Chaqfeh, Moumena; Mihaylova, Lyudmila; Mayfield, Martin

    2016-06-30

    Vehicular traffic congestion is a significant problem that arises in many cities. This is due to the increasing number of vehicles that are driving on city roads of limited capacity. The vehicular congestion significantly impacts travel distance, travel time, fuel consumption and air pollution. Avoidance of traffic congestion and providing drivers with optimal paths are not trivial tasks. The key contribution of this work consists of the developed approach for dynamic calculation of optimal traffic routes. Two attributes (the average travel speed of the traffic and the roads' length) are utilized by the proposed method to find the optimal paths. The average travel speed values can be obtained from the sensors deployed in smart cities and communicated to vehicles via the Internet of Vehicles and roadside communication units. The performance of the proposed algorithm is compared to three other algorithms: the simulated annealing weighted sum, the simulated annealing technique for order preference by similarity to the ideal solution and the Dijkstra algorithm. The weighted sum and technique for order preference by similarity to the ideal solution methods are used to formulate different attributes in the simulated annealing cost function. According to the Sheffield scenario, simulation results show that the improved simulated annealing technique for order preference by similarity to the ideal solution method improves the traffic performance in the presence of congestion by an overall average of 19.22% in terms of travel time, fuel consumption and CO₂ emissions as compared to other algorithms; also, similar performance patterns were achieved for the Birmingham test scenario.

  9. Optimizing Availability of a Framework in Series Configuration Utilizing Markov Model and Monte Carlo Simulation Techniques

    Directory of Open Access Journals (Sweden)

    Mansoor Ahmed Siddiqui

    2017-06-01

    Full Text Available This research work is aimed at optimizing the availability of a framework comprising of two units linked together in series configuration utilizing Markov Model and Monte Carlo (MC Simulation techniques. In this article, effort has been made to develop a maintenance model that incorporates three distinct states for each unit, while taking into account their different levels of deterioration. Calculations are carried out using the proposed model for two distinct cases of corrective repair, namely perfect and imperfect repairs, with as well as without opportunistic maintenance. Initially, results are accomplished using an analytical technique i.e., Markov Model. Validation of the results achieved is later carried out with the help of MC Simulation. In addition, MC Simulation based codes also work well for the frameworks that follow non-exponential failure and repair rates, and thus overcome the limitations offered by the Markov Model.

  10. Advanced particle-in-cell simulation techniques for modeling the Lockheed Martin Compact Fusion Reactor

    Science.gov (United States)

    Welch, Dale; Font, Gabriel; Mitchell, Robert; Rose, David

    2017-10-01

    We report on particle-in-cell developments of the study of the Compact Fusion Reactor. Millisecond, two and three-dimensional simulations (cubic meter volume) of confinement and neutral beam heating of the magnetic confinement device requires accurate representation of the complex orbits, near perfect energy conservation, and significant computational power. In order to determine initial plasma fill and neutral beam heating, these simulations include ionization, elastic and charge exchange hydrogen reactions. To this end, we are pursuing fast electromagnetic kinetic modeling algorithms including a two implicit techniques and a hybrid quasi-neutral algorithm with kinetic ions. The kinetic modeling includes use of the Poisson-corrected direct implicit, magnetic implicit, as well as second-order cloud-in-cell techniques. The hybrid algorithm, ignoring electron inertial effects, is two orders of magnitude faster than kinetic but not as accurate with respect to confinement. The advantages and disadvantages of these techniques will be presented. Funded by Lockheed Martin.

  11. Wind Turbine Rotor Simulation via CFD Based Actuator Disc Technique Compared to Detailed Measurement

    Directory of Open Access Journals (Sweden)

    Esmail Mahmoodi

    2015-10-01

    Full Text Available In this paper, a generalized Actuator Disc (AD is used to model the wind turbine rotor of the MEXICO experiment, a collaborative European wind turbine project. The AD model as a combination of CFD technique and User Defined Functions codes (UDF, so-called UDF/AD model is used to simulate loads and performance of the rotor in three different wind speed tests. Distributed force on the blade, thrust and power production of the rotor as important designing parameters of wind turbine rotors are focused to model. A developed Blade Element Momentum (BEM theory as a code based numerical technique as well as a full rotor simulation both from the literature are included into the results to compare and discuss. The output of all techniques is compared to detailed measurements for validation, which led us to final conclusions.

  12. Student perception of two different simulation techniques in oral and maxillofacial surgery undergraduate training.

    Science.gov (United States)

    Lund, Bodil; Fors, Uno; Sejersen, Ronny; Sallnäs, Eva-Lotta; Rosén, Annika

    2011-10-12

    Yearly surveys among the undergraduate students in oral and maxillofacial surgery at Karolinska Institutet have conveyed a wish for increased clinical training, and in particular, in surgical removal of mandibular third molars. Due to lack of resources, this kind of clinical supervision has so far not been possible to implement. One possible solution to this problem might be to introduce simulation into the curriculum. The purpose of this study was to investigate undergraduate students' perception of two different simulation methods for practicing clinical reasoning skills and technical skills in oral and maxillofacial surgery. Forty-seven students participating in the oral and maxillofacial surgery course at Karolinska Institutet during their final year were included. Three different oral surgery patient cases were created in a Virtual Patient (VP) Simulation system (Web-SP) and used for training clinical reasoning. A mandibular third molar surgery simulator with tactile feedback, providing hands on training in the bone removal and tooth sectioning in third molar surgery, was also tested. A seminar was performed using the combination of these two simulators where students' perception of the two different simulation methods was assessed by means of a questionnaire. The response rate was 91.5% (43/47). The students were positive to the VP cases, although they rated their possible improvement of clinical reasoning skills as moderate. The students' perception of improved technical skills after training in the mandibular third molar surgery simulator was rated high. The majority of the students agreed that both simulation techniques should be included in the curriculum and strongly agreed that it was a good idea to use the two simulators in concert. The importance of feedback from the senior experts during simulator training was emphasised. The two tested simulation methods were well accepted and most students agreed that the future curriculum would benefit from

  13. Evaluating a Bayesian approach to improve accuracy of individual photographic identification methods using ecological distribution data

    Directory of Open Access Journals (Sweden)

    Richard Stafford

    2011-04-01

    Full Text Available Photographic identification of individual organisms can be possible from natural body markings. Data from photo-ID can be used to estimate important ecological and conservation metrics such as population sizes, home ranges or territories. However, poor quality photographs or less well-studied individuals can result in a non-unique ID, potentially confounding several similar looking individuals. Here we present a Bayesian approach that uses known data about previous sightings of individuals at specific sites as priors to help assess the problems of obtaining a non-unique ID. Using a simulation of individuals with different confidence of correct ID we evaluate the accuracy of Bayesian modified (posterior probabilities. However, in most cases, the accuracy of identification decreases. Although this technique is unsuccessful, it does demonstrate the importance of computer simulations in testing such hypotheses in ecology.

  14. Does Taking Photographs Help?

    Science.gov (United States)

    Hand, Sarah

    2010-01-01

    Since many people tend to use photographs as memory anchors, this author decided she wanted to know whether the process of capturing and manipulating an image taken during a learning activity would act as a memory anchor for children's visual, auditory and kinaesthetic memories linked to their cognitive learning at the time. In plain English,…

  15. Cloud chamber photographs of the cosmic radiation

    CERN Document Server

    Rochester, George Dixon

    1952-01-01

    Cloud Chamber Photographs of the Cosmic Radiation focuses on cloud chamber and photographic emulsion wherein the tracks of individual subatomic particles of high energy are studied. The publication first offers information on the technical features of operation and electrons and cascade showers. Discussions focus on the relationship in time and space of counter-controlled tracks; techniques of internal control of the cloud chamber; cascade processes with artificially-produced electrons and photons; and nuclear interaction associated with an extensive shower. The manuscript then elaborates on

  16. Pyrite: A blender plugin for visualizing molecular dynamics simulations using industry-standard rendering techniques.

    Science.gov (United States)

    Rajendiran, Nivedita; Durrant, Jacob D

    2018-05-05

    Molecular dynamics (MD) simulations provide critical insights into many biological mechanisms. Programs such as VMD, Chimera, and PyMOL can produce impressive simulation visualizations, but they lack many advanced rendering algorithms common in the film and video-game industries. In contrast, the modeling program Blender includes such algorithms but cannot import MD-simulation data. MD trajectories often require many gigabytes of memory/disk space, complicating Blender import. We present Pyrite, a Blender plugin that overcomes these limitations. Pyrite allows researchers to visualize MD simulations within Blender, with full access to Blender's cutting-edge rendering techniques. We expect Pyrite-generated images to appeal to students and non-specialists alike. A copy of the plugin is available at http://durrantlab.com/pyrite/, released under the terms of the GNU General Public License Version 3. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  17. Validation of a low dose simulation technique for computed tomography images.

    Directory of Open Access Journals (Sweden)

    Daniela Muenzel

    Full Text Available PURPOSE: Evaluation of a new software tool for generation of simulated low-dose computed tomography (CT images from an original higher dose scan. MATERIALS AND METHODS: Original CT scan data (100 mAs, 80 mAs, 60 mAs, 40 mAs, 20 mAs, 10 mAs; 100 kV of a swine were acquired (approved by the regional governmental commission for animal protection. Simulations of CT acquisition with a lower dose (simulated 10-80 mAs were calculated using a low-dose simulation algorithm. The simulations were compared to the originals of the same dose level with regard to density values and image noise. Four radiologists assessed the realistic visual appearance of the simulated images. RESULTS: Image characteristics of simulated low dose scans were similar to the originals. Mean overall discrepancy of image noise and CT values was -1.2% (range -9% to 3.2% and -0.2% (range -8.2% to 3.2%, respectively, p>0.05. Confidence intervals of discrepancies ranged between 0.9-10.2 HU (noise and 1.9-13.4 HU (CT values, without significant differences (p>0.05. Subjective observer evaluation of image appearance showed no visually detectable difference. CONCLUSION: Simulated low dose images showed excellent agreement with the originals concerning image noise, CT density values, and subjective assessment of the visual appearance of the simulated images. An authentic low-dose simulation opens up opportunity with regard to staff education, protocol optimization and introduction of new techniques.

  18. Simulation of white light generation and near light bullets using a novel numerical technique

    Science.gov (United States)

    Zia, Haider

    2018-01-01

    An accurate and efficient simulation has been devised, employing a new numerical technique to simulate the derivative generalised non-linear Schrödinger equation in all three spatial dimensions and time. The simulation models all pertinent effects such as self-steepening and plasma for the non-linear propagation of ultrafast optical radiation in bulk material. Simulation results are compared to published experimental spectral data of an example ytterbium aluminum garnet system at 3.1 μm radiation and fits to within a factor of 5. The simulation shows that there is a stability point near the end of the 2 mm crystal where a quasi-light bullet (spatial temporal soliton) is present. Within this region, the pulse is collimated at a reduced diameter (factor of ∼2) and there exists a near temporal soliton at the spatial center. The temporal intensity within this stable region is compressed by a factor of ∼4 compared to the input. This study shows that the simulation highlights new physical phenomena based on the interplay of various linear, non-linear and plasma effects that go beyond the experiment and is thus integral to achieving accurate designs of white light generation systems for optical applications. An adaptive error reduction algorithm tailor made for this simulation will also be presented in appendix.

  19. ECR plasma photographs as a plasma diagnostic

    Energy Technology Data Exchange (ETDEWEB)

    Racz, R; Biri, S; Palinkas, J [Institute of Nuclear Research (ATOMKI), H-4026 Debrecen, Bem ter 18/c (Hungary)

    2011-04-15

    Low, medium or highly charged ions delivered by electron cyclotron resonance (ECR) ion sources all are produced in the ECR plasma. In order to study such plasmas, high-resolution visible light plasma photographs were taken at the ATOMKI ECR ion source. An 8 megapixel digital camera was used to photograph plasmas made from He, methane, N, O, Ne, Ar, Kr, Xe gases and from their mixtures. The analysis of the photo series gave many qualitative and some valuable physical information on the nature of ECR plasmas. A comparison was made between the plasma photos and computer simulations, and conclusions were drawn regarding the cold electron component of the plasma. The warm electron component of similar simulation was compared with x-ray photos emitted by plasma ions. While the simulations are in good agreement with the photos, a significant difference was found between the spatial distribution of the cold and warm electrons.

  20. Geotagging Photographs in Student Fieldwork

    Science.gov (United States)

    Welsh, Katharine E.; France, Derek; Whalley, W. Brian; Park, Julian R.

    2012-01-01

    This resource paper provides guidance for staff and students on the potential educational benefits, limitations and applications of geotagging photographs. It also offers practical advice for geotagging photographs in a range of fieldwork settings and reviews three free smartphone applications (apps) for geotagging photographs (Flickr, Evernote…

  1. Crack growth simulation for plural crack using hexahedral mesh generation technique

    International Nuclear Information System (INIS)

    Orita, Y; Wada, Y; Kikuchi, M

    2010-01-01

    This paper describes a surface crack growth simulation using a new mesh generation technique. The generated mesh is constituted of all hexahedral elements. Hexahedral elements are suitable for an analysis of fracture mechanics parameters, i.e. stress intensity factor. The advantage of a hexahedral mesh is good accuracy of an analysis and less number of degrees of freedoms than a tetrahedral mesh. In this study, a plural crack growth simulation is computed using the hexahedral mesh and its distribution of stress intensity factor is investigated.

  2. Selecting the Most Economic Project under Uncertainty Using Bootstrap Technique and Fuzzy Simulation

    Directory of Open Access Journals (Sweden)

    Kamran Shahanaghi

    2012-01-01

    Full Text Available This article, by leaving pre-determined membership function of a fuzzy set which is a basic assumption for such subject, will try to propose a hybrid technique to select the most economic project among alternative projects in fuzziness interest rates condition. In this way, net present worth (NPW would be the economic indicator. This article tries to challenge the assumption of large sample sizes availability for membership function determination and shows that some other techniques may have less accuracy. To give a robust solution, bootstrapping and fuzzy simulation is suggested and a numerical example is given and analyzed.

  3. Timesaving techniques for decision of electron-molecule collisions in Monte Carlo simulation of electrical discharges

    International Nuclear Information System (INIS)

    Sugawara, Hirotake; Mori, Naoki; Sakai, Yosuke; Suda, Yoshiyuki

    2007-01-01

    Techniques to reduce the computational load for determination of electron-molecule collisions in Monte Carlo simulations of electrical discharges have been presented. By enhancing the detection efficiency of the no-collision case in the decision scheme of the collisional events, we can decrease the frequency of access to time-consuming subroutines to calculate the electron collision cross sections of the gas molecules for obtaining the collision probability. A benchmark test and an estimation to evaluate the present techniques have shown a practical timesaving efficiency

  4. Photograph of the Month

    Science.gov (United States)

    2015-04-01

    For dykes, magma flow direction can be deciphered from various fabrics in the chilled margin (Correa-Gomez et al., 2001, JSG 23, 1415). This photograph represents part of a chilled margin of a appr. N- S trending dyke at Kharghar Hills, Mumbai, Maharashtra, India. The section is sub-vertical. The elongated grooves indicate flow of magma through a fault and the tapered grooves (arrows) connote the flow direction: towards the pointed end of the groove. The magma flowed towards the north in this case. Such fabrics of wall-magma interaction in the Deccan volcanic province prove that dykes injected along fault planes. 19° 2‧ 22.3″ N, 73° 3‧ 28.7″ E. Photograph Ayan Achyuta Misra, Mumbai, India.

  5. Application of data assimilation technique for flow field simulation for Kaiga site using TAPM model

    International Nuclear Information System (INIS)

    Shrivastava, R.; Oza, R.B.; Puranik, V.D.; Hegde, M.N.; Kushwaha, H.S.

    2008-01-01

    The data assimilation techniques are becoming popular nowadays to get realistic flow field simulation for the site under consideration. The present paper describes data assimilation technique for flow field simulation for Kaiga site using the air pollution model (TAPM) developed by CSIRO, Australia. In this, the TAPM model was run for Kaiga site for a period of one month (Nov. 2004) using the analysed meteorological data supplied with the model for Central Asian (CAS) region and the model solutions were nudged with the observed wind speed and wind direction data available for the site. The model was run with 4 nested grids with grid spacing varying from 30km, 10km, 3km and 1km respectively. The models generated results with and without nudging are statistically compared with the observations. (author)

  6. Optimisation of 12 MeV electron beam simulation using variance reduction technique

    International Nuclear Information System (INIS)

    Jayamani, J; Aziz, M Z Abdul; Termizi, N A S Mohd; Kamarulzaman, F N Mohd

    2017-01-01

    Monte Carlo (MC) simulation for electron beam radiotherapy consumes a long computation time. An algorithm called variance reduction technique (VRT) in MC was implemented to speed up this duration. This work focused on optimisation of VRT parameter which refers to electron range rejection and particle history. EGSnrc MC source code was used to simulate (BEAMnrc code) and validate (DOSXYZnrc code) the Siemens Primus linear accelerator model with the non-VRT parameter. The validated MC model simulation was repeated by applying VRT parameter (electron range rejection) that controlled by global electron cut-off energy 1,2 and 5 MeV using 20 × 10 7 particle history. 5 MeV range rejection generated the fastest MC simulation with 50% reduction in computation time compared to non-VRT simulation. Thus, 5 MeV electron range rejection utilized in particle history analysis ranged from 7.5 × 10 7 to 20 × 10 7 . In this study, 5 MeV electron cut-off with 10 × 10 7 particle history, the simulation was four times faster than non-VRT calculation with 1% deviation. Proper understanding and use of VRT can significantly reduce MC electron beam calculation duration at the same time preserving its accuracy. (paper)

  7. Assessment of Robotic Patient Simulators for Training in Manual Physical Therapy Examination Techniques

    Science.gov (United States)

    Ishikawa, Shun; Okamoto, Shogo; Isogai, Kaoru; Akiyama, Yasuhiro; Yanagihara, Naomi; Yamada, Yoji

    2015-01-01

    Robots that simulate patients suffering from joint resistance caused by biomechanical and neural impairments are used to aid the training of physical therapists in manual examination techniques. However, there are few methods for assessing such robots. This article proposes two types of assessment measures based on typical judgments of clinicians. One of the measures involves the evaluation of how well the simulator presents different severities of a specified disease. Experienced clinicians were requested to rate the simulated symptoms in terms of severity, and the consistency of their ratings was used as a performance measure. The other measure involves the evaluation of how well the simulator presents different types of symptoms. In this case, the clinicians were requested to classify the simulated resistances in terms of symptom type, and the average ratios of their answers were used as performance measures. For both types of assessment measures, a higher index implied higher agreement among the experienced clinicians that subjectively assessed the symptoms based on typical symptom features. We applied these two assessment methods to a patient knee robot and achieved positive appraisals. The assessment measures have potential for use in comparing several patient simulators for training physical therapists, rather than as absolute indices for developing a standard. PMID:25923719

  8. Assessment of robotic patient simulators for training in manual physical therapy examination techniques.

    Directory of Open Access Journals (Sweden)

    Shun Ishikawa

    Full Text Available Robots that simulate patients suffering from joint resistance caused by biomechanical and neural impairments are used to aid the training of physical therapists in manual examination techniques. However, there are few methods for assessing such robots. This article proposes two types of assessment measures based on typical judgments of clinicians. One of the measures involves the evaluation of how well the simulator presents different severities of a specified disease. Experienced clinicians were requested to rate the simulated symptoms in terms of severity, and the consistency of their ratings was used as a performance measure. The other measure involves the evaluation of how well the simulator presents different types of symptoms. In this case, the clinicians were requested to classify the simulated resistances in terms of symptom type, and the average ratios of their answers were used as performance measures. For both types of assessment measures, a higher index implied higher agreement among the experienced clinicians that subjectively assessed the symptoms based on typical symptom features. We applied these two assessment methods to a patient knee robot and achieved positive appraisals. The assessment measures have potential for use in comparing several patient simulators for training physical therapists, rather than as absolute indices for developing a standard.

  9. Assessment of robotic patient simulators for training in manual physical therapy examination techniques.

    Science.gov (United States)

    Ishikawa, Shun; Okamoto, Shogo; Isogai, Kaoru; Akiyama, Yasuhiro; Yanagihara, Naomi; Yamada, Yoji

    2015-01-01

    Robots that simulate patients suffering from joint resistance caused by biomechanical and neural impairments are used to aid the training of physical therapists in manual examination techniques. However, there are few methods for assessing such robots. This article proposes two types of assessment measures based on typical judgments of clinicians. One of the measures involves the evaluation of how well the simulator presents different severities of a specified disease. Experienced clinicians were requested to rate the simulated symptoms in terms of severity, and the consistency of their ratings was used as a performance measure. The other measure involves the evaluation of how well the simulator presents different types of symptoms. In this case, the clinicians were requested to classify the simulated resistances in terms of symptom type, and the average ratios of their answers were used as performance measures. For both types of assessment measures, a higher index implied higher agreement among the experienced clinicians that subjectively assessed the symptoms based on typical symptom features. We applied these two assessment methods to a patient knee robot and achieved positive appraisals. The assessment measures have potential for use in comparing several patient simulators for training physical therapists, rather than as absolute indices for developing a standard.

  10. Computer aided photographic engineering

    Science.gov (United States)

    Hixson, Jeffrey A.; Rieckhoff, Tom

    1988-01-01

    High speed photography is an excellent source of engineering data but only provides a two-dimensional representation of a three-dimensional event. Multiple cameras can be used to provide data for the third dimension but camera locations are not always available. A solution to this problem is to overlay three-dimensional CAD/CAM models of the hardware being tested onto a film or photographic image, allowing the engineer to measure surface distances, relative motions between components, and surface variations.

  11. A Dynamic Operation Permission Technique Based on an MFM Model and Numerical Simulation

    International Nuclear Information System (INIS)

    Akio, Gofuku; Masahiro, Yonemura

    2011-01-01

    It is important to support operator activities to an abnormal plant situation where many counter actions are taken in relatively short time. The authors proposed a technique called dynamic operation permission to decrease human errors without eliminating creative idea of operators to cope with an abnormal plant situation by checking if the counter action taken is consistent with emergency operation procedure. If the counter action is inconsistent, a dynamic operation permission system warns it to operators. It also explains how and why the counter action is inconsistent and what influence will appear on the future plant behavior by a qualitative influence inference technique based on a model by the Mf (Multilevel Flow Modeling). However, the previous dynamic operation permission is not able to explain quantitative effects on plant future behavior. Moreover, many possible influence paths are derived because a qualitative reasoning does not give a solution when positive and negative influences are propagated to the same node. This study extends the dynamic operation permission by combining the qualitative reasoning and the numerical simulation technique. The qualitative reasoning based on an Mf model of plant derives all possible influence propagation paths. Then, a numerical simulation gives a prediction of plant future behavior in the case of taking a counter action. The influence propagation that does not coincide with the simulation results is excluded from possible influence paths. The extended technique is implemented in a dynamic operation permission system for an oil refinery plant. An MFM model and a static numerical simulator are developed. The results of dynamic operation permission for some abnormal plant situations show the improvement of the accuracy of dynamic operation permission and the quality of explanation for the effects of the counter action taken

  12. Advanced Hydroinformatic Techniques for the Simulation and Analysis of Water Supply and Distribution Systems

    OpenAIRE

    Herrera, Manuel; Meniconi, Silvia; Alvisi, Stefano; Izquierdo, Joaquin

    2018-01-01

    This document is intended to be a presentation of the Special Issue “Advanced Hydroinformatic Techniques for the Simulation and Analysis of Water Supply and Distribution Systems”. The final aim of this Special Issue is to propose a suitable framework supporting insightful hydraulic mechanisms to aid the decision-making processes of water utility managers and practitioners. Its 18 peer-reviewed articles present as varied topics as: water distribution system design, optimization of network perf...

  13. New techniques and results for worldline simulations of lattice field theories

    Science.gov (United States)

    Giuliani, Mario; Orasch, Oliver; Gattringer, Christof

    2018-03-01

    We use the complex ø4 field at finite density as a model system for developing further techniques based on worldline formulations of lattice field theories. More specifically we: 1) Discuss new variants of the worm algorithm for updating the ø4 theory and related systems with site weights. 2) Explore the possibility of canonical simulations in the worldline formulation. 3) Study the connection of 2-particle condensation at low temperature to scattering parameters of the theory.

  14. An Improved Simulated Annealing Technique for Enhanced Mobility in Smart Cities

    Directory of Open Access Journals (Sweden)

    Hayder Amer

    2016-06-01

    Full Text Available Vehicular traffic congestion is a significant problem that arises in many cities. This is due to the increasing number of vehicles that are driving on city roads of limited capacity. The vehicular congestion significantly impacts travel distance, travel time, fuel consumption and air pollution. Avoidance of traffic congestion and providing drivers with optimal paths are not trivial tasks. The key contribution of this work consists of the developed approach for dynamic calculation of optimal traffic routes. Two attributes (the average travel speed of the traffic and the roads’ length are utilized by the proposed method to find the optimal paths. The average travel speed values can be obtained from the sensors deployed in smart cities and communicated to vehicles via the Internet of Vehicles and roadside communication units. The performance of the proposed algorithm is compared to three other algorithms: the simulated annealing weighted sum, the simulated annealing technique for order preference by similarity to the ideal solution and the Dijkstra algorithm. The weighted sum and technique for order preference by similarity to the ideal solution methods are used to formulate different attributes in the simulated annealing cost function. According to the Sheffield scenario, simulation results show that the improved simulated annealing technique for order preference by similarity to the ideal solution method improves the traffic performance in the presence of congestion by an overall average of 19.22% in terms of travel time, fuel consumption and CO2 emissions as compared to other algorithms; also, similar performance patterns were achieved for the Birmingham test scenario.

  15. Space Geodetic Technique Co-location in Space: Simulation Results for the GRASP Mission

    Science.gov (United States)

    Kuzmicz-Cieslak, M.; Pavlis, E. C.

    2011-12-01

    The Global Geodetic Observing System-GGOS, places very stringent requirements in the accuracy and stability of future realizations of the International Terrestrial Reference Frame (ITRF): an origin definition at 1 mm or better at epoch and a temporal stability on the order of 0.1 mm/y, with similar numbers for the scale (0.1 ppb) and orientation components. These goals were derived from the requirements of Earth science problems that are currently the international community's highest priority. None of the geodetic positioning techniques can achieve this goal alone. This is due in part to the non-observability of certain attributes from a single technique. Another limitation is imposed from the extent and uniformity of the tracking network and the schedule of observational availability and number of suitable targets. The final limitation derives from the difficulty to "tie" the reference points of each technique at the same site, to an accuracy that will support the GGOS goals. The future GGOS network will address decisively the ground segment and to certain extent the space segment requirements. The JPL-proposed multi-technique mission GRASP (Geodetic Reference Antenna in Space) attempts to resolve the accurate tie between techniques, using their co-location in space, onboard a well-designed spacecraft equipped with GNSS receivers, a SLR retroreflector array, a VLBI beacon and a DORIS system. Using the anticipated system performance for all four techniques at the time the GGOS network is completed (ca 2020), we generated a number of simulated data sets for the development of a TRF. Our simulation studies examine the degree to which GRASP can improve the inter-technique "tie" issue compared to the classical approach, and the likely modus operandi for such a mission. The success of the examined scenarios is judged by the quality of the origin and scale definition of the resulting TRF.

  16. Loading pattern optimization by multi-objective simulated annealing with screening technique

    International Nuclear Information System (INIS)

    Tong, K. P.; Hyun, C. L.; Hyung, K. J.; Chang, H. K.

    2006-01-01

    This paper presents a new multi-objective function which is made up of the main objective term as well as penalty terms related to the constraints. All the terms are represented in the same functional form and the coefficient of each term is normalized so that each term has equal weighting in the subsequent simulated annealing optimization calculations. The screening technique introduced in the previous work is also adopted in order to save computer time in 3-D neutronics evaluation of trial loading patterns. For numerical test of the new multi-objective function in the loading pattern optimization, the optimum loading patterns for the initial and the cycle 7 reload PWR core of Yonggwang Unit 4 are calculated by the simulated annealing algorithm with screening technique. A total of 10 optimum loading patterns are obtained for the initial core through 10 independent simulated annealing optimization runs. For the cycle 7 reload core one optimum loading pattern has been obtained from a single simulated annealing optimization run. More SA optimization runs will be conducted to optimum loading patterns for the cycle 7 reload core and results will be presented in the further work. (authors)

  17. Automatic domain updating technique for improving computational efficiency of 2-D flood-inundation simulation

    Science.gov (United States)

    Tanaka, T.; Tachikawa, Y.; Ichikawa, Y.; Yorozu, K.

    2017-12-01

    Flood is one of the most hazardous disasters and causes serious damage to people and property around the world. To prevent/mitigate flood damage through early warning system and/or river management planning, numerical modelling of flood-inundation processes is essential. In a literature, flood-inundation models have been extensively developed and improved to achieve flood flow simulation with complex topography at high resolution. With increasing demands on flood-inundation modelling, its computational burden is now one of the key issues. Improvements of computational efficiency of full shallow water equations are made from various perspectives such as approximations of the momentum equations, parallelization technique, and coarsening approaches. To support these techniques and more improve the computational efficiency of flood-inundation simulations, this study proposes an Automatic Domain Updating (ADU) method of 2-D flood-inundation simulation. The ADU method traces the wet and dry interface and automatically updates the simulation domain in response to the progress and recession of flood propagation. The updating algorithm is as follow: first, to register the simulation cells potentially flooded at initial stage (such as floodplains nearby river channels), and then if a registered cell is flooded, to register its surrounding cells. The time for this additional process is saved by checking only cells at wet and dry interface. The computation time is reduced by skipping the processing time of non-flooded area. This algorithm is easily applied to any types of 2-D flood inundation models. The proposed ADU method is implemented to 2-D local inertial equations for the Yodo River basin, Japan. Case studies for two flood events show that the simulation is finished within two to 10 times smaller time showing the same result as that without the ADU method.

  18. Effect of Simulation Techniques and Lecture Method on Students' Academic Performance in Mafoni Day Secondary School Maiduguri, Borno State, Nigeria

    Science.gov (United States)

    Bello, Sulaiman; Ibi, Mustapha Baba; Bukar, Ibrahim Bulama

    2016-01-01

    The study examined the effect of simulation technique and lecture method on students' academic performance in Mafoni Day Secondary School, Maiduguri. The study used both simulation technique and lecture methods of teaching at the basic level of education in the teaching/learning environment. The study aimed at determining the best predictor among…

  19. Recent advances in 3D computed tomography techniques for simulation and navigation in hepatobiliary pancreatic surgery.

    Science.gov (United States)

    Uchida, Masafumi

    2014-04-01

    A few years ago it could take several hours to complete a 3D image using a 3D workstation. Thanks to advances in computer science, obtaining results of interest now requires only a few minutes. Many recent 3D workstations or multimedia computers are equipped with onboard 3D virtual patient modeling software, which enables patient-specific preoperative assessment and virtual planning, navigation, and tool positioning. Although medical 3D imaging can now be conducted using various modalities, including computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), and ultrasonography (US) among others, the highest quality images are obtained using CT data, and CT images are now the most commonly used source of data for 3D simulation and navigation image. If the 2D source image is bad, no amount of 3D image manipulation in software will provide a quality 3D image. In this exhibition, the recent advances in CT imaging technique and 3D visualization of the hepatobiliary and pancreatic abnormalities are featured, including scan and image reconstruction technique, contrast-enhanced techniques, new application of advanced CT scan techniques, and new virtual reality simulation and navigation imaging. © 2014 Japanese Society of Hepato-Biliary-Pancreatic Surgery.

  20. Atmospheric Pre-Corrected Differential Absorption Techniques to Retrieve Columnar Water Vapor: Theory and Simulations

    Science.gov (United States)

    Borel, Christoph C.; Schlaepfer, Daniel

    1996-01-01

    Two different approaches exist to retrieve columnar water vapor from imaging spectrometer data: (1) Differential absorption techniques based on: (a) Narrow-Wide (N/W) ratio between overlapping spectrally wide and narrow channels; (b) Continuum Interpolated Band Ratio (CIBR) between a measurement channel and the weighted sum of two reference channels. (2) Non-linear fitting techniques which are based on spectral radiative transfer calculations. The advantage of the first approach is computational speed and of the second, improved retrieval accuracy. Our goal was to improve the accuracy of the first technique using physics based on radiative transfer. Using a modified version of the Duntley equation, we derived an "Atmospheric Pre-corrected Differential Absorption" (APDA) technique and described an iterative scheme to retrieve water vapor on a pixel-by-pixel basis. Next we compared both, the CIBR and the APDA using the Duntley equation for MODTRAN3 computed irradiances, transmissions and path radiance (using the DISORT option). This simulation showed that the CIBR is very sensitive to reflectance effects and that the APDA performs much better. An extensive data set was created with the radiative transfer code 6S over 379 different ground reflectance spectra. The calculated relative water vapor error was reduced significantly for the APDA. The APDA technique had about 8% (vs. over 35% for the CIBR) of the 379 spectra with a relative water vapor error of greater than +5%. The APDA has been applied to 1991 and 1995 AVIRIS scenes which visually demonstrate the improvement over the CIBR technique.

  1. A research on applications of qualitative reasoning techniques in Human Acts Simulation Program

    International Nuclear Information System (INIS)

    Far, B.H.

    1992-04-01

    Human Acts Simulation Program (HASP) is a ten-year research project of the Computing and Information Systems Center of JAERI. In HASP the goal is developing programs for an advanced intelligent robot to accomplish multiple instructions (for instance, related to surveillance, inspection and maintenance) in nuclear power plants. Some recent artificial intelligence techniques can contribute to this project. This report introduces some original contributions concerning application of Qualitative Reasoning (QR) techniques in HASP. The focus is on the knowledge-intensive tasks, including model-based reasoning, analytic learning, fault diagnosis and functional reasoning. The multi-level extended qualitative modeling for the Skill-Rule-Knowledge (S-R-K) based reasoning, that included the coordination and timing of events, Qualitative Sensitivity analysis (Q S A), Subjective Qualitative Fault Diagnosis (S Q F D) and Qualitative Function Formation (Q F F ) techniques are introduced. (author) 123 refs

  2. A multiobserver study of the effects of including point-of-care patient photographs with portable radiography: a means to detect wrong-patient errors.

    Science.gov (United States)

    Tridandapani, Srini; Ramamurthy, Senthil; Provenzale, James; Obuchowski, Nancy A; Evanoff, Michael G; Bhatti, Pamela

    2014-08-01

    To evaluate whether the presence of facial photographs obtained at the point-of-care of portable radiography leads to increased detection of wrong-patient errors. In this institutional review board-approved study, 166 radiograph-photograph combinations were obtained from 30 patients. Consecutive radiographs from the same patients resulted in 83 unique pairs (ie, a new radiograph and prior, comparison radiograph) for interpretation. To simulate wrong-patient errors, mismatched pairs were generated by pairing radiographs from different patients chosen randomly from the sample. Ninety radiologists each interpreted a unique randomly chosen set of 10 radiographic pairs, containing up to 10% mismatches (ie, error pairs). Radiologists were randomly assigned to interpret radiographs with or without photographs. The number of mismatches was identified, and interpretation times were recorded. Ninety radiologists with 21 ± 10 (mean ± standard deviation) years of experience were recruited to participate in this observer study. With the introduction of photographs, the proportion of errors detected increased from 31% (9 of 29) to 77% (23 of 30; P = .006). The odds ratio for detection of error with photographs to detection without photographs was 7.3 (95% confidence interval: 2.29-23.18). Observer qualifications, training, or practice in cardiothoracic radiology did not influence sensitivity for error detection. There is no significant difference in interpretation time for studies without photographs and those with photographs (60 ± 22 vs. 61 ± 25 seconds; P = .77). In this observer study, facial photographs obtained simultaneously with portable chest radiographs increased the identification of any wrong-patient errors, without substantial increase in interpretation time. This technique offers a potential means to increase patient safety through correct patient identification. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  3. A review on computational fluid dynamic simulation techniques for Darrieus vertical axis wind turbines

    International Nuclear Information System (INIS)

    Ghasemian, Masoud; Ashrafi, Z. Najafian; Sedaghat, Ahmad

    2017-01-01

    Highlights: • A review on CFD simulation technique for Darrieus wind turbines is provided. • Recommendations and guidelines toward reliable and accurate simulations are presented. • Different progresses in CFD simulation of Darrieus wind turbines are addressed. - Abstract: The global warming threats, the presence of policies on support of renewable energies, and the desire for clean smart cities are the major drives for most recent researches on developing small wind turbines in urban environments. VAWTs (vertical axis wind turbines) are most appealing for energy harvesting in the urban environment. This is attributed due to structural simplicity, wind direction independency, no yaw mechanism required, withstand high turbulence winds, cost effectiveness, easier maintenance, and lower noise emission of VAWTs. This paper reviews recent published works on CFD (computational fluid dynamic) simulations of Darrieus VAWTs. Recommendations and guidelines are presented for turbulence modeling, spatial and temporal discretization, numerical schemes and algorithms, and computational domain size. The operating and geometrical parameters such as tip speed ratio, wind speed, solidity, blade number and blade shapes are fully investigated. The purpose is to address different progresses in simulations areas such as blade profile modification and optimization, wind turbine performance augmentation using guide vanes, wind turbine wake interaction in wind farms, wind turbine aerodynamic noise reduction, dynamic stall control, self-starting characteristics, and effects of unsteady and skewed wind conditions.

  4. Simulating the x-ray image contrast to setup techniques with desired flaw detectability

    Science.gov (United States)

    Koshti, Ajay M.

    2015-04-01

    The paper provides simulation data of previous work by the author in developing a model for estimating detectability of crack-like flaws in radiography. The methodology is developed to help in implementation of NASA Special x-ray radiography qualification, but is generically applicable to radiography. The paper describes a method for characterizing the detector resolution. Applicability of ASTM E 2737 resolution requirements to the model are also discussed. The paper describes a model for simulating the detector resolution. A computer calculator application, discussed here, also performs predicted contrast and signal-to-noise ratio calculations. Results of various simulation runs in calculating x-ray flaw size parameter and image contrast for varying input parameters such as crack depth, crack width, part thickness, x-ray angle, part-to-detector distance, part-to-source distance, source sizes, and detector sensitivity and resolution are given as 3D surfaces. These results demonstrate effect of the input parameters on the flaw size parameter and the simulated image contrast of the crack. These simulations demonstrate utility of the flaw size parameter model in setting up x-ray techniques that provide desired flaw detectability in radiography. The method is applicable to film radiography, computed radiography, and digital radiography.

  5. 3D DIGITAL SIMULATION OF MINNAN TEMPLE ARCHITECTURE CAISSON'S CRAFT TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Y. C. Lin

    2013-07-01

    Full Text Available Caisson is one of the important representations of the Minnan (southern Fujian temple architecture craft techniques and decorative aesthetics. The special component design and group building method present the architectural thinking and personal characteristics of great carpenters of Minnan temple architecture. In late Qing Dynasty, the appearance and style of caissons of famous temples in Taiwan apparently presented the building techniques of the great carpenters. However, as the years went by, the caisson design and craft techniques were not fully inherited, which has been a great loss of cultural assets. Accordingly, with the caisson of Fulong temple, a work by the well-known great carpenter in Tainan as an example, this study obtained the thinking principles of the original design and the design method at initial period of construction through interview records and the step of redrawing the "Tng-Ko" (traditional design, stakeout and construction tool. We obtained the 3D point cloud model of the caisson of Fulong temple using 3D laser scanning technology, and established the 3D digital model of each component of the caisson. Based on the caisson component procedure obtained from interview records, this study conducted the digital simulation of the caisson component to completely recode and present the caisson design, construction and completion procedure. This model of preserving the craft techniques for Minnan temple caisson by using digital technology makes specific contribution to the heritage of the craft techniques while providing an important reference for the digital preservation of human cultural assets.

  6. Statistical learning techniques applied to epidemiology: a simulated case-control comparison study with logistic regression

    Directory of Open Access Journals (Sweden)

    Land Walker H

    2011-01-01

    Full Text Available Abstract Background When investigating covariate interactions and group associations with standard regression analyses, the relationship between the response variable and exposure may be difficult to characterize. When the relationship is nonlinear, linear modeling techniques do not capture the nonlinear information content. Statistical learning (SL techniques with kernels are capable of addressing nonlinear problems without making parametric assumptions. However, these techniques do not produce findings relevant for epidemiologic interpretations. A simulated case-control study was used to contrast the information embedding characteristics and separation boundaries produced by a specific SL technique with logistic regression (LR modeling representing a parametric approach. The SL technique was comprised of a kernel mapping in combination with a perceptron neural network. Because the LR model has an important epidemiologic interpretation, the SL method was modified to produce the analogous interpretation and generate odds ratios for comparison. Results The SL approach is capable of generating odds ratios for main effects and risk factor interactions that better capture nonlinear relationships between exposure variables and outcome in comparison with LR. Conclusions The integration of SL methods in epidemiology may improve both the understanding and interpretation of complex exposure/disease relationships.

  7. Numerical simulation of 3D unsteady flow in a rotating pump by dynamic mesh technique

    International Nuclear Information System (INIS)

    Huang, S; Guo, J; Yang, F X

    2013-01-01

    In this paper, the numerical simulation of unsteady flow for three kinds of typical rotating pumps, roots blower, roto-jet pump and centrifugal pump, were performed using the three-dimensional Dynamic Mesh technique. In the unsteady simulation, all the computational domains, as stationary, were set in one inertial reference frame. The motions of the solid boundaries were defined by the Profile file in FLUENT commercial code, in which the rotational orientation and speed of the rotors were specified. Three methods (Spring-based Smoothing, Dynamic Layering and Local Re-meshing) were used to achieve mesh deformation and re-meshing. The unsteady solutions of flow field and pressure distribution were solved. After a start-up stage, the flow parameters exhibit time-periodic behaviour corresponding to blade passing frequency of rotor. This work shows that Dynamic Mesh technique could achieve numerical simulation of three-dimensional unsteady flow field in various kinds of rotating pumps and have a strong versatility and broad application prospects

  8. Simulating GPS radio signal to synchronize network--a new technique for redundant timing.

    Science.gov (United States)

    Shan, Qingxiao; Jun, Yang; Le Floch, Jean-Michel; Fan, Yaohui; Ivanov, Eugene N; Tobar, Michael E

    2014-07-01

    Currently, many distributed systems such as 3G mobile communications and power systems are time synchronized with a Global Positioning System (GPS) signal. If there is a GPS failure, it is difficult to realize redundant timing, and thus time-synchronized devices may fail. In this work, we develop time transfer by simulating GPS signals, which promises no extra modification to original GPS-synchronized devices. This is achieved by applying a simplified GPS simulator for synchronization purposes only. Navigation data are calculated based on a pre-assigned time at a fixed position. Pseudo-range data which describes the distance change between the space vehicle (SV) and users are calculated. Because real-time simulation requires heavy-duty computations, we use self-developed software optimized on a PC to generate data, and save the data onto memory disks while the simulator is operating. The radio signal generation is similar to the SV at an initial position, and the frequency synthesis of the simulator is locked to a pre-assigned time. A filtering group technique is used to simulate the signal transmission delay corresponding to the SV displacement. Each SV generates a digital baseband signal, where a unique identifying code is added to the signal and up-converted to generate the output radio signal at the centered frequency of 1575.42 MHz (L1 band). A prototype with a field-programmable gate array (FPGA) has been built and experiments have been conducted to prove that we can realize time transfer. The prototype has been applied to the CDMA network for a three-month long experiment. Its precision has been verified and can meet the requirements of most telecommunication systems.

  9. Application of parallel computing techniques to a large-scale reservoir simulation

    International Nuclear Information System (INIS)

    Zhang, Keni; Wu, Yu-Shu; Ding, Chris; Pruess, Karsten

    2001-01-01

    Even with the continual advances made in both computational algorithms and computer hardware used in reservoir modeling studies, large-scale simulation of fluid and heat flow in heterogeneous reservoirs remains a challenge. The problem commonly arises from intensive computational requirement for detailed modeling investigations of real-world reservoirs. This paper presents the application of a massive parallel-computing version of the TOUGH2 code developed for performing large-scale field simulations. As an application example, the parallelized TOUGH2 code is applied to develop a three-dimensional unsaturated-zone numerical model simulating flow of moisture, gas, and heat in the unsaturated zone of Yucca Mountain, Nevada, a potential repository for high-level radioactive waste. The modeling approach employs refined spatial discretization to represent the heterogeneous fractured tuffs of the system, using more than a million 3-D gridblocks. The problem of two-phase flow and heat transfer within the model domain leads to a total of 3,226,566 linear equations to be solved per Newton iteration. The simulation is conducted on a Cray T3E-900, a distributed-memory massively parallel computer. Simulation results indicate that the parallel computing technique, as implemented in the TOUGH2 code, is very efficient. The reliability and accuracy of the model results have been demonstrated by comparing them to those of small-scale (coarse-grid) models. These comparisons show that simulation results obtained with the refined grid provide more detailed predictions of the future flow conditions at the site, aiding in the assessment of proposed repository performance

  10. Application of simulation techniques for accident management training in nuclear power plants

    International Nuclear Information System (INIS)

    2003-05-01

    core. These capabilities include the optimized use of design margins as well as complementary measures for the prevention of accident progression, its monitoring, and the mitigation of severe accidents. Finally, level 5 includes off-site emergency response measures, the objective of which is to mitigate the radiological consequences of significant releases of radioactive material. Accident management is defined in the IAEA Safety Report on Development and Implementation of Accident Management Programmes in Nuclear Power Plants. The IAEA definitions are in line with the definitions of severe accident management in OECD/NEA documents as given, for example. This report describes simulation techniques used in the training of personnel involved in accident management of NPPs. This concerns both the plant personnel and the persons involved in the management of off-site releases. The report pertains to light water reactors (LWRs) and pressurized heavy water reactors (PHWRs), but it can equally be applied to power reactors of other types. The report is intended for use by experts responsible for planning, developing, executing or supervising the training of personnel involved in the implementation of AMPs in NPPs. It concentrates on existing techniques, but future prospects are also discussed. Various simulation techniques are considered, from incorporating graphical interfaces into existing severe accident codes to full-scope replica simulators. Both preventive and mitigative accident management measures, different training levels and different target personnel groups are taken into account. Based on the available information compiled worldwide, present views on the applicability of simulation techniques for the training of personnel involved in accident management are provided in this report. Apart from the introduction, this report consists of four sections and three appendices. In Section 2, specific aspects of accident management are summarized. Basic approaches in the

  11. eLearning techniques supporting problem based learning in clinical simulation.

    Science.gov (United States)

    Docherty, Charles; Hoy, Derek; Topp, Helena; Trinder, Kathryn

    2005-08-01

    This paper details the results of the first phase of a project using eLearning to support students' learning within a simulated environment. The locus was a purpose built clinical simulation laboratory (CSL) where the School's philosophy of problem based learning (PBL) was challenged through lecturers using traditional teaching methods. a student-centred, problem based approach to the acquisition of clinical skills that used high quality learning objects embedded within web pages, substituting for lecturers providing instruction and demonstration. This encouraged student nurses to explore, analyse and make decisions within the safety of a clinical simulation. Learning was facilitated through network communications and reflection on video performances of self and others. Evaluations were positive, students demonstrating increased satisfaction with PBL, improved performance in exams, and increased self-efficacy in the performance of nursing activities. These results indicate that eLearning techniques can help students acquire clinical skills in the safety of a simulated environment within the context of a problem based learning curriculum.

  12. A NEW TECHNIQUE FOR THE PHOTOSPHERIC DRIVING OF NON-POTENTIAL SOLAR CORONAL MAGNETIC FIELD SIMULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Weinzierl, Marion; Yeates, Anthony R. [Department of Mathematical Sciences, Durham University, South Road, Durham DH1 3LE (United Kingdom); Mackay, Duncan H. [School of Mathematics and Statistics, University of St. Andrews, North Haugh, St. Andrews, Fife KY16 9SS (United Kingdom); Henney, Carl J.; Arge, C. Nick, E-mail: marion.weinzierl@durham.ac.uk [Air Force Research Lab/Space Vehicles Directorate, 3550 Aberdeen Avenue SE, Kirtland AFB, NM (United States)

    2016-05-20

    In this paper, we develop a new technique for driving global non-potential simulations of the Sun’s coronal magnetic field solely from sequences of radial magnetic maps of the solar photosphere. A primary challenge to driving such global simulations is that the required horizontal electric field cannot be uniquely determined from such maps. We show that an “inductive” electric field solution similar to that used by previous authors successfully reproduces specific features of the coronal field evolution in both single and multiple bipole simulations. For these cases, the true solution is known because the electric field was generated from a surface flux-transport model. The match for these cases is further improved by including the non-inductive electric field contribution from surface differential rotation. Then, using this reconstruction method for the electric field, we show that a coronal non-potential simulation can be successfully driven from a sequence of ADAPT maps of the photospheric radial field, without including additional physical observations which are not routinely available.

  13. The Life of Digital Photographs

    DEFF Research Database (Denmark)

    Larsen, Jonas

    )mobilities of things, practice approaches to photography and multi sited ethnography, this talk discusses and empirically track the life (the conception, birth, transformative years, ageing and death) travel, detours, makeovers and destinations of (analogue and digital) photographs in our present network societies. So...... we can understand the life of more-than representational photographs, and then I turn to my ethnographies to flesh out empirically the life of tourist photographs...

  14. Simulation technique for slurries interacting with moving parts and deformable solids with applications

    Science.gov (United States)

    Mutabaruka, Patrick; Kamrin, Ken

    2018-04-01

    A numerical method for particle-laden fluids interacting with a deformable solid domain and mobile rigid parts is proposed and implemented in a full engineering system. The fluid domain is modeled with a lattice Boltzmann representation, the particles and rigid parts are modeled with a discrete element representation, and the deformable solid domain is modeled using a Lagrangian mesh. The main issue of this work, since separately each of these methods is a mature tool, is to develop coupling and model-reduction approaches in order to efficiently simulate coupled problems of this nature, as in various geological and engineering applications. The lattice Boltzmann method incorporates a large eddy simulation technique using the Smagorinsky turbulence model. The discrete element method incorporates spherical and polyhedral particles for stiff contact interactions. A neo-Hookean hyperelastic model is used for the deformable solid. We provide a detailed description of how to couple the three solvers within a unified algorithm. The technique we propose for rubber modeling/coupling exploits a simplification that prevents having to solve a finite-element problem at each time step. We also developed a technique to reduce the domain size of the full system by replacing certain zones with quasi-analytic solutions, which act as effective boundary conditions for the lattice Boltzmann method. The major ingredients of the routine are separately validated. To demonstrate the coupled method in full, we simulate slurry flows in two kinds of piston valve geometries. The dynamics of the valve and slurry are studied and reported over a large range of input parameters.

  15. Lattice Boltzmann flow simulations with applications of reduced order modeling techniques

    KAUST Repository

    Brown, Donald

    2014-01-01

    With the recent interest in shale gas, an understanding of the flow mechanisms at the pore scale and beyond is necessary, which has attracted a lot of interest from both industry and academia. One of the suggested algorithms to help understand flow in such reservoirs is the Lattice Boltzmann Method (LBM). The primary advantage of LBM is its ability to approximate complicated geometries with simple algorithmic modificatoins. In this work, we use LBM to simulate the flow in a porous medium. More specifically, we use LBM to simulate a Brinkman type flow. The Brinkman law allows us to integrate fast free-flow and slow-flow porous regions. However, due to the many scales involved and complex heterogeneities of the rock microstructure, the simulation times can be long, even with the speed advantage of using an explicit time stepping method. The problem is two-fold, the computational grid must be able to resolve all scales and the calculation requires a steady state solution implying a large number of timesteps. To help reduce the computational complexity and total simulation times, we use model reduction techniques to reduce the dimension of the system. In this approach, we are able to describe the dynamics of the flow by using a lower dimensional subspace. In this work, we utilize the Proper Orthogonal Decomposition (POD) technique, to compute the dominant modes of the flow and project the solution onto them (a lower dimensional subspace) to arrive at an approximation of the full system at a lowered computational cost. We present a few proof-of-concept examples of the flow field and the corresponding reduced model flow field.

  16. Why use case studies rather than simulation-gaming techniques or library research?

    Science.gov (United States)

    Mcdonald, S. W.

    1981-01-01

    Method which present a student with a more challenging and true to life situation of needing to conduct research in a problem solving context--and not thinking about organization of format until research and thinking are complete are investigated. Simulation-gaming techniques which attempt to teach initiative and creativity that library research are used for this purpose. However, it is shown case studies provide the greatest opportunities to engage the students in problem solving situations in which they develop skills as researchers and writers.

  17. Structural investigation and simulation of acoustic properties of some tellurite glasses using artificial intelligence technique

    International Nuclear Information System (INIS)

    Gaafar, M.S.; Abdeen, Mostafa A.M.; Marzouk, S.Y.

    2011-01-01

    Research highlights: → Simulation the acoustic properties of some tellurite glasses using one of the artificial intelligence techniques (artificial neural network). → The glass network is strengthened by enhancing the linkage of Te-O chains. The tellurite network will also come to homogenization, because of uniform distribution of Nb 5+ ions among the Te-O chains, though some of the tellurium-oxide polyhedra still link each other in edge sharing. → Excellent agreements between the measured values and the predicted values were obtained for over 50 different tellurite glass compositions. → The model we designed gives a better agreement as compared with Makishima and Machenzie model. - Abstract: The developments in the field of industry raise the need for simulating the acoustic properties of glass materials before melting raw material oxides. In this paper, we are trying to simulate the acoustic properties of some tellurite glasses using one of the artificial intelligence techniques (artificial neural network). The artificial neural network (ANN) technique is introduced in the current study to simulate and predict important parameters such as density, longitudinal and shear ultrasonic velocities and elastic moduli (longitudinal and shear moduli). The ANN results were found to be in successful good agreement with those experimentally measured parameters. Then the presented ANN model is used to predict the acoustic properties of some new tellurite glasses. For this purpose, four glass systems xNb 2 O 5 -(1 - x)TeO 2 , 0.1PbO-xNb 2 O 5 -(0.9 - x)TeO 2 , 0.2PbO-xNb 2 O 5 -(0.8 - x)TeO 2 and 0.05Bi 2 O 3 -xNb 2 O 5 -(0.95 - x)TeO 2 were prepared using melt quenching technique. The results of ultrasonic velocities and elastic moduli showed that the addition of Nb 2 O 5 as a network modifier provides oxygen ions to change [TeO 4 ] tbps into [TeO 3 ] tps.

  18. Structural investigation and simulation of acoustic properties of some tellurite glasses using artificial intelligence technique

    Energy Technology Data Exchange (ETDEWEB)

    Gaafar, M.S., E-mail: mohamed_s_gaafar@hotmail.com [Ultrasonic Department, National Institute for Standards, Giza (Egypt); Physics Department, Faculty of Science, Majmaah University, Zulfi (Saudi Arabia); Abdeen, Mostafa A.M., E-mail: mostafa_a_m_abdeen@hotmail.com [Dept. of Eng. Math. and Physics, Faculty of Eng., Cairo University, Giza (Egypt); Marzouk, S.Y., E-mail: samir_marzouk2001@yahoo.com [Arab Academy of Science and Technology, Al-Horria, Heliopolis, Cairo (Egypt)

    2011-02-24

    Research highlights: > Simulation the acoustic properties of some tellurite glasses using one of the artificial intelligence techniques (artificial neural network). > The glass network is strengthened by enhancing the linkage of Te-O chains. The tellurite network will also come to homogenization, because of uniform distribution of Nb{sup 5+} ions among the Te-O chains, though some of the tellurium-oxide polyhedra still link each other in edge sharing. > Excellent agreements between the measured values and the predicted values were obtained for over 50 different tellurite glass compositions. > The model we designed gives a better agreement as compared with Makishima and Machenzie model. - Abstract: The developments in the field of industry raise the need for simulating the acoustic properties of glass materials before melting raw material oxides. In this paper, we are trying to simulate the acoustic properties of some tellurite glasses using one of the artificial intelligence techniques (artificial neural network). The artificial neural network (ANN) technique is introduced in the current study to simulate and predict important parameters such as density, longitudinal and shear ultrasonic velocities and elastic moduli (longitudinal and shear moduli). The ANN results were found to be in successful good agreement with those experimentally measured parameters. Then the presented ANN model is used to predict the acoustic properties of some new tellurite glasses. For this purpose, four glass systems xNb{sub 2}O{sub 5}-(1 - x)TeO{sub 2}, 0.1PbO-xNb{sub 2}O{sub 5}-(0.9 - x)TeO{sub 2}, 0.2PbO-xNb{sub 2}O{sub 5}-(0.8 - x)TeO{sub 2} and 0.05Bi{sub 2}O{sub 3}-xNb{sub 2}O{sub 5}-(0.95 - x)TeO{sub 2} were prepared using melt quenching technique. The results of ultrasonic velocities and elastic moduli showed that the addition of Nb{sub 2}O{sub 5} as a network modifier provides oxygen ions to change [TeO{sub 4}] tbps into [TeO{sub 3}] tps.

  19. Photographic Portraits: Narrative and Memory

    Directory of Open Access Journals (Sweden)

    Brian Roberts

    2011-05-01

    Full Text Available This article is a more general "companion" to the subsequent, Brian ROBERTS (2011 "Interpreting Photographic Portraits: Autobiography, Time Perspectives and Two School Photographs". The article seeks to add to the growing awareness of the importance of visual materials and methods in qualitative social research and to give an introduction to the "photographic self image"—self-portraits and portraits. It focuses on time and memory, including the experiential associations (in consciousness and the senses that the self engenders, thus linking the "visual" (photographic and "auto/biographical". The article attempts to "map" a field—the use of portraiture within social science—drawing on narrative and biographical research, on one side, and photographic portraiture, on the other. In supporting the use of photography in qualitative research it points to the need for researchers to have a greater knowledge of photographic (and art criticism and cognisance of photographic practices. The article does not intend to give a definitive account of photographic portraiture or prescribe in detail how it may be used within social science. It is an initial overview of the development and issues within the area of photographic portraiture and an exploration of relevant methodological issues when images of individuals are employed within social science—so that "portraiture" is better understood and developed within biographical and narrative research. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs110263

  20. Evaluation of convergence behavior of metamodeling techniques for bridging scales in multi-scale multimaterial simulation

    International Nuclear Information System (INIS)

    Sen, Oishik; Davis, Sean; Jacobs, Gustaaf; Udaykumar, H.S.

    2015-01-01

    The effectiveness of several metamodeling techniques, viz. the Polynomial Stochastic Collocation method, Adaptive Stochastic Collocation method, a Radial Basis Function Neural Network, a Kriging Method and a Dynamic Kriging Method is evaluated. This is done with the express purpose of using metamodels to bridge scales between micro- and macro-scale models in a multi-scale multimaterial simulation. The rate of convergence of the error when used to reconstruct hypersurfaces of known functions is studied. For sufficiently large number of training points, Stochastic Collocation methods generally converge faster than the other metamodeling techniques, while the DKG method converges faster when the number of input points is less than 100 in a two-dimensional parameter space. Because the input points correspond to computationally expensive micro/meso-scale computations, the DKG is favored for bridging scales in a multi-scale solver

  1. Atmospheric pre-corrected differential absorption techniques to retrieve columnar water vapor: Theory and simulations

    Energy Technology Data Exchange (ETDEWEB)

    Borel, C.C.; Schlaepfer, D.

    1996-03-01

    Two different approaches exist to retrieve columnar water vapor from imaging spectrometer data: (1) Differential absorption techniques based on: (a) Narrow-Wide (N/W) ratio between overlapping spectrally wide and narrow channels (b) Continuum Interpolated Band Ratio (CIBR) between a measurement channel and the weighted sum of two reference channels; and (2) Non-linear fitting techniques which are based on spectral radiative transfer calculations. The advantage of the first approach is computational speed and of the second, improved retrieval accuracy. Our goal was to improve the accuracy of the first technique using physics based on radiative transfer. Using a modified version of the Duntley equation, we derived an {open_quote}Atmospheric Pre-corrected Differential Absorption{close_quote} (APDA) technique and described an iterative scheme to retrieve water vapor on a pixel-by-pixel basis. Next we compared both, the CIBR and the APDA using the Duntley equation for MODTRAN3 computed irradiances, transmissions and path radiance (using the DISORT option). This simulation showed that the CIBR is very sensitive to reflectance effects and that the APDA performs much better. An extensive data set was created with the radiative transfer code 6S over 379 different ground reflectance spectra. The calculated relative water vapor error was reduced significantly for the APDA. The APDA technique had about 8% (vs. over 35% for the CIBR) of the 379 spectra with a relative water vapor error of greater than {+-}5%. The APDA has been applied to 1991 and 1995 AVIRIS scenes which visually demonstrate the improvement over the CIBR technique.

  2. Method of making stepped photographic density standards of radiographic photographs

    International Nuclear Information System (INIS)

    Borovin, I.V.; Kondina, M.A.

    1987-01-01

    In industrial radiography practice the need often arises for a prompt evaluation of the photographic density of an x-ray film. A method of making stepped photographic density standards for industrial radiography by contact printing from a negative is described. The method is intended for industrial radiation flaw detection laboratories not having specialized sensitometric equipment

  3. Physical simulation technique on the behaviour of oil spills in grease ice under wave actions

    International Nuclear Information System (INIS)

    Li, Z.; Hollebone, B.; Fingas, M.; Fieldhouse, B.

    2008-01-01

    Light or medium oil spilled on ice tends to rise and remain the surface in unconsolidated frazil or grease ice. This study looked for a new system for studying the oil emulsion in grease ice under experimental conditions. A physical simulation technique was designed to test the effect of wave energy on the spilled oil grease ice emulsion. The newly developed test system has the ability to perform simulation tests in wave, wave-ice, wave-oil and wave-ice-oil. This paper presented the design concept of the developed test system and introduced the experimental certifications of its capability in terms of temperature control, wave-making and grease ice-making. The key feature of the technique is a mini wave flume which derives its wave making power from an oscillator in a chemical laboratory. Video cameras record the wave action in the flume in order to obtain wave parameters. The wave making capability tests in this study were used to determine the relation of wave height, length and frequency with oscillator power transfer, oscillator frequency and the depth of the water flume. 16 refs., 10 figs

  4. The simulation of Typhoon-induced coastal inundation in Busan, South Korea applying the downscaling technique

    Science.gov (United States)

    Jang, Dongmin; Park, Junghyun; Yuk, Jin-Hee; Joh, MinSu

    2017-04-01

    Due to typhoons, the south coastal cities including Busan in South Korea coastal are very vulnerable to a surge, wave and corresponding coastal inundation, and are affected every year. In 2016, South Korea suffered tremendous damage by typhoon 'Chaba', which was developed near east-north of Guam on Sep. 28 and had maximum 10-minute sustained wind speed of about 50 m/s, 1-minute sustained wind speed of 75 m/s and a minimum central pressure of 905 hpa. As 'Chaba', which is the strongest since typhoon 'Maemi' in 2003, hit South Korea on Oct. 5, it caused a massive economic and casualty damage to Ulsan, Gyeongju and Busan in South Korea. In particular, the damage of typhoon-induced coastal inundation in Busan, where many high-rise buildings and residential areas are concentrated near coast, was serious. The coastal inundation could be more affected by strong wind-induced wave than surge. In fact, it was observed that the surge height was about 1 m averagely and a significant wave height was about 8 m at coastal sea nearby Busan on Oct. 5 due to 'Chaba'. Even though the typhoon-induced surge elevated the sea level, the typhoon-induced long period wave with wave period of more than 15s could play more important role in the inundation. The present work simulated the coastal inundation induced by 'Chaba' in Busan, South Korea considering the effects of typhoon-induced surge and wave. For 'Chaba' hindcast, high resolution Weather Research and Forecasting model (WRF) was applied using a reanalysis data produced by NCEP (FNL 0.25 degree) on the boundary and initial conditions, and was validated by the observation of wind speed, direction and pressure. The typhoon-induced coastal inundation was simulated by an unstructured gird model, Finite Volume Community Ocean Model (FVCOM), which is fully current-wave coupled model. To simulate the wave-induced inundation, 1-way downscaling technique of multi domain was applied. Firstly, a mother's domain including Korean peninsula was

  5. Taxi Time Prediction at Charlotte Airport Using Fast-Time Simulation and Machine Learning Techniques

    Science.gov (United States)

    Lee, Hanbong

    2016-01-01

    Accurate taxi time prediction is required for enabling efficient runway scheduling that can increase runway throughput and reduce taxi times and fuel consumptions on the airport surface. Currently NASA and American Airlines are jointly developing a decision-support tool called Spot and Runway Departure Advisor (SARDA) that assists airport ramp controllers to make gate pushback decisions and improve the overall efficiency of airport surface traffic. In this presentation, we propose to use Linear Optimized Sequencing (LINOS), a discrete-event fast-time simulation tool, to predict taxi times and provide the estimates to the runway scheduler in real-time airport operations. To assess its prediction accuracy, we also introduce a data-driven analytical method using machine learning techniques. These two taxi time prediction methods are evaluated with actual taxi time data obtained from the SARDA human-in-the-loop (HITL) simulation for Charlotte Douglas International Airport (CLT) using various performance measurement metrics. Based on the taxi time prediction results, we also discuss how the prediction accuracy can be affected by the operational complexity at this airport and how we can improve the fast time simulation model before implementing it with an airport scheduling algorithm in a real-time environment.

  6. Mining MaNGA for Merging Galaxies: A New Imaging and Kinematic Technique from Hydrodynamical Simulations

    Science.gov (United States)

    Nevin, Becky; Comerford, Julia M.; Blecha, Laura

    2018-06-01

    Merging galaxies play a key role in galaxy evolution, and progress in our understanding of galaxy evolution is slowed by the difficulty of making accurate galaxy merger identifications. Mergers are typically identified using imaging alone, which has its limitations and biases. With the growing popularity of integral field spectroscopy (IFS), it is now possible to use kinematic signatures to improve galaxy merger identifications. I use GADGET-3 hydrodynamical simulations of merging galaxies with the radiative transfer code SUNRISE, the later of which enables me to apply the same analysis to simulations and observations. From the simulated galaxies, I have developed the first merging galaxy classification scheme that is based on kinematics and imaging. Utilizing a Linear Discriminant Analysis tool, I have determined which kinematic and imaging predictors are most useful for identifying mergers of various merger parameters (such as orientation, mass ratio, gas fraction, and merger stage). I will discuss the strengths and limitations of the classification technique and then my initial results for applying the classification to the >10,000 observed galaxies in the MaNGA (Mapping Nearby Galaxies at Apache Point) IFS survey. Through accurate identification of merging galaxies in the MaNGA survey, I will advance our understanding of supermassive black hole growth in galaxy mergers and other open questions related to galaxy evolution.

  7. The relationship of superficial cerebral veins with meningiomas by simulation craniotomy technique

    International Nuclear Information System (INIS)

    Zhao Hongwei; Gong Xiangyang

    2012-01-01

    Objective: To assess the value of simulation craniotomy (SC) technique in evaluation of superficial cerebral veins (SCVs) and its relationship with convexity, parasagittal and falcine meningiomas. Methods: Forty-nine consecutive patients with convexity,parasagittal, and falcine meningiomas performed SC technique and three-dimensional contrast enhanced MR venography (3D CE MRV) in a prospective study. The number of SCVs (diameter > 1 mm) within 2 cm around the margin of tumors detected by two techniques were compared with the paired t test. Furthermore, 49 cases were divided into groups according to the tumor largest diameter, position, and dural enhancement. The image quality of SC technique in different groups were analyzed by Wilcoxon test in order to find influence factors. Results: The number of SCVs within 2 cm around the margin of tumor in SC was 4.4 ± 1.9, which was significantly less than that on 3D CE MRV (5.1 ± 2.7) (t=3.131, P<0.05). The relationship between meningiomas and the SCVs was demonstrated well on SC in majority of cases with the score of image quality was 2.5 ±0.7. The score of image quality of 12 patients with obvious dural enhancement was 1.5 ± 0.5, which was significantly lower than that of 37 patients without dural enhancement (2.8 ± 0.3) (Z=-3.093, P<0.05). The score of image quality of 18 patients with tumor larger than 4 cm in diameter (2.2 ± 0.9)was significantly lowed than that of 31 patients with small tumors (2.7 ± 0.5) (Z=-2.057, P<0.05). The score of image quality of convexity group (n=10) and parasagittal and falcine group (n=39) was 2.2 ± 0.9 and 2.6 ± 0.6,and there was no significant difference between different location group (Z=-0.604, P>0.05). Conclusions: Simulation craniotomy can exactly display SCVs avoiding the influence of deep cerebral veins and skull veins. This simple technique can provide useful information about the SCVs and their relationships with cortical structures and tumors for preoperative surgical

  8. Simulation and prediction for energy dissipaters and stilling basins design using artificial intelligence technique

    Directory of Open Access Journals (Sweden)

    Mostafa Ahmed Moawad Abdeen

    2015-12-01

    Full Text Available Water with large velocities can cause considerable damage to channels whose beds are composed of natural earth materials. Several stilling basins and energy dissipating devices have been designed in conjunction with spillways and outlet works to avoid damages in canals’ structures. In addition, lots of experimental and traditional mathematical numerical works have been performed to profoundly investigate the accurate design of these stilling basins and energy dissipaters. The current study is aimed toward introducing the artificial intelligence technique as new modeling tool in the prediction of the accurate design of stilling basins. Specifically, artificial neural networks (ANNs are utilized in the current study in conjunction with experimental data to predict the length of the hydraulic jumps occurred in spillways and consequently the stilling basin dimensions can be designed for adequate energy dissipation. The current study showed, in a detailed fashion, the development process of different ANN models to accurately predict the hydraulic jump lengths acquired from different experimental studies. The results obtained from implementing these models showed that ANN technique was very successful in simulating the hydraulic jump characteristics occurred in stilling basins. Therefore, it can be safely utilized in the design of these basins as ANN involves minimum computational and financial efforts and requirements compared with experimental work and traditional numerical techniques such as finite difference or finite elements.

  9. Simulation Tools and Techniques for Analyzing the Impacts of Photovoltaic System Integration

    Science.gov (United States)

    Hariri, Ali

    utility simulation software. On the other hand, EMT simulation tools provide high accuracy and visibility over a wide bandwidth of frequencies at the expense of larger processing and memory requirements, limited network size, and long simulation time. Therefore, there is a gap in simulation tools and techniques that can efficiently and effectively identify potential PV impact. New planning simulation tools are needed in order to accommodate for the simulation requirements of new integrated technologies in the electric grid. The dissertation at hand starts by identifying some of the potential impacts that are caused by high PV penetration. A phasor-based quasi-static time series (QSTS) analysis tool is developed in order to study the slow dynamics that are caused by the variations in the PV generation that lead to voltage fluctuations. Moreover, some EMT simulations are performed in order to study the impacts of PV systems on the electric network harmonic levels. These studies provide insights into the type and duration of certain impacts, as well as the conditions that may lead to adverse phenomena. In addition these studies present an idea about the type of simulation tools that are sufficient for each type of study. After identifying some of the potential impacts, certain planning tools and techniques are proposed. The potential PV impacts may cause certain utilities to refrain from integrating PV systems into their networks. However, each electric network has a certain limit beyond which the impacts become substantial and may adversely interfere with the system operation and the equipment along the feeder; this limit is referred to as the hosting limit (or hosting capacity). Therefore, it is important for utilities to identify the PV hosting limit on a specific electric network in order to safely and confidently integrate the maximum possible PV systems. In the following dissertation, two approaches have been proposed for identifying the hosing limit: 1. Analytical

  10. Full scope simulator of a nuclear power plant control room using 3D stereo virtual reality techniques for operators training

    International Nuclear Information System (INIS)

    Aghina, Mauricio A.C.; Mol, Antonio Carlos A.; Almeida, Adino Americo A.; Pereira, Claudio M.N.A.; Varela, Thiago F.B.

    2007-01-01

    Practical training of nuclear power plants operators are partially performed by means of simulators. Usually these simulators are physical copies of the original control roam, needing a large space on a facility being also very expensive. In this way, the proposal of this paper is to implement the use of Virtual Reality techniques to design a full scope control room simulator, in a manner to reduce costs and physical space usage. (author)

  11. A simulation technique for 3D MR-guided acoustic radiation force imaging

    International Nuclear Information System (INIS)

    Payne, Allison; Bever, Josh de; Farrer, Alexis; Coats, Brittany; Parker, Dennis L.; Christensen, Douglas A.

    2015-01-01

    Purpose: In magnetic resonance-guided focused ultrasound (MRgFUS) therapies, the in situ characterization of the focal spot location and quality is critical. MR acoustic radiation force imaging (MR-ARFI) is a technique that measures the tissue displacement caused by the radiation force exerted by the ultrasound beam. This work presents a new technique to model the displacements caused by the radiation force of an ultrasound beam in a homogeneous tissue model. Methods: When a steady-state point-source force acts internally in an infinite homogeneous medium, the displacement of the material in all directions is given by the Somigliana elastostatic tensor. The radiation force field, which is caused by absorption and reflection of the incident ultrasound intensity pattern, will be spatially distributed, and the tensor formulation takes the form of a convolution of a 3D Green’s function with the force field. The dynamic accumulation of MR phase during the ultrasound pulse can be theoretically accounted for through a time-of-arrival weighting of the Green’s function. This theoretical model was evaluated experimentally in gelatin phantoms of varied stiffness (125-, 175-, and 250-bloom). The acoustic and mechanical properties of the phantoms used as parameters of the model were measured using independent techniques. Displacements at focal depths of 30- and 45-mm in the phantoms were measured by a 3D spin echo MR-ARFI segmented-EPI sequence. Results: The simulated displacements agreed with the MR-ARFI measured displacements for all bloom values and focal depths with a normalized RMS difference of 0.055 (range 0.028–0.12). The displacement magnitude decreased and the displacement pattern broadened with increased bloom value for both focal depths, as predicted by the theory. Conclusions: A new technique that models the displacements caused by the radiation force of an ultrasound beam in a homogeneous tissue model theory has been rigorously validated through comparison

  12. Real-time surgical simulation for deformable soft-tissue objects with a tumour using Boundary Element techniques

    Science.gov (United States)

    Wang, P.; Becker, A. A.; Jones, I. A.; Glover, A. T.; Benford, S. D.; Vloeberghs, M.

    2009-08-01

    A virtual-reality real-time simulation of surgical operations that incorporates the inclusion of a hard tumour is presented. The software is based on Boundary Element (BE) technique. A review of the BE formulation for real-time analysis of two-domain deformable objects, using the pre-solution technique, is presented. The two-domain BE software is incorporated into a surgical simulation system called VIRS to simulate the initiation of a cut on the surface of the soft tissue and extending the cut deeper until the tumour is reached.

  13. Real-time surgical simulation for deformable soft-tissue objects with a tumour using Boundary Element techniques

    International Nuclear Information System (INIS)

    Wang, P; Becker, A A; Jones, I A; Glover, A T; Benford, S D; Vloeberghs, M

    2009-01-01

    A virtual-reality real-time simulation of surgical operations that incorporates the inclusion of a hard tumour is presented. The software is based on Boundary Element (BE) technique. A review of the BE formulation for real-time analysis of two-domain deformable objects, using the pre-solution technique, is presented. The two-domain BE software is incorporated into a surgical simulation system called VIRS to simulate the initiation of a cut on the surface of the soft tissue and extending the cut deeper until the tumour is reached.

  14. Experience with the Large Eddy Simulation (LES) Technique for the Modelling of Premixed and Non-premixed Combustion

    OpenAIRE

    Malalasekera, W; Ibrahim, SS; Masri, AR; Gubba, SR; Sadasivuni, SK

    2013-01-01

    Compared to RANS based combustion modelling, the Large Eddy Simulation (LES) technique has recently emerged as a more accurate and very adaptable technique in terms of handling complex turbulent interactions in combustion modelling problems. In this paper application of LES based combustion modelling technique and the validation of models in non-premixed and premixed situations are considered. Two well defined experimental configurations where high quality data are available for validation is...

  15. Verification of time-delay interferometry techniques using the University of Florida LISA interferometry simulator

    Energy Technology Data Exchange (ETDEWEB)

    Mitryk, Shawn J; Wand, Vinzenz; Mueller, Guido, E-mail: smitryk@phys.ufl.ed, E-mail: mueller@phys.ufl.ed [Department of Physics, University of Florida, PO Box 118440, Gainesville, FL 32611-8440 (United States)

    2010-04-21

    Laser Interferometer Space Antenna (LISA) is a cooperative NASA/ESA mission proposed to directly measure gravitational waves (GW) in the frequency range from 30 muHz to 1 Hz with an optimal strain sensitivity of 10{sup -21}/sq root(Hz) at 3 mHz. LISA will utilize a modified Michelson interferometer to measure length changes of 40 pm/sq root(Hz) between drag-free proof masses located on three separate spacecraft (SC) separated by a distance of 5 Gm. The University of Florida has developed a hardware-in-the-loop simulator of the LISA constellation to verify the laser noise cancellation technique known as time-delay interferometry (TDI). We replicate the frequency stabilization of the laser on the local SC and the phase-locking of the lasers on the far SC. The laser photodetector beatnotes are electronically delayed, Doppler shifted and applied with a mock GW signal to simulate the laser link between the SC. The beatnotes are also measured with a LISA-like phasemeter and the data are used to extract the laser phase and residual phase-lock loop noise in post-processing through TDI. This uncovers the GW modulation signal buried under the laser noise. The results are then compared to the requirements defined by the LISA science collaboration.

  16. MATLAB Simulation of UPQC for Power Quality Mitigation Using an Ant Colony Based Fuzzy Control Technique.

    Science.gov (United States)

    Kumarasabapathy, N; Manoharan, P S

    2015-01-01

    This paper proposes a fuzzy logic based new control scheme for the Unified Power Quality Conditioner (UPQC) for minimizing the voltage sag and total harmonic distortion in the distribution system consequently to improve the power quality. UPQC is a recent power electronic module which guarantees better power quality mitigation as it has both series-active and shunt-active power filters (APFs). The fuzzy logic controller has recently attracted a great deal of attention and possesses conceptually the quality of the simplicity by tackling complex systems with vagueness and ambiguity. In this research, the fuzzy logic controller is utilized for the generation of reference signal controlling the UPQC. To enable this, a systematic approach for creating the fuzzy membership functions is carried out by using an ant colony optimization technique for optimal fuzzy logic control. An exhaustive simulation study using the MATLAB/Simulink is carried out to investigate and demonstrate the performance of the proposed fuzzy logic controller and the simulation results are compared with the PI controller in terms of its performance in improving the power quality by minimizing the voltage sag and total harmonic distortion.

  17. Modeling and Simulation of Voids in Composite Tape Winding Process Based on Domain Superposition Technique

    Science.gov (United States)

    Deng, Bo; Shi, Yaoyao

    2017-11-01

    The tape winding technology is an effective way to fabricate rotationally composite materials. Nevertheless, some inevitable defects will seriously influence the performance of winding products. One of the crucial ways to identify the quality of fiber-reinforced composite material products is examining its void content. Significant improvement in products' mechanical properties can be achieved by minimizing the void defect. Two methods were applied in this study, finite element analysis and experimental testing, respectively, to investigate the mechanism of how void forming in composite tape winding processing. Based on the theories of interlayer intimate contact and Domain Superposition Technique (DST), a three-dimensional model of prepreg tape void with SolidWorks has been modeled in this paper. Whereafter, ABAQUS simulation software was used to simulate the void content change with pressure and temperature. Finally, a series of experiments were performed to determine the accuracy of the model-based predictions. The results showed that the model is effective for predicting the void content in the composite tape winding process.

  18. MATLAB Simulation of UPQC for Power Quality Mitigation Using an Ant Colony Based Fuzzy Control Technique

    Directory of Open Access Journals (Sweden)

    N. Kumarasabapathy

    2015-01-01

    Full Text Available This paper proposes a fuzzy logic based new control scheme for the Unified Power Quality Conditioner (UPQC for minimizing the voltage sag and total harmonic distortion in the distribution system consequently to improve the power quality. UPQC is a recent power electronic module which guarantees better power quality mitigation as it has both series-active and shunt-active power filters (APFs. The fuzzy logic controller has recently attracted a great deal of attention and possesses conceptually the quality of the simplicity by tackling complex systems with vagueness and ambiguity. In this research, the fuzzy logic controller is utilized for the generation of reference signal controlling the UPQC. To enable this, a systematic approach for creating the fuzzy membership functions is carried out by using an ant colony optimization technique for optimal fuzzy logic control. An exhaustive simulation study using the MATLAB/Simulink is carried out to investigate and demonstrate the performance of the proposed fuzzy logic controller and the simulation results are compared with the PI controller in terms of its performance in improving the power quality by minimizing the voltage sag and total harmonic distortion.

  19. Bioavailability and distribution and of ceria nanoparticles in simulated aquatic ecosystems, quantification with a radiotracer technique

    International Nuclear Information System (INIS)

    Zhang Zhiyong; Zhang Peng; He Xiao; Ma Yuhui; Lu Kai; Zhao Yuliang

    2014-01-01

    Although the presence of manufactured nanoparticles in the aquatic environment is still largely undocumented, their release could certainly occur in the future, particularly via municipal treatment plant effluents of cities supporting nano-industries. To get an initial estimate of the environmental behavior of nanomaterials, we investigated the distribution and accumulation of ceria nanoparticles in simulated aquatic ecosystems which included aquatic plant, shellfish, fish, water, and sediment using a radiotracer technique. Radioactive ceria ( 141 CeO 2 ) nanoparticles with a diameter of ca. 7 nm were synthesized by a precipitation method and added to the simulated aquatic ecosystems. The results indicate that the concentration of ceria nanoparticles in water decreased to a steady-state value after 3 days; meanwhile, the concentrations of ceria nanoparticles in the aquatic plant and sediment increased to their highest values. The distribution and accumulation characteristics of ceria nanoparticles in various aquatic organisms were different. Ceratophyllum demersum showed a high ability of accumulation of ceria nanoparticles from water. (authors)

  20. Accelerating all-atom MD simulations of lipids using a modified virtual-sites technique

    DEFF Research Database (Denmark)

    Loubet, Bastien; Kopec, Wojciech; Khandelia, Himanshu

    2014-01-01

    We present two new implementations of the virtual sites technique which completely suppresses the degrees of freedom of the hydrogen atoms in a lipid bilayer allowing for an increased time step of 5 fs in all-atom simulations of the CHARMM36 force field. One of our approaches uses the derivation...... of the virtual sites used in GROMACS while the other uses a new definition of the virtual sites of the CH2 groups. Our methods is tested on a DPPC (no unsaturated chain), a POPC (one unsaturated chain), and a DOPC (two unsaturated chains) lipid bilayers. We calculate various physical properties of the membrane...... of our simulations with and without virtual sites and explain the differences and similarity observed. The best agreements are obtained for the GROMACS original virtual sites on the DOPC bilayer where we get an area per lipid of 67.3 ± 0.3 A˚2 without virtual sites and 67.6 ± 0.3 A˚2 with virtual sites...

  1. Use of sting-response techniques for simulate diagnostics in human esophagus

    International Nuclear Information System (INIS)

    Rodriguez, I.; Gonzalez, Y.; Valdes, L.; Alfonso, J.A.; Estevez, E.

    2003-01-01

    In this work a study of simulation of the gamma graphic studies that are carried out in human esophagus in the Dept. of Nuclear Medicine of the 'Celestino Hernandez Robau Hospital of Santa Clara is presented. For the investigation tubular reactors were used and sting-response techniques with radioactive tracer of Technetium 99 metastable to a concentration of 1 mCi and several flows were applied. The distribution curves of residences times were obtained, those that respond to an equation of the type: Y = A + B exp (- exp((x-C)/D)) - ((x-C/D)+1). They were also carried out, optimizations studies of the doses of the radioactive to give to the patients from 1 mCi (that is the one used in studies) up to 0,5 mCi, and the influences on the obtained distributions of residence time were analyzed. It was confirmed the possibility to lower the doses with clear information of the signal. It was also carried out a simulation of the attenuation of the radiations that takes place in the patients by the interposition of tissues among the analyzed organ, and the detection equipment. It was used paraffin for tissue simulation. It was found the almost independence of the intensity of the radiations with the thickness, for the assayed doses. Lastly it was found a complex mathematical model that responds to the diagnostic curves obtained in these studies, being correlated the coefficients of the pattern with the most important physical parameters of the system, giving it a practical and useful value, all time that the error among the values that this it predicts and the experimental ones do not surpass of 5%. (Author)

  2. The influence of wheelchair propulsion technique on upper extremity muscle demand: a simulation study.

    Science.gov (United States)

    Rankin, Jeffery W; Kwarciak, Andrew M; Richter, W Mark; Neptune, Richard R

    2012-11-01

    The majority of manual wheelchair users will experience upper extremity injuries or pain, in part due to the high force requirements, repetitive motion and extreme joint postures associated with wheelchair propulsion. Recent studies have identified cadence, contact angle and peak force as important factors for reducing upper extremity demand during propulsion. However, studies often make comparisons between populations (e.g., able-bodied vs. paraplegic) or do not investigate specific measures of upper extremity demand. The purpose of this study was to use a musculoskeletal model and forward dynamics simulations of wheelchair propulsion to investigate how altering cadence, peak force and contact angle influence individual muscle demand. Forward dynamics simulations of wheelchair propulsion were generated to emulate group-averaged experimental data during four conditions: 1) self-selected propulsion technique, and while 2) minimizing cadence, 3) maximizing contact angle, and 4) minimizing peak force using biofeedback. Simulations were used to determine individual muscle mechanical power and stress as measures of muscle demand. Minimizing peak force and cadence had the lowest muscle power requirements. However, minimizing peak force increased cadence and recovery power, while minimizing cadence increased average muscle stress. Maximizing contact angle increased muscle stress and had the highest muscle power requirements. Minimizing cadence appears to have the most potential for reducing muscle demand and fatigue, which could decrease upper extremity injuries and pain. However, altering any of these variables to extreme values appears to be less effective; instead small to moderate changes may better reduce overall muscle demand. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. "Photographing money" task pricing

    Science.gov (United States)

    Jia, Zhongxiang

    2018-05-01

    "Photographing money" [1]is a self-service model under the mobile Internet. The task pricing is reasonable, related to the success of the commodity inspection. First of all, we analyzed the position of the mission and the membership, and introduced the factor of membership density, considering the influence of the number of members around the mission on the pricing. Multivariate regression of task location and membership density using MATLAB to establish the mathematical model of task pricing. At the same time, we can see from the life experience that membership reputation and the intensity of the task will also affect the pricing, and the data of the task success point is more reliable. Therefore, the successful point of the task is selected, and its reputation, task density, membership density and Multiple regression of task positions, according to which a nhew task pricing program. Finally, an objective evaluation is given of the advantages and disadvantages of the established model and solution method, and the improved method is pointed out.

  4. Martin Parr in Mexico: Does Photographic Style Translate?

    Directory of Open Access Journals (Sweden)

    Timothy R. Gleason

    2011-11-01

    Full Text Available This study analyzes Martin Parr’s 2006 photobook, Mexico. Parr is a British documentary photographer best known for a direct photographic style that reflects upon “Englishness.”Mexico is his attempt to understand this foreign country via his camera. Mexico, as a research subject, is not a problem to solve but an opportunity to understand a photographer’s work. Parr’s Mexico photography (technique, photographic content, and interest in globalization, economics, and culture is compared to his previous work to explain how Parr uses fashion and icons to represent a culture or class. This article argues Parr’s primary subjects, heads/hats, food, and Christs, are photographed without excessive aesthetic pretensions so that the thrust of Parr’s message about globalization can be more evident:Mexico maintains many of its traditions and icons while adopting American brands.

  5. Improvements in televisual and photographic inspections of AGRs

    International Nuclear Information System (INIS)

    Hayter, R.; Wadsworth, A.

    1988-01-01

    The visual inspection techniques and equipment used at AGR Power Stations have been improved and updated in the light of need and technological advance, new equipment being developed and introduced where necessary. Specifically this report covers the development and use of: a short TRIUMPH compatible photographic camera 600 mm long x 75 mm dia taking 50 shots on a 35 mm film; a 240 mm dia photographic pod taking high quality 70 mm format photographs of large in-reactor volumes; a photographic camera of cross section 37 x 17 mm for the inspection of helically wound AGR boilers and the subsequent development of this latter device into a state of the art TV inspection camera. (author)

  6. Study of flow characteristics in a secondary clarifier by numerical simulation and radioisotope tracer technique

    International Nuclear Information System (INIS)

    Kim, H.S.; Shin, M.S.; Jang, D.S.; Jung, S.H.; Jin, J.H.

    2005-01-01

    Numerical simulation in a 2-D rectangular coordinate and experimental study have been performed to figure out the flow characteristics and concentration distribution of a large-scale rectangular final clarifier in wastewater treatment facility located in Busan, S. Korea. The purpose of numerical calculation is to verify the experimentally measured data by radioisotope tracer technique and further to understand the important physical feature occurring in a large-scale clarifier, in many cases which is not sufficient by the aid of limited number of experimental data. To this end, a comprehensive computer program is basically made by SIMPLE algorithm by Patankar with the special emphasis on the parametric evaluation of the various phenomenological models. Calculation results are successfully evaluated against experimental data obtained by the method of radioisotope tracer. Detailed comparison is made on the calculated residence time distribution (RTD) curves with measurement inside the clarifier as well as the exhaust. Further the calculation results predict well the well-known characteristics of clarifier flow such as the waterfall phenomenon at the front end of the clarifier, the bottom density current in the settling zone and the upward flow in the withdrawal zone. Thus it is believed that the flow calculation program and the data incorporation technique of radioisotope measurement employed in this study show the high possibility as a complementary tool of experiment in this area

  7. Determination of true coincidence correction factors using Monte-Carlo simulation techniques

    Directory of Open Access Journals (Sweden)

    Chionis Dionysios A.

    2014-01-01

    Full Text Available Aim of this work is the numerical calculation of the true coincidence correction factors by means of Monte-Carlo simulation techniques. For this purpose, the Monte Carlo computer code PENELOPE was used and the main program PENMAIN was properly modified in order to include the effect of the true coincidence phenomenon. The modified main program that takes into consideration the true coincidence phenomenon was used for the full energy peak efficiency determination of an XtRa Ge detector with relative efficiency 104% and the results obtained for the 1173 keV and 1332 keV photons of 60Co were found consistent with respective experimental ones. The true coincidence correction factors were calculated as the ratio of the full energy peak efficiencies was determined from the original main program PENMAIN and the modified main program PENMAIN. The developed technique was applied for 57Co, 88Y, and 134Cs and for two source-to-detector geometries. The results obtained were compared with true coincidence correction factors calculated from the "TrueCoinc" program and the relative bias was found to be less than 2%, 4%, and 8% for 57Co, 88Y, and 134Cs, respectively.

  8. Comparisons of Particle Tracking Techniques and Galerkin Finite Element Methods in Flow Simulations on Watershed Scales

    Science.gov (United States)

    Shih, D.; Yeh, G.

    2009-12-01

    This paper applies two numerical approximations, the particle tracking technique and Galerkin finite element method, to solve the diffusive wave equation in both one-dimensional and two-dimensional flow simulations. The finite element method is one of most commonly approaches in numerical problems. It can obtain accurate solutions, but calculation times may be rather extensive. The particle tracking technique, using either single-velocity or average-velocity tracks to efficiently perform advective transport, could use larger time-step sizes than the finite element method to significantly save computational time. Comparisons of the alternative approximations are examined in this poster. We adapt the model WASH123D to examine the work. WASH123D is an integrated multimedia, multi-processes, physics-based computational model suitable for various spatial-temporal scales, was first developed by Yeh et al., at 1998. The model has evolved in design capability and flexibility, and has been used for model calibrations and validations over the course of many years. In order to deliver a locally hydrological model in Taiwan, the Taiwan Typhoon and Flood Research Institute (TTFRI) is working with Prof. Yeh to develop next version of WASH123D. So, the work of our preliminary cooperationx is also sketched in this poster.

  9. A Simulation Technique for Three-Dimensional Mechanical Systems Using Universal Software Systems of Analysis

    Directory of Open Access Journals (Sweden)

    V. A. Trudonoshin

    2015-01-01

    Full Text Available The article proposes a technique to develop mathematical models (MM of elements of the three-dimensional (3D mechanical systems for universal simulation software systems that allow us automatically generate the MM of a system based on MM elements and their connections. The technique is based on the MM of 3 D body. Linear and angular velocities are used as the main phase variables (unknown in the MM of the system, linear and angular movements are used as the additional ones, the latter being defined by the normalized quaternions that have computational advantages over turning angles.The paper has considered equations of dynamics, formulas of transition from the global coordinate system to the local one and vice versa. A spherical movable joint is presented as an example of the interaction element between the bodies. The paper shows the MM equivalent circuits of a body and a spherical joint. Such a representation, as the equivalent circuit, automatically enables us to obtain topological equations of the system. Various options to build equations of the joint and advices for their practical use are given.

  10. Applications Of Monte Carlo Radiation Transport Simulation Techniques For Predicting Single Event Effects In Microelectronics

    International Nuclear Information System (INIS)

    Warren, Kevin; Reed, Robert; Weller, Robert; Mendenhall, Marcus; Sierawski, Brian; Schrimpf, Ronald

    2011-01-01

    MRED (Monte Carlo Radiative Energy Deposition) is Vanderbilt University's Geant4 application for simulating radiation events in semiconductors. Geant4 is comprised of the best available computational physics models for the transport of radiation through matter. In addition to basic radiation transport physics contained in the Geant4 core, MRED has the capability to track energy loss in tetrahedral geometric objects, includes a cross section biasing and track weighting technique for variance reduction, and additional features relevant to semiconductor device applications. The crucial element of predicting Single Event Upset (SEU) parameters using radiation transport software is the creation of a dosimetry model that accurately approximates the net collected charge at transistor contacts as a function of deposited energy. The dosimetry technique described here is the multiple sensitive volume (MSV) model. It is shown to be a reasonable approximation of the charge collection process and its parameters can be calibrated to experimental measurements of SEU cross sections. The MSV model, within the framework of MRED, is examined for heavy ion and high-energy proton SEU measurements of a static random access memory.

  11. A new and efficient transient noise analysis technique for simulation of CCD image sensors or particle detectors

    International Nuclear Information System (INIS)

    Bolcato, P.; Jarron, P.; Poujois, R.

    1993-01-01

    CCD image sensors or switched capacitor circuits used for particle detectors have a certain noise level affecting the resolution of the detector. A new noise simulation technique for these devices is presented that has been implemented in the circuit simulator ELDO. The approach is particularly useful for noise simulation in analog sampling circuits. Comparison between simulations and experimental results has been made and is shown for a 1.5 μ CMOS current mode amplifier designed for high-rate particle detectors. (R.P.) 5 refs., 7 figs

  12. Digitization of natural objects with micro CT and photographs.

    Science.gov (United States)

    Ijiri, Takashi; Todo, Hideki; Hirabayashi, Akira; Kohiyama, Kenji; Dobashi, Yoshinori

    2018-01-01

    In this paper, we present a three-dimensional (3D) digitization technique for natural objects, such as insects and plants. The key idea is to combine X-ray computed tomography (CT) and photographs to obtain both complicated 3D shapes and surface textures of target specimens. We measure a specimen by using an X-ray CT device and a digital camera to obtain a CT volumetric image (volume) and multiple photographs. We then reconstruct a 3D model by segmenting the CT volume and generate a texture by projecting the photographs onto the model. To achieve this reconstruction, we introduce a technique for estimating a camera position for each photograph. We also present techniques for merging multiple textures generated from multiple photographs and recovering missing texture areas caused by occlusion. We illustrate the feasibility of our 3D digitization technique by digitizing 3D textured models of insects and flowers. The combination of X-ray CT and a digital camera makes it possible to successfully digitize specimens with complicated 3D structures accurately and allows us to browse both surface colors and internal structures.

  13. Test/score/report: Simulation techniques for automating the test process

    Science.gov (United States)

    Hageman, Barbara H.; Sigman, Clayton B.; Koslosky, John T.

    1994-01-01

    A Test/Score/Report capability is currently being developed for the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) system which will automate testing of the Goddard Space Flight Center (GSFC) Payload Operations Control Center (POCC) and Mission Operations Center (MOC) software in three areas: telemetry decommutation, spacecraft command processing, and spacecraft memory load and dump processing. Automated computer control of the acceptance test process is one of the primary goals of a test team. With the proper simulation tools and user interface, the task of acceptance testing, regression testing, and repeatability of specific test procedures of a ground data system can be a simpler task. Ideally, the goal for complete automation would be to plug the operational deliverable into the simulator, press the start button, execute the test procedure, accumulate and analyze the data, score the results, and report the results to the test team along with a go/no recommendation to the test team. In practice, this may not be possible because of inadequate test tools, pressures of schedules, limited resources, etc. Most tests are accomplished using a certain degree of automation and test procedures that are labor intensive. This paper discusses some simulation techniques that can improve the automation of the test process. The TASS system tests the POCC/MOC software and provides a score based on the test results. The TASS system displays statistics on the success of the POCC/MOC system processing in each of the three areas as well as event messages pertaining to the Test/Score/Report processing. The TASS system also provides formatted reports documenting each step performed during the tests and the results of each step. A prototype of the Test/Score/Report capability is available and currently being used to test some POCC/MOC software deliveries. When this capability is fully operational it should greatly reduce the time necessary

  14. Effects of radiation on photographic film. A study

    International Nuclear Information System (INIS)

    Dutton, D.M.

    1971-01-01

    This study of the effects of radiation on photographic film is related to the Nevada Test Site's underground nuclear testing program, which has been active since implementation of the Limited Test Ban Treaty of 1963. Residual radioactivity, which has accidentally been released on several tests, adversely affects the photographic film used in test data acquisition. The report defines this problem in terms of radiation-caused image degradation, radiation/matter interactions, types of radiation released by accidental venting, and the photographic effects of gamma and x radiation. Techniques and experimental findings are documented that may be useful in recovering information from radiation-fogged film. Techniques discussed include processing methods, shielding, image enhancement techniques, and operational handling of potentially irradiated film. (U.S.)

  15. Efficient CT simulation of the four-field technique for conformal radiotherapy of prostate carcinoma

    International Nuclear Information System (INIS)

    Valicenti, Richard K.; Waterman, Frank M.; Croce, Raymond J.; Corn, Benjamin; Suntharalingam, Nagalingam; Curran, Walter J.

    1997-01-01

    Purpose: Conformal radiotherapy of prostate carcinoma relies on contouring of individual CT slices for target and normal tissue localization. This process can be very time consuming. In the present report, we describe a method to more efficiently localize pelvic anatomy directly from digital reconstructed radiographs (DRRs). Materials and Methods: Ten patients with prostate carcinoma underwent CT simulation (the spiral mode at 3 mm separation) for conformal four-field 'box' radiotherapy. The bulbous urethra and bladder were opacified with iodinated contrast media. On lateral and anteroposterior DRRs, the volume of interest (VOI) was restricted to 1.0-1.5 cm tissue thickness to optimize digital radiograph reconstruction of the prostate and seminal vesicles. By removing unessential voxel elements, this method provided direct visualization of those structures. For comparison, the targets of each patient were also obtained by contouring CT axial slices. Results: The method was successfully performed if the target structures were readily visualized and geometrically corresponded to those generated by contouring axial images. The targets in 9 of 10 patients were reliable representations of the CT-contoured volumes. One patient had 18 mm variation due to the lack of bladder opacification. Using VOIs to generate thin tissue DRRs, the time required for target and normal tissue localization was on the average less than 5 min. Conclusion: In CT simulation of the four-field irradiation technique for prostate carcinoma, thin-tissue DRRs allowed for efficient and accurate target localization without requiring individual axial image contouring. This method may facilitate positioning of the beam isocenter and provide reliable conformal radiotherapy

  16. Comparative evaluation of a two stroke compressed natural gas mixer design using simulation and experimental techniques

    Energy Technology Data Exchange (ETDEWEB)

    Ramasamy, D.; Bakar, R.A.; Rahim, M.F.; Noor, M.M. [Malaysia Pahang Univ., Pahang (Malaysia). Automotive Focus Group

    2008-07-01

    A study was conducted in which a two-stroke engine was converted for use with bi-fuel, notably compressed natural gas and gasoline. The excessive by-products generated by two-stroke engine combustion can be attributed to the inefficient combustion process. This prototype uniflow-type single-cylinder engine was equipped with a bi-fuel conversion system. A dedicated mixer was also developed to meter the gaseous fuel through the engine intake system. It was designed to meet air and fuel requirement similar to its gasoline counterpart. The mixer was modeled to obtain optimum orifice diameter using three different sizes of 14, 16 and 18 mm respectively. A standard computational fluid dynamics (CFD) software package was used to simulate the flow. A pressure reading was obtained during the prototype test. The drop in pressure across the venturi was shown to be an important parameter as it determined the actual fuel-air ratio in the actual engine. A good agreement of CFD outputs with that of the experimental outputs was recorded. The experimental technique validated the pressure distribution predicted by CFD means on the effects of the three insert rings in the CNG mixer. The simulation exercise can be used to predict the amount of CNG consumed by the engine. It was concluded that the 14 mm throat ring was best suited for the CNG mixer because it provided the best suction. Once the mixer is tested on a real engine, it will clear any doubts as to whether the throat can function at high engine speeds. 5 refs., 3 tabs., 8 figs.

  17. Adobe Photoshop CS6 for photographers

    CERN Document Server

    Evening, Martin

    2012-01-01

    Renowned Photographer and Photoshop hall-of-famer, Martin Evening returns with his comprehensive guide to Photoshop. This acclaimed work covers everything from the core aspects of working in Photoshop to advanced techniques for refined workflows and professional results. Using concise advice, clear instruction and real world examples, this essential guide will give you the skills, regardless of your experience, to create professional quality results. A robust accompanying website features sample images, tutorial videos, bonus chapters and a plethora of extra resources. Quite simply, this is

  18. Adobe Photoshop CS5 for photographers

    CERN Document Server

    Evening, Martin

    2010-01-01

    With the new edition of this proven bestseller, Photoshop users can master the power of Photoshop CS5 with internationally renowned photographer and Photoshop hall-of-famer Martin Evening by their side.  In this acclaimed reference work, Martin covers everything from the core aspects of working in Photoshop to advanced techniques for professional results. Subjects covered include organizing a digital workflow, improving creativity, output, automating Photoshop, and using Camera RAW. The style of the book is extremely clear, with real examples, diagrams, illustrations, and step-by-step ex

  19. Adobe Photoshop Elements 11 for photographers

    CERN Document Server

    Andrews, Philip

    2013-01-01

    To coincide with some of the biggest changes in Photoshop Elements for years, Philip Andrews completely revises his bestselling title to include all the new features of this release. See how the new interface works alongside new tools, techniques and workflows to make editing, enhancing and sharing your pictures easier than ever. And as always, he introduces the changed and improved features with colorful illustrations and the clear step-by-step instruction that has made his books the go-to titles for photographers the world over. ????In this edition Andrews highlights followi

  20. Development and validation of predictive simulation model of multi-layer repair welding process by temper bead technique

    International Nuclear Information System (INIS)

    Okano, Shigetaka; Miyasaka, Fumikazu; Mochizuki, Masahito; Tanaka, Manabu

    2015-01-01

    Stress corrosion cracking (SCC) has recently been observed in the nickel base alloy weld metal of dissimilar pipe joint used in pressurized water reactor (PWR) . Temper bead technique has been developed as one of repair procedures against SCC applicable in case that post weld heat treatment (PWHT) is difficult to carry out. In this regard, however it is essential to pass the property and performance qualification test to confirm the effect of tempering on the mechanical properties at repair welds before temper bead technique is actually used in practice. Thus the appropriate welding procedure conditions in temper bead technique are determined on the basis of the property and performance qualification testing. It is necessary for certifying the structural soundness and reliability at repair welds but takes a lot of work and time in the present circumstances. Therefore it is desirable to establish the reasonable alternatives for qualifying the property and performance at repair welds. In this study, mathematical modeling and numerical simulation procedures were developed for predicting weld bead configuration and temperature distribution during multi-layer repair welding process by temper bead technique. In the developed simulation technique, characteristics of heat source in temper bead welding are calculated from weld heat input conditions through the arc plasma simulation and then weld bead configuration and temperature distribution during temper bead welding are calculated from characteristics of heat source obtained through the coupling analysis between bead surface shape and thermal conduction. The simulation results were compared with the experimental results under the same welding heat input conditions. As the results, the bead surface shape and temperature distribution, such as A cl lines, were in good agreement between simulation and experimental results. It was concluded that the developed simulation technique has the potential to become useful for

  1. Modeling and simulation of PEM fuel cell's flow channels using CFD techniques

    International Nuclear Information System (INIS)

    Cunha, Edgar F.; Andrade, Alexandre B.; Robalinho, Eric; Bejarano, Martha L.M.; Linardi, Marcelo; Cekinski, Efraim

    2007-01-01

    Fuel cells are one of the most important devices to obtain electrical energy from hydrogen. The Proton Exchange Membrane Fuel Cell (PEMFC) consists of two important parts: the Membrane Electrode Assembly (MEA), where the reactions occur, and the flow field plates. The plates have many functions in a fuel cell: distribute reactant gases (hydrogen and air or oxygen), conduct electrical current, remove heat and water from the electrodes and make the cell robust. The cost of the bipolar plates corresponds up to 45% of the total stack costs. The Computational Fluid Dynamic (CFD) is a very useful tool to simulate hydrogen and oxygen gases flow channels, to reduce the costs of bipolar plates production and to optimize mass transport. Two types of flow channels were studied. The first type was a commercial plate by ELECTROCELL and the other was entirely projected at Programa de Celula a Combustivel (IPEN/CNEN-SP) and the experimental data were compared with modelling results. Optimum values for each set of variables were obtained and the models verification was carried out in order to show the feasibility of this technique to improve fuel cell efficiency. (author)

  2. Analysis and simulation of wireless signal propagation applying geostatistical interpolation techniques

    Science.gov (United States)

    Kolyaie, S.; Yaghooti, M.; Majidi, G.

    2011-12-01

    This paper is a part of an ongoing research to examine the capability of geostatistical analysis for mobile networks coverage prediction, simulation and tuning. Mobile network coverage predictions are used to find network coverage gaps and areas with poor serviceability. They are essential data for engineering and management in order to make better decision regarding rollout, planning and optimisation of mobile networks.The objective of this research is to evaluate different interpolation techniques in coverage prediction. In method presented here, raw data collected from drive testing a sample of roads in study area is analysed and various continuous surfaces are created using different interpolation methods. Two general interpolation methods are used in this paper with different variables; first, Inverse Distance Weighting (IDW) with various powers and number of neighbours and second, ordinary kriging with Gaussian, spherical, circular and exponential semivariogram models with different number of neighbours. For the result comparison, we have used check points coming from the same drive test data. Prediction values for check points are extracted from each surface and the differences with actual value are computed. The output of this research helps finding an optimised and accurate model for coverage prediction.

  3. Modeling and simulation of PEM fuel cell's flow channels using CFD techniques

    Energy Technology Data Exchange (ETDEWEB)

    Cunha, Edgar F.; Andrade, Alexandre B.; Robalinho, Eric; Bejarano, Martha L.M.; Linardi, Marcelo [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)]. E-mails: efcunha@ipen.br; abodart@ipen.br; eric@ipen.br; mmora@ipen.br; mlinardi@ipen.br; Cekinski, Efraim [Instituto de Pesquisas Tecnologicas (IPT-SP), Sao Paulo, SP (Brazil)]. E-mail: cekinski@ipt.br

    2007-07-01

    Fuel cells are one of the most important devices to obtain electrical energy from hydrogen. The Proton Exchange Membrane Fuel Cell (PEMFC) consists of two important parts: the Membrane Electrode Assembly (MEA), where the reactions occur, and the flow field plates. The plates have many functions in a fuel cell: distribute reactant gases (hydrogen and air or oxygen), conduct electrical current, remove heat and water from the electrodes and make the cell robust. The cost of the bipolar plates corresponds up to 45% of the total stack costs. The Computational Fluid Dynamic (CFD) is a very useful tool to simulate hydrogen and oxygen gases flow channels, to reduce the costs of bipolar plates production and to optimize mass transport. Two types of flow channels were studied. The first type was a commercial plate by ELECTROCELL and the other was entirely projected at Programa de Celula a Combustivel (IPEN/CNEN-SP) and the experimental data were compared with modelling results. Optimum values for each set of variables were obtained and the models verification was carried out in order to show the feasibility of this technique to improve fuel cell efficiency. (author)

  4. Assessing Uncertainty in Deep Learning Techniques that Identify Atmospheric Rivers in Climate Simulations

    Science.gov (United States)

    Mahesh, A.; Mudigonda, M.; Kim, S. K.; Kashinath, K.; Kahou, S.; Michalski, V.; Williams, D. N.; Liu, Y.; Prabhat, M.; Loring, B.; O'Brien, T. A.; Collins, W. D.

    2017-12-01

    Atmospheric rivers (ARs) can be the difference between CA facing drought or hurricane-level storms. ARs are a form of extreme weather defined as long, narrow columns of moisture which transport water vapor outside the tropics. When they make landfall, they release the vapor as rain or snow. Convolutional neural networks (CNNs), a machine learning technique that uses filters to recognize features, are the leading computer vision mechanism for classifying multichannel images. CNNs have been proven to be effective in identifying extreme weather events in climate simulation output (Liu et. al. 2016, ABDA'16, http://bit.ly/2hlrFNV). Here, we compare three different CNN architectures, tuned with different hyperparameters and training schemes. We compare two-layer, three-layer, four-layer, and sixteen-layer CNNs' ability to recognize ARs in Community Atmospheric Model version 5 output, and we explore the ability of data augmentation and pre-trained models to increase the accuracy of the classifier. Because pre-training the model with regular images (i.e. benches, stoves, and dogs) yielded the highest accuracy rate, this strategy, also known as transfer learning, may be vital in future scientific CNNs, which likely will not have access to a large labelled training dataset. By choosing the most effective CNN architecture, climate scientists can build an accurate historical database of ARs, which can be used to develop a predictive understanding of these phenomena.

  5. Thermoreflectance temperature imaging of integrated circuits: calibration technique and quantitative comparison with integrated sensors and simulations

    International Nuclear Information System (INIS)

    Tessier, G; Polignano, M-L; Pavageau, S; Filloy, C; Fournier, D; Cerutti, F; Mica, I

    2006-01-01

    Camera-based thermoreflectance microscopy is a unique tool for high spatial resolution thermal imaging of working integrated circuits. However, a calibration is necessary to obtain quantitative temperatures on the complex surface of integrated circuits. The spatial and temperature resolutions reached by thermoreflectance are excellent (360 nm and 2.5 x 10 -2 K in 1 min here), but the precision is more difficult to assess, notably due to the lack of comparable thermal techniques at submicron scales. We propose here a Peltier element control of the whole package temperature in order to obtain calibration coefficients simultaneously on several materials visible on the surface of the circuit. Under high magnifications, movements associated with thermal expansion are corrected using a piezo electric displacement and a software image shift. This calibration method has been validated by comparison with temperatures measured using integrated thermistors and diodes and by a finite volume simulation. We show that thermoreflectance measurements agree within a precision of ±2.3% with the on-chip sensors measurements. The diode temperature is found to underestimate the actual temperature of the active area by almost 70% due to the thermal contact of the diode with the substrate, acting as a heat sink

  6. Dynamics of fibres in a turbulent flow field - A particle-level simulation technique

    International Nuclear Information System (INIS)

    Sasic, Srdjan; Almstedt, Alf-Erik

    2010-01-01

    A particle-level simulation technique has been developed for modelling the flow of fibres in a turbulent flow field. A single fibre is conceived here as a chain of segments, thus enabling the model fibre to have all the degrees of freedom (translation, rotation, bending and twisting) needed to realistically reproduce the dynamics of real fibres. Equations of motion are solved for each segment, accounting for the interaction forces with the fluid, the contact forces with other fibres and the forces that maintain integrity of the fibre. The motion of the fluid is resolved as a combination of 3D mean flow velocities obtained from a CFD code and fluctuating turbulent velocities derived from the Langevin equation. A case of homogeneous turbulence is treated in this paper. The results obtained show that fibre flocs in air-fibre flows can be created even when attractive forces are not present. In such a case, contacts between fibres, properties of an individual fibre (such as flexibility and equilibrium shapes) and properties of the flow of the carrying fluid are shown to govern the physics behind formation and breaking up of fibre flocs. Highly irregular fibre shapes and stiff fibres lead to strong flocculation. The modelling framework applied in this work aims at making possible a numerical model applicable for designing processes involving transport of fibres by air at industrial scale.

  7. Monte Carlo simulation and gaussian broaden techniques for full energy peak of characteristic X-ray in EDXRF

    International Nuclear Information System (INIS)

    Li Zhe; Liu Min; Shi Rui; Wu Xuemei; Tuo Xianguo

    2012-01-01

    Background: Non-standard analysis (NSA) technique is one of the most important development directions of energy dispersive X-ray fluorescence (EDXRF). Purpose: This NSA technique is mainly based on Monte Carlo (MC) simulation and full energy peak broadening, which were studied preliminarily in this paper. Methods: A kind of MC model was established for Si-PIN based EDXRF setup, and the flux spectra were obtained for iron ore sample. Finally, the flux spectra were broadened by Gaussian broaden parameters calculated by a new method proposed in this paper, and the broadened spectra were compared with measured energy spectra. Results: MC method can be used to simulate EDXRF measurement, and can correct the matrix effects among elements automatically. Peak intensities can be obtained accurately by using the proposed Gaussian broaden technique. Conclusions: This study provided a key technique for EDXRF to achieve advanced NSA technology. (authors)

  8. A Simple Ensemble Simulation Technique for Assessment of Future Variations in Specific High-Impact Weather Events

    Science.gov (United States)

    Taniguchi, Kenji

    2018-04-01

    To investigate future variations in high-impact weather events, numerous samples are required. For the detailed assessment in a specific region, a high spatial resolution is also required. A simple ensemble simulation technique is proposed in this paper. In the proposed technique, new ensemble members were generated from one basic state vector and two perturbation vectors, which were obtained by lagged average forecasting simulations. Sensitivity experiments with different numbers of ensemble members, different simulation lengths, and different perturbation magnitudes were performed. Experimental application to a global warming study was also implemented for a typhoon event. Ensemble-mean results and ensemble spreads of total precipitation, atmospheric conditions showed similar characteristics across the sensitivity experiments. The frequencies of the maximum total and hourly precipitation also showed similar distributions. These results indicate the robustness of the proposed technique. On the other hand, considerable ensemble spread was found in each ensemble experiment. In addition, the results of the application to a global warming study showed possible variations in the future. These results indicate that the proposed technique is useful for investigating various meteorological phenomena and the impacts of global warming. The results of the ensemble simulations also enable the stochastic evaluation of differences in high-impact weather events. In addition, the impacts of a spectral nudging technique were also examined. The tracks of a typhoon were quite different between cases with and without spectral nudging; however, the ranges of the tracks among ensemble members were comparable. It indicates that spectral nudging does not necessarily suppress ensemble spread.

  9. Helped positioning by using a simulation tool for qualification of PWR vessel examination technique

    International Nuclear Information System (INIS)

    Lasserre, Frederic; Pasquier, Thierry; Haiat, Guillaume; Calmon, Pierre; Leberre, Stephane; Lutsen, Mickael

    2006-01-01

    INTERCONTROLE have been performing the examination of all PWR vessels in France from the inside, using UT techniques since 1975. The in-service inspection machine (MIS) features several tools equipped with focussed transducers; each tool is dedicated to one specific area of the vessel. In the core region, the very first millimeters from the cladding-base metal interface has to be inspected with accuracy because of the under-cladding cracks type defects (perpendicular to the inner surface) likely to be found. The technique used up to now was qualified according to the RSE-M code in 1998. It is based on a set of 63 angle L-waves transducers specifically designed for the detection of defect tip diffraction echoes in the 25 first millimeters in through-wall thickness. The analysis methods for defect characterization are based on a global integration of various cladding induced phenomena. The technique, the procedure and the analysis methods were qualified for a given limited volume. The new qualification in process in France, requires that INTERCONTROLE find solutions for increasing the accuracy of the analysis, in a larger qualification volume than before, while remaining in close compliance with the RSE-M code. A new computer assisted analysis tool for the characterization, the sizing and the positioning of defects is part of the improvements currently in progress or already completed. This tool is the result of a thesis commissioned to the CEA (Atomic Energy Commission), now implemented in the CIVAMIS software (developed on a CIVA based system). The updated version of CIVAMIS including this characterization tool and the RSE-M qualification of the new analysis method (with validation on mock-ups) is now qualified. Despite of a larger qualification volume, the results obtained (mentioned in the present paper) fulfill the customer's requirements thanks to the amount of data, of information and of knowledge, available today. The ability to simulate the cladding in terms

  10. Neural correlates for perception of companion animal photographs.

    Science.gov (United States)

    Hayama, Sara; Chang, Linda; Gumus, Kazim; King, George R; Ernst, Thomas

    2016-05-01

    Anthrozoological neuroscience, which we propose as the use of neuroscience techniques to study human-animal interaction, may help to elucidate mechanisms underlying the associated psychological, physiological, and other purported health effects. This preliminary study investigates the neural response to animal photographs in pet owners and non-pet owners, and both attraction and attachment to companion animals as modulators of human perception of companion animal photographs. Thirty male participants, 15 "Pet Owners" (PO) and 15 "Non-Pet Owners" (NPO), viewed photographs of companion animals during functional MRI (fMRI) scans at 3 T and provided ratings of attraction to the animal species represented in the photographs. Fourteen subjects additionally submitted and viewed personal pet photographs during fMRI scans, and completed the Lexington Attachment to Pets Scale (LAPS). PO exhibited greater activation than NPO during the viewing of animal photographs in areas of the insula, and frontal and occipital cortices. Moreover, ratings of attraction to animals correlated positively with neural activation in the cingulate gyrus, precentral gyrus, inferior parietal lobule, and superior temporal gyrus during the viewing of representative photographs. For subjects with household pets, scores on the LAPS correlated positively with neural activation during the viewing of owned pet photographs in the precuneus, cuneus, and superior parietal lobule. Our preliminary findings suggest that human perception of companion animals involve the visual attention network, which may be modulated at the neural level by subjective experiences of attraction or attachment to animals. Our understanding of human-animal interactions through anthrozoological neuroscience may better direct therapeutic applications, such as animal-assisted therapy. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Laboratory EXAFS using photographic method

    International Nuclear Information System (INIS)

    Joshi, S K; Gaur, A; Johari, A; Shrivastava, B D

    2009-01-01

    Laboratory EXAFS facilities have been used since long. However, EXAFS data analysis has not been reported as yet for the spectra recorded photographically. Though from our laboratory we have been reporting various studies employing X-ray spectrographs using the photographic method of registration of EXAFS spectra, but the data has never been analyzed using the Fourier transformation method and fitting with standards. This paper reports the study of copper metal EXAFS spectra at the K-edge recorded photographically employing a 400 mm curved mica crystal Cauchois type spectrograph with 0.5 kW tungsten target X-ray tube. The data obtained in digital form with the help of a microphotometer has been processed using EXAFS data analysis programs Athena and Artemis. The experimental data for copper metal foil have been fitted with the theoretical standards. The results have been compared with those obtained from another laboratory EXAFS set up employing 12 kW Rigaku rotating anode, Johansson-type spectrometer with Si(311) monochromator crystal and scintillation counter. The results have also been compared with those obtained from SSRL. The parameters obtained for the first two shells from the photographic method are comparable with those obtained from the other two methods. The present work shows that the photographic method of registering EXAFS spectra in laboratory set up using fixed target X-ray tubes can also be used for getting structural information at least for the first two coordination shells.

  12. Laboratory EXAFS using photographic method

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, S K [Government College, Badnawar (Dhar)-454660 (India); Gaur, A; Johari, A; Shrivastava, B D, E-mail: joshisantoshk@yahoo.co [School of Studies in Physics, Vikram University, Ujjain-456010 (India)

    2009-11-15

    Laboratory EXAFS facilities have been used since long. However, EXAFS data analysis has not been reported as yet for the spectra recorded photographically. Though from our laboratory we have been reporting various studies employing X-ray spectrographs using the photographic method of registration of EXAFS spectra, but the data has never been analyzed using the Fourier transformation method and fitting with standards. This paper reports the study of copper metal EXAFS spectra at the K-edge recorded photographically employing a 400 mm curved mica crystal Cauchois type spectrograph with 0.5 kW tungsten target X-ray tube. The data obtained in digital form with the help of a microphotometer has been processed using EXAFS data analysis programs Athena and Artemis. The experimental data for copper metal foil have been fitted with the theoretical standards. The results have been compared with those obtained from another laboratory EXAFS set up employing 12 kW Rigaku rotating anode, Johansson-type spectrometer with Si(311) monochromator crystal and scintillation counter. The results have also been compared with those obtained from SSRL. The parameters obtained for the first two shells from the photographic method are comparable with those obtained from the other two methods. The present work shows that the photographic method of registering EXAFS spectra in laboratory set up using fixed target X-ray tubes can also be used for getting structural information at least for the first two coordination shells.

  13. Solving optimisation problems in metal forming using Finite Element simulation and metamodelling techniques

    NARCIS (Netherlands)

    Bonte, M.H.A.; van den Boogaard, Antonius H.; Huetink, Han

    2005-01-01

    During the last decades, Finite Element (FEM) simulations of metal forming processes have become important tools for designing feasible production processes. In more recent years, several authors recognised the potential of coupling FEM simulations to mathematical optimisation algorithms to design

  14. Application of a Cycle Jump Technique for Acceleration of Fatigue Crack Growth Simulation

    DEFF Research Database (Denmark)

    Moslemian, Ramin; Berggreen, Christian; Karlsson, A.M.

    2010-01-01

    A method for accelerated simulation of fatigue crack growth in a bimaterial interface is proposed. To simulate fatigue crack growth in a bimaterial interface a routine is developed in the commercial finite element code ANSYS and a method to accelerate the simulation is implemented. The proposed m...... of the simulation show that with fair accuracy, using the cycle jump method, more than 70% reduction in computation time can be achieved....

  15. Innovations in surgery simulation: a review of past, current and future techniques

    OpenAIRE

    Badash, Ido; Burtt, Karen; Solorzano, Carlos A.; Carey, Joseph N.

    2016-01-01

    As a result of recent work-hours limitations and concerns for patient safety, innovations in extraclinical surgical simulation have become a desired part of residency education. Current simulation models, including cadaveric, animal, bench-top, virtual reality (VR) and robotic simulators are increasingly used in surgical training programs. Advances in telesurgery, three-dimensional (3D) printing, and the incorporation of patient-specific anatomy are paving the way for simulators to become int...

  16. The simulated early learning of cervical spine manipulation technique utilising mannequins.

    Science.gov (United States)

    Chapman, Peter D; Stomski, Norman J; Losco, Barrett; Walker, Bruce F

    2015-01-01

    Trivial pain or minor soreness commonly follows neck manipulation and has been estimated at one in three treatments. In addition, rare catastrophic events can occur. Some of these incidents have been ascribed to poor technique where the neck is rotated too far. The aims of this study were to design an instrument to measure competency of neck manipulation in beginning students when using a simulation mannequin, and then examine the suitability of using a simulation mannequin to teach the early psychomotor skills for neck chiropractic manipulative therapy. We developed an initial set of questionnaire items and then used an expert panel to assess an instrument for neck manipulation competency among chiropractic students. The study sample comprised all 41 fourth year 2014 chiropractic students at Murdoch University. Students were randomly allocated into either a usual learning or mannequin group. All participants crossed over to undertake the alternative learning method after four weeks. A chi-square test was used to examine differences between groups in the proportion of students achieving an overall pass mark at baseline, four weeks, and eight weeks. This study was conducted between January and March 2014. We successfully developed an instrument of measurement to assess neck manipulation competency in chiropractic students. We then randomised 41 participants to first undertake either "usual learning" (n = 19) or "mannequin learning" (n = 22) for early neck manipulation training. There were no significant differences between groups in the overall pass rate at baseline (χ(2) = 0.10, p = 0.75), four weeks (χ(2) = 0.40, p = 0.53), and eight weeks (χ(2) = 0.07, p = 0.79). This study demonstrates that the use of a mannequin does not affect the manipulation competency grades of early learning students at short term follow up. Our findings have potentially important safety implications as the results indicate that students could initially

  17. New technique of identifying the hierarchy of dynamic domains in proteins using a method of molecular dynamics simulations

    Directory of Open Access Journals (Sweden)

    Yesylevskyy S. O.

    2010-04-01

    Full Text Available Aim. Despite a large number of existing domain identification techniques there is no universally accepted method, which identifies the hierarchy of dynamic domains using the data of molecular dynamics (MD simulations. The goal of this work is to develop such technique. Methods. The dynamic domains are identified by eliminating systematic motions from MD trajectories recursively in a model-free manner. Results. The technique called the Hierarchical Domain-Wise Alignment (HDWA to identify hierarchically organized dynamic domains in proteins using the MD trajectories has been developed. Conclusion. A new method of domain identification in proteins is proposed

  18. Employ Simulation Techniques. Second Edition. Module C-5 of Category C--Instructional Execution. Professional Teacher Education Module Series.

    Science.gov (United States)

    Ohio State Univ., Columbus. National Center for Research in Vocational Education.

    One of a series of performance-based teacher education learning packages focusing upon specific professional competencies of vocational teachers, this learning module deals with employing simulation techniques. It consists of an introduction and four learning experiences. Covered in the first learning experience are various types of simulation…

  19. Fun and Games with Photoshop: Using Image Editors To Change Photographic Meaning.

    Science.gov (United States)

    Croft, Richard S.

    The introduction of techniques for digitizing photographic images, as well as the subsequent development of powerful image-editing software, has both broadened the possibilities of altering photographs and brought the means for doing so within the reach of many. This article is an informal review of the ways image-editing software can be used to…

  20. Who is that masked educator? Deconstructing the teaching and learning processes of an innovative humanistic simulation technique.

    Science.gov (United States)

    McAllister, Margaret; Searl, Kerry Reid; Davis, Susan

    2013-12-01

    Simulation learning in nursing has long made use of mannequins, standardized actors and role play to allow students opportunity to practice technical body-care skills and interventions. Even though numerous strategies have been developed to mimic or amplify clinical situations, a common problem that is difficult to overcome in even the most well-executed simulation experiences, is that students may realize the setting is artificial and fail to fully engage, remember or apply the learning. Another problem is that students may learn technical competence but remain uncertain about communicating with the person. Since communication capabilities are imperative in human service work, simulation learning that only achieves technical competence in students is not fully effective for the needs of nursing education. Furthermore, while simulation learning is a burgeoning space for innovative practices, it has been criticized for the absence of a basis in theory. It is within this context that an innovative simulation learning experience named "Mask-Ed (KRS simulation)", has been deconstructed and the active learning components examined. Establishing a theoretical basis for creative teaching and learning practices provides an understanding of how, why and when simulation learning has been effective and it may help to distinguish aspects of the experience that could be improved. Three conceptual theoretical fields help explain the power of this simulation technique: Vygotskian sociocultural learning theory, applied theatre and embodiment. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Student’s Perceptions on Simulation as Part of Experiential Learning in Approaches, Methods, and Techniques (AMT Course

    Directory of Open Access Journals (Sweden)

    Marselina Karina Purnomo

    2017-03-01

    Full Text Available Simulation is a part of Experiential Learning which represents certain real-life events. In this study, simulation is used as a learning activity in Approaches, Methods, and Techniques (AMT course which is one of the courses in English Language Education Study Program (ELESP of Sanata Dharma University. Since simulation represents the real-life events, it encourages students to apply the approaches, methods, and techniques being studied based on the real-life classroom. Several experts state that students are able to involve their personal experiences through simulation which additionally is believed to create a meaningful learning in the class. This study aimed to discover ELESP students’ perceptions toward simulation as a part of Experiential Learning in AMT course. From the findings, it could be inferred that students agreed that simulation in class was important for students’ learning for it formed a meaningful learning in class.  DOI: https://doi.org/10.24071/llt.2017.200104

  2. Preserving local writers, genealogy, photographs, newspapers, and related materials

    CERN Document Server

    Smallwood, Carol

    2012-01-01

    Preserving Local Writers, Genealogy, Photographs, Newspapers, and Related Materials draws on the practical knowledge of archivists, preservationists, librarians, and others who share the goal of making local history accessible to future generations. Anyone who plans to start a local history project or preserve important historical materials will find plenty of tips, techniques, sample documents, project ideas, and inspiration in its pages.

  3. Development of techniques for joining fuel rod simulators to test assemblies

    International Nuclear Information System (INIS)

    Moorhead, A.J.; Reed, R.W.

    1980-01-01

    A unique tubular electrode carrier is described for gas tungsten-arc welding small-diameter nuclear fuel rod simulators to the tubesheet of a test assembly. Both the close-packed geometry of the array of simulators and the extension of coaxial electrical conductors from each simulator hindered access to the weld joint. Consequently, a conventional gas tungsten-arc torch could not be used. Two seven-rod assemblies that were mockups of the simulator-to-tubesheet joint area were welded and successfully tested. Modified versions of the electrode carrier for brazing electrical leads to the upper ends of the fuel pin simulators are also described. Satisfactory brazes have been made on both single-rod mockups and an array of 25 simulators by using the modified electrode carrier and a filler metal with a composition of 71.5 Ag-28 Cu-0.5 Ni

  4. Innovations in surgery simulation: a review of past, current and future techniques.

    Science.gov (United States)

    Badash, Ido; Burtt, Karen; Solorzano, Carlos A; Carey, Joseph N

    2016-12-01

    As a result of recent work-hours limitations and concerns for patient safety, innovations in extraclinical surgical simulation have become a desired part of residency education. Current simulation models, including cadaveric, animal, bench-top, virtual reality (VR) and robotic simulators are increasingly used in surgical training programs. Advances in telesurgery, three-dimensional (3D) printing, and the incorporation of patient-specific anatomy are paving the way for simulators to become integral components of medical training in the future. Evidence from the literature highlights the benefits of including simulations in surgical training; skills acquired through simulations translate into improvements in operating room performance. Moreover, simulations are rapidly incorporating new medical technologies and offer increasingly high-fidelity recreations of procedures. As a result, both novice and expert surgeons are able to benefit from their use. As dedicated, structured curricula are developed that incorporate simulations into daily resident training, simulated surgeries will strengthen the surgeon's skill set, decrease hospital costs, and improve patient outcomes.

  5. Pilot study: evaluation of the use of the convergent interview technique in understanding the perception of surgical design and simulation.

    Science.gov (United States)

    Logan, Heather; Wolfaardt, Johan; Boulanger, Pierre; Hodgetts, Bill; Seikaly, Hadi

    2013-06-19

    It is important to understand the perceived value of surgical design and simulation (SDS) amongst surgeons, as this will influence its implementation in clinical settings. The purpose of the present study was to examine the application of the convergent interview technique in the field of surgical design and simulation and evaluate whether the technique would uncover new perceptions of virtual surgical planning (VSP) and medical models not discovered by other qualitative case-based techniques. Five surgeons were asked to participate in the study. Each participant was interviewed following the convergent interview technique. After each interview, the interviewer interpreted the information by seeking agreements and disagreements among the interviewees in order to understand the key concepts in the field of SDS. Fifteen important issues were extracted from the convergent interviews. In general, the convergent interview was an effective technique in collecting information about the perception of clinicians. The study identified three areas where the technique could be improved upon for future studies in the SDS field.

  6. A note on photographing otoliths

    African Journals Online (AJOL)

    The sagiu..al otoliths of rlShes have in recent years become importanttuonomic aids. Forthis purpose high quality photographs have become essential to illustrate such fine: structures as the crislae and the colliculi which art often useful in distinguishing between closely related species. The method described bdow proved ...

  7. Globes, Maps, Photographs: Geographic Tools.

    Science.gov (United States)

    McDermott, Paul D.; And Others

    This compilation of reprinted articles that originally appeared in the Journal of Geography from September 1969 through the May 1970 issues, is intended to help teachers use globes, maps, and photographs with skill and understanding. The articles were designed with several objectives in mind: 1) to provide information regarding the design,…

  8. Adobe Photoshop CS5 for Photographers The Ultimate Workshop

    CERN Document Server

    Evening, Martin

    2010-01-01

    If you already have a good knowledge of Adobe Photoshop and are looking to advance your skills, Adobe Photoshop CS5 for Photographers: The Ultimate Workshop is the book you've been waiting for.  Renowned photographers Martin Evening and Jeff Schewe impart their Photoshop tips and workflow, showing you how to use a vast array of rarely seen advanced Photoshop techniques.  Whether the subject is serious retouching work, weird and wonderful compositions, or planning a shoot before you've even picked up a camera, you can be sure that the advice is based on years of practical experience.

  9. Path integral molecular dynamics within the grand canonical-like adaptive resolution technique: Simulation of liquid water

    Energy Technology Data Exchange (ETDEWEB)

    Agarwal, Animesh, E-mail: animesh@zedat.fu-berlin.de; Delle Site, Luigi, E-mail: dellesite@fu-berlin.de [Institute for Mathematics, Freie Universität Berlin, Berlin (Germany)

    2015-09-07

    Quantum effects due to the spatial delocalization of light atoms are treated in molecular simulation via the path integral technique. Among several methods, Path Integral (PI) Molecular Dynamics (MD) is nowadays a powerful tool to investigate properties induced by spatial delocalization of atoms; however, computationally this technique is very demanding. The above mentioned limitation implies the restriction of PIMD applications to relatively small systems and short time scales. One of the possible solutions to overcome size and time limitation is to introduce PIMD algorithms into the Adaptive Resolution Simulation Scheme (AdResS). AdResS requires a relatively small region treated at path integral level and embeds it into a large molecular reservoir consisting of generic spherical coarse grained molecules. It was previously shown that the realization of the idea above, at a simple level, produced reasonable results for toy systems or simple/test systems like liquid parahydrogen. Encouraged by previous results, in this paper, we show the simulation of liquid water at room conditions where AdResS, in its latest and more accurate Grand-Canonical-like version (GC-AdResS), is merged with two of the most relevant PIMD techniques available in the literature. The comparison of our results with those reported in the literature and/or with those obtained from full PIMD simulations shows a highly satisfactory agreement.

  10. 3D micro-crack propagation simulation at enamel/adhesive interface using FE submodeling and element death techniques.

    Science.gov (United States)

    Liu, Heng-Liang; Lin, Chun-Li; Sun, Ming-Tsung; Chang, Yen-Hsiang

    2010-06-01

    This study investigates micro-crack propagation at the enamel/adhesive interface using finite element (FE) submodeling and element death techniques. A three-dimensional (3D) FE macro-model of the enamel/adhesive/ceramic subjected to shear bond testing was generated and analyzed. A 3D micro-model with interfacial bonding structure was constructed at the upper enamel/adhesive interface where the stress concentration was found from the macro-model results. The morphology of this interfacial bonding structure (i.e., resin tag) was assigned based on resin tag geometry and enamel rod arrangement from a scanning electron microscopy micrograph. The boundary conditions for the micro-model were determined from the macro-model results. A custom iterative code combined with the element death technique was used to calculate the micro-crack propagation. Parallel experiments were performed to validate this FE simulation. The stress concentration within the adhesive occurred mainly at the upper corner near the enamel/adhesive interface and the resin tag base. A simulated fracture path was found at the resin tag base along the enamel/adhesive interface. A morphological observation of the fracture patterns obtained from in vitro testing corresponded with the simulation results. This study shows that the FE submodeling and element death techniques could be used to simulate the 3D micro-stress pattern and the crack propagation noted at the enamel/adhesive interface.

  11. A fast simulation method for the Log-normal sum distribution using a hazard rate twisting technique

    KAUST Repository

    Rached, Nadhir B.

    2015-06-08

    The probability density function of the sum of Log-normally distributed random variables (RVs) is a well-known challenging problem. For instance, an analytical closed-form expression of the Log-normal sum distribution does not exist and is still an open problem. A crude Monte Carlo (MC) simulation is of course an alternative approach. However, this technique is computationally expensive especially when dealing with rare events (i.e. events with very small probabilities). Importance Sampling (IS) is a method that improves the computational efficiency of MC simulations. In this paper, we develop an efficient IS method for the estimation of the Complementary Cumulative Distribution Function (CCDF) of the sum of independent and not identically distributed Log-normal RVs. This technique is based on constructing a sampling distribution via twisting the hazard rate of the original probability measure. Our main result is that the estimation of the CCDF is asymptotically optimal using the proposed IS hazard rate twisting technique. We also offer some selected simulation results illustrating the considerable computational gain of the IS method compared to the naive MC simulation approach.

  12. A fast simulation method for the Log-normal sum distribution using a hazard rate twisting technique

    KAUST Repository

    Rached, Nadhir B.; Benkhelifa, Fatma; Alouini, Mohamed-Slim; Tempone, Raul

    2015-01-01

    The probability density function of the sum of Log-normally distributed random variables (RVs) is a well-known challenging problem. For instance, an analytical closed-form expression of the Log-normal sum distribution does not exist and is still an open problem. A crude Monte Carlo (MC) simulation is of course an alternative approach. However, this technique is computationally expensive especially when dealing with rare events (i.e. events with very small probabilities). Importance Sampling (IS) is a method that improves the computational efficiency of MC simulations. In this paper, we develop an efficient IS method for the estimation of the Complementary Cumulative Distribution Function (CCDF) of the sum of independent and not identically distributed Log-normal RVs. This technique is based on constructing a sampling distribution via twisting the hazard rate of the original probability measure. Our main result is that the estimation of the CCDF is asymptotically optimal using the proposed IS hazard rate twisting technique. We also offer some selected simulation results illustrating the considerable computational gain of the IS method compared to the naive MC simulation approach.

  13. Redirection of Metabolic Hydrogen by Inhibiting Methanogenesis in the Rumen Simulation Technique (RUSITEC)

    Science.gov (United States)

    Guyader, Jessie; Ungerfeld, Emilio M.; Beauchemin, Karen A.

    2017-01-01

    A decrease in methanogenesis is expected to improve ruminant performance by allocating rumen metabolic hydrogen ([2H]) to more energy-rendering fermentation pathways for the animal. However, decreases in methane (CH4) emissions of up to 30% are not always linked with greater performance. Therefore, the aim of this study was to understand the fate of [2H] when CH4 production in the rumen is inhibited by known methanogenesis inhibitors (nitrate, NIT; 3-nitrooxypropanol, NOP; anthraquinone, AQ) in comparison with a control treatment (CON) with the Rumen Simulation Technique (RUSITEC). Measurements started after 1 week adaptation. Substrate disappearance was not modified by methanogenesis inhibitors. Nitrate mostly seemed to decrease [2H] availability by acting as an electron acceptor competing with methanogenesis. As a consequence, NIT decreased CH4 production (−75%), dissolved dihydrogen (H2) concentration (−30%) and the percentages of reduced volatile fatty acids (butyrate, isobutyrate, valerate, isovalerate, caproate and heptanoate) except propionate, but increased acetate molar percentage, ethanol concentration and the efficiency of microbial nitrogen synthesis (+14%) without affecting gaseous H2. Nitrooxypropanol decreased methanogenesis (−75%) while increasing both gaseous and dissolved H2 concentrations (+81% and +24%, respectively). Moreover, NOP decreased acetate and isovalerate molar percentages and increased butyrate, valerate, caproate and heptanoate molar percentages as well as n-propanol and ammonium concentrations. Methanogenesis inhibition with AQ (−26%) was associated with higher gaseous H2 production (+70%) but lower dissolved H2 concentration (−76%), evidencing a lack of relationship between the two H2 forms. Anthraquinone increased ammonium concentration, caproate and heptanoate molar percentages but decreased acetate and isobutyrate molar percentages, total microbial nitrogen production and efficiency of microbial protein synthesis (

  14. Forest Vegetation Simulator translocation techniques with the Bureau of Land Management's Forest Vegetation Information system database

    Science.gov (United States)

    Timothy A. Bottomley

    2008-01-01

    The BLM uses a database, called the Forest Vegetation Information System (FORVIS), to store, retrieve, and analyze forest resource information on a majority of their forested lands. FORVIS also has the capability of easily transferring appropriate data electronically into Forest Vegetation Simulator (FVS) for simulation runs. Only minor additional data inputs or...

  15. Experimental simulation techniques for the evaluation of structural changes in metals and alloys

    International Nuclear Information System (INIS)

    Lucki, Georgi

    1992-01-01

    In this work, high-doses irradiation in nuclear reactor were simulated using a cyclotron in order to study mechanical, electric, magnetic and structural changes in materials. Results of such simulations for portland cement, binary alloys and stainless steels are presented and discussed. 15 refs., 11 figs., 1 tab

  16. Performance of medical residents in sterile techniques during central vein catheterization: randomized trial of efficacy of simulation-based training.

    Science.gov (United States)

    Khouli, Hassan; Jahnes, Katherine; Shapiro, Janet; Rose, Keith; Mathew, Joseph; Gohil, Amit; Han, Qifa; Sotelo, Andre; Jones, James; Aqeel, Adnan; Eden, Edward; Fried, Ethan

    2011-01-01

    Catheter-related bloodstream infection (CRBSI) is a preventable cause of a potentially lethal ICU infection. The optimal method to teach health-care providers correct sterile techniques during central vein catheterization (CVC) remains unclear. We randomly assigned second- and third-year internal medicine residents trained by a traditional apprenticeship model to simulation-based plus video training or video training alone from December 2007 to January 2008, with a follow-up period to examine CRBSI ending in July 2009. During the follow-up period, a simulation-based training program in sterile techniques during CVC was implemented in the medical ICU (MICU). A surgical ICU (SICU) where no residents received study interventions was used for comparison. The primary outcome measures were median residents' scores in sterile techniques and rates of CRBSI per 1,000 catheter-days. Of the 47 enrolled residents, 24 were randomly assigned to the simulation-based plus video training group and 23 to the video training group. Median baseline scores in both groups were equally poor: 12.5 to 13 (52%-54%) out of maximum score of 24 (P = .95; median difference, 0; 95% CI, 0.2-2.0). After training, median score was significantly higher for the simulation-based plus video training group: 22 (92%) vs 18 (75%) for the video training group (P training in sterile techniques during CVC is superior to traditional training or video training alone and is associated with decreased rate of CRBSI. Simulation-based training in CVC should be routinely used to reduce iatrogenic risk. ClinicalTrials.gov; No.: NCT00612131; URL: clinicaltrials.gov.

  17. Nuclear power plant human computer interface design incorporating console simulation, operations personnel, and formal evaluation techniques

    International Nuclear Information System (INIS)

    Chavez, C.; Edwards, R.M.; Goldberg, J.H.

    1993-01-01

    New CRT-based information displays which enhance the human machine interface are playing a very important role and are being increasingly used in control rooms since they present a higher degree of flexibility compared to conventional hardwired instrumentation. To prototype a new console configuration and information display system at the Experimental Breeder Reactor II (EBR-II), an iterative process of console simulation and evaluation involving operations personnel is being pursued. Entire panels including selector switches and information displays are simulated and driven by plant dynamical simulations with realistic responses that reproduce the actual cognitive and physical environment. Careful analysis and formal evaluation of operator interaction while using the simulated console will be conducted to determine underlying principles for effective control console design for this particular group of operation personnel. Additional iterations of design, simulation, and evaluation will then be conducted as necessary

  18. Cultural influences on Facebook photographs.

    Science.gov (United States)

    Huang, Chih-Mao; Park, Denise

    2013-01-01

    Prior research in social psychology indicates that East Asians from collectivistic and interdependent sociocultural systems are more sensitive to contextual information than Westerners, whereas Westerners with individualistic and independent representation have a tendency to process focal and discrete attributes of the environment. Here we have demonstrated that such systematic cultural variations can also be observed in cyberspace, focusing on self-presentation of photographs on Facebook, the most popular worldwide online social network site. We examined cultural differences in face/frame ratios for Facebook profile photographs in two studies. For Study 1, 200 digital profile face photographs of active Facebook users were randomly selected from native and immigrant Taiwanese and Americans. For Study 2, 312 Facebook profiles of undergraduate students of six public universities in East Asia (Hong Kong, Singapore, and Taiwan) and the United States (California and Texas) were randomly selected. Overall, the two studies clearly showed that East Asian Facebook users are more likely to deemphasize their faces compared to Americans. Specifically, East Asians living in Hong Kong, Singapore, and Taiwan exhibited a predilection for context inclusiveness in their profile photographs, whereas Americans tended to prioritize their focal face at the expense of the background. Moreover, East Asian Facebook users had lower intensity of facial expression than Americans on their photographs. These results demonstrate marked cultural differences in context-inclusive styles versus object-focused styles between East Asian and American Facebook users. Our findings extend previous findings from the real world to cyberspace, and provide a novel approach to investigate cognition and behaviors across cultures by using Facebook as a data collection platform.

  19. Cultural influences on Facebook photographs

    Science.gov (United States)

    Huang, Chih-Mao; Park, Denise

    2012-01-01

    Prior research in social psychology indicates that East Asians from collectivistic and interdependent sociocultural systems are more sensitive to contextual information than Westerners, whereas Westerners with individualistic and independent representation have a tendency to process focal and discrete attributes of the environment. Here we have demonstrated that such systematic cultural variations can also be observed in cyberspace, focusing on self-presentation of photographs on Facebook, the most popular worldwide online social network site. We examined cultural differences in face/frame ratios for Facebook profile photographs in two studies. For Study 1, 200 digital profile face photographs of active Facebook users were randomly selected from native and immigrant Taiwanese and Americans. For Study 2, 312 Facebook profiles of undergraduate students of six public universities in East Asia (Hong Kong, Singapore, and Taiwan) and the United States (California and Texas) were randomly selected. Overall, the two studies clearly showed that East Asian Facebook users are more likely to deemphasize their faces compared to Americans. Specifically, East Asians living in Hong Kong, Singapore, and Taiwan exhibited a predilection for context inclusiveness in their profile photographs, whereas Americans tended to prioritize their focal face at the expense of the background. Moreover, East Asian Facebook users had lower intensity of facial expression than Americans on their photographs. These results demonstrate marked cultural differences in context-inclusive styles versus object-focused styles between East Asian and American Facebook users. Our findings extend previous findings from the real world to cyberspace, and provide a novel approach to investigate cognition and behaviors across cultures by using Facebook as a data collection platform. PMID:22468606

  20. ARIADNE, a Photographic LAr TPC at the CERN Neutrino Platform

    CERN Document Server

    Mavrokoridis, K; Nessi, M; Roberts, A; Smith, N A; Touramanis, C; CERN. Geneva. SPS and PS Experiments Committee; SPSC

    2016-01-01

    This letter of intent describes a novel and innovative two-phase LAr TPC with photographic capabilities as an attractive alternative readout method to the currently accepted segmented THGEMs which will require many thousands of charge readout channels for kton-scale two-phase TPCs. These colossal LAr TPCs will be used for the future long-baseline-neutrino-oscillation experiments. Optical readout also presents many other clear advantages over current readout techniques such as ease of scalability, upgrade, installation and maintenance, and cost effectiveness. This technology has already been demonstrated at the Liverpool LAr facility with the photographic capturing of cosmic muon tracks and single gammas using a 40-litre prototype. We have now secured ERC funding to develop this further with the ARIADNE programme. ARIADNE will be a 1-ton two-phase LAr TPC utilizing THGEM and EMCCD camera readouts in order to photograph interactions, allowing for track reconstruction and particle identification. We are request...

  1. Heavenly bodies the photographer's guide to astrophotography

    CERN Document Server

    Krages, Esq, Bert P

    2003-01-01

    Detailing the photographic equipment and astronomical instruments needed to capture celestial images, this guide shows how astrophotography can be accessible to all photographers. Included is a detailed introduction to basic astronomy with information on mapping the sky, locating celestial bodies, and planning an expedition to photograph astronomical phenomena. Photographers learn how to determine the color sensitivity of various films and achieve the best possible exposure, how to ensure a captivating composition, and how commercially processed prints can support their artistic vision. Whethe

  2. Wind Energy System Time-domain (WEST) analyzers using hybrid simulation techniques

    Science.gov (United States)

    Hoffman, J. A.

    1979-01-01

    Two stand-alone analyzers constructed for real time simulation of the complex dynamic characteristics of horizontal-axis wind energy systems are described. Mathematical models for an aeroelastic rotor, including nonlinear aerodynamic and elastic loads, are implemented with high speed digital and analog circuitry. Models for elastic supports, a power train, a control system, and a rotor gimbal system are also included. Limited correlation efforts show good comparisons between results produced by the analyzers and results produced by a large digital simulation. The digital simulation results correlate well with test data.

  3. D Photographs in Cultural Heritage

    Science.gov (United States)

    Schuhr, W.; Lee, J. D.; Kiel, St.

    2013-07-01

    This paper on providing "oo-information" (= objective object-information) on cultural monuments and sites, based on 3D photographs is also a contribution of CIPA task group 3 to the 2013 CIPA Symposium in Strasbourg. To stimulate the interest in 3D photography for scientists as well as for amateurs, 3D-Masterpieces are presented. Exemplary it is shown, due to their high documentary value ("near reality"), 3D photography support, e.g. the recording, the visualization, the interpretation, the preservation and the restoration of architectural and archaeological objects. This also includes samples for excavation documentation, 3D coordinate calculation, 3D photographs applied for virtual museum purposes and as educational tools. In addition 3D photography is used for virtual museum purposes, as well as an educational tool and for spatial structure enhancement, which in particular holds for inscriptions and in rock arts. This paper is also an invitation to participate in a systematic survey on existing international archives of 3D photographs. In this respect it is also reported on first results, to define an optimum digitization rate for analog stereo views. It is more than overdue, in addition to the access to international archives for 3D photography, the available 3D photography data should appear in a global GIS(cloud)-system, like on, e.g., google earth. This contribution also deals with exposing new 3D photographs to document monuments of importance for Cultural Heritage, including the use of 3D and single lense cameras from a 10m telescope staff, to be used for extremely low earth based airborne 3D photography, as well as for "underwater staff photography". In addition it is reported on the use of captive balloon and drone platforms for 3D photography in Cultural Heritage. It is liked to emphasize, the still underestimated 3D effect on real objects even allows, e.g., the spatial perception of extremely small scratches as well as of nuances in color differences

  4. 31 CFR 91.10 - Photographs.

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Photographs. 91.10 Section 91.10 Money and Finance: Treasury Regulations Relating to Money and Finance REGULATIONS GOVERNING CONDUCT IN OR ON THE BUREAU OF THE MINT BUILDINGS AND GROUNDS § 91.10 Photographs. The taking of photographs on...

  5. 22 CFR 51.26 - Photographs.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Photographs. 51.26 Section 51.26 Foreign Relations DEPARTMENT OF STATE NATIONALITY AND PASSPORTS PASSPORTS Application § 51.26 Photographs. The applicant must submit with his or her application photographs as prescribed by the Department that are a...

  6. Parameter identification using optimization techniques in the continuous simulation programs FORSIM and MACKSIM

    International Nuclear Information System (INIS)

    Carver, M.B.; Austin, C.F.; Ross, N.E.

    1980-02-01

    This report discusses the mechanics of automated parameter identification in simulation packages, and reviews available integration and optimization algorithms and their interaction within the recently developed optimization options in the FORSIM and MACKSIM simulation packages. In the MACKSIM mass-action chemical kinetics simulation package, the form and structure of the ordinary differential equations involved is known, so the implementation of an optimizing option is relatively straightforward. FORSIM, however, is designed to integrate ordinary and partial differential equations of abritrary definition. As the form of the equations is not known in advance, the design of the optimizing option is more intricate, but the philosophy could be applied to most simulation packages. In either case, however, the invocation of the optimizing interface is simple and user-oriented. Full details for the use of the optimizing mode for each program are given; specific applications are used as examples. (O.T.)

  7. A Novel Technique for Running the NASA Legacy Code LAPIN Synchronously With Simulations Developed Using Simulink

    Science.gov (United States)

    Vrnak, Daniel R.; Stueber, Thomas J.; Le, Dzu K.

    2012-01-01

    This report presents a method for running a dynamic legacy inlet simulation in concert with another dynamic simulation that uses a graphical interface. The legacy code, NASA's LArge Perturbation INlet (LAPIN) model, was coded using the FORTRAN 77 (The Portland Group, Lake Oswego, OR) programming language to run in a command shell similar to other applications that used the Microsoft Disk Operating System (MS-DOS) (Microsoft Corporation, Redmond, WA). Simulink (MathWorks, Natick, MA) is a dynamic simulation that runs on a modern graphical operating system. The product of this work has both simulations, LAPIN and Simulink, running synchronously on the same computer with periodic data exchanges. Implementing the method described in this paper avoided extensive changes to the legacy code and preserved its basic operating procedure. This paper presents a novel method that promotes inter-task data communication between the synchronously running processes.

  8. Metamodeling Techniques to Aid in the Aggregation Process of Large Hierarchical Simulation Models

    National Research Council Canada - National Science Library

    Rodriguez, June F

    2008-01-01

    .... More specifically, investigating how to accurately aggregate hierarchical lower-level (higher resolution) models into the next higher-level in order to reduce the complexity of the overall simulation model...

  9. Site characterization and validation - equipment design and techniques used in single borehole hydraulic testing, simulated drift experiment and crosshole testing

    International Nuclear Information System (INIS)

    Holmes, D.C.; Sehlstedt, M.

    1991-10-01

    This report describes the equipment and techniques used to investigate the variation of hydrogeological parameters within a fractured crystalline rock mass. The testing program was performed during stage 3 of the site characterization and validation programme at the Stripa mine in Sweden. This programme used a multidisciplinary approach, combining geophysical, geological and hydrogeological methods, to determine how groundwater moved through the rock mass. The hydrogeological work package involved three components. Firstly, novel single borehole techniques (focused packer testing) were used to determine the distribution of hydraulic conductivity and head along individual boreholes. Secondly, water was abstracted from boreholes which were drilled to simulate a tunnel (simulated drift experiment). Locations and magnitudes of flows were measured together with pressure responses at various points in the SCV rock mass. Thirdly, small scale crosshole tests, involving detailed interference testing, were used to determine the variability of hydrogeological parameters within previously identified, significant flow zones. (au)

  10. New technique for global solar radiation forecasting by simulated annealing and genetic algorithms using

    International Nuclear Information System (INIS)

    Tolabi, H.B.; Ayob, S.M.

    2014-01-01

    In this paper, a novel approach based on simulated annealing algorithm as a meta-heuristic method is implemented in MATLAB software to estimate the monthly average daily global solar radiation on a horizontal surface for six different climate cities of Iran. A search method based on genetic algorithm is applied to accelerate problem solving. Results show that simulated annealing based on genetic algorithm search is a suitable method to find the global solar radiation. (author)

  11. Computer simulation of stochastic processes through model-sampling (Monte Carlo) techniques.

    Science.gov (United States)

    Sheppard, C W.

    1969-03-01

    A simple Monte Carlo simulation program is outlined which can be used for the investigation of random-walk problems, for example in diffusion, or the movement of tracers in the blood circulation. The results given by the simulation are compared with those predicted by well-established theory, and it is shown how the model can be expanded to deal with drift, and with reflexion from or adsorption at a boundary.

  12. Simulation Techniques and Prosthetic Approach Towards Biologically Efficient Artificial Sense Organs- An Overview

    OpenAIRE

    Neogi, Biswarup; Ghosal, Soumya; Mukherjee, Soumyajit; Das, Achintya; Tibarewala, D. N.

    2011-01-01

    An overview of the applications of control theory to prosthetic sense organs including the senses of vision, taste and odor is being presented in this paper. Simulation aspect nowadays has been the centre of research in the field of prosthesis. There have been various successful applications of prosthetic organs, in case of natural biological organs dis-functioning patients. Simulation aspects and control modeling are indispensible for knowing system performance, and to generate an original a...

  13. Improvement in visibility of simulated lung nodules on computed radiography (CR) chest images by use of temporal subtraction technique

    International Nuclear Information System (INIS)

    Oda, Nobuhiro; Fujimoto, Keiji; Murakami, Seiichi; Katsuragawa, Shigehiko; Doi, Kunio; Nakata, Hajime

    1999-01-01

    A temporal subtraction image obtained by subtraction of a previous image from a current one can enhance interval change on chest images. In this study, we compared the visibility of simulated lung nodules on CR images with and without temporal subtraction. Chest phantom images without and with simulated nodules were obtained as previous and current images, respectively, by a CR system. Then, subtraction images were produced with an iterative image warping technique. Twelve simulated nodules were attached on various locations of the chest phantom. The diameter of nodules having a CT number of 47 ranged from 3 mm to 10 mm. Seven radiologists subjectively evaluated the visibility of simulated nodules on CR images with and without temporal subtraction using a three-point rating scale (0: invisible, +1: questionable, +2:visible). The minimum diameter of simulated nodules visible at a frequency greater than 50% was 4 mm on the CR images with temporal subtraction and 6 mm on those without. Our results indicated that the subtraction images clearly improved the visibility of simulated nodules. (author)

  14. Ant colony method to control variance reduction techniques in the Monte Carlo simulation of clinical electron linear accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Garcia-Pareja, S. [Servicio de Radiofisica Hospitalaria, Hospital Regional Universitario ' Carlos Haya' , Avda. Carlos Haya, s/n, E-29010 Malaga (Spain)], E-mail: garciapareja@gmail.com; Vilches, M. [Servicio de Fisica y Proteccion Radiologica, Hospital Regional Universitario ' Virgen de las Nieves' , Avda. de las Fuerzas Armadas, 2, E-18014 Granada (Spain); Lallena, A.M. [Departamento de Fisica Atomica, Molecular y Nuclear, Universidad de Granada, E-18071 Granada (Spain)

    2007-09-21

    The ant colony method is used to control the application of variance reduction techniques to the simulation of clinical electron linear accelerators of use in cancer therapy. In particular, splitting and Russian roulette, two standard variance reduction methods, are considered. The approach can be applied to any accelerator in a straightforward way and permits, in addition, to investigate the 'hot' regions of the accelerator, an information which is basic to develop a source model for this therapy tool.

  15. Ant colony method to control variance reduction techniques in the Monte Carlo simulation of clinical electron linear accelerators

    International Nuclear Information System (INIS)

    Garcia-Pareja, S.; Vilches, M.; Lallena, A.M.

    2007-01-01

    The ant colony method is used to control the application of variance reduction techniques to the simulation of clinical electron linear accelerators of use in cancer therapy. In particular, splitting and Russian roulette, two standard variance reduction methods, are considered. The approach can be applied to any accelerator in a straightforward way and permits, in addition, to investigate the 'hot' regions of the accelerator, an information which is basic to develop a source model for this therapy tool

  16. Simulation of the Schroedinger equation on SHAC

    International Nuclear Information System (INIS)

    Stewart, A.

    1976-01-01

    A simulation of the Schroedinger wave equation for the hydrogen atom, on SHAC, a simple homogeneous analogue computer primarily intended for use in schools, is described. Due to the incorporation of FET switches very high speed switching from initial conditions to compute modes is possible. The techniques employed in the multiplier and divider are discussed and the flow diagram for the Schroedinger program shown. Results and photographs are discussed. (U.K.)

  17. Technique for Simulation of Black Sea Circulation with Increased Resolution in the Area of the IO RAS Polygon

    Science.gov (United States)

    Gusev, A. V.; Zalesny, V. B.; Fomin, V. V.

    2017-11-01

    A numerical technique is presented for simulating the hydrophysical fields of the Black Sea on a variable-step grid with refinement in the area of IO RAS polygon. Model primitive equations are written in spherical coordinates with an arbitrary arrangement of poles. In order to increase the horizontal resolution of the coastal zone in the area of the IO RAS polygon in the northeastern part of the sea near Gelendzhik, one of the poles is placed at a land point (38.35° E, 44.75° N). The model horizontal resolution varies from 150 m in the area of the IO RAS polygon to 4.6 km in the southwestern part of the Black Sea. The numerical technique makes it possible to simulate a large-scale structure of Black Sea circulation as well as the meso- and submesoscale dynamics of the coastal zone. In order to compute the atmospheric forcing, the results of the regional climate model WRF with a resolution of about 10 km in space and 1 h in time are used. In order to demonstrate the technique, Black Sea hydrophysical fields for 2011-2012 and a passive tracer transport representing self-cleaning of Gelendzhik Bay in July 2012 are simulated.

  18. High-precision numerical simulation with autoadaptative grid technique in nonlinear thermal diffusion

    International Nuclear Information System (INIS)

    Chambarel, A.; Pumborios, M.

    1992-01-01

    This paper reports that many engineering problems concern the determination of a steady state solution in the case with strong thermal gradients, and results obtained using the finite-element technique are sometimes inaccurate, particularly for nonlinear problems with unadapted meshes. Building on previous results in linear problems, we propose an autoadaptive technique for nonlinear cases that uses quasi-Newtonian iterations to reevaluate an interpolation error estimation. The authors perfected an automatic refinement technique to solve the nonlinear thermal problem of temperature calculus in a cast-iron cylinder head of a diesel engine

  19. Dose point kernel simulation for monoenergetic electrons and radionuclides using Monte Carlo techniques.

    Science.gov (United States)

    Wu, J; Liu, Y L; Chang, S J; Chao, M M; Tsai, S Y; Huang, D E

    2012-11-01

    Monte Carlo (MC) simulation has been commonly used in the dose evaluation of radiation accidents and for medical purposes. The accuracy of simulated results is affected by the particle-tracking algorithm, cross-sectional database, random number generator and statistical error. The differences among MC simulation software packages must be validated. This study simulated the dose point kernel (DPK) and the cellular S-values of monoenergetic electrons ranging from 0.01 to 2 MeV and the radionuclides of (90)Y, (177)Lu and (103 m)Rh, using Fluktuierende Kaskade (FLUKA) and MC N-Particle Transport Code Version 5 (MCNP5). A 6-μm-radius cell model consisting of the cell surface, cytoplasm and cell nucleus was constructed for cellular S-value calculation. The mean absolute percentage errors (MAPEs) of the scaled DPKs, simulated using FLUKA and MCNP5, were 7.92, 9.64, 4.62, 3.71 and 3.84 % for 0.01, 0.1, 0.5, 1 and 2 MeV, respectively. For the three radionuclides, the MAPEs of the scaled DPKs were within 5 %. The maximum deviations of S(N←N), S(N←Cy) and S(N←CS) for the electron energy larger than 10 keV were 6.63, 6.77 and 5.24 %, respectively. The deviations for the self-absorbed S-values and cross-dose S-values of the three radionuclides were within 4 %. On the basis of the results of this study, it was concluded that the simulation results are consistent between FLUKA and MCNP5. However, there is a minor inconsistency for low energy range. The DPK and the cellular S-value should be used as the quality assurance tools before the MC simulation results are adopted as the gold standard.

  20. Photographs of the southern heavens

    CERN Document Server

    West, R M

    1975-01-01

    The 1 m Schmidt telescope of the European Southern Observatory (ESO) has been used for a number of sky surveys. In particular a main task has been an examination of the southern night sky between declinations of -20 degrees and -90 degrees . This exercise is known as the ESO(B) Survey (or Quick Blue Survey); some of the more interesting results are shown and are briefly discussed. The photographic plates used were Kodak II a-o. There are photographs of the two Magellanic Clouds and of the galaxy NGC 1313 and the globular cluster NGC 6752. A spectrogram of our Galaxy for wavelengths in the band 3900 A to 4900 A, taken with the telescope's large objective prism is also shown. (0 refs).

  1. A Novel Temporal Bone Simulation Model Using 3D Printing Techniques.

    Science.gov (United States)

    Mowry, Sarah E; Jammal, Hachem; Myer, Charles; Solares, Clementino Arturo; Weinberger, Paul

    2015-09-01

    An inexpensive temporal bone model for use in a temporal bone dissection laboratory setting can be made using a commercially available, consumer-grade 3D printer. Several models for a simulated temporal bone have been described but use commercial-grade printers and materials to produce these models. The goal of this project was to produce a plastic simulated temporal bone on an inexpensive 3D printer that recreates the visual and haptic experience associated with drilling a human temporal bone. Images from a high-resolution CT of a normal temporal bone were converted into stereolithography files via commercially available software, with image conversion and print settings adjusted to achieve optimal print quality. The temporal bone model was printed using acrylonitrile butadiene styrene (ABS) plastic filament on a MakerBot 2x 3D printer. Simulated temporal bones were drilled by seven expert temporal bone surgeons, assessing the fidelity of the model as compared with a human cadaveric temporal bone. Using a four-point scale, the simulated bones were assessed for haptic experience and recreation of the temporal bone anatomy. The created model was felt to be an accurate representation of a human temporal bone. All raters felt strongly this would be a good training model for junior residents or to simulate difficult surgical anatomy. Material cost for each model was $1.92. A realistic, inexpensive, and easily reproducible temporal bone model can be created on a consumer-grade desktop 3D printer.

  2. A hybrid method for flood simulation in small catchments combining hydrodynamic and hydrological techniques

    Science.gov (United States)

    Bellos, Vasilis; Tsakiris, George

    2016-09-01

    The study presents a new hybrid method for the simulation of flood events in small catchments. It combines a physically-based two-dimensional hydrodynamic model and the hydrological unit hydrograph theory. Unit hydrographs are derived using the FLOW-R2D model which is based on the full form of two-dimensional Shallow Water Equations, solved by a modified McCormack numerical scheme. The method is tested at a small catchment in a suburb of Athens-Greece for a storm event which occurred in February 2013. The catchment is divided into three friction zones and unit hydrographs of 15 and 30 min are produced. The infiltration process is simulated by the empirical Kostiakov equation and the Green-Ampt model. The results from the implementation of the proposed hybrid method are compared with recorded data at the hydrometric station at the outlet of the catchment and the results derived from the fully hydrodynamic model FLOW-R2D. It is concluded that for the case studied, the proposed hybrid method produces results close to those of the fully hydrodynamic simulation at substantially shorter computational time. This finding, if further verified in a variety of case studies, can be useful in devising effective hybrid tools for the two-dimensional flood simulations, which are lead to accurate and considerably faster results than those achieved by the fully hydrodynamic simulations.

  3. Simulation and experimental tests of a real-time DPWM technique ...

    African Journals Online (AJOL)

    This control strategy is a simple and an easy technique generating the same switching ..... Inverter”, Energy Conversion Congress and Exposition (ECCE ), IEEE, ... Minimize the Switching Loss”, Innovative Smart Grid Technologies (ISGT Asia), ...

  4. Advanced Numerical Integration Techniques for HighFidelity SDE Spacecraft Simulation

    Data.gov (United States)

    National Aeronautics and Space Administration — Classic numerical integration techniques, such as the ones at the heart of several NASA GSFC analysis tools, are known to work well for deterministic differential...

  5. An alternative technique for simulating volumetric cylindrical sources in the Morse code utilization

    International Nuclear Information System (INIS)

    Vieira, W.J.; Mendonca, A.G.

    1985-01-01

    In the solution of deep-penetration problems using the Monte Carlo method, calculation techniques and strategies are used in order to increase the particle population in the regions of interest. A common procedure is the coupling of bidimensional calculations, with (r,z) discrete ordinates transformed into source data, and tridimensional Monte Carlo calculations. An alternative technique for this procedure is presented. This alternative proved effective when applied to a sample problem. (F.E.) [pt

  6. 3D DIGITAL SIMULATION OF MINNAN TEMPLE ARCHITECTURE CAISSON'S CRAFT TECHNIQUES

    OpenAIRE

    Y. C. Lin; T. C. Wu; M. F. Hsu

    2013-01-01

    Caisson is one of the important representations of the Minnan (southern Fujian) temple architecture craft techniques and decorative aesthetics. The special component design and group building method present the architectural thinking and personal characteristics of great carpenters of Minnan temple architecture. In late Qing Dynasty, the appearance and style of caissons of famous temples in Taiwan apparently presented the building techniques of the great carpenters. However, as the y...

  7. A graphical simulator for teaching basic and advanced MR imaging techniques

    DEFF Research Database (Denmark)

    Hanson, Lars G

    2007-01-01

    Teaching of magnetic resonance (MR) imaging techniques typically involves considerable handwaving, literally, to explain concepts such as resonance, rotating frames, dephasing, refocusing, sequences, and imaging. A proper understanding of MR contrast and imaging techniques is crucial for radiolog...... be visualized in an intuitive way. The cross-platform software is primarily designed for use in lectures, but is also useful for self studies and student assignments. Movies available at http://radiographics.rsnajnls.org/cgi/content/full/e27/DC1 ....

  8. A Survey on Modeling and Simulation of MEMS Switches and Its Application in Power Gating Techniques

    OpenAIRE

    Pramod Kumar M.P; A.S. Augustine Fletcher

    2014-01-01

    Large numbers of techniques have been developed to reduce the leakage power, including supply voltage scaling, varying threshold voltages, smaller logic banks, etc. Power gating is a technique which is used to reduce the static power when the sleep transistor is in off condition. Micro Electro mechanical System (MEMS) switches have properties that are very close to an ideal switch, with infinite off-resistance due to an air gap and low on-resistance due to the ohmic metal to m...

  9. Computed simulation of radiographies of pipes - validation of techniques for wall thickness measurements

    International Nuclear Information System (INIS)

    Bellon, C.; Tillack, G.R.; Nockemann, C.; Wenzel, L.

    1995-01-01

    A macroscopic model of radiographic NDE methods and applications is given. A computer-aided approach for determination of wall thickness from radiographs is presented, guaranteeing high accuracy and reproducibility of wall thickness determination by means of projection radiography. The algorithm was applied to computed simulations of radiographies. The simulation thus offers an effective means for testing such automated wall thickness determination as a function of imaging conditions, pipe geometries, coatings, and media tracking, and likewise is a tool for validation and optimization of the method. (orig.) [de

  10. Flight test techniques for validating simulated nuclear electromagnetic pulse aircraft responses

    Science.gov (United States)

    Winebarger, R. M.; Neely, W. R., Jr.

    1984-01-01

    An attempt has been made to determine the effects of nuclear EM pulses (NEMPs) on aircraft systems, using a highly instrumented NASA F-106B to document the simulated NEMP environment at the Kirtland Air Force Base's Vertically Polarized Dipole test facility. Several test positions were selected so that aircraft orientation relative to the test facility would be the same in flight as when on the stationary dielectric stand, in order to validate the dielectric stand's use in flight configuration simulations. Attention is given to the flight test portions of the documentation program.

  11. Quasi-monte carlo simulation and variance reduction techniques substantially reduce computational requirements of patient-level simulation models: An application to a discrete event simulation model

    NARCIS (Netherlands)

    Treur, M.; Postma, M.

    2014-01-01

    Objectives: Patient-level simulation models provide increased flexibility to overcome the limitations of cohort-based approaches in health-economic analysis. However, computational requirements of reaching convergence is a notorious barrier. The objective was to assess the impact of using

  12. Monte Carlo particle simulation and finite-element techniques for tandem mirror transport

    International Nuclear Information System (INIS)

    Rognlien, T.D.; Cohen, B.I.; Matsuda, Y.; Stewart, J.J. Jr.

    1987-01-01

    A description is given of numerical methods used in the study of axial transport in tandem mirrors owing to Coulomb collisions and rf diffusion. The methods are Monte Carlo particle simulations and direct solution to the Fokker-Planck equations by finite-element expansion. (author)

  13. 360-degree videos: a new visualization technique for astrophysical simulations, applied to the Galactic Center

    Science.gov (United States)

    Russell, Christopher

    2018-01-01

    360-degree videos are a new type of movie that renders over all 4π steradian. Video sharing sites such as YouTube now allow this unique content to be shared via virtual reality (VR) goggles, hand-held smartphones/tablets, and computers. Creating 360-degree videos from astrophysical simulations not only provide a new way to view these simulations due to their immersive nature, but also yield engaging content for outreach to the public. We present our 360-degree video of an astrophysical simulation of the Galactic center: a hydrodynamics calculation of the colliding and accreting winds of the 30 Wolf-Rayet stars orbiting within the central parsec. Viewing the movie, which renders column density, from the location of the supermassive black hole gives a unique and immersive perspective of the shocked wind material inspiraling and tidally stretching as it plummets toward the black hole. We also describe how to create such movies, discuss what type of content does and does not look appealing in 360-degree format, and briefly comment on what new science can be extracted from astrophysical simulations using 360-degree videos.

  14. Multispectral Terrain Background Simulation Techniques For Use In Airborne Sensor Evaluation

    Science.gov (United States)

    Weinberg, Michael; Wohlers, Ronald; Conant, John; Powers, Edward

    1988-08-01

    A background simulation code developed at Aerodyne Research, Inc., called AERIE is designed to reflect the major sources of clutter that are of concern to staring and scanning sensors of the type being considered for various airborne threat warning (both aircraft and missiles) sensors. The code is a first principles model that could be used to produce a consistent image of the terrain for various spectral bands, i.e., provide the proper scene correlation both spectrally and spatially. The code utilizes both topographic and cultural features to model terrain, typically from DMA data, with a statistical overlay of the critical underlying surface properties (reflectance, emittance, and thermal factors) to simulate the resulting texture in the scene. Strong solar scattering from water surfaces is included with allowance for wind driven surface roughness. Clouds can be superimposed on the scene using physical cloud models and an analytical representation of the reflectivity obtained from scattering off spherical particles. The scene generator is augmented by collateral codes that allow for the generation of images at finer resolution. These codes provide interpolation of the basic DMA databases using fractal procedures that preserve the high frequency power spectral density behavior of the original scene. Scenes are presented illustrating variations in altitude, radiance, resolution, material, thermal factors, and emissivities. The basic models utilized for simulation of the various scene components and various "engineering level" approximations are incorporated to reduce the computational complexity of the simulation.

  15. Refinement of homology-based protein structures by molecular dynamics simulation techniques

    NARCIS (Netherlands)

    Fan, H; Mark, AE

    The use of classical molecular dynamics simulations, performed in explicit water, for the refinement of structural models of proteins generated ab initio or based on homology has been investigated. The study involved a test set of 15 proteins that were previously used by Baker and coworkers to

  16. Monte Carlo particle simulation and finite-element techniques for tandem mirror transport

    International Nuclear Information System (INIS)

    Rognlien, T.D.; Cohen, B.I.; Matsuda, Y.; Stewart, J.J. Jr.

    1985-12-01

    A description is given of numerical methods used in the study of axial transport in tandem mirrors owing to Coulomb collisions and rf diffusion. The methods are Monte Carlo particle simulations and direct solution to the Fokker-Planck equations by finite-element expansion. 11 refs

  17. Synchrotron and Simulations Techniques Applied to Problems in Materials Science: Catalysts and Azul Maya Pigments

    International Nuclear Information System (INIS)

    Chianelli, R.

    2005-01-01

    Development of synchrotron techniques for the determination of the structure of disordered, amorphous and surface materials has exploded over the past twenty years due to the increasing availability of high flux synchrotron radiation and the continuing development of increasingly powerful synchrotron techniques. These techniques are available to materials scientists who are not necessarily synchrotron scientists through interaction with effective user communities that exist at synchrotrons such as the Stanford Synchrotron Radiation Laboratory (SSRL). In this article we review the application of multiple synchrotron characterization techniques to two classes of materials defined as ''surface compounds.'' One class of surface compounds are materials like MoS 2-x C x that are widely used petroleum catalysts used to improve the environmental properties of transportation fuels. These compounds may be viewed as ''sulfide supported carbides'' in their catalytically active states. The second class of ''surface compounds'' is the ''Maya Blue'' pigments that are based on technology created by the ancient Maya. These compounds are organic/inorganic ''surface complexes'' consisting of the dye indigo and palygorskite, a common clay. The identification of both surface compounds relies on the application of synchrotron techniques as described in this report

  18. Integrated Stamping Simulation Using State Of The Art Techniques To Fulfill Quality Assessment Requirements

    International Nuclear Information System (INIS)

    Ling, David; Lambriks, Marc; El Khaldi, Fouad

    2005-01-01

    The last few years have seen the use of stamping simulation evolve to the extent that it is now a mainstream activity; a core part of the press tool engineering process. Now, new requirements for the use of challenging materials like Dual phase / Complex phase steel, VHSS, and aluminum, together with more stringent quality expectations, and shorter development cycles, there is a need to assess the panel quality in a wider context, before committing to tool manufacture.The integrated approach from ESI Group allows early up-front feasibility assessment, geometry and process optimization, and detailed process validation all within one system. Rapid die design and quick forming simulation modules play an essential role in the early stages of the process. A seamless connection between simulation and geometry is a vital characteristic, with the accurate simulation being used to validate and fine tune the process in order to assess final component quality in unprecedented detail, utilizing some of the most accurate material models available today. The combination of the distributed memory processing (DMP) solver together with new cost effective cluster based compute servers provide a practical solution to the problems of 'one million element' model sizes, and more sophisticated modeling methodologies become realistic for the first time.It is no longer sufficient to merely focus on the draw die, forming simulation must now consider the entire die line up. Typically, around half of forming issues arise from the draw die, so the time has now come to address the other half as well! This paper will discuss how the PAM-STAMP 2G TM integrated solution is successfully used to deliver a positive business impact, by providing virtual panel quality assessment, tolerance control, and springback compensation. The paper will also discuss how other forming processes can be accurately modeled using the new modules

  19. MCNP simulations of a new time-resolved Compton scattering imaging technique

    International Nuclear Information System (INIS)

    Ilan, Y.

    2004-01-01

    Medical images of human tissue can be produced using Computed Tomography (CT), Positron Emission Tomography (PET), Ultrasound or Magnetic Resonance Imaging (MRI). In all of the above techniques, in order to get a three-dimensional (3D) image, one has to rotate or move the source, the detectors or the scanned target. This procedure is complicated, time consuming and increases the cost and weight of the scanning equipment. Time resolved optical tomography has been suggested as an alternative to the above conventional methods. This technique implies near infrared light (NIR) and fast time-resolved detectors to obtain a 3D image of the scanned target. However, due to the limited penetration of the NIR light in the tissue, the application of this technique is limited to soft tissue like a female breast or a premature infant brain

  20. XRF analysis to identify historical photographic processes: The case of some Interguglielmi Jr.’s images from the Palermo Municipal Archive

    International Nuclear Information System (INIS)

    Modica, A.; Alberghina, M.F.; Brai, M.; Bruno, M.; Di Bella, M.; Fontana, D.; Tranchina, L.

    2017-01-01

    In the early period, even though professional photographers worked with similar techniques and products, their artistic and commercial aims determined different choices and led them to follow different, often personal, recipes. For this reason, identification of the techniques through date and name of the photographer or through some visual features like colour, tonality and surface of the image layer, often needs further investigation to be proved. Chemical characterization, carried out in a non or micro destructive way, can be crucial to provide useful information about the original composition, degradation process, realization technique, in obtaining an indirect dating of the photograph and/or to choose the most correct conservation treatment. In our case, x-ray fluorescence (XRF) analysis was used to confirm the chemical composition of eleven historical photographs dated between the end of the 19th century and the beginning of the 20th, shot in Palermo (Sicily) by a renowned photographer of the time, and pasted on their original cardboards. The elemental identification, obtained with a non destructive approach, provided important information to distinguish among different photographic techniques in terms of distribution and characterization of chemical elements markers in the photographic surface. - Highlights: • Overview of the photographic processes used in the early XX century. • X-ray fluorescence used to characterize photographs made by different techniques. • Diagnostic and conservative approach in the photographic material restoration. • Non invasive approach in studying photographic materials.

  1. Simulation and experimental tests of a real-time DPWM technique ...

    African Journals Online (AJOL)

    This control strategy is a simple and an easy technique generating the same switching pattern as space vector modulation with less switching losses and reduced total harmonic distortion. The main motivation of the present paper is that the DPWM is not largely and deeply investigated and can present a serious alternative ...

  2. Simulation of limiting dilution technique in determination of immunocompetent cells frequency in irradiated cell cultures

    International Nuclear Information System (INIS)

    Martini Filho, R.J.; Barlette, V.E.; Goes, E.G.; Covas, D.T.; Orellana, M.

    2001-01-01

    Limiting dilution techniques (LDA) dose-response data have been used to detect immunocompetent T-Cells in microcultures. In this work, LDA frequencies estimates was obtained using χ2 minimization for irradiated cells in a range of 500 to 1,500 cGy. (author)

  3. Simulations

    CERN Document Server

    Ngada, Narcisse

    2015-06-15

    The complexity and cost of building and running high-power electrical systems make the use of simulations unavoidable. The simulations available today provide great understanding about how systems really operate. This paper helps the reader to gain an insight into simulation in the field of power converters for particle accelerators. Starting with the definition and basic principles of simulation, two simulation types, as well as their leading tools, are presented: analog and numerical simulations. Some practical applications of each simulation type are also considered. The final conclusion then summarizes the main important items to keep in mind before opting for a simulation tool or before performing a simulation.

  4. Monte Carlo simulation for scanning technique with scattering foil free electron beam: A proof of concept study.

    Directory of Open Access Journals (Sweden)

    Wonmo Sung

    Full Text Available This study investigated the potential of a newly proposed scattering foil free (SFF electron beam scanning technique for the treatment of skin cancer on the irregular patient surfaces using Monte Carlo (MC simulation. After benchmarking of the MC simulations, we removed the scattering foil to generate SFF electron beams. Cylindrical and spherical phantoms with 1 cm boluses were generated and the target volume was defined from the surface to 5 mm depth. The SFF scanning technique with 6 MeV electrons was simulated using those phantoms. For comparison, volumetric modulated arc therapy (VMAT plans were also generated with two full arcs and 6 MV photon beams. When the scanning resolution resulted in a larger separation between beams than the field size, the plan qualities were worsened. In the cylindrical phantom with a radius of 10 cm, the conformity indices, homogeneity indices and body mean doses of the SFF plans (scanning resolution = 1° vs. VMAT plans were 1.04 vs. 1.54, 1.10 vs. 1.12 and 5 Gy vs. 14 Gy, respectively. Those of the spherical phantom were 1.04 vs. 1.83, 1.08 vs. 1.09 and 7 Gy vs. 26 Gy, respectively. The proposed SFF plans showed superior dose distributions compared to the VMAT plans.

  5. Monte Carlo simulation for scanning technique with scattering foil free electron beam: A proof of concept study.

    Science.gov (United States)

    Sung, Wonmo; Park, Jong In; Kim, Jung-In; Carlson, Joel; Ye, Sung-Joon; Park, Jong Min

    2017-01-01

    This study investigated the potential of a newly proposed scattering foil free (SFF) electron beam scanning technique for the treatment of skin cancer on the irregular patient surfaces using Monte Carlo (MC) simulation. After benchmarking of the MC simulations, we removed the scattering foil to generate SFF electron beams. Cylindrical and spherical phantoms with 1 cm boluses were generated and the target volume was defined from the surface to 5 mm depth. The SFF scanning technique with 6 MeV electrons was simulated using those phantoms. For comparison, volumetric modulated arc therapy (VMAT) plans were also generated with two full arcs and 6 MV photon beams. When the scanning resolution resulted in a larger separation between beams than the field size, the plan qualities were worsened. In the cylindrical phantom with a radius of 10 cm, the conformity indices, homogeneity indices and body mean doses of the SFF plans (scanning resolution = 1°) vs. VMAT plans were 1.04 vs. 1.54, 1.10 vs. 1.12 and 5 Gy vs. 14 Gy, respectively. Those of the spherical phantom were 1.04 vs. 1.83, 1.08 vs. 1.09 and 7 Gy vs. 26 Gy, respectively. The proposed SFF plans showed superior dose distributions compared to the VMAT plans.

  6. The corner rounding modeling technique in SPICE simulations for deeply scaled MOSFETs

    International Nuclear Information System (INIS)

    Sun Wei; Yang Dake

    2013-01-01

    This paper presents a novel poly (PC) and active (RX) corner rounding modeling approach to SPICE simulations. A set of specially designed structures was used for measurement data collection. PC and RX corner rounding equations have been derived based on an assumption that the corner rounding area is a fragment of a circle. The equations were modified to reflect the gouging effect of physical silicon wafers. The modified general equations were implemented in the SPICE model to enable the model to describe the corner rounding effect. The good fittings between the SPICE model simulation results and the silicon data demonstrated in this paper proved that the designed corner rounding model is practical and accurate. (semiconductor devices)

  7. Comparison of Different Measurement Techniques and a CFD Simulation in Complex Terrain

    International Nuclear Information System (INIS)

    Schulz, Christoph; Lutz, Thorsten; Hofsäß, Martin; Anger, Jan; Wen Cheng, Po; Rautenberg, Alexander; Bange, Jens

    2016-01-01

    This paper deals with a comparison of data collected by measurements and a simulation for a complex terrain test site in southern Germany. Lidar, met mast, unmanned aerial vehicle (UAV) measurements of wind speed and direction and Computational Fluid Dynamics (CFD) data are compared to each other. The site is characterised regarding its flow features and the suitability for a wind turbine test field. A Delayed-Detached-Eddy- Simulation (DES) was employed using measurement data to generate generic turbulent inflow. A good agreement of the wind profiles between the different approaches was reached. The terrain slope leads to a speed-up, a change of turbulence intensity as well as to flow angle variations. (paper)

  8. Simulation of radionuclide chemistry and sorption characteristics in the geosphere by artificial intelligence technique

    International Nuclear Information System (INIS)

    Liu Shangjyh; National Tsing Hua Univ., Hsinchu; Wang Shigang; Ho Liwei

    1988-01-01

    An expert system operated in a personal computer is employed to simulate chemistry and sorption phenomena of radionuclides in the geosphere. The system handles both qualitative and quantitative analyses primarily for the actinides and fission products. The system also incorporates data bases of several groundwater and rock types with mineral and chemical compositions, the distribution coefficients of nuclides for minerals, etc. The decision rule base facilitates this system to carry out the reasoning procedures to predict the solubility-limiting phase, solute species, oxidation states and possible complex formations of radionuclides, as well as to calculate the distribution coefficients and retardation factors in a geological formation, provided that the essential groundwater and host rock information are available. It is concluded that this device of artificial intelligence provides a vehicle to accumulate developed human knowledge and serves as a tool not only for simulating the complicated radionuclide behaviour in the geosphere, but also for instructional or educational purpose in this field. (orig.)

  9. The Technique of Special-Effects Cinematography.

    Science.gov (United States)

    Fielding, Raymond

    The author describes the many techniques used to produce cinematic effects that would be too costly, too difficult, too time-consuming, too dangerous, or simply impossible to achieve with conventional photographic techniques. He points out that these techniques are available not only for 35 millimeter work but also to the 16 mm. photographer who…

  10. Dosimetric study of prostate brachytherapy using techniques of Monte-Carlo simulation, experimental measurements and comparison with a treatment plan

    International Nuclear Information System (INIS)

    Teles, Pedro; Barros, Silvia; Vaz, Pedro; Goncalves, Isabel; Facure, Alessandro; Rosa, Luiz da; Santos, Maira; Pereira Junior, Pedro Paulo; Zankl, Maria

    2013-01-01

    Prostate Brachytherapy is a radiotherapy technique, which consists in inserting a number of radioactive seeds (containing, usually, the following radionuclides 125 l, 241 Am or 103 Pd ) surrounding or in the vicinity of, prostate tumor tissue . The main objective of this technique is to maximize the radiation dose to the tumor and minimize it in other tissues and organs healthy, in order to reduce its morbidity. The absorbed dose distribution in the prostate, using this technique is usually non-homogeneous and time dependent. Various parameters such as the type of seed, the attenuation interactions between them, their geometrical arrangement within the prostate, the actual geometry of the seeds,and further swelling of the prostate gland after implantation greatly influence the course of absorbed dose in the prostate and surrounding areas. Quantification of these parameters is therefore extremely important for dose optimization and improvement of their plans conventional treatment, which in many cases not fully take into account. The Monte Carlo techniques allow to study these parameters quickly and effectively. In this work, we use the program MCNPX and generic voxel phantom (GOLEM) where simulated different geometric arrangements of seeds containing 125 I, Amersham Health model of type 6711 in prostates of different sizes, in order to try to quantify some of the parameters. The computational model was validated using a phantom prostate cubic RW3 type , consisting of tissue equivalent, and thermoluminescent dosimeters. Finally, to have a term of comparison with a treatment real plan it was simulate a treatment plan used in a hospital of Rio de Janeiro, with exactly the same parameters, and our computational model. The results obtained in our study seem to indicate that the parameters described above may be a source of uncertainty in the correct evaluation of the dose required for actual treatment plans. The use of Monte Carlo techniques can serve as a complementary

  11. A Computer Program to Model Passive Acoustic Antisubmarine Search Using Monte Carlo Simulation Techniques.

    Science.gov (United States)

    1983-09-01

    duplicate a continuous function on a digital computer, and thus the machine representatic- of the GMA is only a close approximation of the continuous...error process. Thus, the manner in which the GMA process is digitally replicated has an effect on the results of the simulation. The parameterization of...Information Center 2 Cameron Station Alexandria, Virginia 22314 2. Libary , Code 0142 2 Naval Postgraduate School Monterey, California 93943 3. Professor

  12. FABRICATION OF TISSUE-SIMULATIVE PHANTOMS AND CAPILLARIES AND THEIR INVESTIGATION BY OPTICAL COHERENCE TOMOGRAPHY TECHNIQUES

    Directory of Open Access Journals (Sweden)

    A. V. Bykov

    2013-03-01

    Full Text Available Methods of tissue-simulative phantoms and capillaries fabrication from PVC-plastisol and silicone for application as test-objects in optical coherence tomography (OCT and skin and capillary emulation are considered. Comparison characteristics of these materials and recommendations for their application are given. Examples of phantoms visualization by optical coherence tomography method are given. Possibility of information using from B-scans for refractive index evaluation is shown.

  13. The FADE mass-stat: A technique for inserting or deleting particles in molecular dynamics simulations

    Energy Technology Data Exchange (ETDEWEB)

    Borg, Matthew K., E-mail: matthew.borg@strath.ac.uk [Department of Mechanical and Aerospace Engineering, University of Strathclyde, Glasgow G1 1XJ (United Kingdom); Lockerby, Duncan A., E-mail: duncan.lockerby@warwick.ac.uk [School of Engineering, University of Warwick, Coventry CV4 7AL (United Kingdom); Reese, Jason M., E-mail: jason.reese@ed.ac.uk [School of Engineering, University of Edinburgh, Edinburgh EH9 3JL (United Kingdom)

    2014-02-21

    The emergence of new applications of molecular dynamics (MD) simulation calls for the development of mass-statting procedures that insert or delete particles on-the-fly. In this paper we present a new mass-stat which we term FADE, because it gradually “fades-in” (inserts) or “fades-out” (deletes) molecules over a short relaxation period within a MD simulation. FADE applies a time-weighted relaxation to the intermolecular pair forces between the inserting/deleting molecule and any neighbouring molecules. The weighting function we propose in this paper is a piece-wise polynomial that can be described entirely by two parameters: the relaxation time scale and the order of the polynomial. FADE inherently conserves overall system momentum independent of the form of the weighting function. We demonstrate various simulations of insertions of atomic argon, polyatomic TIP4P water, polymer strands, and C{sub 60} Buckminsterfullerene molecules. We propose FADE parameters and a maximum density variation per insertion-instance that restricts spurious potential energy changes entering the system within desired tolerances. We also demonstrate in this paper that FADE compares very well to an existing insertion algorithm called USHER, in terms of accuracy, insertion rate (in dense fluids), and computational efficiency. The USHER algorithm is applicable to monatomic and water molecules only, but we demonstrate that FADE can be generally applied to various forms and sizes of molecules, such as polymeric molecules of long aspect ratio, and spherical carbon fullerenes with hollow interiors.

  14. The FADE mass-stat: A technique for inserting or deleting particles in molecular dynamics simulations

    International Nuclear Information System (INIS)

    Borg, Matthew K.; Lockerby, Duncan A.; Reese, Jason M.

    2014-01-01

    The emergence of new applications of molecular dynamics (MD) simulation calls for the development of mass-statting procedures that insert or delete particles on-the-fly. In this paper we present a new mass-stat which we term FADE, because it gradually “fades-in” (inserts) or “fades-out” (deletes) molecules over a short relaxation period within a MD simulation. FADE applies a time-weighted relaxation to the intermolecular pair forces between the inserting/deleting molecule and any neighbouring molecules. The weighting function we propose in this paper is a piece-wise polynomial that can be described entirely by two parameters: the relaxation time scale and the order of the polynomial. FADE inherently conserves overall system momentum independent of the form of the weighting function. We demonstrate various simulations of insertions of atomic argon, polyatomic TIP4P water, polymer strands, and C 60 Buckminsterfullerene molecules. We propose FADE parameters and a maximum density variation per insertion-instance that restricts spurious potential energy changes entering the system within desired tolerances. We also demonstrate in this paper that FADE compares very well to an existing insertion algorithm called USHER, in terms of accuracy, insertion rate (in dense fluids), and computational efficiency. The USHER algorithm is applicable to monatomic and water molecules only, but we demonstrate that FADE can be generally applied to various forms and sizes of molecules, such as polymeric molecules of long aspect ratio, and spherical carbon fullerenes with hollow interiors

  15. Hot air impingement on a flat plate using Large Eddy Simulation (LES) technique

    Science.gov (United States)

    Plengsa-ard, C.; Kaewbumrung, M.

    2018-01-01

    Impinging hot gas jets to a flat plate generate very high heat transfer coefficients in the impingement zone. The magnitude of heat transfer prediction near the stagnation point is important and accurate heat flux distribution are needed. This research studies on heat transfer and flow field resulting from a single hot air impinging wall. The simulation is carried out using computational fluid dynamics (CFD) commercial code FLUENT. Large Eddy Simulation (LES) approach with a subgrid-scale Smagorinsky-Lilly model is present. The classical Werner-Wengle wall model is used to compute the predicted results of velocity and temperature near walls. The Smagorinsky constant in the turbulence model is set to 0.1 and is kept constant throughout the investigation. The hot gas jet impingement on the flat plate with a constant surface temperature is chosen to validate the predicted heat flux results with experimental data. The jet Reynolds number is equal to 20,000 and a fixed jet-to-plate spacing of H/D = 2.0. Nusselt number on the impingement surface is calculated. As predicted by the wall model, the instantaneous computed Nusselt number agree fairly well with experimental data. The largest values of calculated Nusselt number are near the stagnation point and decrease monotonically in the wall jet region. Also, the contour plots of instantaneous values of wall heat flux on a flat plate are captured by LES simulation.

  16. CFD simulation of the pulsed neutron activation technique for water flow measurements

    International Nuclear Information System (INIS)

    Mattsson, H.; Nordlund, A.

    2005-01-01

    A pulse neutron activation (PNA) flowmeter uses a radioactive substance to measure water flow in pipes. The water in the pipe is bombarded with neutron pulses, thus introducing activity into the pipe. The activity is then transported and mixed with the flow. Gamma radiation emitted from the activity is measured with one or two detectors downstream from the activation point. The average velocity of the water is calculated using the time-resolved signal from the detector. The CFD program FLUENT was used to simulate the transport and mixing of the activity induced in the pipe. The turbulence of the flow is described with the k-ε model. Some parameters affecting a PNA measurement have been investigated. From the calculations it was possible to quantify how much the average initial velocity of the activity differs from the average velocity of the water. Results also show that activity initially produced far away from the wall has a substantial effect on the detector signal. To accurately simulate the detector signal it is necessary to include activity produced in a large part of the pipe. The results also indicate that the collimation of the detectors have a significant impact on the data and should be included when evaluating simulated data. Three different response functions were also tested. (authors)

  17. BAYESIAN TECHNIQUES FOR COMPARING TIME-DEPENDENT GRMHD SIMULATIONS TO VARIABLE EVENT HORIZON TELESCOPE OBSERVATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Junhan; Marrone, Daniel P.; Chan, Chi-Kwan; Medeiros, Lia; Özel, Feryal; Psaltis, Dimitrios, E-mail: junhankim@email.arizona.edu [Department of Astronomy and Steward Observatory, University of Arizona, 933 N. Cherry Avenue, Tucson, AZ 85721 (United States)

    2016-12-01

    The Event Horizon Telescope (EHT) is a millimeter-wavelength, very-long-baseline interferometry (VLBI) experiment that is capable of observing black holes with horizon-scale resolution. Early observations have revealed variable horizon-scale emission in the Galactic Center black hole, Sagittarius A* (Sgr A*). Comparing such observations to time-dependent general relativistic magnetohydrodynamic (GRMHD) simulations requires statistical tools that explicitly consider the variability in both the data and the models. We develop here a Bayesian method to compare time-resolved simulation images to variable VLBI data, in order to infer model parameters and perform model comparisons. We use mock EHT data based on GRMHD simulations to explore the robustness of this Bayesian method and contrast it to approaches that do not consider the effects of variability. We find that time-independent models lead to offset values of the inferred parameters with artificially reduced uncertainties. Moreover, neglecting the variability in the data and the models often leads to erroneous model selections. We finally apply our method to the early EHT data on Sgr A*.

  18. Using Elearning techniques to support problem based learning within a clinical simulation laboratory.

    Science.gov (United States)

    Docherty, Charles; Hoy, Derek; Topp, Helena; Trinder, Kathryn

    2004-01-01

    This paper details the results of the first phase of a project that used eLearning to support students' learning within a simulated environment. The locus was a purpose built Clinical Simulation Laboratory (CSL) where the School's newly adopted philosophy of Problem Based Learning (PBL) was challenged through lecturers reverting to traditional teaching methods. The solution, a student-centred, problem-based approach to the acquisition of clinical skills was developed using learning objects embedded within web pages that substituted for lecturers providing instruction and demonstration. This allowed lecturers to retain their facilitator role, and encouraged students to explore, analyse and make decisions within the safety of a clinical simulation. Learning was enhanced through network communications and reflection on video performances of self and others. Evaluations were positive, students demonstrating increased satisfaction with PBL, improved performance in exams, and increased self-efficacy in the performance of nursing activities. These results indicate that an elearning approach can support PBL in delivering a student centred learning experience.

  19. Real-Time Simulation Technique of a Microgrid Model for DER Penetration

    Directory of Open Access Journals (Sweden)

    Konstantina Mentesidi

    2014-12-01

    Full Text Available Comprehensive analysis of Distributed Energy Resources (DER integration requires tools that provide computational power and flexibility. In this context, throughout this paper PHIL simulations are performed to emulate the energy management system of a real microgrid including a diesel synchronous machine and inverter-based sources. Moreover, conventional frequency and voltage droops were incorporated into the respective inverters. The results were verified at the real microgrid installation in the Centre for Renewable Energy Sources (CRES premises. This research work is divided into two steps: A Real time in RSCAD/RTDS and Power Hardware-in-the-Loop (PHIL simulations where the diesel generator´s active power droop control is evaluated, the battery inverter´s droop curves are simulated and the load sharing for parallel operation of the system´s generation units is examined. B microgrid experiments during which various tests were executed concerning the diesel generator and the battery inverters in order to examine their dynamic operation within the LV islanded power system.

  20. ANALYSIS OF MONTE CARLO SIMULATION SAMPLING TECHNIQUES ON SMALL SIGNAL STABILITY OF WIND GENERATOR- CONNECTED POWER SYSTEM

    Directory of Open Access Journals (Sweden)

    TEMITOPE RAPHAEL AYODELE

    2016-04-01

    Full Text Available Monte Carlo simulation using Simple Random Sampling (SRS technique is popularly known for its ability to handle complex uncertainty problems. However, to produce a reasonable result, it requires huge sample size. This makes it to be computationally expensive, time consuming and unfit for online power system applications. In this article, the performance of Latin Hypercube Sampling (LHS technique is explored and compared with SRS in term of accuracy, robustness and speed for small signal stability application in a wind generator-connected power system. The analysis is performed using probabilistic techniques via eigenvalue analysis on two standard networks (Single Machine Infinite Bus and IEEE 16–machine 68 bus test system. The accuracy of the two sampling techniques is determined by comparing their different sample sizes with the IDEAL (conventional. The robustness is determined based on a significant variance reduction when the experiment is repeated 100 times with different sample sizes using the two sampling techniques in turn. Some of the results show that sample sizes generated from LHS for small signal stability application produces the same result as that of the IDEAL values starting from 100 sample size. This shows that about 100 sample size of random variable generated using LHS method is good enough to produce reasonable results for practical purpose in small signal stability application. It is also revealed that LHS has the least variance when the experiment is repeated 100 times compared to SRS techniques. This signifies the robustness of LHS over that of SRS techniques. 100 sample size of LHS produces the same result as that of the conventional method consisting of 50000 sample size. The reduced sample size required by LHS gives it computational speed advantage (about six times over the conventional method.

  1. Damage Atlas for Photographic materials

    Directory of Open Access Journals (Sweden)

    Kristel Van Camp

    2010-11-01

    Full Text Available La conservation des documents photographiques peut nécessiter des interventions préventives ou curatives. Ce choix est guidé par leur état de conservation. Une meilleure connaissance des détériorations est donc cruciale. Le répertoire présenté ici essaie de les classifier selon des caractéristiques spécifiques et leur niveau de gravité. Les différents types de dégradation sont illustrés et décrits avec une terminologie précise. L’auteur propose en regard de ceux-ci l’intervention qui semble la plus appropriée. Ce répertoire s’adresse à toutes les personnes concernées par la photographie, qu’ils soient dans le milieu de la conservation ou dans le domaine artistique, dans les musées ou dans les archives. In order to rescue a damaged photographic object, preventive or conservative actions are needed. Knowing the specific characteristics of different types of damage is crucial. A damage atlas can provide these characteristics. With this atlas the damage can be recognised and appropriate actions can be taken. This damage atlas offers a first attempt to such a characterisation in the field of photography. The damage atlas contains images and the necessary information about damage on photographic material. The atlas with special annotations about the terminology and the grade of the damage is meant for everybody who works with photographic material, as well in museums as in archives.

  2. Modeling and simulation of defects detection in conductive multi-layered pieces by the eddy current technique

    International Nuclear Information System (INIS)

    Bennoud, S; Zergoug, M

    2015-01-01

    It has been shown that the eddy current method is one of the most effective techniques for the detection and characterization of surface and near-surface defects in conductive mediums especially in aluminum alloy. It is one of the most applied methods in industries which require a maximum of reliability and security (aerospace, aeronautics, nuclear, Etc). In this study, a code to solve electromagnetic problems by employing the finite element method is developed. The suggested model can simulate the probe response to the presence of a defect hidden in a multi-layered structure or a riveted structure on aluminum alloy. The developed code is based on the discretization in three dimensions of the Maxwell's equations in harmonic mode by the finite element method based on the combined potential formulations. That will enable us to interpret the results, to present them in graphical form and to carry out simulations for various applications

  3. A fitting algorithm based on simulated annealing techniques for efficiency calibration of HPGe detectors using different mathematical functions

    Energy Technology Data Exchange (ETDEWEB)

    Hurtado, S. [Servicio de Radioisotopos, Centro de Investigacion, Tecnologia e Innovacion (CITIUS), Universidad de Sevilla, Avda. Reina Mercedes s/n, 41012 Sevilla (Spain)], E-mail: shurtado@us.es; Garcia-Leon, M. [Departamento de Fisica Atomica, Molecular y Nuclear, Facultad de Fisica, Universidad de Sevilla, Aptd. 1065, 41080 Sevilla (Spain); Garcia-Tenorio, R. [Departamento de Fisica Aplicada II, E.T.S.A. Universidad de Sevilla, Avda, Reina Mercedes 2, 41012 Sevilla (Spain)

    2008-09-11

    In this work several mathematical functions are compared in order to perform the full-energy peak efficiency calibration of HPGe detectors using a 126cm{sup 3} HPGe coaxial detector and gamma-ray energies ranging from 36 to 1460 keV. Statistical tests and Monte Carlo simulations were used to study the performance of the fitting curve equations. Furthermore the fitting procedure of these complex functional forms to experimental data is a non-linear multi-parameter minimization problem. In gamma-ray spectrometry usually non-linear least-squares fitting algorithms (Levenberg-Marquardt method) provide a fast convergence while minimizing {chi}{sub R}{sup 2}, however, sometimes reaching only local minima. In order to overcome that shortcoming a hybrid algorithm based on simulated annealing (HSA) techniques is proposed. Additionally a new function is suggested that models the efficiency curve of germanium detectors in gamma-ray spectrometry.

  4. Improving image quality in Electrical Impedance Tomography (EIT using Projection Error Propagation-based Regularization (PEPR technique: A simulation study

    Directory of Open Access Journals (Sweden)

    Tushar Kanti Bera

    2011-03-01

    Full Text Available A Projection Error Propagation-based Regularization (PEPR method is proposed and the reconstructed image quality is improved in Electrical Impedance Tomography (EIT. A projection error is produced due to the misfit of the calculated and measured data in the reconstruction process. The variation of the projection error is integrated with response matrix in each iterations and the reconstruction is carried out in EIDORS. The PEPR method is studied with the simulated boundary data for different inhomogeneity geometries. Simulated results demonstrate that the PEPR technique improves image reconstruction precision in EIDORS and hence it can be successfully implemented to increase the reconstruction accuracy in EIT.>doi:10.5617/jeb.158 J Electr Bioimp, vol. 2, pp. 2-12, 2011

  5. A Novel Idea for Optimizing Condition-Based Maintenance Using Genetic Algorithms and Continuous Event Simulation Techniques

    Directory of Open Access Journals (Sweden)

    Mansoor Ahmed Siddiqui

    2017-01-01

    Full Text Available Effective maintenance strategies are of utmost significance for system engineering due to their direct linkage with financial aspects and safety of the plants’ operation. At a point where the state of a system, for instance, level of its deterioration, can be constantly observed, a strategy based on condition-based maintenance (CBM may be affected; wherein upkeep of the system is done progressively on the premise of monitored state of the system. In this article, a multicomponent framework is considered that is continuously kept under observation. In order to decide an optimal deterioration stage for the said system, Genetic Algorithm (GA technique has been utilized that figures out when its preventive maintenance should be carried out. The system is configured into a multiobjective problem that is aimed at optimizing the two desired objectives, namely, profitability and accessibility. For the sake of reality, a prognostic model portraying the advancements of deteriorating system has been employed that will be based on utilization of continuous event simulation techniques. In this regard, Monte Carlo (MC simulation has been shortlisted as it can take into account a wide range of probable options that can help in reducing uncertainty. The inherent benefits proffered by the said simulation technique are fully utilized to display various elements of a deteriorating system working under stressed environment. The proposed synergic model (GA and MC is considered to be more effective due to the employment of “drop-by-drop approach” that permits successful drive of the related search process with regard to the best optimal solutions.

  6. X-ray photographic apparatus

    International Nuclear Information System (INIS)

    1977-01-01

    The X-ray photographic system is designed for medical applications. Two detectors are used for surveys in different planes, and produce electrical signals which are supplied to a comparator. The electron beams are examined according to a system of reference time steps. The apparatus includes a light source and a photo-detector and enables a reference signal to be produced against which the detected signals are compared. The beam source is formed from an electron gun, an extractor electrode and an anode; beam then passes through a collimator. (G.C.)

  7. Specification for personal photographic dosemeters

    International Nuclear Information System (INIS)

    1981-01-01

    The 1981 British/International Standard Specification, prepared under the direction of the Nuclear Engineering Standards Committee and TC85 of the International Organization for Standardization, is described for personal photographic dosemeters. The Standard specifies classification, characteristics and test procedures to determine absorbed doses due to X or gamma radiations (energy less than 3 MeV) and absorbed doses due to beta radiation (max. energy 0.6 to 3 MeV), whether or not accompanied by X, gamma or bremsstrahlung photon radiation. The Standard is particularly applicable to dosemeters intended to be carried on the chest or wrist. (U.K.)

  8. Tsunami Simulators in Physical Modelling Laboratories - From Concept to Proven Technique

    Science.gov (United States)

    Allsop, W.; Chandler, I.; Rossetto, T.; McGovern, D.; Petrone, C.; Robinson, D.

    2016-12-01

    Before 2004, there was little public awareness around Indian Ocean coasts of the potential size and effects of tsunami. Even in 2011, the scale and extent of devastation by the Japan East Coast Tsunami was unexpected. There were very few engineering tools to assess onshore impacts of tsunami, so no agreement on robust methods to predict forces on coastal defences, buildings or related infrastructure. Modelling generally used substantial simplifications of either solitary waves (far too short durations) or dam break (unrealistic and/or uncontrolled wave forms).This presentation will describe research from EPI-centre, HYDRALAB IV, URBANWAVES and CRUST projects over the last 10 years that have developed and refined pneumatic Tsunami Simulators for the hydraulic laboratory. These unique devices have been used to model generic elevated and N-wave tsunamis up to and over simple shorelines, and at example defences. They have reproduced full-duration tsunamis including the Mercator trace from 2004 at 1:50 scale. Engineering scale models subjected to those tsunamis have measured wave run-up on simple slopes, forces on idealised sea defences and pressures / forces on buildings. This presentation will describe how these pneumatic Tsunami Simulators work, demonstrate how they have generated tsunami waves longer than the facility within which they operate, and will highlight research results from the three generations of Tsunami Simulator. Of direct relevance to engineers and modellers will be measurements of wave run-up levels and comparison with theoretical predictions. Recent measurements of forces on individual buildings have been generalized by separate experiments on buildings (up to 4 rows) which show that the greatest forces can act on the landward (not seaward) buildings. Continuing research in the 70m long 4m wide Fast Flow Facility on tsunami defence structures have also measured forces on buildings in the lee of a failed defence wall.

  9. Wideband simulation of earthquake ground motion by a spectrum-matching, multiple-pulse technique

    International Nuclear Information System (INIS)

    Gusev, A.; Pavlov, V.

    2006-04-01

    To simulate earthquake ground motion, we combine a multiple-point stochastic earthquake fault model and a suite of Green functions. Conceptually, our source model generalizes the classic one of Haskell (1966). At any time instant, slip occurs over a narrow strip that sweeps the fault area at a (spatially variable) velocity. This behavior defines seismic signals at lower frequencies (LF), and describes directivity effects. High-frequency (HF) behavior of source signal is defined by local slip history, assumed to be a short segment of pulsed noise. For calculations, this model is discretized as a grid of point subsources. Subsource moment rate time histories, in their LF part, are smooth pulses whose duration equals to the rise time. In their HF part, they are segments of non-Gaussian noise of similar duration. The spectral content of subsource time histories is adjusted so that the summary far-field signal follows certain predetermined spectral scaling law. The results of simulation depend on random seeds, and on particular values of such parameters as: stress drop; average and dispersion parameter for rupture velocity; rupture nucleation point; slip zone width/rise time, wavenumber-spectrum parameter defining final slip function; the degrees of non-Gaussianity for random slip rate in time, and for random final slip in space, and more. To calculate ground motion at a site, Green functions are calculated for each subsource-site pair, then convolved with subsource time functions and at last summed over subsources. The original Green function calculator for layered weakly inelastic medium is of discrete wavenumber kind, with no intrinsic limitations with respect to layer thickness or bandwidth. The simulation package can generate example motions, or used to study uncertainties of the predicted motion. As a test, realistic analogues of recorded motions in the epicentral zone of the 1994 Northridge, California earthquake were synthesized, and related uncertainties were

  10. Bridging the scales in atmospheric composition simulations using a nudging technique

    Science.gov (United States)

    D'Isidoro, Massimo; Maurizi, Alberto; Russo, Felicita; Tampieri, Francesco

    2010-05-01

    Studying the interaction between climate and anthropogenic activities, specifically those concentrated in megacities/hot spots, requires the description of processes in a very wide range of scales from local, where anthropogenic emissions are concentrated to global where we are interested to study the impact of these sources. The description of all the processes at all scales within the same numerical implementation is not feasible because of limited computer resources. Therefore, different phenomena are studied by means of different numerical models that can cover different range of scales. The exchange of information from small to large scale is highly non-trivial though of high interest. In fact uncertainties in large scale simulations are expected to receive large contribution from the most polluted areas where the highly inhomogeneous distribution of sources connected to the intrinsic non-linearity of the processes involved can generate non negligible departures between coarse and fine scale simulations. In this work a new method is proposed and investigated in a case study (August 2009) using the BOLCHEM model. Monthly simulations at coarse (0.5° European domain, run A) and fine (0.1° Central Mediterranean domain, run B) horizontal resolution are performed using the coarse resolution as boundary condition for the fine one. Then another coarse resolution run (run C) is performed, in which the high resolution fields remapped on to the coarse grid are used to nudge the concentrations on the Po Valley area. The nudging is applied to all gas and aerosol species of BOLCHEM. Averaged concentrations and variances over Po Valley and other selected areas for O3 and PM are computed. It is observed that although the variance of run B is markedly larger than that of run A, the variance of run C is smaller because the remapping procedure removes large portion of variance from run B fields. Mean concentrations show some differences depending on species: in general mean

  11. Pricing Survivor Forwards and Swaps in Incomplete Markets Using Simulation Techniques

    DEFF Research Database (Denmark)

    Boyer, M. Martin; Favaro, Amélie; Stentoft, Lars

    2012-01-01

    This article considers how to manage longevity risk using longevity derivatives products. We review the potential counterparties that naturally have exposure to this type of risk and we provide details on two very simple products, the survivor forward and the survivor swap, that can be used...... to trade this type of risk. We then discuss how such products can be priced using a simulation-based approach that has been shown to be successful in pricing financial derivatives. To illustrate the flexibility of the approach we price survivor forwards and swaps using the simple dynamics of the Lee...

  12. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations.

    Science.gov (United States)

    Hariharan, Prasanna; D'Souza, Gavin A; Horner, Marc; Morrison, Tina M; Malinauskas, Richard A; Myers, Matthew R

    2017-01-01

    A "credible" computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing "model credibility" is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a "threshold-based" validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results ("S") of velocity and viscous shear stress were compared with inter-laboratory experimental measurements ("D"). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student's t-test. However, following the threshold-based approach, a Student's t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be

  13. Simulation technique on combustion of solid propellant; Kotai suishin`yaku nensho no simyureshon gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    Iida, Akihide.; Bazaki, Hakobu.; Douke, Kiyotaka. [Asahi Chemical Industry Corp., Tokyo (Japan). Oita Plant

    1999-04-30

    The burning area of propellant grain is one of the most important parameter in conducting of design on solid rocket performance. However, it has been difficult to calculate the burning area of propellant grain with precise and speed by geometrical way since most of propellant configuration have been adopted as complicated. In the present study, the simulation system was developed and produced, which was adapted `particle chasing method` to and made ot compute the burning area transition. Moreover, the reliability on computation by the system was check up on. It was found that the discrepancy of calculation between by the geometrical way and by the system was less than 1%. (author)

  14. Decision Simulation Technique (DST) as a scanning tool for exploring and explicating sustainability issues in transport decision making

    DEFF Research Database (Denmark)

    Jeppesen, Sara Lise

    2009-01-01

    This paper places focus on explicit consideration of sustainability issues in transport decision making by presenting and using a developed “Decision Simulation Technique” (DST). This technique can be used by an analyst to ‘scan’ a transport planning problem with regard to what in DST terms...... is called a sustainability strategy. This scanning can serve the purpose of informing a group of decision makers before they actually have to deal with, for example, the choice among a number of alternatives that have all been formulated as being relevant. The main focus of the paper is to illustrate how...

  15. Growth of CdTe on Si(100) surface by ionized cluster beam technique: Experimental and molecular dynamics simulation

    Energy Technology Data Exchange (ETDEWEB)

    Araghi, Houshang, E-mail: araghi@aut.ac.ir [Department of Physics, Amirkabir University of Technology, Tehran (Iran, Islamic Republic of); Zabihi, Zabiholah [Department of Physics, Amirkabir University of Technology, Tehran (Iran, Islamic Republic of); Nayebi, Payman [Department of Physics, College of Technical and Engineering, Saveh Branch, Islamic Azad University, Saveh (Iran, Islamic Republic of); Ehsani, Mohammad Mahdi [Department of Physics, Amirkabir University of Technology, Tehran (Iran, Islamic Republic of)

    2016-10-15

    II–VI semiconductor CdTe was grown on the Si(100) substrate surface by the ionized cluster beam (ICB) technique. In the ICB method, when vapors of solid materials such as CdTe were ejected through a nozzle of a heated crucible into a vacuum region, nanoclusters were created by an adiabatic expansion phenomenon. The clusters thus obtained were partially ionized by electron bombardment and then accelerated onto the silicon substrate at 473 K by high potentials. The cluster size was determined using a retarding field energy analyzer. The results of X-ray diffraction measurements indicate the cubic zinc blende (ZB) crystalline structure of the CdTe thin film on the silicon substrate. The CdTe thin film prepared by the ICB method had high crystalline quality. The microscopic processes involved in the ICB deposition technique, such as impact and coalescence processes, have been studied in detail by molecular dynamics (MD) simulation.

  16. Simulating the performance of adaptive optics techniques on FSO communications through the atmosphere

    Science.gov (United States)

    Martínez, Noelia; Rodríguez Ramos, Luis Fernando; Sodnik, Zoran

    2017-08-01

    The Optical Ground Station (OGS), installed in the Teide Observatory since 1995, was built as part of ESA efforts in the research field of satellite optical communications to test laser telecommunication terminals on board of satellites in Low Earth Orbit and Geostationary Orbit. As far as one side of the link is settled on the Earth, the laser beam (either on the uplink or on the downlink) has to bear with the atmospheric turbulence. Within the framework of designing an Adaptive Optics system to improve the performance of the Free-Space Optical Communications at the OGS, turbulence conditions regarding uplink and downlink have been simulated within the OOMAO (Object-Oriented Matlab Adaptive Optics) Toolbox as well as the possible utilization of a Laser Guide Star to measure the wavefront in this context. Simulations have been carried out by reducing available atmospheric profiles regarding both night-time and day-time measurements and by having into account possible seasonal changes. An AO proposal to reduce atmospheric aberrations and, therefore, ameliorate FSO links performance is presented and analysed in this paper

  17. Conventional QT Variability Measurement vs. Template Matching Techniques: Comparison of Performance Using Simulated and Real ECG

    Science.gov (United States)

    Baumert, Mathias; Starc, Vito; Porta, Alberto

    2012-01-01

    Increased beat-to-beat variability in the QT interval (QTV) of ECG has been associated with increased risk for sudden cardiac death, but its measurement is technically challenging and currently not standardized. The aim of this study was to investigate the performance of commonly used beat-to-beat QT interval measurement algorithms. Three different methods (conventional, template stretching and template time shifting) were subjected to simulated data featuring typical ECG recording issues (broadband noise, baseline wander, amplitude modulation) and real short-term ECG of patients before and after infusion of sotalol, a QT interval prolonging drug. Among the three algorithms, the conventional algorithm was most susceptible to noise whereas the template time shifting algorithm showed superior overall performance on simulated and real ECG. None of the algorithms was able to detect increased beat-to-beat QT interval variability after sotalol infusion despite marked prolongation of the average QT interval. The QTV estimates of all three algorithms were inversely correlated with the amplitude of the T wave. In conclusion, template matching algorithms, in particular the time shifting algorithm, are recommended for beat-to-beat variability measurement of QT interval in body surface ECG. Recording noise, T wave amplitude and the beat-rejection strategy are important factors of QTV measurement and require further investigation. PMID:22860030

  18. Conventional QT variability measurement vs. template matching techniques: comparison of performance using simulated and real ECG.

    Directory of Open Access Journals (Sweden)

    Mathias Baumert

    Full Text Available Increased beat-to-beat variability in the QT interval (QTV of ECG has been associated with increased risk for sudden cardiac death, but its measurement is technically challenging and currently not standardized. The aim of this study was to investigate the performance of commonly used beat-to-beat QT interval measurement algorithms. Three different methods (conventional, template stretching and template time shifting were subjected to simulated data featuring typical ECG recording issues (broadband noise, baseline wander, amplitude modulation and real short-term ECG of patients before and after infusion of sotalol, a QT interval prolonging drug. Among the three algorithms, the conventional algorithm was most susceptible to noise whereas the template time shifting algorithm showed superior overall performance on simulated and real ECG. None of the algorithms was able to detect increased beat-to-beat QT interval variability after sotalol infusion despite marked prolongation of the average QT interval. The QTV estimates of all three algorithms were inversely correlated with the amplitude of the T wave. In conclusion, template matching algorithms, in particular the time shifting algorithm, are recommended for beat-to-beat variability measurement of QT interval in body surface ECG. Recording noise, T wave amplitude and the beat-rejection strategy are important factors of QTV measurement and require further investigation.

  19. Numerical Simulation of Molten Flow in Directed Energy Deposition Using an Iterative Geometry Technique

    Science.gov (United States)

    Vincent, Timothy J.; Rumpfkeil, Markus P.; Chaudhary, Anil

    2018-06-01

    The complex, multi-faceted physics of laser-based additive metals processing tends to demand high-fidelity models and costly simulation tools to provide predictions accurate enough to aid in selecting process parameters. Of particular difficulty is the accurate determination of melt pool shape and size, which are useful for predicting lack-of-fusion, as this typically requires an adequate treatment of thermal and fluid flow. In this article we describe a novel numerical simulation tool which aims to achieve a balance between accuracy and cost. This is accomplished by making simplifying assumptions regarding the behavior of the gas-liquid interface for processes with a moderate energy density, such as Laser Engineered Net Shaping (LENS). The details of the implementation, which is based on the solver simpleFoam of the well-known software suite OpenFOAM, are given here and the tool is verified and validated for a LENS process involving Ti-6Al-4V. The results indicate that the new tool predicts width and height of a deposited track to engineering accuracy levels.

  20. Numerical Simulation of Molten Flow in Directed Energy Deposition Using an Iterative Geometry Technique

    Science.gov (United States)

    Vincent, Timothy J.; Rumpfkeil, Markus P.; Chaudhary, Anil

    2018-03-01

    The complex, multi-faceted physics of laser-based additive metals processing tends to demand high-fidelity models and costly simulation tools to provide predictions accurate enough to aid in selecting process parameters. Of particular difficulty is the accurate determination of melt pool shape and size, which are useful for predicting lack-of-fusion, as this typically requires an adequate treatment of thermal and fluid flow. In this article we describe a novel numerical simulation tool which aims to achieve a balance between accuracy and cost. This is accomplished by making simplifying assumptions regarding the behavior of the gas-liquid interface for processes with a moderate energy density, such as Laser Engineered Net Shaping (LENS). The details of the implementation, which is based on the solver simpleFoam of the well-known software suite OpenFOAM, are given here and the tool is verified and validated for a LENS process involving Ti-6Al-4V. The results indicate that the new tool predicts width and height of a deposited track to engineering accuracy levels.

  1. Simulation model of harmonics reduction technique using shunt active filter by cascade multilevel inverter method

    Science.gov (United States)

    Andreh, Angga Muhamad; Subiyanto, Sunardiyo, Said

    2017-01-01

    Development of non-linear loading in the application of industry and distribution system and also harmonic compensation becomes important. Harmonic pollution is an urgent problem in increasing power quality. The main contribution of the study is the modeling approach used to design a shunt active filter and the application of the cascade multilevel inverter topology to improve the power quality of electrical energy. In this study, shunt active filter was aimed to eliminate dominant harmonic component by injecting opposite currents with the harmonic component system. The active filter was designed by shunt configuration with cascaded multilevel inverter method controlled by PID controller and SPWM. With this shunt active filter, the harmonic current can be reduced so that the current wave pattern of the source is approximately sinusoidal. Design and simulation were conducted by using Power Simulator (PSIM) software. Shunt active filter performance experiment was conducted on the IEEE four bus test system. The result of shunt active filter installation on the system (IEEE four bus) could reduce THD current from 28.68% to 3.09%. With this result, the active filter can be applied as an effective method to reduce harmonics.

  2. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    Energy Technology Data Exchange (ETDEWEB)

    and Ben Polly, Joseph Robertson [National Renewable Energy Lab. (NREL), Golden, CO (United States); Polly, Ben [National Renewable Energy Lab. (NREL), Golden, CO (United States); Collis, Jon [Colorado School of Mines, Golden, CO (United States)

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define "explicit" input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAE 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.

  3. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, J.; Polly, B.; Collis, J.

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define 'explicit' input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAE 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.

  4. A Photographer From Ankara: Osman Darcan

    Directory of Open Access Journals (Sweden)

    Gülseren Mungan Yavuztürk

    2015-12-01

    Full Text Available This work introduces Osman Darcan, an important name in the history of Ankara photography studios. Darcan followed in the footsteps of famous Austrian photographer Othmar Pferschy, whom he met in Istanbul, to go on to create his own valuable work. On leaving the Public Press Authority Photo Film Center, where he worked as a newsreel photographer and film operator, in 1943 he began taking photographs at the Tatbikat Theater at the Ankara State Conservatoire, where he continued as the photographer for the State Theater until the end of his life. At the same time, this master photographer took the pictures of a select coterie of Ankara’s leading individuals and well-known performers at a studio he opened on Anafartalar Caddesi. In both these roles, his photographs evoke admiration thanks to Darcan’s professional abilities and level of artistry.

  5. Comparison between fluorimetry and oximetry techniques to measure photosynthesis in the diatom Skeletonema costatum cultivated under simulated seasonal conditions.

    Science.gov (United States)

    Lefebvre, Sébastien; Mouget, Jean-Luc; Loret, Pascale; Rosa, Philippe; Tremblin, Gérard

    2007-02-01

    This study reports comparison of two techniques measuring photosynthesis in the ubiquitous diatom Skeletonema costatum, i.e., the classical oximetry and the recent modulated fluorimetry. Microalgae in semi-continuous cultures were exposed to five different environmental conditions simulating a seasonal effect with co-varying temperature, photoperiod and incident light. Photosynthesis was assessed by gross rate of oxygen evolution (P(B)) and the electron transport rate (ETR) measurements. The two techniques were linearly related within seasonal treatments along the course of the P/E curves. The light saturation intensity parameters (Ek and Ek(ETR)), and the maximum electron transport rate increased significantly with the progression of the season while the maximum light utilization efficiency for ETR (alpha(ETR)) was constant. By contrast, the maximum gross oxygen photosynthetic capacity (Pmax(B)) and the maximum light utilization efficiency for P(B) (alpha(B)) increased from December to May treatment but decreased from May to July treatment. Both techniques showed clear photoacclimation in microalgae with the progression of the season, as illustrated by changes in photosynthetic parameters. The relationship between the two techniques changed when high temperature, photoperiod and incident light were combined, possibly due to an overestimation of the PAR--averaged chlorophyll-specific absorption cross-section. Despite this change, our results illustrate the strong suitability of in vivo chlorophyll fluorimetry to estimate primary production in the field.

  6. Uncertainty estimates of a GRACE inversion modelling technique over Greenland using a simulation

    Science.gov (United States)

    Bonin, Jennifer; Chambers, Don

    2013-07-01

    The low spatial resolution of GRACE causes leakage, where signals in one location spread out into nearby regions. Because of this leakage, using simple techniques such as basin averages may result in an incorrect estimate of the true mass change in a region. A fairly simple least squares inversion technique can be used to more specifically localize mass changes into a pre-determined set of basins of uniform internal mass distribution. However, the accuracy of these higher resolution basin mass amplitudes has not been determined, nor is it known how the distribution of the chosen basins affects the results. We use a simple `truth' model over Greenland as an example case, to estimate the uncertainties of this inversion method and expose those design parameters which may result in an incorrect high-resolution mass distribution. We determine that an appropriate level of smoothing (300-400 km) and process noise (0.30 cm2 of water) gets the best results. The trends of the Greenland internal basins and Iceland can be reasonably estimated with this method, with average systematic errors of 3.5 cm yr-1 per basin. The largest mass losses found from GRACE RL04 occur in the coastal northwest (-19.9 and -33.0 cm yr-1) and southeast (-24.2 and -27.9 cm yr-1), with small mass gains (+1.4 to +7.7 cm yr-1) found across the northern interior. Acceleration of mass change is measurable at the 95 per cent confidence level in four northwestern basins, but not elsewhere in Greenland. Due to an insufficiently detailed distribution of basins across internal Canada, the trend estimates of Baffin and Ellesmere Islands are expected to be incorrect due to systematic errors caused by the inversion technique.

  7. Photographs of quantized vortex lines in rotating superfluid helium

    International Nuclear Information System (INIS)

    Williams, G.A.

    1974-01-01

    The spatial positions of quantized vortex lines in rotating He II have been determined using a photographic technique. Electrons are trapped on the vortices and then extracted through the liquid surface and accelerated into a phosphor screen. The light from the phosphor is transmitted to room temperature with coherent fiber optics and photographed with an image intensifier camera. Photographs taken with pure 4 He at T = 0.3 K were complete blurs. These blurs are attributed to nonequilibrium motion of the vortices, arising from the lack of normal fluid damping at this temperature. To resolve the individual vortex lines it was found necessary to add 3 He to the 4 He sample to damp the vortex motion. Photographs are presented for 3 He concentrations up to 1.6 percent. The number of vortices visible varies linearly with rotation speed, but is only about one-half the number expected from theory. The vortex lines in the apparatus were not observed to form a stable array

  8. Monitoring engineering structures by the comparison of similar photographs

    International Nuclear Information System (INIS)

    Jones, A.

    1976-12-01

    A commonly used method of monitoring engineering structures is to compare similar photographs taken at different times. The initial part of this note deals with commercially available equipment, known as a comparascope, which enables differences between photographs to be rapidly (and reliably) detected. A series of practical tests is described in which it is established that a change in dimensions of 0.05mm can be detected between photographs. For typical camera systems, this will usually correspond to detectable displacements of the order of several mm in object space. Perhaps the most serious disadvantages of the technique is that alterations in camera attitude between photographs can cause changes in the recorded image which mask genuine movements in the structure. The changes caused by a given shift in camera attitude are, therefore, investigated theoretically. Since it is desirable that the changes are small enough to go undetected in the comparison, the established detection limit of the comparascope is included in the investigation to specify how accurately the camera attitude must be controlled for a given set of experimental circumstances. As a result, it appears that a special purpose camera mounting will nearly always be required if structural differences as small as several mm are to be reliably detected. Hand-held cameras should only be used for relatively coarse monitoring tasks. (author)

  9. Track photographing in 8-m streamer chamber

    International Nuclear Information System (INIS)

    Anisimova, N.Z.; Davidenko, V.A.; Kantserov, V.A.; Rybakov, V.G.; Somov, S.V.

    1981-01-01

    A system for obtaining data from a streamer chamber intended for measuring muon polarization is described. An optical scheme for photographing of tracks in the chamber is given. The photographing process is complicated at the expense of large dimensions and module structure of the chamber as well as due to insufficient for direct photographing brightness of streamers. The system described was tested during a long time in a physical experiment. More than 100 thousand photos have been taken by its means [ru

  10. Applications of soft computing in time series forecasting simulation and modeling techniques

    CERN Document Server

    Singh, Pritpal

    2016-01-01

    This book reports on an in-depth study of fuzzy time series (FTS) modeling. It reviews and summarizes previous research work in FTS modeling and also provides a brief introduction to other soft-computing techniques, such as artificial neural networks (ANNs), rough sets (RS) and evolutionary computing (EC), focusing on how these techniques can be integrated into different phases of the FTS modeling approach. In particular, the book describes novel methods resulting from the hybridization of FTS modeling approaches with neural networks and particle swarm optimization. It also demonstrates how a new ANN-based model can be successfully applied in the context of predicting Indian summer monsoon rainfall. Thanks to its easy-to-read style and the clear explanations of the models, the book can be used as a concise yet comprehensive reference guide to fuzzy time series modeling, and will be valuable not only for graduate students, but also for researchers and professionals working for academic, business and governmen...

  11. Simulation on the Effect of Bottle Wall Thickness Distribution using Blow Moulding Technique

    International Nuclear Information System (INIS)

    Suraya, S; Azman, M D; Fatchurrohman, N; Jaafar, A A; Yusoff, A R

    2016-01-01

    The aims of this study are to assess the deformation behavior of a polymeric material during a blow moulding process. Transient computations of two dimensional model of a PP bottle were performed using ANSYS Polyflow computer code to predict the wall thickness distribution at four different parison's diameter; 8mm, 10mm, 18mm, and 20mm. Effects on the final wall thickness diameter and time step are studied. The simulated data shows that the inflation performance degrades with increasing parison diameter. It is concluded that the blow moulding process using 10mm parison successfully meet the product processing requirements. Factors that contribute to the variation in deformation behaviour of the plastic during the manufacturing process are discussed. (paper)

  12. Complex Langevin simulation of QCD at finite density and low temperature using the deformation technique

    Science.gov (United States)

    Nagata, Keitro; Nishimura, Jun; Shimasaki, Shinji

    2018-03-01

    We study QCD at finite density and low temperature by using the complex Langevin method. We employ the gauge cooling to control the unitarity norm and intro-duce a deformation parameter in the Dirac operator to avoid the singular-drift problem. The reliability of the obtained results are judged by the probability distribution of the magnitude of the drift term. By making extrapolations with respect to the deformation parameter using only the reliable results, we obtain results for the original system. We perform simulations on a 43 × 8 lattice and show that our method works well even in the region where the reweighing method fails due to the severe sign problem. As a result we observe a delayed onset of the baryon number density as compared with the phase-quenched model, which is a clear sign of the Silver Blaze phenomenon.

  13. A dynamic mesh refinement technique for Lattice Boltzmann simulations on octree-like grids

    KAUST Repository

    Neumann, Philipp

    2012-04-27

    In this contribution, we present our new adaptive Lattice Boltzmann implementation within the Peano framework, with special focus on nanoscale particle transport problems. With the continuum hypothesis not holding anymore on these small scales, new physical effects - such as Brownian fluctuations - need to be incorporated. We explain the overall layout of the application, including memory layout and access, and shortly review the adaptive algorithm. The scheme is validated by different benchmark computations in two and three dimensions. An extension to dynamically changing grids and a spatially adaptive approach to fluctuating hydrodynamics, allowing for the thermalisation of the fluid in particular regions of interest, is proposed. Both dynamic adaptivity and adaptive fluctuating hydrodynamics are validated separately in simulations of particle transport problems. The application of this scheme to an oscillating particle in a nanopore illustrates the importance of Brownian fluctuations in such setups. © 2012 Springer-Verlag.

  14. Velocity-Resolved LES (VR-LES) technique for simulating turbulent transport of high Schmidt number passive scalars

    Science.gov (United States)

    Verma, Siddhartha; Blanquart, Guillaume; P. K. Yeung Collaboration

    2011-11-01

    Accurate simulation of high Schmidt number scalar transport in turbulent flows is essential to studying pollutant dispersion, weather, and several oceanic phenomena. Batchelor's theory governs scalar transport in such flows, but requires further validation at high Schmidt and high Reynolds numbers. To this end, we use a new approach with the velocity field fully resolved, but the scalar field only partially resolved. The grid used is fine enough to resolve scales up to the viscous-convective subrange where the decaying slope of the scalar spectrum becomes constant. This places the cutoff wavenumber between the Kolmogorov scale and the Batchelor scale. The subgrid scale terms, which affect transport at the supergrid scales, are modeled under the assumption that velocity fluctuations are negligible beyond this cutoff wavenumber. To ascertain the validity of this technique, we performed a-priori testing on existing DNS data. This Velocity-Resolved LES (VR-LES) technique significantly reduces the computational cost of turbulent simulations of high Schmidt number scalars, and yet provides valuable information of the scalar spectrum in the viscous-convective subrange.

  15. Developing close combat behaviors for simulated soldiers using genetic programming techniques.

    Energy Technology Data Exchange (ETDEWEB)

    Pryor, Richard J.; Schaller, Mark J.

    2003-10-01

    Genetic programming is a powerful methodology for automatically producing solutions to problems in a variety of domains. It has been used successfully to develop behaviors for RoboCup soccer players and simple combat agents. We will attempt to use genetic programming to solve a problem in the domain of strategic combat, keeping in mind the end goal of developing sophisticated behaviors for compound defense and infiltration. The simplified problem at hand is that of two armed agents in a small room, containing obstacles, fighting against each other for survival. The base case and three changes are considered: a memory of positions using stacks, context-dependent genetic programming, and strongly typed genetic programming. Our work demonstrates slight improvements from the first two techniques, and no significant improvement from the last.

  16. Simulative Investigation on Spectral Efficiency of Unipolar Codes based OCDMA System using Importance Sampling Technique

    Science.gov (United States)

    Farhat, A.; Menif, M.; Rezig, H.

    2013-09-01

    This paper analyses the spectral efficiency of Optical Code Division Multiple Access (OCDMA) system using Importance Sampling (IS) technique. We consider three configurations of OCDMA system namely Direct Sequence (DS), Spectral Amplitude Coding (SAC) and Fast Frequency Hopping (FFH) that exploits the Fiber Bragg Gratings (FBG) based encoder/decoder. We evaluate the spectral efficiency of the considered system by taking into consideration the effect of different families of unipolar codes for both coherent and incoherent sources. The results show that the spectral efficiency of OCDMA system with coherent source is higher than the incoherent case. We demonstrate also that DS-OCDMA outperforms both others in terms of spectral efficiency in all conditions.

  17. Photographic quality assurance in diagnostic radiology, nuclear medicine, and radiation therapy. Volume II. Photographic processing, quality assurance, and the evaluation of photographic materials. Final report

    International Nuclear Information System (INIS)

    Gray, J.E.; Vucich, J.J.

    1977-03-01

    Contents: Sensitometers, densitometers, and testing equipment; Pitfalls of the photographic (and radiographic) process; Evaluation and optimization of photographic processes; Quality assurance; Odds 'n' ends

  18. Determination of the overall migration from silicone baking moulds into simulants and food using 1H-NMR techniques.

    Science.gov (United States)

    Helling, Ruediger; Mieth, Anja; Altmann, Stefan; Simat, Thomas Joachim

    2009-03-01

    Different silicone baking moulds (37 samples) were characterized with respect to potential migrating substances using 1H-NMR, RP-HPLC-UV/ELSD and GC techniques. In all cases cyclic organosiloxane oligomers with the formula [Si(CH3)2-O]n were identified (n = 6 ... 50). Additionally, linear, partly hydroxyl-terminated organosiloxanes HO-[Si(CH3)2-O]n-H (n = 7 ... 20) were found in 13 samples. No substances other than siloxanes could be detected, meaning the migrants mainly consist of organopolysiloxanes. Based on this knowledge, a 1H-NMR quantification method for siloxanes was established for the analysis of both simulants and foodstuffs. Validation of the 1H-NMR method gave suitable performance characteristics: limit of detection 8.7 mg kg(-1) oil, coefficient of variation 7.8% (at a level of 1.0 mg kg(-1) food). Migration studies were carried out with simulants (olive oil, isooctane, ethanol (95%), Tenax) as well as preparation of different cakes. From the 1st to 10th experiment, siloxane migration into cakes only slightly decreased, with a significant dependence on fat content. Migration never exceeded a level of 21 mg kg(-1) (3 mg dm(-2)) and was, therefore, well below the overall migration limit of 60 mg kg(-1) (10 mg dm(-2)). However, migration behaviour into simulants differed completely from these results.

  19. Computer-assisted preoperative simulation for positioning and fixation of plate in 2-stage procedure combining maxillary advancement by distraction technique and mandibular setback surgery

    Directory of Open Access Journals (Sweden)

    Hideyuki Suenaga

    2016-01-01

    Conclusion: The implementation of the computer-assisted preoperative simulation for the positioning and fixation of plate in 2-stage orthognathic procedure using distraction technique and mandibular setback surgery yielded good results.

  20. Objective mapping of observed sub-surface mesoscale cold core eddy in the Bay of Bengal by stochastic inverse technique with tomographically simulated travel times

    Digital Repository Service at National Institute of Oceanography (India)

    Murty, T.V.R.; Rao, M.M.M.; Sadhuram, Y.; Sridevi, B.; Maneesha, K.; SujithKumar, S.; Prasanna, P.L.; Murthy, K.S.R.

    of Bengal during south-west monsoon season and explore possibility to reconstruct the acoustic profile of the eddy by Stochastic Inverse Technique. A simulation experiment on forward and inverse problems for observed sound velocity perturbation field has...

  1. Electron Irradiation of Conjunctival Lymphoma-Monte Carlo Simulation of the Minute Dose Distribution and Technique Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Brualla, Lorenzo, E-mail: lorenzo.brualla@uni-due.de [NCTeam, Strahlenklinik, Universitaetsklinikum Essen, Essen (Germany); Zaragoza, Francisco J.; Sempau, Josep [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Barcelona (Spain); Wittig, Andrea [Department of Radiation Oncology, University Hospital Giessen and Marburg, Philipps-University Marburg, Marburg (Germany); Sauerwein, Wolfgang [NCTeam, Strahlenklinik, Universitaetsklinikum Essen, Essen (Germany)

    2012-07-15

    Purpose: External beam radiotherapy is the only conservative curative approach for Stage I non-Hodgkin lymphomas of the conjunctiva. The target volume is geometrically complex because it includes the eyeball and lid conjunctiva. Furthermore, the target volume is adjacent to radiosensitive structures, including the lens, lacrimal glands, cornea, retina, and papilla. The radiotherapy planning and optimization requires accurate calculation of the dose in these anatomical structures that are much smaller than the structures traditionally considered in radiotherapy. Neither conventional treatment planning systems nor dosimetric measurements can reliably determine the dose distribution in these small irradiated volumes. Methods and Materials: The Monte Carlo simulations of a Varian Clinac 2100 C/D and human eye were performed using the PENELOPE and PENEASYLINAC codes. Dose distributions and dose volume histograms were calculated for the bulbar conjunctiva, cornea, lens, retina, papilla, lacrimal gland, and anterior and posterior hemispheres. Results: The simulated results allow choosing the most adequate treatment setup configuration, which is an electron beam energy of 6 MeV with additional bolus and collimation by a cerrobend block with a central cylindrical hole of 3.0 cm diameter and central cylindrical rod of 1.0 cm diameter. Conclusions: Monte Carlo simulation is a useful method to calculate the minute dose distribution in ocular tissue and to optimize the electron irradiation technique in highly critical structures. Using a voxelized eye phantom based on patient computed tomography images, the dose distribution can be estimated with a standard statistical uncertainty of less than 2.4% in 3 min using a computing cluster with 30 cores, which makes this planning technique clinically relevant.

  2. Development of an interpretive simulation tool for the proton radiography technique

    Energy Technology Data Exchange (ETDEWEB)

    Levy, M. C., E-mail: levymc@stanford.edu [Clarendon Laboratory, University of Oxford, Parks Road, Oxford OX1 3PU (United Kingdom); Lawrence Livermore National Laboratory, Livermore, California 94551 (United States); Ryutov, D. D.; Wilks, S. C.; Ross, J. S.; Huntington, C. M.; Fiuza, F.; Martinez, D. A.; Park, H.-S. [Lawrence Livermore National Laboratory, Livermore, California 94551 (United States); Kugland, N. L. [Lam Research Corporation, 4400 Cushing Parkway, Fremont, California 94538 (United States); Baring, M. G. [Department of Physics and Astronomy, Rice University, Houston, Texas 77005 (United States)

    2015-03-15

    Proton radiography is a useful diagnostic of high energy density (HED) plasmas under active theoretical and experimental development. In this paper, we describe a new simulation tool that interacts realistic laser-driven point-like proton sources with three dimensional electromagnetic fields of arbitrary strength and structure and synthesizes the associated high resolution proton radiograph. The present tool’s numerical approach captures all relevant physics effects, including effects related to the formation of caustics. Electromagnetic fields can be imported from particle-in-cell or hydrodynamic codes in a streamlined fashion, and a library of electromagnetic field “primitives” is also provided. This latter capability allows users to add a primitive, modify the field strength, rotate a primitive, and so on, while quickly generating a high resolution radiograph at each step. In this way, our tool enables the user to deconstruct features in a radiograph and interpret them in connection to specific underlying electromagnetic field elements. We show an example application of the tool in connection to experimental observations of the Weibel instability in counterstreaming plasmas, using ∼10{sup 8} particles generated from a realistic laser-driven point-like proton source, imaging fields which cover volumes of ∼10 mm{sup 3}. Insights derived from this application show that the tool can support understanding of HED plasmas.

  3. Comparison of rheological evaluation techniques and turbulent flow prediction of a simulated nuclear waste melter slurry

    International Nuclear Information System (INIS)

    Carleson, T.E.; Hart, R.E.; Drown, D.C.; Peterson, M.E.

    1987-03-01

    An experimental study was performed on a simulated nuclear waste slurry containing the type of waste sludge and glass-forming chemicals that will be converted to a stable glass in a high-temperature furnace. The rheological properties of the slurry must be determined in order to design the transport and mixing systems. The rheological parameters for the slurry were determined by a variety of viscometers including a rotational viscometer, a capillary tube viscometer, and a pipe flow apparatus. Experiments revealed the absence of wall slip and sufficient non-Newtonian behavior to require adjustments of the results. The slurry was characterized as a yield pseudoplastic fluid. Different rheological constants were obtained for all three viscometers. Predictions of the shear stress as a function of shear rate showed good agreement between the constants determined by the rotational viscometer and the pipe loop apparatus. Laminar and turbulent flows in the pipe loop correlated closely with a recent theoretical model. 16 refs., 16 figs., 5 tabs

  4. Development of an interpretive simulation tool for the proton radiography technique

    Science.gov (United States)

    Levy, M. C.; Ryutov, D. D.; Wilks, S. C.; Ross, J. S.; Huntington, C. M.; Fiuza, F.; Martinez, D. A.; Kugland, N. L.; Baring, M. G.; Park, H.-S.

    2015-03-01

    Proton radiography is a useful diagnostic of high energy density (HED) plasmas under active theoretical and experimental development. In this paper, we describe a new simulation tool that interacts realistic laser-driven point-like proton sources with three dimensional electromagnetic fields of arbitrary strength and structure and synthesizes the associated high resolution proton radiograph. The present tool's numerical approach captures all relevant physics effects, including effects related to the formation of caustics. Electromagnetic fields can be imported from particle-in-cell or hydrodynamic codes in a streamlined fashion, and a library of electromagnetic field "primitives" is also provided. This latter capability allows users to add a primitive, modify the field strength, rotate a primitive, and so on, while quickly generating a high resolution radiograph at each step. In this way, our tool enables the user to deconstruct features in a radiograph and interpret them in connection to specific underlying electromagnetic field elements. We show an example application of the tool in connection to experimental observations of the Weibel instability in counterstreaming plasmas, using ˜108 particles generated from a realistic laser-driven point-like proton source, imaging fields which cover volumes of ˜10 mm3. Insights derived from this application show that the tool can support understanding of HED plasmas.

  5. Optimal design of minimum mean-square error noise reduction algorithms using the simulated annealing technique.

    Science.gov (United States)

    Bai, Mingsian R; Hsieh, Ping-Ju; Hur, Kur-Nan

    2009-02-01

    The performance of the minimum mean-square error noise reduction (MMSE-NR) algorithm in conjunction with time-recursive averaging (TRA) for noise estimation is found to be very sensitive to the choice of two recursion parameters. To address this problem in a more systematic manner, this paper proposes an optimization method to efficiently search the optimal parameters of the MMSE-TRA-NR algorithms. The objective function is based on a regression model, whereas the optimization process is carried out with the simulated annealing algorithm that is well suited for problems with many local optima. Another NR algorithm proposed in the paper employs linear prediction coding as a preprocessor for extracting the correlated portion of human speech. Objective and subjective tests were undertaken to compare the optimized MMSE-TRA-NR algorithm with several conventional NR algorithms. The results of subjective tests were processed by using analysis of variance to justify the statistic significance. A post hoc test, Tukey's Honestly Significant Difference, was conducted to further assess the pairwise difference between the NR algorithms.

  6. A data-based technique for monitoring of wound rotor induction machines: A simulation study

    KAUST Repository

    Harrou, Fouzi

    2016-05-09

    Detecting faults induction machines is crucial for a safe operation of these machines. The aim of this paper is to present a statistical fault detection methodology for the detection of faults in three-phase wound rotor induction machines (WRIM). The proposed fault detection approach is based on the use of principal components analysis (PCA). However, conventional PCA-based detection indices, such as the T2T2 and the Q statistics, are not well suited to detect small faults because these indices only use information from the most recent available samples. Detection of small faults is one of the most crucial and challenging tasks in the area of fault detection and diagnosis. In this paper, a new statistical system monitoring strategy is proposed for detecting changes resulting from small shifts in several variables associated with WRIM. The proposed approach combines modeling using PCA modeling with the exponentially weighted moving average (EWMA) control scheme. In the proposed approach, EWMA control scheme is applied on the ignored principal components to detect the presence of faults. The performance of the proposed method is compared with those of the traditional PCA-based fault detection indices. The simulation results clearly show the effectiveness of the proposed method over the conventional ones, especially in the presence of faults with small magnitudes.

  7. Evaluation and simulation of event building techniques for a detector at the LHC

    CERN Document Server

    Spiwoks, R

    1995-01-01

    The main objectives of future experiments at the Large Hadron Collider are the search for the Higgs boson (or bosons), the verification of the Standard Model and the search beyond the Standard Model in a new energy range up to a few TeV. These experiments will have to cope with unprecedented high data rates and will need event building systems which can offer a bandwidth of 1 to 100GB/s and which can assemble events from 100 to 1000 readout memories at rates of 1 to 100kHz. This work investigates the feasibility of parallel event building sys- tems using commercially available high speed interconnects and switches. Studies are performed by building a small-scale prototype and by modelling this proto- type and realistic architectures with discrete-event simulations. The prototype is based on the HiPPI standard and uses commercially available VME-HiPPI interfaces and a HiPPI switch together with modular and scalable software. The setup operates successfully as a parallel event building system of limited size in...

  8. "Photographers Are the Devil": An Essay in the Historiography of Photographing Schools

    Science.gov (United States)

    Hardcastle, John

    2013-01-01

    Today, the use of photographs in publications and exhibitions is commonplace, but this was not always so. This article shows how photographs of certain schools that have had lasting impact on design stand in ambiguous relationships to the buildings themselves. Photographs function as part of the design process; they record details of construction…

  9. Monte Carlo simulation of X-ray imaging and spectroscopy experiments using quadric geometry and variance reduction techniques

    Science.gov (United States)

    Golosio, Bruno; Schoonjans, Tom; Brunetti, Antonio; Oliva, Piernicola; Masala, Giovanni Luca

    2014-03-01

    The simulation of X-ray imaging experiments is often performed using deterministic codes, which can be relatively fast and easy to use. However, such codes are generally not suitable for the simulation of even slightly more complex experimental conditions, involving, for instance, first-order or higher-order scattering, X-ray fluorescence emissions, or more complex geometries, particularly for experiments that combine spatial resolution with spectral information. In such cases, simulations are often performed using codes based on the Monte Carlo method. In a simple Monte Carlo approach, the interaction position of an X-ray photon and the state of the photon after an interaction are obtained simply according to the theoretical probability distributions. This approach may be quite inefficient because the final channels of interest may include only a limited region of space or photons produced by a rare interaction, e.g., fluorescent emission from elements with very low concentrations. In the field of X-ray fluorescence spectroscopy, this problem has been solved by combining the Monte Carlo method with variance reduction techniques, which can reduce the computation time by several orders of magnitude. In this work, we present a C++ code for the general simulation of X-ray imaging and spectroscopy experiments, based on the application of the Monte Carlo method in combination with variance reduction techniques, with a description of sample geometry based on quadric surfaces. We describe the benefits of the object-oriented approach in terms of code maintenance, the flexibility of the program for the simulation of different experimental conditions and the possibility of easily adding new modules. Sample applications in the fields of X-ray imaging and X-ray spectroscopy are discussed. Catalogue identifier: AERO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERO_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland

  10. 3D dose imaging for arc therapy techniques by means of Fricke gel dosimetry and dedicated Monte Carlo simulations

    International Nuclear Information System (INIS)

    Valente, Mauro; Castellano, Gustavo; Sosa, Carlos

    2008-01-01

    Full text: Radiotherapy is one of the most effective techniques for tumour treatment and control. During the last years, significant developments were performed regarding both irradiation technology and techniques. However, accurate 3D dosimetric techniques are nowadays not commercially available. Due to their intrinsic characteristics, traditional dosimetric techniques like ionisation chamber, film dosimetry or TLD do not offer proper continuous 3D dose mapping. The possibility of using ferrous sulphate (Fricke) dosimeters suitably fixed to a gel matrix, along with dedicated optical analysis methods, based on light transmission measurements for 3D absorbed dose imaging in tissue-equivalent materials, has become great interest in radiotherapy. Since Gore et al. showed in 1984 that the oxidation of ferrous ions to ferric ions still happen even when fixing the ferrous sulphate solution to a gelatine matrix, important efforts have been dedicated in developing and improving real continuous 3D dosimetric systems based on Fricke solution. The purpose of this work is to investigate the capability and suitability of Fricke gel dosimetry for arc therapy irradiations. The dosimetric system is mainly composed by Fricke gel dosimeters, suitably shaped in form of thin layers and optically analysed by means of visible light transmission measurements, acquiring sample images just before and after irradiation by means of a commercial flatbed-like scanner. Image acquisition, conversion to matrices and further analysis are accomplished by means of dedicated developed software, which includes suitable algorithms for optical density differences calculation and corresponding absorbed dose conversion. Dedicated subroutines allow 3D dose imaging reconstruction from single layer information, by means of computer tomography-like algorithms. Also, dedicated Monte Carlo (PENELOPE) subroutines have been adapted in order to achieve accurate simulation of arc therapy irradiation techniques

  11. Improving Conductivity Image Quality Using Block Matrix-based Multiple Regularization (BMMR Technique in EIT: A Simulation Study

    Directory of Open Access Journals (Sweden)

    Tushar Kanti Bera

    2011-06-01

    Full Text Available A Block Matrix based Multiple Regularization (BMMR technique is proposed for improving conductivity image quality in EIT. The response matrix (JTJ has been partitioned into several sub-block matrices and the highest eigenvalue of each sub-block matrices has been chosen as regularization parameter for the nodes contained by that sub-block. Simulated boundary data are generated for circular domain with circular inhomogeneity and the conductivity images are reconstructed in a Model Based Iterative Image Reconstruction (MoBIIR algorithm. Conductivity images are reconstructed with BMMR technique and the results are compared with the Single-step Tikhonov Regularization (STR and modified Levenberg-Marquardt Regularization (LMR methods. It is observed that the BMMR technique reduces the projection error and solution error and improves the conductivity reconstruction in EIT. Result show that the BMMR method also improves the image contrast and inhomogeneity conductivity profile and hence the reconstructed image quality is enhanced. ;doi:10.5617/jeb.170 J Electr Bioimp, vol. 2, pp. 33-47, 2011

  12. "Planaltina in the Hole of Aluminum": production and consumption of pinhole photographs

    Directory of Open Access Journals (Sweden)

    Juliana Soares Mendes

    2015-08-01

    Full Text Available This article analyzes a photographic exhibition consisting of 15 images created by pinhole technique, which stimulates a critical thinking about photojournalism practice and consumption. The exhibition in the internet (www.fosfoto.com and at the Artistic and Historic Museum of Planaltina (Brazilian Federal District happened in May 2009. Participants were asked to interpret the photographs and rewrite temporary captions. The 1.860 proposed captions indicates the public’s interest to participate, discuss and interpret the pictures.

  13. 8 CFR 333.2 - Attachment of photographs to documents.

    Science.gov (United States)

    2010-01-01

    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Attachment of photographs to documents. 333... PHOTOGRAPHS § 333.2 Attachment of photographs to documents. A signed photograph of the applicant must be... portion of the photograph in such a manner as not to obscure the features of the applicant. [56 FR 50495...

  14. Looking for an old aerial photograph

    Science.gov (United States)

    ,

    1997-01-01

    Attempts to photograph the surface of the Earth date from the 1800's, when photographers attached cameras to balloons, kites, and even pigeons. Today, aerial photographs and satellite images are commonplace. The rate of acquiring aerial photographs and satellite images has increased rapidly in recent years. Views of the Earth obtained from aircraft or satellites have become valuable tools to Government resource planners and managers, land-use experts, environmentalists, engineers, scientists, and a wide variety of other users. Many people want historical aerial photographs for business or personal reasons. They may want to locate the boundaries of an old farm or a piece of family property. Or they may want a photograph as a record of changes in their neighborhood, or as a gift. The U.S. Geological Survey (USGS) maintains the Earth Science Information Centers (ESIC?s) to sell aerial photographs, remotely sensed images from satellites, a wide array of digital geographic and cartographic data, as well as the Bureau?s wellknown maps. Declassified photographs from early spy satellites were recently added to the ESIC offerings of historical images. Using the Aerial Photography Summary Record System database, ESIC researchers can help customers find imagery in the collections of other Federal agencies and, in some cases, those of private companies that specialize in esoteric products.

  15. Stephen Marc: Photographer for Our Time

    Science.gov (United States)

    Smith, Toni M. Shorter

    2012-01-01

    It is said that "a picture is worth a thousand words" as visual images can express complex and multilayered ideas. Sometimes photographic imagery is so strong and resonant of certain success, struggles, or events that it becomes key to a community or generation. As historic records, photographs are uniquely able to present not only success and…

  16. 36 CFR 702.4 - Photographs.

    Science.gov (United States)

    2010-07-01

    ... 702.4 Parks, Forests, and Public Property LIBRARY OF CONGRESS CONDUCT ON LIBRARY PREMISES § 702.4 Photographs. (a) The policy set out herein applies to all individuals who are photographing Library of... conditions may include provision for a fee for services rendered consistent with the Library's policies and...

  17. Imposed Work of Breathing for Flow Meters with In-Line versus Flow-Through Technique during Simulated Neonatal Breathing.

    Directory of Open Access Journals (Sweden)

    Snorri Donaldsson

    Full Text Available The ability to determine airflow during nasal CPAP (NCPAP treatment without adding dead space or resistance would be useful when investigating the physiologic effects of different NCPAP systems on breathing. The aim of this study was to investigate the effect on pressure stability of different flow measuring devices at the in-line and flow-through position, using simulated neonatal breathing.Six different flow measure devices were evaluated by recording pressure changes and imposed work of breathing for breaths with 16 and 32 ml tidal volumes. The tests were performed initially with the devices in an in line position and with 5 and 10 L/min using flow through technique, without CPAP. The flow meters were then subsequently tested with an Infant Flow CPAP system at 3, 5 and 8 cm H2O pressure using flow through technique. The quality of the recorded signals was compared graphically.The resistance of the measuring devices generated pressure swings and imposed work of breathing. With bias flow, the resistance also generated CPAP pressure. Three of the devices had low resistance and generated no changes in pressure stability or CPAP pressure. The two devices intended for neonatal use had the highest measured resistance.The importance of pressure stability and increased work of breathing during non-invasive respiratory support are insufficiently studied. Clinical trials using flow-through technique have not focused on pressure stability. Our results indicate that a flow-through technique might be a way forward in obtaining a sufficiently high signal quality without the added effects of rebreathing and increased work of breathing. The results should stimulate further research and the development of equipment for dynamic flow measurements in neonates.

  18. Imposed Work of Breathing for Flow Meters with In-Line versus Flow-Through Technique during Simulated Neonatal Breathing.

    Science.gov (United States)

    Donaldsson, Snorri; Falk, Markus; Jonsson, Baldvin; Drevhammar, Thomas

    2015-01-01

    The ability to determine airflow during nasal CPAP (NCPAP) treatment without adding dead space or resistance would be useful when investigating the physiologic effects of different NCPAP systems on breathing. The aim of this study was to investigate the effect on pressure stability of different flow measuring devices at the in-line and flow-through position, using simulated neonatal breathing. Six different flow measure devices were evaluated by recording pressure changes and imposed work of breathing for breaths with 16 and 32 ml tidal volumes. The tests were performed initially with the devices in an in line position and with 5 and 10 L/min using flow through technique, without CPAP. The flow meters were then subsequently tested with an Infant Flow CPAP system at 3, 5 and 8 cm H2O pressure using flow through technique. The quality of the recorded signals was compared graphically. The resistance of the measuring devices generated pressure swings and imposed work of breathing. With bias flow, the resistance also generated CPAP pressure. Three of the devices had low resistance and generated no changes in pressure stability or CPAP pressure. The two devices intended for neonatal use had the highest measured resistance. The importance of pressure stability and increased work of breathing during non-invasive respiratory support are insufficiently studied. Clinical trials using flow-through technique have not focused on pressure stability. Our results indicate that a flow-through technique might be a way forward in obtaining a sufficiently high signal quality without the added effects of rebreathing and increased work of breathing. The results should stimulate further research and the development of equipment for dynamic flow measurements in neonates.

  19. Application of a local linearization technique for the solution of a system of stiff differential equations associated with the simulation of a magnetic bearing assembly

    Science.gov (United States)

    Kibler, K. S.; Mcdaniel, G. A.

    1981-01-01

    A digital local linearization technique was used to solve a system of stiff differential equations which simulate a magnetic bearing assembly. The results prove the technique to be accurate, stable, and efficient when compared to a general purpose variable order Adams method with a stiff option.

  20. Numerical models: Detailing and simulation techniques aimed at comparison with experimental data, support to test result interpretation

    International Nuclear Information System (INIS)

    Lin Chiwen

    2001-01-01

    This part of the presentation discusses the modelling details required and the simulation techniques available for analyses, facilitating the comparison with the experimental data and providing support for interpretation of the test results. It is organised to cover the following topics: analysis inputs; basic modelling requirements for reactor coolant system; method applicable for reactor cooling system; consideration of damping values and integration time steps; typical analytic models used for analysis of reactor pressure vessel and internals; hydrodynamic mass and fluid damping for the internal analysis; impact elements for fuel analysis; and PEI theorem and its applications. The intention of these topics is to identify the key parameters associated with models of analysis and analytical methods. This should provide proper basis for useful comparison with the test results

  1. A correction scheme for thermal conductivity measurement using the comparative cut-bar technique based on 3D numerical simulation

    International Nuclear Information System (INIS)

    Xing, Changhu; Folsom, Charles; Jensen, Colby; Ban, Heng; Marshall, Douglas W

    2014-01-01

    As an important factor affecting the accuracy of thermal conductivity measurement, systematic (bias) error in the guarded comparative axial heat flow (cut-bar) method was mostly neglected by previous researches. This bias is primarily due to the thermal conductivity mismatch between sample and meter bars (reference), which is common for a sample of unknown thermal conductivity. A correction scheme, based on finite element simulation of the measurement system, was proposed to reduce the magnitude of the overall measurement uncertainty. This scheme was experimentally validated by applying corrections on four types of sample measurements in which the specimen thermal conductivity is much smaller, slightly smaller, equal and much larger than that of the meter bar. As an alternative to the optimum guarding technique proposed before, the correction scheme can be used to minimize the uncertainty contribution from the measurement system with non-optimal guarding conditions. It is especially necessary for large thermal conductivity mismatches between sample and meter bars. (paper)

  2. Study of the radioactivity of rocks by the photographic method

    Energy Technology Data Exchange (ETDEWEB)

    Picciotto, E E

    1949-08-16

    The use of photographic plates, and especially of the new Ilford and Kodak plates, in nuclear physics is briefly described. In particular, the application of these methods to the study of the radioactivity of rocks is discussed. In a series of studies made by the authors, the photographic plates were placed in close contact with a thin, highly polished sheet of the rock sample and then developed under specified conditions. This method was used to determine the concentration of U and Th in two radioactive rock samples and the results are given. The samples were then reduced to powder form and the concentrations were again determined. Work on dissolved samples has not yet been completed. In conclusion, the relative merits of these different techniques are indicated.

  3. Development of an immobilisation technique by cementation for non-radioactive simulated liquid waste, from Mo-99 production process

    International Nuclear Information System (INIS)

    Arva, E A; Marabini, S G; Varani, J L

    2012-01-01

    The Argentine Atomic Energy Commission (CNEA) is the responsible for developing a management nuclear waste disposal programme. This programme contemplates the strictly environmental safe and efficient management of the radioactive waste from different sources. Since 1985, CNEA has been producing commercially Mo-99 for medical use. In this process two types of liquid waste are produced. One of them has high alkaline (NaOH 3,5M) and aluminate contents. Since Mo-99 production started, such liquid waste was stored in specially designed containers during production, and after a decay period in smaller containers in interim storage conditions. As this waste is still a liquid, development of an immobilisation technique is required. Immobilisation of radioactive liquid waste by cementation is a frequently used technique, and will be studied in the present work using Mo-99 non-radioactive simulated liquid waste. In this second stage, a full scale (200 liters drum) cementation test using simulated non radioactive waste was carried out. Such test included: using the BEBA 201 mixing machine - the same that will be used with real waste in the future for 'tuning up' the process, construction of a specially designed temperature sensor for measuring the maximum temperature value (five different positions, four inside the drum and one outside) and the time elapsed after all components mixing. Finally, standard specimens (IRAM 1622) were made for mechanical resistance tests after cement setting at 28 days. The results show values of temperature not above 40 o C with the maximum at 12 hours before component mixing and compression strength of 14 MPa. Such values are compatible for a waste immobilisation process by cementation (author)

  4. Airflow and air quality simulations over the western mountainous region with a four-dimensional data assimilation technique

    Science.gov (United States)

    Yamada, Tetsuji; Kao, Chih-Yue; Bunker, Susan

    We apply a three-dimensional meteorological model with a four-dimensional data assimilation (4-DDA) technique to simulate diurnal variations of wind, temperature, water vapor, and turbulence in a region extending from the west coast to east of the Rockies and from northern Mexico to Wyoming. The wind data taken during the 1985 SCENES ( Subregional Cooperative Electric Utility, Dept. of Defense, National Park Service, and Environmental Protection Agency Study on Visibility) field experiments are successfully assimilated into the model through the 4-DDA technique by 'nudging' the modeled winds toward the observed winds. The modeled winds and turbulence fields are then used in a Lagrangian random-particle statistical model to investigate how pollutants from potential sources are transported and diffused. Finally, we calculate the ground concentrations through a kernel density estimator. Two scenarios in different weather patterns are investigated with simulation periods up to 6 days. One is associated with the evolution of a surface cold front and the other under a high-pressure stagnant condition. In the frontal case, the impact of air-mass movement on the ground concentrations of pollutants released from the Los Angeles area is well depicted by the model. Also, the pollutants produced from Los Angeles can be transported to the Grand Canyon area within 24 h. However, if we use only the data that were obtained from the regular NWS rawinsonde network, whose temporal and spatial resolutions are coarser than those of the special network, the plume goes north-northeast and never reaches the Grand Canyon area. In the stagnant case, the pollutants meander around the source area and can have significant impact on local air quality.

  5. Teaching aseptic technique for central venous access under ultrasound guidance: a randomized trial comparing didactic training alone to didactic plus simulation-based training.

    Science.gov (United States)

    Latif, Rana K; Bautista, Alexander F; Memon, Saima B; Smith, Elizabeth A; Wang, Chenxi; Wadhwa, Anupama; Carter, Mary B; Akca, Ozan

    2012-03-01

    Our goal was to determine whether simulation combined with didactic training improves sterile technique during ultrasound (US)-guided central venous catheter (CVC) insertion compared with didactic training alone among novices. We hypothesized that novices who receive combined didactic and simulation-based training would perform similarly to experienced residents in aseptic technique, knowledge, and perception of comfort during US-guided CVC insertion on a simulator. Seventy-two subjects were enrolled in a randomized, controlled trial of an educational intervention. Fifty-four novices were randomized into either the didactic group or the simulation combined with didactic group. Both groups received didactic training but the simulation combined with didactic group also received simulation-based CVC insertion training. Both groups were tested by demonstrating US-guided CVC insertion on a simulator. Aseptic technique was scored on 8 steps as "yes/no" and also using a 7-point Likert scale with 7 being "excellent technique" by a rater blinded to subject randomization. After initial testing, the didactic group was offered simulation-based training and retesting. Both groups also took a pre- and posttraining test of knowledge and rated their comfort with US and CVC insertion pre- and posttraining on a 5-point Likert scale. Subsequently, 18 experienced residents also took the test of knowledge, rated their comfort level, and were scored while performing aseptic US-guided CVC insertion using a simulator. The simulation combined with didactic group achieved a 167% (95% confidence interval [CI] 133%-167%) incremental increase in yes/no scores and 115% (CI 112%-127%) incremental increase in Likert scale ratings on aseptic technique compared with novices in the didactic group. Compared with experienced residents, simulation combined with didactic trained novices achieved an increase in aseptic scores with a 33.3% (CI 16.7%-50%) increase in yes/no ratings and a 20% (CI 13

  6. X-radiography using photographic papers

    International Nuclear Information System (INIS)

    Nitiyaporn, W.

    1986-01-01

    The objective of this research is to study the possibility of using photographic paper, available on the market, for x-ray radiography instead of x-ray film which is more expensive and more complicate to develop. This research concerned about (1) the method and the limitation of x-ray radiography by using 3 types of photographic paper, namely, F2, F3 and F4 distributed by the Kodak Company, as the screen with 3 different kinds of intensifying screens produced by the Phillips Company, Toshiba Company and Picker Company to increase photographic efficiency; (2) correction factor between these 3 types of photographic paper and intensifying screens; (3) the most suitable combination of photographic paper and intensifying screens used; (4) the result of using photographic paper and x-ray film in x-ray radiography regarding quality, cost and film developing. From the research, it was found that (1) the combination of intensifying screen from Picker Company and Kodak photographic paper No. F4 coating with silver bromide with a little mixture of silver iodide resulted in higher sensitivity and more contrast than other combination; (2) photographic papers had more limitation than x-ray film in the sense that it could be used with the iron test piece no thicker than 3 cm.with the x-ray energy of 220 k Vp; (3) photographic papers would give almost the same degree of contrast and sensitivity as x-ray film when used with thin test specimens. For instance the smallest wire No.12 of DIN 62 FE could be seen in the photographic paper at 220 k Vp while it could be seen in the x-ray film at 200 k Vp while it could be seen in the x-ray film at 200 k Vp. The exposure of photographic paper would be in vicinity of x-ray film when it was used with thin test specimens. Photographic paper would produce sharpness definition, density and contrast picture and also details of the picture closely to what given by x-ray film. It is concluded that if the test specimens are thin, photographic papers

  7. Use of human patient simulation and the situation awareness global assessment technique in practical trauma skills assessment.

    Science.gov (United States)

    Hogan, Michael P; Pace, David E; Hapgood, Joanne; Boone, Darrell C

    2006-11-01

    Situation awareness (SA) is defined as the perception of elements in the environment within a volume of time and space, the comprehension of their meaning, and the projection of their status in the near future. This construct is vital to decision making in intense, dynamic environments. It has been used in aviation as it relates to pilot performance, but has not been applied to medical education. The most widely used objective tool for measuring trainee SA is the Situation Awareness Global Assessment Technique (SAGAT). The purpose of this study was to design and validate SAGAT for assessment of practical trauma skills, and to compare SAGAT results to traditional checklist style scoring. Using the Human Patient Simulator, we designed SAGAT for practical trauma skills assessment based on Advanced Trauma Life Support objectives. Sixteen subjects (four staff surgeons, four senior residents, four junior residents, and four medical students) participated in three scenarios each. They were assessed using SAGAT and traditional checklist assessment. A questionnaire was used to assess possible confounding factors in attaining SA and overall trainee satisfaction. SAGAT was found to show significant difference (analysis of variance; p level of training lending statistical support to construct validity. SAGAT was likewise found to display reliability (Cronbach's alpha 0.767), and significant scoring correlation with traditional checklist performance measures (Pearson's coefficient 0.806). The questionnaire revealed no confounding factors and universal satisfaction with the human patient simulator and SAGAT. SAGAT is a valid, reliable assessment tool for trauma trainees in the dynamic clinical environment created by human patient simulation. Information provided by SAGAT could provide specific feedback, direct individualized teaching, and support curriculum change. Introduction of SAGAT could improve the current assessment model for practical trauma education.

  8. Development of Quality Assessment Techniques for Large Eddy Simulation of Propulsion and Power Systems in Complex Geometries

    Energy Technology Data Exchange (ETDEWEB)

    Lacaze, Guilhem [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Oefelein, Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-03-01

    Large-eddy-simulation (LES) is quickly becoming a method of choice for studying complex thermo-physics in a wide range of propulsion and power systems. It provides a means to study coupled turbulent combustion and flow processes in parameter spaces that are unattainable using direct-numerical-simulation (DNS), with a degree of fidelity that can be far more accurate than conventional engineering methods such as the Reynolds-averaged Navier-Stokes (RANS) approx- imation. However, development of predictive LES is complicated by the complex interdependence of different type of errors coming from numerical methods, algorithms, models and boundary con- ditions. On the other hand, control of accuracy has become a critical aspect in the development of predictive LES for design. The objective of this project is to create a framework of metrics aimed at quantifying the quality and accuracy of state-of-the-art LES in a manner that addresses the myriad of competing interdependencies. In a typical simulation cycle, only 20% of the computational time is actually usable. The rest is spent in case preparation, assessment, and validation, because of the lack of guidelines. This work increases confidence in the accuracy of a given solution while min- imizing the time obtaining the solution. The approach facilitates control of the tradeoffs between cost, accuracy, and uncertainties as a function of fidelity and methods employed. The analysis is coupled with advanced Uncertainty Quantification techniques employed to estimate confidence in model predictions and calibrate model's parameters. This work has provided positive conse- quences on the accuracy of the results delivered by LES and will soon have a broad impact on research supported both by the DOE and elsewhere.

  9. Shoulder strengthening exercises adapted to specific shoulder pathologies can be selected using new simulation techniques: a pilot study.

    Science.gov (United States)

    Charbonnier, Caecilia; Lädermann, Alexandre; Kevelham, Bart; Chagué, Sylvain; Hoffmeyer, Pierre; Holzer, Nicolas

    2018-02-01

    Shoulder strength training exercises represent a major component of rehabilitation protocols designed for conservative or postsurgical management of shoulder pathologies. Numerous methods are described for exercising each shoulder muscle or muscle group. Limited information is available to assess potential deleterious effects of individual methods with respect to specific shoulder pathologies. Thus, the goal of this pilot study was to use a patient-specific 3D measurement technique coupling medical imaging and optical motion capture for evaluation of a set of shoulder strength training exercises regarding glenohumeral, labral and subacromial compression, as well as elongation of the rotator cuff muscles. One volunteer underwent magnetic resonance imaging (MRI) and motion capture of the shoulder. Motion data from the volunteer were recorded during three passive rehabilitation exercises and twenty-nine strengthening exercises targeting eleven of the most frequently trained shoulder muscles or muscle groups and using four different techniques when available. For each exercise, glenohumeral and labral compression, subacromial space height and rotator cuff muscles elongation were measured on the entire range of motion. Significant differences in glenohumeral, subacromial and labral compressions were observed between sets of exercises targeting individual shoulder muscles. Muscle lengths computed by simulation compared to MRI measurements showed differences of 0-5%. This study represents the first screening of shoulder strengthening exercises to identify potential deleterious effects on the shoulder joint. Motion capture combined with medical imaging allows for reliable assessment of glenohumeral, labral and subacromial compression, as well as muscle-tendon elongation during shoulder strength training exercises.

  10. Spacelab Life Science-1 Mission Onboard Photograph

    Science.gov (United States)

    1995-01-01

    Spacelab Life Science -1 (SLS-1) was the first Spacelab mission dedicated solely to life sciences. The main purpose of the SLS-1 mission was to study the mechanisms, magnitudes, and time courses of certain physiological changes that occur during space flight, to investigate the consequences of the body's adaptation to microgravity and readjustment to Earth's gravity, and bring the benefits back home to Earth. The mission was designed to explore the responses of the heart, lungs, blood vessels, kidneys, and hormone-secreting glands to microgravity and related body fluid shifts; examine the causes of space motion sickness; and study changes in the muscles, bones, and cells. This photograph shows astronaut Rhea Seddon conducting an inflight study of the Cardiovascular Deconditioning experiment by breathing into the cardiovascular rebreathing unit. This experiment focused on the deconditioning of the heart and lungs and changes in cardiopulmonary function that occur upon return to Earth. By using noninvasive techniques of prolonged expiration and rebreathing, investigators can determine the amount of blood pumped out of the heart (cardiac output), the ease with which blood flows through all the vessels (total peripheral resistance), oxygen used and carbon dioxide released by the body, and lung function and volume changes. SLS-1 was launched aboard the Space Shuttle Orbiter Columbia (STS-40) on June 5, 1995.

  11. Systematic study of the effects of scaling techniques in numerical simulations with application to enhanced geothermal systems

    Science.gov (United States)

    Heinze, Thomas; Jansen, Gunnar; Galvan, Boris; Miller, Stephen A.

    2016-04-01

    Numerical modeling is a well established tool in rock mechanics studies investigating a wide range of problems. Especially for estimating seismic risk of a geothermal energy plants a realistic rock mechanical model is needed. To simulate a time evolving system, two different approaches need to be separated: Implicit methods for solving linear equations are unconditionally stable, while explicit methods are limited by the time step. However, explicit methods are often preferred because of their limited memory demand, their scalability in parallel computing, and simple implementation of complex boundary conditions. In numerical modeling of explicit elastoplastic dynamics the time step is limited by the rock density. Mass scaling techniques, which increase the rock density artificially by several orders, can be used to overcome this limit and significantly reduce computation time. In the context of geothermal energy this is of great interest because in a coupled hydro-mechanical model the time step of the mechanical part is significantly smaller than for the fluid flow. Mass scaling can also be combined with time scaling, which increases the rate of physical processes, assuming that processes are rate independent. While often used, the effect of mass and time scaling and how it may influence the numerical results is rarely-mentioned in publications, and choosing the right scaling technique is typically performed by trial and error. Also often scaling techniques are used in commercial software packages, hidden from the untrained user. To our knowledge, no systematic studies have addressed how mass scaling might affect the numerical results. In this work, we present results from an extensive and systematic study of the influence of mass and time scaling on the behavior of a variety of rock-mechanical models. We employ a finite difference scheme to model uniaxial and biaxial compression experiments using different mass and time scaling factors, and with physical models

  12. [Natural head position's reproducibility on photographs].

    Science.gov (United States)

    Eddo, Marie-Line; El Hayeck, Émilie; Hoyeck, Maha; Khoury, Élie; Ghoubril, Joseph

    2017-12-01

    The purpose of this study is to evaluate the reproducibility of natural head position with time on profile photographs. Our sample is composed of 96 students (20-30 years old) at the department of dentistry of Saint Joseph University in Beirut. Two profile photographs were taken in natural head position about a week apart. No significant differences were found between T0 and T1 (E = 1.065°). Many studies confirmed this reproducibility with time. Natural head position can be adopted as an orientation for profile photographs in orthodontics. © EDP Sciences, SFODF, 2017.

  13. The Photoshop CS4 Companion for Photographers

    CERN Document Server

    Story, Derrick

    2009-01-01

    "Derrick shows that Photoshop can be friendly as well as powerful. In part, he does that by focusing photographers on the essential steps of an efficient workflow. With this guide in hand, you'll quickly learn how to leverage Photoshop CS4's features to organize and improve your pictures."-- John Nack, Principal Product Manager, Adobe Photoshop & BridgeMany photographers -- even the pros -- feel overwhelmed by all the editing options Photoshop provides. The Photoshop CS4 Companion for Photographers pares it down to only the tools you'll need most often, and shows you how to use those tools as

  14. New ISO standard - personnel photographic film dosemeters

    International Nuclear Information System (INIS)

    Brabec, D.

    1980-01-01

    The ISO Standard 1757 ''Personnel Photographic Film Dosemeters'', issued in June 1980, is briefly described. UVVVR's own dosemeter developed for use in the national film dosimetry service in Czechoslovakia is evaluated in relation to this ISO Standard. (author)

  15. A Simulation Based Analysis of Motor Unit Number Index (MUNIX) Technique Using Motoneuron Pool and Surface Electromyogram Models

    Science.gov (United States)

    Li, Xiaoyan; Rymer, William Zev; Zhou, Ping

    2013-01-01

    Motor unit number index (MUNIX) measurement has recently achieved increasing attention as a tool to evaluate the progression of motoneuron diseases. In our current study, the sensitivity of the MUNIX technique to changes in motoneuron and muscle properties was explored by a simulation approach utilizing variations on published motoneuron pool and surface electromyogram (EMG) models. Our simulation results indicate that, when keeping motoneuron pool and muscle parameters unchanged and varying the input motor unit numbers to the model, then MUNIX estimates can appropriately characterize changes in motor unit numbers. Such MUNIX estimates are not sensitive to different motor unit recruitment and rate coding strategies used in the model. Furthermore, alterations in motor unit control properties do not have a significant effect on the MUNIX estimates. Neither adjustment of the motor unit recruitment range nor reduction of the motor unit firing rates jeopardizes the MUNIX estimates. The MUNIX estimates closely correlate with the maximum M wave amplitude. However, if we reduce the amplitude of each motor unit action potential rather than simply reduce motor unit number, then MUNIX estimates substantially underestimate the motor unit numbers in the muscle. These findings suggest that the current MUNIX definition is most suitable for motoneuron diseases that demonstrate secondary evidence of muscle fiber reinnervation. In this regard, when MUNIX is applied, it is of much importance to examine a parallel measurement of motor unit size index (MUSIX), defined as the ratio of the maximum M wave amplitude to the MUNIX. However, there are potential limitations in the application of the MUNIX methods in atrophied muscle, where it is unclear whether the atrophy is accompanied by loss of motor units or loss of muscle fiber size. PMID:22514208

  16. A new technique to characterize CT scanner bow-tie filter attenuation and applications in human cadaver dosimetry simulations

    Science.gov (United States)

    Li, Xinhua; Shi, Jim Q.; Zhang, Da; Singh, Sarabjeet; Padole, Atul; Otrakji, Alexi; Kalra, Mannudeep K.; Xu, X. George; Liu, Bob

    2015-01-01

    Purpose: To present a noninvasive technique for directly measuring the CT bow-tie filter attenuation with a linear array x-ray detector. Methods: A scintillator based x-ray detector of 384 pixels, 307 mm active length, and fast data acquisition (model X-Scan 0.8c4-307, Detection Technology, FI-91100 Ii, Finland) was used to simultaneously detect radiation levels across a scan field-of-view. The sampling time was as short as 0.24 ms. To measure the body bow-tie attenuation on a GE Lightspeed Pro 16 CT scanner, the x-ray tube was parked at the 12 o’clock position, and the detector was centered in the scan field at the isocenter height. Two radiation exposures were made with and without the bow-tie in the beam path. Each readout signal was corrected for the detector background offset and signal-level related nonlinear gain, and the ratio of the two exposures gave the bow-tie attenuation. The results were used in the geant4 based simulations of the point doses measured using six thimble chambers placed in a human cadaver with abdomen/pelvis CT scans at 100 or 120 kV, helical pitch at 1.375, constant or variable tube current, and distinct x-ray tube starting angles. Results: Absolute attenuation was measured with the body bow-tie scanned at 80–140 kV. For 24 doses measured in six organs of the cadaver, the median or maximum difference between the simulation results and the measurements on the CT scanner was 8.9% or 25.9%, respectively. Conclusions: The described method allows fast and accurate bow-tie filter characterization. PMID:26520720

  17. Lessons learned from accident simulation exercises and their implications for operation of the IPSN Centre Technique de Crise

    International Nuclear Information System (INIS)

    Manesse, D.; Ney, J.; Crabol, B.; Ginot, P.

    1990-01-01

    The Centre Technique de Crise (CTC) of the Institut de Protection et de Surete Nucleaire (IPSN) has an important role to play in the event of an accident at a nuclear installation of Electricite de France (EdF) concerning diagnosis of the situation and forecasting its evolution. For this purpose the CTS is organized into various groups; only that responsible for the evaluation of the radiological consequences is considered in the present paper. Since the beginning of the eighties numerous simulations of nuclear accidents have been organized both by the public authorities and by the nuclear operators. These exercises, of growing complexity, are distinguished according to the type of installation concerned, the scenario (with and without a simulator), the equipment involved, the participants (local and national officials), the accident phase used (at the time of the accident or post-accident), the use of actual or pre-determined meteorological conditions etc.. Different combinations are imposed as a function of the specific aims of each exercise. Numerous lessons have been drawn progressively from these very varied exercises for the operation of the CTC and, in particular, of the Radiological Consequences Group. The principal Lessons concern: development of calculation and mapping tools, specific liaison with the national meteorological services, modification of the centre's facilities, composition of the team and definition of the role of each of its members, improved liaison with the Site Evaluation Group and the provision of appropriate documentation. The need for continuous training of duty teams in the form of presentations and exercises has also been confirmed

  18. Eutectic-based wafer-level-packaging technique for piezoresistive MEMS accelerometers and bond characterization using molecular dynamics simulations

    Science.gov (United States)

    Aono, T.; Kazama, A.; Okada, R.; Iwasaki, T.; Isono, Y.

    2018-03-01

    We developed a eutectic-based wafer-level-packaging (WLP) technique for piezoresistive micro-electromechanical systems (MEMS) accelerometers on the basis of molecular dynamics analyses and shear tests of WLP accelerometers. The bonding conditions were experimentally and analytically determined to realize a high shear strength without solder material atoms diffusing to adhesion layers. Molecular dynamics (MD) simulations and energy dispersive x-ray (EDX) spectrometry done after the shear tests clarified the eutectic reaction of the solder materials used in this research. Energy relaxation calculations in MD showed that the diffusion of solder material atoms into the adhesive layer was promoted at a higher temperature. Tensile creep MD simulations also suggested that the local potential energy in a solder material model determined the fracture points of the model. These numerical results were supported by the shear tests and EDX analyses for WLP accelerometers. Consequently, a bonding load of 9.8 kN and temperature of 300 °C were found to be rational conditions because the shear strength was sufficient to endure the polishing process after the WLP process and there was little diffusion of solder material atoms to the adhesion layer. Also, eutectic-bonding-based WLP was effective for controlling the attenuation of the accelerometers by determining the thickness of electroplated solder materials that played the role of a cavity between the accelerometers and lids. If the gap distance between the two was less than 6.2 µm, the signal gains for x- and z-axis acceleration were less than 20 dB even at the resonance frequency due to air-damping.

  19. Special photographic emulsions for high LET dosimetry

    International Nuclear Information System (INIS)

    Katz, R.

    1978-12-01

    The purpose of these investigations into photographic emulsion dosimetry is to attempt to use the photographic emulsion to mimic the response of human tissues to high LET radiations. The program therefore requires that a systematic understanding of the response of mammalian cells to ionizing radiations be achieved. We have been concerned with differences in RBE and in radiation response to both high and LET radiations, and in the interrelationship between observations with these different radiations

  20. Simulation

    DEFF Research Database (Denmark)

    Gould, Derek A; Chalmers, Nicholas; Johnson, Sheena J

    2012-01-01

    Recognition of the many limitations of traditional apprenticeship training is driving new approaches to learning medical procedural skills. Among simulation technologies and methods available today, computer-based systems are topical and bring the benefits of automated, repeatable, and reliable p...... performance assessments. Human factors research is central to simulator model development that is relevant to real-world imaging-guided interventional tasks and to the credentialing programs in which it would be used....

  1. Alterations in aerobic energy expenditure and neuromuscular function during a simulated cross-country skiathlon with the skating technique.

    Science.gov (United States)

    Fabre, Nicolas; Mourot, Laurent; Zoppirolli, Chiara; Andersson, Erik; Willis, Sarah J; Holmberg, Hans-Christer

    2015-04-01

    Here, we tested the hypothesis that aerobic energy expenditure (AEE) is higher during a simulated 6-km (2 loops of 3-km each) "skiathlon" than during skating only on a treadmill and attempted to link any such increase to biomechanical and neuromuscular responses. Six elite male cross-country skiers performed two pre-testing time-trials (TT) to determine their best performances and to choose an appropriate submaximal speed for collection of physiological, biomechanical and neuromuscular data during two experimental sessions (exp). Each skier used, in randomized order, either the classical (CL) or skating technique (SK) for the first 3-km loop, followed by transition to the skating technique for the second 3-km loop. Respiratory parameters were recorded continuously. The EMG activity of the triceps brachii (TBr) and vastus lateralis (VLa) muscles during isometric contractions performed when the skiers were stationary (i.e., just before the first loop, during the transition, and after the second loop); their corresponding activity during dynamic contractions; and pole and plantar forces during the second loop were recorded. During the second 3-km of the TT, skating speed was significantly higher for the SK-SK than CL-SK. During this second loop, AEE was also higher (+1.5%) for CL-SKexp than SK-SKexp, in association with higher VLa EMG activity during both isometric and dynamic contractions, despite no differences in plantar or pole forces, poling times or cycle rates. Although the underlying mechanism remains unclear, during a skiathlon, the transition between the sections of classical skiing and skating alters skating performance (i.e., skiing speed), AEE and neuromuscular function. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Determinants of a simulated cross-country skiing sprint competition using V2 skating technique on roller skis.

    Science.gov (United States)

    Mikkola, Jussi; Laaksonen, Marko; Holmberg, Hans-Christer; Vesterinen, Ville; Nummela, Ari

    2010-04-01

    The present study investigated the performance-predicting factors of a simulated cross-country (XC) skiing sprint competition on roller skis, on a slow surface. Sixteen elite male XC skiers performed a simulated sprint competition (4 x 850 m heat with a 20-minute recovery) using V2 skating technique on an indoor tartan track. Heat velocities, oxygen consumption, and peak lactate were measured during or after the heats. Maximal skiing velocity was measured by performing a 30-m speed test. Explosive and maximal force production in the upper body was determined by bench press (BP). Subjects also performed maximal anaerobic skiing test (MAST) and the 2 x 2-km double poling (DP) test. The maximal velocity of MAST (VMAST) and velocities at 3 (V3), 5 (V5), 7 (V7) mmol.L lactate levels in MAST were determined. In the 2 x 2-km test, DP economy (VO2SUBDP) and maximal 2-km DP velocity (VDP2KM) were determined. The best single performance-predicting factors for the sprint performance were VDP2KM (r = 0.73, p < 0.01), V7 (r = 0.70, p < 0.01), and VO2SUBDP (r = -0.70, p < 0.01). Faster skiers in sprint simulation had a higher absolute VO2 (L.min) (p < 0.05-0.01) during sprint heats, and higher anaerobic skiing power (VMAST, p < 0.05) and better anaerobic skiing economy (V3, V5, V7, p < 0.05-0.001) than slower skiers. Faster skiers were also stronger in BP, with regard to both absolute (p < 0.01) and relative (p < 0.05) values. In addition, anaerobic characteristics seem to be of importance at the beginning of the XC skiing sprint competition, whereas the aerobic characteristics become more important as the XC skiing sprint competition progressed. This study indicates that sprint skiers should emphasize sport-specific upper body training, and training skiing economy at high speeds.

  3. Evaluation of SLAR and simulated thematic mapper MSS data for forest cover mapping using computer-aided analysis techniques

    Science.gov (United States)

    Hoffer, R. M.; Dean, M. E.; Knowlton, D. J.; Latty, R. S.

    1982-01-01

    Kershaw County, South Carolina was selected as the study site for analyzing simulated thematic mapper MSS data and dual-polarized X-band synthetic aperture radar (SAR) data. The impact of the improved spatial and spectral characteristics of the LANDSAT D thematic mapper data on computer aided analysis for forest cover type mapping was examined as well as the value of synthetic aperture radar data for differentiating forest and other cover types. The utility of pattern recognition techniques for analyzing SAR data was assessed. Topics covered include: (1) collection and of TMS and reference data; (2) reformatting, geometric and radiometric rectification, and spatial resolution degradation of TMS data; (3) development of training statistics and test data sets; (4) evaluation of different numbers and combinations of wavelength bands on classification performance; (5) comparison among three classification algorithms; and (6) the effectiveness of the principal component transformation in data analysis. The collection, digitization, reformatting, and geometric adjustment of SAR data are also discussed. Image interpretation results and classification results are presented.

  4. A Combined Methodology for Landslide Risk Mitigation in Basilicata Region by Using LIDAR Technique and Rockfall Simulation

    Directory of Open Access Journals (Sweden)

    G. Colangelo

    2011-01-01

    Full Text Available Rockfalls represent a significant geohazards along the SS18 road of Basilicata Region, Italy. The management of these rockfall hazards and the mitigation of the risk require innovative approaches and technologies. This paper discusses a hazard assessment strategy and risk mitigation for rockfalls in a section of SS118, along the coast of Maratea, using LIDAR technique and spatial modelling. Historical rockfall records were used to calibrate the physical characteristics of the rockfall processes. The results of the simulations were used to define the intervention actions and engineering strategy for the mitigation of the phenomena. Within two months, 260 linear meters of high-energy rockfall barriers for impact energies up to 3000 kJ were installed. After that, according to road authority, the SS18 road was opened in a safe condition. The results represent a valid cognitive support to choose the most appropriate technical solution for topography strengthening and an example of good practice for the cooperation between innovative technologies and field emergency management.

  5. Improved Titanium Billet Inspection Sensitivity through Optimized Phased Array Design, Part I: Design Technique, Modeling and Simulation

    International Nuclear Information System (INIS)

    Lupien, Vincent; Hassan, Waled; Dumas, Philippe

    2006-01-01

    Reductions in the beam diameter and pulse duration of focused ultrasound for titanium inspections are believed to result in a signal-to-noise ratio improvement for embedded defect detection. It has been inferred from this result that detection limits could be extended to smaller defects through a larger diameter, higher frequency transducer resulting in a reduced beamwidth and pulse duration. Using Continuum Probe Designer TM (Pat. Pending), a transducer array was developed for full coverage inspection of 8 inch titanium billets. The main challenge in realizing a large aperture phased array transducer for billet inspection is ensuring that the number of elements remains within the budget allotted by the driving electronics. The optimization technique implemented by Continuum Probe Designer TM yields an array with twice the aperture but the same number of elements as existing phased arrays for the same application. The unequal area element design was successfully manufactured and validated both numerically and experimentally. Part I of this two-part series presents the design, simulation and modeling steps, while Part II presents the experimental validation and comparative study to multizone

  6. Thick-foils activation technique for neutron spectrum unfolding with the MINUIT routine-Comparison with GEANT4 simulations

    Science.gov (United States)

    Vagena, E.; Theodorou, K.; Stoulos, S.

    2018-04-01

    Neutron activation technique has been applied using a proposed set of twelve thick metal foils (Au, As, Cd, In, Ir, Er, Mn, Ni, Se, Sm, W, Zn) for off-site measurements to obtain the neutron spectrum over a wide energy range (from thermal up to a few MeV) in intense neutron-gamma mixed fields such as around medical Linacs. The unfolding procedure takes into account the activation rates measured using thirteen (n , γ) and two (n , p) reactions without imposing a guess solution-spectrum. The MINUIT minimization routine unfolds a neutron spectrum that is dominated by fast neutrons (70%) peaking at 0.3 MeV, while the thermal peak corresponds to the 15% of the total neutron fluence equal to the epithermal-resonances area. The comparison of the unfolded neutron spectrum against the simulated one with the GEANT4 Monte-Carlo code shows a reasonable agreement within the measurement uncertainties. Therefore, the proposed set of activation thick-foils could be a useful tool in order to determine low flux neutrons spectrum in intense mixed field.

  7. UQ and V&V techniques applied to experiments and simulations of heated pipes pressurized to failure

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dempsey, J. Franklin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Antoun, Bonnie R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-05-01

    This report demonstrates versatile and practical model validation and uncertainty quantification techniques applied to the accuracy assessment of a computational model of heated steel pipes pressurized to failure. The Real Space validation methodology segregates aleatory and epistemic uncertainties to form straightforward model validation metrics especially suited for assessing models to be used in the analysis of performance and safety margins. The methodology handles difficulties associated with representing and propagating interval and/or probabilistic uncertainties from multiple correlated and uncorrelated sources in the experiments and simulations including: material variability characterized by non-parametric random functions (discrete temperature dependent stress-strain curves); very limited (sparse) experimental data at the coupon testing level for material characterization and at the pipe-test validation level; boundary condition reconstruction uncertainties from spatially sparse sensor data; normalization of pipe experimental responses for measured input-condition differences among tests and for random and systematic uncertainties in measurement/processing/inference of experimental inputs and outputs; numerical solution uncertainty from model discretization and solver effects.

  8. 44 CFR 15.12 - Photographs and other depictions.

    Science.gov (United States)

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Photographs and other... NATIONAL EMERGENCY TRAINING CENTER § 15.12 Photographs and other depictions. (a) Photographs and other depictions at Mt. Weather. We prohibit taking photographs and making notes, sketches, or diagrams of...

  9. 8 CFR 1236.5 - Fingerprints and photographs.

    Science.gov (United States)

    2010-01-01

    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Fingerprints and photographs. 1236.5... ORDERED REMOVED Detention of Aliens Prior to Order of Removal § 1236.5 Fingerprints and photographs. Every... photographed. Such fingerprints and photographs shall be made available to Federal, State, and local law...

  10. 8 CFR 333.1 - Description of required photographs.

    Science.gov (United States)

    2010-01-01

    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Description of required photographs. 333.1 Section 333.1 Aliens and Nationality DEPARTMENT OF HOMELAND SECURITY NATIONALITY REGULATIONS PHOTOGRAPHS § 333.1 Description of required photographs. (a) Every applicant required to furnish photographs of...

  11. PaintShop Photo Pro X3 For Photographers

    CERN Document Server

    McMahon, Ken

    2010-01-01

    If you are a digital photographer who's new to PaintShop Photo Pro or digital imaging in general, or have recently upgraded to the all-new version X3, this is the book for you! Packed with full color images to provide inspiration and easy to follow, step-by-step projects, you'll learn the ins and outs of this fantastic program in no time so you can start correcting and editing your images to create stunning works of art. Whether you want to learn or refresh yourself on the basics, such as effective cropping or simple color correction, or move on to more sophisticated techniques like creating s

  12. Photographic film dosimetry for high-energy accelerator radiation

    International Nuclear Information System (INIS)

    Komochkov, M.M.; Salatskaya, M.I.

    1981-01-01

    A technique for personnel photographic film dosimetry (PPFDN) of wide energy spectrum neutrons intended for measuring the effect of accelerating device radiation on personnel is described. Procedures of data measurement and processing as well as corrections to hadron contribution are presented. It is noted that the PPFDN method permits to measure a neutron dose equivalent for personnel in the range from 0.01 to 0.02 up to 100 rem, if the relativistic neutron contribution to a total dose does not exceed 5%. The upper limit of the measured dose reduced several times for a greater contribution of relativistic neutrons to the total dose [ru

  13. Simulation

    CERN Document Server

    Ross, Sheldon

    2006-01-01

    Ross's Simulation, Fourth Edition introduces aspiring and practicing actuaries, engineers, computer scientists and others to the practical aspects of constructing computerized simulation studies to analyze and interpret real phenomena. Readers learn to apply results of these analyses to problems in a wide variety of fields to obtain effective, accurate solutions and make predictions about future outcomes. This text explains how a computer can be used to generate random numbers, and how to use these random numbers to generate the behavior of a stochastic model over time. It presents the statist

  14. Numerical simulation of flow induced by a pitched blade turbine. Comparison of the sliding mesh technique and an averaged source term method

    Energy Technology Data Exchange (ETDEWEB)

    Majander, E.O.J.; Manninen, M.T. [VTT Energy, Espoo (Finland)

    1996-12-31

    The flow induced by a pitched blade turbine was simulated using the sliding mesh technique. The detailed geometry of the turbine was modelled in a computational mesh rotating with the turbine and the geometry of the reactor including baffles was modelled in a stationary co-ordinate system. Effects of grid density were investigated. Turbulence was modelled by using the standard k-{epsilon} model. Results were compared to experimental observations. Velocity components were found to be in good agreement with the measured values throughout the tank. Averaged source terms were calculated from the sliding mesh simulations in order to investigate the reliability of the source term approach. The flow field in the tank was then simulated in a simple grid using these source terms. Agreement with the results of the sliding mesh simulations was good. Commercial CFD-code FLUENT was used in all simulations. (author)

  15. Numerical simulation of flow induced by a pitched blade turbine. Comparison of the sliding mesh technique and an averaged source term method

    Energy Technology Data Exchange (ETDEWEB)

    Majander, E O.J.; Manninen, M T [VTT Energy, Espoo (Finland)

    1997-12-31

    The flow induced by a pitched blade turbine was simulated using the sliding mesh technique. The detailed geometry of the turbine was modelled in a computational mesh rotating with the turbine and the geometry of the reactor including baffles was modelled in a stationary co-ordinate system. Effects of grid density were investigated. Turbulence was modelled by using the standard k-{epsilon} model. Results were compared to experimental observations. Velocity components were found to be in good agreement with the measured values throughout the tank. Averaged source terms were calculated from the sliding mesh simulations in order to investigate the reliability of the source term approach. The flow field in the tank was then simulated in a simple grid using these source terms. Agreement with the results of the sliding mesh simulations was good. Commercial CFD-code FLUENT was used in all simulations. (author)

  16. Digital techniques in simulation, communication and control. Proceedings of the IMACS European meeting held at University of Patras, Patras, Greece, July 9-12, 1984

    Energy Technology Data Exchange (ETDEWEB)

    Tzafestas, S G

    1985-01-01

    The book contains 90 papers which are classified in the following five parts: Modelling and simulation; Digital signal processing and 2-D system design; Information and communication systems; Control systems; and Applications (robotics, industrial and miscellaneous applications). The volume reflects the state-of-art of the field of digital techniques. (Auth.).

  17. Simultaneous determination of free calcium, magnesium, sodium and potassium ion concentrations in simulated milk ultrafiltrate and reconstituted skim milk using the Donnan Membrane Technique

    NARCIS (Netherlands)

    Gao, R.; Temminghoff, E.J.M.; Leeuwen, van H.P.; Valenberg, van H.J.F.; Eisner, M.D.; Boekel, van M.A.J.S.

    2009-01-01

    This study focused on determination of free Ca2+, Mg2+, Na+ and K+ concentrations in a series of CaCl2 solutions, simulated milk ultrafiltrate and reconstituted skim milk using a recently developed Donnan Membrane Technique (DMT). A calcium ion selective electrode was used to compare the DMT

  18. Photographic program of a BWR for ALARA

    International Nuclear Information System (INIS)

    Dodd, A.M.; Parry, J.O.

    1984-01-01

    High radiation areas have often been photographed in commercial nuclear plants to identify radiation sources and equipment so workers having to go into the areas become familiar with them prior to entering. This helps minimize the workers' time in a high radiation area and is useful as a visual aid in training. Previous problems encountered in using this type of file included indexing, storing for long term use and reproducing photos for use in the field. At WNP-2, a program has been adopted from Aerojet of Idaho where negatives from photographs are mounted on computer aperture cards. The cards are coded to identify the equipment, physical location in the plant, reference drawings and other data. The aperture cards are reproduced using a process called Diazo. The information is put in a data file that can be sorted by any field on the card. A paper copy of the photo can be made in seconds on a machine similar to a dry silver copier, then mounted for training or maintenance purposes. The cost of duplicating the aperture cards and/or the paper copies is a fraction of that for reproducing color glossies. The computer data file provides cross-referencing to correlate the equipment with the photograph. The results are low cost, easy storage and easy access to the photograph file. Using this program, several thousand photographs can easily be stored and used

  19. Dose-response effects of dietary pequi oil on fermentation characteristics and microbial population using a rumen simulation technique (Rusitec).

    Science.gov (United States)

    Duarte, Andrea Camacho; Durmic, Zoey; Vercoe, Philip E; Chaves, Alexandre V

    2017-12-01

    The effect of increasing the concentration of commercial pequi (Caryocar brasiliense) oil on fermentation characteristics and abundance of methanogens and fibrolityc bacteria was evaluated using the rumen simulation technique (Rusitec). In vitro incubation was performed over 15 days using a basal diet consisting of ryegrass, maize silage and concentrate in equal proportions. Treatments consisted of control diet (no pequi oil inclusion, 0 g/kg DM), pequi dose 1 (45 g/kg DM), and pequi dose 2 (91 g/kg DM). After a 7 day adaptation period, samples for fermentation parameters (total gas, methane, and VFA production) were taken on a daily basis. Quantitative real time PCR (q-PCR) was used to evaluate the abundance of the main rumen cellulolytic bacteria, as well as abundance of methanogens. Supplementation with pequi oil did not reduce overall methane production (P = 0.97), however a tendency (P = 0.06) to decrease proportion of methane in overall microbial gas was observed. Increasing addition of pequi oil was associated with a linear decrease (P < 0.01) in dry matter disappearance of maize silage. The abundance of total methanogens was unchanged by the addition of pequi oil, but numbers of those belonging to Methanomassiliicoccaceae decreased in liquid-associated microbes (LAM) samples (P < 0.01) and solid-associated microbes (SAM) samples (P = 0.09) respectively, while Methanobrevibacter spp. increased (P < 0.01) only in SAM samples. Fibrobacter succinogenes decreased (P < 0.01) in both LAM and SAM samples when substrates were supplemented with pequi oil. In conclusion, pequi oil was ineffective in mitigating methane emissions and had some adverse effects on digestibility and selected fibrolytic bacteria. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. The effect of starch, inulin, and degradable protein on ruminal fermentation and microbial growth in rumen simulation technique

    Directory of Open Access Journals (Sweden)

    Xiang H. Zhao

    2014-03-01

    Full Text Available A rumen simulation technique apparatus with eight 800 mL fermentation vessels was used to investigate the effects of rumen degradable protein (RDP level and non-fibre carbohydrate (NFC type on ruminal fermentation, microbial growth, and populations of ruminal cellulolytic bacteria. Treatments consisted of two NFC types (starch and inulin supplemented with 0 g/d (low RDP or 1.56 g/d (high RDP sodium caseinate. No significant differences existed among dietary treatments in the apparent disappearance of dietary nutrients except for dietary N, which increased with increased dietary RDP (P<0.001. Compared with starch, inulin treatments reduced the molar proportion of acetate (P<0.001, the acetate:propionate ratio (P<0.001, and methane production (P=0.006, but increased the butyrate proportion (P<0.001. Increased dietary RDP led to increases in production of total volatile fatty acid (P=0.014 and methane (P=0.050, various measures of N (P≤0.046, and 16s rDNA copy numbers of Ruminococcus flavefaciens (P≤0.010. Non-fibre carbohydrate source did not affect daily microbial N flow regardless of dietary RDP, but ammonia N production was lower for inulin than for starch treatments under high RDP conditions (P<0.001. Compared with starch treatments, inulin depressed the copy numbers of Fibrobacter succinogenes in solid fraction (P=0.023 and R. flavefaciens in liquid (P=0.017 and solid fractions (P=0.007, but it increased the carboxymethylcellulase activity in solid fraction (P=0.045. Current results suggest that starch and inulin differ in ruminal volatile fatty acid fermentation but have similar effects on ruminal digestion and microbial synthesis in vitro, although inulin suppressed the growth of partial ruminal cellulolytic bacteria.

  1. Effects of monolaurin on ruminal methanogens and selected bacterial species from cattle, as determined with the rumen simulation technique.

    Science.gov (United States)

    Klevenhusen, Fenja; Meile, Leo; Kreuzer, Michael; Soliva, Carla R

    2011-10-01

    Before being able to implement effective ruminal methane mitigation strategies via feed supplementation, the assessment of side effects on ruminal fermentation and rumen microbial populations is indispensable. In this respect we investigated the effects of monolaurin, a methane-mitigating lipid, on methanogens and important carbohydrate-degrading bacteria present in ruminal fluid of dairy cattle in continuous culture employing the rumen simulation technique. In six experimental runs, each lasting for 10 days, four diets with different carbohydrate composition, based on hay, maize, wheat and a maize-wheat mixture, either remained non-supplemented or were supplemented with monolaurin and incubated in a ruminal-fluid buffer mixture. Incubation liquid samples from days 6 to 10 of incubation were analyzed with relative quantitative polymerase chain reaction (qPCR) of 16S rRNA genes to assess monolaurin-induced shifts in specific rumen microbial populations in relation to the corresponding non-supplemented diets. Monolaurin completely inhibited Fibrobacter succinogenes in all diets while the response of the other cellulolytic bacteria varied in dependence of the diet. Megasphaera elsdenii remained unaffected by monolaurin in the two diets containing maize, but was slightly stimulated by monolaurin with the wheat and largely with the hay diet. The supply of monolaurin suppressed Methanomicrobiales below the detection limit with all diets, whereas relative 16S rRNA gene copy numbers of Methanobacteriales increased by 7-fold with monolaurin in case of the hay diet. Total Archaea were decreased by up to over 90%, but this was significant only for the wheat containing diets. Thus, monolaurin exerted variable effects mediated by unknown mechanisms on important ruminal microbes involved in carbohydrate degradation, along with its suppression of methane formation. The applicability of monolaurin for methane mitigation in ruminants thus depends on the extent to which adverse

  2. Reduction of radiation-induced xerostomia in nasopharyngeal carcinoma using CT simulation with laser patient marking and three-field irradiation technique

    International Nuclear Information System (INIS)

    Nishioka, Takeshi; Shirato, Hiroki; Arimoto, Takuro; Kaneko, Masanori; Kitahara, Toshihiro; Oomori, Keiichi; Yasuda, Motoyuki; Fukuda, Satoshi; Inuyama, Yukio; Miyasaka, Kazuo

    1997-01-01

    Purpose: Tumor control and reduction of postirradiation xerostomia in patients with nasopharyngeal carcinoma (NPC) using the three-field irradiation technique based on the CT-based simulation with laser patient marking was investigated. Methods and Materials: Seventy-eight patients with NPC were consecutively treated between 1983 and 1993. In 33 patients treated before 1987, target volume was determined using a conventional x-ray simulator with a reference of CT images, and the primary site was treated by the conventional parallel-opposed two-field technique (Group I). In 45 patients treated from 1987, target volume was determined using a CT simulator slice by slice, the treatment field was projected onto the patient's skin by a laser beam projector mounted on a C-arm, and the primary site was irradiated by a three-fields (anterior and bilateral) technique (Group II). In Group II, the shape of each field was determined using a beam's eye view to reduce the dose to the bilateral parotid glands. The three-field technique reduced the dose to the superficial lobe of parotid gland to about two-thirds of the dose given by the two-field technique. Radiation-induced xerostomia was evaluated by clinical symptoms and radio-isotope sialography. Results: The 5-year survival rate and disease-free survival rate were 46.6 and 31.2% in Group I, and 46.8 and 46.5% in Group II. A large variation in the volume of parotid glands were demonstrated, ranging from 9 cm 3 to 61 cm 3 among patients treated with CT simulation. Forty percent of the patients in Group II showed no or mild xerostomia, whereas all of the patients in Group I showed moderate to severe xerostomia (p < 0.01). The radioisotope sialography study showed that the mean secretion ratio by acid stimulation was improved from 3.8% in the Group I to 15.2% in the Group II (p < 0.01). Conclusions: CT simulation was useful to determine the size and shape of each field to reduce the dose to the parotid gland, of which size varies

  3. Performance Accuracy of Hand-on-needle versus Hand-onsyringe Technique for Ultrasound-guided Regional Anesthesia Simulation for Emergency Medicine Residents

    Directory of Open Access Journals (Sweden)

    Brian Johnson

    2014-09-01

    Full Text Available Introduction: Ultrasound-guided nerve blocks (UGNB are increasingly used in emergency care. The hand-on-syringe (HS needle technique is ideally suited to the emergency department setting because it allows a single operator to perform the block without assistance. The HS technique is assumed to provide less exact needle control than the alternative two-operator hand-on-needle (HN technique; however this assumption has never been directly tested. The primary objective of this study was to compare accuracy of needle targeting under ultrasound guidance by emergency medicine (EM residents using HN and HS techniques on a standardized gelatinous simulation model. Methods: This prospective, randomized study evaluated task performance. We compared needle targeting accuracy using the HN and HS techniques. Each participant performed a set of structured needling maneuvers (both simple and difficult on a standardized partial-task simulator. We evaluated time to task completion, needle visualization during advancement, and accuracy of needle tip at targeting. Resident technique preference was assessed using a post-task survey. Results: We evaluated 60 tasks performed by 10 EM residents. There was no significant difference in time to complete the simple model (HN vs. HS, 18 seconds vs. 18 seconds, p=0.93, time to complete the difficult model (HN vs. HS, 56 seconds vs. 50 seconds, p=0.63, needle visualization, or needle tip targeting accuracy. Most residents (60% preferred the HS technique. Conclusion: For EM residents learning UGNBs, the HN technique was not associated with superior needle control. Our results suggest that the single-operator HS technique provides equivalent needle control when compared to the two-operator HN technique. [West J Emerg Med. 2014;15(6:641–646

  4. Nobels Nobels laureates photographed by Peter Badge

    CERN Document Server

    2008-01-01

    A unique photographic record of all living Nobel laureates. In this handsome coffee-table book, photographer Peter Badge captures the likeness of every living Nobel laureate in a lasting black-and-white image -- more than 300 striking portraits in all. Brief biographical sketches accompanying the large-scale photographs pay homage to each laureate's singular contribution to science, literature or world peace. Bringing readers face-to-face with Nelson Mandela, Jimmy Carter, the Dalai Lama, James Watson, Gabriel García Márquez, Toni Morrison, Rita Levi-Montalcini, Linda Buck, and Paul Samuelson among many others, NOBELS offers an intimate and compelling look at well-known honorees as well as lesser-known recipients. A fascinating word/image tableau.

  5. A Relational Ecology of Photographic Practices

    Directory of Open Access Journals (Sweden)

    Jacqui Knight

    2017-11-01

    Full Text Available This paper proposes a relational history of media artifacts, which decentralizes the dominance of the photographer or filmmaker as the absolute author of the work. It adds an alternative account to understanding the creative process and the subsequent study of media forms by discussing film and photographic practices as the reciprocal affective relationship between the maker, their intentions, materials, technologies, non-human agents and the environment. By reorganizing the anthropocentrism of art historical narratives, which typically exclude corporeality and materiality as drivers of human history, we are able to discuss the complex dynamic meshwork of determinants that bring photographic artifacts into existence: the lived, animate, vital materialism at once emergent and mixing of different causalities and temporalities.

  6. Investigation of gas discharge by schlieren method and interferometry with automated processing of schlieren photographs and interferograms

    International Nuclear Information System (INIS)

    Gerasimova, V.I.; Dushin, L.A.; Privezentsev, V.S.; Taran, V.S.

    1974-01-01

    The principles are clarified of two optical plasma diagnostics techniques, viz., the interferometric method permitting the determination of electron density and the schlieren method determining the gradient of electron density. Both techniques in combination were used in investigating the plasma in a hydrogen hollow-cathode spark discharge. In the schlieren technique, a pulsed xenon laser, in the interference technique a helium-neon laser were used as the light sources. Schlieren photographs were processed automatically using an electronic computer. A detailed description is presented of the equipment for the automatic photograph evaluation. (A.K.)

  7. Development of Curriculum of Learning through Photograph

    Science.gov (United States)

    Suzuki, Keiko; Aoki, Naokazu; Kobayashi, Hiroyuki

    A curriculum of an integrated learning using power of photography in the junior highschool was constructed, and was experimented in the class "Seminar for Photographic Expression" of the integrated learning at a junior high school. The center of the curriculum is viewing photographs and self-expression using photography. By comparing the results of questionnaires investigation between before and after the class it is suggested that the curriculum brings about increase in self-esteem, empathy, and motivation for learning. This educational effect is really to foster ability to live self-sufficient lives. On the basis of these results curriculums which can be conducted by anyone at every junior highschool were proposed.

  8. Organisation of a laboratory of photographic dosimetry

    International Nuclear Information System (INIS)

    Soudain, Georges

    1961-01-01

    After a recall of the main properties of photographic dosimetry, the author describes the principle of this method, and comments the issue of chromatic sensitivity of photographic emulsions. He discusses the calibration process for gamma radiation, X rays, and thermal neutrons. He describes how fast neutron dosimetry is performed. In the next part, he describes the organisation of the photometry laboratory which has to prepare and distribute dosimeters, to collect and exploit them, and to prepare a publication of results. These different missions and tasks are described

  9. Salvaging and Conserving Water Damaged Photographic Materials

    Science.gov (United States)

    Suzuki, Ryuji

    Degradation of water damaged photographic materials is discussed; the most vulnerable elements are gelatin layers and silver image. A simple and inexpensive chemical treatment is proposed, consisting of a bath containing a gelatin-protecting biocide and a silver image protecting agent. These ingredients were selected among those used in manufacturing of silver halide photographic emulsions or processing chemicals. Experiments confirmed that this treatment significantly reduced oxidative attacks to silver image and bacterial degradation of gelatin layers. The treated material was also stable under intense light fading test. Method of hardening gelatin to suppress swelling is also discussed.

  10. PREFACE: Workshop Photograph and Program

    Science.gov (United States)

    2011-07-01

    Workshop photograph Workshop Program Sunday 28 March 201019:00-21:00 Reception at Okura Frontier Hotel Tsukuba(Buffet style dinner with drink) Monday 29 March 2010Introduction (Chair: André Rubbia (ETH Zurich))09:00 Welcome address (05') Atsuto Suzuki (KEK)09:05 Message from CERN on neutrino physics (10') Sergio Bertolucci (CERN)09:15 Message from FNAL on neutrino physics (10') Young Kee Kim (FNAL)09:25 Message from KEK on neutrino physics (10') Koichiro Nishikawa (KEK)09:35 Introductory remark on GLA2010 (10') Takuya Hasegawa (KEK) Special session (Chair: Koichiro Nishikawa (KEK))09:45 The ICARUS Liquid Argon TPC (45') Carlo Rubbia (CERN)10:30-11:00 Coffee break Main goals of Giant Liquid Argon Charge Imaging Experiments I (Chair: Takashi Kobayashi (KEK))11:00 Results from massive underground detectors (non accelerator) (30') Takaaki Kajita (ICRR, U. of Tokyo)11:30 Present long baseline neutrino experiments (30') Chang Kee Jung (SUNY Stony Brook)12:00-12:10 Workshop picture12:10-14:00 Lunch break Main goals of Giant Liquid Argon Charge Imaging Experiments II (Chair: Takashi Kobayashi (KEK))14:00 Physics goals of the next generation massive underground experiments (30') David Wark (Imperial College London)14:30 Near detectors for long baseline neutrino experiments (20') Tsuyoshi Nakaya (Kyoto U.) Lessons on Liquid Argon Charge Imaging technology from ongoing developments (Chair: Chang Kee Jung (SUNY Stony Brook))14:50 WARP (30') Claudio Montanari (U. of Pavia)15:20 ArDM (30') Alberto Marchionni (ETH Zurich)15:50 From ArgoNeuT to MicroBooNE (30') Bonnie Fleming (Yale U.)16:20 250L (30') Takasumi Maruyama (KEK)16:50 The DEAP/CLEAN project (20') Mark Boulay (Queen's U.)17:10-17:40 Coffee break Lessons from Xe based Liquids Imaging detectors (Chair: Flavio Cavanna (U. of L'Aquilla))17:30 MEG (20') Satoshi Mihara (KEK)17:50 The XENON project (20') Elena Aprile (Columbia U.)18:10 XMASS (20') Hiroyuki Sekiya (ICRR, U. of Tokyo) Studies on physics performance (Chair

  11. Uniform sources of ionizing radiation of extended area from radiotoned photographic film

    International Nuclear Information System (INIS)

    Thackray, M.

    1978-01-01

    The technique of toning photographic films, that have been uniformly exposed and developed, with radionuclides to provide uniform sources of ionizing radiation of extended area and their uses in radiography are discussed. The suitability of various radionuclides for uniform-plane sources is considered. (U.K.)

  12. The use of neutron activation to detect a photographic image under a painting

    International Nuclear Information System (INIS)

    Wall, T.; Bird, R.

    1980-01-01

    Neutron activation followed by autoradiography has been used in a number of studies of oil paintings to reveal brush technique, overpainting, pigment types and other information. The facilities of the Australian Atomic Energy Commission Research Establishment have been used to investigate whether there is a photographic image underlying a painting by the Swedish born artist Carl Magnus Oscar Fristrom (1856-1919)

  13. Computer-assisted preoperative simulation for positioning and fixation of plate in 2-stage procedure combining maxillary advancement by distraction technique and mandibular setback surgery.

    Science.gov (United States)

    Suenaga, Hideyuki; Taniguchi, Asako; Yonenaga, Kazumichi; Hoshi, Kazuto; Takato, Tsuyoshi

    2016-01-01

    Computer-assisted preoperative simulation surgery is employed to plan and interact with the 3D images during the orthognathic procedure. It is useful for positioning and fixation of maxilla by a plate. We report a case of maxillary retrusion by a bilateral cleft lip and palate, in which a 2-stage orthognathic procedure (maxillary advancement by distraction technique and mandibular setback surgery) was performed following a computer-assisted preoperative simulation planning to achieve the positioning and fixation of the plate. A high accuracy was achieved in the present case. A 21-year-old male patient presented to our department with a complaint of maxillary retrusion following bilateral cleft lip and palate. Computer-assisted preoperative simulation with 2-stage orthognathic procedure using distraction technique and mandibular setback surgery was planned. The preoperative planning of the procedure resulted in good aesthetic outcomes. The error of the maxillary position was less than 1mm. The implementation of the computer-assisted preoperative simulation for the positioning and fixation of plate in 2-stage orthognathic procedure using distraction technique and mandibular setback surgery yielded good results. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  14. MO-F-CAMPUS-I-03: GPU Accelerated Monte Carlo Technique for Fast Concurrent Image and Dose Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Becchetti, M; Tian, X; Segars, P; Samei, E [Clinical Imaging Physics Group, Department of Radiology, Duke University Me, Durham, NC (United States)

    2015-06-15

    Purpose: To develop an accurate and fast Monte Carlo (MC) method of simulating CT that is capable of correlating dose with image quality using voxelized phantoms. Methods: A realistic voxelized phantom based on patient CT data, XCAT, was used with a GPU accelerated MC code for helical MDCT. Simulations were done with both uniform density organs and with textured organs. The organ doses were validated using previous experimentally validated simulations of the same phantom under the same conditions. Images acquired by tracking photons through the phantom with MC require lengthy computation times due to the large number of photon histories necessary for accurate representation of noise. A substantial speed up of the process was attained by using a low number of photon histories with kernel denoising of the projections from the scattered photons. These FBP reconstructed images were validated against those that were acquired in simulations using many photon histories by ensuring a minimal normalized root mean square error. Results: Organ doses simulated in the XCAT phantom are within 10% of the reference values. Corresponding images attained using projection kernel smoothing were attained with 3 orders of magnitude less computation time compared to a reference simulation using many photon histories. Conclusion: Combining GPU acceleration with kernel denoising of scattered photon projections in MC simulations allows organ dose and corresponding image quality to be attained with reasonable accuracy and substantially reduced computation time than is possible with standard simulation approaches.

  15. MO-F-CAMPUS-I-03: GPU Accelerated Monte Carlo Technique for Fast Concurrent Image and Dose Simulation

    International Nuclear Information System (INIS)

    Becchetti, M; Tian, X; Segars, P; Samei, E

    2015-01-01

    Purpose: To develop an accurate and fast Monte Carlo (MC) method of simulating CT that is capable of correlating dose with image quality using voxelized phantoms. Methods: A realistic voxelized phantom based on patient CT data, XCAT, was used with a GPU accelerated MC code for helical MDCT. Simulations were done with both uniform density organs and with textured organs. The organ doses were validated using previous experimentally validated simulations of the same phantom under the same conditions. Images acquired by tracking photons through the phantom with MC require lengthy computation times due to the large number of photon histories necessary for accurate representation of noise. A substantial speed up of the process was attained by using a low number of photon histories with kernel denoising of the projections from the scattered photons. These FBP reconstructed images were validated against those that were acquired in simulations using many photon histories by ensuring a minimal normalized root mean square error. Results: Organ doses simulated in the XCAT phantom are within 10% of the reference values. Corresponding images attained using projection kernel smoothing were attained with 3 orders of magnitude less computation time compared to a reference simulation using many photon histories. Conclusion: Combining GPU acceleration with kernel denoising of scattered photon projections in MC simulations allows organ dose and corresponding image quality to be attained with reasonable accuracy and substantially reduced computation time than is possible with standard simulation approaches

  16. On the reverse. Some notes on photographic images from the Warburg Institute Photographic Collection

    Directory of Open Access Journals (Sweden)

    Katia Mazzucco

    2012-10-01

    Full Text Available How can the visual and textual data about an image – the image of a work of art – on recto and verso of a picture be interpreted? An analogical-art-documentary photograph represents a palimpsest to be considered layer by layer. The examples discussed in this article, which refer to both Aby Warburg himself and the first nucleus of the Warburg Institute Photographic Collection, contribute to effectively outline elements of the debate around the question of the photographic reproduction of the work of art as well as of the position of photography in relation to the perception of the work of art.

  17. Photographing magnetic fields in superconductors

    International Nuclear Information System (INIS)

    Harrison, R.B.; Wright, L.S.

    Magneto-optic techniques coupled with high-speed photography are being used to study the destruction of superconductivity by a magnetic field. The phenomenon of superconductivity will be introduced with emphasis placed on the properties of type I and type II superconductors in a magnetic field. The Faraday effect and its application to the study of the penetration of magnetic fields into these superconductors will be described; the relative effectiveness of some types of paramagnetic glass will be demonstrated. A number of cinefilms will be shown to illustrate the versatility of the magneto-optic method for observing flux motion and patterns. The analysis of data obtained from a high speed film (10,200 fps) of a flux jump in Nb-Zr will be presented and discussed

  18. Cryptography Would Reveal Alterations In Photographs

    Science.gov (United States)

    Friedman, Gary L.

    1995-01-01

    Public-key decryption method proposed to guarantee authenticity of photographic images represented in form of digital files. In method, digital camera generates original data from image in standard public format; also produces coded signature to verify standard-format image data. Scheme also helps protect against other forms of lying, such as attaching false captions.

  19. The Social Effects of War Photographs

    Directory of Open Access Journals (Sweden)

    Gökhan DEMİREL

    2017-07-01

    Full Text Available In today’s world, knowledge is increasingly impacted via visual representation. The messages sent through various sources, such as newspaper, television and the internet, lead people to form opinions about various topics. In this context, photography is one of the most powerful source of information. Moreover, the visual power and the ability to show nonverbal communication makes it a perfect tool for propaganda. These days, photographs showing war themes are used more often than the past. It can be said that war photographs serve as a tool for showing the world the realities of war to those, even to those who turn their back to massacres. After all, a dead body creates a shocking effect in the seer. In this study, the context of the photographs of the war, examined in sample of photograph of Aylan Kurdi, which became the “icon” of immigration due to Syrian civil war and war it relates to and it is studied to understand how it is assessed and understood considering the environment and conditions on the date the photo was taken, existing values, beliefs and things happened in the world in that time, from a critical point of view.

  20. Overview of workshop on 'Evaluation of simulation techniques for radiation damage in the bulk of fusion first wall materials'

    International Nuclear Information System (INIS)

    Leffers, T.; Singh, B.N.; Green, W.V.; Victoria, M.

    1984-05-01

    The main points and the main conclusions of a workshop held June 27-30 1983 at Interlaken, Switzerland, are reported. There was general agreement among the participants that ideal simulation, providing unambiguous information about the behaviour of the first wall material, is at present out of reach. In this situation the route to follow is to use the existing simulation facilities in a concerted effort to understand the damage accumulation processes and thereby create the background for prediction or appropriate simulation of the behaviour of the first wall material. (Auth.)

  1. Overview of Workshop on Evaluation of Simulation Techniques for Radiation Damage in the Bulk of Fusion First Wall Materials

    DEFF Research Database (Denmark)

    Leffers, Torben; Singh, Bachu Narain; Green, W.V.

    1984-01-01

    of reach. In this situation the route to follow is to use the existing simulation facilities in a concerted effort to understand the damage accumulation processes and thereby create the background for prediction or appropriate simulation of the behaviour of the first wall material.......The main points and the main conclusions of a workshop held June 27–30 1983 at Interlaken, Switzerland, are reported. There was general agreement among the participants that ideal simulation, providing unambiguous information about the behaviour of the first wall material, is at present out...

  2. Monte Carlo Simulation of the Time-Of-Flight Technique for the Measurement of Neutron Cross-section in the Pohang Neutron Facility

    Energy Technology Data Exchange (ETDEWEB)

    An, So Hyun; Lee, Young Ouk; Lee, Cheol Woo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Lee, Young Seok [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2007-10-15

    It is essential that neutron cross sections are measured precisely for many areas of research and technique. In Korea, these experiments have been performed in the Pohang Neutron Facility (PNF) with the pulsed neutron facility based on the 100 MeV electron linear accelerator. In PNF, the neutron energy spectra have been measured for different water levels inside the moderator and compared with the results of the MCNPX calculation. The optimum size of the water moderator has been determined on the base of these results. In this study, Monte Carlo simulations for the TOF technique were performed and neutron spectra of neutrons were calculated to predict the measurements.

  3. An interactive program for digitization of seabed photographs

    Digital Repository Service at National Institute of Oceanography (India)

    Ramprasad, T.; Sharma, R.

    A program for dignitization of seabed photographs to compute coverage and abundance of polymetallic nodules is developed. Since the objects in the seabed photograph are partially covered by thin sediment layer, the automatic scanning devices may...

  4. Time perception of action photographs is more precise than that of still photographs.

    Science.gov (United States)

    Moscatelli, Alessandro; Polito, Laura; Lacquaniti, Francesco

    2011-04-01

    A photograph of an action contains implicit information about the depicted motion. Previous studies using either psychophysics or neuroimaging suggested that the neural processing of implied-motion images shares some features of real-motion processing. According to the hypothesis that the target depicted in photographs with implied motion is mentally represented as continuing in motion, such kind of photographs should be processed by the brain similarly to the individual frames of a running movie. In order to decode the functional significance of a movie, we must be able to estimate the duration of each frame and the time interval between successive frames as precisely as possible. Therefore, under naturalistic conditions, one would expect that the precision of time duration estimates is higher for action pictures than for still pictures. To test this prediction, we asked human observers to compare the variable duration of test photographs with the reference duration of their scrambled version. We found that, as expected, the duration of photographs with implied motion was discriminated better than the duration of photographs without implied motion. We also found that the average reaction time for the discrimination of photographs with implied motion was longer than that for photographs without implied motion, suggesting that the processing of implied motion involves longer and/or slower neural routes to compute time duration. This longer processing may depend on the engagement of two visual systems in parallel, one for processing form and the other one for processing implied motion. The perceptual decision about time duration would occur after the convergence of signals from these two pathways.

  5. Lunar orbiter photographic atlas of the near side of the Moon

    CERN Document Server

    Byrne, Charles

    2005-01-01

    In 1967, Lunar Orbiter Mission 4 sent back to Earth a superb series of photographs of the surface of the Moon. Using 21st century computer techniques, Charles Byrne - previously System Engineer of the Apollo Program for Lunar Orbiter Photography - has removed the scanning artifacts and transmission imperfections to produce a most comprehensive and beautifully detailed set of images of the lunar surface. To help practical astronomers, all the photographs are systematically related to an Earth-based view. The book has been organized to make it easy for astronomers to use, enabling ground-based images and views to be compared with the Orbiter photographs. Every astronomer - amateur and professional - who is interested in the Moon will want this book in his library!.

  6. Black-and-white photographic chemistry: A reference

    Science.gov (United States)

    Walker, E. D. (Compiler)

    1986-01-01

    This work is intended as a reference of black-and-white photographic chemistry. Included is a basic history of the photographic processes and a complete description of all chemicals used, formulas for the development and fixation process, and associated formulas such as cleaners, hardeners, and toners. The work contains a complete glossary of photographic terms, a trouble-shooting section listing causes and effects regarding photographic film and papers, and various conversion charts.

  7. The application of neutral network integrated with genetic algorithm and simulated annealing for the simulation of rare earths separation processes by the solvent extraction technique using EHEHPA agent

    International Nuclear Information System (INIS)

    Tran Ngoc Ha; Pham Thi Hong Ha

    2003-01-01

    In the present work, neutral network has been used for mathematically modeling equilibrium data of the mixture of two rare earth elements, namely Nd and Pr with PC88A agent. Thermo-genetic algorithm based on the idea of the genetic algorithm and the simulated annealing algorithm have been used in the training procedure of the neutral networks, giving better result in comparison with the traditional modeling approach. The obtained neutral network modeling the experimental data is further used in the computer program to simulate the solvent extraction process of two elements Nd and Pr. Based on this computer program, various optional schemes for the separation of Nd and Pr have been investigated and proposed. (author)

  8. Phase recording for formation of holographic optical elements on silver-halide photographic emulsions

    Science.gov (United States)

    Ganzherli, Nina M.; Gulyaev, Sergey N.; Maurer, Irina A.; Chernykh, Dmitrii F.

    2009-05-01

    Holographic fabrication methods of regular and nonregular relief-phase structures on silver-halide photographic emulsions are considered. Methods of gelatin photodestruction under short-wave ultra-violet radiation and chemical hardening with the help of dichromated solutions were used as a technique for surface relief formation. The developed techniques permitted us to study specimens of holographic diffusers and microlens rasters with small absorption and high light efficiency.

  9. A Qualitative Analysis of the Neutron Population in Fresh and Spent Fuel Assemblies during Simulated Interrogation using the Differential Die-Away Technique

    International Nuclear Information System (INIS)

    Lundkvista, Niklas; Goodsell, Alison V.; Grapea, Sophie; Hendricksb, John S.; Henzlb, Vladimir; Swinhoe, Martyn T.; Tobin, Stephen J.

    2015-01-01

    Monte Carlo simulations were performed for the differential die-away (DDA) technique to analyse the time-dependent behaviour of the neutron population in fresh and spent nuclear fuel assemblies as part of the Next Generation Safeguards Initiative Spent Fuel (NGSI-SF) Project. Simulations were performed to investigate both a possibly portable as well as a permanent DDA instrument. Taking advantage of a custom made modification to the MCNPX code, the variation in the neutron population, simultaneously in time and space, was examined. The motivation for this research was to improve the design of the DDA instrument, as it is being considered for possible deployment at the Central Storage of Spent Nuclear Fuel and Encapsulation Plant in Sweden (Clab), as well as to assist in the interpretation of the both simulated and measured signals.

  10. Analyzing Forest Inventory Data from Geo-Located Photographs

    Science.gov (United States)

    Toivanen, Timo; Tergujeff, Renne; Andersson, Kaj; Molinier, Matthieu; Häme, Tuomas

    2015-04-01

    Forests are widely monitored using a variety of remote sensing data and techniques. Remote sensing offers benefits compared to traditional in-situ forest inventories made by experts. One of the main benefits is that the number of ground reference plots can be significantly reduced. Remote sensing of forests can provide reduced costs and time requirement compared to full forest inventories. The availability of ground reference data has been a bottleneck in remote sensing analysis over wide forested areas, as the acquisition of this data is an expensive and slow process. In this paper we present a tool for estimating forest inventory data from geo-located photographs. The tool can be used to estimate in-situ forest inventory data including estimated biomass, tree species, tree height and diameter. The collected in-situ forest measurements can be utilized as a ground reference material for spaceborne or airborne remote sensing data analysis. The GPS based location information with measured forest data makes it possible to introduce measurements easily as in-situ reference data. The central projection geometry of digital photographs allows the use of the relascope principle [1] to measure the basal area of stems per area unit, a variable very closely associated with tree biomass. Relascope is applied all over the world for forest inventory. Experiments with independent ground reference data have shown that in-situ data analysed from photographs can be utilised as reference data for satellite image analysis. The concept was validated by comparing mobile measurements with 54 independent ground reference plots from the Hyytiälä forest research station in Finland [2]. Citizen scientists could provide the manpower for analysing photographs from forests on a global level and support researchers working on tasks related to forests. This low-cost solution can also increase the coverage of forest management plans, particularly in regions where possibilities to invest on

  11. 3D PHOTOGRAPHS IN CULTURAL HERITAGE

    Directory of Open Access Journals (Sweden)

    W. Schuhr

    2013-07-01

    Full Text Available This paper on providing "oo-information" (= objective object-information on cultural monuments and sites, based on 3D photographs is also a contribution of CIPA task group 3 to the 2013 CIPA Symposium in Strasbourg. To stimulate the interest in 3D photography for scientists as well as for amateurs, 3D-Masterpieces are presented. Exemplary it is shown, due to their high documentary value ("near reality", 3D photography support, e.g. the recording, the visualization, the interpretation, the preservation and the restoration of architectural and archaeological objects. This also includes samples for excavation documentation, 3D coordinate calculation, 3D photographs applied for virtual museum purposes and as educational tools. In addition 3D photography is used for virtual museum purposes, as well as an educational tool and for spatial structure enhancement, which in particular holds for inscriptions and in rock arts. This paper is also an invitation to participate in a systematic survey on existing international archives of 3D photographs. In this respect it is also reported on first results, to define an optimum digitization rate for analog stereo views. It is more than overdue, in addition to the access to international archives for 3D photography, the available 3D photography data should appear in a global GIS(cloud-system, like on, e.g., google earth. This contribution also deals with exposing new 3D photographs to document monuments of importance for Cultural Heritage, including the use of 3D and single lense cameras from a 10m telescope staff, to be used for extremely low earth based airborne 3D photography, as well as for "underwater staff photography". In addition it is reported on the use of captive balloon and drone platforms for 3D photography in Cultural Heritage. It is liked to emphasize, the still underestimated 3D effect on real objects even allows, e.g., the spatial perception of extremely small scratches as well as of nuances in

  12. 8 CFR 236.5 - Fingerprints and photographs.

    Science.gov (United States)

    2010-01-01

    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Fingerprints and photographs. 236.5 Section... to Order of Removal § 236.5 Fingerprints and photographs. Every alien 14 years of age or older... photographs shall be made available to Federal, State, and local law enforcement agencies upon request to the...

  13. The aesthetic appeal of depth of field in photographs

    NARCIS (Netherlands)

    Zhang, T.; Nefs, H.T.; Redi, J.; Heynderickx, I.E.J.

    2014-01-01

    We report here how depth of field (DOF) affects the aesthetic appeal of photographs for different content categories. 339 photographs spanning eight categories were selected from Flickr, Google+, and personal collections. First, we classified the 339 photographs into three levels of depth of field:

  14. Using Photographs to Integrate Liberal Arts Learning in Business Education

    Science.gov (United States)

    Madden, Laura T.; Smith, Anne D.

    2015-01-01

    The inclusion of photographic approaches in the business classroom can incorporate missing elements of liberal education into business education, which were highlighted in a recent Carnegie study of undergraduate business education. Building on photographic methods in social science research, we identify three categories of photographic approaches…

  15. 7 CFR 500.9 - Photographs for news or advertising.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 6 2010-01-01 2010-01-01 false Photographs for news or advertising. 500.9 Section 500... for news or advertising. Photographs for news purposes may be taken at the USNA without prior permission. Photographs for advertising and other commercial purposes may be taken, but only with the prior...

  16. Images of the Great Depression: A Photographic Essay.

    Science.gov (United States)

    Stevens, Robert L.; Fogel, Jared A.

    2001-01-01

    Provides background information on the Farm Security Administration (FSA) and the photographic section of the FSA. Identifies six photographers and features three photographers (Walker Evans, Dorothea Lange, and Ben Shahn) who were recruited to document farm conditions. Discusses using FSA photos in the classroom and provides lesson plans to help…

  17. Recognition Memory for Movement in Photographs: A Developmental Study.

    Science.gov (United States)

    Futterweit, Lorelle R.; Beilin, Harry

    1994-01-01

    Investigated whether children's recognition memory for movement in photographs is distorted forward in the direction of implied motion. When asked whether the second photograph was the same as or different from the first, subjects made more errors for test photographs showing the action slightly forward in time, compared with slightly backward in…

  18. Catalyst volumetric fraction simulation in a riser of a cold flow pilot unit with aid of transmission gamma technique

    International Nuclear Information System (INIS)

    Santos, Kamylla A.L. dos; Lima Filho, Hilario J.B. de; Benachour, Mohand; Dantas, Carlos C.; Santos, Valdemir A. dos

    2013-01-01

    Was obtained the radial profile of the catalyst volume fraction in a riser of the cold flow pilot unit of the Fluid Catalytic Cracking (FCC) unit, which was used for adjustment of the entrance conditions of the catalyst in a simulation program by Computational Fluid Dynamics (CFD). The height of the riser of the Cold Flow Pilot Unity (CFPU) utilized is 6.0m and its inner diameter is 0.097 m. A radiation-γ source of Am-241 and a NaI (Tl) detector, with shielding made of lead, have been installed on a steel backing that maintains the geometry of the source-detector-riser and allows to vary the distance from the source to the detector and the radial position in a given cross section of the riser. The data associated with the simulation of volume fraction radial profile of the catalyst were: Fluent software, version 12.0; preprocessor GAMBIT, version 2.3.16; Eulerian approach; structured mesh, cell number of 60000; turbulence model used was k-ε and kinetic theory of granular flow (KTGF) was implemented to describe the solid phase. Comparison of radial profiles simulated and experimental of the catalyst volumetric fraction in the CFPU riser allowed the identification of needs adjustments in the simulation for the input of catalyst, with consequent validation for the proposed model simulation. (author)

  19. Quantification of marine macro-debris abundance around Vancouver Island, Canada, based on archived aerial photographs processed by projective transformation.

    Science.gov (United States)

    Kataoka, Tomoya; Murray, Cathryn Clarke; Isobe, Atsuhiko

    2017-09-12

    The abundance of marine macro-debris was quantified with high spatial resolution by applying an image processing technique to archived shoreline aerial photographs taken over Vancouver Island, Canada. The photographs taken from an airplane at oblique angles were processed by projective transformation for georeferencing, where five reference points were defined by comparing aerial photographs with satellite images of Google Earth. Thereafter, pixels of marine debris were extracted based on their color differences from the background beaches. The debris abundance can be evaluated by the ratio of an area covered by marine debris to that of the beach (percent cover). The horizontal distribution of percent cover of marine debris was successfully computed from 167 aerial photographs and was significantly related to offshore Ekman flows and winds (leeway drift and Stokes drift). Therefore, the estimated percent cover is useful information to determine priority sites for mitigating adverse impacts across broad areas. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Oscilloscope trace photograph digitizing system (TRACE)

    International Nuclear Information System (INIS)

    Richards, M.; Dabbs, R.D.

    1977-10-01

    The digitizing system allows digitization of photographs or sketches of waveforms and then the computer is used to reduce and analyze the data. The software allows for alignment, calibration, removal of baselines, removal of unwanted points and addition of new points which makes for a fairly versatile system as far as data reduction and manipulation are concerned. System considerations are introduced first to orient the potential user to the process of digitizing information. The start up and actual commands for TRACE are discussed. Detailed descriptions of each subroutine and program section are also provided. Three general examples of typical photographs are included. A partial listing of FAWTEK is made available. Once suitable arrays that contain the data are arranged, ''GO FA'' (active FAWTEK) and many mathematical operations to further analyze the data may be performed

  1. The Cambridge photographic atlas of galaxies

    CERN Document Server

    König, Michael

    2017-01-01

    Galaxies - the Milky Way's siblings - offer a surprising variety of forms and colours. Displaying symmetrical spiral arms, glowing red nebulae or diffuse halos, even the image of a galaxy can reveal much about its construction. All galaxies consist of gas, dust and stars, but the effects of gravity, dark matter and the interaction of star formation and stellar explosions all influence their appearances. This volume showcases more than 250 of the most beautiful galaxies within an amateur's reach and uses them to explain current astrophysical research. It features fantastic photographs, unique insights into our knowledge, tips on astrophotography and essential facts and figures based on the latest science. From the Andromeda Galaxy to galaxy clusters and gravitational lenses, the nature of galaxies is revealed through these stunning amateur photographs. This well illustrated reference atlas deserves a place on the bookshelves of astronomical imagers, observers and armchair enthusiasts.

  2. Use of simulation techniques to detect changes in property value after the accident at Three Mile Island

    International Nuclear Information System (INIS)

    Downing, R.H.; Gamble, H.B.

    1982-01-01

    The purpose of this simulation is to predict the sale price of properties in various geographical cells after the accident and compare them with the actual sales. Tests for significance were made in twenty cells. Several were significantly higher and only those north of the plant were significantly lower

  3. Validation of techniques for simulating long range dispersal and deposition of atmospheric pollutants based upon measurements after the Chernobyl accident

    International Nuclear Information System (INIS)

    Tveten, U.

    1987-02-01

    Problem specifications and a time schedule for an international study of computerized simulation of transfrontier atmospheric contamination are presented. Started on the initiative of the Nordic Liaison Committee for Atomic Energy, the study will be based on international measurements after the Chernobyl accident

  4. Design and Simulation of Control Technique for Permanent Magnet Synchronous Motor Using Space Vector Pulse Width Modulation

    Science.gov (United States)

    Khan, Mansoor; Yong, Wang; Mustafa, Ehtasham

    2017-07-01

    After the rapid advancement in the field of power electronics devices and drives for last few decades, there are different kinds of Pulse Width Modulation techniques which have been brought to the market. The applications ranging from industrial appliances to military equipment including the home appliances. The vey common application for the PWM is three phase voltage source inverter, which is used to convert DC to AC in the homes to supply the power to the house in case electricity failure, usually named as Un-interrupted Power Supply. In this paper Space Vector Pulse Width Modulation techniques is discussed and analysed under the control technique named as Field Oriented Control. The working and implementation of this technique has been studied by implementing on the three phase bridge inverter. The technique is used to control the Permanente Magnet Synchronous Motor. The drive system is successfully implemented in MATLAB/Simulink using the mathematical equation and algorithm to achieve the satisfactory results. PI type of controller is used to tuned ers of the motothe parametr i.e. torque and current.

  5. Investigation of Photographic Image Quality Estimators

    Science.gov (United States)

    1980-04-01

    Bibeman (1973) describes acutance as being "expressed in terms of the mean square of the gradient of . . . density (in a photographic image) with...the density difference AD. for each interval from the (smoothed) microdensitometer trace (calibrated in density units). 4. Compute the gradient -77...resolution." Rotacion Effects: The conditions were: Target: Shutter Speed: I- requency: Arplitude: Medium contrast, variable aspect 250 milliseconds

  6. Front Cover Photograph & Interview for FREEYE Magazine

    OpenAIRE

    Murray, Matthew

    2003-01-01

    Matthew Murray Front Cover Photograph & Interview for FREEYE Magazine - Dutch Quarterly For Exceptional International Photography, Holland.\\ud The article focuses on Murray's practice, his personal work, commissioned work, advertising, gallery and exhibition work along with his methodology. Looking at Murray's inspirations and how they feed into his personal projects and how this personal work feeds into shooting above the line advertising campaigns. Murray's work blurs the lines between pers...

  7. Perfection of the individual photographic emulsion dosimeter

    International Nuclear Information System (INIS)

    Soudain, G.

    1960-01-01

    A photographic dosimeter making possible the measurement of γ radiation doses of from 10 mr up to 800 r by means of 3 emulsion bands of varying sensitivity stuck to the same support is described. The dosimeter has also a zone for marking and a test film insensitive to radiation. This requires a photometric measurement by diffuse reflection an d makes it possible to measure doses with an accuracy of 20 per cent. (author) [fr

  8. An agent-based simulation combined with group decision-making technique for improving the performance of an emergency department

    Directory of Open Access Journals (Sweden)

    M. Yousefi

    Full Text Available This study presents an agent-based simulation modeling in an emergency department. In a traditional approach, a supervisor (or a manager allocates the resources (receptionist, nurses, doctors, etc. to different sections based on personal experience or by using decision-support tools. In this study, each staff agent took part in the process of allocating resources based on their observation in their respective sections, which gave the system the advantage of utilizing all the available human resources during the workday by being allocated to a different section. In this simulation, unlike previous studies, all staff agents took part in the decision-making process to re-allocate the resources in the emergency department. The simulation modeled the behavior of patients, receptionists, triage nurses, emergency room nurses and doctors. Patients were able to decide whether to stay in the system or leave the department at any stage of treatment. In order to evaluate the performance of this approach, 6 different scenarios were introduced. In each scenario, various key performance indicators were investigated before and after applying the group decision-making. The outputs of each simulation were number of deaths, number of patients who leave the emergency department without being attended, length of stay, waiting time and total number of discharged patients from the emergency department. Applying the self-organizing approach in the simulation showed an average of 12.7 and 14.4% decrease in total waiting time and number of patients who left without being seen, respectively. The results showed an average increase of 11.5% in total number of discharged patients from emergency department.

  9. Tracking Protests Using Geotagged Flickr Photographs.

    Directory of Open Access Journals (Sweden)

    Merve Alanyali

    Full Text Available Recent years have witnessed waves of protests sweeping across countries and continents, in some cases resulting in political and governmental change. Much media attention has been focused on the increasing usage of social media to coordinate and provide instantly available reports on these protests. Here, we investigate whether it is possible to identify protest outbreaks through quantitative analysis of activity on the photo sharing site Flickr. We analyse 25 million photos uploaded to Flickr in 2013 across 244 countries and regions, and determine for each week in each country and region what proportion of the photographs are tagged with the word "protest" in 34 different languages. We find that higher proportions of "protest"-tagged photographs in a given country and region in a given week correspond to greater numbers of reports of protests in that country and region and week in the newspaper The Guardian. Our findings underline the potential value of photographs uploaded to the Internet as a source of global, cheap and rapidly available measurements of human behaviour in the real world.

  10. Application perspectives of simulation techniques CFD in nuclear power plants; Perspectivas de aplicacion de tecnicas de modelado CFD en plantas nucleoelectricas

    Energy Technology Data Exchange (ETDEWEB)

    Galindo G, I. F., E-mail: igalindo@iie.org.mx [Instituto de Investigaciones Electricas, Reforma No. 113, Col. Palmira, 62490 Cuernavaca, Morelos (Mexico)

    2013-10-15

    The scenarios simulation in nuclear power plants is usually carried out with system codes that are based on concentrated parameters networks. However situations exist in some components where the flow is predominantly 3-D, as they are the natural circulation, mixed and stratification phenomena. The simulation techniques of computational fluid dynamics (CFD) have the potential to simulate these flows numerically. The use of CFD simulations embraces many branches of the engineering and continues growing, however, in relation to its application with respect to the problems related with the safety in nuclear power plants, has a smaller development, although is accelerating quickly and is expected that in the future they play a more emphasized paper in the analyses. A main obstacle to be able to achieve a general acceptance of the CFD is that the simulations should have very complete validation studies, sometimes not available. In this article a general panorama of the state of the methods application CFD in nuclear power plants is presented and the problem associated to its routine application and acceptance, including the view point of the regulatory authorities. Application examples are revised in those that the CFD offers real benefits and are also presented two illustrative study cases of the application of CFD techniques. The case of a water recipient with a heat source in its interior, similar to spent fuel pool of a nuclear power plant is presented firstly; and later the case of the Boron dilution of a water volume that enters to a nuclear reactor is presented. We can conclude that the CFD technology represents a very important opportunity to improve the phenomena understanding with a strong component 3-D and to contribute in the uncertainty reduction. (Author)

  11. Development of new methodology for dose calculation in photographic dosimetry

    International Nuclear Information System (INIS)

    Daltro, T.F.L.; Campos, L.L.; Perez, H.E.B.

    1996-01-01

    The personal dosemeter system of IPEN is based on film dosimetry. Personal doses at IPEN are mainly due to X or gamma radiation. The use of personal photographic dosemeters involves two steps: firstly, data acquisition including their evaluation with respect to the calibration quantity and secondly, the interpretation of the data in terms of effective dose. The effective dose was calculated using artificial intelligence techniques by means of neural network. The learning of the neural network was performed by taking the readings of optical density as a function of incident energy and exposure from the calibration curve. The obtained output in the daily grind is the mean effective energy and the effective dose. (author)

  12. Development of new methodology for dose calculation in photographic dosimetry

    International Nuclear Information System (INIS)

    Daltro, T.F.L.

    1994-01-01

    A new methodology for equivalent dose calculations has been developed at IPEN-CNEN/SP to be applied at the Photographic Dosimetry Laboratory using artificial intelligence techniques by means of neutral network. The research was orientated towards the optimization of the whole set of parameters involves in the film processing going from the irradiation in order to obtain the calibration curve up to the optical density readings. The learning of the neutral network was performed by taking the readings of optical density from calibration curve as input and the effective energy and equivalent dose as output. The obtained results in the intercomparison show an excellent agreement with the actual values of dose and energy given by the National Metrology Laboratory of Ionizing Radiation. (author)

  13. Development of new methodology for dose calculation in photographic dosimetry

    International Nuclear Information System (INIS)

    Daltro, T.F.L.; Campos, L.L.

    1994-01-01

    A new methodology for equivalent dose calculation has been developed at IPEN-CNEN/SP to be applied at the Photographic Dosimetry Laboratory using artificial intelligence techniques by means of neural network. The research was oriented towards the optimization of the whole set of parameters involved in the film processing going from the irradiation in order to obtain the calibration curve up to the optical density readings. The learning of the neural network was performed by taking readings of optical density from calibration curve as input and the effective energy and equivalent dose as output. The obtained results in the intercomparison show an excellent agreement with the actual values of dose and energy given by the National Metrology Laboratory of Ionizing Radiation

  14. Painterly rendered portraits from photographs using a knowledge-based approach

    Science.gov (United States)

    DiPaola, Steve

    2007-02-01

    Portrait artists using oils, acrylics or pastels use a specific but open human vision methodology to create a painterly portrait of a live sitter. When they must use a photograph as source, artists augment their process, since photographs have: different focusing - everything is in focus or focused in vertical planes; value clumping - the camera darkens the shadows and lightens the bright areas; as well as color and perspective distortion. In general, artistic methodology attempts the following: from the photograph, the painting must 'simplify, compose and leave out what's irrelevant, emphasizing what's important'. While seemingly a qualitative goal, artists use known techniques such as relying on source tone over color to indirect into a semantic color temperature model, use brush and tonal "sharpness" to create a center of interest, lost and found edges to move the viewers gaze through the image towards the center of interest as well as other techniques to filter and emphasize. Our work attempts to create a knowledge domain of the portrait painter process and incorporate this knowledge into a multi-space parameterized system that can create an array of NPR painterly rendering output by analyzing the photographic-based input which informs the semantic knowledge rules.

  15. Applications of simulation technique on debris-flow hazard zone delineation: a case study in Hualien County, Taiwan

    Directory of Open Access Journals (Sweden)

    S. M. Hsu

    2010-03-01

    Full Text Available Debris flows pose severe hazards to communities in mountainous areas, often resulting in the loss of life and property. Helping debris-flow-prone communities delineate potential hazard zones provides local authorities with useful information for developing emergency plans and disaster management policies. In 2003, the Soil and Water Conservation Bureau of Taiwan proposed an empirical model to delineate hazard zones for all creeks (1420 in total with potential of debris flows and utilized the model to help establish a hazard prevention system. However, the model does not fully consider hydrologic and physiographical conditions for a given creek in simulation. The objective of this study is to propose new approaches that can improve hazard zone delineation accuracy and simulate hazard zones in response to different rainfall intensity. In this study, a two-dimensional commercial model FLO-2D, physically based and taking into account the momentum and energy conservation of flow, was used to simulate debris-flow inundated areas.

    Sensitivity analysis with the model was conducted to determine the main influence parameters which affect debris flow simulation. Results indicate that the roughness coefficient, yield stress and volumetric sediment concentration dominate the computed results. To improve accuracy of the model, the study examined the performance of the rainfall-runoff model of FLO-2D as compared with that of the HSPF (Hydrological Simulation Program Fortran model, and then the proper values of the significant parameters were evaluated through the calibration process. Results reveal that the HSPF model has a better performance than the FLO-2D model at peak flow and flow recession period, and the volumetric sediment concentration and yield stress can be estimated by the channel slope. The validation of the model for simulating debris-flow hazard zones has been confirmed by a comparison of field evidence from historical debris

  16. Determination of output factor for 6 MV small photon beam: comparison between Monte Carlo simulation technique and microDiamond detector

    International Nuclear Information System (INIS)

    Krongkietlearts, K; Tangboonduangjit, P; Paisangittisakul, N

    2016-01-01

    In order to improve the life's quality for a cancer patient, the radiation techniques are constantly evolving. Especially, the two modern techniques which are intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT) are quite promising. They comprise of many small beam sizes (beamlets) with various intensities to achieve the intended radiation dose to the tumor and minimal dose to the nearby normal tissue. The study investigates whether the microDiamond detector (PTW manufacturer), a synthetic single crystal diamond detector, is suitable for small field output factor measurement. The results were compared with those measured by the stereotactic field detector (SFD) and the Monte Carlo simulation (EGSnrc/BEAMnrc/DOSXYZ). The calibration of Monte Carlo simulation was done using the percentage depth dose and dose profile measured by the photon field detector (PFD) of the 10×10 cm 2 field size with 100 cm SSD. Comparison of the values obtained from the calculations and measurements are consistent, no more than 1% difference. The output factors obtained from the microDiamond detector have been compared with those of SFD and Monte Carlo simulation, the results demonstrate the percentage difference of less than 2%. (paper)

  17. Cognitive environment simulation: An artificial intelligence system for human performance assessment: Cognitive reliability analysis technique: [Technical report, May 1986-June 1987

    International Nuclear Information System (INIS)

    Woods, D.D.; Roth, E.M.

    1987-11-01

    This report documents the results of Phase II of a three phase research program to develop and validate improved methods to model the cognitive behavior of nuclear power plant (NPP) personnel. In Phase II a dynamic simulation capability for modeling how people form intentions to act in NPP emergency situations was developed based on techniques from artificial intelligence. This modeling tool, Cognitive Environment Simulation or CES, simulates the cognitive processes that determine situation assessment and intention formation. It can be used to investigate analytically what situations and factors lead to intention failures, what actions follow from intention failures (e.g., errors of omission, errors of commission, common mode errors), the ability to recover from errors or additional machine failures, and the effects of changes in the NPP person-machine system. The Cognitive Reliability Assessment Technique (or CREATE) was also developed in Phase II to specify how CES can be used to enhance the measurement of the human contribution to risk in probabilistic risk assessment (PRA) studies. 34 refs., 7 figs., 1 tab

  18. Evaluation of photographs supporting an FFQ developed for adolescents.

    Science.gov (United States)

    Brito, Alessandra Page; Guimarães, Celso Pereira; Pereira, Rosangela Alves

    2014-01-01

    To evaluate the validity of food photographs used to support the reporting of food intake with an FFQ designed for adolescents from Rio de Janeiro, Brazil. A set of ninety-five food photographs was elaborated. The photographs' evaluation process included the acknowledgement of foods and portions in the pictures. In the identification of foods (ninety-five photographs) and typical portions (twelve photographs), the adolescents were requested to answer a structured questionnaire related to the food photographs. The identification of the portion size of amorphous foods (forty-three photographs) was performed using three different portion sizes of actual preparations. The proportions (and 95% confidence intervals) of adolescents who correctly identified foods and portion size in each photograph were estimated. A public school in Niterói, Rio de Janeiro State, Brazil. Sixty-two adolescents between 11·0 and 18·9 years old, randomly selected. At least 90% of adolescents correctly identified the food in ninety-two photographs and the food in the three remaining photographs was recognized by 80-89% of the adolescents. At least 98% of the adolescents correctly identified eleven typical or natural portions in the food photographs. For amorphous foods, at least 70% of teenagers correctly identified the portion size in the photograph of thirty-one foods; for the other photographs, the portion size was correctly recognized by 50-69% of the adolescents for eight foods and by less than 50% of adolescents for four foods. The analysed photographs are appropriate visual aids to the reporting of food consumption by adolescents.

  19. Evaluation of a scatter correlation technique for single photon transmission measurements in PET by means of Monte Carlo simulations

    International Nuclear Information System (INIS)

    Wegmann, K.; Brix, G.

    2000-01-01

    Purpose: Single photon transmission (SPT) measurements offer a new approach for the determination of attenuation correction factors (ACF) in PET. It was the aim of the present work, to evaluate a scatter correction alogrithm proposed by C. Watson by means of Monte Carlo simulations. Methods: SPT measurements with a Cs-137 point source were simulated for a whole-body PET scanner (ECAT EXACT HR + ) in both the 2D and 3D mode. To examine the scatter fraction (SF) in the transmission data, the detected photons were classified as unscattered or scattered. The simulated data were used to determine (i) the spatial distribution of the SFs, (ii) an ACF sinogram from all detected events (ACF tot ) and (iii) from the unscattered events only (ACF unscattered ), and (iv) an ACF cor =(ACF tot ) 1+Κ sinogram corrected according to the Watson algorithm. In addition, density images were reconstructed in order to quantitatively evaluate linear attenuation coefficients. Results: A high correlation was found between the SF and the ACF tot sinograms. For the cylinder and the EEC phantom, similar correction factors Κ were estimated. The determined values resulted in an accurate scatter correction in both the 2D and 3D mode. (orig.) [de

  20. Reduction technique of drop voltage and power losses to improve power quality using ETAP Power Station simulation model

    Science.gov (United States)

    Satrio, Reza Indra; Subiyanto

    2018-03-01

    The effect of electric loads growth emerged direct impact in power systems distribution. Drop voltage and power losses one of the important things in power systems distribution. This paper presents modelling approach used to restructrure electrical network configuration, reduce drop voltage, reduce power losses and add new distribution transformer to enhance reliability of power systems distribution. Restructrure electrical network was aimed to analyse and investigate electric loads of a distribution transformer. Measurement of real voltage and real current were finished two times for each consumer, that were morning period and night period or when peak load. Design and simulation were conduct by using ETAP Power Station Software. Based on result of simulation and real measurement precentage of drop voltage and total power losses were mismatch with SPLN (Standard PLN) 72:1987. After added a new distribution transformer and restructrured electricity network configuration, the result of simulation could reduce drop voltage from 1.3 % - 31.3 % to 8.1 % - 9.6 % and power losses from 646.7 watt to 233.29 watt. Result showed, restructrure electricity network configuration and added new distribution transformer can be applied as an effective method to reduce drop voltage and reduce power losses.