Sample records for simulated photograph technique

  1. Rugoscopy: Human identification by computer-assisted photographic superimposition technique. (United States)

    Mohammed, Rezwana Begum; Patil, Rajendra G; Pammi, V R; Sandya, M Pavana; Kalyan, Siva V; Anitha, A


    Human identification has been studied since fourteenth century and it has gradually advanced for forensic purposes. Traditional methods such as dental, fingerprint, and DNA comparisons are probably the most common techniques used in this context, allowing fast and secure identification processes. But, in circumstances where identification of an individual by fingerprint or dental record comparison is difficult, palatal rugae may be considered as an alternative source of material. The present study was done to evaluate the individualistic nature and use of palatal rugae patterns for personal identification and also to test the efficiency of computerized software for forensic identification by photographic superimposition of palatal photographs obtained from casts. Two sets of Alginate impressions were made from the upper arches of 100 individuals (50 males and 50 females) with one month interval in between and the casts were poured. All the teeth except the incisors were removed to ensure that only the palate could be used in identification process. In one set of the casts, the palatal rugae were highlighted with a graphite pencil. All the 200 casts were randomly numbered, and then, they were photographed with a 10.1 Mega Pixel Kodak digital camera using standardized method. Using computerized software, the digital photographs of the models without highlighting the palatal rugae were overlapped over the images (transparent) of the palatal rugae with highlighted palatal rugae, in order to identify the pairs by superimposition technique. Incisors were remained and used as landmarks to determine the magnification required to bring the two set of photographs to the same size, in order to make perfect superimposition of images. The result of the overlapping of the digital photographs of highlighted palatal rugae over normal set of models without highlighted palatal rugae resulted in 100% positive identification. This study showed that utilization of palatal photographs

  2. Best of Adobe Photoshop techniques and images from professional photographers

    CERN Document Server

    Hurter, Bill


    Bill Hurter is the editor of ""Rangefinder"" magazine, the former editor of ""Petersen's PhotoGraphic,"" and the author of ""The Best of Wedding Photography, Group Portrait Photography Handbook, The Portrait Photographer's Guide to Posing, ""and ""Portrait Photographer's Handbook. ""He lives in Santa Monica, California.

  3. Airflow Simulation Techniques

    DEFF Research Database (Denmark)

    Nielsen, Peter V.

    The paper describes the development in airflow simulations in rooms . The research is, as other areas of flow research, influenced by the decreasing cost of computation which seems to indicate an increased use of airflow simulation in the coming years....

  4. Chebyshev-based technique for automated restoration of digital copies of faded photographic prints (United States)

    Uchaev, Dmitry V.; Uchaev, Denis V.; Malinnikov, Vasiliy A.


    We present a technique for automated restoration of digital images obtained from faded photographic prints. The proposed defading technique uses our early proposed image contrast enhancement algorithm based on a contrast measure of images in the Chebyshev moment transform domain. Obtained experimental results demonstrate some advantages of the technique as compared to other widely used image enhancement methods.

  5. Comparative analysis of photograph-based clinical goniometry to standard techniques. (United States)

    Crasto, Jared A; Sayari, Arash J; Gray, Robert R-L; Askari, Morad


    Assessment of joint range of motion (ROM) is an accepted evaluation of disability as well as an indicator of recovery from musculoskeletal injuries. Many goniometric techniques have been described to measure ROM, with variable validity due to inter-rater reliability. In this report, we assessed the validity of photograph-based goniometry in measurement of ROM and its inter-rater reliability and compared it to two other commonly used techniques. We examined three methods for measuring ROM in the upper extremity: manual goniometry (MG), visual estimations (VE), and photograph-based goniometry (PBG). Eight motions of the upper extremity were measured in 69 participants at an academic medical center. We found visual estimations and photograph-based goniometry to be clinically valid when tested against manual goniometry (r avg. 0.58, range 0.28 to 0.87). Photograph-based measurements afforded a satisfactory degree of inter-rater reliability (ICC avg. 0.77, range 0.28 to 0.96). Our study supports photograph-based goniometry as the new standard goniometric technique, as it has been clinically validated, is performed with greater consistency and better inter-rater reliability when compared with manual goniometry. It also allows for better documentation of measurements and potential incorporation into medical records in direct contrast to visual estimation.

  6. Rugoscopy: Human identification by computer-assisted photographic superimposition technique


    Mohammed, Rezwana Begum; Patil, Rajendra G.; Pammi, V. R.; Sandya, M. Pavana; Kalyan, Siva V.; Anitha, A.


    Background: Human identification has been studied since fourteenth century and it has gradually advanced for forensic purposes. Traditional methods such as dental, fingerprint, and DNA comparisons are probably the most common techniques used in this context, allowing fast and secure identification processes. But, in circumstances where identification of an individual by fingerprint or dental record comparison is difficult, palatal rugae may be considered as an alternative source of material. ...

  7. Microprocessor Simulation: A Training Technique. (United States)

    Oscarson, David J.


    Describes the design and application of a microprocessor simulation using BASIC for formal training of technicians and managers and as a management tool. Illustrates the utility of the modular approach for the instruction and practice of decision-making techniques. (SK)

  8. Multilevel techniques for Reservoir Simulation

    DEFF Research Database (Denmark)

    Christensen, Max la Cour

    The subject of this thesis is the development, application and study of novel multilevel methods for the acceleration and improvement of reservoir simulation techniques. The motivation for addressing this topic is a need for more accurate predictions of porous media flow and the ability to carry...... based on element-based Algebraic Multigrid (AMGe). In particular, an advanced AMGe technique with guaranteed approximation properties is used to construct a coarse multilevel hierarchy of Raviart-Thomas and L2 spaces for the Galerkin coarsening of a mixed formulation of the reservoir simulation...... equations. By experimentation it is found that the AMGe based upscaling technique provided very accurate results while reducing the computational time proportionally to the reduction in degrees of freedom. Furthermore, it is demonstrated that the AMGe coarse spaces (interpolation operators) can be used...

  9. Sources of uncertainty in individual monitoring for photographic,TL and OSL dosimetry techniques

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, Max S.; Silva, Everton R.; Mauricio, Claudia L.P., E-mail:, E-mail:, E-mail: [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)


    The identification of the uncertainty sources and their quantification is essential to the quality of any dosimetric results. If uncertainties are not stated for all dose measurements informed in the monthly dose report to the monitored radiation facilities, they need to be known. This study aims to analyze the influence of different sources of uncertainties associated with photographic, TL and OSL dosimetric techniques, considering the evaluation of occupational doses of whole-body exposure for photons. To identify the sources of uncertainty it was conducted a bibliographic review in specific documents that deal with operational aspects of each technique and the uncertainties associated to each of them. Withal, technical visits to individual monitoring services were conducted to assist in this identification. The sources of uncertainty were categorized and their contributions were expressed in a qualitative way. The process of calibration and traceability are the most important sources of uncertainties, regardless the technique used. For photographic dosimetry, the remaining important uncertainty sources are due to: energy and angular dependence; linearity of response; variations in the films processing. For TL and OSL, the key process for a good performance is respectively the reproducibility of the thermal and optical cycles. For the three techniques, all procedures of the measurement process must be standardized, controlled and reproducible. Further studies can be performed to quantify the contribution of the sources of uncertainty. (author)

  10. Simulation and analysis of natural rain in a wind tunnel via digital image processing techniques (United States)

    Aaron, K. M.; Hernan, M.; Parikh, P.; Sarohia, V.; Gharib, M.


    It is desired to simulate natural rain in a wind tunnel in order to investigate its influence on the aerodynamic characteristics of aircraft. Rain simulation nozzles have been developed and tested at JPL. Pulsed laser sheet illumination is used to photograph the droplets in the moving airstream. Digital image processing techniques are applied to these photographs for calculation of rain statistics to evaluate the performance of the nozzles. It is found that fixed hypodermic type nozzles inject too much water to simulate natural rain conditions. A modification uses two aerodynamic spinners to flex a tube in a pseudo-random fashion to distribute the water over a larger area.

  11. Techniques for Binary Black Hole Simulations (United States)

    Baker, John G.


    Recent advances in techniques for numerical simulation of black hole systems have enabled dramatic progress in astrophysical applications. Our approach to these simulations, which includes new gauge conditions for moving punctures, AMR, and specific tools for analyzing black hole simulations, has been applied to a variety of black hole configurations, typically resulting in simulations lasting several orbits. I will discuss these techniques, what we've learned in applications, and outline some areas for further development.

  12. Validation of image analysis techniques to measure skin aging features from facial photographs. (United States)

    Hamer, M A; Jacobs, L C; Lall, J S; Wollstein, A; Hollestein, L M; Rae, A R; Gossage, K W; Hofman, A; Liu, F; Kayser, M; Nijsten, T; Gunn, D A


    Accurate measurement of the extent skin has aged is crucial for skin aging research. Image analysis offers a quick and consistent approach for quantifying skin aging features from photographs, but is prone to technical bias and requires proper validation. Facial photographs of 75 male and 75 female North-European participants, randomly selected from the Rotterdam Study, were graded by two physicians using photonumeric scales for wrinkles (full face, forehead, crow's feet, nasolabial fold and upper lip), pigmented spots and telangiectasia. Image analysis measurements of the same features were optimized using photonumeric grades from 50 participants, then compared to photonumeric grading in the 100 remaining participants stratified by sex. The inter-rater reliability of the photonumeric grades was good to excellent (intraclass correlation coefficients 0.65-0.93). Correlations between the digital measures and the photonumeric grading were moderate to excellent for all the wrinkle comparisons (Spearman's rho ρ = 0.52-0.89) bar the upper lip wrinkles in the men (fair, ρ = 0.30). Correlations were moderate to good for pigmented spots and telangiectasia (ρ = 0.60-0.75). These comparisons demonstrate that all the image analysis measures, bar the upper lip measure in the men, are suitable for use in skin aging research and highlight areas of improvement for future refinements of the techniques. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons.

  13. Using a Concept Mapping Tool with a Photograph Association Technique (CoMPAT) to Elicit Children's Ideas about Microbial Activity (United States)

    Byrne, Jenny; Grace, Marcus


    Concept mapping is a technique used to provide a visual representation of an individual's ideas about a concept or set of related concepts. This paper describes a concept mapping tool using a photograph association technique (CoMPAT) that is considered to be a novel way of eliciting children's ideas. What children at 11 years of age know about…

  14. Determine Conjugate Points of an Aerial Photograph Stereopairs using Seperate Channel Mean Value Technique

    Directory of Open Access Journals (Sweden)

    Andri Hernandi


    Full Text Available In the development of digital photogrammetric system, automatic image matching process play an important role. The automatic image matching is used in finding the conjugate points of an aerial photograph stereopair automatically. This matching technique gives quite significant contribution especially in the development of 3D photogrammetry in an attempt to get the exact and precise topographic information during the stereo restitution. There are two image matching methods that have been so far developed, i.e. the area based system for gray level environment and the feature based system for natural feature environment. This re¬search is trying to implement the area based matching with normalized cross correlation technique to get the correlation coefficient between the spectral value of the left image and its pair on the right. Based on the previous researches, the use of color image could increase the quality of match-ing. One of the color image matching technique is known as Separate Channel Mean Value. In order to be able to see the performance of the technique, a number of sampling areas with various different characteristics have been chosen, i.e. the heterogeneous, homogeneous, texture, shadow, and contrast. The result shows the highest similarity measure is obtained on heterogeneous sample area at size of all reference and search image, i.e. (11 pixels x 11 pixels and (23 pixels x 23 pixels. In these area the correlation coefficient reached more than 0.7 and the highest percentage of similarity measure is obtained. The average of total similarity measure of conjugate images in the sampling image area only reach about 41.43 % of success. Therefore, this technique has a weakness and some treatment to overcome the problems is still needed.

  15. Retinal Image Simulation of Subjective Refraction Techniques. (United States)

    Perches, Sara; Collados, M Victoria; Ares, Jorge


    Refraction techniques make it possible to determine the most appropriate sphero-cylindrical lens prescription to achieve the best possible visual quality. Among these techniques, subjective refraction (i.e., patient's response-guided refraction) is the most commonly used approach. In this context, this paper's main goal is to present a simulation software that implements in a virtual manner various subjective-refraction techniques--including Jackson's Cross-Cylinder test (JCC)--relying all on the observation of computer-generated retinal images. This software has also been used to evaluate visual quality when the JCC test is performed in multifocal-contact-lens wearers. The results reveal this software's usefulness to simulate the retinal image quality that a particular visual compensation provides. Moreover, it can help to gain a deeper insight and to improve existing refraction techniques and it can be used for simulated training.

  16. New techniques to measure cliff change from historical oblique aerial photographs and structure-from-motion photogrammetry (United States)

    Warrick, Jonathan; Ritchie, Andy; Adelman, Gabrielle; Adelman, Ken; Limber, Patrick W.


    Oblique aerial photograph surveys are commonly used to document coastal landscapes. Here it is shown that adequate overlap may exist in these photographic records to develop topographic models with Structure-from-Motion (SfM) photogrammetric techniques. Using photographs of Fort Funston, California, from the California Coastal Records Project, imagery were combined with ground control points in a four-dimensional analysis that produced topographic point clouds of the study area’s cliffs for 5 years spanning 2002 to 2010. Uncertainty was assessed by comparing point clouds with airborne LIDAR data, and these uncertainties were related to the number and spatial distribution of ground control points used in the SfM analyses. With six or more ground control points, the root mean squared errors between the SfM and LIDAR data were less than 0.30 m (minimum 1⁄4 0.18 m), and the mean systematic error was less than 0.10 m. The SfM results had several benefits over traditional airborne LIDAR in that they included point coverage on vertical- to-overhanging sections of the cliff and resulted in 10–100 times greater point densities. Time series of the SfM results revealed topographic changes, including landslides, rock falls, and the erosion of landslide talus along the Fort Funston beach. Thus, it was concluded that SfM photogrammetric techniques with historical oblique photographs allow for the extraction of useful quantitative information for mapping coastal topography and measuring coastal change. The new techniques presented here are likely applicable to many photograph collections and problems in the earth sciences.

  17. The Impact of Simulated Nature on Patient Outcomes: A Study of Photographic Sky Compositions. (United States)

    Pati, Debajyoti; Freier, Patricia; O'Boyle, Michael; Amor, Cherif; Valipoor, Shabboo


    To examine whether incorporation of simulated nature, in the form of ceiling mounted photographic sky compositions, influences patient outcomes. Previous studies have shown that most forms of nature exposure have a positive influence on patients. However, earlier studies have mostly focused on wall-hung nature representations. The emergence of simulated nature products has raised the question regarding the effects of the new product on patient outcomes. A between-subject experimental design was adopted, where outcomes from five inpatient rooms with sky composition ceiling fixture were compared to corresponding outcomes in five identical rooms without the intervention. Data were collected from a total of 181 subjects on 11 outcomes. Independent sample tests were performed to identify differences in mean outcomes. Significant positive outcomes were observed in environmental satisfaction and diastolic blood pressure (BP). Environmental satisfaction in the experimental group was 12.4% higher than the control group. Direction of association for diastolic BP, nausea/indigestion medication, acute stress, anxiety, pain, and environmental satisfaction were consistent with a priori hypothesis. A post hoc exploratory assessment involving patients who did not self-request additional pain and sleep medication demonstrated confirmatory directions for all outcomes except Systolic BP, and statistically significant outcomes for Acute Stress and Anxiety-Acute Stress and Anxiety levels of the experimental group subjects was 53.4% and 34.79% lower, respectively, than that of the control group subjects. Salutogenic benefits of photographic sky compositions render them better than traditional ceiling tiles and offer an alternative to other nature interventions. © The Author(s) 2015.

  18. A Methodological Intercomparison of Topographic and Aerial Photographic Habitat Survey Techniques (United States)

    Bangen, S. G.; Wheaton, J. M.; Bouwes, N.


    A severe decline in Columbia River salmonid populations and subsequent Federal listing of subpopulations has mandated both the monitoring of populations and evaluation of the status of available habitat. Numerous field and analytical methods exist to assist in the quantification of the abundance and quality of in-stream habitat for salmonids. These methods range from field 'stick and tape' surveys to spatially explicit topographic and aerial photographic surveys from a mix of ground-based and remotely sensed airborne platforms. Although several previous studies have assessed the quality of specific individual survey methods, the intercomparison of competing techniques across a diverse range of habitat conditions (wadeable headwater channels to non-wadeable mainstem channels) has not yet been elucidated. In this study, we seek to enumerate relative quality (i.e. accuracy, precision, extent) of habitat metrics and inventories derived from an array of ground-based and remotely sensed surveys of varying degrees of sophistication, as well as quantify the effort and cost in conducting the surveys. Over the summer of 2010, seven sample reaches of varying habitat complexity were surveyed in the Lemhi River Basin, Idaho, USA. Complete topographic surveys were attempted at each site using rtkGPS, total station, ground-based LiDaR and traditional airborne LiDaR. Separate high spatial resolution aerial imagery surveys were acquired using a tethered blimp, a drone UAV, and a traditional fixed-wing aircraft. Here we also developed a relatively simplistic methodology for deriving bathymetry from aerial imagery that could be readily employed by instream habitat monitoring programs. The quality of bathymetric maps derived from aerial imagery was compared with rtkGPS topographic data. The results are helpful for understanding the strengths and weaknesses of different approaches in specific conditions, and how a hybrid of data acquisition methods can be used to build a more complete

  19. Enhanced sampling techniques in biomolecular simulations. (United States)

    Spiwok, Vojtech; Sucur, Zoran; Hosek, Petr


    Biomolecular simulations are routinely used in biochemistry and molecular biology research; however, they often fail to match expectations of their impact on pharmaceutical and biotech industry. This is caused by the fact that a vast amount of computer time is required to simulate short episodes from the life of biomolecules. Several approaches have been developed to overcome this obstacle, including application of massively parallel and special purpose computers or non-conventional hardware. Methodological approaches are represented by coarse-grained models and enhanced sampling techniques. These techniques can show how the studied system behaves in long time-scales on the basis of relatively short simulations. This review presents an overview of new simulation approaches, the theory behind enhanced sampling methods and success stories of their applications with a direct impact on biotechnology or drug design. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Pedestrian flow simulation validation and verification techniques


    Dridi, Mohamed H.


    For the verification and validation of microscopic simulation models of pedestrian flow, we have performed experiments for different kind of facilities and sites where most conflicts and congestion happens e.g. corridors, narrow passages, and crosswalks. The validity of the model should compare the experimental conditions and simulation results with video recording carried out in the same condition like in real life e.g. pedestrian flux and density distributions. The strategy in this techniqu...

  1. Comparison of anthropometry with photogrammetry based on a standardized clinical photographic technique using a cephalostat and chair. (United States)

    Han, Kihwan; Kwon, Hyuk Joon; Choi, Tae Hyun; Kim, Jun Hyung; Son, Daegu


    The aim of this study was to standardize clinical photogrammetric techniques, and to compare anthropometry with photogrammetry. To standardize clinical photography, we have developed a photographic cephalostat and chair. We investigated the repeatability of the standardized clinical photogrammetric technique. Then, with 40 landmarks, a total of 96 anthropometric measurement items was obtained from 100 Koreans. Ninety six photogrammetric measurements from the same subjects were also obtained from standardized clinical photographs using Adobe Photoshop version 7.0 (Adobe Systems Corporation, San Jose, CA, USA). The photogrammetric and anthropometric measurement data (mm, degree) were then compared. A coefficient was obtained by dividing the anthropometric measurements by the photogrammetric measurements. The repeatability of the standardized photography was statistically significantly high (p=0.463). Among the 96 measurement items, 44 items were reliable; for these items the photogrammetric measurements were not different to the anthropometric measurements. The remaining 52 items must be classified as unreliable. By developing a photographic cephalostat and chair, we have standardized clinical photogrammetric techniques. The reliable set of measurement items can be used as anthropometric measurements. For unreliable measurement items, applying a suitable coefficient to the photogrammetric measurement allows the anthropometric measurement to be obtained indirectly.

  2. Semi-Automated Classification of Gray Scale Aerial Photographs using Geographic Object Based Image Analysis (GEOBIA) Technique (United States)

    Harb Rabia, Ahmed; Terribile, Fabio


    Aerial photography is an important source of high resolution remotely sensed data. Before 1970, aerial photographs were the only remote sensing data source for land use and land cover classification. Using these old aerial photographs improve the final output of land use and land cover change detection. However, classic techniques of aerial photographs classification like manual interpretation or screen digitization require great experience, long processing time and vast effort. A new technique needs to be developed in order to reduce processing time and effort and to give better results. Geographic object based image analysis (GEOBIA) is a newly developed area of Geographic Information Science and remote sensing in which automatic segmentation of images into objects of similar spectral, temporal and spatial characteristics is undertaken. Unlike pixel-based technique, GEOBIA deals with the object properties such as texture, square fit, roundness and many other properties that can improve classification results. GEOBIA technique can be divided into two main steps; segmentation and classification. Segmentation process is grouping adjacent pixels into objects of similar spectral and spatial characteristics. Classification process is assigning classes to the generated objects based on the characteristics of the individual objects. This study aimed to use GEOBIA technique to develop a novel approach for land use and land cover classification of aerial photographs that saves time and effort and gives improved results. Aerial photographs from 1954 of Valle Telesina in Italy were used in this study. Images were rectified and georeferenced in Arcmap using topographic maps. Images were then processed in eCognition software to generate land use and land cover map of 1954. A decision tree rule set was developed in eCognition to classify images and finally nine classes of general land use and land cover in the study area were recognized (forest, trees stripes, agricultural

  3. Pedestrian Flow Simulation Validation and Verification Techniques

    CERN Document Server

    Dridi, Mohamed H


    For the verification and validation of microscopic simulation models of pedestrian flow, we have performed experiments for different kind of facilities and sites where most conflicts and congestion happens e.g. corridors, narrow passages, and crosswalks. The validity of the model should compare the experimental conditions and simulation results with video recording carried out in the same condition like in real life e.g. pedestrian flux and density distributions. The strategy in this technique is to achieve a certain amount of accuracy required in the simulation model. This method is good at detecting the critical points in the pedestrians walking areas. For the calibration of suitable models we use the results obtained from analyzing the video recordings in Hajj 2009 and these results can be used to check the design sections of pedestrian facilities and exits. As practical examples, we present the simulation of pilgrim streams on the Jamarat bridge. The objectives of this study are twofold: first, to show th...

  4. Cochlear implant simulator for surgical technique analysis (United States)

    Turok, Rebecca L.; Labadie, Robert F.; Wanna, George B.; Dawant, Benoit M.; Noble, Jack H.


    Cochlear Implant (CI) surgery is a procedure in which an electrode array is inserted into the cochlea. The electrode array is used to stimulate auditory nerve fibers and restore hearing for people with severe to profound hearing loss. The primary goals when placing the electrode array are to fully insert the array into the cochlea while minimizing trauma to the cochlea. Studying the relationship between surgical outcome and various surgical techniques has been difficult since trauma and electrode placement are generally unknown without histology. Our group has created a CI placement simulator that combines an interactive 3D visualization environment with a haptic-feedback-enabled controller. Surgical techniques and patient anatomy can be varied between simulations so that outcomes can be studied under varied conditions. With this system, we envision that through numerous trials we will be able to statistically analyze how outcomes relate to surgical techniques. As a first test of this system, in this work, we have designed an experiment in which we compare the spatial distribution of forces imparted to the cochlea in the array insertion procedure when using two different but commonly used surgical techniques for cochlear access, called round window and cochleostomy access. Our results suggest that CIs implanted using round window access may cause less trauma to deeper intracochlear structures than cochleostomy techniques. This result is of interest because it challenges traditional thinking in the otological community but might offer an explanation for recent anecdotal evidence that suggests that round window access techniques lead to better outcomes.

  5. Photographic simulation of off-axis blurring due to chromatic aberration in spectacle lenses. (United States)

    Doroslovački, Pavle; Guyton, David L


    Spectacle lens materials of high refractive index (nd) tend to have high chromatic dispersion (low Abbé number [V]), which may contribute to visual blurring with oblique viewing. A patient who noted off-axis blurring with new high-refractive-index spectacle lenses prompted us to do a photographic simulation of the off-axis aberrations in 3 readily available spectacle lens materials, CR-39 (nd = 1.50), polyurethane (nd = 1.60), and polycarbonate (nd = 1.59). Both chromatic and monochromatic aberrations were found to cause off-axis image degradation. Chromatic aberration was more prominent in the higher-index materials (especially polycarbonate), whereas the lower-index CR-39 had more astigmatism of oblique incidence. It is important to consider off-axis aberrations when a patient complains of otherwise unexplained blurred vision with a new pair of spectacle lenses, especially given the increasing promotion of high-refractive-index materials with high chromatic dispersion. Copyright © 2015 American Association for Pediatric Ophthalmology and Strabismus. Published by Elsevier Inc. All rights reserved.

  6. Fast simulation techniques for switching converters (United States)

    King, Roger J.


    Techniques for simulating a switching converter are examined. The state equations for the equivalent circuits, which represent the switching converter, are presented and explained. The uses of the Newton-Raphson iteration, low ripple approximation, half-cycle symmetry, and discrete time equations to compute the interval durations are described. An example is presented in which these methods are illustrated by applying them to a parallel-loaded resonant inverter with three equivalent circuits for its continuous mode of operation.

  7. Significance of stellar magnetic field measurements obtained with the photographic technique: the spurious magnetic field of the supergiant Canopus

    Energy Technology Data Exchange (ETDEWEB)

    Stift, M.J.


    A number of straightforward statistical tests are proposed, aimed at establishing the significance of stellar magnetic field measurements obtained with the photographic technique of a previous author. The power of these methods is illustrated with three Ap stars for which the presence of a magnetic field can be established at an exceedingly high significance level. Application to the supergiant Canopus on the contrary gives no support to the claims for the detection of a kilogauss surface field. Further investigations involving classical scaling and an error analysis of the MSHIFT correlation method favour the view that instrumental instabilities are at the origin of the apparent magnetic variations in Canopus.

  8. A technique for developing and photographing ridge impressions on decomposed water-soaked fingers. (United States)

    Keating, D M; Miller, J J


    One of the most challenging tasks confronting a crime laboratory technician is the fingerprinting and subsequent identification of an unknown homicide or drowning victim whose fingers have been subjected to a long period of exposure to water and the effects of decomposition. If the fingers of the individual have not been exposed to the erosive effects of water and decomposition for a long period of time, they may be allowed to dry, and suitable impressions are often obtainable. In other cases the fingers may have to be removed, with the permission of the Medical Examiners Office, and processed by the Crime Laboratory in an attempt to develop suitable ridge structure for inked impressions or an exact photographic copy of the individual's fingers. In extreme cases the effects of water and decomposition make the fragile ridge structure appear to be nonexistent to the naked eye. The procedure used in this case report, combines the use of cyanoacrylate vapor, commonly called "super glue fuming," and the ninhydrin process in conjunction to develop fragile ridge structure into discernable ridges that are easily seen and photographed for the purpose of making an identification of the individual.

  9. High-resolution imaging of hypervelocity metal jets using advanced high-speed photographic techniques

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, L.L.; Muelder, S.A.


    It is now possible to obtain high resolution sequential photographs of the initial formation and evolution of hypervelocity metal jets formed by shaped charge devices fired in air. Researchers have been frustrated by the high velocity of the jet material and the luminous sheath of hot gases cloaking the jet that made detailed observation of the jet body extremely difficult. The camera system that provides the photographs is a large format multi-frame electro-optic camera, referred to as an IC camera (IC stands for image converter), that utilizes electro-optic shuttering, monochromatic pulsed laser illumination and bandpass filtering to provide sequential pictures (in 3D if desired) with minimal degradation due to luminous air shocks or motion blur. The large format (75mm image plane), short exposure (15 ns minimum), ruby laser illumination and bandpass filtering (monochromatic illumination while excluding extraneous light) produces clear, sharp, images of the detailed surface structure of a metal shaped charge jet during early jet formation, elongation of the jet body, jet tip evolution and subsequent particulation (breakup) of the jet body. By utilizing the new camera system in conjunction with the more traditional rotating mirror high speed cameras, pulsed radiography, and electrical sensors, a maximum amount of, often unique, data can be extracted from a single experiment. This paper was intended primarily as an oral presentation. For purposes of continuity and simplicity in these proceedings, the authors have chosen to concentrate on the development of the IC camera system and its impact on the photography of high speed shaped chargejets.

  10. A novel optic nerve photograph alignment and subtraction technique for the detection of structural progression in glaucoma. (United States)

    Marlow, Elizabeth D; McGlynn, Margaret M; Radcliffe, Nathan M


    To highlight changing features over time within a single static image through the auto-alignment and subtraction of serial optic nerve photographs. Subtraction maps were generated from auto-aligned (EyeIC, Narbeth, PA) baseline and follow-up images using Adobe Photoshop software. They demonstrated progressive retinal nerve fibre layer (RNFL) defects, optic disc haemorrhage (DH), neuroretinal rim loss (RL) and peripapillary atrophy (PPA). A masked glaucoma specialist identified features of progression on subtraction map first, then assessed feature strength by comparison with original images using alternation flicker. Control images with no progression and parallax-only images (as determined by flicker) were included. Eighty eyes of 67 patients were used to generate subtraction maps that detected glaucoma progression in 87% of DH (n = 28, sensitivity (Se) 82%, specificity (Sp) 98%) and 84% of PPA (n = 30, Se 80%, Sp 98%) cases. The lowest rate of detection was seen with RL at 67% (n = 31, Se 65%, Sp 100%). The subtraction technique was most sensitive for detecting parallax (n = 39, Se 98%, Sp 94%). Features of glaucoma progression appeared equally strong in flicker and subtraction images, but parallax was often enhanced on subtraction maps. Among control images selected for absence of features of glaucomatous change (n = 9) in original flicker images, no features were detected on subtraction maps. Auto-alignment and subtraction of serial optic nerve photographs reliably detects features of glaucoma progression with a single static image. Parallax identification may also be facilitated. Auto-alignment and subtraction of serial optic nerve photographs may prove especially useful in education and printed publications when dynamic imaging is not feasible. © 2014 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  11. Computer simulation, nuclear techniques and surface analysis

    Directory of Open Access Journals (Sweden)

    Reis, A. D.


    Full Text Available This article is about computer simulation and surface analysis by nuclear techniques, which are non-destructive. The “energy method of analysis” for nuclear reactions is used. Energy spectra are computer simulated and compared with experimental data, giving target composition and concentration profile information. Details of prediction stages are given for thick flat target yields. Predictions are made for non-flat targets having asymmetric triangular surface contours. The method is successfully applied to depth profiling of 12C and 18O nuclei in thick targets, by deuteron (d,p and proton (p,α induced reactions, respectively.

    Este artículo trata de simulación por ordenador y del análisis de superficies mediante técnicas nucleares, que son no destructivas. Se usa el “método de análisis en energía” para reacciones nucleares. Se simulan en ordenador espectros en energía que se comparan con datos experimentales, de lo que resulta la obtención de información sobre la composición y los perfiles de concentración de la muestra. Se dan detalles de las etapas de las predicciones de espectros para muestras espesas y planas. Se hacen predicciones para muestras no planas que tienen contornos superficiales triangulares asimétricos. Este método se aplica con éxito en el cálculo de perfiles en profundidad de núcleos de 12C y de 18O en muestras espesas a través de reacciones (d,p y (p,α inducidas por deuterones y protones, respectivamente.

  12. The photoload sampling technique: estimating surface fuel loadings from downward-looking photographs of synthetic fuelbeds (United States)

    Robert E. Keane; Laura J. Dickinson


    Fire managers need better estimates of fuel loading so they can more accurately predict the potential fire behavior and effects of alternative fuel and ecosystem restoration treatments. This report presents a new fuel sampling method, called the photoload sampling technique, to quickly and accurately estimate loadings for six common surface fuel components (1 hr, 10 hr...

  13. Expansion techniques for collisionless stellar dynamical simulations

    Energy Technology Data Exchange (ETDEWEB)

    Meiron, Yohai [Kavli Institute for Astronomy and Astrophysics at Peking University, Beijing 100871 (China); Li, Baile; Holley-Bockelmann, Kelly [Department of Physics and Astronomy, Vanderbilt University, Nashville, TN 37235 (United States); Spurzem, Rainer, E-mail: [National Astronomical Observatories of China, Chinese Academy of Sciences, Beijing 100012 (China)


    We present graphics processing unit (GPU) implementations of two fast force calculation methods based on series expansions of the Poisson equation. One method is the self-consistent field (SCF) method, which is a Fourier-like expansion of the density field in some basis set; the other method is the multipole expansion (MEX) method, which is a Taylor-like expansion of the Green's function. MEX, which has been advocated in the past, has not gained as much popularity as SCF. Both are particle-field methods and optimized for collisionless galactic dynamics, but while SCF is a 'pure' expansion, MEX is an expansion in just the angular part; thus, MEX is capable of capturing radial structure easily, while SCF needs a large number of radial terms. We show that despite the expansion bias, these methods are more accurate than direct techniques for the same number of particles. The performance of our GPU code, which we call ETICS, is profiled and compared to a CPU implementation. On the tested GPU hardware, a full force calculation for one million particles took ∼0.1 s (depending on expansion cutoff), making simulations with as many as 10{sup 8} particles fast for a comparatively small number of nodes.

  14. Modelling laser speckle photographs of decayed teeth by applying a digital image information technique (United States)

    Ansari, M. Z.; da Silva, L. C.; da Silva, J. V. P.; Deana, A. M.


    We report on the application of a digital image model to assess early carious lesions on teeth. When decay is in its early stages, the lesions were illuminated with a laser and the laser speckle images were obtained. Due to the differences in the optical properties between healthy and carious tissue, both regions produced different scatter patterns. The digital image information technique allowed us to produce colour-coded 3D surface plots of the intensity information in the speckle images, where the height (on the z-axis) and the colour in the rendering correlate with the intensity of a pixel in the image. The quantitative changes in colour component density enhance the contrast between the decayed and sound tissue, and visualization of the carious lesions become significantly evident. Therefore, the proposed technique may be adopted in the early diagnosis of carious lesions.

  15. Techniques and Simulation Models in Risk Management

    Directory of Open Access Journals (Sweden)

    Mirela GHEORGHE


    Full Text Available In the present paper, the scientific approach of the research starts from the theoretical framework of the simulation concept and then continues in the setting of the practical reality, thus providing simulation models for a broad range of inherent risks specific to any organization and simulation of those models, using the informatics instrument @Risk (Palisade. The reason behind this research lies in the need for simulation models that will allow the person in charge with decision taking inside the field of risk management to adopt new corporate strategies which will answer their current needs. The results of the research are represented by two simulation models specific to risk management. The first model follows the net profit simulation as well as simulating the impact that could be generated by a series of inherent risk factors such as losing some important colleagues, a drop in selling prices, a drop in sales volume, retrofitting, and so on. The second simulation model is associated to the IT field, through the analysis of 10 informatics threats, in order to evaluate the potential financial loss.

  16. Techniques in micromagnetic simulation and analysis (United States)

    Kumar, D.; Adeyeye, A. O.


    Advances in nanofabrication now allow us to manipulate magnetic material at micro- and nanoscales. As the steps of design, modelling and simulation typically precede that of fabrication, these improvements have also granted a significant boost to the methods of micromagnetic simulations (MSs) and analyses. The increased availability of massive computational resources has been another major contributing factor. Magnetization dynamics at micro- and nanoscale is described by the Landau-Lifshitz-Gilbert (LLG) equation, which is an ordinary differential equation (ODE) in time. Several finite difference method (FDM) and finite element method (FEM) based LLG solvers are now widely use to solve different kind of micromagnetic problems. In this review, we present a few patterns in the ways MSs are being used in the pursuit of new physics. An important objective of this review is to allow one to make a well informed decision on the details of simulation and analysis procedures needed to accomplish a given task using computational micromagnetics. We also examine the effect of different simulation parameters to underscore and extend some best practices. Lastly, we examine different methods of micromagnetic analyses which are used to process simulation results in order to extract physically meaningful and valuable information.

  17. Simulation techniques in hyperthermia treatment planning

    NARCIS (Netherlands)

    M.M. Paulides (Maarten); J.C. Stauffer; E. Neufeld; P.F. MacCarini (Paolo); A. Kyriakou (Adamos); R.A.M. Canters (Richard); S. Diederich (Sven); J. Bakker (Jan); G.C. van Rhoon (Gerard)


    textabstractClinical trials have shown that hyperthermia (HT), i.e. an increase of tissue temperature to 39-44 °C, significantly enhance radiotherapy and chemotherapy effectiveness [1]. Driven by the developments in computational techniques and computing power, personalised hyperthermia treatment

  18. Development of facial aging simulation system combined with three-dimensional shape prediction from facial photographs (United States)

    Nagata, Takeshi; Matsuzaki, Kazutoshi; Taniguchi, Kei; Ogawa, Yoshinori; Imaizumi, Kazuhiko


    3D Facial aging changes in more than 10 years of identical persons are being measured at National Research Institute of Police Science. We performed machine learning using such measured data as teacher data and have developed the system which convert input 2D face image into 3D face model and simulate aging. Here, we report about processing and accuracy of our system.

  19. Traffic Simulations on Parallel Computers Using Domain Decomposition Techniques (United States)


    Large scale simulations of Intelligent Transportation Systems (ITS) can only be acheived by using the computing resources offered by parallel computing architectures. Domain decomposition techniques are proposed which allow the performance of traffic...

  20. An analog simulation technique for distributed flow systems

    DEFF Research Database (Denmark)

    Jørgensen, Sten Bay; Kümmel, Mogens


    Simulation of distributed flow systems in chemical engine­ering has been applied more and more during the last decade as computer techniques have developed [l]. The applications have served the purpose of identification of process dynamics and parameter estimation as well as improving process...... and process control design. Although the conventional analog computer has been expanded with hybrid techniques and digital simulation languages have appeared, none of these has demonstrated superiority in simulating distributed flow systems in general [l]. Conventional analog techniques are expensive......, especially when flow forcing and nonlinearities are simulated. Digital methods on the other. hand are time consuming. The purpose of this application note is to describe the hardware for the analog principle proposed by {2, 3]. Using this hardware ffowforcing is readily simulated, which was not feasible...

  1. Acceleration techniques for dependability simulation. M.S. Thesis (United States)

    Barnette, James David


    As computer systems increase in complexity, the need to project system performance from the earliest design and development stages increases. We have to employ simulation for detailed dependability studies of large systems. However, as the complexity of the simulation model increases, the time required to obtain statistically significant results also increases. This paper discusses an approach that is application independent and can be readily applied to any process-based simulation model. Topics include background on classical discrete event simulation and techniques for random variate generation and statistics gathering to support simulation.

  2. Semi-Analytic Techniques for Fast MATLAB Simulations


    Borio, Daniele; Cano, Eduardo


    Semi-analytic techniques are a powerful tool for the analysis of complex systems. In the semi-analytic framework, the knowledge of the system under analysis is exploited to reduce the computational load and complexity that full Monte Carlo simulations would require. In this way, the strengths of both analytical and Monte Carlo methods are effectively combined. The main goal of this chapter is to provide a general overview of semi-analytic techniques for the simulation of communications sys...

  3. HADES, A Code for Simulating a Variety of Radiographic Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Aufderheide, M B; Henderson, G; von Wittenau, A; Slone, D M; Barty, A; Martz, Jr., H E


    It is often useful to simulate radiographic images in order to optimize imaging trade-offs and to test tomographic techniques. HADES is a code that simulates radiography using ray tracing techniques. Although originally developed to simulate X-Ray transmission radiography, HADES has grown to simulate neutron radiography over a wide range of energy, proton radiography in the 1 MeV to 100 GeV range, and recently phase contrast radiography using X-Rays in the keV energy range. HADES can simulate parallel-ray or cone-beam radiography through a variety of mesh types, as well as through collections of geometric objects. HADES was originally developed for nondestructive evaluation (NDE) applications, but could be a useful tool for simulation of portal imaging, proton therapy imaging, and synchrotron studies of tissue. In this paper we describe HADES' current capabilities and discuss plans for a major revision of the code.

  4. Simulation technique for hard-disk models in two dimensions

    DEFF Research Database (Denmark)

    Fraser, Diane P.; Zuckermann, Martin J.; Mouritsen, Ole G.


    A method is presented for studying hard-disk systems by Monte Carlo computer-simulation techniques within the NpT ensemble. The method is based on the Voronoi tesselation, which is dynamically maintained during the simulation. By an analysis of the Voronoi statistics, a quantity is identified tha...

  5. Simulation of wind turbine wakes using the actuator line technique

    DEFF Research Database (Denmark)

    Sørensen, Jens Nørkær; Mikkelsen, Robert Flemming; Henningson, Dan S.


    The actuator line technique was introduced as a numerical tool to be employed in combination with large eddy simulations to enable the study of wakes and wake interaction in wind farms. The technique is today largely used for studying basic features of wakes as well as for making performance...

  6. Simulation-based optimization parametric optimization techniques and reinforcement learning

    CERN Document Server

    Gosavi, Abhijit


    Simulation-Based Optimization: Parametric Optimization Techniques and Reinforcement Learning introduces the evolving area of simulation-based optimization. The book's objective is two-fold: (1) It examines the mathematical governing principles of simulation-based optimization, thereby providing the reader with the ability to model relevant real-life problems using these techniques. (2) It outlines the computational technology underlying these methods. Taken together these two aspects demonstrate that the mathematical and computational methods discussed in this book do work. Broadly speaking, the book has two parts: (1) parametric (static) optimization and (2) control (dynamic) optimization. Some of the book's special features are: *An accessible introduction to reinforcement learning and parametric-optimization techniques. *A step-by-step description of several algorithms of simulation-based optimization. *A clear and simple introduction to the methodology of neural networks. *A gentle introduction to converg...


    Directory of Open Access Journals (Sweden)

    Imam Santoso


    Full Text Available Extraction of silver (I has been studied from black/white printing photographic waste by emulsion liquid membrane technique. Composition emulsion at the membrane phase was cerosene as solvent, sorbitan monooleat (span 80 as surfactant, dimethyldioctadesyl-ammonium bromide as carrier and as internal phase was HNO3. Optimum condition was obtained: ratio of internal phase volume and membrane phase volume was 1:1 : concentration of surfactant was 2% (v/v : time of making emulsion was 20 second : rate of stiring emulsion was 1100 rpm : rest time emulsion was 3 second : rate of emulsion volume and external phase volume was 1:5 : emulsion contact rate 500 rpm : emulsion contact time was 40 second : concentration of silver thiosulfate as external phase was 100 ppm : pH of external phase was 3 and pH of internal phase was 1. Optimum condition was applied in silver(I extraction from black/white printing photographic waste. It was obtained 77.33% average which 56.06% silver (I average of internal phase and 22.66% in the external phase. Effect of matrices ion decreased silver(I percent extraction from 96,37% average to 77.33% average. Keyword: photographics waste, silver extraction

  8. IFSAR Simulation Using the Shooting and Bouncing Ray Technique (United States)

    Houshmand, Bijan; Bhalla, Rajan; Ling, Hao


    Interferometric Synthetic Aperture Radar (IFSAR) is a technique that allows an automated way to carry out terrain mapping. IFSAR is carried out by first generating a SAR image pair from two antennas that are spatially separated. The phase difference between the SAR image pair is proportional to the topography. After registering the SAR images, the difference in phase in each pixel is extracted to generate an interferogram. Since the phase can only be measured within 2pi radians, phase unwrapping is carried out to extract the absolute phase for each pixel that will be proportional to the local height. While IFSAR algorithm is typically applied to measurement data, it is useful to develop an IFSAR simulator to develop a better understanding of the IFSAR technique. The IFSAR simulator can be used in choosing system parameters, experimenting with processing procedures and mission planning. In this paper we will present an IFSAR simulation methodology to simulate the interferogram based on the shooting and bouncing ray (SBR) technique. SBR is a standard ray-tracing technique used to simulate scattering from large, complex targets. SBR is carried out by shooting rays at the target or scene. At the exit point of each ray, a ray-tube integration is done to find its contribution to the total field. A fast algorithm has been developed for the SBR for simulating SAR images of complex targets. In the IFSAR simulation, we build upon the fast SAR simulation technique. Given the antenna pair configuration, radar system parameters and the geometrical description of the scene, we first simulate two SAR images from each antenna. After post processing the two SAR images, we generate an interferogram. Phase unwrapping is then performed on the interferogram to arrive at the desired terrain map. We will present results from the SBR-based IFSAR simulator. The results will include terrain map reconstruction of urban environments. The reconstruction will be compared to the ground truth to

  9. Traffic simulations on parallel computers using domain decomposition techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hanebutte, U.R.; Tentner, A.M.


    Large scale simulations of Intelligent Transportation Systems (ITS) can only be achieved by using the computing resources offered by parallel computing architectures. Domain decomposition techniques are proposed which allow the performance of traffic simulations with the standard simulation package TRAF-NETSIM on a 128 nodes IBM SPx parallel supercomputer as well as on a cluster of SUN workstations. Whilst this particular parallel implementation is based on NETSIM, a microscopic traffic simulation model, the presented strategy is applicable to a broad class of traffic simulations. An outer iteration loop must be introduced in order to converge to a global solution. A performance study that utilizes a scalable test network that consist of square-grids is presented, which addresses the performance penalty introduced by the additional iteration loop.

  10. Simulation of OFDM technique for wireless communication systems (United States)

    Bloul, Albe; Mohseni, Saeed; Alhasson, Bader; Ayad, Mustafa; Matin, M. A.


    Orthogonal Frequency Division Multiplex (OFDM) is a modulation technique to transmit the baseband Radio signals over Fiber (RoF). Combining OFDM modulation technique and radio over fiber technology will improve future wireless communication. This technique can be implemented using laser and photodetector as optical modulator and demodulator. OFDM uses multiple sub-carriers to transmit low data rate streams in parallel, by using Quadrature Amplitude Modulation (QAM) or Phase Shift Keying (PSK). In this paper we will compare power spectrum signal and signal constellation of transmitted and received signals in RoF using Matlab and OptiSystem simulation software.

  11. A numerical technique to simulate display pixels based on electrowetting

    NARCIS (Netherlands)

    Roghair, I.; Musterd, M.; van den Ende, Henricus T.M.; Kleijn, C.; Kleijn, C.; Kreutzer, M.T.; Mugele, Friedrich Gunther


    We present a numerical simulation technique to calculate the deformation of interfaces between a conductive and non-conductive fluid as well as the motion of liquid–liquid–solid three-phase contact lines under the influence of externally applied electric fields in electrowetting configuration. The

  12. Use of Simulation Techniques in Determining the Fleet ...

    African Journals Online (AJOL)

    Goldfields Ltd., a gold mine in Ghana, which will enable the mine to meet its waste stripping and ore production targets. The use of simulation techniques as a tool in the modeling, formulation and testing of several models in the ore and waste mining operations of the mine are demonstrated. The results obtained from the ...

  13. Vertical Photographs (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set includes of photographs of marine mammals and sea turtles taken with high resolution cameras mounted in airplanes, unmanned platforms or the bow of...

  14. Oblique Photographs (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set includes photographs of marine mammals and sea turtles taken in the field. Most are lateral views of animals that are used to confirm species identity...

  15. Fusion of cone-beam CT and 3D photographic images for soft tissue simulation in maxillofacial surgery (United States)

    Chung, Soyoung; Kim, Joojin; Hong, Helen


    During maxillofacial surgery, prediction of the facial outcome after surgery is main concern for both surgeons and patients. However, registration of the facial CBCT images and 3D photographic images has some difficulties that regions around the eyes and mouth are affected by facial expressions or the registration speed is low due to their dense clouds of points on surfaces. Therefore, we propose a framework for the fusion of facial CBCT images and 3D photos with skin segmentation and two-stage surface registration. Our method is composed of three major steps. First, to obtain a CBCT skin surface for the registration with 3D photographic surface, skin is automatically segmented from CBCT images and the skin surface is generated by surface modeling. Second, to roughly align the scale and the orientation of the CBCT skin surface and 3D photographic surface, point-based registration with four corresponding landmarks which are located around the mouth is performed. Finally, to merge the CBCT skin surface and 3D photographic surface, Gaussian-weight-based surface registration is performed within narrow-band of 3D photographic surface.

  16. A computer code to simulate X-ray imaging techniques

    Energy Technology Data Exchange (ETDEWEB)

    Duvauchelle, Philippe E-mail:; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel


    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests.

  17. Etude et simulation des techniques de multiplexage OFDM pour une ...

    African Journals Online (AJOL)

    La simulation de ce modèle a révélé que, pour un même RSB de 20 dB, la technique ACO-OFDM (présentant un TEB de 0.0083) est moins sensible au bruit que la technique DCO-OFDM (dont le TEB est de 0.3413). Il est aussi remarqué que, pour un même RSB, l'implémentation de la DFT en matière de génération de ...

  18. Dataflow Integration and Simulation Techniques for DSP System Design Tools (United States)


    tations for PDSPs or other types of embedded processors, or for Verilog/ VHDL implementations on FPGAs. 3 Figure 1.1: Overview of DSP system design...ABSTRACT Title of dissertation: DATAFLOW INTEGRATION AND SIMULATION TECHNIQUES FOR DSP SYSTEM DESIGN TOOLS Chia-Jui Hsu Doctor of Philosophy, 2007...synthesis using dataflow models of computation are widespread in electronic design automation (EDA) tools for digi- tal signal processing ( DSP ) systems

  19. The use of visual interactive simulation techniques for production scheduling

    Directory of Open Access Journals (Sweden)

    W.H. Swan


    Full Text Available During the last decade visual interactive simulation has become established as a useful new tool for solving real life problems. It offers the Operational Research professional the opportunity to impact beneficially on important new decision making areas of business and industry. As an example, this paper discusses its application to the scheduling of production on batch chemical plants, which to date has remained largely a manual activity. Two different approaches are introduced, and it is concluded that while discrete event simulation is most useful as an aid to learning at a time of change, bar chart simulation is preferred for the day to day scheduling. The technique has been implemented on a number of plants and has led to significant improvements in their performance. Some areas for further development are identified.

  20. Dispersion analysis techniques within the space vehicle dynamics simulation program (United States)

    Snow, L. S.; Kuhn, A. E.


    The Space Vehicle Dynamics Simulation (SVDS) program was evaluated as a dispersion analysis tool. The Linear Error Analysis (LEA) post processor was examined in detail and simulation techniques relative to conducting a dispersion analysis using the SVDS were considered. The LEA processor is a tool for correlating trajectory dispersion data developed by simulating 3 sigma uncertainties as single error source cases. The processor combines trajectory and performance deviations by a root-sum-square (RSS process) and develops a covariance matrix for the deviations. Results are used in dispersion analyses for the baseline reference and orbiter flight test missions. As a part of this study, LEA results were verified as follows: (A) Hand calculating the RSS data and the elements of the covariance matrix for comparison with the LEA processor computed data. (B) Comparing results with previous error analyses. The LEA comparisons and verification are made at main engine cutoff (MECO).

  1. Photographic filters (United States)

    Rodigues, Jose Eduardo; Santosdealmeida, Wagner


    Some of the main aspects related to photographic filters are examined and prepared as a reference for researchers and students of remote sensing. A large range of information about the filters including their basic fundamentals, classification, and main types is presented. The theme cannot be exhausted in this or any other individual publication because of its great complexity, profound theoretical publication, and dynmaic technological development. The subject does not deal only with filter applications in remote sensing. As much as possible, additional information about the utilization of these products in other professional areas, as pictorial photography, photographic processing, and optical engineering, were included.

  2. Simulation of wind turbine wakes using the actuator line technique (United States)

    Sørensen, Jens N.; Mikkelsen, Robert F.; Henningson, Dan S.; Ivanell, Stefan; Sarmast, Sasan; Andersen, Søren J.


    The actuator line technique was introduced as a numerical tool to be employed in combination with large eddy simulations to enable the study of wakes and wake interaction in wind farms. The technique is today largely used for studying basic features of wakes as well as for making performance predictions of wind farms. In this paper, we give a short introduction to the wake problem and the actuator line methodology and present a study in which the technique is employed to determine the near-wake properties of wind turbines. The presented results include a comparison of experimental results of the wake characteristics of the flow around a three-bladed model wind turbine, the development of a simple analytical formula for determining the near-wake length behind a wind turbine and a detailed investigation of wake structures based on proper orthogonal decomposition analysis of numerically generated snapshots of the wake. PMID:25583862

  3. Simulation and data reconstruction for NDT phased array techniques. (United States)

    Chatillon, S; de Roumilly, L; Porre, J; Poidevin, C; Calmon, P


    Phased array techniques are now widely employed for industrial NDT applications in various contexts. Indeed, phased array present a great adaptability to the inspection configuration and the application of suitable delay laws allows to optimize the detection and characterization performances by taking into account the component geometry, the material characteristics, and the aim of the inspection. In addition, the amount of potential information issued from the inspection is in general greatly enhanced. It is the case when the employed method involve sequences of shots (sectorial scanning, multiple depth focusing etc) or when signals received on the different channels are stored. At last, application of electronic commutation make possible higher acquisition rates. Accompanying these advantages, it is clear that an optimal use of such techniques require the application of simulation-based algorithms at the different stages of the inspection process: When designing the probe by optimizing number and characteristics of element; When conceiving the inspection method by selecting suitable sequences of shots, computing optimized delay laws and evaluating the performances of the control in terms of zone coverage or flaw detection capabilities; When analysing the results by applying simulation-helped visualization and data reconstruction algorithms. For many years the CEA (French Atomic Energy Commission) has been being greatly involved in the development of such phased arrays simulation-based tools. In this paper, we will present recent advances of this activity and show different examples of application carried out on complex situations.

  4. D Digital Simulation of Minnan Temple Architecture CAISSON'S Craft Techniques (United States)

    Lin, Y. C.; Wu, T. C.; Hsu, M. F.


    Caisson is one of the important representations of the Minnan (southern Fujian) temple architecture craft techniques and decorative aesthetics. The special component design and group building method present the architectural thinking and personal characteristics of great carpenters of Minnan temple architecture. In late Qing Dynasty, the appearance and style of caissons of famous temples in Taiwan apparently presented the building techniques of the great carpenters. However, as the years went by, the caisson design and craft techniques were not fully inherited, which has been a great loss of cultural assets. Accordingly, with the caisson of Fulong temple, a work by the well-known great carpenter in Tainan as an example, this study obtained the thinking principles of the original design and the design method at initial period of construction through interview records and the step of redrawing the "Tng-Ko" (traditional design, stakeout and construction tool). We obtained the 3D point cloud model of the caisson of Fulong temple using 3D laser scanning technology, and established the 3D digital model of each component of the caisson. Based on the caisson component procedure obtained from interview records, this study conducted the digital simulation of the caisson component to completely recode and present the caisson design, construction and completion procedure. This model of preserving the craft techniques for Minnan temple caisson by using digital technology makes specific contribution to the heritage of the craft techniques while providing an important reference for the digital preservation of human cultural assets.

  5. X-ray optics simulation using Gaussian superposition technique. (United States)

    Idir, Mourad; Cywiak, Moisés; Morales, Arquímedes; Modi, Mohammed H


    We present an efficient method to perform x-ray optics simulation with high or partially coherent x-ray sources using Gaussian superposition technique. In a previous paper, we have demonstrated that full characterization of optical systems, diffractive and geometric, is possible by using the Fresnel Gaussian Shape Invariant (FGSI) previously reported in the literature. The complex amplitude distribution in the object plane is represented by a linear superposition of complex Gaussians wavelets and then propagated through the optical system by means of the referred Gaussian invariant. This allows ray tracing through the optical system and at the same time allows calculating with high precision the complex wave-amplitude distribution at any plane of observation. This technique can be applied in a wide spectral range where the Fresnel diffraction integral applies including visible, x-rays, acoustic waves, etc. We describe the technique and include some computer simulations as illustrative examples for x-ray optical component. We show also that this method can be used to study partial or total coherence illumination problem. © 2011 Optical Society of America

  6. Validation techniques of agent based modelling for geospatial simulations (United States)

    Darvishi, M.; Ahmadi, G.


    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  7. Numerical simulation of electron beam welding and instrumental technique

    Energy Technology Data Exchange (ETDEWEB)

    Carin, M.; Rogeon, P.; Carron, D.; Le Masson, P.; Couedel, D. [Universite de Bretagne Sud, Centre de Recherche, Lab. d' Etudes Thermiques Energetique et Environnement, 56 - Lorient (France)


    In the present work, thermal cycles measured with thermocouples embedded in specimens are employed to validate a numerical thermo-metallurgical model of an Electron Beam welding process. The implemented instrumentation techniques aim at reducing the perturbations induced by the sensors in place. The numerical model is based on the definition of a heat source term linked to the keyhole geometry predicted by a model of pressure balance using the FEMLAB code. The heat source term is used by the thermo-metallurgical simulation carried out with the finite element code SYSWELD. Kinetics parameters are extracted from dilatometric experiments achieved in welding austenitization conditions at constant cooling rates. (authors)


    Energy Technology Data Exchange (ETDEWEB)

    Louis J. Durlofsky; Khalid Aziz


    Nonconventional wells, which include horizontal, deviated, multilateral and ''smart'' wells, offer great potential for the efficient management of oil and gas reservoirs. These wells are able to contact larger regions of the reservoir than conventional wells and can also be used to target isolated hydrocarbon accumulations. The use of nonconventional wells instrumented with downhole inflow control devices allows for even greater flexibility in production. Because nonconventional wells can be very expensive to drill, complete and instrument, it is important to be able to optimize their deployment, which requires the accurate prediction of their performance. However, predictions of nonconventional well performance are often inaccurate. This is likely due to inadequacies in some of the reservoir engineering and reservoir simulation tools used to model and optimize nonconventional well performance. A number of new issues arise in the modeling and optimization of nonconventional wells. For example, the optimal use of downhole inflow control devices has not been addressed for practical problems. In addition, the impact of geological and engineering uncertainty (e.g., valve reliability) has not been previously considered. In order to model and optimize nonconventional wells in different settings, it is essential that the tools be implemented into a general reservoir simulator. This simulator must be sufficiently general and robust and must in addition be linked to a sophisticated well model. Our research under this five year project addressed all of the key areas indicated above. The overall project was divided into three main categories: (1) advanced reservoir simulation techniques for modeling nonconventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and for coupling the well to the simulator (which includes the accurate calculation of well index and the modeling of multiphase flow

  9. Simulated annealing technique to design minimum cost exchanger

    Directory of Open Access Journals (Sweden)

    Khalfe Nadeem M.


    Full Text Available Owing to the wide utilization of heat exchangers in industrial processes, their cost minimization is an important target for both designers and users. Traditional design approaches are based on iterative procedures which gradually change the design and geometric parameters to satisfy a given heat duty and constraints. Although well proven, this kind of approach is time consuming and may not lead to cost effective design as no cost criteria are explicitly accounted for. The present study explores the use of nontraditional optimization technique: called simulated annealing (SA, for design optimization of shell and tube heat exchangers from economic point of view. The optimization procedure involves the selection of the major geometric parameters such as tube diameters, tube length, baffle spacing, number of tube passes, tube layout, type of head, baffle cut etc and minimization of total annual cost is considered as design target. The presented simulated annealing technique is simple in concept, few in parameters and easy for implementations. Furthermore, the SA algorithm explores the good quality solutions quickly, giving the designer more degrees of freedom in the final choice with respect to traditional methods. The methodology takes into account the geometric and operational constraints typically recommended by design codes. Three different case studies are presented to demonstrate the effectiveness and accuracy of proposed algorithm. The SA approach is able to reduce the total cost of heat exchanger as compare to cost obtained by previously reported GA approach.


    Directory of Open Access Journals (Sweden)

    A.G.A. Rahman


    Full Text Available Suspension is part of automotive systems, providing both vehicle control and passenger comfort. The knuckle is an important part within the suspension system, constantly encountering the cyclic loads subjecting it to fatigue failure. This paper presents an evaluation of the fatigue characteristics of a knuckle using multibody simulation (MBS techniques. Load time history extracted from the MBS is used for stress analysis. An actual road profile of road bumps was used as the input to MBS. The stress fluctuations for fatigue simulations are considered with the road profile. The strain-life method is utilized to assess the fatigue life. The instantaneous stress distributions and maximum principal stress are used for fatigue life predictions. Mesh sensitivity analysis has been performed. The results show that the steering link in the knuckle is found to be the most susceptible region for fatigue failure. The number of times the knuckle can manage a road bump at 40 km/hr is determined to be approximately 371 times with a 50% certainty of survival. The proposed method of using the loading time history extracted from MBS simulation for fatigue life estimation is found to be very promising for the accurate evaluation of the performance of suspension system components.

  11. Simulation of wind turbine wakes using the actuator line technique. (United States)

    Sørensen, Jens N; Mikkelsen, Robert F; Henningson, Dan S; Ivanell, Stefan; Sarmast, Sasan; Andersen, Søren J


    The actuator line technique was introduced as a numerical tool to be employed in combination with large eddy simulations to enable the study of wakes and wake interaction in wind farms. The technique is today largely used for studying basic features of wakes as well as for making performance predictions of wind farms. In this paper, we give a short introduction to the wake problem and the actuator line methodology and present a study in which the technique is employed to determine the near-wake properties of wind turbines. The presented results include a comparison of experimental results of the wake characteristics of the flow around a three-bladed model wind turbine, the development of a simple analytical formula for determining the near-wake length behind a wind turbine and a detailed investigation of wake structures based on proper orthogonal decomposition analysis of numerically generated snapshots of the wake. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  12. Photography and photographic-photometry of the solar aureole (United States)

    Deepak, A.; Adams, R. R.


    This paper describes procedures for taking solar aureole photographs with conventional small format (35- or 70-mm) cameras and films and discusses photographic-photometry techniques for obtaining accurate solar aureole radiance measurements from these photographs, the measurements being used for retrieving atmospheric aerosol characteristics. The photographic data reduction techniques discussed here include sensitometry, densitometry, off-axis illumination distribution measurement, photogrammetry and photometry relations, etc. Comparison tests show that photographic measurements of the solar aureole radiance agree well with simultaneous photoelectric measurements.

  13. Evaluating drilling and suctioning technique in a mastoidectomy simulator. (United States)

    Sewell, Christopher; Morris, Dan; Blevins, Nikolas H; Barbagli, Federico; Salisbury, Kenneth


    This paper presents several new metrics related to bone removal and suctioning technique in the context of a mastoidectomy simulator. The expertise with which decisions as to which regions of bone to remove and which to leave intact is evaluated by building a Naïve Bayes classifier using training data from known experts and novices. Since the bone voxel mesh is very large, and many voxels are always either removed or not removed regardless of expertise, the mutual information was calculated for each voxel and only the most informative voxels used for the classifier. Leave-out-one cross validation showed a high correlation of calculated expert probabilities with scores assigned by instructors. Additional metrics described in this paper include those for assessing smoothness of drill strokes, proper drill burr selection, sufficiency of suctioning, two-handed tool coordination, and application of appropriate force and velocity magnitudes as functions of distance from critical structures.

  14. Analytical decoupling techniques for fully implicit reservoir simulation (United States)

    Qiao, Changhe; Wu, Shuhong; Xu, Jinchao; Zhang, Chen-Song


    This paper examines linear algebraic solvers for a given general purpose compositional simulator. In particular, the decoupling stage of the constraint pressure residual (CPR) preconditioner for linear systems arising from the fully implicit scheme is evaluated. An asymptotic analysis of the convergence behavior is given when Δt approaches zero. Based on this analysis, we propose an analytical decoupling technique, from which the pressure equation is directly related to an elliptic equation and can be solved efficiently. We show that this method ensures good convergence behavior of the algebraic solvers in a two-stage CPR-type preconditioner. We also propose a semi-analytical decoupling strategy that combines the analytical method and alternate block factorization method. Numerical experiments demonstrate the superior performance of the analytical and semi-analytical decoupling methods compared to existing methods.

  15. Limitations of 14 MeV neutron simulation techniques (United States)

    Kley, W.; Bishop, G. R.; Sinha, A.


    A D-T fusion cycle produces five times more neutrons per unit of energy released than a fission cycle, with about twice the damage energy and the capability to produce ten times more hydrogen, helium and transmutation products than fission neutrons. They determine, together with other parameters, the lifetime of the construction materials for the low plasma-density fusion reactors (tokamak, tandem-mirror, etc.), which require a first wall. For the economie feasibility of fusion power reactors the first wall and blanket materials must withstand a dose approaching 300 to 400 dpa. Arguments are presented that demonstrate that today's simulation techniques using existing fission reactors and charged particle beams are excellent tools to study the underlying basic physical phenomena of the evolving damage structures but are not sufficient to provide a valid technological data base for the design of economie fusion power reactors. It is shown than an optimized spallation neutron source based on a continuous beam of 600 MeV, 6 mA protons is suitable to simulate first wall conditions. Comparing it with FMIT the 35 MeV, 100 mA D + -Li neutron source, we arrive at the following figure of merit: FM = {(dpa·volume) EURAC}/{(dpa·volume) FMIT} = {} = 111 reflecting the fact that the proton beam generates about 100 times more neutrons than the deuteron beam in FMIT for the same beam power.

  16. A Monte Carlo simulation technique to determine the optimal portfolio

    Directory of Open Access Journals (Sweden)

    Hassan Ghodrati


    Full Text Available During the past few years, there have been several studies for portfolio management. One of the primary concerns on any stock market is to detect the risk associated with various assets. One of the recognized methods in order to measure, to forecast, and to manage the existing risk is associated with Value at Risk (VaR, which draws much attention by financial institutions in recent years. VaR is a method for recognizing and evaluating of risk, which uses the standard statistical techniques and the method has been used in other fields, increasingly. The present study has measured the value at risk of 26 companies from chemical industry in Tehran Stock Exchange over the period 2009-2011 using the simulation technique of Monte Carlo with 95% confidence level. The used variability in the present study has been the daily return resulted from the stock daily price change. Moreover, the weight of optimal investment has been determined using a hybrid model called Markowitz and Winker model in each determined stocks. The results showed that the maximum loss would not exceed from 1259432 Rials at 95% confidence level in future day.

  17. Supply chain simulation tools and techniques: a survey

    NARCIS (Netherlands)

    Kleijnen, J.P.C.


    The main contribution of this paper is twofold: it surveys different types of simulation for supply chain management; it discusses several methodological issues. These different types of simulation are spreadsheet simulation, system dynamics, discrete-event simulation and business games. Which

  18. Parallel pic plasma simulation through particle decomposition techniques

    Energy Technology Data Exchange (ETDEWEB)

    Briguglio, S.; Vlad, G. [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dipt. Energia; Di Martino, B. [Wien Univ. (Austria). Inst. for Software Tecnology and Parallel Systems]|[Naples, Univ. `Federico II` (Italy). Dipt. di Informatica e Sistemistica


    Particle-in-cell (PIC) codes are among the major candidates to yield a satisfactory description of the detail of kinetic effects, such as the resonant wave-particle interaction, relevant in determining the transport mechanism in magnetically confined plasmas. A significant improvement of the simulation performance of such codes con be expected from parallelization, e.g., by distributing the particle population among several parallel processors. Parallelization of a hybrid magnetohydrodynamic-gyrokinetic code has been accomplished within the High Performance Fortran (HPF) framework, and tested on the IBM SP2 parallel system, using a `particle decomposition` technique. The adopted technique requires a moderate effort in porting the code in parallel form and results in intrinsic load balancing and modest inter processor communication. The performance tests obtained confirm the hypothesis of high effectiveness of the strategy, if targeted towards moderately parallel architectures. Optimal use of resources is also discussed with reference to a specific physics problem. [Italiano] I codici Particle-in-cell (PIC) sono considerati tra i piu` promettenti candidati per ottenere una descrizione soddisfacente e dettagliata degli effetti cinetici, quali per esempio l`interazione risonante particella-onda, rilevanti nel determinare i meccanismi di trasporto che interessano il confinamento del plasma. Un significativo miglioramento delle prestazioni della simulazione puo` essere ottenuto distribuendo la popolazione di particelle tra diversi processori in parallelo. La parallelizzazione di un codice ibrido MHD-girocinetico e` stata effettuata, in ambiente HPF, utilizzando la tecnica di `decomposizione per particelle`, ed e` stata provata sul sistema parallelo IBM SP2. La tecnica adottata richiede uno sforzo moderato per la trasformazione del codice in versione parallela, permette un intrinseco bilanciamento tra i processori del carico di lavoro e necessita di una modesta

  19. Petascale Kinetic Simulations in Space Sciences: New Simulations and Data Discovery Techniques and Physics Results (United States)

    Karimabadi, Homa


    Recent advances in simulation technology and hardware are enabling breakthrough science where many longstanding problems can now be addressed for the first time. In this talk, we focus on kinetic simulations of the Earth's magnetosphere and magnetic reconnection process which is the key mechanism that breaks the protective shield of the Earth's dipole field, allowing the solar wind to enter the Earth's magnetosphere. This leads to the so-called space weather where storms on the Sun can affect space-borne and ground-based technological systems on Earth. The talk will consist of three parts: (a) overview of a new multi-scale simulation technique where each computational grid is updated based on its own unique timestep, (b) Presentation of a new approach to data analysis that we refer to as Physics Mining which entails combining data mining and computer vision algorithms with scientific visualization to extract physics from the resulting massive data sets. (c) Presentation of several recent discoveries in studies of space plasmas including the role of vortex formation and resulting turbulence in magnetized plasmas.

  20. Adobe Photoshop CC for photographers

    CERN Document Server

    Evening, Martin


    Adobe Photoshop for Photographers 2014 Release by Photoshop hall-of-famer and acclaimed digital imaging professional Martin Evening has been fully updated to include detailed instruction for all of the updates to Photoshop CC 2014 on Adobe's Creative Cloud, including significant new features, such as Focus Area selections, enhanced Content-Aware filling, and new Spin and Path blur gallery effects. This guide covers all the tools and techniques photographers and professional image editors need to know when using Photoshop, from workflow guidance to core skills to advanced techniques for profess

  1. Induction of a transient acidosis in the rumen simulation technique. (United States)

    Eger, M; Riede, S; Breves, G


    Feeding high concentrate diets to cattle results in an enhanced production of short-chain fatty acids by the micro-organisms in the rumen. Excessive fermentation might result in subclinical or clinical rumen acidosis, characterized by low pH, alterations in the microbial community and lactate production. Here, we provide an in vitro model of a severe rumen acidosis. A transient acidosis was induced in the rumen simulation technique by lowering bicarbonate, dihydrogen phosphate and hydrogen phosphate concentrations in the artificial saliva while providing a concentrate-to-forage ratio of 70:30. The experiment consisted of an equilibration period of 7 days, a first control period of 5 days, the acidosis period of 5 days and a second control period of 5 days. During acidosis induction, pH decreased stepwise until it ranged below 5.0 at the last day of acidosis (day 17). This was accompanied by an increase in lactate production reaching 11.3 mm at day 17. The daily production of acetate, propionate and butyrate was reduced at the end of the acidosis period. Gas production (methane and carbon dioxide) and NH3 -N concentration reached a minimum 2 days after terminating the acidosis challenge. While the initial pH was already restored 1 day after acidosis, alterations in the mentioned fermentation parameters lasted longer. However, by the end of the experiment, all parameters had recovered. An acidosis-induced alteration in the microbial community of bacteria and archaea was revealed by single-strand conformation polymorphism. For bacteria, the pre-acidotic community could be re-established within 5 days, however, not for archaea. This study provides an in vitro model for a transient rumen acidosis including biochemical and microbial changes, which might be used for testing feeding strategies or feed additives influencing rumen acidosis. Journal of Animal Physiology and Animal Nutrition © 2017 Blackwell Verlag GmbH.

  2. Teaching Behavioral Modeling and Simulation Techniques for Power Electronics Courses (United States)

    Abramovitz, A.


    This paper suggests a pedagogical approach to teaching the subject of behavioral modeling of switch-mode power electronics systems through simulation by general-purpose electronic circuit simulators. The methodology is oriented toward electrical engineering (EE) students at the undergraduate level, enrolled in courses such as "Power…

  3. Determine the feasibility of techniques for simulating coal dust explosions

    CSIR Research Space (South Africa)

    Kirsten, JT


    Full Text Available The primary objective of this work is to assess the feasibility of reliably simulating the coal dust explosion process taking place in the Kloppersbos tunnel with a computer model. Secondary objectives are to investigate the viability of simulating...

  4. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard


    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  5. ATLAS trigger simulation with legacy code using virtualization techniques

    CERN Document Server

    Galster, G; The ATLAS collaboration; Wiedenmann, W


    Abstract. Several scenarios, both present and future, requires re-simulation of the trigger response in ATLAS. While software for the detector response simulation and event reconstruction is allowed to change and improve, the trigger response simulation has to reflect the conditions at which data was taken. This poses a massive maintenance and data preservation problem. Several strategies have been considered and a proof-of-concept model using CernVM has been developed. While the virtualization with CernVM elegantly solves several aspects of the data preservation problem, the low maturity for contextualization as well as data format incompatibilities in the currently used data format introduces new challenges. In this proceeding these challenges, their current solutions and the proof of concept model for precise trigger simulation are discussed.

  6. 360-degree videos: a new visualization technique for astrophysical simulations (United States)

    Russell, Christopher M. P.


    360-degree videos are a new type of movie that renders over all 4π steradian. Video sharing sites such as YouTube now allow this unique content to be shared via virtual reality (VR) goggles, hand-held smartphones/tablets, and computers. Creating 360° videos from astrophysical simulations is not only a new way to view these simulations as you are immersed in them, but is also a way to create engaging content for outreach to the public. We present what we believe is the first 360° video of an astrophysical simulation: a hydrodynamics calculation of the central parsec of the Galactic centre. We also describe how to create such movies, and briefly comment on what new science can be extracted from astrophysical simulations using 360° videos.

  7. An Initialization Technique for the Waveform-Relaxation Circuit Simulation


    Habib, S. E.-D.; Al-Karim, G. J.


    This paper reports the development of the Cairo University Waveform Relaxation (CUWORX) simulator. In order to accelerate the convergence of the waveform relaxation (WR) in the presence of logic feedback, CUWORK is initialized via a logic simulator. This logic initialization scheme is shown to be highly effective for digital synchronous circuits. Additionally, this logic initialization scheme preserves fully the multi-rate properties of the WR algorithm.


    Directory of Open Access Journals (Sweden)

    N. A. Degotinsky


    Full Text Available Subject of Research. Westudied a method of estimating the object distance on the basis of its single defocused photograph. The method is based on the analysis of image defocus at the contour points corresponding to borders of photographed objects. It is supposed that the brightness drop in not defocused image of border can be simulated with an ideal step function – the Heaviside function. Method. The contours corresponding to local maxima of brightness gradient are detected in the initial image to be analyzed and recorded for further analysis. Then the initial image is subjected to additional defocusing by a Gaussian filter having the dispersion parameter of defined in advance value. The ratios of local gradient values for the initial and additionally defocused images are then calculated at the contour points, and the defocus values of initial image at the points of objects borders are estimated based on these ratios. A sparse map of relative remoteness is built on the basis of these estimations for the border points of photographed objects, and a dense depth map of relative distances is then calculated using a special interpolation technique. Main Results. The efficiency of described technique is illustrated with the results of distance estimation in the photographs of real environment. Practical Relevance. In contrast to the widely applied stereo-technique and distance measurement methods analyzing the sets of defocused images, the investigated approach enables dealing with a single photograph acquired in a standard way without setting any additional conditions and limitations. If the relative remoteness of objects may be estimated instead of their absolute distances, no special calibration is needed for the camera applied, and the pictures taken once previously in diversified situations can be analyzed using the considered technique.

  9. Simulation and Aerodynamic Analysis of the Flow Around the Sailplane Using CFD Techniques

    Directory of Open Access Journals (Sweden)

    Sebastian Marian ZAHARIA


    Full Text Available In this paper, it was described the analysis and simulation process using the CFD technique and the phenomena that shows up in the engineering aero-spatial practice, directing the studies of simulation for the air flows around sailplane. The analysis and aerodynamic simulations using Computational Fluid Dynamics techniques (CFD are well set as instruments in the development process of an aeronautical product. The simulation techniques of fluid flow helps engineers to understand the physical phenomena that take place in the product design since its prototype faze and in the same time allows for the optimization of aeronautical products’ performance concerning certain design criteria.

  10. Advancing botnet modeling techniques for military and security simulations (United States)

    Banks, Sheila B.; Stytz, Martin R.


    Simulation environments serve many purposes, but they are only as good as their content. One of the most challenging and pressing areas that call for improved content is the simulation of bot armies (botnets) and their effects upon networks and computer systems. Botnets are a new type of malware, a type that is more powerful and potentially dangerous than any other type of malware. A botnet's power derives from several capabilities including the following: 1) the botnet's capability to be controlled and directed throughout all phases of its activity, 2) a command and control structure that grows increasingly sophisticated, and 3) the ability of a bot's software to be updated at any time by the owner of the bot (a person commonly called a bot master or bot herder.) Not only is a bot army powerful and agile in its technical capabilities, a bot army can be extremely large, can be comprised of tens of thousands, if not millions, of compromised computers or it can be as small as a few thousand targeted systems. In all botnets, their members can surreptitiously communicate with each other and their command and control centers. In sum, these capabilities allow a bot army to execute attacks that are technically sophisticated, difficult to trace, tactically agile, massive, and coordinated. To improve our understanding of their operation and potential, we believe that it is necessary to develop computer security simulations that accurately portray bot army activities, with the goal of including bot army simulations within military simulation environments. In this paper, we investigate issues that arise when simulating bot armies and propose a combination of the biologically inspired MSEIR infection spread model coupled with the jump-diffusion infection spread model to portray botnet propagation.

  11. Experiencing Photographs Qua Photographs: What's So Special about Them?

    Directory of Open Access Journals (Sweden)

    Jiri Benovsky


    Full Text Available Merely rhetorically and answering in the negative, Kendall Walton has asked: "Isn't photography just another method people have of making pictures, one that merely uses different tools and materials; cameras, photosensitive paper, and darkroom equipment, rather than canvas, paint, and brushes? And don't the results differ only contingently and in degree, not fundamentally, from pictures of other kinds?" Contrary to Walton and others, I answer with a resounding "Yes" to Walton’s questions in this article. It is a widely shared view that photographs are somehow special and that they fundamentally differ from hand-made pictures such as paintings, both from a phenomenological point of view (in the way we experience them and an epistemic point of view (since they are supposed to have a different that is, greater, epistemic value from paintings that gives us a privileged access to the world. I almost reject the totality of these claims and, as a consequence, there remains little difference between photographs and paintings. As we shall see, “photographs are always partly paintings,” a claim that is true not only of retouched digital photographs but of all photographs, including traditional ones made using photosensitive film and development techniques.

  12. Simulating tidal turbines with multi-scale mesh optimisation techniques

    NARCIS (Netherlands)

    Abolghasemi, M.A.; Piggott, M.D.; Spinneken, J; Viré, A.C.; Cotter, CJ; Crammond, S.


    Embedding tidal turbines within simulations of realistic large-scale tidal flows is a highly multi-scale problem that poses significant computational challenges. Here this problem is tackled using actuator disc momentum (ADM) theory and Reynolds-averaged Navier–Stokes (RANS) with, for the first

  13. Enhanced sampling techniques in molecular dynamics simulations of biological systems. (United States)

    Bernardi, Rafael C; Melo, Marcelo C R; Schulten, Klaus


    Molecular dynamics has emerged as an important research methodology covering systems to the level of millions of atoms. However, insufficient sampling often limits its application. The limitation is due to rough energy landscapes, with many local minima separated by high-energy barriers, which govern the biomolecular motion. In the past few decades methods have been developed that address the sampling problem, such as replica-exchange molecular dynamics, metadynamics and simulated annealing. Here we present an overview over theses sampling methods in an attempt to shed light on which should be selected depending on the type of system property studied. Enhanced sampling methods have been employed for a broad range of biological systems and the choice of a suitable method is connected to biological and physical characteristics of the system, in particular system size. While metadynamics and replica-exchange molecular dynamics are the most adopted sampling methods to study biomolecular dynamics, simulated annealing is well suited to characterize very flexible systems. The use of annealing methods for a long time was restricted to simulation of small proteins; however, a variant of the method, generalized simulated annealing, can be employed at a relatively low computational cost to large macromolecular complexes. Molecular dynamics trajectories frequently do not reach all relevant conformational substates, for example those connected with biological function, a problem that can be addressed by employing enhanced sampling algorithms. This article is part of a Special Issue entitled Recent developments of molecular dynamics. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Estimation of fracture aperture using simulation technique; Simulation wo mochiita fracture kaiko haba no suitei

    Energy Technology Data Exchange (ETDEWEB)

    Kikuchi, T. [Geological Survey of Japan, Tsukuba (Japan); Abe, M. [Tohoku University, Sendai (Japan). Faculty of Engineering


    Characteristics of amplitude variation around fractures have been investigated using simulation technique in the case changing the fracture aperture. Four models were used. The model-1 was a fracture model having a horizontal fracture at Z=0. For the model-2, the fracture was replaced by a group of small fractures. The model-3 had an extended borehole diameter at Z=0 in a shape of wedge. The model-4 had a low velocity layer at Z=0. The maximum amplitude was compared each other for each depth and for each model. For the model-1, the amplitude became larger at the depth of the fracture, and became smaller above the fracture. For the model-2, when the cross width D increased to 4 cm, the amplitude approached to that of the model-1. For the model-3 having extended borehole diameter, when the extension of borehole diameter ranged between 1 cm and 2 cm, the change of amplitude was hardly observed above and below the fracture. However, when the extension of borehole diameter was 4 cm, the amplitude became smaller above the extension part of borehole. 3 refs., 4 figs., 1 tab.

  15. Assembly line balancing using simulation technique in a garment ...

    African Journals Online (AJOL)

    The typical problems facing with garment manufacturing are: short product cycle for fashion articles, long production lead time, bottlenecking, and low productivity. To alleviate the problems, different types of line balancing techniques have been used for many years in the garment industry. However, garment industries ...

  16. Application of the numerical modelling techniques to the simulation ...

    African Journals Online (AJOL)

    The aquifer was modelled by the application of Finite Element Method (F.E.M), with appropriate initial and boundary conditions. The matrix solver technique adopted for the F.E.M. was that of the Conjugate Gradient Method. After the steady state calibration and transient verification, the model was used to predict the effect of ...

  17. Skeletal response to simulated weightlessness - A comparison of suspension techniques (United States)

    Wronski, T. J.; Morey-Holton, E. R.


    Comparisons are made of the skeletal response of rats subjected to simulated weightlessness by back or tail suspension. In comparison to pair-fed control rats, back-suspended rats failed to gain weight whereas tail-suspended rats exhibited normal weight gain. Quantitative bone histomorphometry revealed marked skeletal abnormalities in the proximal tibial metaphysis of back-suspended rats. Loss of trabecular bone mass in these animals was due to a combination of depressed longitudinal bone growth, decreased bone formation, and increased bone resorption. In contrast, the proximal tibia of tail-suspended rats was relatively normal by these histologic criteria. However, a significant reduction trabecular bone volume occurred during 2 weeks of tail suspension, possibly due to a transient inhibition of bone formation. The findings indicate that tail suspension may be a more appropriate model for evaluating the effects of simulated weightlessness on skeletal homeostasis.

  18. Photographic fixative poisoning (United States)

    Photographic fixatives are chemicals used to develop photographs. This article discusses poisoning from swallowing such chemicals. This article is for information only. DO NOT use it to treat or manage an ...

  19. Glacier Photograph Collection (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Glacier Photograph Collection is a database of photographs of glaciers from around the world, some dating back to the mid-1850's, that provide an historical...

  20. Measurement and Simulation Techniques For Piezoresistive Microcantilever Biosensor Applications

    Directory of Open Access Journals (Sweden)

    Aan Febriansyah


    Full Text Available Applications of microcantilevers as biosensors have been explored by many researchers for the applications in medicine, biological, chemistry, and environmental monitoring. This research discusses a design of measurement method and simuations for piezoresistive microcantilever as a biosensor, which consist of designing Wheatstone bridge circuit as object detector, simulation of resonance frequency shift based on Euler Bernoulli Beam equation, and microcantilever vibration simulation using COMSOL Multiphysics 3.5. The piezoresistive microcantilever used here is Seiko Instrument Technology (Japan product with length of 110 ?m, width of 50 ?m, and thickness of 1 ?m. Microcantilever mass is 12.815 ng, including the mass receptor. The sample object in this research is bacteria EColi. One bacteria mass is assumed to 0.3 pg. Simulation results show that the mass of one bacterium will cause the deflection of 0,03053 nm and resonance frequency value of 118,90 kHz. Moreover, four bacterium will cause the deflection of 0,03054 nm and resonance frequency value of 118,68 kHz. These datas indicate that the increasing of the bacteria mass increases the deflection value and reduces the value of resonance frequency.

  1. Reliability assessment of restructured power systems using reliability network equivalent and pseudo-sequential simulation techniques

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Yi; Wang, Peng; Goel, Lalit [Nanyang Technological University, School of Electrical and Electronics Engineering, Block S1, Nanyang Avenue, Singapore 639798 (Singapore); Billinton, Roy; Karki, Rajesh [Department of Electrical Engineering, University of Saskatchewan, Saskatoon (Canada)


    This paper presents a technique to evaluate reliability of a restructured power system with a bilateral market. The proposed technique is based on the combination of the reliability network equivalent and pseudo-sequential simulation approaches. The reliability network equivalent techniques have been implemented in the Monte Carlo simulation procedure to reduce the computational burden of the analysis. Pseudo-sequential simulation has been used to increase the computational efficiency of the non-sequential simulation method and to model the chronological aspects of market trading and system operation. Multi-state Markov models for generation and transmission systems are proposed and implemented in the simulation. A new load shedding scheme is proposed during generation inadequacy and network congestion to minimize the load curtailment. The IEEE reliability test system (RTS) is used to illustrate the technique. (author)

  2. Simulating New Drop Test Vehicles and Test Techniques for the Orion CEV Parachute Assembly System (United States)

    Morris, Aaron L.; Fraire, Usbaldo, Jr.; Bledsoe, Kristin J.; Ray, Eric; Moore, Jim W.; Olson, Leah M.


    The Crew Exploration Vehicle Parachute Assembly System (CPAS) project is engaged in a multi-year design and test campaign to qualify a parachute recovery system for human use on the Orion Spacecraft. Test and simulation techniques have evolved concurrently to keep up with the demands of a challenging and complex system. The primary simulations used for preflight predictions and post-test data reconstructions are Decelerator System Simulation (DSS), Decelerator System Simulation Application (DSSA), and Drop Test Vehicle Simulation (DTV-SIM). The goal of this paper is to provide a roadmap to future programs on the test technique challenges and obstacles involved in executing a large-scale, multi-year parachute test program. A focus on flight simulation modeling and correlation to test techniques executed to obtain parachute performance parameters are presented.

  3. FEM Techniques for High Stress Detection in Accelerated Fatigue Simulation (United States)

    Veltri, M.


    This work presents the theory and a numerical validation study in support to a novel method for a priori identification of fatigue critical regions, with the aim to accelerate durability design in large FEM problems. The investigation is placed in the context of modern full-body structural durability analysis, where a computationally intensive dynamic solution could be required to identify areas with potential for fatigue damage initiation. The early detection of fatigue critical areas can drive a simplification of the problem size, leading to sensible improvement in solution time and model handling while allowing processing of the critical areas in higher detail. The proposed technique is applied to a real life industrial case in a comparative assessment with established practices. Synthetic damage prediction quantification and visualization techniques allow for a quick and efficient comparison between methods, outlining potential application benefits and boundaries.

  4. Editing soft shadows in a digital photograph. (United States)

    Mohan, Ankit; Tumblin, Jack; Choudhury, Prasun


    This technique for modeling, editing, and rendering shadow edges in a photograph or a synthetic image lets users separate the shadow from the rest of the image and make arbitrary adjustments to its position, sharpness, and intensity.

  5. Photographing Turkey Run: A Guide to Nature Photography


    Shepardson, Daniel P.


    Photographing Turkey Run: A Guide to Nature Photography was written to be used in conjunction with Daniel P. Shepardson’s A Place Called Turkey Run: A Celebration of Indiana’s Second State Park in Photographs and Words. This guide contains tips and techniques designed to provide a basic understanding of how to photograph nature and improve one’s photography skills.

  6. Simulation of California's Major Reservoirs Outflow Using Data Mining Technique (United States)

    Yang, T.; Gao, X.; Sorooshian, S.


    The reservoir's outflow is controlled by reservoir operators, which is different from the upstream inflow. The outflow is more important than the reservoir's inflow for the downstream water users. In order to simulate the complicated reservoir operation and extract the outflow decision making patterns for California's 12 major reservoirs, we build a data-driven, computer-based ("artificial intelligent") reservoir decision making tool, using decision regression and classification tree approach. This is a well-developed statistical and graphical modeling methodology in the field of data mining. A shuffled cross validation approach is also employed to extract the outflow decision making patterns and rules based on the selected decision variables (inflow amount, precipitation, timing, water type year etc.). To show the accuracy of the model, a verification study is carried out comparing the model-generated outflow decisions ("artificial intelligent" decisions) with that made by reservoir operators (human decisions). The simulation results show that the machine-generated outflow decisions are very similar to the real reservoir operators' decisions. This conclusion is based on statistical evaluations using the Nash-Sutcliffe test. The proposed model is able to detect the most influential variables and their weights when the reservoir operators make an outflow decision. While the proposed approach was firstly applied and tested on California's 12 major reservoirs, the method is universally adaptable to other reservoir systems.

  7. Image processing of galaxy photographs (United States)

    Arp, H.; Lorre, J.


    New computer techniques for analyzing and processing photographic images of galaxies are presented, with interesting scientific findings gleaned from the processed photographic data. Discovery and enhancement of very faint and low-contrast nebulous features, improved resolution of near-limit detail in nebulous and stellar images, and relative colors of a group of nebulosities in the field are attained by the methods. Digital algorithms, nonlinear pattern-recognition filters, linear convolution filters, plate averaging and contrast enhancement techniques, and an atmospheric deconvolution technique are described. New detail is revealed in images of NGC 7331, Stephan's Quintet, Seyfert's Sextet, and the jet in M87, via processes of addition of plates, star removal, contrast enhancement, standard deviation filtering, and computer ratioing to bring out qualitative color differences.

  8. Cross-section adjustment techniques for BWR adaptive simulation (United States)

    Jessee, Matthew Anderson

    Computational capability has been developed to adjust multi-group neutron cross-sections to improve the fidelity of boiling water reactor (BWR) modeling and simulation. The method involves propagating multi-group neutron cross-section uncertainties through BWR computational models to evaluate uncertainties in key core attributes such as core k-effective, nodal power distributions, thermal margins, and in-core detector readings. Uncertainty-based inverse theory methods are then employed to adjust multi-group cross-sections to minimize the disagreement between BWR modeling predictions and measured plant data. For this work, measured plant data were virtually simulated in the form of perturbed 3-D nodal power distributions with discrepancies with predictions of the same order of magnitude as expected from plant data. Using the simulated plant data, multi-group cross-section adjustment reduces the error in core k-effective to less than 0.2% and the RMS error in nodal power to 4% (i.e. the noise level of the in-core instrumentation). To ensure that the adapted BWR model predictions are robust, Tikhonov regularization is utilized to control the magnitude of the cross-section adjustment. In contrast to few-group cross-section adjustment, which was the focus of previous research on BWR adaptive simulation, multigroup cross-section adjustment allows for future fuel cycle design optimization to include the determination of optimal fresh fuel assembly designs using the adjusted multi-group cross-sections. The major focus of this work is to efficiently propagate multi-group neutron cross-section uncertainty through BWR lattice physics calculations. Basic neutron cross-section uncertainties are provided in the form of multi-group cross-section covariance matrices. For energy groups in the resolved resonance energy range, the cross-section uncertainties are computed using an infinitely-dilute approximation of the neutron flux. In order to accurately account for spatial and

  9. Drift simulation of MH370 debris using superensemble techniques (United States)

    Jansen, Eric; Coppini, Giovanni; Pinardi, Nadia


    On 7 March 2014 (UTC), Malaysia Airlines flight 370 vanished without a trace. The aircraft is believed to have crashed in the southern Indian Ocean, but despite extensive search operations the location of the wreckage is still unknown. The first tangible evidence of the accident was discovered almost 17 months after the disappearance. On 29 July 2015, a small piece of the right wing of the aircraft was found washed up on the island of Réunion, approximately 4000 km from the assumed crash site. Since then a number of other parts have been found in Mozambique, South Africa and on Rodrigues Island. This paper presents a numerical simulation using high-resolution oceanographic and meteorological data to predict the movement of floating debris from the accident. Multiple model realisations are used with different starting locations and wind drag parameters. The model realisations are combined into a superensemble, adjusting the model weights to best represent the discovered debris. The superensemble is then used to predict the distribution of marine debris at various moments in time. This approach can be easily generalised to other drift simulations where observations are available to constrain unknown input parameters. The distribution at the time of the accident shows that the discovered debris most likely originated from the wide search area between 28 and 35° S. This partially overlaps with the current underwater search area, but extends further towards the north. Results at later times show that the most probable locations to discover washed-up debris are along the African east coast, especially in the area around Madagascar. The debris remaining at sea in 2016 is spread out over a wide area and its distribution changes only slowly.

  10. Wind Turbine Rotor Simulation via CFD Based Actuator Disc Technique Compared to Detailed Measurement

    National Research Council Canada - National Science Library

    Esmail Mahmoodi; Ali Jafari; Alireza Keyhani


    .... The AD model as a combination of CFD technique and User Defined Functions codes (UDF), so-called UDF/AD model is used to simulate loads and performance of the rotor in three different wind speed tests...

  11. Using simulation models to evaluate ape nest survey techniques.

    Directory of Open Access Journals (Sweden)

    Ryan H Boyko

    Full Text Available BACKGROUND: Conservationists frequently use nest count surveys to estimate great ape population densities, yet the accuracy and precision of the resulting estimates are difficult to assess. METHODOLOGY/PRINCIPAL FINDINGS: We used mathematical simulations to model nest building behavior in an orangutan population to compare the quality of the population size estimates produced by two of the commonly used nest count methods, the 'marked recount method' and the 'matrix method.' We found that when observers missed even small proportions of nests in the first survey, the marked recount method produced large overestimates of the population size. Regardless of observer reliability, the matrix method produced substantial overestimates of the population size when surveying effort was low. With high observer reliability, both methods required surveying approximately 0.26% of the study area (0.26 km(2 out of 100 km(2 in this simulation to achieve an accurate estimate of population size; at or above this sampling effort both methods produced estimates within 33% of the true population size 50% of the time. Both methods showed diminishing returns at survey efforts above 0.26% of the study area. The use of published nest decay estimates derived from other sites resulted in widely varying population size estimates that spanned nearly an entire order of magnitude. The marked recount method proved much better at detecting population declines, detecting 5% declines nearly 80% of the time even in the first year of decline. CONCLUSIONS/SIGNIFICANCE: These results highlight the fact that neither nest surveying method produces highly reliable population size estimates with any reasonable surveying effort, though either method could be used to obtain a gross population size estimate in an area. Conservation managers should determine if the quality of these estimates are worth the money and effort required to produce them, and should generally limit surveying effort to

  12. Simulated Annealing Technique for Routing in a Rectangular Mesh Network

    Directory of Open Access Journals (Sweden)

    Noraziah Adzhar


    Full Text Available In the process of automatic design for printed circuit boards (PCBs, the phase following cell placement is routing. On the other hand, routing process is a notoriously difficult problem, and even the simplest routing problem which consists of a set of two-pin nets is known to be NP-complete. In this research, our routing region is first tessellated into a uniform Nx×Ny array of square cells. The ultimate goal for a routing problem is to achieve complete automatic routing with minimal need for any manual intervention. Therefore, shortest path for all connections needs to be established. While classical Dijkstra’s algorithm guarantees to find shortest path for a single net, each routed net will form obstacles for later paths. This will add complexities to route later nets and make its routing longer than the optimal path or sometimes impossible to complete. Today’s sequential routing often applies heuristic method to further refine the solution. Through this process, all nets will be rerouted in different order to improve the quality of routing. Because of this, we are motivated to apply simulated annealing, one of the metaheuristic methods to our routing model to produce better candidates of sequence.

  13. Advanced Techniques for Reservoir Simulation and Modeling of Non-Conventional Wells

    Energy Technology Data Exchange (ETDEWEB)

    Durlofsky, Louis J.; Aziz, Khalid


    Research results for the second year of this project on the development of improved modeling techniques for non-conventional (e.g., horizontal, deviated or multilateral) wells were presented. The overall program entails the development of enhanced well modeling and general simulation capabilities. A general formulation for black-oil and compositional reservoir simulation was presented.

  14. Configuring Simulation Models Using CAD Techniques: A New Approach to Warehouse Design


    Brito, António Ernesto da Silva Carvalho


    The research reported in this thesis is related to the development and use of software tools for supporting warehouse design and management. Computer Aided Design and Simulation techniques are used to develop a software system that forms the basis of a Decision Support System for warehouse design. The current position of simulation software is reviewed. It is investigated how appropriate current simulation software is for warehouse modelling. Special attention is given to Vi...

  15. Parallel Reservoir Simulations with Sparse Grid Techniques and Applications to Wormhole Propagation

    KAUST Repository

    Wu, Yuanqing


    In this work, two topics of reservoir simulations are discussed. The first topic is the two-phase compositional flow simulation in hydrocarbon reservoir. The major obstacle that impedes the applicability of the simulation code is the long run time of the simulation procedure, and thus speeding up the simulation code is necessary. Two means are demonstrated to address the problem: parallelism in physical space and the application of sparse grids in parameter space. The parallel code can gain satisfactory scalability, and the sparse grids can remove the bottleneck of flash calculations. Instead of carrying out the flash calculation in each time step of the simulation, a sparse grid approximation of all possible results of the flash calculation is generated before the simulation. Then the constructed surrogate model is evaluated to approximate the flash calculation results during the simulation. The second topic is the wormhole propagation simulation in carbonate reservoir. In this work, different from the traditional simulation technique relying on the Darcy framework, we propose a new framework called Darcy-Brinkman-Forchheimer framework to simulate wormhole propagation. Furthermore, to process the large quantity of cells in the simulation grid and shorten the long simulation time of the traditional serial code, standard domain-based parallelism is employed, using the Hypre multigrid library. In addition to that, a new technique called “experimenting field approach” to set coefficients in the model equations is introduced. In the 2D dissolution experiments, different configurations of wormholes and a series of properties simulated by both frameworks are compared. We conclude that the numerical results of the DBF framework are more like wormholes and more stable than the Darcy framework, which is a demonstration of the advantages of the DBF framework. The scalability of the parallel code is also evaluated, and good scalability can be achieved. Finally, a mixed

  16. Comparative analysis of numerical simulation techniques for incoherent imaging of extended objects through atmospheric turbulence (United States)

    Lachinova, Svetlana L.; Vorontsov, Mikhail A.; Filimonov, Grigory A.; LeMaster, Daniel A.; Trippel, Matthew E.


    Computational efficiency and accuracy of wave-optics-based Monte-Carlo and brightness function numerical simulation techniques for incoherent imaging of extended objects through atmospheric turbulence are evaluated. Simulation results are compared with theoretical estimates based on known analytical solutions for the modulation transfer function of an imaging system and the long-exposure image of a Gaussian-shaped incoherent light source. It is shown that the accuracy of both techniques is comparable over the wide range of path lengths and atmospheric turbulence conditions, whereas the brightness function technique is advantageous in terms of the computational speed.

  17. Simulation-driven design by knowledge-based response correction techniques

    CERN Document Server

    Koziel, Slawomir


    Focused on efficient simulation-driven multi-fidelity optimization techniques, this monograph on simulation-driven optimization covers simulations utilizing physics-based low-fidelity models, often based on coarse-discretization simulations or other types of simplified physics representations, such as analytical models. The methods presented in the book exploit as much as possible any knowledge about the system or device of interest embedded in the low-fidelity model with the purpose of reducing the computational overhead of the design process. Most of the techniques described in the book are of response correction type and can be split into parametric (usually based on analytical formulas) and non-parametric, i.e., not based on analytical formulas. The latter, while more complex in implementation, tend to be more efficient. The book presents a general formulation of response correction techniques as well as a number of specific methods, including those based on correcting the low-fidelity model response (out...

  18. A Grid-Free Approach for Plasma Simulations (Grid-Free Plasma Simulation Techniques) (United States)


    titles are listed below. The papers will be sent to the program manager, Major David Byers, upon completion. Christlieb, A.J.; Olson, S.E.; Gridless...W. Hockney and J. W. Eastwood, Computer Simulation Using Parti- Fluid Mech., vol. 184, pp. 123-155, 1987. cles. Bristol, U.K.: lOP Publishing, 1988

  19. A system identification technique based on the random decrement signatures. Part 1: Theory and simulation (United States)

    Bedewi, Nabih E.; Yang, Jackson C. S.


    Identification of the system parameters of a randomly excited structure may be treated using a variety of statistical techniques. Of all these techniques, the Random Decrement is unique in that it provides the homogeneous component of the system response. Using this quality, a system identification technique was developed based on a least-squares fit of the signatures to estimate the mass, damping, and stiffness matrices of a linear randomly excited system. The mathematics of the technique is presented in addition to the results of computer simulations conducted to demonstrate the prediction of the response of the system and the random forcing function initially introduced to excite the system.

  20. Are burns photographs useful? (United States)

    Nelson, L; Boyle, M; Taggart, I; Watson, S


    Routine photography of all patients admitted to the West of Scotland Regional Burns Unit was introduced in 2003. To date, there are few burns units to evaluate the usefulness of photographs taken. To assess the usefulness of photographs of patients admitted to the burns unit to various members of the multidisciplinary team. A questionnaire was completed by hospital staff involved in the management of burns patients over a 3-month period. A total of 43 questionnaires were completed. The majority of questionnaires were completed by nursing staff (55%) followed by medical staff (23%); physiotherapy (5%); anaesthetists (7%); theatre staff (5%); students (2%); dietician (2%). About 98% of respondents agreed that photographs were useful overall, particularly for teaching purposes. About 9% disagreed that photographs were useful for assessment due to difficulty in assessing depth of burn. About 72% agreed that the photographs were useful for patient management and improve patient care. About 88% agreed that all patients should have photographs available in future. Advantages of photographs include; moving and handling of patients; patient positioning in theatre; reviewing wound healing and complications. They are useful for assessing site, size and type of burn. Disadvantages include difficulty in assessing depth of burn, technical factors, and unavailability out of hours. Photographs of burns patients are useful overall to all members of the multidisciplinary team.

  1. An overview of uncertainty quantification techniques with application to oceanic and oil-spill simulations

    KAUST Repository

    Iskandarani, Mohamed


    We give an overview of four different ensemble-based techniques for uncertainty quantification and illustrate their application in the context of oil plume simulations. These techniques share the common paradigm of constructing a model proxy that efficiently captures the functional dependence of the model output on uncertain model inputs. This proxy is then used to explore the space of uncertain inputs using a large number of samples, so that reliable estimates of the model\\'s output statistics can be calculated. Three of these techniques use polynomial chaos (PC) expansions to construct the model proxy, but they differ in their approach to determining the expansions\\' coefficients; the fourth technique uses Gaussian Process Regression (GPR). An integral plume model for simulating the Deepwater Horizon oil-gas blowout provides examples for illustrating the different techniques. A Monte Carlo ensemble of 50,000 model simulations is used for gauging the performance of the different proxies. The examples illustrate how regression-based techniques can outperform projection-based techniques when the model output is noisy. They also demonstrate that robust uncertainty analysis can be performed at a fraction of the cost of the Monte Carlo calculation.

  2. Aerodynamic design of a space vehicle using the numerical simulation technique


    Yamamoto, Yukimitsu; Wada, Yasuhiro; Takanashi, Susumu; Ishiguro, Mitsuo; 山本 行光; 和田 安弘; 高梨 進; 石黒 満津夫


    Optimization or the aerodynamic configuration or a space vehicle 'HOPE' (H-2 Orbiting Plane) is conducted by using several numerical simulation codes in the transonic and hypersonic speed ranges. Design requirements are set on the longitudinal aerodynamic characteristics in the transonic speed and the aerodynamic heat characteristics in the hypersonic speed. This paper describes the procedure or the optimization or aerodynamic configurations by using the numerical simulation technique as an e...

  3. Comparison of phase noise simulation techniques on a BJT LC oscillator. (United States)

    Forbes, Leonard; Zhang, Chengwei; Zhang, Binglei; Chandra, Yudi


    The phase noise resulting from white and flicker noise in a bipolar junction transistor (BJT) LC oscillator is investigated. Large signal transient time domain SPICE simulations of phase noise resulting from the random-phase flicker and white noise in a 2 GHz BJT LC oscillator have been performed and demonstrated. The simulation results of this new technique are compared with Eldo RF and Spectre RF based on linear circuit concepts and experimental result reported in the literature.

  4. Dynamic Simulations & Animations of the Classical Control Techniques with Linear Transformations


    Ahmet ALTINTAŞ; Güven, Mehmet


    Teaching and learning techniques using computer-based resources greatly improve the effectiveness and efficiency of the learning process. Currently, there are a lot of simulation and animation packages in use, and some of them are developed for educational purposes. The dynamic simulations-animations (DSA) allow us to see physical movement of the different pieces according to the modeled system. Education-purposed packages cannot be sufficiently flexible in different branches of  sci...

  5. Analytical vs. Simulation Solution Techniques for Pulse Problems in Non-linear Stochastic Dynamics

    DEFF Research Database (Denmark)

    Iwankiewicz, R.; Nielsen, Søren R. K.

    Advantages and disadvantages of available analytical and simulation techniques for pulse problems in non-linear stochastic dynamics are discussed. First, random pulse problems, both those which do and do not lead to Markov theory, are presented. Next, the analytical and analytically-numerical tec......Advantages and disadvantages of available analytical and simulation techniques for pulse problems in non-linear stochastic dynamics are discussed. First, random pulse problems, both those which do and do not lead to Markov theory, are presented. Next, the analytical and analytically...

  6. Agreement between radiographic and photographic trabecular patterns

    Energy Technology Data Exchange (ETDEWEB)

    Korstjens, C.M.; Geraets, W.G.M.; Stelt, P.F. van der [Dept. of Oral Radiology, Academic Centre for Dentistry, Amsterdam (Netherlands); Spruijt, R.J. [Div. of Psychosocial Research and Epidemiology, Netherlands Cancer Inst., Amsterdam (Netherlands); Mosekilde, L. [Dept. of Cell Biology, Univ. of Aarhus (Denmark)


    Purpose: It has been hypothesized that photographs can facilitate the interpretation of the radiographic characteristics of trabecular bone. The reliability of these photographic and radiographic approaches has been determined, as have various agreements between the two approaches and their correlations with biomechanical characteristics. Material and Methods: Fourteen vertebral bodies were obtained at autopsy from 6 women and 8 men aged 22-76 years. Photographs (n=28) and radiographs (n=28) were taken of midsagittal slices from the third lumbar vertebra. The radiographs and photographs were digitized and the geometric properties of the trabecular architecture were then determined with a digital images analysis technique. Information on the compressive strength and ash density of the vertebral body was also available. Results: The geometric properties of both radiographs and photographs could be measured with a high degree of reliability (Cronbach`s {alpha}>0.85). Agreement between the radiographic and photographic approaches was mediocre as only the radiographic measurements showed insignificant correlations (p<0.05) with the biomechanical characteristics. We suggest that optical phenomena may result in the significant correlations between the photographs and the biomechanical characteristics. Conclusion: For digital image processing, radiography offers a superior description of the architecture of trabecular bone to that offered by photography. (orig.)

  7. Development of a Car Racing Simulator Game Using Artificial Intelligence Techniques

    Directory of Open Access Journals (Sweden)

    Marvin T. Chan


    Full Text Available This paper presents a car racing simulator game called Racer, in which the human player races a car against three game-controlled cars in a three-dimensional environment. The objective of the game is not to defeat the human player, but to provide the player with a challenging and enjoyable experience. To ensure that this objective can be accomplished, the game incorporates artificial intelligence (AI techniques, which enable the cars to be controlled in a manner that mimics natural driving. The paper provides a brief history of AI techniques in games, presents the use of AI techniques in contemporary video games, and discusses the AI techniques that were implemented in the development of Racer. A comparison of the AI techniques implemented in the Unity platform with traditional AI search techniques is also included in the discussion.

  8. Modeling and numerical techniques for high-speed digital simulation of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Wulff, W.; Cheng, H.S.; Mallen, A.N.


    Conventional computing methods are contrasted with newly developed high-speed and low-cost computing techniques for simulating normal and accidental transients in nuclear power plants. Six principles are formulated for cost-effective high-fidelity simulation with emphasis on modeling of transient two-phase flow coolant dynamics in nuclear reactors. Available computing architectures are characterized. It is shown that the combination of the newly developed modeling and computing principles with the use of existing special-purpose peripheral processors is capable of achieving low-cost and high-speed simulation with high-fidelity and outstanding user convenience, suitable for detailed reactor plant response analyses.

  9. Virtual X-ray imaging techniques in an immersive casting simulation environment (United States)

    Li, Ning; Kim, Sung-Hee; Suh, Ji-Hyun; Cho, Sang-Hyun; Choi, Jung-Gil; Kim, Myoung-Hee


    A computer code was developed to simulate radiograph of complex casting products in a CAVE TM-like environment. The simulation is based on the deterministic algorithms and ray tracing techniques. The aim of this study is to examine CAD/CAE/CAM models at the design stage, to optimize the design and inspect predicted defective regions with fast speed, good accuracy and small numerical expense. The present work discusses the algorithms for the radiography simulation of CAD/CAM model and proposes algorithmic solutions adapted from ray-box intersection algorithm and octree data structure specifically for radiographic simulation of CAE model. The stereoscopic visualization of full-size of product in the immersive casting simulation environment as well as the virtual X-ray images of castings provides an effective tool for design and evaluation of foundry processes by engineers and metallurgists.

  10. A Novel Interfacing Technique for Distributed Hybrid Simulations Combining EMT and Transient Stability Models

    Energy Technology Data Exchange (ETDEWEB)

    Shu, Dewu; Xie, Xiaorong; Jiang, Qirong; Huang, Qiuhua; Zhang, Chunpeng


    With steady increase of power electronic devices and nonlinear dynamic loads in large scale AC/DC systems, the traditional hybrid simulation method, which incorporates these components into a single EMT subsystem and hence causes great difficulty for network partitioning and significant deterioration in simulation efficiency. To resolve these issues, a novel distributed hybrid simulation method is proposed in this paper. The key to realize this method is a distinct interfacing technique, which includes: i) a new approach based on the two-level Schur complement to update the interfaces by taking full consideration of the couplings between different EMT subsystems; and ii) a combined interaction protocol to further improve the efficiency while guaranteeing the simulation accuracy. The advantages of the proposed method in terms of both efficiency and accuracy have been verified by using it for the simulation study of an AC/DC hybrid system including a two-terminal VSC-HVDC and nonlinear dynamic loads.

  11. Reflection on photographs. (United States)

    Brand, Gabrielle; McMurray, Anne


    Nursing students' exposure to clinical placements with older adults is instrumental in helping them adopt positive attitudes toward care of that population. This qualitative pilot study analyzed perceptions and expectations of a group of first-year students prior to a clinical placement with older adults. A photo-elicitation technique, involving viewing of realistic photographs of older adults being cared for, was used to help students clarify expectations. This was followed by thematic analysis of their perceptions and expectations. Analysis revealed five main themes: Dissecting What It Means to Be a Nurse, Revisioning Therapeutic Relationships in Terms of Dignity, Youthful Reflection on the Differences Between Young and Old, Feeling Challenged and Confronted, and Experiencing Sensitivity and Awkwardness Toward Older Adults' Nakedness. Engagement with images of older adults encouraged students to anticipate their clinical placement in an aged care setting in a more meaningful, reflective way than they may have done without prior exposure, suggesting a need for realistic pre-practice education. Copyright 2009, SLACK Incorporated.

  12. Generating Inviscid and Viscous Fluid Flow Simulations over a Surface Using a Quasi-simultaneous Technique (United States)

    Sturdza, Peter (Inventor); Martins-Rivas, Herve (Inventor); Suzuki, Yoshifumi (Inventor)


    A fluid-flow simulation over a computer-generated surface is generated using a quasi-simultaneous technique. The simulation includes a fluid-flow mesh of inviscid and boundary-layer fluid cells. An initial fluid property for an inviscid fluid cell is determined using an inviscid fluid simulation that does not simulate fluid viscous effects. An initial boundary-layer fluid property a boundary-layer fluid cell is determined using the initial fluid property and a viscous fluid simulation that simulates fluid viscous effects. An updated boundary-layer fluid property is determined for the boundary-layer fluid cell using the initial fluid property, initial boundary-layer fluid property, and an interaction law. The interaction law approximates the inviscid fluid simulation using a matrix of aerodynamic influence coefficients computed using a two-dimensional surface panel technique and a fluid-property vector. An updated fluid property is determined for the inviscid fluid cell using the updated boundary-layer fluid property.

  13. Computer imaging software for profile photograph analysis. (United States)

    Tollefson, Travis T; Sykes, Jonathan M


    To describe a novel calibration technique for photographs of different sizes and to test a new method of chin evaluation in relation to established analysis measurements. A photograph analysis and medical record review of 14 patients who underwent combined rhinoplasty and chin correction at an academic center. Patients undergoing concurrent orthognathic surgery, rhytidectomy, or submental liposuction were excluded. Preoperative and postoperative digital photographs were analyzed using computer imaging software with a new method, the soft tissue porion to pogonion distance, and with established measurements, including the cervicomental angle, the mentocervical angle, and the facial convexity angle. The porion to pogonion distance consistently increased after the chin correction procedure (more in the osseous group). All photograph angle measurements changed toward the established normal range postoperatively. Surgery for facial disharmony requires artistic judgment and objective evaluation. Although 3-dimensional video analysis of the face seems promising, its clinical use is limited by cost. For surgeons who use computer imaging software, analysis of profile photographs is the most valuable tool. Even when preoperative and postoperative photographs are of different sizes, relative distance comparisons are possible with a new calibration technique using the constant facial landmarks, the porion and the pupil. The porion-pogonion distance is a simple reproducible measurement that can be used along with established soft tissue measurements as a guide for profile facial analysis.

  14. Accelerating all-atom MD simulations of lipids using a modified virtual-sites technique

    DEFF Research Database (Denmark)

    Loubet, Bastien; Kopec, Wojciech; Khandelia, Himanshu


    We present two new implementations of the virtual sites technique which completely suppresses the degrees of freedom of the hydrogen atoms in a lipid bilayer allowing for an increased time step of 5 fs in all-atom simulations of the CHARMM36 force field. One of our approaches uses the derivation ...

  15. Simulation of Strong Ground Motion of the 2009 Bhutan Earthquake Using Modified Semi-Empirical Technique (United States)

    Sandeep; Joshi, A.; Lal, Sohan; Kumar, Parveen; Sah, S. K.; Vandana; Kamal


    On 21st September 2009 an earthquake of magnitude (M w 6.1) occurred in the East Bhutan. This earthquake caused serious damage to the residential area and was widely felt in the Bhutan Himalaya and its adjoining area. We estimated the source model of this earthquake using modified semi empirical technique. In the rupture plane, several locations of nucleation point have been considered and finalised based on the minimum root mean square error of waveform comparison. In the present work observed and simulated waveforms has been compared at all the eight stations. Comparison of horizontal components of actual and simulated records at these stations confirms the estimated parameters of final rupture model and efficacy of the modified semi-empirical technique (Joshi et al., Nat Hazards 64:1029-1054, 2012b) of strong ground motion simulation.

  16. Optimizing Availability of a Framework in Series Configuration Utilizing Markov Model and Monte Carlo Simulation Techniques

    Directory of Open Access Journals (Sweden)

    Mansoor Ahmed Siddiqui


    Full Text Available This research work is aimed at optimizing the availability of a framework comprising of two units linked together in series configuration utilizing Markov Model and Monte Carlo (MC Simulation techniques. In this article, effort has been made to develop a maintenance model that incorporates three distinct states for each unit, while taking into account their different levels of deterioration. Calculations are carried out using the proposed model for two distinct cases of corrective repair, namely perfect and imperfect repairs, with as well as without opportunistic maintenance. Initially, results are accomplished using an analytical technique i.e., Markov Model. Validation of the results achieved is later carried out with the help of MC Simulation. In addition, MC Simulation based codes also work well for the frameworks that follow non-exponential failure and repair rates, and thus overcome the limitations offered by the Markov Model.

  17. Retrieval of Similar Objects in Simulation Data Using Machine Learning Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Cantu-Paz, E; Cheung, S-C; Kamath, C


    Comparing the output of a physics simulation with an experiment is often done by visually comparing the two outputs. In order to determine which simulation is a closer match to the experiment, more quantitative measures are needed. This paper describes our early experiences with this problem by considering the slightly simpler problem of finding objects in a image that are similar to a given query object. Focusing on a dataset from a fluid mixing problem, we report on our experiments using classification techniques from machine learning to retrieve the objects of interest in the simulation data. The early results reported in this paper suggest that machine learning techniques can retrieve more objects that are similar to the query than distance-based similarity methods.

  18. Advanced particle-in-cell simulation techniques for modeling the Lockheed Martin Compact Fusion Reactor (United States)

    Welch, Dale; Font, Gabriel; Mitchell, Robert; Rose, David


    We report on particle-in-cell developments of the study of the Compact Fusion Reactor. Millisecond, two and three-dimensional simulations (cubic meter volume) of confinement and neutral beam heating of the magnetic confinement device requires accurate representation of the complex orbits, near perfect energy conservation, and significant computational power. In order to determine initial plasma fill and neutral beam heating, these simulations include ionization, elastic and charge exchange hydrogen reactions. To this end, we are pursuing fast electromagnetic kinetic modeling algorithms including a two implicit techniques and a hybrid quasi-neutral algorithm with kinetic ions. The kinetic modeling includes use of the Poisson-corrected direct implicit, magnetic implicit, as well as second-order cloud-in-cell techniques. The hybrid algorithm, ignoring electron inertial effects, is two orders of magnitude faster than kinetic but not as accurate with respect to confinement. The advantages and disadvantages of these techniques will be presented. Funded by Lockheed Martin.

  19. Wind Turbine Rotor Simulation via CFD Based Actuator Disc Technique Compared to Detailed Measurement

    Directory of Open Access Journals (Sweden)

    Esmail Mahmoodi


    Full Text Available In this paper, a generalized Actuator Disc (AD is used to model the wind turbine rotor of the MEXICO experiment, a collaborative European wind turbine project. The AD model as a combination of CFD technique and User Defined Functions codes (UDF, so-called UDF/AD model is used to simulate loads and performance of the rotor in three different wind speed tests. Distributed force on the blade, thrust and power production of the rotor as important designing parameters of wind turbine rotors are focused to model. A developed Blade Element Momentum (BEM theory as a code based numerical technique as well as a full rotor simulation both from the literature are included into the results to compare and discuss. The output of all techniques is compared to detailed measurements for validation, which led us to final conclusions.

  20. USGS Photographic Library (United States)

    U.S. Geological Survey, Department of the Interior — The U.S. Geological Survey Denver Library maintains a collection of over 400,000 photographs taken during geologic studies of the United States and its territories...

  1. Validation of a novel technique for creating simulated radiographs using computed tomography datasets. (United States)

    Mendoza, Patricia; d'Anjou, Marc-André; Carmel, Eric N; Fournier, Eric; Mai, Wilfried; Alexander, Kate; Winter, Matthew D; Zwingenberger, Allison L; Thrall, Donald E; Theoret, Christine


    Understanding radiographic anatomy and the effects of varying patient and radiographic tube positioning on image quality can be a challenge for students. The purposes of this study were to develop and validate a novel technique for creating simulated radiographs using computed tomography (CT) datasets. A DICOM viewer (ORS Visual) plug-in was developed with the ability to move and deform cuboidal volumetric CT datasets, and to produce images simulating the effects of tube-patient-detector distance and angulation. Computed tomographic datasets were acquired from two dogs, one cat, and one horse. Simulated radiographs of different body parts (n = 9) were produced using different angles to mimic conventional projections, before actual digital radiographs were obtained using the same projections. These studies (n = 18) were then submitted to 10 board-certified radiologists who were asked to score visualization of anatomical landmarks, depiction of patient positioning, realism of distortion/magnification, and image quality. No significant differences between simulated and actual radiographs were found for anatomic structure visualization and patient positioning in the majority of body parts. For the assessment of radiographic realism, no significant differences were found between simulated and digital radiographs for canine pelvis, equine tarsus, and feline abdomen body parts. Overall, image quality and contrast resolution of simulated radiographs were considered satisfactory. Findings from the current study indicated that radiographs simulated using this new technique are comparable to actual digital radiographs. Further studies are needed to apply this technique in developing interactive tools for teaching radiographic anatomy and the effects of varying patient and tube positioning. © 2013 American College of Veterinary Radiology.

  2. Spectral element filtering techniques for large eddy simulation with dynamic estimation

    CERN Document Server

    Blackburn, H M


    Spectral element methods have previously been successfully applied to direct numerical simulation of turbulent flows with moderate geometrical complexity and low to moderate Reynolds numbers. A natural extension of application is to large eddy simulation of turbulent flows, although there has been little published work in this area. One of the obstacles to such application is the ability to deal successfully with turbulence modelling in the presence of solid walls in arbitrary locations. An appropriate tool with which to tackle the problem is dynamic estimation of turbulence model parameters, but while this has been successfully applied to simulation of turbulent wall-bounded flows, typically in the context of spectral and finite volume methods, there have been no published applications with spectral element methods. Here, we describe approaches based on element-level spectral filtering, couple these with the dynamic procedure, and apply the techniques to large eddy simulation of a prototype wall-bounded turb...

  3. Validation of a low dose simulation technique for computed tomography images.

    Directory of Open Access Journals (Sweden)

    Daniela Muenzel

    Full Text Available PURPOSE: Evaluation of a new software tool for generation of simulated low-dose computed tomography (CT images from an original higher dose scan. MATERIALS AND METHODS: Original CT scan data (100 mAs, 80 mAs, 60 mAs, 40 mAs, 20 mAs, 10 mAs; 100 kV of a swine were acquired (approved by the regional governmental commission for animal protection. Simulations of CT acquisition with a lower dose (simulated 10-80 mAs were calculated using a low-dose simulation algorithm. The simulations were compared to the originals of the same dose level with regard to density values and image noise. Four radiologists assessed the realistic visual appearance of the simulated images. RESULTS: Image characteristics of simulated low dose scans were similar to the originals. Mean overall discrepancy of image noise and CT values was -1.2% (range -9% to 3.2% and -0.2% (range -8.2% to 3.2%, respectively, p>0.05. Confidence intervals of discrepancies ranged between 0.9-10.2 HU (noise and 1.9-13.4 HU (CT values, without significant differences (p>0.05. Subjective observer evaluation of image appearance showed no visually detectable difference. CONCLUSION: Simulated low dose images showed excellent agreement with the originals concerning image noise, CT density values, and subjective assessment of the visual appearance of the simulated images. An authentic low-dose simulation opens up opportunity with regard to staff education, protocol optimization and introduction of new techniques.

  4. Metabolic rate control during extravehicular activity simulations and measurement techniques during actual EVAS (United States)

    Horrigan, D. J.


    A description of the methods used to control and measure metabolic rate during ground simulations is given. Work levels attained at the Space Environment Simulation Laboratory are presented. The techniques and data acquired during ground simulations are described and compared with inflight procedures. Data from both the Skylab and Apollo Program were utilized and emphasis is given to the methodology, both in simulation and during flight. The basic techniques of work rate assessment are described. They include oxygen consumption, which was useful for averages over long time periods, heart rate correlations based on laboratory calibrations, and liquid cooling garment temperature changes. The relative accuracy of these methods as well as the methods of real-time monitoring at the Mission Control Center are discussed. The advantages and disadvantages of each of the metabolic measurement techniques are discussed. Particular emphasis is given to the problem of utilizing oxygen decrement for short time periods and heart rate at low work levels. A summary is given of the effectiveness of work rate control and measurements; and current plans for future EVA monitoring are discussed.

  5. Simulation of white light generation and near light bullets using a novel numerical technique (United States)

    Zia, Haider


    An accurate and efficient simulation has been devised, employing a new numerical technique to simulate the derivative generalised non-linear Schrödinger equation in all three spatial dimensions and time. The simulation models all pertinent effects such as self-steepening and plasma for the non-linear propagation of ultrafast optical radiation in bulk material. Simulation results are compared to published experimental spectral data of an example ytterbium aluminum garnet system at 3.1 μm radiation and fits to within a factor of 5. The simulation shows that there is a stability point near the end of the 2 mm crystal where a quasi-light bullet (spatial temporal soliton) is present. Within this region, the pulse is collimated at a reduced diameter (factor of ∼2) and there exists a near temporal soliton at the spatial center. The temporal intensity within this stable region is compressed by a factor of ∼4 compared to the input. This study shows that the simulation highlights new physical phenomena based on the interplay of various linear, non-linear and plasma effects that go beyond the experiment and is thus integral to achieving accurate designs of white light generation systems for optical applications. An adaptive error reduction algorithm tailor made for this simulation will also be presented in appendix.

  6. Experimental and data analysis techniques for deducing collision-induced forces from photographic histories of engine rotor fragment impact/interaction with a containment ring (United States)

    Yeghiayan, R. P.; Leech, J. W.; Witmer, E. A.


    An analysis method termed TEJ-JET is described whereby measured transient elastic and inelastic deformations of an engine-rotor fragment-impacted structural ring are analyzed to deduce the transient external forces experienced by that ring as a result of fragment impact and interaction with the ring. Although the theoretical feasibility of the TEJ-JET concept was established, its practical feasibility when utilizing experimental measurements of limited precision and accuracy remains to be established. The experimental equipment and the techniques (high-speed motion photography) employed to measure the transient deformations of fragment-impacted rings are described. Sources of error and data uncertainties are identified. Techniques employed to reduce data reading uncertainties and to correct the data for optical-distortion effects are discussed. These procedures, including spatial smoothing of the deformed ring shape by Fourier series and timewise smoothing by Gram polynomials, are applied illustratively to recent measurements involving the impact of a single T58 turbine rotor blade against an aluminum containment ring. Plausible predictions of the fragment-ring impact/interaction forces are obtained by one branch of this TEJ-JET method; however, a second branch of this method, which provides an independent estimate of these forces, remains to be evaluated.

  7. Improved simulation based HR-EBSD procedure using image gradient based DIC techniques. (United States)

    Alkorta, Jon; Marteleur, Matthieu; Jacques, Pascal J


    Conventional HR-EBSD is attracting much interest due to its ability of measuring relative crystal misorientations and microstresses with great accuracy. However, this technique needs the use of simulated patterns in order to get absolute values of crystal orientation and stresses and thus expand its use to intergranular analyses. Simulation-based approaches have shown many limitations due to the poor correlation with the real patterns specially when Bragg simulations are considered. This paper presents an improved algorithm based on gradient-based correlation techniques that makes simulation-based HR-EBSD possible. Based on this new algorithm, a new pattern center calibration procedure is proposed and validated. Also, a new hybrid procedure that combines simulation-based HR-EBSD with conventional HR-EBSD is presented that enables an absolute determination of both orientations and stresses with improved accuracy. The hybrid HR-EBSD is used to analyze the martensitic transformation induced by plastic deformation in an as-quenched Ti-12wt.%Mo alloy. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Translating the simulation of procedural drilling techniques for interactive neurosurgical training. (United States)

    Stredney, Don; Rezai, Ali R; Prevedello, Daniel M; Elder, J Bradley; Kerwin, Thomas; Hittle, Bradley; Wiet, Gregory J


    Through previous efforts we have developed a fully virtual environment to provide procedural training of otologic surgical technique. The virtual environment is based on high-resolution volumetric data of the regional anatomy. These volumetric data help drive an interactive multisensory, ie, visual (stereo), aural (stereo), and tactile, simulation environment. Subsequently, we have extended our efforts to support the training of neurosurgical procedural technique as part of the Congress of Neurological Surgeons simulation initiative. To deliberately study the integration of simulation technologies into the neurosurgical curriculum and to determine their efficacy in teaching minimally invasive cranial and skull base approaches. We discuss issues of biofidelity and our methods to provide objective, quantitative and automated assessment for the residents. We conclude with a discussion of our experiences by reporting preliminary formative pilot studies and proposed approaches to take the simulation to the next level through additional validation studies. We have presented our efforts to translate an otologic simulation environment for use in the neurosurgical curriculum. We have demonstrated the initial proof of principles and define the steps to integrate and validate the system as an adjuvant to the neurosurgical curriculum.

  9. Modeling and simulation techniques in extreme nonlinear optics of gaseous and condensed media. (United States)

    Kolesik, M; Moloney, J V


    Computer simulation techniques for extreme nonlinear optics are reviewed with emphasis on the high light-intensity regimes in which both bound and freed electronic states contribute to the medium response and thus affect the optical pulse dynamics. The first part concentrates on the optical pulse propagation modeling, and provides a classification of various approaches to optical-field evolution equations. Light-matter interaction models are reviewed in the second part, which concentrates on methods that can be integrated with time- and space-resolved simulations encompassing realistic experimental scenarios.

  10. Around the laboratories: Dubna: Physics results and progress on bubble chamber techniques; Stanford (SLAC): Operation of a very rapid cycling bubble chamber; Daresbury: Photographs of visitors to the Laboratory; Argonne: Charge exchange injection tests into the ZGS in preparation for a proposed Booster

    CERN Multimedia


    Around the laboratories: Dubna: Physics results and progress on bubble chamber techniques; Stanford (SLAC): Operation of a very rapid cycling bubble chamber; Daresbury: Photographs of visitors to the Laboratory; Argonne: Charge exchange injection tests into the ZGS in preparation for a proposed Booster

  11. Virtual simulation experiment in the course Laser Principles and Techniques for undergraduates (United States)

    Li, Kun; Wu, Bo


    The course Laser Principle and Technology for undergraduates is a multi-physics subject with main contents of laser's basic principle, laser modulation techniques, Q-switching techniques, etc. . In order to help students understand the complex theory and to integrate the theory with the engineering practice, we developed a virtual simulation platform/software. This platform consists of three main modules (laser generation, laser propagation and laser controlling), which can be subdivided into eight secondary modules, including laser output characteristics, laser resonator, laser modulation, frequency conversion, et al. . Each module has its input and output parameters and can be modified by the user. The theoretical models and the algorithms are introduced in this article. The output characteristics of the relaxation oscillation process are presented as an example of the simulation results.

  12. Optimisation of 12 MeV electron beam simulation using variance reduction technique (United States)

    Jayamani, J.; Termizi, N. A. S. Mohd; Kamarulzaman, F. N. Mohd; Aziz, M. Z. Abdul


    Monte Carlo (MC) simulation for electron beam radiotherapy consumes a long computation time. An algorithm called variance reduction technique (VRT) in MC was implemented to speed up this duration. This work focused on optimisation of VRT parameter which refers to electron range rejection and particle history. EGSnrc MC source code was used to simulate (BEAMnrc code) and validate (DOSXYZnrc code) the Siemens Primus linear accelerator model with the non-VRT parameter. The validated MC model simulation was repeated by applying VRT parameter (electron range rejection) that controlled by global electron cut-off energy 1,2 and 5 MeV using 20 × 107 particle history. 5 MeV range rejection generated the fastest MC simulation with 50% reduction in computation time compared to non-VRT simulation. Thus, 5 MeV electron range rejection utilized in particle history analysis ranged from 7.5 × 107 to 20 × 107. In this study, 5 MeV electron cut-off with 10 × 107 particle history, the simulation was four times faster than non-VRT calculation with 1% deviation. Proper understanding and use of VRT can significantly reduce MC electron beam calculation duration at the same time preserving its accuracy.

  13. Assessment of robotic patient simulators for training in manual physical therapy examination techniques. (United States)

    Ishikawa, Shun; Okamoto, Shogo; Isogai, Kaoru; Akiyama, Yasuhiro; Yanagihara, Naomi; Yamada, Yoji


    Robots that simulate patients suffering from joint resistance caused by biomechanical and neural impairments are used to aid the training of physical therapists in manual examination techniques. However, there are few methods for assessing such robots. This article proposes two types of assessment measures based on typical judgments of clinicians. One of the measures involves the evaluation of how well the simulator presents different severities of a specified disease. Experienced clinicians were requested to rate the simulated symptoms in terms of severity, and the consistency of their ratings was used as a performance measure. The other measure involves the evaluation of how well the simulator presents different types of symptoms. In this case, the clinicians were requested to classify the simulated resistances in terms of symptom type, and the average ratios of their answers were used as performance measures. For both types of assessment measures, a higher index implied higher agreement among the experienced clinicians that subjectively assessed the symptoms based on typical symptom features. We applied these two assessment methods to a patient knee robot and achieved positive appraisals. The assessment measures have potential for use in comparing several patient simulators for training physical therapists, rather than as absolute indices for developing a standard.

  14. Implementation of numerical simulation techniques in analysis of the accidents in complex technological systems

    Energy Technology Data Exchange (ETDEWEB)

    Klishin, G.S.; Seleznev, V.E.; Aleoshin, V.V. [RFNC-VNIIEF (Russian Federation)


    Gas industry enterprises such as main pipelines, compressor gas transfer stations, gas extracting complexes belong to the energy intensive industry. Accidents there can result into the catastrophes and great social, environmental and economic losses. Annually, according to the official data several dozens of large accidents take place at the pipes in the USA and Russia. That is why prevention of the accidents, analysis of the mechanisms of their development and prediction of their possible consequences are acute and important tasks nowadays. The accidents reasons are usually of a complicated character and can be presented as a complex combination of natural, technical and human factors. Mathematical and computer simulations are safe, rather effective and comparatively inexpensive methods of the accident analysis. It makes it possible to analyze different mechanisms of a failure occurrence and development, to assess its consequences and give recommendations to prevent it. Besides investigation of the failure cases, numerical simulation techniques play an important role in the treatment of the diagnostics results of the objects and in further construction of mathematical prognostic simulations of the object behavior in the period of time between two inspections. While solving diagnostics tasks and in the analysis of the failure cases, the techniques of theoretical mechanics, of qualitative theory of different equations, of mechanics of a continuous medium, of chemical macro-kinetics and optimizing techniques are implemented in the Conversion Design Bureau {number_sign}5 (DB{number_sign}5). Both universal and special numerical techniques and software (SW) are being developed in DB{number_sign}5 for solution of such tasks. Almost all of them are calibrated on the calculations of the simulated and full-scale experiments performed at the VNIIEF and MINATOM testing sites. It is worth noting that in the long years of work there has been established a fruitful and effective

  15. An object localization optimization technique in medical images using plant growth simulation algorithm


    Bhattacharjee, Deblina; Paul, Anand; Kim, Jeong Hong; Kim, Mucheol


    The analysis of leukocyte images has drawn interest from fields of both medicine and computer vision for quite some time where different techniques have been applied to automate the process of manual analysis and classification of such images. Manual analysis of blood samples to identify leukocytes is time-consuming and susceptible to error due to the different morphological features of the cells.?In this article, the nature-inspired plant growth simulation algorithm has been applied to optim...

  16. Development of a Car Racing Simulator Game Using Artificial Intelligence Techniques


    Chan, Marvin T.; Chan, Christine W.; Gelowitz, Craig


    This paper presents a car racing simulator game called Racer, in which the human player races a car against three game-controlled cars in a three-dimensional environment. The objective of the game is not to defeat the human player, but to provide the player with a challenging and enjoyable experience. To ensure that this objective can be accomplished, the game incorporates artificial intelligence (AI) techniques, which enable the cars to be controlled in a manner that mimics natural driving. ...

  17. An Improved Simulated Annealing Technique for Enhanced Mobility in Smart Cities

    Directory of Open Access Journals (Sweden)

    Hayder Amer


    Full Text Available Vehicular traffic congestion is a significant problem that arises in many cities. This is due to the increasing number of vehicles that are driving on city roads of limited capacity. The vehicular congestion significantly impacts travel distance, travel time, fuel consumption and air pollution. Avoidance of traffic congestion and providing drivers with optimal paths are not trivial tasks. The key contribution of this work consists of the developed approach for dynamic calculation of optimal traffic routes. Two attributes (the average travel speed of the traffic and the roads’ length are utilized by the proposed method to find the optimal paths. The average travel speed values can be obtained from the sensors deployed in smart cities and communicated to vehicles via the Internet of Vehicles and roadside communication units. The performance of the proposed algorithm is compared to three other algorithms: the simulated annealing weighted sum, the simulated annealing technique for order preference by similarity to the ideal solution and the Dijkstra algorithm. The weighted sum and technique for order preference by similarity to the ideal solution methods are used to formulate different attributes in the simulated annealing cost function. According to the Sheffield scenario, simulation results show that the improved simulated annealing technique for order preference by similarity to the ideal solution method improves the traffic performance in the presence of congestion by an overall average of 19.22% in terms of travel time, fuel consumption and CO2 emissions as compared to other algorithms; also, similar performance patterns were achieved for the Birmingham test scenario.

  18. Effect of Simulation Techniques and Lecture Method on Students' Academic Performance in Mafoni Day Secondary School Maiduguri, Borno State, Nigeria (United States)

    Bello, Sulaiman; Ibi, Mustapha Baba; Bukar, Ibrahim Bulama


    The study examined the effect of simulation technique and lecture method on students' academic performance in Mafoni Day Secondary School, Maiduguri. The study used both simulation technique and lecture methods of teaching at the basic level of education in the teaching/learning environment. The study aimed at determining the best predictor among…

  19. Evaluating a Bayesian approach to improve accuracy of individual photographic identification methods using ecological distribution data

    Directory of Open Access Journals (Sweden)

    Richard Stafford


    Full Text Available Photographic identification of individual organisms can be possible from natural body markings. Data from photo-ID can be used to estimate important ecological and conservation metrics such as population sizes, home ranges or territories. However, poor quality photographs or less well-studied individuals can result in a non-unique ID, potentially confounding several similar looking individuals. Here we present a Bayesian approach that uses known data about previous sightings of individuals at specific sites as priors to help assess the problems of obtaining a non-unique ID. Using a simulation of individuals with different confidence of correct ID we evaluate the accuracy of Bayesian modified (posterior probabilities. However, in most cases, the accuracy of identification decreases. Although this technique is unsuccessful, it does demonstrate the importance of computer simulations in testing such hypotheses in ecology.

  20. Approaching photographic ekphrasis


    Castro, Azucena


    My paper analyses the ekphrastic sequences in postmodern writer Tomás Eloy Martínez’s novel, Santa Evita (1995), in which the photographic descriptions of Eva Perón are a recurring theme. The photographic ekphrasis in this novel reflects the various meanings and messages the photography has, but it also manifests the appropriation of the photographictools and mechanisms by the literary practice to create a deceitful sensation of reality in the reader-spectator. The fictive world is created by...

  1. A Moving Window Technique in Parallel Finite Element Time Domain Electromagnetic Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Lie-Quan; Candel, Arno; Ng, Cho; Ko, Kwok; /SLAC


    A moving window technique for the finite element time domain (FETD) method is developed to simulate the propagation of electromagnetic waves induced by the transit of a charged particle beam inside large and long structures. The window moving along with the beam in the computational domain adopts high-order finite-element basis functions through p refinement and/or a high-resolution mesh through h refinement so that a sufficient accuracy is attained with substantially reduced computational costs. Algorithms to transfer discretized fields from one mesh to another, which are the key to implementing a moving window in a finite-element unstructured mesh, are presented. Numerical experiments are carried out using the moving window technique to compute short-range wakefields in long accelerator structures. The results are compared with those obtained from the normal FETD method and the advantages of using the moving window technique are discussed.

  2. Extension of the Viscous Collision Limiting Direct Simulation Monte Carlo Technique to Multiple Species (United States)

    Liechty, Derek S.; Burt, Jonathan M.


    There are many flows fields that span a wide range of length scales where regions of both rarefied and continuum flow exist and neither direct simulation Monte Carlo (DSMC) nor computational fluid dynamics (CFD) provide the appropriate solution everywhere. Recently, a new viscous collision limited (VCL) DSMC technique was proposed to incorporate effects of physical diffusion into collision limiter calculations to make the low Knudsen number regime normally limited to CFD more tractable for an all-particle technique. This original work had been derived for a single species gas. The current work extends the VCL-DSMC technique to gases with multiple species. Similar derivations were performed to equate numerical and physical transport coefficients. However, a more rigorous treatment of determining the mixture viscosity is applied. In the original work, consideration was given to internal energy non-equilibrium, and this is also extended in the current work to chemical non-equilibrium.

  3. Flight Behaviors of a Complex Projectile Using a Coupled Computational Fluid Dynamics (CFD)-based Simulation Technique: Free Motion (United States)


    Projectile Using a Coupled Computational Fluid Dynamics (CFD)-based Simulation Technique: Free Motion 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...38 vi Preface The paper “Flight Behaviors of a Complex Projectile using a Coupled CFD-based Simulation Technique: Free Motion ” was...involves coupling of CFD and rigid body dynamics (RBD) codes for the simulation of projectile free flight motion in a time-accurate manner. This

  4. The evolution of simulation techniques for dynamic bone tissue engineering in bioreactors. (United States)

    Vetsch, Jolanda Rita; Müller, Ralph; Hofmann, Sandra


    Bone tissue engineering aims to overcome the drawbacks of current bone regeneration techniques in orthopaedics. Bioreactors are widely used in the field of bone tissue engineering, as they help support efficient nutrition of cultured cells with the possible combination of applying mechanical stimuli. Beneficial influencing parameters of in vitro cultures are difficult to find and are mostly determined by trial and error, which is associated with significant time and money spent. Mathematical simulations can support the finding of optimal parameters. Simulations have evolved over the last 20 years from simple analytical models to complex and detailed computational models. They allow researchers to simulate the mechanical as well as the biological environment experienced by cells seeded on scaffolds in a bioreactor. Based on the simulation results, it is possible to give recommendations about specific parameters for bone bioreactor cultures, such as scaffold geometries, scaffold mechanical properties, the level of applied mechanical loading or nutrient concentrations. This article reviews the evolution in simulating various aspects of dynamic bone culture in bioreactors and reveals future research directions. Copyright © 2013 John Wiley & Sons, Ltd.


    Directory of Open Access Journals (Sweden)

    Ye. A. Kolesnikova


    Full Text Available The purpose of this study is to introduce a new educational method for gynecologists to master the technique of laparoscopic surgery in case of ectopic pregnancy. This method involves using a computer Simulation Platform “Lap mentor” (Simbionix, USA.Thirty gynecologists, who had no experience of independent performance of laparoscopic gynecological surgery, were randomized into 2 groups of 15 people. Laparoscopic technique in both groups was mastered by performing operations in different clinical variants of ectopic pregnancy on a computer simulator. But doctors from the second group, according to the proposed learning method, also performed additional exercises aimed at developing specific laparoscopic skills (work with the camera, control of one or two instruments, separation of tissue using scissors and endosurgical monopolar electrodes.Comparison of groups at the final tests showed that gynecologists whose training included exercises to develop skills in laparoscopy showed significantly greater success in the performance of control tasks. All surgical techniques doctors performed faster and it took themless time to perform the operation than for their counter parts in the comparison group. Along with this movements of gynecologists from the second group were more precise and accurate, accompanied by a smaller number of vascular and organs injures than in the comparison group.Thus, application of the proposed method of mastering the laparoscopic skills in gynecology, including the performance of special training exercises with virtual operations, can significantly improve the surgical technique of specialists and their professional competence. Skills, obtained using this educational method, are of higher quality compared with the experience gained by simply repeating the operation on a computer simulator.

  6. Cloud chamber photographs of the cosmic radiation

    CERN Document Server

    Rochester, George Dixon


    Cloud Chamber Photographs of the Cosmic Radiation focuses on cloud chamber and photographic emulsion wherein the tracks of individual subatomic particles of high energy are studied. The publication first offers information on the technical features of operation and electrons and cascade showers. Discussions focus on the relationship in time and space of counter-controlled tracks; techniques of internal control of the cloud chamber; cascade processes with artificially-produced electrons and photons; and nuclear interaction associated with an extensive shower. The manuscript then elaborates on

  7. Does Taking Photographs Help? (United States)

    Hand, Sarah


    Since many people tend to use photographs as memory anchors, this author decided she wanted to know whether the process of capturing and manipulating an image taken during a learning activity would act as a memory anchor for children's visual, auditory and kinaesthetic memories linked to their cognitive learning at the time. In plain English,…


    Berry, F.G.


    S>An improved photographic developer is presented having very high energy development fine grain characteristics and a long shelf life. These characteristics are obtained by the use of aminoacetic acid in the developer, the other constituents of which are: sodium sulfite, hydroquinone, sodiunn borate, boric acid and potassium bromide, 1-phenyl-3-pyrazolidone.

  9. Numerical simulation about orthogonal single frequency dithering technique used in tilt control of fiber laser array (United States)

    Zhang, Zhixin; Zhi, Dong; Ma, Yanxing; Wang, Xiaolin; Zhou, Pu; Si, Lei


    Beam combination of fiber laser array is an effective technique contributed to improve the brightness of fiber lasers. In order to realize high-efficiency CBC, challenges like phase distortion (mainly including piston and tilt phase aberrations) should be taken into consideration. Resent years, tilt phase aberrations control has been come true by adaptive fiber optics collimator using the stochastic parallel gradient descent (SPGD) algorithm. However, the convergence rate of tilt control system still cannot satisfy the needs of practical application. In order to increase the tilt control bandwidth, a new idea is put forward that applying the orthogonal single frequency dithering (OSFD) technique into tilt control, and numerical simulation has been completed. A hexagonal laser array with 7 elements has been simulated, and each element has a pair of initial tilt angles in horizontal and vertical direction. The initial tilt angles comply with normal distribution. In the same condition, tilt phase control has been realized through SPGD and OSFD individually, and the convergence steps (defined as the iteration steps that improve the normalized PIB above 0.9) with appropriate parameters are respectively about 20 (SPGD) and 7 (OSFD). Furthermore, tilt phase control of large number hexagonal array is simulated, and the results are as follows: for 19/37 elements, the least convergence steps are about 80/160(SPGD) and 19/55(OSFD). Comparing with SPGD algorithm, it is obvious that the OSFD has higher convergence rate and greater potential for tilt control application in large number coherent fiber laser array.

  10. Numerical characterization of landing gear aeroacoustics using advanced simulation and analysis techniques (United States)

    Redonnet, S.; Ben Khelil, S.; Bulté, J.; Cunha, G.


    With the objective of aircraft noise mitigation, we here address the numerical characterization of the aeroacoustics by a simplified nose landing gear (NLG), through the use of advanced simulation and signal processing techniques. To this end, the NLG noise physics is first simulated through an advanced hybrid approach, which relies on Computational Fluid Dynamics (CFD) and Computational AeroAcoustics (CAA) calculations. Compared to more traditional hybrid methods (e.g. those relying on the use of an Acoustic Analogy), and although it is used here with some approximations made (e.g. design of the CFD-CAA interface), the present approach does not rely on restrictive assumptions (e.g. equivalent noise source, homogeneous propagation medium), which allows to incorporate more realism into the prediction. In a second step, the outputs coming from such CFD-CAA hybrid calculations are processed through both traditional and advanced post-processing techniques, thus offering to further investigate the NLG's noise source mechanisms. Among other things, this work highlights how advanced computational methodologies are now mature enough to not only simulate realistic problems of airframe noise emission, but also to investigate their underlying physics.

  11. Accelerating All-Atom MD Simulations of Lipids Using a Modified Virtual-Sites Technique. (United States)

    Loubet, Bastien; Kopec, Wojciech; Khandelia, Himanshu


    We present two new implementations of the virtual sites technique which completely suppresses the degrees of freedom of the hydrogen atoms in a lipid bilayer allowing for an increased time step of 5 fs in all-atom simulations of the CHARMM36 force field. One of our approaches uses the derivation of the virtual sites used in GROMACS while the other uses a new definition of the virtual sites of the CH2 groups. Our methods is tested on a DPPC (no unsaturated chain), a POPC (one unsaturated chain), and a DOPC (two unsaturated chains) lipid bilayers. We calculate various physical properties of the membrane of our simulations with and without virtual sites and explain the differences and similarity observed. The best agreements are obtained for the GROMACS original virtual sites on the DOPC bilayer where we get an area per lipid of 67.3 ± 0.3 Å(2) without virtual sites and 67.6 ± 0.3 Å(2) with virtual sites. In conclusion the virtual-sites technique on lipid membranes is a powerful simulation tool, but it should be used with care. The procedure can be applied to other force fields and lipids in a straightforward manner.

  12. Heat Transfer Study of Heat-Integrated Distillation Column (HIDiC) Using Simulation Techniques (United States)

    Pulido, Jeffrey León; Martínez, Edgar Leonardo; Wolf, Maria Regina; Filho, Rubens Maciel


    Separation processes is largely used in petroleum refining and alcohol industries. Distillation columns consume a huge amount of energy in industrial process. Therefore, the concept of Heat-Integrated Distillation Column (HIDiC) was studied using simulation techniques in order to overcome this drawback. In this configuration the column is composed for two concentric sections called rectifying and stripping. The heat transfer is conducted from the rectifying section (which works at higher pressure and temperature) to the stripping section (which works at lower pressure and temperature) using the heat present in the process and decreasing the energy charge required by the reboiler. The HIDiC column offers great potential to reduce energy consumption compared to conventional columns. However, the complexity of the internal configuration requires the development of rigorous works that enable a better understanding of the column operation. For this reason, techniques of simulation were used through of computational software. The current work presents a heat transfer study in a concentric stage of a HIDiC column. The results obtained by Aspen Plus and CFD simulation showed the internal heat transfer in a concentric tray as a promissory configuration in order to decrease energy consumption in distillation processes.


    Directory of Open Access Journals (Sweden)

    Y. C. Lin


    Full Text Available Caisson is one of the important representations of the Minnan (southern Fujian temple architecture craft techniques and decorative aesthetics. The special component design and group building method present the architectural thinking and personal characteristics of great carpenters of Minnan temple architecture. In late Qing Dynasty, the appearance and style of caissons of famous temples in Taiwan apparently presented the building techniques of the great carpenters. However, as the years went by, the caisson design and craft techniques were not fully inherited, which has been a great loss of cultural assets. Accordingly, with the caisson of Fulong temple, a work by the well-known great carpenter in Tainan as an example, this study obtained the thinking principles of the original design and the design method at initial period of construction through interview records and the step of redrawing the "Tng-Ko" (traditional design, stakeout and construction tool. We obtained the 3D point cloud model of the caisson of Fulong temple using 3D laser scanning technology, and established the 3D digital model of each component of the caisson. Based on the caisson component procedure obtained from interview records, this study conducted the digital simulation of the caisson component to completely recode and present the caisson design, construction and completion procedure. This model of preserving the craft techniques for Minnan temple caisson by using digital technology makes specific contribution to the heritage of the craft techniques while providing an important reference for the digital preservation of human cultural assets.

  14. Impact of Standardized Communication Techniques on Errors during Simulated Neonatal Resuscitation. (United States)

    Yamada, Nicole K; Fuerch, Janene H; Halamek, Louis P


    Current patterns of communication in high-risk clinical situations, such as resuscitation, are imprecise and prone to error. We hypothesized that the use of standardized communication techniques would decrease the errors committed by resuscitation teams during neonatal resuscitation. In a prospective, single-blinded, matched pairs design with block randomization, 13 subjects performed as a lead resuscitator in two simulated complex neonatal resuscitations. Two nurses assisted each subject during the simulated resuscitation scenarios. In one scenario, the nurses used nonstandard communication; in the other, they used standardized communication techniques. The performance of the subjects was scored to determine errors committed (defined relative to the Neonatal Resuscitation Program algorithm), time to initiation of positive pressure ventilation (PPV), and time to initiation of chest compressions (CC). In scenarios in which subjects were exposed to standardized communication techniques, there was a trend toward decreased error rate, time to initiation of PPV, and time to initiation of CC. While not statistically significant, there was a 1.7-second improvement in time to initiation of PPV and a 7.9-second improvement in time to initiation of CC. Should these improvements in human performance be replicated in the care of real newborn infants, they could improve patient outcomes and enhance patient safety. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  15. Standardization of Guidelines for Patient Photograph Deidentification. (United States)

    Roberts, Erik A; Troiano, Chelsea; Spiegel, Jeffrey H


    This work was performed to advance patient care by protecting patient anonymity. This study aimed to analyze the current practices used in patient facial photograph deidentification and set forth standardized guidelines for improving patient autonomy that are congruent with medical ethics and Health Insurance Portability and Accountability Act. The anonymization guidelines of 13 respected journals were reviewed for adequacy in accordance to facial recognition literature. Simple statistics were used to compare the usage of the most common concealment techniques in 8 medical journals which may publish the most facial photographs. Not applicable. Not applicable. Facial photo deidentification guidelines of 13 journals were ascertained. Number and percentage of patient photographs lacking adequate anonymization in 8 journals were determined. Facial image anonymization guidelines varied across journals. When anonymization was attempted, 87% of the images were inadequately concealed. The most common technique used was masking the eyes alone with a black box. Most journals evaluated lack specific instructions for properly de-identifying facial photographs. The guidelines introduced here stress that both eyebrows and eyes must be concealed to ensure patient privacy. Examples of proper and inadequate photo anonymization techniques are provided. Improving patient care by ensuring greater patient anonymity.

  16. Simulating GPS radio signal to synchronize network--a new technique for redundant timing. (United States)

    Shan, Qingxiao; Jun, Yang; Le Floch, Jean-Michel; Fan, Yaohui; Ivanov, Eugene N; Tobar, Michael E


    Currently, many distributed systems such as 3G mobile communications and power systems are time synchronized with a Global Positioning System (GPS) signal. If there is a GPS failure, it is difficult to realize redundant timing, and thus time-synchronized devices may fail. In this work, we develop time transfer by simulating GPS signals, which promises no extra modification to original GPS-synchronized devices. This is achieved by applying a simplified GPS simulator for synchronization purposes only. Navigation data are calculated based on a pre-assigned time at a fixed position. Pseudo-range data which describes the distance change between the space vehicle (SV) and users are calculated. Because real-time simulation requires heavy-duty computations, we use self-developed software optimized on a PC to generate data, and save the data onto memory disks while the simulator is operating. The radio signal generation is similar to the SV at an initial position, and the frequency synthesis of the simulator is locked to a pre-assigned time. A filtering group technique is used to simulate the signal transmission delay corresponding to the SV displacement. Each SV generates a digital baseband signal, where a unique identifying code is added to the signal and up-converted to generate the output radio signal at the centered frequency of 1575.42 MHz (L1 band). A prototype with a field-programmable gate array (FPGA) has been built and experiments have been conducted to prove that we can realize time transfer. The prototype has been applied to the CDMA network for a three-month long experiment. Its precision has been verified and can meet the requirements of most telecommunication systems.

  17. SMEX03 Site Photographs, Alabama (United States)

    National Aeronautics and Space Administration — The SMEX03 Site Photographs data set includes photographs of the regional study areas of Alabama, Georgia, and Oklahoma, USA as part of the 2003 Soil Moisture...

  18. Simulation technique for slurries interacting with moving parts and deformable solids with applications (United States)

    Mutabaruka, Patrick; Kamrin, Ken


    A numerical method for particle-laden fluids interacting with a deformable solid domain and mobile rigid parts is proposed and implemented in a full engineering system. The fluid domain is modeled with a lattice Boltzmann representation, the particles and rigid parts are modeled with a discrete element representation, and the deformable solid domain is modeled using a Lagrangian mesh. The main issue of this work, since separately each of these methods is a mature tool, is to develop coupling and model-reduction approaches in order to efficiently simulate coupled problems of this nature, as in various geological and engineering applications. The lattice Boltzmann method incorporates a large eddy simulation technique using the Smagorinsky turbulence model. The discrete element method incorporates spherical and polyhedral particles for stiff contact interactions. A neo-Hookean hyperelastic model is used for the deformable solid. We provide a detailed description of how to couple the three solvers within a unified algorithm. The technique we propose for rubber modeling/coupling exploits a simplification that prevents having to solve a finite-element problem at each time step. We also developed a technique to reduce the domain size of the full system by replacing certain zones with quasi-analytic solutions, which act as effective boundary conditions for the lattice Boltzmann method. The major ingredients of the routine are separately validated. To demonstrate the coupled method in full, we simulate slurry flows in two kinds of piston valve geometries. The dynamics of the valve and slurry are studied and reported over a large range of input parameters.

  19. Lattice Boltzmann flow simulations with applications of reduced order modeling techniques

    KAUST Repository

    Brown, Donald


    With the recent interest in shale gas, an understanding of the flow mechanisms at the pore scale and beyond is necessary, which has attracted a lot of interest from both industry and academia. One of the suggested algorithms to help understand flow in such reservoirs is the Lattice Boltzmann Method (LBM). The primary advantage of LBM is its ability to approximate complicated geometries with simple algorithmic modificatoins. In this work, we use LBM to simulate the flow in a porous medium. More specifically, we use LBM to simulate a Brinkman type flow. The Brinkman law allows us to integrate fast free-flow and slow-flow porous regions. However, due to the many scales involved and complex heterogeneities of the rock microstructure, the simulation times can be long, even with the speed advantage of using an explicit time stepping method. The problem is two-fold, the computational grid must be able to resolve all scales and the calculation requires a steady state solution implying a large number of timesteps. To help reduce the computational complexity and total simulation times, we use model reduction techniques to reduce the dimension of the system. In this approach, we are able to describe the dynamics of the flow by using a lower dimensional subspace. In this work, we utilize the Proper Orthogonal Decomposition (POD) technique, to compute the dominant modes of the flow and project the solution onto them (a lower dimensional subspace) to arrive at an approximation of the full system at a lowered computational cost. We present a few proof-of-concept examples of the flow field and the corresponding reduced model flow field.

  20. A simulation benchmark to evaluate the performance of advanced control techniques in biological wastewater treatment plants

    Directory of Open Access Journals (Sweden)

    O.A.Z. Sotomayor


    Full Text Available Wastewater treatment plants (WWTP are complex systems that incorporate a large number of biological, physicochemical and biochemical processes. They are large and nonlinear systems subject to great disturbances in incoming loads. The primary goal of a WWTP is to reduce pollutants and the second goal is disturbance rejection, in order to obtain good effluent quality. Modeling and computer simulations are key tools in the achievement of these two goals. They are essential to describe, predict and control the complicated interactions of the processes. Numerous control techniques (algorithms and control strategies (structures have been suggested to regulate WWTP; however, it is difficult to make a discerning performance evaluation due to the nonuniformity of the simulated plants used. The main objective of this paper is to present a benchmark of an entire biological wastewater treatment plant in order to evaluate, through simulations, different control techniques. This benchmark plays the role of an activated sludge process used for removal of organic matter and nitrogen from domestic effluents. The development of this simulator is based on models widely accepted by the international community and is implemented in Matlab/Simulink (The MathWorks, Inc. platform. The benchmark considers plant layout and the effects of influent characteristics. It also includes a test protocol for analyzing the open and closed-loop responses of the plant. Examples of control applications in the benchmark are implemented employing conventional PI controllers. The following common control strategies are tested: dissolved oxygen (DO concentration-based control, respirometry-based control and nitrate concentration-based control.

  1. Wind Turbine Rotor Simulation via CFD Based Actuator Disc Technique Compared to Detailed Measurement


    Esmail Mahmoodi; Ali Jafari; Alireza Keyhani


    In this paper, a generalized Actuator Disc (AD) is used to model the wind turbine rotor of the MEXICO experiment, a collaborative European wind turbine project. The AD model as a combination of CFD technique and User Defined Functions codes (UDF), so-called UDF/AD model is used to simulate loads and performance of the rotor in three different wind speed tests. Distributed force on the blade, thrust and power production of the rotor as important designing parameters of wind turbine rotors are ...

  2. Sputter behaviour of vanadium silicide studied by a computer simulation technique

    Energy Technology Data Exchange (ETDEWEB)

    Fritzsch, B.; Zehe, A.; Samoylov, V.N.


    A detailed study of the sputter yield of VSi/sub 2/ has been carried out by a computer simulation technique. Krypton and neon ions in the energy range of 200 - 5000 eV are used, and the angle of incidence on the (0001) surface of VSi/sub 2/ is taken to be 90/sup 0/. The dynamical atom block considered in the computer program consists of 397 atoms situated at 5 equally spaced layers. Both amorphous and crystalline targets are discussed, having expectingly a different outcome for yield, as well as for preferential and directional sputtering. Theoretical results are in good accord with experimental findings.

  3. Why use case studies rather than simulation-gaming techniques or library research? (United States)

    Mcdonald, S. W.


    Method which present a student with a more challenging and true to life situation of needing to conduct research in a problem solving context--and not thinking about organization of format until research and thinking are complete are investigated. Simulation-gaming techniques which attempt to teach initiative and creativity that library research are used for this purpose. However, it is shown case studies provide the greatest opportunities to engage the students in problem solving situations in which they develop skills as researchers and writers.

  4. Geotagging Photographs in Student Fieldwork (United States)

    Welsh, Katharine E.; France, Derek; Whalley, W. Brian; Park, Julian R.


    This resource paper provides guidance for staff and students on the potential educational benefits, limitations and applications of geotagging photographs. It also offers practical advice for geotagging photographs in a range of fieldwork settings and reviews three free smartphone applications (apps) for geotagging photographs (Flickr, Evernote…

  5. Topological Visualisation techniques for the understanding of Lattice Quantum Chromodynamics (LQCD) simulations

    CERN Document Server

    Thomas, Dean P; Hands, Simon


    The use of topology for visualisation applications has become increasingly popular due to its ability to summarise data at a high level. Criticalities in scalar field data are used by visualisation methods such as the Reeb graph and contour trees to present topological structure in simple graph based formats. These techniques can be used to segment the input field, recognising the boundaries between multiple objects, allowing whole contour meshes to be seeded as separate objects. In this paper we demonstrate the use of topology based techniques when applied to theoretical physics data generated from Quantum Chromodynamics simulations, which due to its structure complicates their use. We also discuss how the output of algorithms involved in topological visualisation can be used by physicists to further their understanding of Quantum Chromodynamics.

  6. Verifying calculations forty years on : an overview of classical verification techniques for FEM simulations

    CERN Document Server

    Díez, Pedro


    This work provides an overview of a posteriori error assessment techniques for Finite Element (FE) based numerical models. These tools aim at estimating and controlling the discretization error in scientific computational models, being the basis for the numerical verification of the FE solutions. The text discusses the capabilities and limitations of classical methods to build error estimates which can be used to control the quality of numerical simulations and drive adaptive algorithms, with a focus on Computational Mechanics engineering applications. Fundamentals principles of residual methods, smoothing (recovery) methods, and constitutive relation error (duality based) methods are thus addressed along the manuscript. Attention is paid to recent advances and forthcoming research challenges on related topics.  The book constitutes a useful guide for students, researchers, or engineers wishing to acquire insights into state-of-the-art techniques for numerical verification.

  7. Simulation of photoacoustic tomography (PAT) system in COMSOL and comparison of two popular reconstruction techniques (United States)

    Sowmiya, C.; Thittai, Arun K.


    Photoacoustic imaging is a molecular cum functional imaging modality based on differential optical absorption of the incident laser pulse by the endogeneous tissue chromophores. Several numerical simulations and finite element models have been developed in the past to describe and study Photoacoustic (PA) signal generation principles and study the effect of variation in PA parameters. Most of these simulation work concentrate on analyzing extracted 1D PA signals and each of them mostly describe only few of the building blocks of a Photoacoustic Tomography (PAT) imaging system. Papers describing simulation of the entire PAT system in one simulation platform, along with reconstruction is seemingly rare. This study attempts to describe how a commercially available Finite Element software (COMSOL(R)), can serve as a single platform for simulating PAT that couples the electromagnetic, thermodynamic and acoustic pressure physics involved in PA phenomena. Further, an array of detector elements placed at the boundary in the FE model can provide acoustic pressure data that can be exported to Matlab(R) to perform tomographic image reconstruction. The performance of two most commonly used image reconstruction techniques; namely, Filtered Backprojection (FBP) and Synthetic Aperture (SA) beamforming are compared. Results obtained showed that the lateral resolution obtained using FBP vs. SA largely depends on the aperture parameters. FBP reconstruction was able to provide a slightly better lateral resolution for smaller aperture while SA worked better for larger aperture. This interesting effect is currently being investigated further. Computationally FBP was faster, but it had artifacts along the spherical shell on which the data is projected.

  8. Impact of resolution and downscaling technique in simulating recent Atlantic tropical cyclone activity

    Energy Technology Data Exchange (ETDEWEB)

    Caron, Louis-Philippe; Winger, Katja [CRCMD Network, UQAM, Montreal, QC (Canada); Jones, Colin G. [Swedish Meteorological and Hydrological Institute, Rossby Centre, Norrkoping (Sweden)


    Using the global environmental multiscale (GEM) model, we investigate the impact of increasing model resolution from 2 to 0.3 on Atlantic tropical cyclone activity. There is a clear improvement in the realism of Atlantic storms with increased resolution, in part, linked to a better representation of African easterly waves. The geographical distribution of a Genesis Potential Index, composed of large-scales fields known to impact cyclone formation, coincides closely in the model with areas of high cyclogenesis. The geographical distribution of this index also improves with resolution. We then compare two techniques for achieving local high resolution over the tropical Atlantic: a limited-area model driven at the boundaries by the global 2 GEM simulation and a global variable resolution model (GVAR). The limited-area domain and high-resolution part of the GVAR model coincide geographically, allowing a direct comparison between these two downscaling options. These integrations are further compared with a set of limited-area simulations employing the same domain and resolution, but driven at the boundaries by reanalysis. The limited-area model driven by reanalysis produces the most realistic Atlantic tropical cyclone variability. The GVAR simulation is clearly more accurate than the limited-area version driven by GEM-Global. Degradation in the simulated interannual variability is partly linked to the models failure to accurately reproduce the impact of atmospheric teleconnections from the equatorial Pacific and Sahel on Atlantic cyclogenesis. Through the use of a smaller limited-area grid, driven by GEM-Global 2 , we show that an accurate representation of African Easterly Waves is crucial for simulating Atlantic tropical cyclone variability. (orig.)

  9. Photograph of the month (United States)


    Sigmoid (σ-type) structure in mylonite exposed along the wall of the Blue Nile Canyon in Bure, Ethiopian Plateau, Ethiopia. The Blue Nile River carved a 1.6 km deep and extensive canyon on the Ethiopian Plateau and exposed Neoproterozoic to Quaternary rocks. The lenticular-shaped sigmoid is a deformed feldspar aggregate with elongated wings at both ends. The wings of the clast step up in the direction of movement of the upper part of shear zone, thus showing a dextral (right-lateral) sense of shear. This Neoproteozoic-Early Paleozoic shear zone might have formed by dynamic metamorphism during the East African Orogeny, which is related to the collision between East and West Gondwanaland. The horizontal scale bar in the photograph is 5 cm. Outcrop location Bure region (10° 20.326' N, 037°1.305 E, 1580 m altitude). Photograph © Nahid D. Gani, WKU, Bowling Green, USA (

  10. STS-80 Onboard Photograph (United States)


    This STS-80 onboard photograph shows the Orbiting Retrievable Far and Extreme Ultraviolet Spectrometer-Shuttle Pallet Satellite II (ORFEUS-SPAS II), photographed during approach by the Space Shuttle Orbiter Columbia for retrieval. Built by the German Space Agency, DARA, the ORFEUS-SPAS II, a free-flying satellite, was dedicated to astronomical observations at very short wavelengths to: investigate the nature of hot stellar atmospheres, investigate the cooling mechanisms of white dwarf stars, determine the nature of accretion disks around collapsed stars, investigate supernova remnants, and investigate the interstellar medium and potential star-forming regions. Some 422 observations of almost 150 astronomical objects were completed, including the Moon, nearby stars, distant Milky Way stars, stars in other galaxies, active galaxies, and quasar 3C273. The STS-80 mission was launched November 19, 1996.

  11. Computer aided photographic engineering (United States)

    Hixson, Jeffrey A.; Rieckhoff, Tom


    High speed photography is an excellent source of engineering data but only provides a two-dimensional representation of a three-dimensional event. Multiple cameras can be used to provide data for the third dimension but camera locations are not always available. A solution to this problem is to overlay three-dimensional CAD/CAM models of the hardware being tested onto a film or photographic image, allowing the engineer to measure surface distances, relative motions between components, and surface variations.

  12. Reconstruction of chalk pore networks from 2D backscatter electron micrographs using a simulated annealing technique

    Energy Technology Data Exchange (ETDEWEB)

    Talukdar, M.S.; Torsaeter, O. [Department of Petroleum Engineering and Applied Geophysics, Norwegian University of Science and Technology, Trondheim (Norway)


    We report the stochastic reconstruction of chalk pore networks from limited morphological information that may be readily extracted from 2D backscatter electron (BSE) images of the pore space. The reconstruction technique employs a simulated annealing (SA) algorithm, which can be constrained by an arbitrary number of morphological descriptors. Backscatter electron images of a high-porosity North Sea chalk sample are analyzed and the morphological descriptors of the pore space are determined. The morphological descriptors considered are the void-phase two-point probability function and lineal path function computed with or without the application of periodic boundary conditions (PBC). 2D and 3D samples have been reconstructed with different combinations of the descriptors and the reconstructed pore networks have been analyzed quantitatively to evaluate the quality of reconstructions. The results demonstrate that simulated annealing technique may be used to reconstruct chalk pore networks with reasonable accuracy using the void-phase two-point probability function and/or void-phase lineal path function. Void-phase two-point probability function produces slightly better reconstruction than the void-phase lineal path function. Imposing void-phase lineal path function results in slight improvement over what is achieved by using the void-phase two-point probability function as the only constraint. Application of periodic boundary conditions appears to be not critically important when reasonably large samples are reconstructed.

  13. Steering charge kinetics in photocatalysis: intersection of materials syntheses, characterization techniques and theoretical simulations. (United States)

    Bai, Song; Jiang, Jun; Zhang, Qun; Xiong, Yujie


    Charge kinetics is highly critical in determining the quantum efficiency of solar-to-chemical conversion in photocatalysis, and this includes, but is not limited to, the separation of photoexcited electron-hole pairs, utilization of plasmonic hot carriers and delivery of photo-induced charges to reaction sites, as well as activation of reactants by energized charges. In this review, we highlight the recent progress on probing and steering charge kinetics toward designing highly efficient photocatalysts and elucidate the fundamentals behind the combinative use of controlled synthesis, characterization techniques (with a focus on spectroscopic characterizations) and theoretical simulations in photocatalysis studies. We first introduce the principles of various processes associated with charge kinetics that account for or may affect photocatalysis, from which a set of parameters that are critical to photocatalyst design can be summarized. We then outline the design rules for photocatalyst structures and their corresponding synthetic approaches. The implementation of characterization techniques and theoretical simulations in different steps of photocatalysis, together with the associated fundamentals and working mechanisms, are also presented. Finally, we discuss the challenges and opportunities for photocatalysis research at this unique intersection as well as the potential impact on other research fields.

  14. The simulation of Typhoon-induced coastal inundation in Busan, South Korea applying the downscaling technique (United States)

    Jang, Dongmin; Park, Junghyun; Yuk, Jin-Hee; Joh, MinSu


    Due to typhoons, the south coastal cities including Busan in South Korea coastal are very vulnerable to a surge, wave and corresponding coastal inundation, and are affected every year. In 2016, South Korea suffered tremendous damage by typhoon 'Chaba', which was developed near east-north of Guam on Sep. 28 and had maximum 10-minute sustained wind speed of about 50 m/s, 1-minute sustained wind speed of 75 m/s and a minimum central pressure of 905 hpa. As 'Chaba', which is the strongest since typhoon 'Maemi' in 2003, hit South Korea on Oct. 5, it caused a massive economic and casualty damage to Ulsan, Gyeongju and Busan in South Korea. In particular, the damage of typhoon-induced coastal inundation in Busan, where many high-rise buildings and residential areas are concentrated near coast, was serious. The coastal inundation could be more affected by strong wind-induced wave than surge. In fact, it was observed that the surge height was about 1 m averagely and a significant wave height was about 8 m at coastal sea nearby Busan on Oct. 5 due to 'Chaba'. Even though the typhoon-induced surge elevated the sea level, the typhoon-induced long period wave with wave period of more than 15s could play more important role in the inundation. The present work simulated the coastal inundation induced by 'Chaba' in Busan, South Korea considering the effects of typhoon-induced surge and wave. For 'Chaba' hindcast, high resolution Weather Research and Forecasting model (WRF) was applied using a reanalysis data produced by NCEP (FNL 0.25 degree) on the boundary and initial conditions, and was validated by the observation of wind speed, direction and pressure. The typhoon-induced coastal inundation was simulated by an unstructured gird model, Finite Volume Community Ocean Model (FVCOM), which is fully current-wave coupled model. To simulate the wave-induced inundation, 1-way downscaling technique of multi domain was applied. Firstly, a mother's domain including Korean peninsula was

  15. Simulation of Engine Internal Flows Using Digital Physics Simulation des écoulements dans les moteurs avec la technique Digital Physics

    Directory of Open Access Journals (Sweden)

    Halliday J.


    Full Text Available This paper presents simulations of engine intake port and cylinder flows performed using PowerFLOW software. The numerical technique behind PowerFLOW, called Digital Physics, is based on statistical kinetic theory and it is numerically stable, so divergence does not occur during calculations. Digital Physics uses large numbers of computational cells with a simple turbulence model, giving grid-independence and high levels of accuracy. In addition, the technique is explicit in time, and so a transient simulation is always obtained. The paper outlines the numerical technique and presents details of an engine port and cylinder simulation. Cet article présente la simulation des écoulements dans les conduits d'admission et dans les cylindres avec le code PowerFLOW. La technique numérique sur laquelle repose PowerFLOW, appelée Digital Physics , est basée sur une théorie statistique de la cinétique. Elle est numériquement stable. La technique Digital Physicsutilise un nombre élevé de mailles de calcul avec un modèle de turbulence simple procurant une indépendance du maillage et une précision élevée. De plus, la méthodologie est explicite en temps, permettant la simulation des transitoires. Des simulations d'écoulements dans les conduits d'admission et dans les cylindres sont présentées.

  16. On Parallelizing Single Dynamic Simulation Using HPC Techniques and APIs of Commercial Software

    Energy Technology Data Exchange (ETDEWEB)

    Diao, Ruisheng; Jin, Shuangshuang; Howell, Frederic; Huang, Zhenyu; Wang, Lei; Wu, Di; Chen, Yousu


    Time-domain simulations are heavily used in today’s planning and operation practices to assess power system transient stability and post-transient voltage/frequency profiles following severe contingencies to comply with industry standards. Because of the increased modeling complexity, it is several times slower than real time for state-of-the-art commercial packages to complete a dynamic simulation for a large-scale model. With the growing stochastic behavior introduced by emerging technologies, power industry has seen a growing need for performing security assessment in real time. This paper presents a parallel implementation framework to speed up a single dynamic simulation by leveraging the existing stability model library in commercial tools through their application programming interfaces (APIs). Several high performance computing (HPC) techniques are explored such as parallelizing the calculation of generator current injection, identifying fast linear solvers for network solution, and parallelizing data outputs when interacting with APIs in the commercial package, TSAT. The proposed method has been tested on a WECC planning base case with detailed synchronous generator models and exhibits outstanding scalable performance with sufficient accuracy.

  17. Taxi Time Prediction at Charlotte Airport Using Fast-Time Simulation and Machine Learning Techniques (United States)

    Lee, Hanbong


    Accurate taxi time prediction is required for enabling efficient runway scheduling that can increase runway throughput and reduce taxi times and fuel consumptions on the airport surface. Currently NASA and American Airlines are jointly developing a decision-support tool called Spot and Runway Departure Advisor (SARDA) that assists airport ramp controllers to make gate pushback decisions and improve the overall efficiency of airport surface traffic. In this presentation, we propose to use Linear Optimized Sequencing (LINOS), a discrete-event fast-time simulation tool, to predict taxi times and provide the estimates to the runway scheduler in real-time airport operations. To assess its prediction accuracy, we also introduce a data-driven analytical method using machine learning techniques. These two taxi time prediction methods are evaluated with actual taxi time data obtained from the SARDA human-in-the-loop (HITL) simulation for Charlotte Douglas International Airport (CLT) using various performance measurement metrics. Based on the taxi time prediction results, we also discuss how the prediction accuracy can be affected by the operational complexity at this airport and how we can improve the fast time simulation model before implementing it with an airport scheduling algorithm in a real-time environment.

  18. On the Importance of Wave Simulation Techniques for Forecasting Shoreline Change (United States)

    Anderson, D. L.; Alvarez Antolinez, J. A.; Mendez, F. J.; Ruggiero, P.


    Global climate change is projected to alter large-scale atmospheric circulation, storm tracks, and consequently the regional wave climates produced by these patterns. Since shorelines naturally evolve towards dynamic equilibrium with the local wave climate, any redistribution of wave energy has the potential to result in morphological changes equal to or greater than those induced by sea-level rise over the next several decades. Because nearly all state of the art coastal modeling frameworks require a representation of the wave climate as input, the development of methodologies that create realistic wave climate scenarios is necessary to forecast possible shoreline change. Here we use a simple, one-line shoreline change model to assess the importance of wave simulation techniques on shoreline modeling. Our study site, the U.S. Pacific Northwest, exhibits significant seasonal to multi-decadal shoreline variability along relatively straight embayed beaches. One-line models, which calculate spatial gradients of alongshore sediment transport as a function of wave energy flux and angle, can represent this temporal variability if the wave input time series accurately represents the chronology and joint-probabilities of heights, periods, and directions. Because dynamically downscaling waves from general circulation models is computationally expensive, we explore several statistical input-reduction techniques for constructing time series that capture realistic seasonal to multi-decadal variability and the chronology of storm events. Methods include continuous-time Markov Chains, data mining techniques, fitting of non-stationary distribution functions, auto-regressive logistic models, and trivariate copula dependence structures formed from correlating observed wave records with coincident sea level pressures. The wave climates produced by each method places an emphasis on either the chronological progression or the joint probabilities of the wave parameters or both. We

  19. Simulation and prediction for energy dissipaters and stilling basins design using artificial intelligence technique

    Directory of Open Access Journals (Sweden)

    Mostafa Ahmed Moawad Abdeen


    Full Text Available Water with large velocities can cause considerable damage to channels whose beds are composed of natural earth materials. Several stilling basins and energy dissipating devices have been designed in conjunction with spillways and outlet works to avoid damages in canals’ structures. In addition, lots of experimental and traditional mathematical numerical works have been performed to profoundly investigate the accurate design of these stilling basins and energy dissipaters. The current study is aimed toward introducing the artificial intelligence technique as new modeling tool in the prediction of the accurate design of stilling basins. Specifically, artificial neural networks (ANNs are utilized in the current study in conjunction with experimental data to predict the length of the hydraulic jumps occurred in spillways and consequently the stilling basin dimensions can be designed for adequate energy dissipation. The current study showed, in a detailed fashion, the development process of different ANN models to accurately predict the hydraulic jump lengths acquired from different experimental studies. The results obtained from implementing these models showed that ANN technique was very successful in simulating the hydraulic jump characteristics occurred in stilling basins. Therefore, it can be safely utilized in the design of these basins as ANN involves minimum computational and financial efforts and requirements compared with experimental work and traditional numerical techniques such as finite difference or finite elements.

  20. A semi-simulated EEG/EOG dataset for the comparison of EOG artifact rejection techniques

    Directory of Open Access Journals (Sweden)

    Manousos A. Klados


    Full Text Available Artifact rejection techniques are used to recover the brain signals underlying artifactual electroencephalographic (EEG segments. Although over the last few years many different artifact rejection techniques have been proposed ( [1], [2], [3], none has been established as a gold standard so far, because assessing their performance is difficult and subjective ( [4], [5], [6]. This limitation is mainly based on the fact that the underlying artifact-free brain signal is unknown, so there is no objective way to measure how close the retrieved signal is to the real one. This article solves the aforementioned problem by presenting a semi-simulated EEG dataset, where artifact-free EEG signals are manually contaminated with ocular artifacts, using a realistic head model. The significant part of this dataset is that it contains the pre-contamination EEG signals, so the brain signals underlying the EOG artifacts are known and thus the performance of every artifact rejection technique can be objectively assessed.

  1. Validation of total skin electron irradiation (TSEI) technique dosimetry data by Monte Carlo simulation. (United States)

    Nevelsky, Alexander; Borzov, Egor; Daniel, Shahar; Bar-Deroma, Rachel


    Total skin electron irradiation (TSEI) is a complex technique which requires many nonstandard measurements and dosimetric procedures. The purpose of this work was to validate measured dosimetry data by Monte Carlo (MC) simulations using EGSnrc-based codes (BEAMnrc and DOSXYZnrc). Our MC simulations consisted of two major steps. In the first step, the incident electron beam parameters (energy spectrum, FWHM, mean angular spread) were adjusted to match the measured data (PDD and profile) at SSD = 100 cm for an open field. In the second step, these parameters were used to calculate dose distributions at the treatment distance of 400 cm. MC simulations of dose distributions from single and dual fields at the treatment distance were performed in a water phantom. Dose distribution from the full treatment with six dual fields was simulated in a CT-based anthropomorphic phantom. MC calculations were compared to the available set of measurements used in clinical practice. For one direct field, MC calculated PDDs agreed within 3%/1 mm with the measurements, and lateral profiles agreed within 3% with the measured data. For the OF, the measured and calculated results were within 2% agreement. The optimal angle of 17° was confirmed for the dual field setup. Dose distribution from the full treatment with six dual fields was simulated in a CT-based anthropomorphic phantom. The MC-calculated multiplication factor (B12-factor), which relates the skin dose for the whole treatment to the dose from one calibration field, for setups with and without degrader was 2.9 and 2.8, respectively. The measured B12-factor was 2.8 for both setups. The difference between calculated and measured values was within 3.5%. It was found that a degrader provides more homogeneous dose distribution. The measured X-ray contamination for the full treatment was 0.4%; this is compared to the 0.5% X-ray contamination obtained with the MC calculation. Feasibility of MC simulation in an anthropomorphic phantom

  2. Simulation Tools and Techniques for Analyzing the Impacts of Photovoltaic System Integration (United States)

    Hariri, Ali

    utility simulation software. On the other hand, EMT simulation tools provide high accuracy and visibility over a wide bandwidth of frequencies at the expense of larger processing and memory requirements, limited network size, and long simulation time. Therefore, there is a gap in simulation tools and techniques that can efficiently and effectively identify potential PV impact. New planning simulation tools are needed in order to accommodate for the simulation requirements of new integrated technologies in the electric grid. The dissertation at hand starts by identifying some of the potential impacts that are caused by high PV penetration. A phasor-based quasi-static time series (QSTS) analysis tool is developed in order to study the slow dynamics that are caused by the variations in the PV generation that lead to voltage fluctuations. Moreover, some EMT simulations are performed in order to study the impacts of PV systems on the electric network harmonic levels. These studies provide insights into the type and duration of certain impacts, as well as the conditions that may lead to adverse phenomena. In addition these studies present an idea about the type of simulation tools that are sufficient for each type of study. After identifying some of the potential impacts, certain planning tools and techniques are proposed. The potential PV impacts may cause certain utilities to refrain from integrating PV systems into their networks. However, each electric network has a certain limit beyond which the impacts become substantial and may adversely interfere with the system operation and the equipment along the feeder; this limit is referred to as the hosting limit (or hosting capacity). Therefore, it is important for utilities to identify the PV hosting limit on a specific electric network in order to safely and confidently integrate the maximum possible PV systems. In the following dissertation, two approaches have been proposed for identifying the hosing limit: 1. Analytical

  3. Colorimetric control of photographic prints: the problem of fluorescence (United States)

    Witt, Klaus


    The colorimetric control of photographic prints is an important issue of color reproduction quality. Concentration series of the three photographic dyes Yellow, Magenta and Cyan on photographic paper are very often measured without considering the optical quality of the substrate. However, this substrate, the photographic paper, produces problems for colorimetry because it emits luminenscent light from optical brighteners. Many instruments used in colorimetry are not adapted to measure such an output correctly. Here, an experimental investigation is presented which quantifies systematic shifts of spectral curves and colorimetric values of photographic paper and dyes for various measuring parameters. Several spectrophotometers equipped with different sources for irradiation such as Tungsten halogen lamp, Xenon flash lamp, D65 simulator, and two Xenon lamps with adjustable UV-filter were referenced against the results of two-monochromator spectrophotometer. In the photographic paper the largest color-difference extended to approximately 10 CIELAB-units between irradiation by a Tungsten halogen and a Xenon lamp. These differences diminished with increasing concentration of the photographic dyes, however, did not die away at the highest concentration of the dyes used. The correct colorimetric values for D65 irradiation were halfway between those for the two former lamps and near to measures received from Xenon lamp irraditoin with adjustable UV-filter. Therefore, such spectrophotometers may be used for colorimetric control of photographic prints to attain an accuracy below 3 CIELAB units, or else, two measurements with a Tungsten halogen and a Xenon lamp should be averaged.

  4. Comparison of sonochemiluminescence images using image analysis techniques and identification of acoustic pressure fields via simulation. (United States)

    Tiong, T Joyce; Chandesa, Tissa; Yap, Yeow Hong


    One common method to determine the existence of cavitational activity in power ultrasonics systems is by capturing images of sonoluminescence (SL) or sonochemiluminescence (SCL) in a dark environment. Conventionally, the light emitted from SL or SCL was detected based on the number of photons. Though this method is effective, it could not identify the sonochemical zones of an ultrasonic systems. SL/SCL images, on the other hand, enable identification of 'active' sonochemical zones. However, these images often provide just qualitative data as the harvesting of light intensity data from the images is tedious and require high resolution images. In this work, we propose a new image analysis technique using pseudo-colouring images to quantify the SCL zones based on the intensities of the SCL images and followed by comparison of the active SCL zones with COMSOL simulated acoustic pressure zones. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Alternative Forecasting Techniques that Reduce the Bullwhip Effect in a Supply Chain: A Simulation Study

    Directory of Open Access Journals (Sweden)

    Francisco Campuzano-Bolarín


    Full Text Available The research of the Bullwhip effect has given rise to many papers, aimed at both analysing its causes and correcting it by means of various management strategies because it has been considered as one of the critical problems in a supply chain. This study is dealing with one of its principal causes, demand forecasting. Using different simulated demand patterns, alternative forecasting methods are proposed, that can reduce the Bullwhip effect in a supply chain in comparison to the traditional forecasting techniques (moving average, simple exponential smoothing, and ARMA processes. Our main findings show that kernel regression is a good alternative in order to improve important features in the supply chain, such as the Bullwhip, NSAmp, and FillRate.

  6. The Photograph as Network

    DEFF Research Database (Denmark)

    Wiegand, Frauke Katharina


    tool for reading the net-work of images. The case is two visitor snapshots at the photographic exhibition, ‘The Story of Soweto’, in the famous Regina Mundi Church in Soweto, South Africa. I demonstrate that, when slightly adjusted for research engaging with visual materials, Latourian concepts......Inspired by actor-network theory (ANT), this article develops a theoretical framework to grasp the dynamic visual work of memory. It introduces three sensitizing concepts of actor-network methodology, namely entanglement, relationality and traceability, and operationalizes them in a methodological...

  7. MATLAB Simulation of UPQC for Power Quality Mitigation Using an Ant Colony Based Fuzzy Control Technique

    Directory of Open Access Journals (Sweden)

    N. Kumarasabapathy


    Full Text Available This paper proposes a fuzzy logic based new control scheme for the Unified Power Quality Conditioner (UPQC for minimizing the voltage sag and total harmonic distortion in the distribution system consequently to improve the power quality. UPQC is a recent power electronic module which guarantees better power quality mitigation as it has both series-active and shunt-active power filters (APFs. The fuzzy logic controller has recently attracted a great deal of attention and possesses conceptually the quality of the simplicity by tackling complex systems with vagueness and ambiguity. In this research, the fuzzy logic controller is utilized for the generation of reference signal controlling the UPQC. To enable this, a systematic approach for creating the fuzzy membership functions is carried out by using an ant colony optimization technique for optimal fuzzy logic control. An exhaustive simulation study using the MATLAB/Simulink is carried out to investigate and demonstrate the performance of the proposed fuzzy logic controller and the simulation results are compared with the PI controller in terms of its performance in improving the power quality by minimizing the voltage sag and total harmonic distortion.

  8. Kinetic simulation technique for plasma flow in strong external magnetic field (United States)

    Ebersohn, Frans H.; Sheehan, J. P.; Gallimore, Alec D.; Shebalin, John V.


    A technique for the kinetic simulation of plasma flow in strong external magnetic fields was developed which captures the compression and expansion of plasma bound to a magnetic flux tube as well as forces on magnetized particles within the flux tube. This quasi-one-dimensional (Q1D) method resolves a single spatial dimension while modeling two-dimensional effects. The implementation of this method in a Particle-In-Cell (PIC) code was verified with newly formulated test cases which include two-particle motion and particle dynamics in a magnetic mirror. Results from the Q1D method and fully two dimensional simulations were compared and error analyses performed verifying that the Q1D model reproduces the fully 2D results in the correct regimes. The Q1D method was found to be valid when the hybrid Larmor radius was less than 10% of the magnetic field scale length for magnetic field guided plasma expansions and less than 1% of the magnetic field scale length for a plasma in a converging-diverging magnetic field. The simple and general Q1D method can readily be incorporated in standard 1D PIC codes to capture multi-dimensional effects for plasma flow along magnetic fields in parameter spaces currently inaccessible by fully kinetic methods.

  9. MATLAB Simulation of UPQC for Power Quality Mitigation Using an Ant Colony Based Fuzzy Control Technique. (United States)

    Kumarasabapathy, N; Manoharan, P S


    This paper proposes a fuzzy logic based new control scheme for the Unified Power Quality Conditioner (UPQC) for minimizing the voltage sag and total harmonic distortion in the distribution system consequently to improve the power quality. UPQC is a recent power electronic module which guarantees better power quality mitigation as it has both series-active and shunt-active power filters (APFs). The fuzzy logic controller has recently attracted a great deal of attention and possesses conceptually the quality of the simplicity by tackling complex systems with vagueness and ambiguity. In this research, the fuzzy logic controller is utilized for the generation of reference signal controlling the UPQC. To enable this, a systematic approach for creating the fuzzy membership functions is carried out by using an ant colony optimization technique for optimal fuzzy logic control. An exhaustive simulation study using the MATLAB/Simulink is carried out to investigate and demonstrate the performance of the proposed fuzzy logic controller and the simulation results are compared with the PI controller in terms of its performance in improving the power quality by minimizing the voltage sag and total harmonic distortion.

  10. Reservoir Simulator Runtime Enhancement Based on a Posteriori Error Estimation Techniques

    Directory of Open Access Journals (Sweden)

    Gratien Jean-Marc


    Full Text Available In this work, we show how the a posteriori error estimation techniques proposed in [Di Pietro et al. (2014 Computers & Mathematics with Applications 68, 2331-2347] can be efficiently employed to improve the performance of a compositional reservoir simulator dedicated to Enhanced Oil Recovery (EOR processes. This a posteriori error estimate allows to propose an adaptive mesh refinement algorithm leading to significant gain in terms of the number of cells in mesh compared to a fine mesh resolution, and to formulate criteria for stopping the iterative algebraic solver and the iterative linearization solver without any loss of precision. The emphasis of this paper is on the computational cost of the error estimators. We introduce an efficient computation using a practical simplified formula that can be easily implemented in a reservoir simulation code. Numerical results for a real-life reservoir engineering example in three dimensions show that we obtain a significant gain in CPU times without affecting the accuracy of the oil production forecast.

  11. Distribution of indoor thoron in dwellings under normal and turbulent flow conditions using CFD simulation technique

    Directory of Open Access Journals (Sweden)

    Chauhan Rishi Pal


    Full Text Available Extensive work has been carried out on measurement of radon and thoron levels in indoor environment in last three decades. These studies are important from radiation protection point of view, if one considered the contribution of radon, thoron and their decay products to total inhalation dose. Numerous studies on radon measurement well established the behaviour of its dispersion in dwellings. But the short lives of thoron cause the difficulty to understand the distribution of thoron in dwellings. The problem becomes more complicated when thoron dispersion is studied under different inlet air flow rate. Different air flow pattern may cause different thoron level at different point in test dwellings causing uncertainty in the measurements. This work utilized the CFD simulation technique for study of indoor thoron dispersion in test dwellings under normal and turbulent flow of air. The simulation study for thoron distribution in a test room was performed for air velocities 0.10 ms-1, 0.25 ms-1, 0.50 ms-1, 1.0 ms-1, 1.5 ms-1, and 2.0 ms-1. The results show that the thoron distribution becomes uniform for the inlet velocity more than 0.5 ms-1 and appropriate to measure indoor thoron concentration. While in normal condition the measured thoron level varies depending upon the location of dosimeter. Thoron diffusion and migration length are also increased with air flow rate.



    N. A. Degotinsky; V. R. Lutsiv


    Subject of Research. Westudied a method of estimating the object distance on the basis of its single defocused photograph. The method is based on the analysis of image defocus at the contour points corresponding to borders of photographed objects. It is supposed that the brightness drop in not defocused image of border can be simulated with an ideal step function – the Heaviside function. Method. The contours corresponding to local maxima of brightness gradient are detected in the initial ima...

  13. Simulation and Track Reconstruction Techniques for the J-PARC muon g-2 experiment

    Directory of Open Access Journals (Sweden)

    Tsilias Paschalis


    Full Text Available The Muon g-2/EDM proposed experiment at J-PARC is a promising and innovative attempt at the field of Precision Physics. The sensitivity goal of 0.1 ppm will test the limits of our current understanding, and may probe for Beyond the Standard Model observations. This paper seeks out to investigate the computational techniques required by the experiment. The GEANT4 [1] framework was used to simulate the detector setup, according to the experiment’s Conceptual Design Report (CDR [2]. This allowed to observe the event hierarchy in different energies, generate signal hit data, and construct an event-selection algorithm. ROOT and GDML enabled us to use the geometry and parsed output data in a platform-independent way. Using techniques pertaining to Machine Learning and Image Feature extraction, such as the Canny Edge detection and the Hough Transform, we were able to construct a generic representation of ‘track families’ from each event category. Finally, the modular GENFIT2 [3] framework was used to implement the Kalman Filter [4] along with an Deterministic Annealing Filter (DAF [5] and the Runge-Kutta stepper to reconstruct tracks from a few digitized, smeared singular event data.

  14. Monte Carlo simulation in proton computed tomography: a study of image reconstruction technique

    Energy Technology Data Exchange (ETDEWEB)

    Inocente, Guilherme Franco; Stenico, Gabriela V.; Hormaza, Joel Mesa [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Inst. de Biociencias. Dept. de Fisica e Biofisica


    Full text: The radiation method is one of the most used for cancer treatment. In this context arises therapy with proton beams in front of conventional radiotherapy. It is known that with proton therapy there are more advantages to the patient treated when compared with more conventional methods. The dose distributed along the path, especially in healthy tissues - neighbor the tumor, is smaller and the accuracy of treatment is much better. To carry out the treatment, the patient undergoes a plan through images for visualization and location of the target volume. The main method for obtaining these images is computed tomography X-ray (XCT). For treatment with proton beam this imaging technique can to generate some uncertainties. The purpose of this project is to study the feasibility of reconstructing images generated from the irradiation with proton beams, thereby reducing some inaccuracies, as it will be the same type of radiation as treatment planning, and also to drastically reduce some errors location, since the planning can be done at the same place and just before where the patient is treated. This study aims to obtain a relationship between the intrinsic property of the interaction of photons and protons with matter. For this we use computational simulation based on Monte Carlo method with the code SRIM 2008 and MCNPX v.2.5.0, to reconstruct images using the technique used in conventional computed tomography. (author)

  15. An object localization optimization technique in medical images using plant growth simulation algorithm. (United States)

    Bhattacharjee, Deblina; Paul, Anand; Kim, Jeong Hong; Kim, Mucheol


    The analysis of leukocyte images has drawn interest from fields of both medicine and computer vision for quite some time where different techniques have been applied to automate the process of manual analysis and classification of such images. Manual analysis of blood samples to identify leukocytes is time-consuming and susceptible to error due to the different morphological features of the cells. In this article, the nature-inspired plant growth simulation algorithm has been applied to optimize the image processing technique of object localization of medical images of leukocytes. This paper presents a random bionic algorithm for the automated detection of white blood cells embedded in cluttered smear and stained images of blood samples that uses a fitness function that matches the resemblances of the generated candidate solution to an actual leukocyte. The set of candidate solutions evolves via successive iterations as the proposed algorithm proceeds, guaranteeing their fit with the actual leukocytes outlined in the edge map of the image. The higher precision and sensitivity of the proposed scheme from the existing methods is validated with the experimental results of blood cell images. The proposed method reduces the feasible sets of growth points in each iteration, thereby reducing the required run time of load flow, objective function evaluation, thus reaching the goal state in minimum time and within the desired constraints.

  16. Determination of true coincidence correction factors using Monte-Carlo simulation techniques

    Directory of Open Access Journals (Sweden)

    Chionis Dionysios A.


    Full Text Available Aim of this work is the numerical calculation of the true coincidence correction factors by means of Monte-Carlo simulation techniques. For this purpose, the Monte Carlo computer code PENELOPE was used and the main program PENMAIN was properly modified in order to include the effect of the true coincidence phenomenon. The modified main program that takes into consideration the true coincidence phenomenon was used for the full energy peak efficiency determination of an XtRa Ge detector with relative efficiency 104% and the results obtained for the 1173 keV and 1332 keV photons of 60Co were found consistent with respective experimental ones. The true coincidence correction factors were calculated as the ratio of the full energy peak efficiencies was determined from the original main program PENMAIN and the modified main program PENMAIN. The developed technique was applied for 57Co, 88Y, and 134Cs and for two source-to-detector geometries. The results obtained were compared with true coincidence correction factors calculated from the "TrueCoinc" program and the relative bias was found to be less than 2%, 4%, and 8% for 57Co, 88Y, and 134Cs, respectively.

  17. A novel 3D modelling and simulation technique in thermotherapy predictive analysis on biological tissue (United States)

    Fanjul-Vélez, F.; Arce-Diego, J. L.; Romanov, Oleg G.; Tolstik, Alexei L.


    Optical techniques applied to biological tissue allow the development of new tools in medical praxis, either in tissue characterization or treatment. Examples of the latter are Photodynamic Therapy (PDT) or Low Intensity Laser Treatment (LILT), and also a promising technique called thermotherapy, that tries to control temperature increase in a pathological tissue in order to reduce or even eliminate pathological effects. The application of thermotherapy requires a previous analysis in order to avoid collateral damage to the patient, and also to choose the appropriate optical source parameters. Among different implementations of opto-thermal models, the one we use consists of a three dimensional Beer-Lambert law for the optical part, and a bio-heat equation, that models heat transference, conduction, convection, radiation, blood perfusion and vaporization, solved via a numerical spatial-temporal explicit finite difference approach, for the thermal part. The usual drawback of the numerical method of the thermal model is that convergence constraints make spatial and temporal steps very small, with the natural consequence of slow processing. In this work, a new algorithm implementation is used for the bio-heat equation solution, in such a way that the simulation time decreases considerably. Thermal damage based on the Arrhenius integral damage is also considered.

  18. A Simulation Technique for Three-Dimensional Mechanical Systems Using Universal Software Systems of Analysis

    Directory of Open Access Journals (Sweden)

    V. A. Trudonoshin


    Full Text Available The article proposes a technique to develop mathematical models (MM of elements of the three-dimensional (3D mechanical systems for universal simulation software systems that allow us automatically generate the MM of a system based on MM elements and their connections. The technique is based on the MM of 3 D body. Linear and angular velocities are used as the main phase variables (unknown in the MM of the system, linear and angular movements are used as the additional ones, the latter being defined by the normalized quaternions that have computational advantages over turning angles.The paper has considered equations of dynamics, formulas of transition from the global coordinate system to the local one and vice versa. A spherical movable joint is presented as an example of the interaction element between the bodies. The paper shows the MM equivalent circuits of a body and a spherical joint. Such a representation, as the equivalent circuit, automatically enables us to obtain topological equations of the system. Various options to build equations of the joint and advices for their practical use are given.

  19. A graphical simulator for teaching basic and advanced MR imaging techniques

    DEFF Research Database (Denmark)

    Hanson, Lars G


    Teaching of magnetic resonance (MR) imaging techniques typically involves considerable handwaving, literally, to explain concepts such as resonance, rotating frames, dephasing, refocusing, sequences, and imaging. A proper understanding of MR contrast and imaging techniques is crucial for radiolog...... be visualized in an intuitive way. The cross-platform software is primarily designed for use in lectures, but is also useful for self studies and student assignments. Movies available at ....... for radiologists, radiographers, and technical staff alike, but it is notoriously challenging to explain spin dynamics by using traditional teaching tools. The author developed a freely available graphical simulator based on the Bloch equations to aid in the teaching of topics ranging from precession...... and relaxation to advanced concepts such as stimulated echoes, spin tagging, and k-space-methods. A graphical user interface provides the user with a three-dimensional view of spin isochromates that can be manipulated by selecting radiofrequency pulses and gradient events. Even complicated sequences can...

  20. Confirmation of standard error analysis techniques applied to EXAFS using simulations

    Energy Technology Data Exchange (ETDEWEB)

    Booth, Corwin H; Hu, Yung-Jin


    Systematic uncertainties, such as those in calculated backscattering amplitudes, crystal glitches, etc., not only limit the ultimate accuracy of the EXAFS technique, but also affect the covariance matrix representation of real parameter errors in typical fitting routines. Despite major advances in EXAFS analysis and in understanding all potential uncertainties, these methods are not routinely applied by all EXAFS users. Consequently, reported parameter errors are not reliable in many EXAFS studies in the literature. This situation has made many EXAFS practitioners leery of conventional error analysis applied to EXAFS data. However, conventional error analysis, if properly applied, can teach us more about our data, and even about the power and limitations of the EXAFS technique. Here, we describe the proper application of conventional error analysis to r-space fitting to EXAFS data. Using simulations, we demonstrate the veracity of this analysis by, for instance, showing that the number of independent dat a points from Stern's rule is balanced by the degrees of freedom obtained from a 2 statistical analysis. By applying such analysis to real data, we determine the quantitative effect of systematic errors. In short, this study is intended to remind the EXAFS community about the role of fundamental noise distributions in interpreting our final results.

  1. Monte Carlo Simulation of Emission Tomography and other Medical Imaging Techniques (United States)

    Harrison, Robert L.


    As an introduction to Monte Carlo simulation of emission tomography, this paper reviews the history and principles of Monte Carlo simulation, then applies these principles to emission tomography using the public domain simulation package SimSET (a Simulation System for Emission Tomography) as an example. Finally, the paper discusses how the methods are modified for X-ray computed tomography and radiotherapy simulations.

  2. Personal photograph enhancement using internet photo collections. (United States)

    Zhang, Chenxi; Gao, Jizhou; Wang, Oliver; Georgel, Pierre; Yang, Ruigang; Davis, James; Frahm, Jan-Michael; Pollefeys, Marc


    Given the growth of Internet photo collections, we now have a visual index of all major cities and tourist sites in the world. However, it is still a difficult task to capture that perfect shot with your own camera when visiting these places, especially when your camera itself has limitations, such as a limited field of view. In this paper, we propose a framework to overcome the imperfections of personal photographs of tourist sites using the rich information provided by large-scale Internet photo collections. Our method deploys state-of-the-art techniques for constructing initial 3D models from photo collections. The same techniques are then used to register personal photographs to these models, allowing us to augment personal 2D images with 3D information. This strong available scene prior allows us to address a number of traditionally challenging image enhancement techniques and achieve high-quality results using simple and robust algorithms. Specifically, we demonstrate automatic foreground segmentation, mono-to-stereo conversion, field-of-view expansion, photometric enhancement, and additionally automatic annotation with geolocation and tags. Our method clearly demonstrates some possible benefits of employing the rich information contained in online photo databases to efficiently enhance and augment one's own personal photographs.

  3. Photographic Portraits: Narrative and Memory

    Directory of Open Access Journals (Sweden)

    Brian Roberts


    Full Text Available This article is a more general "companion" to the subsequent, Brian ROBERTS (2011 "Interpreting Photographic Portraits: Autobiography, Time Perspectives and Two School Photographs". The article seeks to add to the growing awareness of the importance of visual materials and methods in qualitative social research and to give an introduction to the "photographic self image"—self-portraits and portraits. It focuses on time and memory, including the experiential associations (in consciousness and the senses that the self engenders, thus linking the "visual" (photographic and "auto/biographical". The article attempts to "map" a field—the use of portraiture within social science—drawing on narrative and biographical research, on one side, and photographic portraiture, on the other. In supporting the use of photography in qualitative research it points to the need for researchers to have a greater knowledge of photographic (and art criticism and cognisance of photographic practices. The article does not intend to give a definitive account of photographic portraiture or prescribe in detail how it may be used within social science. It is an initial overview of the development and issues within the area of photographic portraiture and an exploration of relevant methodological issues when images of individuals are employed within social science—so that "portraiture" is better understood and developed within biographical and narrative research. URN:

  4. Painter 12 for photographers creating painterly images step by step

    CERN Document Server

    Addison, Martin


    Transform your photographs into stunning works of art with this fully updated, authoritative guide to the all-new Painter 12. Whether you are new to Painter or a seasoned pro wanting to go further with your digital art, Painter 12 for Photographers will show you how to get the most of Corel's powerful painting software. Starting with the basics and moving on to cover brushes, textures, cloning, toning, and other effects, Martin Addison will help you master the techniques needed to transform photographs into beautiful painterly images. Packed with vivid images to illustrate what can be achieve

  5. Effect of composite insertion technique on cuspal deflection using an in vitro simulation model. (United States)

    Jafarpour, Saeed; El-Badrawy, Wafa; Jazi, Hamid Salimi; McComb, Dorothy


    The objective of this study was to investigate, by simulation, the effect of conventional composite resin insertion techniques on cuspal deflection using bonded typodont artificial teeth. The deflection produced by a new low-shrinkage composite was also determined. Sixty standardized MOD preparations on ivorine maxillary premolars were prepared: group A at 4 mm depth and group B at 6 mm depth. Each group was further subdivided according to composite insertion technique (n=6), as follows: 1) bulk insertion, 2) horizontal increments, 3) tangential increments, and 4) a modified tangential technique. Preparations were microetched, acid-cleaned, and bonded with adhesive resin to provide micromechanical attachment before restoration with a conventional composite (Spectrum TPH( 3 ), Dentsply). Two additional subgroups at 4 mm and 6 mm depth (n=6) were restored in bulk using low-shrinkage composite (Filtek LS, 3M/ESPE). All groups received the same total photo-polymerization time. Cuspal deflection was measured during the restorative procedure using two Linear Variable Differential Transformers attached to a data acquisition system. The average cuspal deflections for group A were 1) 40.17 ± 1.18 μm, 2) 25.80 ± 4.98 μm, 3) 28.27 ± 5.12 μm, and 4) 27.33 ± 2.42 μm. The deflections in group B were 1) 38.82 ± 3.64 μm, 2) 50.39 ± 9.17 μm, 3) 55.62 ± 8.16 μm, and 4) 49.61 ± 8.01 μm. Cuspal flexure for the low-shrinkage composite was 11.14 ± 1.67 μm (group A: 4 mm depth) and 16.53 ± 2.79 μm (group B: 6 mm depth). All insertion techniques using conventional composite caused cuspal deformation. In general, deeper preparations showed increased cuspal deflection-except in the case of bulk insertion, which was likely affected by decreased depth of cure. Cuspal movement using low-shrinkage composite was significantly reduced.

  6. The effect of experience, simulator-training and biometric feedback on manual ventilation technique. (United States)

    Lewis, Rebecca; Sherfield, Cerrie A; Fellows, Christopher R; Burrow, Rachel; Young, Iain; Dugdale, Alex


    To determine the frequency of provision and main providers (veterinary surgeons, nurses or trainees) of manual ventilation in UK veterinary practices. Furthermore, to determine the variation in peak inspiratory (inflation) pressure (PIP), applied to a lung model during manual ventilation, by three different groups of operators (inexperienced, experienced and specialist), before and after training. Questionnaire survey, lung model simulator development and prospective testing. Postal questionnaires were sent to 100 randomly selected veterinary practices. The lung model simulator was manually ventilated in a staged process over 3 weeks, with and without real-time biometric feedback (PIP display), by three groups of volunteer operators: inexperienced, experienced and specialist. The questionnaires determined that veterinary nurses were responsible for providing the majority of manual ventilation in veterinary practices, mainly drawing on theoretical knowledge rather than any specific training. Thoracic surgery and apnoea were the main reasons for provision of manual ventilation. Specialists performed well when manually ventilating the lung model, regardless of feedback training. Both inexperienced and experienced operators showed significant improvement in technique when using the feedback training tool: variation in PIP decreased significantly until operators provided manual ventilation at PIPs within the defined optimum range. Preferences for different forms of feedback (graphical, numerical or scale display), revealed that the operators' choice was not always the method which gave least variation in PIP. This study highlighted a need for training in manual ventilation at an early stage in veterinary and veterinary nursing careers and demonstrated how feedback is important in the process of experiential learning. A manometer device which can provide immediate feedback during training, or indeed in a real clinical setting, should improve patient safety. Copyright

  7. Photograph Usage in History Education (United States)

    Akbaba, Bulent


    In this study, the effect of photograph usage in history education to the students' achievement was tried to be identified. In the study which was done with a pre-test post-test control group design, a frame was tried to be established between the experimental group and the analytical usage of the photograph, the control group's courses were done…

  8. A French survey on breast radiotherapy techniques for simulation and treatment; Enquete sur les techniques de preparation et de traitement utilisees pour la radiotherapie du sein en France

    Energy Technology Data Exchange (ETDEWEB)

    Fournier-Bidoz, N.; Rosenwald, J.C. [Institut Curie, Dept. d' Oncologie-Radiotherapie, 75 - Paris (France); Romestaing, P. [Centre Hospitalier Lyon-Sud, Dept. d' Oncologie-Radiotherapie, 69 - Pierre-Benite (France)


    Breast radiotherapy is still in progress. The target volumes - whole breast and lymph nodes - are usually located by clinical palpation and the use of bony landmarks. However computed tomography has allowed a better definition of the deep edge of the volumes and the calculation of 3D dose distributions. A survey of 194 centers has started in June 2005 in France. The questionnaire that was sent included questions about general techniques in breast radiotherapy. Preliminary results on 50 centers showed that patient anatomical data were in the vast majority acquired by a simulator-CT or a CT (for 92%). In the 50 departments, beam placement is done either directly at the simulator (20 centers), or on the TPS (16 centers). Virtual simulation software is used in 8 centers. In about 20% (11) radiotherapy departments, 3D target volumes are contoured and the beams adapted to their shapes. (author)

  9. Digital image processing of flow visualization photographs (United States)

    Hesselink, L.; White, B. S.


    This paper is concerned with the propagation of laser light through a slab of a randomly varying medium. A theoretical analysis is presented which relates the spectrum of the recorded-intensity field some distance downstream of the medium to the spectrum of the index-of-refraction field. For a homogeneous and isotropic random field, the 3-D spectrum of the medium is obtained from the 2-D spectrum of the photograph by dividing each component of the spectrum by the frequency raised to the fourth power. Free-space propagation outside the random medium is accounted for by a scaling factor. Experimental results are presented which support the theoretical analysis. The nonintrusive diagnostic technique presented here is applicable to photographs which contain partially developed caustic networks.

  10. Modeling and simulation of PEM fuel cell's flow channels using CFD techniques

    Energy Technology Data Exchange (ETDEWEB)

    Cunha, Edgar F.; Andrade, Alexandre B.; Robalinho, Eric; Bejarano, Martha L.M.; Linardi, Marcelo [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)]. E-mails:;;;;; Cekinski, Efraim [Instituto de Pesquisas Tecnologicas (IPT-SP), Sao Paulo, SP (Brazil)]. E-mail:


    Fuel cells are one of the most important devices to obtain electrical energy from hydrogen. The Proton Exchange Membrane Fuel Cell (PEMFC) consists of two important parts: the Membrane Electrode Assembly (MEA), where the reactions occur, and the flow field plates. The plates have many functions in a fuel cell: distribute reactant gases (hydrogen and air or oxygen), conduct electrical current, remove heat and water from the electrodes and make the cell robust. The cost of the bipolar plates corresponds up to 45% of the total stack costs. The Computational Fluid Dynamic (CFD) is a very useful tool to simulate hydrogen and oxygen gases flow channels, to reduce the costs of bipolar plates production and to optimize mass transport. Two types of flow channels were studied. The first type was a commercial plate by ELECTROCELL and the other was entirely projected at Programa de Celula a Combustivel (IPEN/CNEN-SP) and the experimental data were compared with modelling results. Optimum values for each set of variables were obtained and the models verification was carried out in order to show the feasibility of this technique to improve fuel cell efficiency. (author)

  11. The Rhetoric of the Frame Revisioning Archival Photographs in "The Civil War." (United States)

    Lancioni, Judith


    Illustrates the ways in which mobile framing and reframing (techniques used on the archival photographs used in the documentary film "The Civil War") constitute a visual argument. Suggests that these techniques lead viewers to analyze the photographs from the vantage point of both current and past ideologies, and proves especially…

  12. What Makes a Photograph Memorable? (United States)

    Isola, Phillip; Jianxiong Xiao; Parikh, Devi; Torralba, Antonio; Oliva, Aude


    When glancing at a magazine, or browsing the Internet, we are continuously exposed to photographs. Despite this overflow of visual information, humans are extremely good at remembering thousands of pictures along with some of their visual details. But not all images are equal in memory. Some stick in our minds while others are quickly forgotten. In this paper, we focus on the problem of predicting how memorable an image will be. We show that memorability is an intrinsic and stable property of an image that is shared across different viewers, and remains stable across delays. We introduce a database for which we have measured the probability that each picture will be recognized after a single view. We analyze a collection of image features, labels, and attributes that contribute to making an image memorable, and we train a predictor based on global image descriptors. We find that predicting image memorability is a task that can be addressed with current computer vision techniques. While making memorable images is a challenging task in visualization, photography, and education, this work is a first attempt to quantify this useful property of images.

  13. The importance of being systematically surprisable : Comparative social simulation as experimental technique

    NARCIS (Netherlands)

    Achterkamp, M.C.; Imhof, P.


    We argue that computer simulation can serve as a functional equivalent for the experimental method in sociology, with respect to theory development. To this end Eve present accounts of experimentation and simulation by experimenting/simulating scientists and sociologists of science. From these

  14. Application of a cycle jump technique for acceleration of fatigue crack growth simulation

    DEFF Research Database (Denmark)

    Moslemian, Ramin; Karlsson, A.M.; Berggreen, Christian


    A method for accelerated simulation of fatigue crack growth in a bimaterial interface is proposed. To simulate fatigue crack growth in a bimaterial interface a routine is developed in the commercial finite element code ANSYS and a method to accelerate the simulation is implemented. The proposed...

  15. Martin Parr in Mexico: Does Photographic Style Translate?

    Directory of Open Access Journals (Sweden)

    Timothy R. Gleason


    Full Text Available This study analyzes Martin Parr’s 2006 photobook, Mexico. Parr is a British documentary photographer best known for a direct photographic style that reflects upon “Englishness.”Mexico is his attempt to understand this foreign country via his camera. Mexico, as a research subject, is not a problem to solve but an opportunity to understand a photographer’s work. Parr’s Mexico photography (technique, photographic content, and interest in globalization, economics, and culture is compared to his previous work to explain how Parr uses fashion and icons to represent a culture or class. This article argues Parr’s primary subjects, heads/hats, food, and Christs, are photographed without excessive aesthetic pretensions so that the thrust of Parr’s message about globalization can be more evident:Mexico maintains many of its traditions and icons while adopting American brands.

  16. Simulations of the heat exchange in thermoplastic injection molds manufactured by additive techniques (United States)

    Daldoul, Wafa; Toulorge, Thomas; Vincent, Michel


    The cost and quality of complex parts manufactured by thermoplastic injection is traditionally limited by design constraints on the cooling system of the mold. A possible solution is to create the mold by additive manufacturing, which makes it possible to freely design the cooling channels. Such molds normally contain hollow parts (alveoli) in order to decrease their cost. However, the complex geometry of the cooling channels and the alveoli makes it difficult to predict the performance of the cooling system. This work aims to compute the heat exchanges between the polymer, the mold and the cooling channels with complex geometries. An Immersed Volume approach is taken, where the different parts of the domain (i.e. the polymer, the cooling channels, the alveoli and the mold) are represented by level-sets and the thermo-mechanical properties of the materials vary smoothly at the interface between the parts. The energy and momentum equations are solved by a stabilized Finite Element method. In order to accurately resolve the large variations of material properties and the steep temperature gradients at interfaces, state-of-the art anisotropic mesh refinement techniques are employed. The filling stage of the process is neglected. In a first step, only the heat equation is solved, so that the packing stage is also disregarded. In a second step, thermo-mechanical effects occurring in the polymer during the packing stage are taken into account, which results in the injection of an additional amount of polymer that significantly influences the temperature evolution. The method is validated on the simple geometry of a center-gated disk and compared with experimental measurements. The agreement is very good. Simulations are performed on an industrial case which illustrates the ability of the method to deal with complex geometries.

  17. A comparison between two negative pressure irrigation techniques in simulated immature tooth: an ex vivo study. (United States)

    Jamleh, Ahmed; Fukumoto, Yasue; Takatomo, Yoshioka; Kobayashi, Chihiro; Suda, Hideaki; Adorno, Carlos G


    This ex vivo study evaluated the irrigation efficacy of a new apical negative pressure system (ANP) in canals with simulated immature teeth, by comparing it to EndoVac (EV) system in terms of smear layer (SL) removal and irrigation extrusion. Three millimetres of the root end of 40 single canalled lower incisors were resected and decoronated to standardize root canal length. After instrumentation, the specimens were embedded in warm normal saline agar coloured with 1 % acid red and randomly divided into four groups; one control group and three experimental groups. Except in the control group where distilled water was used as irrigant using positive pressure irrigation needle, the canals were irrigated with 6 % NaOCl and 17 % EDTA using the intracanal negative pressure needle (iNP) system, the EV system or 27G open-ended needle under positive pressure (PP). NaOCl extrusion was determined by observing a discolouration of the agar surrounding the root. The SL was evaluated by observing scanning electron microscope images based on a four-level scoring system. Two specimens with irrigant extrusion were observed in the iNP group, which was significantly different (logistic regression, p  0.05) among the experimental groups in terms of SL removal, but all were significantly different to the control group. Irrigation with the iNP could be a viable alternative to EV as an apical negative pressure irrigation technique especially while treating immature teeth. ANP in canal cleanliness is recommended to be utilized in treating immature teeth where periapical tissues should be saved and stimulated. The iNP system might have the potential to avoid irrigant extrusion while cleaning the canal till the apical end.

  18. Motion-Based Piloted Simulation Evaluation of a Control Allocation Technique to Recover from Pilot Induced Oscillations (United States)

    Craun, Robert W.; Acosta, Diana M.; Beard, Steven D.; Leonard, Michael W.; Hardy, Gordon H.; Weinstein, Michael; Yildiz, Yildiray


    This paper describes the maturation of a control allocation technique designed to assist pilots in the recovery from pilot induced oscillations (PIOs). The Control Allocation technique to recover from Pilot Induced Oscillations (CAPIO) is designed to enable next generation high efficiency aircraft designs. Energy efficient next generation aircraft require feedback control strategies that will enable lowering the actuator rate limit requirements for optimal airframe design. One of the common issues flying with actuator rate limits is PIOs caused by the phase lag between the pilot inputs and control surface response. CAPIO utilizes real-time optimization for control allocation to eliminate phase lag in the system caused by control surface rate limiting. System impacts of the control allocator were assessed through a piloted simulation evaluation of a non-linear aircraft simulation in the NASA Ames Vertical Motion Simulator. Results indicate that CAPIO helps reduce oscillatory behavior, including the severity and duration of PIOs, introduced by control surface rate limiting.

  19. Who is that masked educator? Deconstructing the teaching and learning processes of an innovative humanistic simulation technique. (United States)

    McAllister, Margaret; Searl, Kerry Reid; Davis, Susan


    Simulation learning in nursing has long made use of mannequins, standardized actors and role play to allow students opportunity to practice technical body-care skills and interventions. Even though numerous strategies have been developed to mimic or amplify clinical situations, a common problem that is difficult to overcome in even the most well-executed simulation experiences, is that students may realize the setting is artificial and fail to fully engage, remember or apply the learning. Another problem is that students may learn technical competence but remain uncertain about communicating with the person. Since communication capabilities are imperative in human service work, simulation learning that only achieves technical competence in students is not fully effective for the needs of nursing education. Furthermore, while simulation learning is a burgeoning space for innovative practices, it has been criticized for the absence of a basis in theory. It is within this context that an innovative simulation learning experience named "Mask-Ed (KRS simulation)", has been deconstructed and the active learning components examined. Establishing a theoretical basis for creative teaching and learning practices provides an understanding of how, why and when simulation learning has been effective and it may help to distinguish aspects of the experience that could be improved. Three conceptual theoretical fields help explain the power of this simulation technique: Vygotskian sociocultural learning theory, applied theatre and embodiment. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. On the errors of spectral shallow-water limited-area model simulations using an extension technique

    Energy Technology Data Exchange (ETDEWEB)

    Simmel, M.; Harlander, U. [Leipzig Univ. (Germany). Inst. fuer Meteorologie (LIM)


    Although the spectral technique is frequently used for the horizontal discretization in global atmospheric models, it is not common in limited area models (LAMs) because of the nonperiodic boundary conditions. We apply the Haugen-Machenhauer extension technique to a regional three-layer shallow-water model based on double Fourier series. The method extends the time-dependent boundary fields into a zone outside the integration area in a way that periodic fields are obtained. The boundary fields necessary for the regional model simulations are calculated in advance by a global simulation performed. In contrast to other studies, we use exactly the same numerical model for the global and the regional simulation, respectively. The only difference between these simulations is the model domain. Therefore, a relatively objective measure for errors associated with the extension technique can be obtained. First, we compare an analytic stationary nonlinear and nonperiodic solution of the governing model equations with the spectral LAM solution. Secondly, we compare the time evolution of pressure and flow structures during a westerly flow across an asymmetric large-scale topography in the global and regional model domains. Both simulations show a good agreement between the regional and the global solutions. The rms-errors amount to about 2 m for the layer heights and 0.2 m s{sup -1} for the velocity components in the mountain flow case after a 48 h integration period. Finally, we repeat this simulation with models based on 2nd and 4th order finite differences, respectively, and compare the errors of the spectral model version with the errors of the grid point versions. We demonstrate that the high accuracy of global spectral methods can also be realized in the regional model by using the Haugen-Machenhauer extension technique. (orig.) 21 refs.


    The Environmental Sciences Division (ESD) in the National Exposure Research Laboratory (NERL) of the Office of Research and Development provides remote sensing technical support including aerial photograph acquisition and interpretation to the EPA Program Offices, ORD Laboratorie...

  2. Innovations in surgery simulation: a review of past, current and future techniques (United States)

    Burtt, Karen; Solorzano, Carlos A.; Carey, Joseph N.


    As a result of recent work-hours limitations and concerns for patient safety, innovations in extraclinical surgical simulation have become a desired part of residency education. Current simulation models, including cadaveric, animal, bench-top, virtual reality (VR) and robotic simulators are increasingly used in surgical training programs. Advances in telesurgery, three-dimensional (3D) printing, and the incorporation of patient-specific anatomy are paving the way for simulators to become integral components of medical training in the future. Evidence from the literature highlights the benefits of including simulations in surgical training; skills acquired through simulations translate into improvements in operating room performance. Moreover, simulations are rapidly incorporating new medical technologies and offer increasingly high-fidelity recreations of procedures. As a result, both novice and expert surgeons are able to benefit from their use. As dedicated, structured curricula are developed that incorporate simulations into daily resident training, simulated surgeries will strengthen the surgeon’s skill set, decrease hospital costs, and improve patient outcomes. PMID:28090509

  3. Photographic Film Image Enhancement (United States)


    A series of experiments were undertaken to assess the feasibility of defogging color film by the techniques of Optical Spatial Filtering. A coherent optical processor was built using red, blue, and green laser light input and specially designed Fouri...

  4. A fast simulation method for the Log-normal sum distribution using a hazard rate twisting technique

    KAUST Repository

    Rached, Nadhir B.


    The probability density function of the sum of Log-normally distributed random variables (RVs) is a well-known challenging problem. For instance, an analytical closed-form expression of the Log-normal sum distribution does not exist and is still an open problem. A crude Monte Carlo (MC) simulation is of course an alternative approach. However, this technique is computationally expensive especially when dealing with rare events (i.e. events with very small probabilities). Importance Sampling (IS) is a method that improves the computational efficiency of MC simulations. In this paper, we develop an efficient IS method for the estimation of the Complementary Cumulative Distribution Function (CCDF) of the sum of independent and not identically distributed Log-normal RVs. This technique is based on constructing a sampling distribution via twisting the hazard rate of the original probability measure. Our main result is that the estimation of the CCDF is asymptotically optimal using the proposed IS hazard rate twisting technique. We also offer some selected simulation results illustrating the considerable computational gain of the IS method compared to the naive MC simulation approach.

  5. Forest Vegetation Simulator translocation techniques with the Bureau of Land Management's Forest Vegetation Information system database (United States)

    Timothy A. Bottomley


    The BLM uses a database, called the Forest Vegetation Information System (FORVIS), to store, retrieve, and analyze forest resource information on a majority of their forested lands. FORVIS also has the capability of easily transferring appropriate data electronically into Forest Vegetation Simulator (FVS) for simulation runs. Only minor additional data inputs or...

  6. Adobe Photoshop Elements 11 for photographers

    CERN Document Server

    Andrews, Philip


    To coincide with some of the biggest changes in Photoshop Elements for years, Philip Andrews completely revises his bestselling title to include all the new features of this release. See how the new interface works alongside new tools, techniques and workflows to make editing, enhancing and sharing your pictures easier than ever. And as always, he introduces the changed and improved features with colorful illustrations and the clear step-by-step instruction that has made his books the go-to titles for photographers the world over. ????In this edition Andrews highlights followi

  7. Adobe Photoshop CS6 for photographers

    CERN Document Server

    Evening, Martin


    Renowned Photographer and Photoshop hall-of-famer, Martin Evening returns with his comprehensive guide to Photoshop. This acclaimed work covers everything from the core aspects of working in Photoshop to advanced techniques for refined workflows and professional results. Using concise advice, clear instruction and real world examples, this essential guide will give you the skills, regardless of your experience, to create professional quality results. A robust accompanying website features sample images, tutorial videos, bonus chapters and a plethora of extra resources. Quite simply, this is

  8. Adobe Photoshop CS5 for photographers

    CERN Document Server

    Evening, Martin


    With the new edition of this proven bestseller, Photoshop users can master the power of Photoshop CS5 with internationally renowned photographer and Photoshop hall-of-famer Martin Evening by their side.  In this acclaimed reference work, Martin covers everything from the core aspects of working in Photoshop to advanced techniques for professional results. Subjects covered include organizing a digital workflow, improving creativity, output, automating Photoshop, and using Camera RAW. The style of the book is extremely clear, with real examples, diagrams, illustrations, and step-by-step ex

  9. Biases and systematics in the observational derivation of galaxy properties: comparing different techniques on synthetic observations of simulated galaxies (United States)

    Guidi, Giovanni; Scannapieco, Cecilia; Walcher, C. Jakob


    We study the sources of biases and systematics in the derivation of galaxy properties from observational studies, focusing on stellar masses, star formation rates, gas and stellar metallicities, stellar ages, magnitudes and colours. We use hydrodynamical cosmological simulations of galaxy formation, for which the real quantities are known, and apply observational techniques to derive the observables. We also analyse biases that are relevant for a proper comparison between simulations and observations. For our study, we post-process the simulation outputs to calculate the galaxies' spectral energy distributions (SEDs) using stellar population synthesis models and also generate the fully consistent far-UV-submillimetre wavelength SEDs with the radiative transfer code SUNRISE. We compared the direct results of simulations with the observationally derived quantities obtained in various ways, and found that systematic differences in all studied galaxy properties appear, which are caused by: (1) purely observational biases, (2) the use of mass-weighted and luminosity-weighted quantities, with preferential sampling of more massive and luminous regions, (3) the different ways of constructing the template of models when a fit to the spectra is performed, and (4) variations due to different calibrations, most notably for gas metallicities and star formation rates. Our results show that large differences can appear depending on the technique used to derive galaxy properties. Understanding these differences is of primary importance both for simulators, to allow a better judgement of similarities and differences with observations, and for observers, to allow a proper interpretation of the data.

  10. 36 CFR 702.4 - Photographs. (United States)


    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Photographs. 702.4 Section... Photographs. (a) The policy set out herein applies to all individuals who are photographing Library of... Communications, or his/her designee, is authorized to grant or deny permission, in writing, to photograph the...

  11. Experiences in teaching of modeling and simulation with emphasize on equation-based and acausal modeling techniques. (United States)

    Kulhánek, Tomáš; Ježek, Filip; Mateják, Marek; Šilar, Jan; Kofránek, Jří


    This work introduces experiences of teaching modeling and simulation for graduate students in the field of biomedical engineering. We emphasize the acausal and object-oriented modeling technique and we have moved from teaching block-oriented tool MATLAB Simulink to acausal and object oriented Modelica language, which can express the structure of the system rather than a process of computation. However, block-oriented approach is allowed in Modelica language too and students have tendency to express the process of computation. Usage of the exemplar acausal domains and approach allows students to understand the modeled problems much deeper. The causality of the computation is derived automatically by the simulation tool.

  12. Using a prostate exam simulator to decipher palpation techniques that facilitate the detection of abnormalities near clinical limits. (United States)

    Wang, Ninghuan; Gerling, Gregory J; Krupski, Tracey L; Childress, Reba Moyer; Martin, Marcus L


    Prostate carcinoma (and other prostate irregularities and abnormalities) is detected in part via the digital rectal examination. Training clinicians to use particular palpation techniques may be one way to improve the rates of detection. In an experiment of 34 participants with clinical backgrounds, we used a custom-built simulator to determine whether certain finger palpation techniques improved one's ability to detect abnormalities smaller in size and dispersed as multiples over a volume. The intent was to test abnormality cases of clinical relevance near the limits of size perceptibility (ie, 5-mm diameter). The simulator can present abnormalities in various configurations and record finger movement. To characterize finger movement, four palpation techniques were quantitatively defined (global finger movement, local finger movement, average intentional finger pressure, and dominant intentional finger frequency) to represent the qualitative definitions of other researchers. Participants who used more thorough patterns of global finger movement (V and L) ensured that the entire prostate was searched and detected more abnormalities. A higher magnitude of finger pressure was associated with the detection of smaller abnormalities. The local finger movement of firm pressure with varying intensities was most indicative of success and was required to identify the smallest (5-mm diameter) abnormality. When participants used firm pressure with varying intensities, their dominant intentional finger frequency was about 6 Hz. The use of certain palpation techniques does enable the detection of smaller and more numerous abnormalities, and we seek to abstract these techniques into a systematic protocol for use in the clinic.

  13. Photographic observation of a natural fifth-order rainbow. (United States)

    Edens, Harald E


    A photograph has been obtained of a natural fifth-order (quinary) rainbow. The photograph was acquired on 8 August 2012 with a digital camera and a polarization filter to maximize contrast of the rainbows with the background. The quinary rainbow, together with its first supernumerary, appears in a contrast-enhanced version of the photograph as broad green and blue-violet color bands within Alexander's dark band between the primary and secondary rainbows. The red band of the quinary rainbow is obscured by the much brighter secondary rainbow. A comparison with a numerical simulation using the Debye series confirms that the color bands of the quinary rainbow appear at the expected location. The numerical simulation produces a good match with the photograph for a droplet radius of 0.46 mm. The green band of the quinary rainbow is even faintly discernible in the unprocessed photograph, suggesting that under exceptional viewing conditions the green band of the quinary rainbow may be observed visually with the aid of a polarization filter.

  14. Wind Energy System Time-domain (WEST) analyzers using hybrid simulation techniques (United States)

    Hoffman, J. A.


    Two stand-alone analyzers constructed for real time simulation of the complex dynamic characteristics of horizontal-axis wind energy systems are described. Mathematical models for an aeroelastic rotor, including nonlinear aerodynamic and elastic loads, are implemented with high speed digital and analog circuitry. Models for elastic supports, a power train, a control system, and a rotor gimbal system are also included. Limited correlation efforts show good comparisons between results produced by the analyzers and results produced by a large digital simulation. The digital simulation results correlate well with test data.

  15. Simulation and experimental tests of a real-time DPWM technique ...

    African Journals Online (AJOL)

    This technique is proposed to overcome the disadvantages of the space vector pulse width modulation (SVPWM) mainly high modulation frequency and increased inverter switching losses. This control strategy is a simple and an easy technique generating the same switching pattern as space vector modulation with less ...

  16. Simulation and experimental tests of a real-time DPWM technique ...

    African Journals Online (AJOL)

    onduleur triphasé de tension (3P-VSI) commandé par la technique de modulation de largeur d'impulsion discontinue appelée DPWM-3 correspondant à quatre saturations de 30° sont présentés. Cette technique est proposée pour remédier aux ...

  17. A Monte Carlo simulation technique for low-altitude, wind-shear turbulence (United States)

    Bowles, Roland L.; Laituri, Tony R.; Trevino, George


    A case is made for including anisotropy in a Monte Carlo flight simulation scheme of low-altitude wind-shear turbulence by means of power spectral density. This study attempts to eliminate all flight simulation-induced deficiencies in the basic turbulence model. A full-scale low-altitude wind-shear turbulence simulation scheme is proposed with particular emphasis on low cost and practicality for near-ground flight. The power spectral density statistic is used to highlight the need for realistic estimates of energy transfer associated with low-altitude wind-shear turbulence. The simulation of a particular anisotropic turbulence model is shown to be a relatively simple extension from that of traditional isotropic (Dryden) turbulence.

  18. A Novel Technique for Running the NASA Legacy Code LAPIN Synchronously With Simulations Developed Using Simulink (United States)

    Vrnak, Daniel R.; Stueber, Thomas J.; Le, Dzu K.


    This report presents a method for running a dynamic legacy inlet simulation in concert with another dynamic simulation that uses a graphical interface. The legacy code, NASA's LArge Perturbation INlet (LAPIN) model, was coded using the FORTRAN 77 (The Portland Group, Lake Oswego, OR) programming language to run in a command shell similar to other applications that used the Microsoft Disk Operating System (MS-DOS) (Microsoft Corporation, Redmond, WA). Simulink (MathWorks, Natick, MA) is a dynamic simulation that runs on a modern graphical operating system. The product of this work has both simulations, LAPIN and Simulink, running synchronously on the same computer with periodic data exchanges. Implementing the method described in this paper avoided extensive changes to the legacy code and preserved its basic operating procedure. This paper presents a novel method that promotes inter-task data communication between the synchronously running processes.

  19. Ant colony method to control variance reduction techniques in the Monte Carlo simulation of clinical electron linear accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Garcia-Pareja, S. [Servicio de Radiofisica Hospitalaria, Hospital Regional Universitario ' Carlos Haya' , Avda. Carlos Haya, s/n, E-29010 Malaga (Spain)], E-mail:; Vilches, M. [Servicio de Fisica y Proteccion Radiologica, Hospital Regional Universitario ' Virgen de las Nieves' , Avda. de las Fuerzas Armadas, 2, E-18014 Granada (Spain); Lallena, A.M. [Departamento de Fisica Atomica, Molecular y Nuclear, Universidad de Granada, E-18071 Granada (Spain)


    The ant colony method is used to control the application of variance reduction techniques to the simulation of clinical electron linear accelerators of use in cancer therapy. In particular, splitting and Russian roulette, two standard variance reduction methods, are considered. The approach can be applied to any accelerator in a straightforward way and permits, in addition, to investigate the 'hot' regions of the accelerator, an information which is basic to develop a source model for this therapy tool.

  20. A comparison of landing maneuver piloting technique based on measurements made in an airline training simulator and in actual flight (United States)

    Heffley, R. K.; Schulman, T. M.


    An emphasis is placed on developing a mathematical model in order to identify useful metrics, quantify piloting technique, and define simulator fidelity. On the basis of DC-10 flight measurements recorded for 32 pilots, 13 flight-trained and the remainder simulator trained, a revised model of the landing flare is hypothesized which accounts for reduction of sink rate and perference for touchdown point along the runway. The flare maneuver and touchdown point adjustment can be described by a pitch attitude command pilot guidance law consisting of altitude and vertical velocity feedbacks. In flight pilots exhibit a significant vertical velocity feedback which is essential for well controlled sink rate reduction at the desired level of response (bandwidth). In the simulator, however, the vertical velocity feedback appears ineffectual and leads to substantially inferior landing performance.

  1. Impact of Standardized Simulated Patients on First-Year Pharmacy Students' Knowledge Retention of Insulin Injection Technique and Counseling Skills. (United States)

    Bowers, Riley; Tunney, Robert; Kelly, Kim; Mills, Beth; Trotta, Katie; Wheeless, C Neil; Drew, Richard


    Objective. To compare pre- and post-intervention test scores assessing insulin injection technique and counseling skills among P1 students with (intervention) or without (control) simulated patients, and to compare counseling checklist and knowledge retention test scores between groups. Methods. This study utilized cluster randomization. In addition to traditional instruction, the intervention group counseled a simulated patient on the use of insulin using the teach-back method. Test score changes from baseline were analyzed via two-sample t-test. Results. The intervention group exhibited a significantly greater increase in knowledge test scores from baseline compared to the control group. Similar changes were seen in post-instruction counseling checklist scores and knowledge retention test scores from baseline. Conclusion. Simulated patient interactions, when added to traditional coursework within a P1 skills lab, improve student counseling aptitude and knowledge retention scores.

  2. Photographic assessment of tretinoin microsphere gel in moderate acne. (United States)

    Feldman, Steven; Nighland, Marge


    True objective measures for assessing responsiveness to acne treatments are lacking. Photographic documentation can therefore be a valuable adjunct to treatment assessment. To photographically document the ability of tretinoin microsphere gel (TMG) 0.1% in improving facial acne. A standardized photographic technique was used to assess the efficacy and tolerability of TMG 0.1% applied once nightly for 12 weeks in an open-label trial involving 30 patients (Caucasian and Hispanic) ages 11 to 40 years with moderately severe facial acne. An assessment of frontal, left-sided, and right-sided color photographs at week 12 indicated that TMG 0.1% reduced total, noninflammatory, and inflammatory acne lesion counts by 47.8%, 53.5%, and 33.2%, respectively; and produced an "excellent/ good" investigator's global evaluation (4-point scale) treatment response in 63.6% of patients. Tretinoin microsphere gel 0.1% was well tolerated, and more than 63% of subjects did not exhibit cutaneous adverse effects at week 12. The standardized photographic technique proved to be a useful tool for documenting the efficacy of TMG 0.1% in consistently and significantly improving moderately severe facial acne over a 12-week treatment period.

  3. A Novel Temporal Bone Simulation Model Using 3D Printing Techniques. (United States)

    Mowry, Sarah E; Jammal, Hachem; Myer, Charles; Solares, Clementino Arturo; Weinberger, Paul


    An inexpensive temporal bone model for use in a temporal bone dissection laboratory setting can be made using a commercially available, consumer-grade 3D printer. Several models for a simulated temporal bone have been described but use commercial-grade printers and materials to produce these models. The goal of this project was to produce a plastic simulated temporal bone on an inexpensive 3D printer that recreates the visual and haptic experience associated with drilling a human temporal bone. Images from a high-resolution CT of a normal temporal bone were converted into stereolithography files via commercially available software, with image conversion and print settings adjusted to achieve optimal print quality. The temporal bone model was printed using acrylonitrile butadiene styrene (ABS) plastic filament on a MakerBot 2x 3D printer. Simulated temporal bones were drilled by seven expert temporal bone surgeons, assessing the fidelity of the model as compared with a human cadaveric temporal bone. Using a four-point scale, the simulated bones were assessed for haptic experience and recreation of the temporal bone anatomy. The created model was felt to be an accurate representation of a human temporal bone. All raters felt strongly this would be a good training model for junior residents or to simulate difficult surgical anatomy. Material cost for each model was $1.92. A realistic, inexpensive, and easily reproducible temporal bone model can be created on a consumer-grade desktop 3D printer.

  4. A physically-based approach for lens flare simulation


    Keshmirian, Arash


    In this thesis, we present a physically-based method for the computer graphics simulation of lens flare phenomena in photographic lenses. The proposed method can be used to render lens flares from nearly all types of lenses regardless of optical construction. The method described in this thesis utilizes the photon mapping technique (Jensen, 2001) to simulate the flow of light within the lens, and captures the visual effects of internal reflections and scattering within (and between) the optic...

  5. The Life of Digital Photographs

    DEFF Research Database (Denmark)

    Larsen, Jonas

    )mobilities of things, practice approaches to photography and multi sited ethnography, this talk discusses and empirically track the life (the conception, birth, transformative years, ageing and death) travel, detours, makeovers and destinations of (analogue and digital) photographs in our present network societies. So......The Life of Digital Photographs: The Case of Tourist Photography PhD, Jonas Larsen, Lecturer in Social and Cultural Geography, ENSPAC, Roskilde University, Denmark Inspired by ideas that things have lives, the mobilities paradigm's (Hannam et al. 2006) attentiveness to the spatiotemporal (im...... this talk ‘tracks' photographs spatialities and temporalities, their physical and digital materialities and (im)mobilities and ‘placing' within and beyond ‘networked households'. This is done in a historical perspective, seeing the new in the light of the old, so both breaks and coexistences between older...

  6. A novel 3D modelling and simulation technique in thermotherapy predictive analysis on biological tissue


    Fanjul Vélez, Félix; Arce Diego, José Luis; Romanov, Oleg G.; Tolstik, Alexei L.


    Optical techniques applied to biological tissue allow the development of new tools in medical praxis, either in tissue characterization or treatment. Examples of the latter are Photodynamic Therapy (PDT) or Low Intensity Laser Treatment (LILT), and also a promising technique called thermotherapy, that tries to control temperature increase in a pathological tissue in order to reduce or even eliminate pathological effects. The application of thermotherapy requires a previous analysis in order t...

  7. Preserving local writers, genealogy, photographs, newspapers, and related materials

    CERN Document Server

    Smallwood, Carol


    Preserving Local Writers, Genealogy, Photographs, Newspapers, and Related Materials draws on the practical knowledge of archivists, preservationists, librarians, and others who share the goal of making local history accessible to future generations. Anyone who plans to start a local history project or preserve important historical materials will find plenty of tips, techniques, sample documents, project ideas, and inspiration in its pages.

  8. A comparative analysis of preprocessing techniques of cardiac event series for the study of heart rhythm variability using simulated signals

    Directory of Open Access Journals (Sweden)

    Guimarães H.N.


    Full Text Available In the present study, using noise-free simulated signals, we performed a comparative examination of several preprocessing techniques that are used to transform the cardiac event series in a regularly sampled time series, appropriate for spectral analysis of heart rhythm variability (HRV. First, a group of noise-free simulated point event series, which represents a time series of heartbeats, was generated by an integral pulse frequency modulation model. In order to evaluate the performance of the preprocessing methods, the differences between the spectra of the preprocessed simulated signals and the true spectrum (spectrum of the model input modulating signals were surveyed by visual analysis and by contrasting merit indices. It is desired that estimated spectra match the true spectrum as close as possible, showing a minimum of harmonic components and other artifacts. The merit indices proposed to quantify these mismatches were the leakage rate, defined as a measure of leakage components (located outside some narrow windows centered at frequencies of model input modulating signals with respect to the whole spectral components, and the numbers of leakage components with amplitudes greater than 1%, 5% and 10% of the total spectral components. Our data, obtained from a noise-free simulation, indicate that the utilization of heart rate values instead of heart period values in the derivation of signals representative of heart rhythm results in more accurate spectra. Furthermore, our data support the efficiency of the widely used preprocessing technique based on the convolution of inverse interval function values with a rectangular window, and suggest the preprocessing technique based on a cubic polynomial interpolation of inverse interval function values and succeeding spectral analysis as another efficient and fast method for the analysis of HRV signals

  9. Hands-on Simulation versus Traditional Video-learning in Teaching Microsurgery Technique (United States)

    SAKAMOTO, Yusuke; OKAMOTO, Sho; SHIMIZU, Kenzo; ARAKI, Yoshio; HIRAKAWA, Akihiro; WAKABAYASHI, Toshihiko


    Bench model hands-on learning may be more effective than traditional didactic practice in some surgical fields. However, this has not been reported for microsurgery. Our study objective was to demonstrate the efficacy of bench model hands-on learning in acquiring microsuturing skills. The secondary objective was to evaluate the aptitude for microsurgery based on personality assessment. Eighty-six medical students comprising 62 men and 24 women were randomly assigned to either 20 min of hands-on learning with a bench model simulator or 20 min of video-learning using an instructional video. They then practiced microsuturing for 40 min. Each student then made three knots, and the time to complete the task was recorded. The final products were scored by two independent graders in a blind fashion. All participants then took a personality test, and their microsuture test scores and the time to complete the task were compared. The time to complete the task was significantly shorter in the simulator group than in the video-learning group. The final product scores tended to be higher with simulator-learning than with video-learning, but the difference was not significant. Students with high “extraversion” scores on the personality inventory took a shorter time to complete the suturing test. Simulator-learning was more effective for microsurgery training than video instruction, especially in understanding the procedure. There was a weak association between personality traits and microsurgery skill. PMID:28381653

  10. Visualization of simulated small vessels on computed tomography using a model-based iterative reconstruction technique. (United States)

    Higaki, Toru; Tatsugami, Fuminari; Fujioka, Chikako; Sakane, Hiroaki; Nakamura, Yuko; Baba, Yasutaka; Iida, Makoto; Awai, Kazuo


    This article describes a quantitative evaluation of visualizing small vessels using several image reconstruction methods in computed tomography. Simulated vessels with diameters of 1-6 mm made by 3D printer was scanned using 320-row detector computed tomography (CT). Hybrid iterative reconstruction (hybrid IR) and model-based iterative reconstruction (MBIR) were performed for the image reconstruction.

  11. Visualization of simulated small vessels on computed tomography using a model-based iterative reconstruction technique


    Toru Higaki; Fuminari Tatsugami; Chikako Fujioka; Hiroaki Sakane; Yuko Nakamura; Yasutaka Baba; Makoto Iida; Kazuo Awai


    This article describes a quantitative evaluation of visualizing small vessels using several image reconstruction methods in computed tomography. Simulated vessels with diameters of 1?6?mm made by 3D printer was scanned using 320-row detector computed tomography (CT). Hybrid iterative reconstruction (hybrid IR) and model-based iterative reconstruction (MBIR) were performed for the image reconstruction.

  12. A Comparative Study of k-Nearest Neighbour Techniques in Crowd Simulation

    NARCIS (Netherlands)

    Vermeulen, J.; Hillebrand, A.; Geraerts, R.J.


    The k-nearest neighbour (kNN) problem appears in many different fields of computer science, such as computer animation and robotics. In crowd simulation, kNN queries are typically used by a collision-avoidance method to prevent unnecessary computations. Many different methods for finding these

  13. A comparison of six metamodeling techniques applied to building performance simulations

    DEFF Research Database (Denmark)

    Østergård, Torben; Jensen, Rasmus Lund; Maagaard, Steffen Enersen


    Highlights •Linear regression (OLS), support vector regression (SVR), regression splines (MARS). •Random forest (RF), Gaussian processes (GPR), neural network (NN). •Accuracy, time, interpretability, ease-of-use, model selection, and robustness. •13 problems modelled for 9 training set sizes...... spanning from 32 to 8192 simulations. •Methodology for comparison using exhaustive grid searches and sensitivity analysis....

  14. Refinement of homology-based protein structures by molecular dynamics simulation techniques

    NARCIS (Netherlands)

    Fan, H; Mark, AE

    The use of classical molecular dynamics simulations, performed in explicit water, for the refinement of structural models of proteins generated ab initio or based on homology has been investigated. The study involved a test set of 15 proteins that were previously used by Baker and coworkers to

  15. Visualization of simulated small vessels on computed tomography using a model-based iterative reconstruction technique

    Directory of Open Access Journals (Sweden)

    Toru Higaki


    Full Text Available This article describes a quantitative evaluation of visualizing small vessels using several image reconstruction methods in computed tomography. Simulated vessels with diameters of 1–6 mm made by 3D printer was scanned using 320-row detector computed tomography (CT. Hybrid iterative reconstruction (hybrid IR and model-based iterative reconstruction (MBIR were performed for the image reconstruction.

  16. 360-degree videos: a new visualization technique for astrophysical simulations, applied to the Galactic Center (United States)

    Russell, Christopher


    360-degree videos are a new type of movie that renders over all 4π steradian. Video sharing sites such as YouTube now allow this unique content to be shared via virtual reality (VR) goggles, hand-held smartphones/tablets, and computers. Creating 360-degree videos from astrophysical simulations not only provide a new way to view these simulations due to their immersive nature, but also yield engaging content for outreach to the public. We present our 360-degree video of an astrophysical simulation of the Galactic center: a hydrodynamics calculation of the colliding and accreting winds of the 30 Wolf-Rayet stars orbiting within the central parsec. Viewing the movie, which renders column density, from the location of the supermassive black hole gives a unique and immersive perspective of the shocked wind material inspiraling and tidally stretching as it plummets toward the black hole. We also describe how to create such movies, discuss what type of content does and does not look appealing in 360-degree format, and briefly comment on what new science can be extracted from astrophysical simulations using 360-degree videos.

  17. Computer technique for simulating the combustion of cellulose and other fuels (United States)

    Andrew M. Stein; Brian W. Bauske


    A computer method has been developed for simulating the combustion of wood and other cellulosic fuels. The products of combustion are used as input for a convection model that slimulates real fires. The method allows the chemical process to proceed to equilibrium and then examines the effects of mass addition and repartitioning on the fluid mechanics of the convection...

  18. Synchroton and Simulations Techniques Applied to Problems in Materials Science: Catalysts and Azul Maya Pigments

    Energy Technology Data Exchange (ETDEWEB)

    Chianelli, R.


    Development of synchrotron techniques for the determination of the structure of disordered, amorphous and surface materials has exploded over the past twenty years due to the increasing availability of high flux synchrotron radiation and the continuing development of increasingly powerful synchrotron techniques. These techniques are available to materials scientists who are not necessarily synchrotron scientists through interaction with effective user communities that exist at synchrotrons such as the Stanford Synchrotron Radiation Laboratory (SSRL). In this article we review the application of multiple synchrotron characterization techniques to two classes of materials defined as ''surface compounds.'' One class of surface compounds are materials like MoS{sub 2-x}C{sub x} that are widely used petroleum catalysts used to improve the environmental properties of transportation fuels. These compounds may be viewed as ''sulfide supported carbides'' in their catalytically active states. The second class of ''surface compounds'' is the ''Maya Blue'' pigments that are based on technology created by the ancient Maya. These compounds are organic/inorganic ''surface complexes'' consisting of the dye indigo and palygorskite, a common clay. The identification of both surface compounds relies on the application of synchrotron techniques as described in this report.

  19. Discrete Particle Simulation Techniques for the Analysis of Colliding and Flowing Particulate Media (United States)

    Mukherjee, Debanjan

    Flowing particulate media are ubiquitous in a wide spectrum of applications that include transport systems, fluidized beds, manufacturing and materials processing technologies, energy conversion and propulsion technologies, sprays, jets, slurry flows, and biological flows. The discrete nature of the media, along with their underlying coupled multi-physical interactions can lead to a variety of interesting phenomena, many of which are unique to such media - for example, turbulent diffusion and preferential concentration in particle laden flows, and soliton like excitation patterns in a vibrated pile of granular material. This dissertation explores the utility of numerical simulations based on the discrete element method and collision driven particle dynamics methods for analyzing flowing particulate media. Such methods are well-suited to handle phenomena involving particulate, granular, and discontinuous materials, and often provide abilities to tackle complicated physical phenomena, for which pursuing continuum based approaches might be difficult or sometimes insufficient. A detailed discussion on hierarchically representing coupled, multi-physical phenomena through simple models for underlying physical interactions is presented. Appropriate physical models for mechanical contact, conductive and convective heat exchange, fluid-particle interactions, adhesive and near-field effects, and interaction with applied electromagnetic fields are presented. Algorithmic details on assembling the interaction models into a large-scale simulation framework have been elaborated with illustrations. The assembled frameworks were used to develop a computer simulation library (named `Software Library for Discrete Element Simulations' (SLIDES) for the sake of reference and continued future development efforts) and aspects of the architecture and development of this library have also been addressed. This is an object-oriented discrete particle simulation library developed in Fortran

  20. Adobe Photoshop CS5 for Photographers The Ultimate Workshop

    CERN Document Server

    Evening, Martin


    If you already have a good knowledge of Adobe Photoshop and are looking to advance your skills, Adobe Photoshop CS5 for Photographers: The Ultimate Workshop is the book you've been waiting for.  Renowned photographers Martin Evening and Jeff Schewe impart their Photoshop tips and workflow, showing you how to use a vast array of rarely seen advanced Photoshop techniques.  Whether the subject is serious retouching work, weird and wonderful compositions, or planning a shoot before you've even picked up a camera, you can be sure that the advice is based on years of practical experience.

  1. Comparison of Apical Sealing Ability of Lateral Condensation Technique in Room and Body- Simulated Temperatures (An in vitro study). (United States)

    Sobhnamayan, F; Sahebi, S; Moazami, F; Borhanhaghighi, M


    Studies reported that nearly 60% of endodontic failures have been attributed to inadequate obturation of the root canal system. Thus, complete obturation of the root canal system and proper apical seal are essential elements in the long-term success of root canal treatment. This study aimed to compare the apical seal of lateral condensation technique in the room and in body- simulated temperature. In this experimental study, 70 extracted, single- rooted, human premolar teeth were instrumented and divided up into four groups. All tooth's canals were obturated by lateral condensation technique except the teeth in the positive control group. Group 1and 2, each with 30 teeth, were obturated in the room and intracanal temperature respectively. The other two groups were positive and negative control group each with 5 teeth. All groups except negative control were covered by two layers of nail polish. Then linear dye penetration was evaluated with a stereomicroscope. Data was analyzed with student-t test and also Kolmogorov- Smirnov Goodness- of- Fit test to make sure of data. RESULTS showed that dye penetration in group one (obturation in room temperature) was 0.6mm more than group 2 (obturation in simulated-body temperature) although this was not statistically significant (p> 0.05). Under the condition of this invitro study, apical sealing ability was better in the body-simulated temperature than the room temperature, although it was not statistically significant.

  2. Crash Risk Prediction Modeling Based on the Traffic Conflict Technique and a Microscopic Simulation for Freeway Interchange Merging Areas

    Directory of Open Access Journals (Sweden)

    Shen Li


    Full Text Available This paper evaluates the traffic safety of freeway interchange merging areas based on the traffic conflict technique. The hourly composite risk indexes (HCRI was defined. By the use of unmanned aerial vehicle (UAV photography and video processing techniques, the conflict type and severity was judged. Time to collision (TTC was determined with the traffic conflict evaluation index. Then, the TTC severity threshold was determined. Quantizing the weight of the conflict by direct losses of different severities of freeway traffic accidents, the calculated weight of the HCRI can be obtained. Calibration of the relevant parameters of the micro-simulation simulator VISSIM is conducted by the travel time according to the field data. Variables are placed into orthogonal tables at different levels. On the basis of this table, the trajectory file of every traffic condition is simulated, and then submitted into a surrogate safety assessment model (SSAM, identifying the number of hourly traffic conflicts in the merging area, a statistic of HCRI. Moreover, the multivariate linear regression model was presented and validated to study the relationship between HCRI and the influencing variables. A comparison between the HCRI model and the hourly conflicts ratio (HCR, without weight, shows that the HCRI model fitting degree was obviously higher than the HCR. This will be a reference to design and implement operational planners.

  3. Simulating TRSs by minimal TRSs : a simple, efficient, and correct compilation technique

    NARCIS (Netherlands)

    J.F.T. Kamperman; H.R. Walters (Pum)


    textabstractA simple, efficient, and correct compilation technique for left-linear Term Rewriting Systems (TRSs) is presented. TRSs are compiled into Minimal Term Rewriting Systems (MTRSs), a subclass of TRSs, presented in [KW95d]. In MTRSs, the rules have such a simple form that they can be seen as

  4. Simulations

    CERN Document Server

    Ngada, Narcisse


    The complexity and cost of building and running high-power electrical systems make the use of simulations unavoidable. The simulations available today provide great understanding about how systems really operate. This paper helps the reader to gain an insight into simulation in the field of power converters for particle accelerators. Starting with the definition and basic principles of simulation, two simulation types, as well as their leading tools, are presented: analog and numerical simulations. Some practical applications of each simulation type are also considered. The final conclusion then summarizes the main important items to keep in mind before opting for a simulation tool or before performing a simulation.

  5. A new preconditioning technique for implicitly coupled multidomain simulations with applications to hemodynamics (United States)

    Esmaily-Moghadam, Mahdi; Bazilevs, Yuri; Marsden, Alison L.


    In cardiovascular blood flow simulations a large portion of computational resources is dedicated to solve the linear system of equations. Boundary conditions in these applications are critical for obtaining accurate and physiologically realistic solutions, and pose numerical challenges due to the coupling between flow and pressure. Using an implicit time integration setting can lead to an ill-conditioned tangent matrix that causes deterioration in performance of traditional iterative linear equation solvers (LS). In this paper we present a novel and efficient preconditioner (PC) for this class of problems that exploits the strong coupling between the flow and pressure. We implement this PC in a LS algorithm designed for solving systems of equations governing incompressible flows. Excellent efficiency and stability properties of the proposed method are illustrated on a set of clinically relevant hemodynamics simulations.


    Directory of Open Access Journals (Sweden)

    A. V. Bykov


    Full Text Available Methods of tissue-simulative phantoms and capillaries fabrication from PVC-plastisol and silicone for application as test-objects in optical coherence tomography (OCT and skin and capillary emulation are considered. Comparison characteristics of these materials and recommendations for their application are given. Examples of phantoms visualization by optical coherence tomography method are given. Possibility of information using from B-scans for refractive index evaluation is shown.

  7. A simulation benchmark to evaluate the performance of advanced control techniques in biological wastewater treatment plants


    Sotomayor, O.A.Z.; Park, S.W.; Garcia, C.


    Wastewater treatment plants (WWTP) are complex systems that incorporate a large number of biological, physicochemical and biochemical processes. They are large and nonlinear systems subject to great disturbances in incoming loads. The primary goal of a WWTP is to reduce pollutants and the second goal is disturbance rejection, in order to obtain good effluent quality. Modeling and computer simulations are key tools in the achievement of these two goals. They are essential to describe, predict ...

  8. Simulationsverfahren fuer Brown-Resnick-Prozesse (Simulation Techniques for Brown-Resnick Processes)


    Oesting, Marco


    Generalized Brown-Resnick processes form a flexible class of stationary max-stable processes based on Gaussian random fields. With regard to applications fast and accurate simulation of these processes is an important issue. In fact, Brown-Resnick processes that are generated by a dissipative flow do not allow for good finite approximations using the definition of the processes. On large intervals we get either huge approximation errors or very long operating times. Looking for solutions of t...

  9. Real-Time Simulation Technique of a Microgrid Model for DER Penetration


    Konstantina Mentesidi; Evangelos Rikos; Vasilis Kleftakis


    Comprehensive analysis of Distributed Energy Resources (DER) integration requires tools that provide computational power and flexibility. In this context, throughout this paper PHIL simulations are performed to emulate the energy management system of a real microgrid including a diesel synchronous machine and inverter-based sources. Moreover, conventional frequency and voltage droops were incorporated into the respective inverters. The results were verified at the real microgrid installation ...

  10. Simulation of stray grain formation in Ni-base single crystal turbine blades fabricated by HRS and LMC techniques

    Directory of Open Access Journals (Sweden)

    Ya-feng Li


    Full Text Available The simulation models of the thermal and macrostructural evolutions during directional solidification of Ni-base single crystal (SX turbine blades under high rate solidification (HRS and liquid metal cooling (LMC have been constructed using ProCAST software, coupled with a 3D Cellular Automaton Finite Element (CAFE model. The models were used to investigate the tendencies of stray grain (SG formation in the platform region of turbine blades fabricated by HRS and LMC techniques. The results reveal that the LMC technique can prohibit SG formation by smoothing the concaved isotherm and in turn alleviating the undercooling in the platform ends to let the dendrites fill up the undercooled zone before SG nucleation. The simulation results agreed well with the experimental results, indicating that these models could be used to analyze the macrostructural evolution or to optimize process parameters to suppress SG formation. Using these models, the critical withdrawal rate for casting SX turbine blades without SG formation were determined to be around 75 μm·s-1 and 100 μm·s-1 for HRS and LMC respectively, suggesting that LMC can be used as an efficient technique in fabricating SX turbine blades without any SG defect formation.

  11. The FADE mass-stat: A technique for inserting or deleting particles in molecular dynamics simulations

    Energy Technology Data Exchange (ETDEWEB)

    Borg, Matthew K., E-mail: [Department of Mechanical and Aerospace Engineering, University of Strathclyde, Glasgow G1 1XJ (United Kingdom); Lockerby, Duncan A., E-mail: [School of Engineering, University of Warwick, Coventry CV4 7AL (United Kingdom); Reese, Jason M., E-mail: [School of Engineering, University of Edinburgh, Edinburgh EH9 3JL (United Kingdom)


    The emergence of new applications of molecular dynamics (MD) simulation calls for the development of mass-statting procedures that insert or delete particles on-the-fly. In this paper we present a new mass-stat which we term FADE, because it gradually “fades-in” (inserts) or “fades-out” (deletes) molecules over a short relaxation period within a MD simulation. FADE applies a time-weighted relaxation to the intermolecular pair forces between the inserting/deleting molecule and any neighbouring molecules. The weighting function we propose in this paper is a piece-wise polynomial that can be described entirely by two parameters: the relaxation time scale and the order of the polynomial. FADE inherently conserves overall system momentum independent of the form of the weighting function. We demonstrate various simulations of insertions of atomic argon, polyatomic TIP4P water, polymer strands, and C{sub 60} Buckminsterfullerene molecules. We propose FADE parameters and a maximum density variation per insertion-instance that restricts spurious potential energy changes entering the system within desired tolerances. We also demonstrate in this paper that FADE compares very well to an existing insertion algorithm called USHER, in terms of accuracy, insertion rate (in dense fluids), and computational efficiency. The USHER algorithm is applicable to monatomic and water molecules only, but we demonstrate that FADE can be generally applied to various forms and sizes of molecules, such as polymeric molecules of long aspect ratio, and spherical carbon fullerenes with hollow interiors.

  12. Exploitation of Digital Surface Models Generated from WORLDVIEW-2 Data for SAR Simulation Techniques (United States)

    Ilehag, R.; Auer, S.; d'Angelo, P.


    GeoRaySAR, an automated SAR simulator developed at DLR, identifies buildings in high resolution SAR data by utilizing geometric knowledge extracted from digital surface models (DSMs). Hitherto, the simulator has utilized DSMs generated from LiDAR data from airborne sensors with pre-filtered vegetation. Discarding the need for pre-optimized model input, DSMs generated from high resolution optical data (acquired with WorldView-2) are used for the extraction of building-related SAR image parts in this work. An automatic preprocessing of the DSMs has been developed for separating buildings from elevated vegetation (trees, bushes) and reducing the noise level. Based on that, automated simulations are triggered considering the properties of real SAR images. Locations in three cities, Munich, London and Istanbul, were chosen as study areas to determine advantages and limitations related to WorldView-2 DSMs as input for GeoRaySAR. Beyond, the impact of the quality of the DSM in terms of building extraction is evaluated as well as evaluation of building DSM, a DSM only containing buildings. The results indicate that building extents can be detected with DSMs from optical satellite data with various success, dependent on the quality of the DSM as well as on the SAR imaging perspective.

  13. Real-Time Simulation Technique of a Microgrid Model for DER Penetration

    Directory of Open Access Journals (Sweden)

    Konstantina Mentesidi


    Full Text Available Comprehensive analysis of Distributed Energy Resources (DER integration requires tools that provide computational power and flexibility. In this context, throughout this paper PHIL simulations are performed to emulate the energy management system of a real microgrid including a diesel synchronous machine and inverter-based sources. Moreover, conventional frequency and voltage droops were incorporated into the respective inverters. The results were verified at the real microgrid installation in the Centre for Renewable Energy Sources (CRES premises. This research work is divided into two steps: A Real time in RSCAD/RTDS and Power Hardware-in-the-Loop (PHIL simulations where the diesel generator´s active power droop control is evaluated, the battery inverter´s droop curves are simulated and the load sharing for parallel operation of the system´s generation units is examined. B microgrid experiments during which various tests were executed concerning the diesel generator and the battery inverters in order to examine their dynamic operation within the LV islanded power system.


    Directory of Open Access Journals (Sweden)



    Full Text Available Monte Carlo simulation using Simple Random Sampling (SRS technique is popularly known for its ability to handle complex uncertainty problems. However, to produce a reasonable result, it requires huge sample size. This makes it to be computationally expensive, time consuming and unfit for online power system applications. In this article, the performance of Latin Hypercube Sampling (LHS technique is explored and compared with SRS in term of accuracy, robustness and speed for small signal stability application in a wind generator-connected power system. The analysis is performed using probabilistic techniques via eigenvalue analysis on two standard networks (Single Machine Infinite Bus and IEEE 16–machine 68 bus test system. The accuracy of the two sampling techniques is determined by comparing their different sample sizes with the IDEAL (conventional. The robustness is determined based on a significant variance reduction when the experiment is repeated 100 times with different sample sizes using the two sampling techniques in turn. Some of the results show that sample sizes generated from LHS for small signal stability application produces the same result as that of the IDEAL values starting from 100 sample size. This shows that about 100 sample size of random variable generated using LHS method is good enough to produce reasonable results for practical purpose in small signal stability application. It is also revealed that LHS has the least variance when the experiment is repeated 100 times compared to SRS techniques. This signifies the robustness of LHS over that of SRS techniques. 100 sample size of LHS produces the same result as that of the conventional method consisting of 50000 sample size. The reduced sample size required by LHS gives it computational speed advantage (about six times over the conventional method.

  15. Effect of Aerosol Devices and Administration Techniques on Drug Delivery in a Simulated Spontaneously Breathing Pediatric Tracheostomy Model. (United States)

    Alhamad, Bshayer R; Fink, James B; Harwood, Robert J; Sheard, Meryl M; Ari, Arzu


    This study was conducted to compare the efficiency of jet nebulizers, vibrating mesh nebulizers, and pressurized metered-dose inhalers (pMDI) during assisted and unassisted administration techniques using a simulated spontaneously breathing pediatric model with a tracheostomy tube (TT). An in vitro breathing model consisting of an uncuffed TT (4.5-mm inner diameter) was attached to a collecting filter (Respirgard) connected to a dual-chamber test lung and a ventilator (Hamilton Medical) to simulate breathing parameters of a 2-y-old child (breathing frequency, 25 breaths/min; tidal volume, 150 mL; inspiratory time, 0.8 s; peak inspiratory flow, 20 L/min). Albuterol sulfate was administered using a jet nebulizer (MicroMist, 2.5 mg/3 mL), vibrating mesh nebulizer (Aeroneb Solo, 2.5 mg/3 mL), and pMDI (ProAir HFA, 432 μg). Each device was tested 5 times with an unassisted technique (direct administration of aerosols with simulated spontaneous breathing) and with an assisted technique (using a manual resuscitation bag in conjunction with an aerosol device and synchronized with inspiration). Drug collected on the filter was analyzed by spectrophotometry. With the unassisted technique, the pMDI had the highest inhaled mass percent (IM%, 47.15 ± 7.82%), followed by the vibrating mesh nebulizer (19.77 ± 2.99%) and the jet nebulizer (5.88 ± 0.77%, P = .002). IM was greater with the vibrating mesh nebulizer (0.49 ± .07 mg) than with the pMDI (0.20 ± 0.03 mg) and the jet nebulizer (0.15 ± 0.01 mg, P = .007). The trend of lower deposition with the assisted versus unassisted technique was not significant for the jet nebulizer (P = .46), vibrating mesh nebulizer (P = .19), and pMDI (P = .64). In this in vitro pediatric breathing model with a TT, the pMDI delivered the highest IM%, whereas the vibrating mesh nebulizer delivered the highest IM. The jet nebulizer was the least efficient device. Delivery efficiency was similar with unassisted and assisted administration

  16. Photographic evidence for the third-order rainbow. (United States)

    Grossmann, Michael; Schmidt, Elmar; Haussmann, Alexander


    The first likely photographic observation of the tertiary rainbow caused by sunlight in the open air is reported and analyzed. Whereas primary and secondary rainbows are rather common and easily seen phenomena in atmospheric optics, the tertiary rainbow appears in the sunward side of the sky and is thus largely masked by forward scattered light. Up to now, only a few visual reports and no reliable photographs of the tertiary rainbow are known. Evidence of a third-order rainbow has been obtained by using image processing techniques on a digital photograph that contains no obvious indication of such a rainbow. To rule out any misinterpretation of artifacts, we carefully calibrated the image in order to compare the observed bow's angular position and dispersion with those predicted by theory.

  17. ARIADNE, a Photographic LAr TPC at the CERN Neutrino Platform

    CERN Document Server

    Mavrokoridis, K; Nessi, M; Roberts, A; Smith, N A; Touramanis, C; CERN. Geneva. SPS and PS Experiments Committee; SPSC


    This letter of intent describes a novel and innovative two-phase LAr TPC with photographic capabilities as an attractive alternative readout method to the currently accepted segmented THGEMs which will require many thousands of charge readout channels for kton-scale two-phase TPCs. These colossal LAr TPCs will be used for the future long-baseline-neutrino-oscillation experiments. Optical readout also presents many other clear advantages over current readout techniques such as ease of scalability, upgrade, installation and maintenance, and cost effectiveness. This technology has already been demonstrated at the Liverpool LAr facility with the photographic capturing of cosmic muon tracks and single gammas using a 40-litre prototype. We have now secured ERC funding to develop this further with the ARIADNE programme. ARIADNE will be a 1-ton two-phase LAr TPC utilizing THGEM and EMCCD camera readouts in order to photograph interactions, allowing for track reconstruction and particle identification. We are request...

  18. Photographic image enhancement (United States)

    Hite, Gerald E.


    Deblurring capabilities would significantly improve the scientific return from Space Shuttle crew-acquired images of the Earth and the safety of Space Shuttle missions. Deblurring techniques were developed and demonstrated on two digitized images that were blurred in different ways. The first was blurred by a Gaussian blurring function analogous to that caused by atmospheric turbulence, while the second was blurred by improper focussing. It was demonstrated, in both cases, that the nature of the blurring (Gaussian and Airy) and the appropriate parameters could be obtained from the Fourier transformation of their images. The difficulties posed by the presence of noise necessitated special consideration. It was demonstrated that a modified Wiener frequency filter judiciously constructed to avoid over emphasis of frequency regions dominated by noise resulted in substantially improved images. Several important areas of future research were identified. Two areas of particular promise are the extraction of blurring information directly from the spatial images and improved noise abatement form investigations of select spatial regions and the elimination of spike noise.

  19. Real-time measurement of ice growth during simulated and natural icing conditions using ultrasonic pulse-echo techniques (United States)

    Hansman, R. J., Jr.; Kirby, M. S.


    Results of tests to measure ice accretion in real-time using ultrasonic pulse-echo techniques are presented. Tests conducted on a 10.2 cm diameter cylinder exposed to simulated icing conditions in the NASA Lewis Icing Research Tunnel and on an 11.4 cm diameter cylinder exposed to natural icing conditions in flight are described. An accuracy of + or - 0.5 mm is achieved for real-time ice thickness measurements. Ice accretion rate is determined by differentiating ice thickness with respect to time. Icing rates measured during simulated and natural icing conditions are compared and related to icing cloud parameters. The ultrasonic signal characteristics are used to detect the presence of surface water on the accreting ice shape and thus to distinguish between dry ice growth and wet growth. The surface roughness of the accreted ice is shown to be related to the width of the echo signal received from the ice surface.

  20. A fitting algorithm based on simulated annealing techniques for efficiency calibration of HPGe detectors using different mathematical functions

    Energy Technology Data Exchange (ETDEWEB)

    Hurtado, S. [Servicio de Radioisotopos, Centro de Investigacion, Tecnologia e Innovacion (CITIUS), Universidad de Sevilla, Avda. Reina Mercedes s/n, 41012 Sevilla (Spain)], E-mail:; Garcia-Leon, M. [Departamento de Fisica Atomica, Molecular y Nuclear, Facultad de Fisica, Universidad de Sevilla, Aptd. 1065, 41080 Sevilla (Spain); Garcia-Tenorio, R. [Departamento de Fisica Aplicada II, E.T.S.A. Universidad de Sevilla, Avda, Reina Mercedes 2, 41012 Sevilla (Spain)


    In this work several mathematical functions are compared in order to perform the full-energy peak efficiency calibration of HPGe detectors using a 126cm{sup 3} HPGe coaxial detector and gamma-ray energies ranging from 36 to 1460 keV. Statistical tests and Monte Carlo simulations were used to study the performance of the fitting curve equations. Furthermore the fitting procedure of these complex functional forms to experimental data is a non-linear multi-parameter minimization problem. In gamma-ray spectrometry usually non-linear least-squares fitting algorithms (Levenberg-Marquardt method) provide a fast convergence while minimizing {chi}{sub R}{sup 2}, however, sometimes reaching only local minima. In order to overcome that shortcoming a hybrid algorithm based on simulated annealing (HSA) techniques is proposed. Additionally a new function is suggested that models the efficiency curve of germanium detectors in gamma-ray spectrometry.

  1. St. Vincent National Wildlife Refuge Historical Photographs (United States)

    US Fish and Wildlife Service, Department of the Interior — This document contains a series of photographs from St. Vincent Island. Photograph dates range from 1909 to 1969. Subjects include structures, vegetation, equipment,...

  2. Modelling and Simulation of Digital Compensation Technique for dc-dc Converter by Pole Placement (United States)

    Shenbagalakshmi, R.; Sree Renga Raja, T.


    A thorough and effective analysis of the dc-dc converters is carried out in order to achieve the system stability and to improve the dynamic performance. A small signal modelling based on state space averaging technique for dc-dc converters is carried out. A digital state feedback gain matrix is derived by pole placement technique in order to achieve the stability of a completely controllable system. A prediction observer for the dc-dc converters is designed and a dynamic compensation (observer plus control law) is provided using separation principle. The output is very much improved with zero output voltage ripples, zero peak overshoot, and much lesser settling time in the range of ms and with higher overall efficiency (>90 %).

  3. Simulation of Acoustic Wave Propagation in Anisotropic Media Using Dynamic Programming Technique


    Botkin, Nikolai; Turova, Varvara


    International audience; It is known that the Hamiltonian of the eikonal equation for an anisotropic medium may be nonconvex, which excludes the application of Fermat’s minimum-time principle related to minimum-time control problems. The idea proposed in this paper consists in finding a conflict control problem (differential game) whose Hamiltonian coincides with the Hamiltonian of the eikonal equation. It turns out that this is always possible due to Krasovskii’s unification technique. Having...

  4. A new technique combining virtual simulation and methylene blue staining for the localization of small peripheral pulmonary lesions (United States)


    Background Quickly and accurately localizing small peripheral pulmonary lesions can avoid prolonged operative time and unplanned open thoracotomy. In this study, we aimed to introduce and evaluate a new technique combining virtual simulation and methylene blue staining for the localization of small peripheral pulmonary lesions. Methods Seventy four (74) patients with 80 peripheral pulmonary lesions methylene blue dye was injected to the virtually identified point according to the surface point, angle and depth previously determined by the simulator. The wedge resection of the marked lesion was performed under video-assisted thoracoscopic surgery (VATS) and the specimens were sent for immediate pathologic examination. According to pathology results, appropriate surgical procedures were decided and undertaken. Results The average lesion size was 10.4±3.5 mm (range: 4-17 mm) and the average distance to the pleural surface was 9.4±4.9 mm. Our preoperative localization procedure was successful in 75 of 80 (94%) lesions. Histological examination showed 28 benign lesions and 52 lung cancers. The shortest distance between the edges of the stain and lesion was 5.1±3.1 mm. Localization time was 17.4±2.3 min. All patients with malignant lesions subsequently underwent lobectomy and systematic lymph node dissection. No complications were observed in all participants. Conclusions The novel technique combining the preoperative virtual simulation and methylene blue staining techniques has a high success rate for localizing small peripheral pulmonary lesions, particularly for those tiny lesions which are difficult to visualise and palpate during VATS. PMID:24512571

  5. A Novel Idea for Optimizing Condition-Based Maintenance Using Genetic Algorithms and Continuous Event Simulation Techniques

    Directory of Open Access Journals (Sweden)

    Mansoor Ahmed Siddiqui


    Full Text Available Effective maintenance strategies are of utmost significance for system engineering due to their direct linkage with financial aspects and safety of the plants’ operation. At a point where the state of a system, for instance, level of its deterioration, can be constantly observed, a strategy based on condition-based maintenance (CBM may be affected; wherein upkeep of the system is done progressively on the premise of monitored state of the system. In this article, a multicomponent framework is considered that is continuously kept under observation. In order to decide an optimal deterioration stage for the said system, Genetic Algorithm (GA technique has been utilized that figures out when its preventive maintenance should be carried out. The system is configured into a multiobjective problem that is aimed at optimizing the two desired objectives, namely, profitability and accessibility. For the sake of reality, a prognostic model portraying the advancements of deteriorating system has been employed that will be based on utilization of continuous event simulation techniques. In this regard, Monte Carlo (MC simulation has been shortlisted as it can take into account a wide range of probable options that can help in reducing uncertainty. The inherent benefits proffered by the said simulation technique are fully utilized to display various elements of a deteriorating system working under stressed environment. The proposed synergic model (GA and MC is considered to be more effective due to the employment of “drop-by-drop approach” that permits successful drive of the related search process with regard to the best optimal solutions.

  6. Tsunami Simulators in Physical Modelling Laboratories - From Concept to Proven Technique (United States)

    Allsop, W.; Chandler, I.; Rossetto, T.; McGovern, D.; Petrone, C.; Robinson, D.


    Before 2004, there was little public awareness around Indian Ocean coasts of the potential size and effects of tsunami. Even in 2011, the scale and extent of devastation by the Japan East Coast Tsunami was unexpected. There were very few engineering tools to assess onshore impacts of tsunami, so no agreement on robust methods to predict forces on coastal defences, buildings or related infrastructure. Modelling generally used substantial simplifications of either solitary waves (far too short durations) or dam break (unrealistic and/or uncontrolled wave forms).This presentation will describe research from EPI-centre, HYDRALAB IV, URBANWAVES and CRUST projects over the last 10 years that have developed and refined pneumatic Tsunami Simulators for the hydraulic laboratory. These unique devices have been used to model generic elevated and N-wave tsunamis up to and over simple shorelines, and at example defences. They have reproduced full-duration tsunamis including the Mercator trace from 2004 at 1:50 scale. Engineering scale models subjected to those tsunamis have measured wave run-up on simple slopes, forces on idealised sea defences and pressures / forces on buildings. This presentation will describe how these pneumatic Tsunami Simulators work, demonstrate how they have generated tsunami waves longer than the facility within which they operate, and will highlight research results from the three generations of Tsunami Simulator. Of direct relevance to engineers and modellers will be measurements of wave run-up levels and comparison with theoretical predictions. Recent measurements of forces on individual buildings have been generalized by separate experiments on buildings (up to 4 rows) which show that the greatest forces can act on the landward (not seaward) buildings. Continuing research in the 70m long 4m wide Fast Flow Facility on tsunami defence structures have also measured forces on buildings in the lee of a failed defence wall.

  7. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations. (United States)

    Hariharan, Prasanna; D'Souza, Gavin A; Horner, Marc; Morrison, Tina M; Malinauskas, Richard A; Myers, Matthew R


    A "credible" computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing "model credibility" is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a "threshold-based" validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results ("S") of velocity and viscous shear stress were compared with inter-laboratory experimental measurements ("D"). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student's t-test. However, following the threshold-based approach, a Student's t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be

  8. Pricing Survivor Forwards and Swaps in Incomplete Markets Using Simulation Techniques

    DEFF Research Database (Denmark)

    Boyer, M. Martin; Favaro, Amélie; Stentoft, Lars


    This article considers how to manage longevity risk using longevity derivatives products. We review the potential counterparties that naturally have exposure to this type of risk and we provide details on two very simple products, the survivor forward and the survivor swap, that can be used...... to trade this type of risk. We then discuss how such products can be priced using a simulation-based approach that has been shown to be successful in pricing financial derivatives. To illustrate the flexibility of the approach we price survivor forwards and swaps using the simple dynamics of the Lee...

  9. Bridging the scales in atmospheric composition simulations using a nudging technique (United States)

    D'Isidoro, Massimo; Maurizi, Alberto; Russo, Felicita; Tampieri, Francesco


    Studying the interaction between climate and anthropogenic activities, specifically those concentrated in megacities/hot spots, requires the description of processes in a very wide range of scales from local, where anthropogenic emissions are concentrated to global where we are interested to study the impact of these sources. The description of all the processes at all scales within the same numerical implementation is not feasible because of limited computer resources. Therefore, different phenomena are studied by means of different numerical models that can cover different range of scales. The exchange of information from small to large scale is highly non-trivial though of high interest. In fact uncertainties in large scale simulations are expected to receive large contribution from the most polluted areas where the highly inhomogeneous distribution of sources connected to the intrinsic non-linearity of the processes involved can generate non negligible departures between coarse and fine scale simulations. In this work a new method is proposed and investigated in a case study (August 2009) using the BOLCHEM model. Monthly simulations at coarse (0.5° European domain, run A) and fine (0.1° Central Mediterranean domain, run B) horizontal resolution are performed using the coarse resolution as boundary condition for the fine one. Then another coarse resolution run (run C) is performed, in which the high resolution fields remapped on to the coarse grid are used to nudge the concentrations on the Po Valley area. The nudging is applied to all gas and aerosol species of BOLCHEM. Averaged concentrations and variances over Po Valley and other selected areas for O3 and PM are computed. It is observed that although the variance of run B is markedly larger than that of run A, the variance of run C is smaller because the remapping procedure removes large portion of variance from run B fields. Mean concentrations show some differences depending on species: in general mean

  10. Development of numerical techniques for simulation of magnetogasdynamics and hypersonic chemistry (United States)

    Damevin, Henri-Marie

    Magnetogasdynamics, the science concerned with the mutual interaction between electromagnetic field and flow of electrically conducting gas, offers promising advances in flow control and propulsion of future hypersonic vehicles. Numerical simulations are essential for understanding phenomena, and for research and development. The current dissertation is devoted to the development and validation of numerical algorithms for the solution of multidimensional magnetogasdynamic equations and the simulation of hypersonic high-temperature effects. Governing equations are derived, based on classical magnetogasdynamic assumptions. Two sets of equations are considered, namely the full equations and equations in the low magnetic Reynolds number approximation. Equations are expressed in a suitable formulation for discretization by finite differences in a computational space. For the full equations, Gauss law for magnetism is enforced using Powell's methodology. The time integration method is a four-stage modified Runge-Kutta scheme, amended with a Total Variation Diminishing model in a postprocessing stage. The eigensystem, required for the Total Variation Diminishing scheme, is derived in generalized three-dimensional coordinate system. For the simulation of hypersonic high-temperature effects, two chemical models are utilized, namely a nonequilibrium model and an equilibrium model. A loosely coupled approach is implemented to communicate between the magnetogasdynamic equations and the chemical models. The nonequilibrium model is a one-temperature, five-species, seventeen-reaction model solved by an implicit flux-vector splitting scheme. The chemical equilibrium model computes thermodynamics properties using curve fit procedures. Selected results are provided, which explore the different features of the numerical algorithms. The shock-capturing properties are validated for shock-tube simulations using numerical solutions reported in the literature. The computations of

  11. Heavenly bodies the photographer's guide to astrophotography

    CERN Document Server

    Krages, Esq, Bert P


    Detailing the photographic equipment and astronomical instruments needed to capture celestial images, this guide shows how astrophotography can be accessible to all photographers. Included is a detailed introduction to basic astronomy with information on mapping the sky, locating celestial bodies, and planning an expedition to photograph astronomical phenomena. Photographers learn how to determine the color sensitivity of various films and achieve the best possible exposure, how to ensure a captivating composition, and how commercially processed prints can support their artistic vision. Whethe

  12. D Photographs in Cultural Heritage (United States)

    Schuhr, W.; Lee, J. D.; Kiel, St.


    This paper on providing "oo-information" (= objective object-information) on cultural monuments and sites, based on 3D photographs is also a contribution of CIPA task group 3 to the 2013 CIPA Symposium in Strasbourg. To stimulate the interest in 3D photography for scientists as well as for amateurs, 3D-Masterpieces are presented. Exemplary it is shown, due to their high documentary value ("near reality"), 3D photography support, e.g. the recording, the visualization, the interpretation, the preservation and the restoration of architectural and archaeological objects. This also includes samples for excavation documentation, 3D coordinate calculation, 3D photographs applied for virtual museum purposes and as educational tools. In addition 3D photography is used for virtual museum purposes, as well as an educational tool and for spatial structure enhancement, which in particular holds for inscriptions and in rock arts. This paper is also an invitation to participate in a systematic survey on existing international archives of 3D photographs. In this respect it is also reported on first results, to define an optimum digitization rate for analog stereo views. It is more than overdue, in addition to the access to international archives for 3D photography, the available 3D photography data should appear in a global GIS(cloud)-system, like on, e.g., google earth. This contribution also deals with exposing new 3D photographs to document monuments of importance for Cultural Heritage, including the use of 3D and single lense cameras from a 10m telescope staff, to be used for extremely low earth based airborne 3D photography, as well as for "underwater staff photography". In addition it is reported on the use of captive balloon and drone platforms for 3D photography in Cultural Heritage. It is liked to emphasize, the still underestimated 3D effect on real objects even allows, e.g., the spatial perception of extremely small scratches as well as of nuances in color differences

  13. Simulation model of harmonics reduction technique using shunt active filter by cascade multilevel inverter method (United States)

    Andreh, Angga Muhamad; Subiyanto, Sunardiyo, Said


    Development of non-linear loading in the application of industry and distribution system and also harmonic compensation becomes important. Harmonic pollution is an urgent problem in increasing power quality. The main contribution of the study is the modeling approach used to design a shunt active filter and the application of the cascade multilevel inverter topology to improve the power quality of electrical energy. In this study, shunt active filter was aimed to eliminate dominant harmonic component by injecting opposite currents with the harmonic component system. The active filter was designed by shunt configuration with cascaded multilevel inverter method controlled by PID controller and SPWM. With this shunt active filter, the harmonic current can be reduced so that the current wave pattern of the source is approximately sinusoidal. Design and simulation were conducted by using Power Simulator (PSIM) software. Shunt active filter performance experiment was conducted on the IEEE four bus test system. The result of shunt active filter installation on the system (IEEE four bus) could reduce THD current from 28.68% to 3.09%. With this result, the active filter can be applied as an effective method to reduce harmonics.

  14. The Synergy Between Total Scattering and Advanced Simulation Techniques: Quantifying Geopolymer Gel Evolution

    Energy Technology Data Exchange (ETDEWEB)

    White, Claire [Los Alamos National Laboratory; Bloomer, Breaunnah E. [Los Alamos National Laboratory; Provis, John L. [The University of Melbourne; Henson, Neil J. [Los Alamos National Laboratory; Page, Katharine L. [Los Alamos National Laboratory


    With the ever increasing demands for technologically advanced structural materials, together with emerging environmental consciousness due to climate change, geopolymer cement is fast becoming a viable alternative to traditional cements due to proven mechanical engineering characteristics and the reduction in CO2 emitted (approximately 80% less CO2 emitted compared to ordinary Portland cement). Nevertheless, much remains unknown regarding the kinetics of the molecular changes responsible for nanostructural evolution during the geopolymerization process. Here, in-situ total scattering measurements in the form of X-ray pair distribution function (PDF) analysis are used to quantify the extent of reaction of metakaolin/slag alkali-activated geopolymer binders, including the effects of various activators (alkali hydroxide/silicate) on the kinetics of the geopolymerization reaction. Restricting quantification of the kinetics to the initial ten hours of reaction does not enable elucidation of the true extent of the reaction, but using X-ray PDF data obtained after 128 days of reaction enables more accurate determination of the initial extent of reaction. The synergies between the in-situ X-ray PDF data and simulations conducted by multiscale density functional theory-based coarse-grained Monte Carlo analysis are outlined, particularly with regard to the potential for the X-ray data to provide a time scale for kinetic analysis of the extent of reaction obtained from the multiscale simulation methodology.

  15. Application of System Dynamics technique to simulate the fate of persistent organic pollutants in soils. (United States)

    Chaves, R; López, D; Macías, F; Casares, J; Monterroso, C


    Persistent organic pollutants (POPs) are within the most dangerous pollutants released into the environment by human activities. Due to their resistance to degradation (chemical, biological or photolytic), it is critical to assess the fate and environmental hazards of the exchange of POPs between different environmental media. System Dynamics enables to represent complex systems and analyze their dynamic behavior. It provides a highly visual representation of the structure of the system and the existing relationships between the several parameters and variables, facilitating the understanding of the behavior of the system. In the present study the fate of γ-hexachlorocyclohexane (lindane) in a contaminated soil was modeled using the Vensim® simulation software. Results show a gradual decrease in the lindane content in the soil during a simulation period of 10 years. The most important route affecting the concentrations of the contaminant was the biochemical degradation, followed by infiltration and hydrodynamic dispersion. The model appeared to be highly sensitive to the half-life of the pollutant, which value depends on environmental conditions and directly affects the biochemical degradation. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    Energy Technology Data Exchange (ETDEWEB)

    and Ben Polly, Joseph Robertson [National Renewable Energy Lab. (NREL), Golden, CO (United States); Polly, Ben [National Renewable Energy Lab. (NREL), Golden, CO (United States); Collis, Jon [Colorado School of Mines, Golden, CO (United States)


    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define "explicit" input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAE 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.

  17. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, J.; Polly, B.; Collis, J.


    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define 'explicit' input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAE 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.

  18. Design optimization of gating and feeding system through simulation technique for sand casting of wear plate

    Directory of Open Access Journals (Sweden)

    Sachin L. Nimbulkar


    Full Text Available Casting is a manufacturing process to make complex shapes of metal materials; during mass production, we may experience many defects, such as gas porosity, pin holes, blow holes, shrinkages and incomplete filling that may occur in sand casting. Porosity is one of the defects most frequently encountered in ductile iron casting. Porosity impacts cost by scrap loss and limits the use of cast parts in critical high strength applications. The amount of porosity is closely related to the parameter of sand casting process. The gating/riser system design plays a very important role for improving casting quality. Many researchers reported that 90% of the defects in casting are obtained only because of improper design of gating and feeding system. The main objectives were to study the existing design of gating and feeding system, to optimize the gating and feeding system using Auto-CAST X1 casting simulation software, to prepare the sand mold and cast the part, to compare the simulated result and experimental results, to reduce rejection rate and to enable the company to again start the production.

  19. Investigation of dynamic stress recovery in elastic gear simulations using different reduction techniques (United States)

    Schurr, Dennis; Holzwarth, Philip; Eberhard, Peter


    Stresses in gear contact simulations, performed using elastic multibody systems, are recovered. A single gear pair is used for stress investigations and an impact is chosen as simulation case representing an extremely dynamical situation. The gears are modeled as a reduced elastic multibody system allowing a fast computation of the dynamical problem. Depending on the projection matrix which is used for model order reduction, stresses can sometimes not be recovered accurately throughout the whole gear. Thus, the main focus in this paper lies on the selection of the functions which make up the projection matrix and, therefore, determine the elastic deformations and the quality of recovered stresses. However, the chosen set of modes does not only affect stress calculation, it also strongly affects the computation of the dynamics of the gear system and, thus, the computational effort, and may lead to serious drawbacks. This issue is discussed, too. Upon that, several different mode sets are analyzed trying to minimize the computational effort of the elastic multibody system for the given problem while still being able to recover accurate stress values on distinct geometric areas. The stress values are compared with a finite element reference computation. The novel contribution of this paper is the determination of a minimal set of modes including ones assigned to the nodes of the gear contact surface, which are able to accurately recover stresses but minimize the numerical drawbacks.

  20. Comparison of Mechanical and Indirect Ultrasonic Placement Technique on Mineral Trioxide Aggregate Retrofill Density in Simulated Root-end Surgery. (United States)

    Friedl, Christopher C; Williamson, Anne E; Dawson, Deborah V; Gomez, Manuel R; Liu, Wei


    The objective of this study was to evaluate the density of mineral trioxide aggregate (MTA) root-end filling placed by either manual condensation or manual condensation with indirect ultrasonic activation under simulated root-end surgery conditions in vitro. Extracted human molar teeth were obtained and sectioned to provide single-rooted samples (n = 50). Roots were instrumented to a size of 40 with a .04 taper and obturated with a warm vertical technique. The coronal end of each root was embedded in resin. A root-end resection and root-end preparation were completed on each root. Samples were randomly assigned to receive root-end fillings with ProRoot MTA (Dentsply, Tulsa, OK) by 1 of 2 techniques: manual condensation alone (group M, n = 25) or manual condensation with indirect ultrasonic activation (group U, n = 25). MTA was placed incrementally to the level of the root end using the enumerated technique. Samples were weighed immediately before and after filling placement. MTA was removed from all samples so as not to change the root-end preparation, rinsed, and dried. Each sample then underwent MTA placement by the opposite technique, and weight was again measured immediately before and after MTA placement. MTA filling weights for each technique were analyzed statistically using a technique for repeated measures analysis. Statistical analysis was conducted to account for any carryover or order effects. After adjustment for carryover effects, it was found that regardless of the order of placement, the mean fill weight of MTA produced by the indirect ultrasonic method was on average 4.42 mg heavier than that produced by manual condensation alone. This result was statistically significant (P MTA root-end fillings was shown to produce a filling that was significantly denser than MTA placed by manual condensation alone. Copyright © 2016 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  1. 22 CFR 51.26 - Photographs. (United States)


    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Photographs. 51.26 Section 51.26 Foreign Relations DEPARTMENT OF STATE NATIONALITY AND PASSPORTS PASSPORTS Application § 51.26 Photographs. The applicant must submit with his or her application photographs as prescribed by the Department that are a...

  2. 31 CFR 91.10 - Photographs. (United States)


    ... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Photographs. 91.10 Section 91.10 Money and Finance: Treasury Regulations Relating to Money and Finance REGULATIONS GOVERNING CONDUCT IN OR ON THE BUREAU OF THE MINT BUILDINGS AND GROUNDS § 91.10 Photographs. The taking of photographs on...

  3. Using Photographs as Illustrations in Human Geography (United States)

    Rose, Gillian


    Photographs have always played a major role in geographical studies. Ever since the invention of photography in the 1830s, it has been assumed that photographs are perfectly suited to help answer that eminently geographical question, "what is this place like?". Photographs can convey a great deal of information about the appearance of a place far…

  4. The Potential Stopping Power of Student Photographs. (United States)

    Konkle, Bruce E.


    Suggests that capturing photographs that have "stopping power" should not be an impossible task, but a reality for student photographers. Lists 19 recent publications and web sites on photography and photojournalism. Discusses ways for scholastic photographers to take pictures with stopping power. (RS)

  5. Multilevel techniques lead to accurate numerical upscaling and scalable robust solvers for reservoir simulation

    DEFF Research Database (Denmark)

    Christensen, Max la Cour; Villa, Umberto; Vassilevski, Panayot


    approach is well suited for the solution of large problems coming from finite element discretizations of systems of partial differential equations. The AMGe technique from 10,9 allows for the construction of operator-dependent coarse (upscaled) models and guarantees approximation properties of the coarse...... be used both as an upscaling tool and as a robust and scalable solver. The methods employed in the present paper have provable O(N) scaling and are particularly well suited for modern multicore architectures, because the construction of the coarse spaces by solving many small local problems offers a high...

  6. The Technique of Special-Effects Cinematography. (United States)

    Fielding, Raymond

    The author describes the many techniques used to produce cinematic effects that would be too costly, too difficult, too time-consuming, too dangerous, or simply impossible to achieve with conventional photographic techniques. He points out that these techniques are available not only for 35 millimeter work but also to the 16 mm. photographer who…


    Directory of Open Access Journals (Sweden)

    J.K. Visser


    Full Text Available

    ENGLISH ABSTRACT: The maintenance of some sophisticated and complex equipment – for example, diagnostic equipment used in hospitals – is frequently outsourced to the original equipment supplier or another maintenance service provider. Companies that own such assets can either enter into a service agreement, or they can use service providers on a call-out basis. A fundamental problem for the provider of maintenance services is to determine the optimum number of maintenance technicians or artisans. An investigation was done for a service company that provides planned and breakdown maintenance services to hospitals and clinics that use specialised medical imaging equipment. Data on the duration of maintenance tasks and traveling costs were used to obtain an input distribution for a simulation model. The model that was developed can now be used to determine the number of technicians that would optimise the service company’s profit. This paper discusses the features of the simulation model, as well as some results obtained with the basic simulation model.

    AFRIKAANSE OPSOMMING: Die instandhouding van sommige gesofistikeerde en komplekse toerusting, soos byvoorbeeld diagnostiese toerusting wat in hospitale gebruik word, word dikwels uitgekontrakteer na die oorspronklike vervaardigers van die toerusting. Ondernemings wat sulke toerusting besit kan dan ’n diensleweringskontrak met die diensverskaffer aangaan, of gebruik maak van ’n uitroepstelsel. ’n Fundamentale probleem vir die diensverskaffer is om die optimum aantal tegnici te bepaal. ’n Ondersoek is gedoen vir ’n tipiese diensverskaffer wat beplande sowel as onbeplande instandhouding doen vir hospitale en klinieke wat gesofistikeerde optiese toerusting bedryf. Data vir die tydsduur van take sowel as reistyd is gebruik om insetverdelings te verkry vir die ontwikkeling van ’n simulasiemodel. Die model wat ontwikkel is kan gebruik word om die aantal tegnici te bepaal wat die

  8. Simulation and development of nanoscale deposition techniques using kinetic Monte Carlo (United States)

    Clark, Corey

    Modeling of deposition processes has become of extreme importance due to the small scale of devices and features as well as the reduction of time to market required by industry. Current modeling procedures have focused on individual aspects of growth as well as made assumptions that cause simplification of the problem to the point the models usefulness is limited. The Kinetic Monte Carlo (KMC) model developed in this work combines rate transitions that have been commonly found in KMC simulation along with energy density equations that help explain the transitions and formations of island in alloy deposition. Specifically the model developed in this study, analyzes both the surface energy and strain energy of the film, which are incorporated to show the dependence of strain relaxation to surface energy during island formation. The developed model also incorporates the anisotropy of crystalline structures to accommodate for the changes in growth rate and morphology based upon crystal orientation. This leads to a more versatile model that will accommodate multiple material sets as well allow for quick simulation results for the development of new devices. Work was also done in increasing the randomness of site selection while minimizing errors due to standard uniform number generators. The developed KMC model incorporates a pseudo random number generator for the purpose of site selection, which reduce the amount of cluster processing that can occur with random number generators. A focus was also placed on the ability to describe flux distributions that are not commonly found in semiconductor device manufacturing. This was done to allow for the expansion of this model into nonplanar environments that might be found in industries such as MEMS/NEMS. This extension also allows for evaluation of non-planar deposition process such as via deposition. The expansion done in this model allow for a wider variety of applications with in the semiconductor field. Accounting for

  9. Novel reaction control techniques for redundant space manipulators: Theory and simulated microgravity tests (United States)

    Cocuzza, Silvio; Pretto, Isacco; Debei, Stefano


    This paper presents two novel redundancy resolution schemes aimed at locally minimizing the reaction torque transferred to the spacecraft during manipulator manoeuvres. The subject is of particular interest in space robotics because reduced reactions result in reduced energy consumption and longer operating life of the attitude control system. The first presented solution is based on a weighted Jacobian pseudoinverse and is derived by using Lagrangian multipliers. The weight matrix is defined by means of the inertia matrix which appears in the spacecraft reaction torque dynamics. The second one is based on a least squares formulation of the minimization problem. In this formulation the linearity of the forward kinematics and of the reaction torque dynamics equations with respect to the joint accelerations is used. A closed-form solution is derived for both the presented methods, and their equivalence is proven analytically. Moreover, the proposed solutions, which are suitable for real-time implementation, are extended in order to take into account the physical limits of the manipulator joints directly inside the solution algorithms. A software simulator has been developed in order to simulate the performance of the presented solutions for the selected test cases. The proposed solutions have then been experimentally tested using a 3D free-flying robot previously tested in an ESA parabolic flight campaign. In the test campaign the 3D robot has been converted in a 2D robot thanks to its modularity in order to perform planar tests, in which the microgravity environment can be simulated without time constraints. Air-bearings are used to sustain the links weight, and a dynamometer is used to measure the reaction torque. The experimental validation of the presented inverse kinematics solutions, with an insight on the effect of joint flexibility on their performance, has been carried out, and the experimental results confirmed the good performance of the proposed methods

  10. Applications of soft computing in time series forecasting simulation and modeling techniques

    CERN Document Server

    Singh, Pritpal


    This book reports on an in-depth study of fuzzy time series (FTS) modeling. It reviews and summarizes previous research work in FTS modeling and also provides a brief introduction to other soft-computing techniques, such as artificial neural networks (ANNs), rough sets (RS) and evolutionary computing (EC), focusing on how these techniques can be integrated into different phases of the FTS modeling approach. In particular, the book describes novel methods resulting from the hybridization of FTS modeling approaches with neural networks and particle swarm optimization. It also demonstrates how a new ANN-based model can be successfully applied in the context of predicting Indian summer monsoon rainfall. Thanks to its easy-to-read style and the clear explanations of the models, the book can be used as a concise yet comprehensive reference guide to fuzzy time series modeling, and will be valuable not only for graduate students, but also for researchers and professionals working for academic, business and governmen...

  11. Two-dimensional action spectroscopy of excitonic systems: Explicit simulation using a phase-modulation technique (United States)

    Damtie, Fikeraddis A.; Wacker, Andreas; Pullerits, Tõnu; Karki, Khadga J.


    Two-dimensional (2D) spectroscopy has been intensively used to study electronic and vibronic coherences in biological systems and semiconductors. This technique studies coherent as well as incoherent signals that arise from the nonlinear interaction of a sequence of laser pulses. In this paper we present a direct evaluation of the 2D signal based on elementary quantum kinetics in order to compare with the common approximate diagrammatic approaches. Here we consider incoherent action signals such as fluorescence or photocurrent as the observable, which is easily accessible in a measurement. These observables are calculated by solving the time evolution of the density matrix in the Lindblad form, which can take into account all possible decoherence processes. The phase modulation technique is used to separate the relevant nonlinear signals from the other possible interaction pathways. The approach can be used to calculate 2D spectra of any quantum system. For our model system we find a good agreement for the quantum beating between the coupled states.

  12. A dynamic mesh refinement technique for Lattice Boltzmann simulations on octree-like grids

    KAUST Repository

    Neumann, Philipp


    In this contribution, we present our new adaptive Lattice Boltzmann implementation within the Peano framework, with special focus on nanoscale particle transport problems. With the continuum hypothesis not holding anymore on these small scales, new physical effects - such as Brownian fluctuations - need to be incorporated. We explain the overall layout of the application, including memory layout and access, and shortly review the adaptive algorithm. The scheme is validated by different benchmark computations in two and three dimensions. An extension to dynamically changing grids and a spatially adaptive approach to fluctuating hydrodynamics, allowing for the thermalisation of the fluid in particular regions of interest, is proposed. Both dynamic adaptivity and adaptive fluctuating hydrodynamics are validated separately in simulations of particle transport problems. The application of this scheme to an oscillating particle in a nanopore illustrates the importance of Brownian fluctuations in such setups. © 2012 Springer-Verlag.

  13. Robust and adaptive techniques for numerical simulation of nonlinear partial differential equations of fractional order (United States)

    Owolabi, Kolade M.


    In this paper, some nonlinear space-fractional order reaction-diffusion equations (SFORDE) on a finite but large spatial domain x ∈ [0, L], x = x(x , y , z) and t ∈ [0, T] are considered. Also in this work, the standard reaction-diffusion system with boundary conditions is generalized by replacing the second-order spatial derivatives with Riemann-Liouville space-fractional derivatives of order α, for 0 super-diffusive (1 computer simulations of SFORDE give enough evidence that pattern formation in fractional medium at certain parameter value is practically the same as in the standard reaction-diffusion case. With application to models in biology and physics, different spatiotemporal dynamics are observed and displayed.

  14. Coordinate space translation technique for simulation of electronic process in the ion-atom collision. (United States)

    Wang, Feng; Hong, Xuhai; Wang, Jian; Kim, Kwang S


    Recently we developed a theoretical model of ion-atom collisions, which was made on the basis of a time-dependent density functional theory description of the electron dynamics and a classical treatment of the heavy particle motion. Taking advantage of the real-space grid method, we introduce a "coordinate space translation" technique to allow one to focus on a certain space of interest such as the region around the projectile or the target. Benchmark calculations are given for collisions between proton and oxygen over a wide range of impact energy. To extract the probability of charge transfer, the formulation of Lüdde and Dreizler [J. Phys. B 16, 3973 (1983)] has been generalized to ensemble-averaging application in the particular case of O((3)P). Charge transfer total cross sections are calculated, showing fairly good agreements between experimental data and present theoretical results.

  15. Developing close combat behaviors for simulated soldiers using genetic programming techniques.

    Energy Technology Data Exchange (ETDEWEB)

    Pryor, Richard J.; Schaller, Mark J.


    Genetic programming is a powerful methodology for automatically producing solutions to problems in a variety of domains. It has been used successfully to develop behaviors for RoboCup soccer players and simple combat agents. We will attempt to use genetic programming to solve a problem in the domain of strategic combat, keeping in mind the end goal of developing sophisticated behaviors for compound defense and infiltration. The simplified problem at hand is that of two armed agents in a small room, containing obstacles, fighting against each other for survival. The base case and three changes are considered: a memory of positions using stacks, context-dependent genetic programming, and strongly typed genetic programming. Our work demonstrates slight improvements from the first two techniques, and no significant improvement from the last.

  16. Computer-assisted preoperative simulation for positioning and fixation of plate in 2-stage procedure combining maxillary advancement by distraction technique and mandibular setback surgery

    Directory of Open Access Journals (Sweden)

    Hideyuki Suenaga


    Conclusion: The implementation of the computer-assisted preoperative simulation for the positioning and fixation of plate in 2-stage orthognathic procedure using distraction technique and mandibular setback surgery yielded good results.

  17. The effect of user experience and inflation technique on endotracheal tube cuff pressure using a feline airway simulator. (United States)

    White, Donna M; Redondo, José I; Mair, Alastair R; Martinez-Taboada, Fernando


    The effect of user experience and inflation technique on endotracheal tube cuff pressure using a feline airway simulator. Prospective, experimental clinical study. Participants included veterinary students at the beginning (group S1) and end (group S2) of their 2-week anaesthesia rotation and veterinary anaesthetists (group A). The feline airway simulator was designed to simulate an average size feline trachea, intubated with a 4.5 mm low-pressure, high-volume cuffed endotracheal tube, connected to a Bain breathing system with oxygen flow of 2 L minute-1. Participants inflated the on-endotracheal tube cuff by pilot balloon palpation and by instilling the minimum occlusive volume (MOV) required for loss of airway leaks during manual ventilation. Intracuff pressures were measured by manometers obscured to participants and ideally were 20-30 cm H2O. Student t, Fisher exact, and Chi-squared tests were used where appropriate to analyse data (p Experience had no effect on this skill and, as such, a cuff manometer is recommended. Copyright © 2017 Association of Veterinary Anaesthetists and American College of Veterinary Anesthesia and Analgesia. Published by Elsevier Ltd. All rights reserved.

  18. New beam-tracking simulation code using bulk-to-point calculation technique for space charge fields (United States)

    Mizuno, A.


    A new two-dimensional beam-tracking simulation code for electron injectors using a bulk-to-point calculation technique for space charge fields was developed. The calculated space charge fields are produced not by a point charge but by a hollow cylinder that has a volume. Each tracked electron is a point charge. This bulk-to-point calculation technique for space charge fields is based on that used in the multiple beam envelope equations, which were developed by the author. The multiple beam envelope equations are a set of differential equations for investigating the beam dynamics of electron injectors and can be used to calculate bunched beam dynamics with high accuracy. However, there is one limitation. The bunched beam is assumed to be an ensemble of several segmentation pieces in both the transverse and longitudinal directions. In this bunch model, each longitudinal segmentation slice in a bunch must not warp; consequently, the accuracy of the calculated emittance is reduced in the case of a highly charged beam for calculations of a typical rf gun injector system. This limitation is related to the calculation model of longitudinal space charge fields. In the newly developed beam-tracking simulation code, the space charge field calculation scheme is upgraded and the limitation has been overcome. Therefore, the applicable range is extended while maintaining the high accuracy of emittance calculations. Simultaneously, the calculation time is markedly shortened because the emittance dependence on the segmentation number is extremely weak. In this paper, several examples of beam dynamics that cannot be calculated accurately using the multiple beam envelope equations are demonstrated using the new beam-tracking simulation code. The accuracy of the calculated emittance is also discussed.

  19. Electron Irradiation of Conjunctival Lymphoma-Monte Carlo Simulation of the Minute Dose Distribution and Technique Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Brualla, Lorenzo, E-mail: [NCTeam, Strahlenklinik, Universitaetsklinikum Essen, Essen (Germany); Zaragoza, Francisco J.; Sempau, Josep [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Barcelona (Spain); Wittig, Andrea [Department of Radiation Oncology, University Hospital Giessen and Marburg, Philipps-University Marburg, Marburg (Germany); Sauerwein, Wolfgang [NCTeam, Strahlenklinik, Universitaetsklinikum Essen, Essen (Germany)


    Purpose: External beam radiotherapy is the only conservative curative approach for Stage I non-Hodgkin lymphomas of the conjunctiva. The target volume is geometrically complex because it includes the eyeball and lid conjunctiva. Furthermore, the target volume is adjacent to radiosensitive structures, including the lens, lacrimal glands, cornea, retina, and papilla. The radiotherapy planning and optimization requires accurate calculation of the dose in these anatomical structures that are much smaller than the structures traditionally considered in radiotherapy. Neither conventional treatment planning systems nor dosimetric measurements can reliably determine the dose distribution in these small irradiated volumes. Methods and Materials: The Monte Carlo simulations of a Varian Clinac 2100 C/D and human eye were performed using the PENELOPE and PENEASYLINAC codes. Dose distributions and dose volume histograms were calculated for the bulbar conjunctiva, cornea, lens, retina, papilla, lacrimal gland, and anterior and posterior hemispheres. Results: The simulated results allow choosing the most adequate treatment setup configuration, which is an electron beam energy of 6 MeV with additional bolus and collimation by a cerrobend block with a central cylindrical hole of 3.0 cm diameter and central cylindrical rod of 1.0 cm diameter. Conclusions: Monte Carlo simulation is a useful method to calculate the minute dose distribution in ocular tissue and to optimize the electron irradiation technique in highly critical structures. Using a voxelized eye phantom based on patient computed tomography images, the dose distribution can be estimated with a standard statistical uncertainty of less than 2.4% in 3 min using a computing cluster with 30 cores, which makes this planning technique clinically relevant.

  20. Objective facial photograph analysis using imaging software. (United States)

    Pham, Annette M; Tollefson, Travis T


    Facial analysis is an integral part of the surgical planning process. Clinical photography has long been an invaluable tool in the surgeon's practice not only for accurate facial analysis but also for enhancing communication between the patient and surgeon, for evaluating postoperative results, for medicolegal documentation, and for educational and teaching opportunities. From 35-mm slide film to the digital technology of today, clinical photography has benefited greatly from technological advances. With the development of computer imaging software, objective facial analysis becomes easier to perform and less time consuming. Thus, while the original purpose of facial analysis remains the same, the process becomes much more efficient and allows for some objectivity. Although clinical judgment and artistry of technique is never compromised, the ability to perform objective facial photograph analysis using imaging software may become the standard in facial plastic surgery practices in the future. Copyright 2010 Elsevier Inc. All rights reserved.

  1. Efficient Time Propagation Technique for MAS NMR Simulation: Application to Quadrupolar Nuclei. (United States)

    Charpentier; Fermon; Virlet


    The quantum mechanical Floquet theory is investigated in order to derive an efficient way of performing numerical calculations of the dynamics of nuclear spin systems in MAS NMR experiments. Here, we take advantage of time domain integration of the quantum evolution over one period as proposed by Eden et al. (1). But a full investigation of the propagator U(t, t0), and especially its dependence with respect to t and t0 within a formalized approach, leads to further simplifications and to a substantial reduction in computation time when performing powder averaging for any complex sequence. Such an approximation is suitable for quadrupolar nuclei (I > 1/2) and can be applied to the simulation of the RIACT (rotational induced adiabatic coherence transfer) phenomenon that occurs under special experimental conditions in spin locking experiments (2-4). The present method is also compared to the usual infinite dimensional Floquet space approach (5, 6), which is shown to be rather inefficient. As far as we know, it has never been reported for quadrupolar nuclei with I >/= 3/2 in spin locking experiments. The method can also be easily extended to other areas of spectroscopy. Copyright 1998 Academic Press.

  2. Development of an interpretive simulation tool for the proton radiography technique (United States)

    Levy, M. C.; Ryutov, D. D.; Wilks, S. C.; Ross, J. S.; Huntington, C. M.; Fiuza, F.; Martinez, D. A.; Kugland, N. L.; Baring, M. G.; Park, H.-S.


    Proton radiography is a useful diagnostic of high energy density (HED) plasmas under active theoretical and experimental development. In this paper, we describe a new simulation tool that interacts realistic laser-driven point-like proton sources with three dimensional electromagnetic fields of arbitrary strength and structure and synthesizes the associated high resolution proton radiograph. The present tool's numerical approach captures all relevant physics effects, including effects related to the formation of caustics. Electromagnetic fields can be imported from particle-in-cell or hydrodynamic codes in a streamlined fashion, and a library of electromagnetic field "primitives" is also provided. This latter capability allows users to add a primitive, modify the field strength, rotate a primitive, and so on, while quickly generating a high resolution radiograph at each step. In this way, our tool enables the user to deconstruct features in a radiograph and interpret them in connection to specific underlying electromagnetic field elements. We show an example application of the tool in connection to experimental observations of the Weibel instability in counterstreaming plasmas, using ˜108 particles generated from a realistic laser-driven point-like proton source, imaging fields which cover volumes of ˜10 mm3. Insights derived from this application show that the tool can support understanding of HED plasmas.

  3. A data-based technique for monitoring of wound rotor induction machines: A simulation study

    Directory of Open Access Journals (Sweden)

    Fouzi Harrou


    Full Text Available Detecting faults induction machines is crucial for a safe operation of these machines. The aim of this paper is to present a statistical fault detection methodology for the detection of faults in three-phase wound rotor induction machines (WRIM. The proposed fault detection approach is based on the use of principal components analysis (PCA. However, conventional PCA-based detection indices, such as the T2 and the Q statistics, are not well suited to detect small faults because these indices only use information from the most recent available samples. Detection of small faults is one of the most crucial and challenging tasks in the area of fault detection and diagnosis. In this paper, a new statistical system monitoring strategy is proposed for detecting changes resulting from small shifts in several variables associated with WRIM. The proposed approach combines modeling using PCA modeling with the exponentially weighted moving average (EWMA control scheme. In the proposed approach, EWMA control scheme is applied on the ignored principal components to detect the presence of faults. The performance of the proposed method is compared with those of the traditional PCA-based fault detection indices. The simulation results clearly show the effectiveness of the proposed method over the conventional ones, especially in the presence of faults with small magnitudes.

  4. A data-based technique for monitoring of wound rotor induction machines: A simulation study

    KAUST Repository

    Harrou, Fouzi


    Detecting faults induction machines is crucial for a safe operation of these machines. The aim of this paper is to present a statistical fault detection methodology for the detection of faults in three-phase wound rotor induction machines (WRIM). The proposed fault detection approach is based on the use of principal components analysis (PCA). However, conventional PCA-based detection indices, such as the T2T2 and the Q statistics, are not well suited to detect small faults because these indices only use information from the most recent available samples. Detection of small faults is one of the most crucial and challenging tasks in the area of fault detection and diagnosis. In this paper, a new statistical system monitoring strategy is proposed for detecting changes resulting from small shifts in several variables associated with WRIM. The proposed approach combines modeling using PCA modeling with the exponentially weighted moving average (EWMA) control scheme. In the proposed approach, EWMA control scheme is applied on the ignored principal components to detect the presence of faults. The performance of the proposed method is compared with those of the traditional PCA-based fault detection indices. The simulation results clearly show the effectiveness of the proposed method over the conventional ones, especially in the presence of faults with small magnitudes.

  5. Evaluation and simulation of event building techniques for a detector at the LHC

    CERN Document Server

    Spiwoks, R


    The main objectives of future experiments at the Large Hadron Collider are the search for the Higgs boson (or bosons), the verification of the Standard Model and the search beyond the Standard Model in a new energy range up to a few TeV. These experiments will have to cope with unprecedented high data rates and will need event building systems which can offer a bandwidth of 1 to 100GB/s and which can assemble events from 100 to 1000 readout memories at rates of 1 to 100kHz. This work investigates the feasibility of parallel event building sys- tems using commercially available high speed interconnects and switches. Studies are performed by building a small-scale prototype and by modelling this proto- type and realistic architectures with discrete-event simulations. The prototype is based on the HiPPI standard and uses commercially available VME-HiPPI interfaces and a HiPPI switch together with modular and scalable software. The setup operates successfully as a parallel event building system of limited size in...

  6. Shear viscosity of hard chain fluids through molecular dynamics simulation techniques

    Directory of Open Access Journals (Sweden)

    Ratanapisit, J.


    Full Text Available In this paper, we represent the viscosity of hard chain fluids. This study was initiated with an investigation of the equilibrium molecular dynamic simulations of pure hard-sphere molecules. The natural extension of that work was to hard chain fluids. The hard chain model is one in which each molecule is represented as a chain of freely jointed hard spheres that interact on a site-site basis. The major use of the results from this study lie in the future development of a transport perturbation theory in which the hard chain serves as the reference. Our results show agreement to within the combined uncertainties with the previous studies. Comparisons have also been made to a modified Enskog theory. Results show the failure of the Enskog theory to predict the high density viscosity and that the theory fails more rapidly with density as the chain length increases. We attribute this to a failure of the molecular chaos assumption used in the Enskog theory. Further comparisons are made to real fluids using the SAFT-MET and TRAPP approaches. As expected, the hard sphere model is not appropriate to estimate properties of real fluids. However, the hard sphere model provides the good starting point to serve as the reference basis to study chain molecule systems.

  7. Optimal design of minimum mean-square error noise reduction algorithms using the simulated annealing technique. (United States)

    Bai, Mingsian R; Hsieh, Ping-Ju; Hur, Kur-Nan


    The performance of the minimum mean-square error noise reduction (MMSE-NR) algorithm in conjunction with time-recursive averaging (TRA) for noise estimation is found to be very sensitive to the choice of two recursion parameters. To address this problem in a more systematic manner, this paper proposes an optimization method to efficiently search the optimal parameters of the MMSE-TRA-NR algorithms. The objective function is based on a regression model, whereas the optimization process is carried out with the simulated annealing algorithm that is well suited for problems with many local optima. Another NR algorithm proposed in the paper employs linear prediction coding as a preprocessor for extracting the correlated portion of human speech. Objective and subjective tests were undertaken to compare the optimized MMSE-TRA-NR algorithm with several conventional NR algorithms. The results of subjective tests were processed by using analysis of variance to justify the statistic significance. A post hoc test, Tukey's Honestly Significant Difference, was conducted to further assess the pairwise difference between the NR algorithms.

  8. Optimizing human reliability: Mock-up and simulation techniques in waste management

    Energy Technology Data Exchange (ETDEWEB)

    Caccamise, D.J.; Somers, C.S.; Sebok, A.L.


    With the new mission at Rocky Flats to decontaminate and decommission a 40-year old nuclear weapons production facility comes many interesting new challenges for human factors engineering. Because the goal at Rocky Flats is to transform the environment, the workforce that undertakes this mission will find themselves in a state of constant change, as they respond to ever-changing task demands in a constantly evolving work place. In order to achieve the flexibility necessary under these circumstances and still maintain control of human reliability issues that exist in a hazardous, radioactive work environment, Rocky Flats developed an Engineering Mock-up and Simulation Lab to plan, design, test, and train personnel for new tasks involving hazardous materials. This presentation will describe how this laboratory is used to develop equipment, tools, work processes, and procedures to optimize human reliability concerns in the operational environment. We will discuss a particular instance in which a glovebag, large enough to house two individuals, was developed at this laboratory to protect the workers as they cleaned fissile material from building ventilation duct systems.

  9. Optimizing human reliability: Mock-up and simulation techniques in waste management

    Energy Technology Data Exchange (ETDEWEB)

    Caccamise, D.J.; Somers, C.S.; Sebok, A.L.


    With the new mission at Rocky Flats to decontaminate and decommission a 40-year old nuclear weapons production facility comes many interesting new challenges for human factors engineering. Because the goal at Rocky Flats is to transform the environment, the workforce that undertakes this mission will find themselves in a state of constant change, as they respond to ever-changing task demands in a constantly evolving work place. In order to achieve the flexibility necessary under these circumstances and still maintain control of human reliability issues that exist in a hazardous, radioactive work environment, Rocky Flats developed an Engineering Mock-up and Simulation Lab to plan, design, test, and train personnel for new tasks involving hazardous materials. This presentation will describe how this laboratory is used to develop equipment, tools, work processes, and procedures to optimize human reliability concerns in the operational environment. We will discuss a particular instance in which a glovebag, large enough to house two individuals, was developed at this laboratory to protect the workers as they cleaned fissile material from building ventilation duct systems.

  10. Development of an interpretive simulation tool for the proton radiography technique

    Energy Technology Data Exchange (ETDEWEB)

    Levy, M. C., E-mail: [Clarendon Laboratory, University of Oxford, Parks Road, Oxford OX1 3PU (United Kingdom); Lawrence Livermore National Laboratory, Livermore, California 94551 (United States); Ryutov, D. D.; Wilks, S. C.; Ross, J. S.; Huntington, C. M.; Fiuza, F.; Martinez, D. A.; Park, H.-S. [Lawrence Livermore National Laboratory, Livermore, California 94551 (United States); Kugland, N. L. [Lam Research Corporation, 4400 Cushing Parkway, Fremont, California 94538 (United States); Baring, M. G. [Department of Physics and Astronomy, Rice University, Houston, Texas 77005 (United States)


    Proton radiography is a useful diagnostic of high energy density (HED) plasmas under active theoretical and experimental development. In this paper, we describe a new simulation tool that interacts realistic laser-driven point-like proton sources with three dimensional electromagnetic fields of arbitrary strength and structure and synthesizes the associated high resolution proton radiograph. The present tool’s numerical approach captures all relevant physics effects, including effects related to the formation of caustics. Electromagnetic fields can be imported from particle-in-cell or hydrodynamic codes in a streamlined fashion, and a library of electromagnetic field “primitives” is also provided. This latter capability allows users to add a primitive, modify the field strength, rotate a primitive, and so on, while quickly generating a high resolution radiograph at each step. In this way, our tool enables the user to deconstruct features in a radiograph and interpret them in connection to specific underlying electromagnetic field elements. We show an example application of the tool in connection to experimental observations of the Weibel instability in counterstreaming plasmas, using ∼10{sup 8} particles generated from a realistic laser-driven point-like proton source, imaging fields which cover volumes of ∼10 mm{sup 3}. Insights derived from this application show that the tool can support understanding of HED plasmas.

  11. Cervical Spine Motion During Airway Management Using Two Manual In-line Immobilization Techniques: A Human Simulator Model Study. (United States)

    De Jesus, Clarines Rosa; García Peña, Barbara M; Lozano, Juan Manuel; Maniaci, Vincenzo


    The aim of this study is to evaluate cervical spine motion using 2 manual inline immobilization techniques with the use of a human simulator model. Medical students, pediatric and family practice residents, and pediatric emergency medicine fellows were recruited to maintain cervical manual in line immobilization above the head of the bed and across the chest of a human simulator while orotracheal intubation was performed. Participants were then instructed on appropriate holding techniques after the initial session took place. Orotracheal intubation followed. A tilt sensor measured time to intubation and cervical extension and rotation angle. Seventy-one subjects participated in a total of 284 successful orotracheal intubations. No change in cervical spine movement or time to intubation was observed when using 2 different inline manual immobilization techniques with no training. However, a statistically significant difference with assistants above the head versus across the chest was observed after training in: extension 2.1° (95% confidence interval [95% CI], 1.15 to 3.00; P < 0.0001); rotation 0.7° (95% CI, 0.26 to 1.19; P = 0.003) and intubation time of -1.9 seconds (95% CI, -3.45 to -0.13; P = 0.035) after training. Cervical spine movement did not change when maintaining cervical spine immobilization from above the head versus across the chest before training. There was a statistically significant change in extension and rotation when assistants were above the head and in time to intubation when assistants were across the chest after training. The clinical significance of these results is unclear.

  12. Monte Carlo simulation of X-ray imaging and spectroscopy experiments using quadric geometry and variance reduction techniques (United States)

    Golosio, Bruno; Schoonjans, Tom; Brunetti, Antonio; Oliva, Piernicola; Masala, Giovanni Luca


    The simulation of X-ray imaging experiments is often performed using deterministic codes, which can be relatively fast and easy to use. However, such codes are generally not suitable for the simulation of even slightly more complex experimental conditions, involving, for instance, first-order or higher-order scattering, X-ray fluorescence emissions, or more complex geometries, particularly for experiments that combine spatial resolution with spectral information. In such cases, simulations are often performed using codes based on the Monte Carlo method. In a simple Monte Carlo approach, the interaction position of an X-ray photon and the state of the photon after an interaction are obtained simply according to the theoretical probability distributions. This approach may be quite inefficient because the final channels of interest may include only a limited region of space or photons produced by a rare interaction, e.g., fluorescent emission from elements with very low concentrations. In the field of X-ray fluorescence spectroscopy, this problem has been solved by combining the Monte Carlo method with variance reduction techniques, which can reduce the computation time by several orders of magnitude. In this work, we present a C++ code for the general simulation of X-ray imaging and spectroscopy experiments, based on the application of the Monte Carlo method in combination with variance reduction techniques, with a description of sample geometry based on quadric surfaces. We describe the benefits of the object-oriented approach in terms of code maintenance, the flexibility of the program for the simulation of different experimental conditions and the possibility of easily adding new modules. Sample applications in the fields of X-ray imaging and X-ray spectroscopy are discussed. Catalogue identifier: AERO_v1_0 Program summary URL: Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland

  13. Estimating Fractional Shrub Cover Using Simulated EnMAP Data: A Comparison of Three Machine Learning Regression Techniques

    Directory of Open Access Journals (Sweden)

    Marcel Schwieder


    Full Text Available Anthropogenic interventions in natural and semi-natural ecosystems often lead to substantial changes in their functioning and may ultimately threaten ecosystem service provision. It is, therefore, necessary to monitor these changes in order to understand their impacts and to support management decisions that help ensuring sustainability. Remote sensing has proven to be a valuable tool for these purposes, and especially hyperspectral sensors are expected to provide valuable data for quantitative characterization of land change processes. In this study, simulated EnMAP data were used for mapping shrub cover fractions along a gradient of shrub encroachment, in a study region in southern Portugal. We compared three machine learning regression techniques: Support Vector Regression (SVR; Random Forest Regression (RF; and Partial Least Squares Regression (PLSR. Additionally, we compared the influence of training sample size on the prediction performance. All techniques showed reasonably good results when trained with large samples, while SVR always outperformed the other algorithms. The best model was applied to produce a fractional shrub cover map for the whole study area. The predicted patterns revealed a gradient of shrub cover between regions affected by special agricultural management schemes for nature protection and areas without land use incentives. Our results highlight the value of EnMAP data in combination with machine learning regression techniques for monitoring gradual land change processes.

  14. Estimating childhood mortality trends from routine data: a simulation using the preceding birth technique in Bangladesh. (United States)

    Bairagi, R; Shuaib, M; Hill, A G


    The Preceding Birth Technique (PBT) has been proposed as a method suitable for ascertaining the prevailing level of under-2 mortality in countries without full vital registration. It is a monitoring tool rather than a method that will replace other established approaches to measuring childhood mortality levels and differentials that other demographers have developed over the last 30 years. The principle obstacle to the wider adoption of the PBT is the low proportion of women who give birth in maternity clinics and hospitals. A larger proportion of mothers, however, visit clinics and hospitals for antenatal care and to vaccinate their newborn. We used data from the Matlab surveillance system to test the accuracy of mortality estimates derived using the PBT with data obtained from mothers at antenatal visits and at the vaccination of their youngest children. The study shows that the PBT estimates under-3 rather than under-2 mortality in Bangladesh due to the long birth intervals. The data when used to stimulate the collection of the information at antenatal or postnatal visits, nonetheless provide an accurate description of under-3 mortality trends and differences for the two periods examined--before 1984 and before 1989.

  15. Structural determination of Co/TiO2 nanocomposite: XRD technique and simulation analysis

    Directory of Open Access Journals (Sweden)

    Mostaghni F.


    Full Text Available Synthesis and complex theoretical and experimental studies of Co/TiO2 anatase have been reported. The preparation of Co/TiO2 was carried out by sol-gel method. Distribution of cations among the two tetrahedral and octahedral sites was estimated by analyzing the powder X-ray diffraction patterns by employing Rietveld refinement technique, and the results revealed the existence of tetragonal structure. Band structure and density of states calculations were performed using the first-principles methods. The structural and electronic properties of Co/TiO2 were calculated in the general gradient approximation (GGA. An additional comparison with pure TiO2 anatase allowed us to clarify cobalt doping effect on the electronic structure and the band gap. The band gap of Co/TiO2 was decreased by broadening the valence band as a result of the overlap among Co 3d, Ti 3d, and O 2p states, which made it respond better to visible and solar light.

  16. New iterative load balancing scheme with multi-grid level relaxation technique toward a large scale geodynamical granular simulation (United States)

    Furuichi, M.; Nishiura, D.


    The complex dynamics of granular system is an essential part of natural processes such as crystal rich magma flow, accretion prism formation or tsunami sedimentation. Numerical modeling with Discrete Element Method (DEM) is an effective approach for understanding granular dynamics especially when the contact between particles induces strongly non-linear rheology (e.g. DEM-CFD simulation for magma reservoir [Bergantz, Nature geo, 2015, Furuichi and Nishiura, G-cubed, 2014]). In Moving Lagrangian particle methods like DEM, a large number of particles is required to obtain an accurate solution. Therefore, an efficient parallelization of the code is important to handle huge particles system on HPC. However, since particles move around during the simulation, the workload between the different MPI processes becomes imbalance when using static sub-domains. To overcome this limitation, we present a new dynamic load balancing algorithms applicable to particle simulation methods such as DEM and Smoothed Particle Hydrodynamics (SPH) [Furuichi and Nishiura submitted to Comput. Phys. Comm.]. Our method utilizes flexible orthogonal domain decomposition in which the domain is divided into columns, each of which independently defines rectangle sub-domains by rows. We regard the difference of the executed time between neighbor logical processes as the residual of nonlinear problem of the domain change. The load balancing is attained by minimizing the residual within the framework of the iterative non-linear solver combined with the multi-grid level technique for the local relaxation. Scalability tests attest that the algorithm demonstrates close-to-optimal strong and weak scalability on the K-computer and the Earth Simulator. This result holds for even as well as uneven particle distribution, including different types of particles and heterogeneous computer architecture. We performed a DEM simulation with over 2 billion particles for demonstrating the proposed scheme. The

  17. A novel technique for particle tracking in cold 2-dimensional fluidized beds - simulating fuel dispersion

    Energy Technology Data Exchange (ETDEWEB)

    David Pallares; Filip Johnsson [Chalmers University of Technology, Goeteborg (Sweden). Department of Energy and Environment, Energy Conversion


    This paper presents a novel technique for particle tracking in 2-dimensional fluidized beds operated under ambient conditions. The method is applied to study the mixing mechanisms of fuel particles in fluidized beds and is based on tracking a phosphorescent tracer particle by means of video recording with subsequent digital image analysis. From this, concentration, velocity and dispersion fields of the tracer particle can be obtained with high accuracy. Although the method is restricted to 2-dimensional, it can be applied under flow conditions qualitatively resembling a fluidized-bed combustor. Thus, the experiments cover ranges of bed heights, gas velocities and fuel-to-bed material density and size ratios typical for fluidized-bed combustors. Also, several fluidization regimes (bubbling, turbulent, circulating and pneumatic) are included in the runs. A pattern found in all runs is that the mixing pattern of the tracer (fuel) solids is structured in horizontally aligned vortexes induced by the bubble flow. The main bubble paths always give a low concentration of tracer solids and with the tracer moving upwards, while the downflow of tracer particles in the dense bottom bed is found to take place in zones with low bubble density and at the sidewalls. The amount of bed material (bed height) has a strong influence on the bottom bed dynamics (development and coalescence of bubbles) and, consequently, on the solids mixing process. Local dispersion coefficients reach maximum values around the locations of bubble eruptions, while, in the presence of a dense bottom bed, an increase in fluidization velocity or amount of bed material enhances dispersion. Dispersion is found to be larger in the vertical than in the horizontal direction, confirming the critical character of lateral fuel dispersion in fluidized-bed combustors of large cross section.

  18. A novel numerical technique for the high-precision simulation of flow processes related to artificial recharge (United States)

    Stevens, David; Orsini, Paolo; Power, Henry; Morvan, Herve; Bensabat, Jacob


    This paper presents a novel numerical technique for large-scale groundwater flow simulations, in the frame of artificial recharge planning. The implementation is demonstrated using two test-sites from the EU funded GABARDINE project (FP6): The Sindos test site, near Thessaloniki, Greece, examines the infiltration of water towards the water table, through several unsaturated soil layers. The test site at Campina de Faro, Portugal, investigates phreatic surface movement around a large-diameter well. For both test cases a numerical simulation is constructed, and the local subsurface flow regime is investigated. Numerical methods for solving PDEs using interpolation with radial basis functions (RBFs) will typically provide high accuracy solutions, achieve excellent convergence rates, and offer great flexibility with regards to the enforcement of arbitrary boundary conditions. However, RBF methods have traditionally been limited to the solution of small academic problems, due to issues of computational cost and numerical conditioning. Recent developments in locally supported RBF methods have led to techniques which can be scaled to the largest problem sizes, while maintaining many of the flexibilities of traditional RBF methods. As a contribution to the GABARDINE project, two such numerical techniques have been developed; the meshless LHI method and the control-volume based CV-RBF method. These numerical techniques are capable of modelling flow and transport in heterogeneous porous media, and are of order-N computational complexity, allowing problems to be solved on large and irregular datasets. For both numerical techniques, the RBF Hermitian collocation method is utilised to perform interpolation at the local level, allowing the simultaneous imposition of pressure and mass-flux matching conditions at soil-layer interfaces. The non-overlapping stencil configuration then allows the accurate capture of non-smooth solution profiles across layer interfaces, to a high

  19. Damage Atlas for Photographic materials

    Directory of Open Access Journals (Sweden)

    Kristel Van Camp


    Full Text Available La conservation des documents photographiques peut nécessiter des interventions préventives ou curatives. Ce choix est guidé par leur état de conservation. Une meilleure connaissance des détériorations est donc cruciale. Le répertoire présenté ici essaie de les classifier selon des caractéristiques spécifiques et leur niveau de gravité. Les différents types de dégradation sont illustrés et décrits avec une terminologie précise. L’auteur propose en regard de ceux-ci l’intervention qui semble la plus appropriée. Ce répertoire s’adresse à toutes les personnes concernées par la photographie, qu’ils soient dans le milieu de la conservation ou dans le domaine artistique, dans les musées ou dans les archives. In order to rescue a damaged photographic object, preventive or conservative actions are needed. Knowing the specific characteristics of different types of damage is crucial. A damage atlas can provide these characteristics. With this atlas the damage can be recognised and appropriate actions can be taken. This damage atlas offers a first attempt to such a characterisation in the field of photography. The damage atlas contains images and the necessary information about damage on photographic material. The atlas with special annotations about the terminology and the grade of the damage is meant for everybody who works with photographic material, as well in museums as in archives.

  20. Comparison of Baseline Wander Removal Techniques considering the Preservation of ST Changes in the Ischemic ECG: A Simulation Study

    Directory of Open Access Journals (Sweden)

    Gustavo Lenis


    Full Text Available The most important ECG marker for the diagnosis of ischemia or infarction is a change in the ST segment. Baseline wander is a typical artifact that corrupts the recorded ECG and can hinder the correct diagnosis of such diseases. For the purpose of finding the best suited filter for the removal of baseline wander, the ground truth about the ST change prior to the corrupting artifact and the subsequent filtering process is needed. In order to create the desired reference, we used a large simulation study that allowed us to represent the ischemic heart at a multiscale level from the cardiac myocyte to the surface ECG. We also created a realistic model of baseline wander to evaluate five filtering techniques commonly used in literature. In the simulation study, we included a total of 5.5 million signals coming from 765 electrophysiological setups. We found that the best performing method was the wavelet-based baseline cancellation. However, for medical applications, the Butterworth high-pass filter is the better choice because it is computationally cheap and almost as accurate. Even though all methods modify the ST segment up to some extent, they were all proved to be better than leaving baseline wander unfiltered.

  1. Monte Carlo Simulation of Alloy Design Techniques: Fracture and Welding Studied Using the BFS Method for Alloys (United States)

    Bozzolo, Guillermo H.; Good, Brian; Noebe, Ronald D.; Honecy, Frank; Abel, Phillip


    Large-scale simulations of dynamic processes at the atomic level have developed into one of the main areas of work in computational materials science. Until recently, severe computational restrictions, as well as the lack of accurate methods for calculating the energetics, resulted in slower growth in the area than that required by current alloy design programs. The Computational Materials Group at the NASA Lewis Research Center is devoted to the development of powerful, accurate, economical tools to aid in alloy design. These include the BFS (Bozzolo, Ferrante, and Smith) method for alloys (ref. 1) and the development of dedicated software for large-scale simulations based on Monte Carlo- Metropolis numerical techniques, as well as state-of-the-art visualization methods. Our previous effort linking theoretical and computational modeling resulted in the successful prediction of the microstructure of a five-element intermetallic alloy, in excellent agreement with experimental results (refs. 2 and 3). This effort also produced a complete description of the role of alloying additions in intermetallic binary, ternary, and higher order alloys (ref. 4).

  2. CFD simulation of near-field pollutant dispersion in the urban environment: A review of current modeling techniques (United States)

    Tominaga, Yoshihide; Stathopoulos, Ted


    Near-field pollutant dispersion in the urban environment involves the interaction of a plume and the flow field perturbed by building obstacles. In the past two decades, micro-scale Computational Fluid Dynamics (CFD) simulation of pollutant dispersion around buildings and in urban areas has been widely used, sometimes in lieu of wind tunnel testing. This paper reviews current modeling techniques in CFD simulation of near-field pollutant dispersion in urban environments and discusses the findings to give insight into future applications. Key features of near-field pollutant dispersion around buildings from previous studies, i.e., three-dimensionality of mean flow, unsteadiness of large-scale flow structure, and anisotropy of turbulent scalar fluxes, are identified and discussed. This review highlights that it is important to choose appropriate numerical models and boundary conditions by understanding their inherent strengths and limitations. Furthermore, the importance of model evaluation was emphasized. Because pollutant concentrations around buildings can vary by orders of magnitudes in time and space, the model evaluation should be performed carefully, while paying attention to their uncertainty. Although CFD has significant potential, it is important to understand the underlying theory and limitations of a model in order to appropriately investigate the dispersion phenomena in question.

  3. Molecular investigation on the interaction of spermine with proteinase K by multispectroscopic techniques and molecular simulation studies. (United States)

    Hosseini-Koupaei, Mansoore; Shareghi, Behzad; Saboury, Ali Akbar; Davar, Fateme


    The alteration in structure, function and stability of proteinase K in the presence of spermine was investigated using spectroscopic methods and simulation techniques. The stability and enzyme activity of proteinase K-spermine complex were significantly enhanced as compared to that of the pure enzyme. The increase in the value of Vmax and the catalytic efficiency of Proteinase K in presence of spermine confirmed that the polyamine could bring the enzyme hyperactivation. UV-vis spectroscopy, intrinsic fluorescence and circular dichroism methods demonstrated that the binding of spermine changed the microenvironment and structure of proteinase K. The fluorescence studies, showing that spermine quenched the intensity of proteinase K with static mechanism. Thermodynamic parameters analysis suggested that hydrogen bond and van der Waals forces play a key role in complex stability which is in agreement with modeling studies. The CD spectra represented the secondary structure alteration of proteinase K with an increase in α-helicity and a decrease in β-sheet of proteinase K upon spermine conjugation. The molecular simulation results proposed that spermine could interact with proteinase K spontaneously at single binding site, which is in agreement with spectroscopic results. This agreement between experimental and theoretical results may be a worth method for protein-ligand complex studies. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Near-infrared fluorescence (NIRF) imaging in breast-conserving surgery: assessing intraoperative techniques in tissue-simulating breast phantoms. (United States)

    Pleijhuis, R G; Langhout, G C; Helfrich, W; Themelis, G; Sarantopoulos, A; Crane, L M A; Harlaar, N J; de Jong, J S; Ntziachristos, V; van Dam, G M


    Breast-conserving surgery (BCS) results in tumour-positive surgical margins in up to 40% of the patients. Therefore, new imaging techniques are needed that support the surgeon with real-time feedback on tumour location and margin status. In this study, the potential of near-infrared fluorescence (NIRF) imaging in BCS for pre- and intraoperative tumour localization, margin status assessment and detection of residual disease was assessed in tissue-simulating breast phantoms. Breast-shaped phantoms were produced with optical properties that closely match those of normal breast tissue. Fluorescent tumour-like inclusions containing indocyanine green (ICG) were positioned at predefined locations in the phantoms to allow for simulation of (i) preoperative tumour localization, (ii) real-time NIRF-guided tumour resection, and (iii) intraoperative margin assessment. Optical imaging was performed using a custom-made clinical prototype NIRF intraoperative camera. Tumour-like inclusions in breast phantoms could be detected up to a depth of 21 mm using a NIRF intraoperative camera system. Real-time NIRF-guided resection of tumour-like inclusions proved feasible. Moreover, intraoperative NIRF imaging reliably detected residual disease in case of inadequate resection. We evaluated the potential of NIRF imaging applications for BCS. The clinical setting was simulated by exploiting tissue-like breast phantoms with fluorescent tumour-like agarose inclusions. From this evaluation, we conclude that intraoperative NIRF imaging is feasible and may improve BCS by providing the surgeon with imaging information on tumour location, margin status, and presence of residual disease in real-time. Clinical studies are needed to further validate these results. Copyright © 2010 Elsevier Ltd. All rights reserved.

  5. Time-stepping techniques to enable the simulation of bursting behavior in a physiologically realistic computational islet. (United States)

    Khuvis, Samuel; Gobbert, Matthias K; Peercy, Bradford E


    Physiologically realistic simulations of computational islets of beta cells require the long-time solution of several thousands of coupled ordinary differential equations (ODEs), resulting from the combination of several ODEs in each cell and realistic numbers of several hundreds of cells in an islet. For a reliable and accurate solution of complex nonlinear models up to the desired final times on the scale of several bursting periods, an appropriate ODE solver designed for stiff problems is eventually a necessity, since other solvers may not be able to handle the problem or are exceedingly inefficient. But stiff solvers are potentially significantly harder to use, since their algorithms require at least an approximation of the Jacobian matrix. For sophisticated models, systems of several complex ODEs in each cell, it is practically unworkable to differentiate these intricate nonlinear systems analytically and to manually program the resulting Jacobian matrix in computer code. This paper demonstrates that automatic differentiation can be used to obtain code for the Jacobian directly from code for the ODE system, which allows a full accounting for the sophisticated model equations. This technique is also feasible in source-code languages Fortran and C, and the conclusions apply to a wide range of systems of coupled, nonlinear reaction equations. However, when we combine an appropriately supplied Jacobian with slightly modified memory management in the ODE solver, simulations on the realistic scale of one thousand cells in the islet become possible that are several orders of magnitude faster than the original solver in the software Matlab, a language that is particularly user friendly for programming complicated model equations. We use the efficient simulator to analyze electrical bursting and show non-monotonic average burst period between fast and slow cells for increasing coupling strengths. We also find that interestingly, the arrangement of the connected fast

  6. Development of Quality Assessment Techniques for Large Eddy Simulation of Propulsion and Power Systems in Complex Geometries

    Energy Technology Data Exchange (ETDEWEB)

    Lacaze, Guilhem [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Oefelein, Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)


    Large-eddy-simulation (LES) is quickly becoming a method of choice for studying complex thermo-physics in a wide range of propulsion and power systems. It provides a means to study coupled turbulent combustion and flow processes in parameter spaces that are unattainable using direct-numerical-simulation (DNS), with a degree of fidelity that can be far more accurate than conventional engineering methods such as the Reynolds-averaged Navier-Stokes (RANS) approx- imation. However, development of predictive LES is complicated by the complex interdependence of different type of errors coming from numerical methods, algorithms, models and boundary con- ditions. On the other hand, control of accuracy has become a critical aspect in the development of predictive LES for design. The objective of this project is to create a framework of metrics aimed at quantifying the quality and accuracy of state-of-the-art LES in a manner that addresses the myriad of competing interdependencies. In a typical simulation cycle, only 20% of the computational time is actually usable. The rest is spent in case preparation, assessment, and validation, because of the lack of guidelines. This work increases confidence in the accuracy of a given solution while min- imizing the time obtaining the solution. The approach facilitates control of the tradeoffs between cost, accuracy, and uncertainties as a function of fidelity and methods employed. The analysis is coupled with advanced Uncertainty Quantification techniques employed to estimate confidence in model predictions and calibrate model's parameters. This work has provided positive conse- quences on the accuracy of the results delivered by LES and will soon have a broad impact on research supported both by the DOE and elsewhere.

  7. Photographic Study Of A Dead-Pressed Explosive (United States)

    Swallowe, G. M.; Field, J. E.


    High speed photography in conjunction with electron microscopy and a pressure measuring technique have been used to investigate the differences between dead-pressed and non-dead-pressed samples of the primary explosive Mercury Fulminate (Hg Ful). Photographs of reaction propagation were taken in transmitted light using a specially adapted drop-weight machine with transparent anvils. The results of these experiments suggested a mechanism for dead-pressing in Hg Ful based on the microscopic internal structure of the compacted explosive.

  8. A Photographer From Ankara: Osman Darcan

    Directory of Open Access Journals (Sweden)

    Gülseren Mungan Yavuztürk


    Full Text Available This work introduces Osman Darcan, an important name in the history of Ankara photography studios. Darcan followed in the footsteps of famous Austrian photographer Othmar Pferschy, whom he met in Istanbul, to go on to create his own valuable work. On leaving the Public Press Authority Photo Film Center, where he worked as a newsreel photographer and film operator, in 1943 he began taking photographs at the Tatbikat Theater at the Ankara State Conservatoire, where he continued as the photographer for the State Theater until the end of his life. At the same time, this master photographer took the pictures of a select coterie of Ankara’s leading individuals and well-known performers at a studio he opened on Anafartalar Caddesi. In both these roles, his photographs evoke admiration thanks to Darcan’s professional abilities and level of artistry.

  9. Quality of clinical photographs taken by orthodontists, professional photographers, and orthodontic auxiliaries. (United States)

    Sandler, Jonathan; Dwyer, Joe; Kokich, Vincent; McKeown, Fiona; Murray, Alison; McLaughlin, Richard; O'Brien, Catherine; O'Malley, Paul


    A survey of the members of the Angle Society of Europe showed that 60% of orthodontists took their own clinical photographs, 35% assigned the task to an auxiliary, and 5% hired professional clinical photographers. It is always useful to ensure that orthodontists' time is used to maximum effect. Clinical photography could be delegated to auxiliary staff. In this study, we assessed the quality of photographs taken by orthodontists to see whether those taken by orthodontic auxiliaries and clinical photographers are of comparable quality. Fifty sets of orthodontic photographs were collected from each of 3 types of photographers: orthodontists, orthodontic auxiliaries, and professional clinical photographers. Four assessors scored each set for quality and detailed errors. The results were compared to determine whether there were differences between the quality of the photographs taken by the different groups. Most of the photos taken by the 3 groups of photographers were judged to be good or acceptable. The results for extraoral photographs showed no statistically significant differences between the 3 groups for good (P = 0.398) and acceptable (P = 0.398) images. The results for intraoral photographs did not differ significantly for acceptable and unacceptable photographs, but orthodontists produced significantly more good-quality intraoral photographs (P = 0.046).

  10. Clinical validation of robot simulation of toothbrushing - comparative plaque removal efficacy


    Lang, Tomas; Staufer, Sebastian; Jennes, Barbara; Gaengler, Peter


    Background Clinical validation of laboratory toothbrushing tests has important advantages. It was, therefore, the aim to demonstrate correlation of tooth cleaning efficiency of a new robot brushing simulation technique with clinical plaque removal. Methods Clinical programme: 27 subjects received dental cleaning prior to 3-day-plaque-regrowth-interval. Plaque was stained, photographically documented and scored using planimetrical index. Subjects brushed teeth 33–47 with three techniques (hori...

  11. "Photographers Are the Devil": An Essay in the Historiography of Photographing Schools (United States)

    Hardcastle, John


    Today, the use of photographs in publications and exhibitions is commonplace, but this was not always so. This article shows how photographs of certain schools that have had lasting impact on design stand in ambiguous relationships to the buildings themselves. Photographs function as part of the design process; they record details of construction…

  12. "Planaltina in the Hole of Aluminum": production and consumption of pinhole photographs

    Directory of Open Access Journals (Sweden)

    Juliana Soares Mendes


    Full Text Available This article analyzes a photographic exhibition consisting of 15 images created by pinhole technique, which stimulates a critical thinking about photojournalism practice and consumption. The exhibition in the internet ( and at the Artistic and Historic Museum of Planaltina (Brazilian Federal District happened in May 2009. Participants were asked to interpret the photographs and rewrite temporary captions. The 1.860 proposed captions indicates the public’s interest to participate, discuss and interpret the pictures.

  13. Analysis of angular reading distortions of photographic images. (United States)

    Codarin, Gabriela F; Felicio, Lilian R; Coelho, Daniel M; Oliveira, Anamaria S


    Although photogrammetry is a widespread technique in the health field, despite of the methodological efforts distortions in the angular readings of the images are common. To measure the error of angular measurements in photo images with different digital resolutions in an object with pre-determined angles. We used a rubber ball with 52 cm in circumference. The object was previously marked with angles of 10°, 30°, 60° and 90° degrees. The photographic records were performed with the focal axis of the camera perpendicular and three meters away from the object, without the use of optical zoom and a resolution of 3, 5 and 10 Megapixels (Mp). All photographic records were stored and a previously trained experimenter using the computer program ImageJ analyzed the angular values of each photo. The measurements were performed twice within a fifteen-days interval. Subsequently, we calculated the accuracy, relative error and error in degrees values, precision and the Intraclass Correlation Coefficient (ICC). When analyzing the angle of 10°, the average accuracy of measurements was higher for those records of 3 Mp resolution compared to 5 and 10 Mp resolutions. The ICC was considered excellent for all resolutions. With regards to the analyzed angles in photographic records, it was possible to verify that the 90-degree angle photographs were more accurate, had lower relative error and error in degrees, and were more precise, regardless of image resolution. The photographs records that were taken with a 3 Mp resolution provided great accuracy and precision measurements and lower errors values, suggesting to be the proper resolution to generate image of angles of 10º and 30º.

  14. Degraded Imagery/Art Technique for the Handicapped. (United States)

    Agard, Richard

    Developed for handicapped artists, Degraded Imagery is a technique whereby images can be extracted and refined from a photograph or a collage of photographs. The advantage of this process is that it requires a lower degree of fine motor skills to produce a quality image from a photograph than it does to create a quality image on a blank piece of…

  15. Simulation

    DEFF Research Database (Denmark)

    Gould, Derek A; Chalmers, Nicholas; Johnson, Sheena J


    Recognition of the many limitations of traditional apprenticeship training is driving new approaches to learning medical procedural skills. Among simulation technologies and methods available today, computer-based systems are topical and bring the benefits of automated, repeatable, and reliable...... performance assessments. Human factors research is central to simulator model development that is relevant to real-world imaging-guided interventional tasks and to the credentialing programs in which it would be used....

  16. Schlieren Photographic Studies Of Dynamic Stall (United States)

    Carr, L.; Chandrasekhara, M.


    Report describes stroboscopic Schlieren photographic observations of flows around airfoil oscillating sinusoidally about fixed angle of attack. Conducted in wind tunnel on (NACA) 0012 airfoil oscillating with amplitude of 10 degrees about mean angle of attack of 10 degrees. Photographs taken along span of airfoil. Mach numbers and frequencies chosen to encompass conditions on retreating blades of helicopter rotors in forward flight.

  17. [Anisotropy in depth perception of photograph]. (United States)

    Watanabe, Toshio


    How can we reproduce real physical depth from a photograph? How does depth perception in the photograph differ from depth perception in the direct observation? In Experiment 1, objects in an open space were photographed and presented on a screen. Subjects were asked to judge the distances from a fixed point to the objects and the angles from the median line. The distances and the angles in the photograph were perceived shorter and larger than in physical space, respectively. Furthermore, depth perception in the photograph had an anisotropic property. In Experiment 2, the same objects as in Experiment 1 were observed directly by the subjects. The distances and the angles in the direct observation were perceived longer and smaller at longer distance than in the photograph, respectively. It was concluded that depth perception in the photograph did not reproduce depth either in physical space or in visual space, but it was closer to depth in visual space than in physical space. Furthermore, photographic space had an anisotropic property as visual space did.

  18. UQ and V&V techniques applied to experiments and simulations of heated pipes pressurized to failure

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dempsey, J. Franklin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Antoun, Bonnie R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)


    This report demonstrates versatile and practical model validation and uncertainty quantification techniques applied to the accuracy assessment of a computational model of heated steel pipes pressurized to failure. The Real Space validation methodology segregates aleatory and epistemic uncertainties to form straightforward model validation metrics especially suited for assessing models to be used in the analysis of performance and safety margins. The methodology handles difficulties associated with representing and propagating interval and/or probabilistic uncertainties from multiple correlated and uncorrelated sources in the experiments and simulations including: material variability characterized by non-parametric random functions (discrete temperature dependent stress-strain curves); very limited (sparse) experimental data at the coupon testing level for material characterization and at the pipe-test validation level; boundary condition reconstruction uncertainties from spatially sparse sensor data; normalization of pipe experimental responses for measured input-condition differences among tests and for random and systematic uncertainties in measurement/processing/inference of experimental inputs and outputs; numerical solution uncertainty from model discretization and solver effects.

  19. Study of the operating parameters of a helicon plasma discharge source using PIC-MCC simulation technique (United States)

    Jaafarian, Rokhsare; Ganjovi, Alireza; Etaati, Gholamreza


    In this work, a Particle in Cell-Monte Carlo Collision simulation technique is used to study the operating parameters of a typical helicon plasma source. These parameters mainly include the gas pressure, externally applied static magnetic field, the length and radius of the helicon antenna, and the frequency and voltage amplitude of the applied RF power on the helicon antenna. It is shown that, while the strong radial gradient of the formed plasma density in the proximity of the plasma surface is substantially proportional to the energy absorption from the existing Trivelpiece-Gould (TG) modes, the observed high electron temperature in the helicon source at lower static magnetic fields is significant evidence for the energy absorption from the helicon modes. Furthermore, it is found that, at higher gas pressures, both the plasma electron density and temperature are reduced. Besides, it is shown that, at higher static magnetic fields, owing to the enhancement of the energy absorption by the plasma charged species, the plasma electron density is linearly increased. Moreover, it is seen that, at the higher spatial dimensions of the antenna, both the plasma electron density and temperature are reduced. Additionally, while, for the applied frequencies of 13.56 MHz and 27.12 MHz on the helicon antenna, the TG modes appear, for the applied frequency of 18.12 MHz on the helicon antenna, the existence of helicon modes is proved. Moreover, by increasing the applied voltage amplitude on the antenna, the generation of mono-energetic electrons is more probable.

  20. Hybrid approach combining multiple characterization techniques and simulations for microstructural analysis of proton exchange membrane fuel cell electrodes

    Energy Technology Data Exchange (ETDEWEB)

    Cetinbas, Firat C.; Ahluwalia, Rajesh K.; Kariuki, Nancy; De Andrade, Vincent; Fongalland, Dash; Smith, Linda; Sharman, Jonathan; Ferreira, Paulo; Rasouli, Somaye; Myers, Deborah J.


    The cost and performance of proton exchange membrane fuel cells strongly depend on the cathode electrode due to usage of expensive platinum (Pt) group metal catalyst and sluggish reaction kinetics. Development of low Pt content high performance cathodes requires comprehensive understanding of the electrode microstructure. In this study, a new approach is presented to characterize the detailed cathode electrode microstructure from nm to μm length scales by combining information from different experimental techniques. In this context, nano-scale X-ray computed tomography (nano-CT) is performed to extract the secondary pore space of the electrode. Transmission electron microscopy (TEM) is employed to determine primary C particle and Pt particle size distributions. X-ray scattering, with its ability to provide size distributions of orders of magnitude more particles than TEM, is used to confirm the TEM-determined size distributions. The number of primary pores that cannot be resolved by nano-CT is approximated using mercury intrusion porosimetry. An algorithm is developed to incorporate all these experimental data in one geometric representation. Upon validation of pore size distribution against gas adsorption and mercury intrusion porosimetry data, reconstructed ionomer size distribution is reported. In addition, transport related characteristics and effective properties are computed by performing simulations on the hybrid microstructure.

  1. Dose-response effects of dietary pequi oil on fermentation characteristics and microbial population using a rumen simulation technique (Rusitec). (United States)

    Duarte, Andrea Camacho; Durmic, Zoey; Vercoe, Philip E; Chaves, Alexandre V


    The effect of increasing the concentration of commercial pequi (Caryocar brasiliense) oil on fermentation characteristics and abundance of methanogens and fibrolityc bacteria was evaluated using the rumen simulation technique (Rusitec). In vitro incubation was performed over 15 days using a basal diet consisting of ryegrass, maize silage and concentrate in equal proportions. Treatments consisted of control diet (no pequi oil inclusion, 0 g/kg DM), pequi dose 1 (45 g/kg DM), and pequi dose 2 (91 g/kg DM). After a 7 day adaptation period, samples for fermentation parameters (total gas, methane, and VFA production) were taken on a daily basis. Quantitative real time PCR (q-PCR) was used to evaluate the abundance of the main rumen cellulolytic bacteria, as well as abundance of methanogens. Supplementation with pequi oil did not reduce overall methane production (P = 0.97), however a tendency (P = 0.06) to decrease proportion of methane in overall microbial gas was observed. Increasing addition of pequi oil was associated with a linear decrease (P oil, but numbers of those belonging to Methanomassiliicoccaceae decreased in liquid-associated microbes (LAM) samples (P oil. In conclusion, pequi oil was ineffective in mitigating methane emissions and had some adverse effects on digestibility and selected fibrolytic bacteria. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Effectiveness of a role-play simulation program involving the sbar technique: A quasi-experimental study. (United States)

    Yu, Mi; Kang, Kyung Ja


    Accurate, skilled communication in handover is of high priority in maintaining patients' safety. Nursing students have few chances to practice nurse-to-doctor handover in clinical training, and some have little knowledge of what constitutes effective handover or lack confidence in conveying information. This study aimed to develop a role-play simulation program involving the Situation, Background, Assessment, Recommendation technique for nurse-to-doctor handover; implement the program; and analyze its effects on situation, background, assessment, recommendation communication, communication clarity, handover confidence, and education satisfaction in nursing students. Non-equivalent control-group pretest-posttest quasi-experimental. A convenience sample of 62 senior nursing students from two Korean universities. The differences in SBAR communication, communication clarity, handover confidence, and education satisfaction between the control and intervention groups were measured before and after program participation. The intervention group showed higher Situation, Background, Assessment, Recommendation communication scores (t=-3.05, p=0.003); communication clarity scores in doctor notification scenarios (t=-5.50, pcommunicative competence in nursing students. Copyright © 2017. Published by Elsevier Ltd.

  3. A hybrid simulation technique for electrothermal studies of two-dimensional GaN-on-SiC high electron mobility transistors (United States)

    Hao, Qing; Zhao, Hongbo; Xiao, Yue


    In this work, a hybrid simulation technique is introduced for the electrothermal study of a two-dimensional GaN-on-SiC high electron mobility transistor. Detailed electron and phonon transport is considered by coupled electron and phonon Monte Carlo simulations in the transistor region. For regions away from the transistor, the conventional Fourier's law is used for thermal analysis to minimize the computational load. This hybrid simulation strategy can incorporate the physical phenomena over multiple length scales, including phonon generation by hot electrons in the conduction channel, frequency-dependent phonon transport in the transistor region, and heat transfer across the whole macroscale device.

  4. Numerical simulation of flow induced by a pitched blade turbine. Comparison of the sliding mesh technique and an averaged source term method

    Energy Technology Data Exchange (ETDEWEB)

    Majander, E.O.J.; Manninen, M.T. [VTT Energy, Espoo (Finland)


    The flow induced by a pitched blade turbine was simulated using the sliding mesh technique. The detailed geometry of the turbine was modelled in a computational mesh rotating with the turbine and the geometry of the reactor including baffles was modelled in a stationary co-ordinate system. Effects of grid density were investigated. Turbulence was modelled by using the standard k-{epsilon} model. Results were compared to experimental observations. Velocity components were found to be in good agreement with the measured values throughout the tank. Averaged source terms were calculated from the sliding mesh simulations in order to investigate the reliability of the source term approach. The flow field in the tank was then simulated in a simple grid using these source terms. Agreement with the results of the sliding mesh simulations was good. Commercial CFD-code FLUENT was used in all simulations. (author)

  5. Simulation

    CERN Document Server

    Ross, Sheldon


    Ross's Simulation, Fourth Edition introduces aspiring and practicing actuaries, engineers, computer scientists and others to the practical aspects of constructing computerized simulation studies to analyze and interpret real phenomena. Readers learn to apply results of these analyses to problems in a wide variety of fields to obtain effective, accurate solutions and make predictions about future outcomes. This text explains how a computer can be used to generate random numbers, and how to use these random numbers to generate the behavior of a stochastic model over time. It presents the statist

  6. Automated variance reduction technique for three-dimensional Monte Carlo coupled electron-photon-positron simulation using deterministic importance functions (United States)

    Dionne, Benoit

    Three-dimensional Monte Carlo coupled electron-photon-positron transport calculations are often performed to determine characteristics such as energy or charge deposition in a wide range of systems exposed to radiation field such as electronic circuitry in a space-environment, tissues exposed to radiotherapy linear accelerator beams, or radiation detectors. Modeling these systems constitute a challenging problem for the available computational methods and resources because they can involve; (i) very large attenuation, (ii) large number of secondary particles due to the electron-photon-positron cascade, and (iii) large and highly forward-peaked scattering. This work presents a new automated variance reduction technique, referred to as ADEIS (Angular adjoint-Driven Electron-photon-positron Importance Sampling), that takes advantage of the capability of deterministic methods to rapidly provide approximate information about the complete phase-space in order to automatically evaluate variance reduction parameters. More specifically, this work focuses on the use of discrete ordinates importance functions to evaluate angular transport and collision biasing parameters, and use them through a modified implementation of the weight-window technique. The application of this new method to complex Monte Carlo simulations has resulted in speedups as high as five orders of magnitude. Due to numerical difficulties in obtaining physical importance functions devoid of numerical artifacts, a limited form of smoothing was implemented to complement a scheme for automatic discretization parameters selection. This scheme improves the robustness, efficiency and statistical reliability of the methodology by optimizing the accuracy of the importance functions with respect to the additional computational cost from generating and using these functions. It was shown that it is essential to bias different species of particles with their specific importance functions. In the case of electrons and

  7. Spacelab Life Science-1 Mission Onboard Photograph (United States)


    Spacelab Life Science -1 (SLS-1) was the first Spacelab mission dedicated solely to life sciences. The main purpose of the SLS-1 mission was to study the mechanisms, magnitudes, and time courses of certain physiological changes that occur during space flight, to investigate the consequences of the body's adaptation to microgravity and readjustment to Earth's gravity, and bring the benefits back home to Earth. The mission was designed to explore the responses of the heart, lungs, blood vessels, kidneys, and hormone-secreting glands to microgravity and related body fluid shifts; examine the causes of space motion sickness; and study changes in the muscles, bones, and cells. This photograph shows astronaut Rhea Seddon conducting an inflight study of the Cardiovascular Deconditioning experiment by breathing into the cardiovascular rebreathing unit. This experiment focused on the deconditioning of the heart and lungs and changes in cardiopulmonary function that occur upon return to Earth. By using noninvasive techniques of prolonged expiration and rebreathing, investigators can determine the amount of blood pumped out of the heart (cardiac output), the ease with which blood flows through all the vessels (total peripheral resistance), oxygen used and carbon dioxide released by the body, and lung function and volume changes. SLS-1 was launched aboard the Space Shuttle Orbiter Columbia (STS-40) on June 5, 1995.

  8. Down syndrome detection from facial photographs using machine learning techniques (United States)

    Zhao, Qian; Rosenbaum, Kenneth; Sze, Raymond; Zand, Dina; Summar, Marshall; Linguraru, Marius George


    Down syndrome is the most commonly occurring chromosomal condition; one in every 691 babies in United States is born with it. Patients with Down syndrome have an increased risk for heart defects, respiratory and hearing problems and the early detection of the syndrome is fundamental for managing the disease. Clinically, facial appearance is an important indicator in diagnosing Down syndrome and it paves the way for computer-aided diagnosis based on facial image analysis. In this study, we propose a novel method to detect Down syndrome using photography for computer-assisted image-based facial dysmorphology. Geometric features based on facial anatomical landmarks, local texture features based on the Contourlet transform and local binary pattern are investigated to represent facial characteristics. Then a support vector machine classifier is used to discriminate normal and abnormal cases; accuracy, precision and recall are used to evaluate the method. The comparison among the geometric, local texture and combined features was performed using the leave-one-out validation. Our method achieved 97.92% accuracy with high precision and recall for the combined features; the detection results were higher than using only geometric or texture features. The promising results indicate that our method has the potential for automated assessment for Down syndrome from simple, noninvasive imaging data.

  9. Simultaneous determination of free calcium, magnesium, sodium and potassium ion concentrations in simulated milk ultrafiltrate and reconstituted skim milk using the Donnan Membrane Technique

    NARCIS (Netherlands)

    Gao, R.; Temminghoff, E.J.M.; Leeuwen, van H.P.; Valenberg, van H.J.F.; Eisner, M.D.; Boekel, van M.A.J.S.


    This study focused on determination of free Ca2+, Mg2+, Na+ and K+ concentrations in a series of CaCl2 solutions, simulated milk ultrafiltrate and reconstituted skim milk using a recently developed Donnan Membrane Technique (DMT). A calcium ion selective electrode was used to compare the DMT

  10. The effect of starch, inulin, and degradable protein on ruminal fermentation and microbial growth in rumen simulation technique

    Directory of Open Access Journals (Sweden)

    Xiang H. Zhao


    Full Text Available A rumen simulation technique apparatus with eight 800 mL fermentation vessels was used to investigate the effects of rumen degradable protein (RDP level and non-fibre carbohydrate (NFC type on ruminal fermentation, microbial growth, and populations of ruminal cellulolytic bacteria. Treatments consisted of two NFC types (starch and inulin supplemented with 0 g/d (low RDP or 1.56 g/d (high RDP sodium caseinate. No significant differences existed among dietary treatments in the apparent disappearance of dietary nutrients except for dietary N, which increased with increased dietary RDP (P<0.001. Compared with starch, inulin treatments reduced the molar proportion of acetate (P<0.001, the acetate:propionate ratio (P<0.001, and methane production (P=0.006, but increased the butyrate proportion (P<0.001. Increased dietary RDP led to increases in production of total volatile fatty acid (P=0.014 and methane (P=0.050, various measures of N (P≤0.046, and 16s rDNA copy numbers of Ruminococcus flavefaciens (P≤0.010. Non-fibre carbohydrate source did not affect daily microbial N flow regardless of dietary RDP, but ammonia N production was lower for inulin than for starch treatments under high RDP conditions (P<0.001. Compared with starch treatments, inulin depressed the copy numbers of Fibrobacter succinogenes in solid fraction (P=0.023 and R. flavefaciens in liquid (P=0.017 and solid fractions (P=0.007, but it increased the carboxymethylcellulase activity in solid fraction (P=0.045. Current results suggest that starch and inulin differ in ruminal volatile fatty acid fermentation but have similar effects on ruminal digestion and microbial synthesis in vitro, although inulin suppressed the growth of partial ruminal cellulolytic bacteria.

  11. STS-65 Mission Onboard Photograph (United States)


    In this photograph, astronaut Carl Walz performs the Performance Assessment Workstation (PAWS) experiment at the flight deck of the Space Shuttle Orbiter Columbia during the STS-65 mission. Present day astronauts are subject to a variety of stresses during spaceflight. These include microgravity, physical isolation, confinement, lack of privacy, fatigue, and changing work/rest cycles. The purpose of this experiment is to determine the effects of microgravity upon thinking skills critical to the success of operational tasks in space. The principle objective is to distinguish between the effects of microgravity on specific information-processing skills affecting performance and those of fatigue caused by long work periods. To measure these skills, the investigators use a set of computerized performance tests called the Performance Assessment Workstation, which is based on current theoretical models of human performance. The tests were selected by analyzing tasks related to space missions and their hypothesized sensitivity to microgravity. Multiple subjective measures of cumulative fatigue and changing mood states are also included for interpreting performance data.

  12. The Photoshop CS4 Companion for Photographers

    CERN Document Server

    Story, Derrick


    "Derrick shows that Photoshop can be friendly as well as powerful. In part, he does that by focusing photographers on the essential steps of an efficient workflow. With this guide in hand, you'll quickly learn how to leverage Photoshop CS4's features to organize and improve your pictures."-- John Nack, Principal Product Manager, Adobe Photoshop & BridgeMany photographers -- even the pros -- feel overwhelmed by all the editing options Photoshop provides. The Photoshop CS4 Companion for Photographers pares it down to only the tools you'll need most often, and shows you how to use those tools as

  13. Photograph of moon after transearth insertion (United States)


    This photograph of the moon was taken after transearth insertion when the Apollo 10 spacecraft was high above the lunar equator near 27 degrees east longitude. North is about 20 degrees left of the top of the photograph. Apollo Landing Site 3 is on the lighted side of the terminator in a dark area just north of the equator. Apollo Landing Site 2 is near the lower left margin of the Sea of Tranquility (Mare Tranquillitatis), which is the large, dark area near the center of the photograph.

  14. The technique for Simulation of Transient Combustion Processes in the Rocket Engine Operating with Gaseous Fuel “Hydrogen and Oxygen” (United States)

    Zubanov, V. M.; Stepanov, D. V.; Shabliy, L. S.


    The article describes the method for simulation of transient combustion processes in the rocket engine. The engine operates on gaseous propellant: oxygen and hydrogen. Combustion simulation was performed using the ANSYS CFX software. Three reaction mechanisms for the stationary mode were considered and described in detail. Reactions mechanisms have been taken from several sources and verified. The method for converting ozone properties from the Shomate equation to the NASA-polynomial format was described in detail. The way for obtaining quick CFD-results with intermediate combustion components using an EDM model was found. Modeling difficulties with combustion model Finite Rate Chemistry, associated with a large scatter of reference data were identified and described. The way to generate the Flamelet library with CFX-RIF is described. Formulated adequate reaction mechanisms verified at a steady state have also been tested for transient simulation. The Flamelet combustion model was recognized as adequate for the transient mode. Integral parameters variation relates to the values obtained during stationary simulation. A cyclic irregularity of the temperature field, caused by precession of the vortex core, was detected in the chamber with the proposed simulation technique. Investigations of unsteady processes of rocket engines including the processes of ignition were proposed as the area for application of the described simulation technique.

  15. Performance Accuracy of Hand-on-needle versus Hand-onsyringe Technique for Ultrasound-guided Regional Anesthesia Simulation for Emergency Medicine Residents

    Directory of Open Access Journals (Sweden)

    Brian Johnson


    Full Text Available Introduction: Ultrasound-guided nerve blocks (UGNB are increasingly used in emergency care. The hand-on-syringe (HS needle technique is ideally suited to the emergency department setting because it allows a single operator to perform the block without assistance. The HS technique is assumed to provide less exact needle control than the alternative two-operator hand-on-needle (HN technique; however this assumption has never been directly tested. The primary objective of this study was to compare accuracy of needle targeting under ultrasound guidance by emergency medicine (EM residents using HN and HS techniques on a standardized gelatinous simulation model. Methods: This prospective, randomized study evaluated task performance. We compared needle targeting accuracy using the HN and HS techniques. Each participant performed a set of structured needling maneuvers (both simple and difficult on a standardized partial-task simulator. We evaluated time to task completion, needle visualization during advancement, and accuracy of needle tip at targeting. Resident technique preference was assessed using a post-task survey. Results: We evaluated 60 tasks performed by 10 EM residents. There was no significant difference in time to complete the simple model (HN vs. HS, 18 seconds vs. 18 seconds, p=0.93, time to complete the difficult model (HN vs. HS, 56 seconds vs. 50 seconds, p=0.63, needle visualization, or needle tip targeting accuracy. Most residents (60% preferred the HS technique. Conclusion: For EM residents learning UGNBs, the HN technique was not associated with superior needle control. Our results suggest that the single-operator HS technique provides equivalent needle control when compared to the two-operator HN technique. [West J Emerg Med. 2014;15(6:641–646

  16. Comparing humans to automation in rating photographic aesthetics (United States)

    Kakarala, Ramakrishna; Agrawal, Abhishek; Morales, Sandino


    Computer vision researchers have recently developed automated methods for rating the aesthetic appeal of a photograph. Machine learning techniques, applied to large databases of photos, mimic with reasonably good accuracy the mean ratings of online viewers. However, owing to the many factors underlying aesthetics, it is likely that such techniques for rating photos do not generalize well beyond the data on which they are trained. This paper reviews recent attempts to compare human ratings, obtained in a controlled setting, to ratings provided by machine learning techniques. We review methods to obtain meaningful ratings both from selected groups of judges and also from crowd sourcing. We find that state-of-the-art techniques for automatic aesthetic evaluation are only weakly correlated with human ratings. This shows the importance of obtaining data used for training automated systems under carefully controlled conditions.

  17. arXiv Application of the Waveform Relaxation Technique to the Co-Simulation of Power Converter Controller and Electrical Circuit Models

    CERN Document Server

    Maciejewski, Michał; Schöps, Sebastian; Auchmann, Bernhard; Bortot, Lorenzo; Prioli, Marco; Verweij, Arjan P.

    In this paper we present the co-simulation of a PID class power converter controller and an electrical circuit by means of the waveform relaxation technique. The simulation of the controller model is characterized by a fixed-time stepping scheme reflecting its digital implementation, whereas a circuit simulation usually employs an adaptive time stepping scheme in order to account for a wide range of time constants within the circuit model. In order to maintain the characteristic of both models as well as to facilitate model replacement, we treat them separately by means of input/output relations and propose an application of a waveform relaxation algorithm. Furthermore, the maximum and minimum number of iterations of the proposed algorithm are mathematically analyzed. The concept of controller/circuit coupling is illustrated by an example of the co-simulation of a PI power converter controller and a model of the main dipole circuit of the Large Hadron Collider.

  18. Review on different experimental techniques developed for recording force-deformation behaviour of soft tissues; with a view to surgery simulation applications. (United States)

    Afshari, Elnaz; Rostami, Mostafa; Farahmand, Farzam


    Different experimental techniques which have been developed to obtain data related to force-deformation behaviour of soft tissues play an important role in realistically simulating surgery processes as well as medical diagnoses and minimally invasive procedures. Indeed, an adequate quantitative description of soft-tissue-mechanical-behaviour requires high-quality experimental data to be obtained and analysed. In this review article we will first scan the motivations and basic technical issues on surgery simulation. Then, we will concentrate on different experimental techniques developed for recording force-deformation (stress-strain) behaviour of soft tissues with focussing on the in-vivo experimental setups. We will thoroughly review the available techniques by classifying them to four groups; elastography, indentation, aspiration and grasping techniques. The evolutions, advantages and limitations of each technique will be presented by a historical review. At the end, a discussion is given with the aim of summarising the proposed points and predicting the future of techniques utilised in extracting data related to force-deformation behaviour.

  19. Adopt a Pixel Photographs: 2013-Present (United States)

    U.S. Geological Survey, Department of the Interior — The photographs in the Adopt a Pixel collection were provided by volunteers with a digital camera, a Global Positioning System (GPS), and a compass or a smart mobile...

  20. Mixed Numerical-Experimental Technique for Identification of Elastic Material Parameters Using Digital Image Correlation : Simulation Approach

    National Research Council Canada - National Science Library

    CHANTARAT, Rittipol; KUNTHONG, Prapot


    .... Then, a new formulation of digital image correlation (DIC) based on optical flow and finite element methods is developed to estimate heterogeneous displacement fields from simulated speckle images...

  1. Investigation of the role of aesthetics in differentiating between photographs taken by amateur and professional photographers (United States)

    Xue, Shao-Fu; Lin, Qian; Tretter, Daniel R.; Lee, Seungyon; Pizlo, Zygmunt; Allebach, Jan


    Automatically quantifying the aesthetic appeal of images is an interesting problem in computer science and image processing. In this paper, we incorporate aesthetic properties and convert them into computable image features for classifying photographs taken by amateur and professional photographers. In particular, color histograms, spatial edge distribution, and repetition identification are used as features. Results of experiments on professional and amateur photograph data sets confirm the discriminative power of these features.

  2. Cultural Diffusion and Trends in Facebook Photographs


    You, Quanzeng; García-García, Darío; Paluri, Mahohar; Luo, Jiebo; Joo, Jungseock


    Online social media is a social vehicle in which people share various moments of their lives with their friends, such as playing sports, cooking dinner or just taking a selfie for fun, via visual means, that is, photographs. Our study takes a closer look at the popular visual concepts illustrating various cultural lifestyles from aggregated, de-identified photographs. We perform analysis both at macroscopic and microscopic levels, to gain novel insights about global and local visual trends as...

  3. Power hardware-in-the-loop simulation (PHILS) of photovoltaic power generation using real-time simulation techniques and power interfaces (United States)

    Jung, Jee-Hoon


    Power hardware-in-the-loop simulation (PHILS) has been introduced to its rapid prototyping and accurate testing under various load and interface conditions for power electronics applications. Real-time simulation with advancements in computing technologies can effectively support the PHILS to improve the computation speed of complex target systems converted to electrical and mathematical models. In this paper, advancements of optimized model constructions for a single crystalline photovoltaic (PV) panel are built up for the PHILS with a real-time simulator in the view points of improving dynamic model accuracy and boosting the computation speed. The dynamic model accuracy is one of significant performance factors of the PHILS which should show the dynamic performance of the simulation model during elaborate emulations of the power hardware. In addition, several considerations for the PHILS system such as system configuration and communication are provided to electrically emulate the PV panel with power hardware interfaces. The effectiveness of the proposed PHILS developed on Opal RT's RT-Lab real-time engineering simulator based on Matlab/Simulink is experimentally verified using a prototype PHILS system.

  4. 100% Photoshop Create stunning illustrations without using any photographs

    CERN Document Server

    Caplin, Steve


    Just when you think you've learned all that you could ever know about working in Photoshop, digital artist and photomontage king Steve Caplin comes along with yet another masterful method for creating incredible works of art in Photoshop. This time, he'll show you how to create complete images, from start to finish, entirely within the software program. No source material, photographs, or existing files from other software packages are needed, saving you valuable time and resources. The techniques you'll learn in this ground-breaking new book will help you combine your artistic vision and skil

  5. PaintShop Photo Pro X3 For Photographers

    CERN Document Server

    McMahon, Ken


    If you are a digital photographer who's new to PaintShop Photo Pro or digital imaging in general, or have recently upgraded to the all-new version X3, this is the book for you! Packed with full color images to provide inspiration and easy to follow, step-by-step projects, you'll learn the ins and outs of this fantastic program in no time so you can start correcting and editing your images to create stunning works of art. Whether you want to learn or refresh yourself on the basics, such as effective cropping or simple color correction, or move on to more sophisticated techniques like creating s

  6. 8 CFR 333.1 - Description of required photographs. (United States)


    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Description of required photographs. 333.1 Section 333.1 Aliens and Nationality DEPARTMENT OF HOMELAND SECURITY NATIONALITY REGULATIONS PHOTOGRAPHS § 333.1 Description of required photographs. (a) Every applicant required to furnish photographs of...

  7. 8 CFR 1236.5 - Fingerprints and photographs. (United States)


    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Fingerprints and photographs. 1236.5... ORDERED REMOVED Detention of Aliens Prior to Order of Removal § 1236.5 Fingerprints and photographs. Every... photographed. Such fingerprints and photographs shall be made available to Federal, State, and local law...

  8. Monte Carlo Simulation of the Time-Of-Flight Technique for the Measurement of Neutron Cross-section in the Pohang Neutron Facility

    Energy Technology Data Exchange (ETDEWEB)

    An, So Hyun; Lee, Young Ouk; Lee, Cheol Woo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Lee, Young Seok [National Fusion Research Institute, Daejeon (Korea, Republic of)


    It is essential that neutron cross sections are measured precisely for many areas of research and technique. In Korea, these experiments have been performed in the Pohang Neutron Facility (PNF) with the pulsed neutron facility based on the 100 MeV electron linear accelerator. In PNF, the neutron energy spectra have been measured for different water levels inside the moderator and compared with the results of the MCNPX calculation. The optimum size of the water moderator has been determined on the base of these results. In this study, Monte Carlo simulations for the TOF technique were performed and neutron spectra of neutrons were calculated to predict the measurements.

  9. Computer simulations of the ROUSE model: an analytic simulation technique and a comparison between the error variance-covariance and bootstrap methods for estimating parameter confidence. (United States)

    Huber, David E


    This article provides important mathematical descriptions and computer algorithms in relation to the responding optimally with unknown sources of evidence (ROUSE) model of Huber, Shiffrin, Lyle, and Ruys (2001), which has been applied to short-term priming phenomena. In the first section, techniques for obtaining parameter confidence intervals and parameter correlations are described, which are generally applicable to any mathematical model. In the second section, a technique for producing analytic ROUSE predictions is described. Huber et al. (2001) averaged many stochastic trials to obtain stable behavior. By appropriately weighting all possible combinations of feature states, an alternative analytic version is developed, yielding asymptotic model behavior with fewer computations. The third section ties together these separate techniques, obtaining parameter confidence and correlations for the analytic version of the ROUSE model. In doing so, previously unreported behaviors of the model are revealed. In particular, complications due to local minima are discussed, in terms of both variance-covariance analyses and bootstrap sampling analyses.

  10. Framing the Photographer: Discourse and Performance in Portrait Photography


    Dean, Alison Vivian


    Framing the Photographer: Discourse and Performance in Portrait Photography reconsiders photographic criticism, theory, and history in terms of the photographic event. I argue that discursive frames—whether formed through art history, juridical language, technological format, or otherwise—inform and interact with the formal composition of photographs, the channels through which photography circulates, and the attitudes and performances of the photographers themselves. Rather than representing...

  11. Rapid paediatric fluid resuscitation: a randomised controlled trial comparing the efficiency of two provider-endorsed manual paediatric fluid resuscitation techniques in a simulated setting. (United States)

    Cole, Evan T; Harvey, Greg; Urbanski, Sara; Foster, Gary; Thabane, Lehana; Parker, Melissa J


    Manual techniques of intravascular fluid administration are commonly used during paediatric resuscitation, although it is unclear which technique is most efficient in the hands of typical healthcare providers. We compared the rate of fluid administration achieved with the disconnect-reconnect and push-pull manual syringe techniques for paediatric fluid resuscitation in a simulated setting. This study utilised a randomised crossover trial design and enrolled 16 consenting healthcare provider participants from a Canadian paediatric tertiary care centre. The study was conducted in a non-clinical setting using a model simulating a 15 kg child in decompensated shock. Participants administered 900 mL (60 mL/kg) of normal saline to the simulated patient using each of the two techniques under study. The primary outcome was the rate of fluid administration, as determined by two blinded independent video reviewers. We also collected participant demographic data and evaluated other secondary outcomes including total volume administered, number of catheter dislodgements, number of technical errors, and subjective and objective measures of provider fatigue. All 16 participants completed the trial. The mean (SD) rate of fluid administration (mL/s) was greater for the disconnect-reconnect technique at 1.77 (0.145) than it was for the push-pull technique at 1.62 (0.226), with a mean difference of 0.15 (95% CI 0.055 to 0.251; p=0.005). There was no difference in mean volume administered (p=0.778) or participant self-reported fatigue (p=0.736) between techniques. No catheter dislodgement events occurred. The disconnect-reconnect technique allowed for the fastest rate of fluid administration, suggesting that use of this technique may be preferable in situations requiring rapid resuscitation. These findings may help to inform future iterations of paediatric resuscitation guidelines. This trial was registered at [NCT01774214] prior to enrolling the first


    Directory of Open Access Journals (Sweden)

    Marinete Martins Azevedo


    Full Text Available This study was aimed at assessing the possibility to use digital photographs to evaluate success of the reclamation of a gravel-mined degraded area. The indicator used to measure success was the degree of vegetation ground cover. Photographs were taken with a Canon camera (model Power Shot A 100. On 11th of October 2003, in a sunny day (without clouds, a total of 24 photographs were taken of the centre of the 24 experimental plots. All the 24 photographs were processed with ENVI 3.5 software following the same procedures appliedto process satellite scenes. In each scene analysed, two classes - vegetation cover and bared soil - were identified with the maximum likelihood algorithm. Results showed that digital photographs can be used in the quantification of vegetation ground cover and that the technique employed in this study can be applied to evaluate methodologies of reclamation of mined areas in which the establishment of vegetation is expected. The technique tested in thisstudy can be employed by government agencies in charge of land reclamation plans because it is efficient in determining vegetation ground cover, is easy to perform and is not expensive.

  13. Nobels Nobels laureates photographed by Peter Badge

    CERN Document Server


    A unique photographic record of all living Nobel laureates. In this handsome coffee-table book, photographer Peter Badge captures the likeness of every living Nobel laureate in a lasting black-and-white image -- more than 300 striking portraits in all. Brief biographical sketches accompanying the large-scale photographs pay homage to each laureate's singular contribution to science, literature or world peace. Bringing readers face-to-face with Nelson Mandela, Jimmy Carter, the Dalai Lama, James Watson, Gabriel García Márquez, Toni Morrison, Rita Levi-Montalcini, Linda Buck, and Paul Samuelson among many others, NOBELS offers an intimate and compelling look at well-known honorees as well as lesser-known recipients. A fascinating word/image tableau.

  14. Lamp of adjustable spectrum for photographic usage (United States)

    Mazikowski, Adam; Feldzensztajn, Mateusz


    Photography is a unique rapidly growing interdisciplinary field encompassing aspects of science, art and technology. Expectations of photographers are steadily increasing with the development of technology. One of the areas playing a crucial role in photography is lighting. Consequently, several types of light sources for photographic use have been developed. The ongoing research in this field concentrates on lamps with tunable CCT (Correlated Color Temperature). In this paper, we present a lamp, which emission spectrum can be tailored without affecting the output luminous ux. Intended for photographic uses, the lamp is based on an integrating sphere and a selection of LEDs. As the LED drivers, DC-DC converters controlled by a Raspberry PI were applied. Design process, including the selection of LED wavelengths, is presented. Output characteristics of the lamp were measured using the setup containing the spectrometer. The results of these experiments show good agreement with the spectrum set on the microcomputer.

  15. Characterization of the interaction of glycyrrhizin and glycyrrhetinic acid with bovine serum albumin by spectrophotometric-gradient flow injection titration technique and molecular modeling simulations. (United States)

    Manouchehri, Firouzeh; Izadmanesh, Yahya; Ghasemi, Jahan B


    In this research, the interactions of glycyrrhizin (GL) and glycyrrhetinic acid (GA) with bovine serum albumin (BSA) have been investigated by the novel method of spectrophotometric- gradient flow injection titration technique. The hard-modeling multivariate approach to binding was used for calculation of binding constants and estimation of concentration-spectral profiles of equilibrium species. The stoichiometric ratio of binding was estimated using eigenvalue analysis. Results showed that GL and GA bind BSA with overall binding constants of KGL-BSA=3.85 (±0.09)×104Lmol-1, KGA-BSA=3.08 (±0.08)×104Lmol-1. Ligand-BSA complexes were further analyzed by combined docking and molecular dynamics (MD) simulations. Docking simulations were performed to obtain a first guess on the binding structure of the GL/GA-BSA complex, and subsequently analyzed by 20 ns MD simulations in order to evaluate interactions of GL/GA with BSA in detail. Results of MD simulations indicated that GL-BSA complex forms mainly on the basis of hydrogen bonds, while, GA-BSA complex forms on the basis of hydrophobic interactions. Also, water molecules can bridge between the ligand and protein by hydrogen bonds, which are stable during the entire simulation and play an important role in stabilization of the GL/GA-BSA complexes. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Skylab-4 Mission Onboard Photograph - Astronaut Carr Testing Astronaut Maneuvering Equipment. (United States)


    This Skylab-4 onboard photograph depicts Astronaut Gerald Carr testing Astronaut Maneuvering Equipment (M509) by flying it around under weightless conditions in the Orbital Workshop. The M509 experiment was an operational study to evaluate and conduct an in-orbit verification of the utility of various maneuvering techniques to assist astronauts in performing tasks that were representative of future extravehicular activity requirements.

  17. Overview of Workshop on Evaluation of Simulation Techniques for Radiation Damage in the Bulk of Fusion First Wall Materials

    DEFF Research Database (Denmark)

    Leffers, Torben; Singh, Bachu Narain; Green, W.V.


    The main points and the main conclusions of a workshop held June 27–30 1983 at Interlaken, Switzerland, are reported. There was general agreement among the participants that ideal simulation, providing unambiguous information about the behaviour of the first wall material, is at present out...

  18. Development of Curriculum of Learning through Photograph (United States)

    Suzuki, Keiko; Aoki, Naokazu; Kobayashi, Hiroyuki

    A curriculum of an integrated learning using power of photography in the junior highschool was constructed, and was experimented in the class "Seminar for Photographic Expression" of the integrated learning at a junior high school. The center of the curriculum is viewing photographs and self-expression using photography. By comparing the results of questionnaires investigation between before and after the class it is suggested that the curriculum brings about increase in self-esteem, empathy, and motivation for learning. This educational effect is really to foster ability to live self-sufficient lives. On the basis of these results curriculums which can be conducted by anyone at every junior highschool were proposed.

  19. Le métier de photographe


    Debeauvais, Rémi; Vauclare, Claude


    Évalués à environ 25 000, les photographes professionnels font face, depuis une quinzaine d’années, à des changements importants liés à la diffusion des nouvelles technologies numériques, qui ont redéfini les pratiques de toute une profession. La mutation des conditions de création, de production et de diffusion de la photographie s’est traduite par une dérégulation du marché de la photographie et par une fragilisation du cadre juridique qui garantissait auparavant les revenus des photographe...

  20. Design and Simulation of Control Technique for Permanent Magnet Synchronous Motor Using Space Vector Pulse Width Modulation (United States)

    Khan, Mansoor; Yong, Wang; Mustafa, Ehtasham


    After the rapid advancement in the field of power electronics devices and drives for last few decades, there are different kinds of Pulse Width Modulation techniques which have been brought to the market. The applications ranging from industrial appliances to military equipment including the home appliances. The vey common application for the PWM is three phase voltage source inverter, which is used to convert DC to AC in the homes to supply the power to the house in case electricity failure, usually named as Un-interrupted Power Supply. In this paper Space Vector Pulse Width Modulation techniques is discussed and analysed under the control technique named as Field Oriented Control. The working and implementation of this technique has been studied by implementing on the three phase bridge inverter. The technique is used to control the Permanente Magnet Synchronous Motor. The drive system is successfully implemented in MATLAB/Simulink using the mathematical equation and algorithm to achieve the satisfactory results. PI type of controller is used to tuned ers of the motothe parametr i.e. torque and current.

  1. PREFACE: Workshop Photograph and Program (United States)


    Workshop photograph Workshop Program Sunday 28 March 201019:00-21:00 Reception at Okura Frontier Hotel Tsukuba(Buffet style dinner with drink) Monday 29 March 2010Introduction (Chair: André Rubbia (ETH Zurich))09:00 Welcome address (05') Atsuto Suzuki (KEK)09:05 Message from CERN on neutrino physics (10') Sergio Bertolucci (CERN)09:15 Message from FNAL on neutrino physics (10') Young Kee Kim (FNAL)09:25 Message from KEK on neutrino physics (10') Koichiro Nishikawa (KEK)09:35 Introductory remark on GLA2010 (10') Takuya Hasegawa (KEK) Special session (Chair: Koichiro Nishikawa (KEK))09:45 The ICARUS Liquid Argon TPC (45') Carlo Rubbia (CERN)10:30-11:00 Coffee break Main goals of Giant Liquid Argon Charge Imaging Experiments I (Chair: Takashi Kobayashi (KEK))11:00 Results from massive underground detectors (non accelerator) (30') Takaaki Kajita (ICRR, U. of Tokyo)11:30 Present long baseline neutrino experiments (30') Chang Kee Jung (SUNY Stony Brook)12:00-12:10 Workshop picture12:10-14:00 Lunch break Main goals of Giant Liquid Argon Charge Imaging Experiments II (Chair: Takashi Kobayashi (KEK))14:00 Physics goals of the next generation massive underground experiments (30') David Wark (Imperial College London)14:30 Near detectors for long baseline neutrino experiments (20') Tsuyoshi Nakaya (Kyoto U.) Lessons on Liquid Argon Charge Imaging technology from ongoing developments (Chair: Chang Kee Jung (SUNY Stony Brook))14:50 WARP (30') Claudio Montanari (U. of Pavia)15:20 ArDM (30') Alberto Marchionni (ETH Zurich)15:50 From ArgoNeuT to MicroBooNE (30') Bonnie Fleming (Yale U.)16:20 250L (30') Takasumi Maruyama (KEK)16:50 The DEAP/CLEAN project (20') Mark Boulay (Queen's U.)17:10-17:40 Coffee break Lessons from Xe based Liquids Imaging detectors (Chair: Flavio Cavanna (U. of L'Aquilla))17:30 MEG (20') Satoshi Mihara (KEK)17:50 The XENON project (20') Elena Aprile (Columbia U.)18:10 XMASS (20') Hiroyuki Sekiya (ICRR, U. of Tokyo) Studies on physics performance (Chair

  2. An agent-based simulation combined with group decision-making technique for improving the performance of an emergency department (United States)

    Yousefi, M.; Ferreira, R.P.M.


    This study presents an agent-based simulation modeling in an emergency department. In a traditional approach, a supervisor (or a manager) allocates the resources (receptionist, nurses, doctors, etc.) to different sections based on personal experience or by using decision-support tools. In this study, each staff agent took part in the process of allocating resources based on their observation in their respective sections, which gave the system the advantage of utilizing all the available human resources during the workday by being allocated to a different section. In this simulation, unlike previous studies, all staff agents took part in the decision-making process to re-allocate the resources in the emergency department. The simulation modeled the behavior of patients, receptionists, triage nurses, emergency room nurses and doctors. Patients were able to decide whether to stay in the system or leave the department at any stage of treatment. In order to evaluate the performance of this approach, 6 different scenarios were introduced. In each scenario, various key performance indicators were investigated before and after applying the group decision-making. The outputs of each simulation were number of deaths, number of patients who leave the emergency department without being attended, length of stay, waiting time and total number of discharged patients from the emergency department. Applying the self-organizing approach in the simulation showed an average of 12.7 and 14.4% decrease in total waiting time and number of patients who left without being seen, respectively. The results showed an average increase of 11.5% in total number of discharged patients from emergency department. PMID:28380196

  3. Application perspectives of simulation techniques CFD in nuclear power plants; Perspectivas de aplicacion de tecnicas de modelado CFD en plantas nucleoelectricas

    Energy Technology Data Exchange (ETDEWEB)

    Galindo G, I. F., E-mail: [Instituto de Investigaciones Electricas, Reforma No. 113, Col. Palmira, 62490 Cuernavaca, Morelos (Mexico)


    The scenarios simulation in nuclear power plants is usually carried out with system codes that are based on concentrated parameters networks. However situations exist in some components where the flow is predominantly 3-D, as they are the natural circulation, mixed and stratification phenomena. The simulation techniques of computational fluid dynamics (CFD) have the potential to simulate these flows numerically. The use of CFD simulations embraces many branches of the engineering and continues growing, however, in relation to its application with respect to the problems related with the safety in nuclear power plants, has a smaller development, although is accelerating quickly and is expected that in the future they play a more emphasized paper in the analyses. A main obstacle to be able to achieve a general acceptance of the CFD is that the simulations should have very complete validation studies, sometimes not available. In this article a general panorama of the state of the methods application CFD in nuclear power plants is presented and the problem associated to its routine application and acceptance, including the view point of the regulatory authorities. Application examples are revised in those that the CFD offers real benefits and are also presented two illustrative study cases of the application of CFD techniques. The case of a water recipient with a heat source in its interior, similar to spent fuel pool of a nuclear power plant is presented firstly; and later the case of the Boron dilution of a water volume that enters to a nuclear reactor is presented. We can conclude that the CFD technology represents a very important opportunity to improve the phenomena understanding with a strong component 3-D and to contribute in the uncertainty reduction. (Author)

  4. The use of proper orthogonal decomposition (POD) meshless RBF-FD technique to simulate the shallow water equations (United States)

    Dehghan, Mehdi; Abbaszadeh, Mostafa


    The main aim of this paper is to develop a fast and efficient local meshless method for solving shallow water equations in one- and two-dimensional cases. The mentioned equation has been classified in category of advection equations. The solutions of advection equations have some shock, thus, especial numerical methods should be employed for example discontinuous Galerkin and finite volume methods. Here, based on the proper orthogonal decomposition approach we want to construct a fast meshless method. To this end, we consider shallow water models and obtain a suitable time-discrete scheme based on the predictor-corrector technique. Then by applying the proper orthogonal decomposition technique a new set of basis functions can be built for the solution space in which the size of new solution space is less than the original problem. Thus, by employing the new bases the CPU time will be reduced. Some examples have been studied to show the efficiency of the present numerical technique.

  5. Applications of simulation technique on debris-flow hazard zone delineation: a case study in Hualien County, Taiwan

    Directory of Open Access Journals (Sweden)

    S. M. Hsu


    Full Text Available Debris flows pose severe hazards to communities in mountainous areas, often resulting in the loss of life and property. Helping debris-flow-prone communities delineate potential hazard zones provides local authorities with useful information for developing emergency plans and disaster management policies. In 2003, the Soil and Water Conservation Bureau of Taiwan proposed an empirical model to delineate hazard zones for all creeks (1420 in total with potential of debris flows and utilized the model to help establish a hazard prevention system. However, the model does not fully consider hydrologic and physiographical conditions for a given creek in simulation. The objective of this study is to propose new approaches that can improve hazard zone delineation accuracy and simulate hazard zones in response to different rainfall intensity. In this study, a two-dimensional commercial model FLO-2D, physically based and taking into account the momentum and energy conservation of flow, was used to simulate debris-flow inundated areas.

    Sensitivity analysis with the model was conducted to determine the main influence parameters which affect debris flow simulation. Results indicate that the roughness coefficient, yield stress and volumetric sediment concentration dominate the computed results. To improve accuracy of the model, the study examined the performance of the rainfall-runoff model of FLO-2D as compared with that of the HSPF (Hydrological Simulation Program Fortran model, and then the proper values of the significant parameters were evaluated through the calibration process. Results reveal that the HSPF model has a better performance than the FLO-2D model at peak flow and flow recession period, and the volumetric sediment concentration and yield stress can be estimated by the channel slope. The validation of the model for simulating debris-flow hazard zones has been confirmed by a comparison of field evidence from historical debris

  6. On the reverse. Some notes on photographic images from the Warburg Institute Photographic Collection

    Directory of Open Access Journals (Sweden)

    Katia Mazzucco


    Full Text Available How can the visual and textual data about an image – the image of a work of art – on recto and verso of a picture be interpreted? An analogical-art-documentary photograph represents a palimpsest to be considered layer by layer. The examples discussed in this article, which refer to both Aby Warburg himself and the first nucleus of the Warburg Institute Photographic Collection, contribute to effectively outline elements of the debate around the question of the photographic reproduction of the work of art as well as of the position of photography in relation to the perception of the work of art.

  7. The effect of differences between rainfall measurement techniques on groundwater and discharge simulations in a lowland catchment

    NARCIS (Netherlands)

    Brauer, Claudia C.; Overeem, Aart; Leijnse, Hidde; Uijlenhoet, Remko


    Several rainfall measurement techniques are available for hydrological applications, each with its own spatial and temporal resolution and errors. When using these rainfall datasets as input for hydrological models, their errors and uncertainties propagate through the hydrological system. The aim

  8. A wavelet based numerical simulation technique for the two-phase flow using the phase field method

    CERN Document Server

    Alam, Jahrul M


    In multiphase flow phenomena, bubbles and droplets are advected, deformed, break up into smaller ones, and coalesce with each other. A primary challenge of classical computational fluid dynamics (CFD) methods for such flows is to effectively describe a transition zone between phases across which physical properties vary steeply but continuously. Based on the van der Waals theory, Allen-Cahn phase field method describes the face-to-face existence of two fluids with a free-energy functional of mass density or molar concentration, without imposing topological constraints on interface as phase boundary. In this article, a CFD simulation methodology is described by solving the Allen-Cahn-Navier-Stokes equations using a wavelet collocation method. The second order temporal accuracy is verified by simulating a moving sharp interface. The average terminal velocity of a rising gas bubble in a liquid that is computed by the present method has agreed with that computed by a laboratory experiment. The calculation of the ...

  9. A Monte Carlo simulation for the radiation imaging technique based on the Hemispherical Rotational Modulation Collimator (H-RMC) (United States)

    Le Bao, V.; Kim, G.


    The Rotational Modulation Collimator (RMC) is a simple and versatile tool for the radiation imaging system with low cost, makes it a reasonable selection for locating and tracking nuclear materials and radiation sources. In this paper, Monte Carlo simulation-based design studies for an alternative RMC which has an extended field-of-view will be presented. Modulation patterns for 5 different hemispherical RMC (H-RMC) designs were simulated for various source locations, and fundamental characteristics of rotational modulation patterns were investigated. Obtained patterns showed variations depending on the source location for most of the H-RMC designs, exhibiting promises for the future development of an omni-directional radiation imager based on a non-position sensitive radiation detector.

  10. Analysis of the sEMG/force relationship using HD-sEMG technique and data fusion: A simulation study. (United States)

    Al Harrach, Mariam; Carriou, Vincent; Boudaoud, Sofiane; Laforet, Jeremy; Marin, Frederic


    The relationship between the surface Electromyogram (sEMG) signal and the force of an individual muscle is still ambiguous due to the complexity of experimental evaluation. However, understanding this relationship should be useful for the assessment of neuromuscular system in healthy and pathological contexts. In this study, we present a global investigation of the factors governing the shape of this relationship. Accordingly, we conducted a focused sensitivity analysis of the sEMG/force relationship form with respect to neural, functional and physiological parameters variation. For this purpose, we used a fast generation cylindrical model for the simulation of an 8×8 High Density-sEMG (HD-sEMG) grid and a twitch based force model for the muscle force generation. The HD-sEMG signals as well as the corresponding force signals were simulated in isometric non-fatiguing conditions and were based on the Biceps Brachii (BB) muscle properties. A total of 10 isometric constant contractions of 5s were simulated for each configuration of parameters. The Root Mean Squared (RMS) value was computed in order to quantify the sEMG amplitude. Then, an image segmentation method was used for data fusion of the 8×8 RMS maps. In addition, a comparative study between recent modeling propositions and the model proposed in this study is presented. The evaluation was made by computing the Normalized Root Mean Squared Error (NRMSE) of their fitting to the simulated relationship functions. Our results indicated that the relationship between the RMS (mV) and muscle force (N) can be modeled using a 3rd degree polynomial equation. Moreover, it appears that the obtained coefficients are patient-specific and dependent on physiological, anatomical and neural parameters. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. A theoretical study using the multiphase numerical simulation technique for effective use of H2 as blast furnaces fuel


    Jose Adilson de Castro; Cyro Takano; Jun-ichiro Yagi


    We present a numerical simulation procedure for analyzing hydrogen, oxygen and carbon dioxide gases injections mixed with pulverized coals within the tuyeres of blast furnaces. Effective use of H2 rich gas is highly attractive into the steelmaking blast furnace, considering the possibility of increasing the productivity and decreasing the specific emissions of carbon dioxide becoming the process less intensive in carbon utilization. However, the mixed gas and coal injection is a complex techn...

  12. Comparison of irrigant penetration up to working length and into simulated lateral canals using various irrigating techniques. (United States)

    Spoorthy, E; Velmurugan, N; Ballal, S; Nandini, S


    To evaluate the effect of an apical negative pressure system, a passive ultrasonic irrigation system and a combination of both apical negative pressure and passive ultrasonic irrigation on the penetration of the irrigating contrast solution (ICS) up to working length and into simulated lateral canals. The root canals of 64 single-rooted teeth were instrumented using the ProTaper rotary system. In each sample, three simulated lateral canals were created at 2, 4 and 6 mm levels from the root apex using a 06-size C+ file (Dentsply Maillefer, Ballaigues, Switzerland). Samples were randomly assigned into 4 experimental groups (n = 16): group I - conventional needle irrigation, group II - passive ultrasonic irrigation, group III - apical negative irrigation system and group IV - combination of passive ultrasonic irrigation and apical negative pressure irrigation system. To examine irrigating solution penetration, Indian ink was mixed with 5.25% NaOCl and delivered into the root canals. Samples were then assessed by direct observation of the images taken using Canon EOS rebel T3. The depth of penetration of ICS up to the working length and into the simulated lateral canals was analysed using chi-squared tests. The combination (ANP and PUI) and ANP group had significantly deeper ICS penetration up to the working length (P Endodontic Journal. Published by John Wiley & Sons Ltd.

  13. Monte Carlo Simulation of Characteristic Secondary Fluorescence in Electron Probe Microanalysis of Homogeneous Samples Using the Splitting Technique. (United States)

    Petaccia, Mauricio; Segui, Silvina; Castellano, Gustavo


    Electron probe microanalysis (EPMA) is based on the comparison of characteristic intensities induced by monoenergetic electrons. When the electron beam ionizes inner atomic shells and these ionizations cause the emission of characteristic X-rays, secondary fluorescence can occur, originating from ionizations induced by X-ray photons produced by the primary electron interactions. As detectors are unable to distinguish the origin of these characteristic X-rays, Monte Carlo simulation of radiation transport becomes a determinant tool in the study of this fluorescence enhancement. In this work, characteristic secondary fluorescence enhancement in EPMA has been studied by using the splitting routines offered by PENELOPE 2008 as a variance reduction alternative. This approach is controlled by a single parameter NSPLIT, which represents the desired number of X-ray photon replicas. The dependence of the uncertainties associated with secondary intensities on NSPLIT was studied as a function of the accelerating voltage and the sample composition in a simple binary alloy in which this effect becomes relevant. The achieved efficiencies for the simulated secondary intensities bear a remarkable improvement when increasing the NSPLIT parameter; although in most cases an NSPLIT value of 100 is sufficient, some less likely enhancements may require stronger splitting in order to increase the efficiency associated with the simulation of secondary intensities.

  14. Cryptography Would Reveal Alterations In Photographs (United States)

    Friedman, Gary L.


    Public-key decryption method proposed to guarantee authenticity of photographic images represented in form of digital files. In method, digital camera generates original data from image in standard public format; also produces coded signature to verify standard-format image data. Scheme also helps protect against other forms of lying, such as attaching false captions.

  15. STS-97 Onboard Photograph - International Space Station (United States)


    This image of the International Space Station in orbit was taken from the Space Shuttle Endeavour prior to docking. Most of the Station's components are clearly visible in this photograph. They are the Node 1 or Unity Module docked with the Functional Cargo Block or Zarya (top) that is linked to the Zvezda Service Module. The Soyuz spacecraft is at the bottom.

  16. The Social Effects of War Photographs

    Directory of Open Access Journals (Sweden)

    Gökhan DEMİREL


    Full Text Available In today’s world, knowledge is increasingly impacted via visual representation. The messages sent through various sources, such as newspaper, television and the internet, lead people to form opinions about various topics. In this context, photography is one of the most powerful source of information. Moreover, the visual power and the ability to show nonverbal communication makes it a perfect tool for propaganda. These days, photographs showing war themes are used more often than the past. It can be said that war photographs serve as a tool for showing the world the realities of war to those, even to those who turn their back to massacres. After all, a dead body creates a shocking effect in the seer. In this study, the context of the photographs of the war, examined in sample of photograph of Aylan Kurdi, which became the “icon” of immigration due to Syrian civil war and war it relates to and it is studied to understand how it is assessed and understood considering the environment and conditions on the date the photo was taken, existing values, beliefs and things happened in the world in that time, from a critical point of view.

  17. The Frontal View of the Nose: Lighting Effects and Photographic Bias. (United States)

    Strub, Benedikt; Mende, Konrad; Meuli-Simmen, Claudia; Bessler, Stephan


    Most aesthetic rhinosurgeons rely on proper photographic documentation of the nose using several different views. The frontal view is probably the most important, but it is also the most demanding. In the frontal view, delicate, 3-dimensional (3D) anatomic structures require special photographic skills. Lighting is crucial for detail rendition and 3D reproduction of the nose, and for apparent photographic bias. We compared the quality of reproduction and photographic bias with different symmetric and asymmetric lighting in common clinical practice described in the literature. The photographs were compared for anatomic reproduction, shadowing, 3-dimensionality, and apparent changes of nasal shape (bias). Symmetric lighting did not satisfy the demands of the rhinosurgeons because of marginal 3-dimensionality, reduced detail rendition, or photographic bias. Strongly asymmetric lighting altered the nasal shape adversely for bias depending on the side of illumination, but led to very good 3-dimensionality. Slightly asymmetric lighting demonstrated the best results for detail rendition and 3-dimensionality. Classic symmetric quarter light is a practicable lighting technique with limitations in the rendition of detail and 3-dimensionality. Slightly asymmetric lighting offered a perfect compromise, with substantially improved detail rendition and 3-dimensionality. Strongly asymmetric lighting may lead to photographic bias depending on the side of illumination. Frontal documentation of the nose with asymmetric lighting should, therefore, always be performed in duplicate, with asymmetric lighting from the right side and from the left side, to prevent misleading interpretations. © 2015 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission:

  18. Lunar orbiter photographic atlas of the near side of the Moon

    CERN Document Server

    Byrne, Charles


    In 1967, Lunar Orbiter Mission 4 sent back to Earth a superb series of photographs of the surface of the Moon. Using 21st century computer techniques, Charles Byrne - previously System Engineer of the Apollo Program for Lunar Orbiter Photography - has removed the scanning artifacts and transmission imperfections to produce a most comprehensive and beautifully detailed set of images of the lunar surface. To help practical astronomers, all the photographs are systematically related to an Earth-based view. The book has been organized to make it easy for astronomers to use, enabling ground-based images and views to be compared with the Orbiter photographs. Every astronomer - amateur and professional - who is interested in the Moon will want this book in his library!.

  19. An interactive program for digitization of seabed photographs

    Digital Repository Service at National Institute of Oceanography (India)

    Ramprasad, T.; Sharma, R.

    A program for dignitization of seabed photographs to compute coverage and abundance of polymetallic nodules is developed. Since the objects in the seabed photograph are partially covered by thin sediment layer, the automatic scanning devices may...

  20. Digital Lunar Orbiter Photographic Atlas of the Moon (United States)

    National Aeronautics and Space Administration — The Lunar Orbiter Photographic Atlas of the Moon is considered the definitive reference manual to the global photographic coverage of the Moon. The images contained...

  1. An unusual method of forensic human identification: use of selfie photographs. (United States)

    Miranda, Geraldo Elias; Freitas, Sílvia Guzella de; Maia, Luiza Valéria de Abreu; Melani, Rodolfo Francisco Haltenhoff


    As with other methods of identification, in forensic odontology, antemortem data are compared with postmortem findings. In the absence of dental documentation, photographs of the smile play an important role in this comparison. As yet, there are no reports of the use of the selfie photograph for identification purposes. Owing to advancements in technology, electronic devices, and social networks, this type of photograph has become increasingly common. This paper describes a case in which selfie photographs were used to identify a carbonized body, by using the smile line and image superimposition. This low-cost, rapid, and easy to analyze technique provides highly reliable results. Nevertheless, there are disadvantages, such as the limited number of teeth that are visible in a photograph, low image quality, possibility of morphological changes in the teeth after the antemortem image was taken, and difficulty of making comparisons depending on the orientation of the photo. In forensic odontology, new methods of identification must be sought to accompany technological evolution, particularly when no traditional methods of comparison, such as clinical record charts or radiographs, are available. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. Summary of a technique for heap leach simulation on uranium ores. Open file report (final), 15 August 1977-28 August 1980

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, H.D.; Seidel, D.C.; Nichols, I.L.


    A technique was developed for simulation of heap leaching uranium ores in 2-ft-diam columns constructed such that ores as coarse as minus 4-in could be treated in bed depths from 8 to 18 ft with test condition variations. The technique consisted of the separate steps of ore preparation, column loading, charge wetting, leaching, washing, and residue recovery. For the ores studied, sulfuric acid was found to be generally superior to alkaline lixiviants. The rate of uranium extraction by acid leaching proved to be primarily a function of percolation rate, alkalinity of the ores, and oxidant requirements of the ores. Column leaching was closely comparable to field-scale heap leaching in terms of uranium extraction rate and recovery results. Acid and oxidant requirements were lower for the field-scale operation. All column tests, as well as the field-scale heap, achieved in excess of 96-pct uranium extraction on a Bear Creek, WY, ore.

  3. Trimethylamine-N-oxide switches from stabilizing nature: A mechanistic outlook through experimental techniques and molecular dynamics simulation (United States)

    Rani, Anjeeta; Jayaraj, Abhilash; Jayaram, B.; Pannuru, Venkatesu


    In adaptation biology of the discovery of the intracellular osmolytes, the osmolytes are found to play a central role in cellular homeostasis and stress response. A number of models using these molecules are now poised to address a wide range of problems in biology. Here, a combination of biophysical measurements and molecular dynamics (MD) simulation method is used to examine the effect of trimethylamine-N-oxide (TMAO) on stem bromelain (BM) structure, stability and function. From the analysis of our results, we found that TMAO destabilizes BM hydrophobic pockets and active site as a result of concerted polar and non-polar interactions which is strongly evidenced by MD simulation carried out for 250 ns. This destabilization is enthalpically favourable at higher concentrations of TMAO while entropically unfavourable. However, to the best of our knowledge, the results constitute first detailed unambiguous proof of destabilizing effect of most commonly addressed TMAO on the interactions governing stability of BM and present plausible mechanism of protein unfolding by TMAO.

  4. Numerical Simulation of a Grinding Process Model for the Spatial Work-pieces: Development of Modeling Techniques

    Directory of Open Access Journals (Sweden)

    S. A. Voronov


    Full Text Available The article presents a literature review in simulation of grinding processes. It takes into consideration the statistical, energy based, and imitation approaches to simulation of grinding forces. Main stages of interaction between abrasive grains and machined surface are shown. The article describes main approaches to the geometry modeling of forming new surfaces when grinding. The review of approaches to the chip and pile up effect numerical modeling is shown. Advantages and disadvantages of grain-to-surface interaction by means of finite element method and molecular dynamics method are considered. The article points out that it is necessary to take into consideration the system dynamics and its effect on the finished surface. Structure of the complex imitation model of grinding process dynamics for flexible work-pieces with spatial surface geometry is proposed from the literature review. The proposed model of spatial grinding includes the model of work-piece dynamics, model of grinding wheel dynamics, phenomenological model of grinding forces based on 3D geometry modeling algorithm. Model gives the following results for spatial grinding process: vibration of machining part and grinding wheel, machined surface geometry, static deflection of the surface and grinding forces under various cutting conditions.

  5. Analyzing Forest Inventory Data from Geo-Located Photographs (United States)

    Toivanen, Timo; Tergujeff, Renne; Andersson, Kaj; Molinier, Matthieu; Häme, Tuomas


    Forests are widely monitored using a variety of remote sensing data and techniques. Remote sensing offers benefits compared to traditional in-situ forest inventories made by experts. One of the main benefits is that the number of ground reference plots can be significantly reduced. Remote sensing of forests can provide reduced costs and time requirement compared to full forest inventories. The availability of ground reference data has been a bottleneck in remote sensing analysis over wide forested areas, as the acquisition of this data is an expensive and slow process. In this paper we present a tool for estimating forest inventory data from geo-located photographs. The tool can be used to estimate in-situ forest inventory data including estimated biomass, tree species, tree height and diameter. The collected in-situ forest measurements can be utilized as a ground reference material for spaceborne or airborne remote sensing data analysis. The GPS based location information with measured forest data makes it possible to introduce measurements easily as in-situ reference data. The central projection geometry of digital photographs allows the use of the relascope principle [1] to measure the basal area of stems per area unit, a variable very closely associated with tree biomass. Relascope is applied all over the world for forest inventory. Experiments with independent ground reference data have shown that in-situ data analysed from photographs can be utilised as reference data for satellite image analysis. The concept was validated by comparing mobile measurements with 54 independent ground reference plots from the Hyytiälä forest research station in Finland [2]. Citizen scientists could provide the manpower for analysing photographs from forests on a global level and support researchers working on tasks related to forests. This low-cost solution can also increase the coverage of forest management plans, particularly in regions where possibilities to invest on


    Directory of Open Access Journals (Sweden)

    W. Schuhr


    Full Text Available This paper on providing "oo-information" (= objective object-information on cultural monuments and sites, based on 3D photographs is also a contribution of CIPA task group 3 to the 2013 CIPA Symposium in Strasbourg. To stimulate the interest in 3D photography for scientists as well as for amateurs, 3D-Masterpieces are presented. Exemplary it is shown, due to their high documentary value ("near reality", 3D photography support, e.g. the recording, the visualization, the interpretation, the preservation and the restoration of architectural and archaeological objects. This also includes samples for excavation documentation, 3D coordinate calculation, 3D photographs applied for virtual museum purposes and as educational tools. In addition 3D photography is used for virtual museum purposes, as well as an educational tool and for spatial structure enhancement, which in particular holds for inscriptions and in rock arts. This paper is also an invitation to participate in a systematic survey on existing international archives of 3D photographs. In this respect it is also reported on first results, to define an optimum digitization rate for analog stereo views. It is more than overdue, in addition to the access to international archives for 3D photography, the available 3D photography data should appear in a global GIS(cloud-system, like on, e.g., google earth. This contribution also deals with exposing new 3D photographs to document monuments of importance for Cultural Heritage, including the use of 3D and single lense cameras from a 10m telescope staff, to be used for extremely low earth based airborne 3D photography, as well as for "underwater staff photography". In addition it is reported on the use of captive balloon and drone platforms for 3D photography in Cultural Heritage. It is liked to emphasize, the still underestimated 3D effect on real objects even allows, e.g., the spatial perception of extremely small scratches as well as of nuances in

  7. Images of the Great Depression: A Photographic Essay. (United States)

    Stevens, Robert L.; Fogel, Jared A.


    Provides background information on the Farm Security Administration (FSA) and the photographic section of the FSA. Identifies six photographers and features three photographers (Walker Evans, Dorothea Lange, and Ben Shahn) who were recruited to document farm conditions. Discusses using FSA photos in the classroom and provides lesson plans to help…

  8. 7 CFR 97.9 - Drawings and photographs. (United States)


    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Drawings and photographs. 97.9 Section 97.9... PLANT VARIETY AND PROTECTION The Application § 97.9 Drawings and photographs. (a) Drawings or photographs submitted with an application shall disclose the distinctive characteristics of the variety. (b...

  9. 7 CFR 500.9 - Photographs for news or advertising. (United States)


    ... 7 Agriculture 6 2010-01-01 2010-01-01 false Photographs for news or advertising. 500.9 Section 500..., DEPARTMENT OF AGRICULTURE NATIONAL ARBORETUM Conduct on U.S. National Arboreturm Property § 500.9 Photographs for news or advertising. Photographs for news purposes may be taken at the USNA without prior...

  10. 8 CFR 236.5 - Fingerprints and photographs. (United States)


    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Fingerprints and photographs. 236.5 Section... to Order of Removal § 236.5 Fingerprints and photographs. Every alien 14 years of age or older... by service of a notice to appear shall be fingerprinted and photographed. Such fingerprints and...

  11. A high-speed photographic system for flow visualization in a steam turbine (United States)

    Barna, G. J.


    A photographic system was designed to visualize the moisture flow in a steam turbine. Good performance of the system was verified using dry turbine mockups in which an aerosol spray simulated, in a rough way, the moisture flow in the turbine. Borescopes and fiber-optic light tubes were selected as the general instrumentation approach. High speed motion-picture photographs of the liquid flow over the stator blade surfaces were taken using stroboscopic lighting. Good visualization of the liquid flow was obtained. Still photographs of drops in flight were made using short duration flash sources. Drops with diameters as small as 30 micrometers (0.0012 in.) could be resolved. In addition, motion pictures of a spray of water simulating the spray off the rotor blades and shrouds were taken at normal framing rates. Specially constructed light tubes containing small tungsten-halogen lamps were used. Sixteen millimeter photography was used in all cases. Two potential problems resulting from the two-phase turbine flow (attenuation and scattering of light by the fog present and liquid accumulation on the borescope mirrors) were taken into account in the photographic system design but not evaluated experimentally.

  12. Multirate Particle-in-Cell Time Integration Techniques of Vlasov-Maxwell Equations for Collisionless Kinetic Plasma Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Guangye [Los Alamos National Laboratory; Chacon, Luis [Los Alamos National Laboratory; Knoll, Dana Alan [Los Alamos National Laboratory; Barnes, Daniel C [Coronado Consulting


    A multi-rate PIC formulation was developed that employs large timesteps for slow field evolution, and small (adaptive) timesteps for particle orbit integrations. Implementation is based on a JFNK solver with nonlinear elimination and moment preconditioning. The approach is free of numerical instabilities (ωpeΔt >>1, and Δx >> λD), and requires many fewer dofs (vs. explicit PIC) for comparable accuracy in challenging problems. Significant gains (vs. conventional explicit PIC) may be possible for large scale simulations. The paper is organized as follows: Vlasov-Maxwell Particle-in-cell (PIC) methods for plasmas; Explicit, semi-implicit, and implicit time integrations; Implicit PIC formulation (Jacobian-Free Newton-Krylov (JFNK) with nonlinear elimination allows different treatments of disparate scales, discrete conservation properties (energy, charge, canonical momentum, etc.)); Some numerical examples; and Summary.

  13. [Marginal accuracy of pressed ceramic full veneers with different preparation techniques before and after simulated jaw movement]. (United States)

    Stappert, Christian F J; Derks, Jan; Gerds, Thomas; Strub, Jörg R


    This in-vitro study investigated the marginal accuracy of press-ceramic full veneers with different preparation designs. Measurements of the marginal gap were performed after adhesive cementation as well as after mouth motion fatigue of the veneers. 64 extracted human maxillary central incisors were divided into four groups of 16 specimens each and prepared as follows: Group SZ = enamel (S) restricted 0.5 mm deep preparation with a high palatal shoulder (contact point on tooth structure [Z]), Group SK = enamel (S) restricted 0.5 mm deep preparation with a low palatal shoulder (contact point on ceramic [K]), Group DZ = dentine (D) included 1 mm deep preparation with a high palatal shoulder (contact point on tooth structure), Group DK = dentine included 1 mm deep preparation with a low palatal shoulder (contact point on ceramic). IPS e.max Press ceramic was used to fabricate the full veneers. Restorations were adhesively luted with a dual-polymerizing resin-cementVariolinkII* (*Ivoclar-Vivadent AG). Specimens underwent fatigue in a masticatory simulator (1.2 million cycles; 49 N), including thermal cycling (5500; 5 degrees C/55 degrees C). Computer-aided measurements of the marginal accuracy using stereo-light-microscopy (200x) resulted in mean values of 56 pm to 64 pm after adhesive luting. Masticatory simulation did not cause significant changes of mean marginal accuracies. In comparison, neither the adhesive cementation of the veneers to mainly enamel or dentin nor the position of load application did demonstrate a significant influence on the marginal fit in the given investigation period. Based on the used measurement methods, mouth motion fatigue of the restorations did not demonstrate a significant change of marginal fit.

  14. Use of ensemble prediction technique to estimate the inherent uncertainty in the simulated chlorophyll-a concentration in coastal ecosystems* (United States)

    Meszaros, Lorinc; El Serafy, Ghada


    Phytoplankton blooms in coastal ecosystems such as the Wadden Sea may cause mortality of mussels and other benthic organisms. Furthermore, the algal primary production is the base of the food web and therefore it greatly influences fisheries and aquacultures. Consequently, accurate phytoplankton concentration prediction offers ecosystem and economic benefits. Numerical ecosystem models are powerful tools to compute water quality variables including the phytoplankton concentration. Nevertheless, their accuracy ultimately depends on the uncertainty stemming from the external forcings which further propagates and complicates by the non-linear ecological processes incorporated in the ecological model. The Wadden Sea is a shallow, dynamically varying ecosystem with high turbidity and therefore the uncertainty in the Suspended Particulate Matter (SPM) concentration field greatly influences the prediction of water quality variables. Considering the high level of uncertainty in the modelling process, it is advised that an uncertainty estimate should be provided together with a single-valued deterministic model output. Through the use of an ensemble prediction system in the Dutch coastal waters the uncertainty in the modelled chlorophyll-a concentration has been estimated. The input ensemble is generated from perturbed model process parameters and external forcings through Latin hypercube sampling with dependence (LHSD). The simulation is carried out using the Delft3D Generic Ecological Model (GEM) with the advance algal speciation module-BLOOM which is sufficiently well validated for primary production simulation in the southern North Sea. The output ensemble is post-processed to obtain the uncertainty estimate and the results are validated against in-situ measurements and Remote Sensing (RS) data. The spatial uncertainty of chlorophyll-a concentration was derived using the produced ensemble spread maps. *This work has received funding from the European Union's Horizon

  15. A theoretical study using the multiphase numerical simulation technique for effective use of H2 as blast furnaces fuel

    Directory of Open Access Journals (Sweden)

    Jose Adilson de Castro


    Full Text Available We present a numerical simulation procedure for analyzing hydrogen, oxygen and carbon dioxide gases injections mixed with pulverized coals within the tuyeres of blast furnaces. Effective use of H2 rich gas is highly attractive into the steelmaking blast furnace, considering the possibility of increasing the productivity and decreasing the specific emissions of carbon dioxide becoming the process less intensive in carbon utilization. However, the mixed gas and coal injection is a complex technology since significant changes on the inner temperature and gas flow patterns are expected, beyond to their effects on the chemical reactions and heat exchanges. Focusing on the evaluation of inner furnace status under such complex operation a comprehensive mathematical model has been developed using the multi interaction multiple phase theory. The BF, considered as a multiphase reactor, treats the lump solids (sinter, small coke, pellets, granular coke and iron ores, gas, liquids metal and slag and pulverized coal phases. The governing conservation equations are formulated for momentum, mass, chemical species and energy and simultaneously discretized using the numerical method of finite volumes. We verified the model with a reference operational condition using pulverized coal of 215 kg per ton of hot metal (kg thm−1. Thus, combined injections of varying concentrations of gaseous fuels with H2, O2 and CO2 are simulated with 220 kg thm−1 and 250 kg thm−1 coals injection. Theoretical analysis showed that stable operations conditions could be achieved with productivity increase of 60%. Finally, we demonstrated that the net carbon utilization per ton of hot metal decreased 12%.

  16. Quantification of marine macro-debris abundance around Vancouver Island, Canada, based on archived aerial photographs processed by projective transformation. (United States)

    Kataoka, Tomoya; Murray, Cathryn Clarke; Isobe, Atsuhiko


    The abundance of marine macro-debris was quantified with high spatial resolution by applying an image processing technique to archived shoreline aerial photographs taken over Vancouver Island, Canada. The photographs taken from an airplane at oblique angles were processed by projective transformation for georeferencing, where five reference points were defined by comparing aerial photographs with satellite images of Google Earth. Thereafter, pixels of marine debris were extracted based on their color differences from the background beaches. The debris abundance can be evaluated by the ratio of an area covered by marine debris to that of the beach (percent cover). The horizontal distribution of percent cover of marine debris was successfully computed from 167 aerial photographs and was significantly related to offshore Ekman flows and winds (leeway drift and Stokes drift). Therefore, the estimated percent cover is useful information to determine priority sites for mitigating adverse impacts across broad areas. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Decision Simulation Technique (DST) as a scanning tool for exploring and explicating sustainability issues in transport decision making

    DEFF Research Database (Denmark)

    Jeppesen, Sara Lise


    new high speed rail line in Southern Sweden. The third part of the paper is concerned with a principal discussion of incorporation of sustainability in transport planning. It is argued that ‘explicating’-techniques such as the DST compared to more traditional ways of doing this – here denominated...... together with the principal steps that have to be followed when applying it on a concrete case. In the second part the potential of the DST is demonstrated by its use within an ongoing study. Thus the DST is applied on a new rail investment study on a section with four alternatives being part of a proposed...... implicit consideration of sustainability – can be useful for many different planning problems where the treated rail case is just one example. Finally, the paper offers some conclusions and a perspective on the future use and development of the DST....

  18. Large-timestep techniques for particle-in-cell simulation of systems with applied fields that vary rapidly in space

    Energy Technology Data Exchange (ETDEWEB)

    Friedman, A.; Grote, D.P.


    Under conditions which arise commonly in space-charge-dominated beam applications, the applied focusing, bending, and accelerating fields vary rapidly with axial position, while the self-fields (which are, on average, comparable in strength to the applied fields) vary smoothly. In such cases it is desirable to employ timesteps which advance the particles over distances greater than the characteristic scales over which the applied fields vary. Several related concepts are potentially applicable: sub-cycling of the particle advance relative to the field solution, a higher-order time-advance algorithm, force-averaging by integration along approximate orbits, and orbit-averaging. We report on our investigations into the utility of such techniques for systems typical of those encountered in accelerator studies for heavy-ion beam-driven inertial fusion.

  19. The Anderson impurity model out-of-equilibrium: Assessing the accuracy of simulation techniques with an exact current-occupation relation (United States)

    Agarwalla, Bijay Kumar; Segal, Dvira


    We study the interacting, symmetrically coupled single impurity Anderson model. By employing the nonequilibrium Green's function formalism, we reach an exact relationship between the steady-state charge current flowing through the impurity (dot) and its occupation. We argue that the steady-state current-occupation relation can be used to assess the consistency of simulation techniques and identify spurious transport phenomena. We test this relation in two different model variants: First, we study the Anderson-Holstein model in the strong electron-vibration coupling limit using the polaronic quantum master equation method. We find that the current-occupation relation is violated numerically in standard calculations, with simulations bringing up incorrect transport effects. Using a numerical procedure, we resolve the problem efficiently. Second, we simulate the Anderson model with electron-electron interaction on the dot using a deterministic numerically exact time-evolution scheme. Here, we observe that the current-occupation relation is satisfied in the steady-state limit—even before results converge to the exact limit.

  20. The Impact of Surface Boundary Forcing on Simulation of the 1998 Summer Drought Over the US Midwest Using Factor Separation Technique (United States)

    Stein, Uri; Fox-Rabinovitz, Michael


    The factor separation (FS) technique has been utilized to evaluate quantitatively the impact of surface boundary forcings on simulation of the 1988 summer drought over the Midwestern part of the U.S. The four surface boundary forcings used are: (1)Sea Surface Temperature (SST), (2) soil moisture, (3) snow cover, and (4) sea ice. The Goddard Earth Observing System(GEOS) General Circulation Model (GCM) is used to simulate the 1988 U.S. drought. A series of sixteen simulations are performed with climatological and real 1988 surface boundary conditions. The major single and mutual synergistic factors/impacts are analyzed. The results show that SST and soil moisture are the major single pro-drought factors. The couple synergistic effect of SST and soil moisture is the major anti-drought factor. The triple synergistic impact of SST, soil moisture, and snow cover is the strongest pro-drought impact and is, therefore, the main contributor to the generation of the drought. The impact of the snow cover and sea ice anomalies for June 1988 on the drought is significant only when combined with the SST and soil moisture anomalies.

  1. The Cambridge photographic atlas of galaxies

    CERN Document Server

    König, Michael


    Galaxies - the Milky Way's siblings - offer a surprising variety of forms and colours. Displaying symmetrical spiral arms, glowing red nebulae or diffuse halos, even the image of a galaxy can reveal much about its construction. All galaxies consist of gas, dust and stars, but the effects of gravity, dark matter and the interaction of star formation and stellar explosions all influence their appearances. This volume showcases more than 250 of the most beautiful galaxies within an amateur's reach and uses them to explain current astrophysical research. It features fantastic photographs, unique insights into our knowledge, tips on astrophotography and essential facts and figures based on the latest science. From the Andromeda Galaxy to galaxy clusters and gravitational lenses, the nature of galaxies is revealed through these stunning amateur photographs. This well illustrated reference atlas deserves a place on the bookshelves of astronomical imagers, observers and armchair enthusiasts.

  2. Photographic wound documentation after open fracture. (United States)

    Solan, M C; Calder, J D; Gibbons, C E; Ricketts, D M


    More than 3000 open fractures occur in UK each year. They require early assessment and meticulous treatment in order to avoid devastating complications. The British Orthopaedic Association and British Association of Plastic Surgeons Working Party recommend that an instant photograph be taken of an open wound prior to the application of a dressing. The dressing can then remain undisturbed until the definitive surgical debridement is performed in theatre. Such practice reduces nosocomial infection. Fifty-one accident and emergency departments were surveyed by the means of a telephone questionnaire. Forty-one percent were unable to photograph an open fracture wound. A further 20% had no access to a camera outside of office hours. The cheap, simple and effective recommendations of the Working party are not being followed.

  3. Establishment of an open database of realistic simulated data for evaluation of partial volume correction techniques in brain PET/MR

    Energy Technology Data Exchange (ETDEWEB)

    Mota, Ana [Instituto de Biofísica e Engenharia Biomédica, FC-UL, Lisboa (Portugal); Institute of Nuclear Medicine, UCL, London (United Kingdom); Cuplov, Vesna [Instituto de Biofísica e Engenharia Biomédica, FC-UL, Lisboa (Portugal); Schott, Jonathan; Hutton, Brian; Thielemans, Kris [Institute of Nuclear Medicine, UCL, London (United Kingdom); Drobnjak, Ivana [Centre of Medical Image Computing, UCL, London (United Kingdom); Dickson, John [Institute of Nuclear Medicine, UCL, London (United Kingdom); Bert, Julien [INSERM UMR1101, LaTIM, CHRU de Brest, Brest (France); Burgos, Ninon; Cardoso, Jorge; Modat, Marc; Ourselin, Sebastien [Centre of Medical Image Computing, UCL, London (United Kingdom); Erlandsson, Kjell [Institute of Nuclear Medicine, UCL, London (United Kingdom)


    The Partial Volume (PV) effect in Positron Emission Tomography (PET) imaging leads to loss in quantification accuracy, which manifests in PV effects (small objects occupy partially the sensitive volume of the imaging instrument, resulting in blurred images). Simultaneous acquisition of PET and Magnetic Resonance Imaging (MRI) produces concurrent metabolic and anatomical information. The latter has proved to be very helpful for the correction of PV effects. Currently, there are several techniques used for PV correction. They can be applied directly during the reconstruction process or as a post-processing step after image reconstruction. In order to evaluate the efficacy of the different PV correction techniques in brain- PET, we are constructing a database of simulated data. Here we present the framework and steps involved in constructing this database. Static 18F-FDG epilepsy and 18F-Florbetapir amyloid dementia PET/MR were selected because of their very different characteristics. The methodology followed was based on four main steps: Image pre-processing, Ground Truth (GT) generation, MRI and PET data simulation and reconstruction. All steps used Open Source software and can therefore be repeated at any centre. The framework as well as the database will be freely accessible. Tools used included GIF, FSL, POSSUM, GATE and STIR. The final data obtained after simulation, involving raw or reconstructed PET data together with corresponding MRI datasets, were close to the original patient data. Besides, there is the advantage that data can be compared with the GT. We indicate several parameters that can be improved and optimized.

  4. Photographic Poster at the Art Education


    Tomanová, Jana


    ANOTATION This thesis examines a discourse of photography and poster, with the aim to join these two phenomena into a single summary theory and create their general characteristics. The work searches for intersections between photography and graphic design/poster and applies them in the theory of poster creation. In the didactic part, the theoretical principles are introduced into teaching art classes at primary schools. The same approach is applied in the artwork itself. Keywords: Photograph...

  5. The Greatest Photographers of the Twentieth Century


    David Galenson


    A survey of textbooks reveals that scholars consider Alfred Stieglitz to have been the greatest photographer of the twentieth century, followed in order by Walker Evans, Cindy Sherman, Man Ray, and Eugène Atget. Stieglitz, Evans, and Atget were experimental artists, who were committed to realism, whereas Man Ray and Sherman were conceptual innovators, who constructed images to express ideas. During much of the twentieth century, photography was dominated by the experimental approach and aesth...

  6. The effect of various backfilling techniques on the fracture resistance of simulated immature teeth performed apical plug with Biodentine. (United States)

    Topçuoğlu, Hüseyin Sinan; Kesim, Bertan; Düzgün, Salih; Tuncay, Öznur; Demirbuga, Sezer; Topçuoğlu, Gamze


    To evaluate the fracture resistance of simulated immature teeth that had been backfilled using different materials after using Biodentine as the apical plug material. Seventy-five single-rooted teeth were divided into five groups (n = 15). The 15 teeth in group 1 served as a negative control group and received no treatment. The remaining 60 teeth were instrumented to a #6 Peeso reamer to obtain a standard internal diameter of 1.5 mm. The apical 4 mm of 60 teeth was filled with Biodentine. The backfilling was then performed on each group as follows: group 2--no backfilling (positive control), group 3--gutta-percha, group 4--fiber post, and group 5--Biodentine. Specimens were then subjected to fracture testing. The force required to fracture each specimen was recorded, and the data were statistically analyzed. The mean fracture values of groups 1 and 4 were significantly higher than groups 2, 3, and 5 (P Biodentine plug provided the highest fracture resistance among all experimental groups. © 2014 BSPD, IAPD and John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Given time: biology, nature and photographic vision. (United States)

    Garlick, Steve


    The invention of photography in the early 19th century changed the way that we see the world, and has played an important role in the development of western science. Notably, photographic vision is implicated in the definition of a new temporal relation to the natural world at the same time as modern biological science emerges as a disciplinary formation. It is this coincidence in birth that is central to this study. I suggest that by examining the relationship of early photography to nature, we can gain some insight into the technological and epistemological underpinnings of biological vision. To this end, this article is primarily concerned with the role of photographic technology in the genealogy of biological vision. I argue that photography has always been ambiguously located between art and science, between nature and culture, and between life and death. Hence, while it may be a technological expression of the scientific desire to know and to control nature, photographic vision has continually disrupted and frustrated the ambitions of biological technoscience. The technovision of early biological science illustrates that the elusive temporality of nature has always been central to the production of knowledge of life.

  8. Tracking Protests Using Geotagged Flickr Photographs.

    Directory of Open Access Journals (Sweden)

    Merve Alanyali

    Full Text Available Recent years have witnessed waves of protests sweeping across countries and continents, in some cases resulting in political and governmental change. Much media attention has been focused on the increasing usage of social media to coordinate and provide instantly available reports on these protests. Here, we investigate whether it is possible to identify protest outbreaks through quantitative analysis of activity on the photo sharing site Flickr. We analyse 25 million photos uploaded to Flickr in 2013 across 244 countries and regions, and determine for each week in each country and region what proportion of the photographs are tagged with the word "protest" in 34 different languages. We find that higher proportions of "protest"-tagged photographs in a given country and region in a given week correspond to greater numbers of reports of protests in that country and region and week in the newspaper The Guardian. Our findings underline the potential value of photographs uploaded to the Internet as a source of global, cheap and rapidly available measurements of human behaviour in the real world.


    Directory of Open Access Journals (Sweden)

    Patricia Peruzzo


    Full Text Available The main objective of this paper is analyze the changes occurred in the practices of the society point of view using images from pictorial and different techniques of engraving and trying, also, to understand how after the advent of photography in the second quarter of the nineteenth century, changes occurs in the visual schemes present in this society. It is well know, with the spread of photography as art, has changed people's relationship with the images, not only in artistic and aesthetic prisms, but in social, cultural and political. This work traces a panorama of photograph trajectory and explicit its relationship to art and education, based in Boris Kossoy and Rosalind Krauss theory’s, seeking to draw a historical-cultural panorama of photography, and also with François Soulages when it comes to aesthetic research on photography. This work use Cerqueira and Barros propositions to think about the relationship between photography and education.

  10. The solution conformation of sialyl-alpha (2----6)-lactose studied by modern NMR techniques and Monte Carlo simulations. (United States)

    Poppe, L; Stuike-Prill, R; Meyer, B; van Halbeek, H


    We present a comprehensive strategy for detailed characterization of the solution conformations of oligosaccharides by NMR spectroscopy and force-field calculations. Our experimental strategy generates a number of interglycosidic spatial constraints that is sufficiently large to allow us to determine glycosidic linkage conformations with a precision heretofore unachievable. In addition to the commonly used [1H,1H] NOE contacts between aliphatic protons, our constraints are: (a) homonuclear NOEs of hydroxyl protons in H2O to other protons in the oligosaccharide, (b) heteronuclear [1H,13C] NOEs, (c) isotope effects of O1H/O2H hydroxyl groups on 13C chemical shifts, and (d) long-range heteronuclear scalar couplings across glycosidic bonds. We have used this approach to study the trisaccharide sialyl-alpha (2----6)-lactose in aqueous solution. The experimentally determined geometrical constraints were compared to results obtained from force-field calculations based on Metropolis Monte Carlo simulations. The molecule was found to exist in 2 families of conformers. The preferred conformations of the alpha (2----6)-linkage of the trisaccharide are best described by an equilibrium of 2 conformers with phi angles at -60 degrees or 180 degrees and of the 3 staggered rotamers of the omega angle with a predominant gt conformer. Three intramolecular hydrogen bonds, involving the hydroxyl protons on C8 and C7 of the sialic acid residue and on C3 of the reducing-end glucose residue, contribute significantly to the conformational stability of the trisaccharide in aqueous solution.

  11. Skewness and kurtosis of height distribution of thin films simulated by larger curvature model with noise reduction techniques (United States)

    Disrattakit, P.; Chanphana, R.; Chatraphorn, P.


    Time varying skewness (S) and kurtosis (Q) of height distribution of (2 + 1) -dimensional larger curvature (LC) model with and without noise reduction techniques (NRTs) are investigated in both transient and steady state regimes. In this work, effects of the multiple hit NRT (m > 1 NRT) and the long surface diffusion length NRT (ℓ > 1 NRT) on the surface morphologies and characteristics of S and Q are studied. In the early growth time, plots of S and Q versus time of the m > 1 morphologies show pronounced oscillation indicating the layer by layer growth. Our results show that S = 0 and Q 0 at every complete layer. The results are confirmed by the same plots of the results from the Das Sarma-Tamborenea (DT) model. The ℓ > 1 LC model, on the other hand, has no evidence of the layer by layer growth mode due to the rapidly damped oscillation of S and Q. In the steady state, the m > 1 and ℓ > 1 NRTs affect weakly on the values of S and Q and the mounded morphologies of the film. This lead to the evidence of universality of S and Q in the steady state of the LC models with various m and ℓ. The finite size effect on the values of S and Q is found to be very weak in the LC model. By extrapolating to L → ∞, we obtain SL→∞ ≈ 0.05 and QL→∞ ≈ - 0.62 which are in agreement with the NRTs results.

  12. Radar image enhancement and simulation as an aid to interpretation and training (United States)

    Frost, V. S.; Stiles, J. A.; Holtzman, J. C.; Dellwig, L. F.; Held, D. N.


    Greatly increased activity in the field of radar image applications in the coming years demands that techniques of radar image analysis, enhancement, and simulation be developed now. Since the statistical nature of radar imagery differs from that of photographic imagery, one finds that the required digital image processing algorithms (e.g., for improved viewing and feature extraction) differ from those currently existing. This paper addresses these problems and discusses work at the Remote Sensing Laboratory in image simulation and processing, especially for systems comparable to the formerly operational SEASAT synthetic aperture radar.

  13. Realistic training scenario simulations and simulation techniques (United States)

    Dunlop, William H.; Koncher, Tawny R.; Luke, Stanley John; Sweeney, Jerry Joseph; White, Gregory K.


    In one embodiment, a system includes a signal generator operatively coupleable to one or more detectors; and a controller, the controller being both operably coupled to the signal generator and configured to cause the signal generator to: generate one or more signals each signal being representative of at least one emergency event; and communicate one or more of the generated signal(s) to a detector to which the signal generator is operably coupled. In another embodiment, a method includes: receiving data corresponding to one or more emergency events; generating at least one signal based on the data; and communicating the generated signal(s) to a detector.

  14. Polar and non-polar organic binder characterization in Pompeian wall paintings: comparison to a simulated painting mimicking an "a secco" technique. (United States)

    Corso, Gaetano; Gelzo, Monica; Sanges, Carmen; Chambery, Angela; Di Maro, Antimo; Severino, Valeria; Dello Russo, Antonio; Piccioli, Ciro; Arcari, Paolo


    The use of Fourier transform infrared spectromicroscopy and mass spectrometry (MS) allowed us to characterize the composition of polar and non-polar binders present in sporadic wall paint fragments taken from Pompeii's archaeological excavation. The analyses of the polar and non-polar binder components extracted from paint powder layer showed the presence of amino acids, sugars, and fatty acids but the absence of proteinaceous material. These results are consistent with a water tempera painting mixture composed of pigments, flours, gums, and oils and are in agreement with those obtained from a simulated wall paint sample made for mimicking an ancient "a secco" technique. Notably, for the first time, we report the capability to discriminate by tandem MS the presence of free amino acids in the paint layer.

  15. Eye Redness Image Processing Techniques (United States)

    Adnan, M. R. H. Mohd; Zain, Azlan Mohd; Haron, Habibollah; Alwee, Razana; Zulfaezal Che Azemin, Mohd; Osman Ibrahim, Ashraf


    The use of photographs for the assessment of ocular conditions has been suggested to further standardize clinical procedures. The selection of the photographs to be used as scale reference images was subjective. Numerous methods have been proposed to assign eye redness scores by computational methods. Image analysis techniques have been investigated over the last 20 years in an attempt to forgo subjective grading scales. Image segmentation is one of the most important and challenging problems in image processing. This paper briefly outlines the comprehensive of image processing and the implementation of image segmentation in eye redness.

  16. Preferences for photographic art among hospitalized patients with cancer. (United States)

    Hanson, Hazel; Schroeter, Kathryn; Hanson, Andrew; Asmus, Kathryn; Grossman, Azure


    To determine the preferences of patients with cancer for viewing photographic art in an inpatient hospital setting and to evaluate the impact of viewing photographic art. Quantitative, exploratory, single-group, post-test descriptive design incorporating qualitative survey questions. An academic medical center in the midwestern United States. 80 men (n = 44) and women (n = 36) aged 19-85 years (X = 49) and hospitalized for cancer treatment. Participants viewed photographs via computers and then completed a five-instrument electronic survey. Fatigue, quality of life, performance status, perceptions of distraction and restoration, and content categories of photographs. Ninety-six percent of participants enjoyed looking at the study photographs. The photographs they preferred most often were lake sunset (76%), rocky river (66%), and autumn waterfall (66%). The most rejected photographs were amusement park (54%), farmer's market vegetable table (51%), and kayakers (49%). The qualitative categories selected were landscape (28%), animals (15%), people (14%), entertainment (10%), imagery (10%), water (7%), spiritual (7%), flowers (6%), and landmark (3%). Some discrepancy between the quantitative and qualitative sections may be related to participants considering water to be a landscape. The hypothesis that patients' preferences for a category of photographic art are affected by the psychophysical and psychological qualities of the photographs, as well as the patients' moods and characteristics, was supported. Nurses can play an active role in helping patients deal with the challenges of long hospital stays and life-threatening diagnoses through distraction and restoration interventions such as viewing photographic images of nature. Nurses can use photographic imagery to provide a restorative intervention during the hospital experience. Photographic art can be used as a distraction from the hospital stay and the uncertainty of a cancer diagnosis. Having patients view

  17. Investigation into the interaction of losartan with human serum albumin and glycated human serum albumin by spectroscopic and molecular dynamics simulation techniques: A comparison study. (United States)

    Moeinpour, Farid; Mohseni-Shahri, Fatemeh S; Malaekeh-Nikouei, Bizhan; Nassirli, Hooriyeh


    The interaction between losartan and human serum albumin (HSA), as well as its glycated form (gHSA) was studied by multiple spectroscopic techniques and molecular dynamics simulation under physiological conditions. The binding information, including the binding constants, effective quenching constant and number of binding sites showed that the binding partiality of losartan to HSA was higher than to gHSA. The findings of three-dimensional fluorescence spectra demonstrated that the binding of losartan to HSA and gHSA would alter the protein conformation. The distances between Trp residue and the binding sites of the drug were evaluated on the basis of the Förster theory, and it was indicated that non-radiative energy transfer from HSA and gHSA to the losartan happened with a high possibility. According to molecular dynamics simulation, the protein secondary and tertiary structure changes were compared in HSA and gHSA for clarifying the obtained results. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. Comprehensive Peptide Ion Structure Studies Using Ion Mobility Techniques: Part 1. An Advanced Protocol for Molecular Dynamics Simulations and Collision Cross-Section Calculation (United States)

    Ghassabi Kondalaji, Samaneh; Khakinejad, Mahdiar; Tafreshian, Amirmahdi; J. Valentine, Stephen


    Collision cross-section (CCS) measurements with a linear drift tube have been utilized to study the gas-phase conformers of a model peptide (acetyl-PAAAAKAAAAKAAAAKAAAAK). Extensive molecular dynamics (MD) simulations have been conducted to derive an advanced protocol for the generation of a comprehensive pool of in-silico structures; both higher energy and more thermodynamically stable structures are included to provide an unbiased sampling of conformational space. MD simulations at 300 K are applied to the in-silico structures to more accurately describe the gas-phase transport properties of the ion conformers including their dynamics. Different methods used previously for trajectory method (TM) CCS calculation employing the Mobcal software [1] are evaluated. A new method for accurate CCS calculation is proposed based on clustering and data mining techniques. CCS values are calculated for all in-silico structures, and those with matching CCS values are chosen as candidate structures. With this approach, more than 300 candidate structures with significant structural variation are produced; although no final gas-phase structure is proposed here, in a second installment of this work, gas-phase hydrogen deuterium exchange data will be utilized as a second criterion to select among these structures as well as to propose relative populations for these ion conformers. Here the need to increase conformer diversity and accurate CCS calculation is demonstrated and the advanced methods are discussed.

  19. Simulation techniques of medium or small sized coal mining; Sistema de simulacion de labores de minas subterraneas de carbon de tamano mediano o pequeno

    Energy Technology Data Exchange (ETDEWEB)



    Usually an underground mining company designs new production systems in order to reach higher productivity levels, but there is always some uncertainty about reaching the planned figures. This research project, so called Project SimCAR, applies to medium or small sized coal mining companies and tries to become a powerful tool in helping technical staff to answer questions about how future systems will behave. The project`s paradigm is to use computer simulation techniques in order to diminish the implicit uncertainty associated to coal mining activities. As a result, this implies to obtain the best possible scenarios for real economic investments in mine planning. The final programs we have built help technical staff to: 1. Study existing systems in depth. The precision of the resulting model exclusively depends on the correctness of input data and a perfect understanding of the system`s logic processes. 2. Perform several changes on the input system variables over the simulated models in order to allow the technical staff to know the system will react under several conditions. 3. Introduce new strategies on the model construction in order to get a complete optimization under productive and economic viewpoints. (Author)

  20. Comprehensive Peptide Ion Structure Studies Using Ion Mobility Techniques: Part 1. An Advanced Protocol for Molecular Dynamics Simulations and Collision Cross-Section Calculation. (United States)

    Ghassabi Kondalaji, Samaneh; Khakinejad, Mahdiar; Tafreshian, Amirmahdi; J Valentine, Stephen


    Collision cross-section (CCS) measurements with a linear drift tube have been utilized to study the gas-phase conformers of a model peptide (acetyl-PAAAAKAAAAKAAAAKAAAAK). Extensive molecular dynamics (MD) simulations have been conducted to derive an advanced protocol for the generation of a comprehensive pool of in-silico structures; both higher energy and more thermodynamically stable structures are included to provide an unbiased sampling of conformational space. MD simulations at 300 K are applied to the in-silico structures to more accurately describe the gas-phase transport properties of the ion conformers including their dynamics. Different methods used previously for trajectory method (TM) CCS calculation employing the Mobcal software [1] are evaluated. A new method for accurate CCS calculation is proposed based on clustering and data mining techniques. CCS values are calculated for all in-silico structures, and those with matching CCS values are chosen as candidate structures. With this approach, more than 300 candidate structures with significant structural variation are produced; although no final gas-phase structure is proposed here, in a second installment of this work, gas-phase hydrogen deuterium exchange data will be utilized as a second criterion to select among these structures as well as to propose relative populations for these ion conformers. Here the need to increase conformer diversity and accurate CCS calculation is demonstrated and the advanced methods are discussed. Graphical Abstract ᅟ.

  1. Use of sting-response techniques for simulate diagnostics in human esophagus; Uso de tecnicas estimulo-respuesta para simular diagnosticos en esofago humano

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, I.; Gonzalez, Y.; Valdes, L.; Alfonso, J.A.; Estevez, E. [Facultad de Quimica Farmacia, Universidad Central de Las Villas (Cuba)


    In this work a study of simulation of the gamma graphic studies that are carried out in human esophagus in the Dept. of Nuclear Medicine of the 'Celestino Hernandez Robau Hospital of Santa Clara is presented. For the investigation tubular reactors were used and sting-response techniques with radioactive tracer of Technetium 99 metastable to a concentration of 1 mCi and several flows were applied. The distribution curves of residences times were obtained, those that respond to an equation of the type: Y = A + B exp (- exp((x-C)/D)) - ((x-C/D)+1). They were also carried out, optimizations studies of the doses of the radioactive to give to the patients from 1 mCi (that is the one used in studies) up to 0,5 mCi, and the influences on the obtained distributions of residence time were analyzed. It was confirmed the possibility to lower the doses with clear information of the signal. It was also carried out a simulation of the attenuation of the radiations that takes place in the patients by the interposition of tissues among the analyzed organ, and the detection equipment. It was used paraffin for tissue simulation. It was found the almost independence of the intensity of the radiations with the thickness, for the assayed doses. Lastly it was found a complex mathematical model that responds to the diagnostic curves obtained in these studies, being correlated the coefficients of the pattern with the most important physical parameters of the system, giving it a practical and useful value, all time that the error among the values that this it predicts and the experimental ones do not surpass of 5%. (Author)

  2. Dust Plate, Retina, Photograph: Imaging on Experimental Surfaces in Early Nineteenth-Century Physics. (United States)

    Ramalingam, Chitra


    This article explores the entangled histories of three imaging techniques in early nineteenth-century British physical science, techniques in which a dynamic event (such as a sound vibration or an electric spark) was made to leave behind a fixed trace on a sensitive surface. Three categories of "sensitive surface" are examined in turn: first, a metal plate covered in fine dust; second, the retina of the human eye; and finally, a surface covered with a light-sensitive chemical emulsion (a photographic plate). For physicists Michael Faraday and Charles Wheatstone, and photographic pioneer William Henry Fox Talbot, transient phenomena could be studied through careful observation and manipulation of the patterns wrought on these different surfaces, and through an understanding of how the imaging process unfolded through time. This exposes the often-ignored materiality and temporality of epistemic practices around nineteenth-century scientific images said to be "drawn by nature."

  3. Apollo Lunar Sample Photograph Digitization Project Update (United States)

    Todd, N. S.; Lofgren, G. E.


    This is an update of the progress of a 4-year data restoration project effort funded by the LASER program to digitize photographs of the Apollo lunar rock samples and create high resolution digital images and undertaken by the Astromaterials Acquisition and Curation Office at JSC [1]. The project is currently in its last year of funding. We also provide an update on the derived products that make use of the digitized photos including the Lunar Sample Catalog and Photo Database[2], Apollo Sample data files for GoogleMoon[3].

  4. Digital image forensics for photographic copying (United States)

    Yin, Jing; Fang, Yanmei


    Image display technology has greatly developed over the past few decades, which make it possible to recapture high-quality images from the display medium, such as a liquid crystal display(LCD) screen or a printed paper. The recaptured images are not regarded as a separate image class in the current research of digital image forensics, while the content of the recaptured images may have been tempered. In this paper, two sets of features based on the noise and the traces of double JPEG compression are proposed to identify these recaptured images. Experimental results showed that our proposed features perform well for detecting photographic copying.

  5. iPad for digital photographers

    CERN Document Server

    Story, Derrick


    Discover innovative ways to strengthen your photography business with your iPad Many photographers are turning to the flexible, easy-to-use tools of the iPad and relying on them to wear a variety of hats in their photography business. Whether portable portfolio, editing tool, payment-tracking system, or appointment calendar, the iPad melds together the best attributes of the cell phone and a laptop computer and this unique book highlights them all. With this helpful resource, you'll learn how to get the most out of your iPad to not only improve your business but also enhance your p

  6. Using super-resolution technique to elucidate the effects of imaging resolution on transport properties resulting from pore-scale modelling simulations (United States)

    Karsanina, Marina; Gerke, Kirill; Khirevich, Siarhei; Sizonenko, Timofey; Korost, Dmitry


    Permeability is one of the fundamental properties of porous media and is required for large-scale Darcian fluid flow and mass transport models. Whilst permeability can be directly measured at a range of scales, there are increasing opportunities to evaluate permeability from pore-scale simulations. It is well known that single phase flow properties of digital rocks will depend on the resolution of the 3D pore image. Such studies are usually performed by coarsening X-ray microtomography scans. Recently we have proposed a novel approach to fuse multi-scale porous media images using stochastic reconstruction techniques based on directional correlation functions. Here we apply this slightly modified approach to create 3D pore images of different spatial resolution, i.e. stochastic super-resolution method. Contrary to coarsening techniques, this approach preserves porosity values and allows to incorporate fine scale data coming from such imaging techniques as SEM or FIB-SEM. We compute absolute permeability of the same porous media species under different resolutions using lattice-Boltzmann and finite difference methods to model Stokes flow in order to elucidate the effects of image resolution on resulting permeability values and compare stochastic super-resolution technique against conventional coarsening image processing technique. References: 1) Karsanina, M.V., Gerke, K.M., Skvortsova, E.B. and Mallants, D. (2015) Universal spatial correlation functions for describing and reconstructing soil microstructure. PLoS ONE 10(5), e0126515. 2) Gerke, K. M., & Karsanina, M. V. (2015). Improving stochastic reconstructions by weighting correlation functions in an objective function. EPL (Europhysics Letters),111(5), 56002. 3) Gerke, K. M., Karsanina, M. V., Vasilyev, R. V., & Mallants, D. (2014). Improving pattern reconstruction using directional correlation functions. EPL (Europhysics Letters), 106(6), 66002. 4) Gerke, K.M., Karsanina, M. V, Mallants, D., 2015. Universal

  7. Online Study of Melanoma Identification: The Roles of ABC Information and Photographic Examples in Lesion Discrimination


    Cornell, Ella


    Public education campaigns designed to increase awareness about malignant melanoma, the most fatal form of skin cancer, currently use written criteria (ABCD) to describe its common features, but an increasing body of evidence has suggested that the public would benefit more from the use of photographic examples of lesions. This study explored possible public education techniques to optimize laypeople’s recognition of melanoma through an online melanoma identification task. We were particularl...

  8. Analysis of Sasang constitutional types using facial features with compensation for photographic distance

    Directory of Open Access Journals (Sweden)

    Jun-Hyeong Do


    Conclusion: It is noted that the significant facial features represent common characteristics of each SC type in the sense that we collected extensive opinions from many Sasang constitutional medicine doctors with various points of view. Additionally, a compensation method for the photographic distance is needed to find the significant facial features. We expect these findings and the related compensation technique to contribute to establishing a scientific basis for the precise diagnosis of SC types in clinical practice.



    TEMEL, Cenk


    The purpose of this study is to determine the general characteristics of the physical education classes in early republican period in Turkey by examining the photos taken at schools in various cities in those times. The document analysis technique, which is one of the qualitative research methods, has been adopted in the study. The data of the study have been obtained from 37 physical education classes photographs, preserved in the Ministry of National Education, General Administration of Inn...

  10. Piloted Aircraft Environment Simulation Techniques (United States)


    model of the KC- 135 tanker aircraft. 1.2 AS INE EIEET DEFINEDI1T AS TNEAllRFORCE APRVED IT AS TESYSTEMS CNINA AS ThECONTRACTRMADE I AS N LOGISTCS ...approach trajectory, he will control speed with pitch attitude, and sink rate with power. More rapid changes can be achieved by reversing the control...thrust reversers (or parachutes) may be of most concern, coupled with runway conditions (water or ice) and crosswinds. Failures and assymetrics will

  11. Use of human patient simulation and validation of the Team Situation Awareness Global Assessment Technique (TSAGAT): a multidisciplinary team assessment tool in trauma education. (United States)

    Crozier, Michael S; Ting, Heather Y; Boone, Darrell C; O'Regan, Noel B; Bandrauk, Nathalie; Furey, Andrew; Squires, Cynthia; Hapgood, Joanne; Hogan, Michael P


    Situation awareness (SA) is a vital construct for decision making in intense, dynamic environments such as trauma resuscitation. Human patient simulation (HPS) allows for a safe environment where individuals can develop these skills. Trauma resuscitation is performed by multidisciplinary teams that are traditionally difficult to globally assess. Our objective was to create and validate a novel tool to measure SA in multidisciplinary trauma teams using a HPS--the Team Situation Awareness Global Assessment Technique (TSAGAT). Memorial University Simulation Centre. Using HPS, 4 trauma teams completed 2 separate trauma scenarios. Student, junior resident, senior resident, and attending staff teams each had 3 members (trauma team leader, nurse, and airway manager). Individual SAGATs were developed by experts in each respective field and contained shared and complimentary knowledge questions. Teams were assessed with SAGAT in real time and with traditional checklists using video review. TSAGAT was calculated as the sum of individual SAGAT scores and was compared with the traditional checklist scores. Shared, complimentary, and TSAGAT scores improved with increasing team experience. Differences between teams for TSAGAT and complimentary knowledge were statistically significant (p differences between teams also reached statistical significance (p < 0.05). TSAGAT scores correlated strongly with traditional checklist scores (Pearson correlation r = 0.996). Interrater reliability for the checklist tool was high (Pearson correlation r = 0.937). TSAGAT is the first valid and reliable assessment tool incorporating SA and HPS for multidisciplinary team performance in trauma resuscitation. TSAGAT could compliment or improve on current assessment methods and curricula in trauma and critical care and provides a template for team assessment in other areas of surgical education. Copyright © 2014 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights

  12. The function of photographs in the relations between a teacher and children : A photograph is a symbolic artifact


    滝川, 弘人


    The old photograph is a trigger for one person's reminiscences. The aim of this study is to make viewpoints for real life-history. I compare one story between one movie's scenarios that depends on that story. The function of photographs penetrates both a story and a movie. I find intentions to get real meaning of functions of photographs. I acquire an index to investigate one old teacher who tells one's own life. An old photograph is a symbolic-artifact for retired teacher's life.

  13. Non-mydriatic digital macular photography: how good is the second eye photograph? (United States)

    Aung, Khin Zaw; Robman, Luba; Chong, Elaine W T; English, Dallas R; Giles, Graham G; Guymer, Robyn H


    In an elderly Australian population, to evaluate the quality of fundus photographs taken non-mydriatically in both eyes, and to compare the quality of those taken second with those taken first. From 2258 participants (4516 images) aged 70 years and older who participated in the Melbourne Collaborative Cohort Study (MCCS), digital non-stereoscopic 45 degrees retinal photographs were taken with a Canon CR6-45NM Non-mydriatic Retinal Camera and evaluated. The quality of macular images was assessed as good, fair, and poor and McNemar's test was used to analyze variation in quality. Gradable quality images were obtained from 95.8% eyes of participants, with 93.9% of participants having gradable photos of both eyes. The gradable rate for the eye photographed first (right), was significantly higher than that for the eye photographed second (left): 89.7% vs. 85.6%, respectively (difference of 4.12%, confidence interval [CI] of 2.68-5.54%, p photographs from the second eye was slightly greater than the first eye (4.5% and 3.8%, respectively), but the difference in proportion was not statistically significant (difference of 3.6%, CI of 0.17-1.5%, p = 0.384). In the setting of a large elderly cohort study, non-dilated 45 degrees digital retinal imaging is an excellent method for fundus examination. It is fast, easy to use, non-invasive, and a reliable AMD (age-related macular degeneration)-detecting technique with only a minor loss of information from the second eye.

  14. Low-altitude aerial color digital photographic survey of the San Andreas Fault (United States)

    Lynch, David K.; Hudnut, Kenneth W.; Dearborn, David S.P.


    southeast leg and 300 m AGL on the northwest leg. Spatial resolution (pixel size or ground sample distance) is a few centimeters. Time and geographic coordinates of the aircraft were automatically written into the exchangeable image file format (EXIF) data within each jpeg photograph. A few hours after acquisition and validation, the photographs were uploaded to a publically accessible Web page. The goal was to obtain quick-turnaround, low-cost, high-resolution, overlapping, and contiguous imagery for use in planning field operations, and to provide imagery for a wide variety of land use and educational studies. This work was carried out in support of ongoing geological research on the San Andreas fault, but the technique is widely applicable beyond geology.

  15. Silver recovery from spent photographic solutions

    Energy Technology Data Exchange (ETDEWEB)

    Kunda, W.; Etsell, T.H.


    A process is disclosed for recovering silver sulfide from a silver-containing spent photographic fixer solution. The process is particularly suited for treating the fixer solution in a manner which enables recycling of the solution, and does not have the inefficiencies and high costs of alternative processes such as electrolysis. The process comprises introducing hydrosulfide ion into the silver-containing spent fixer solution to precipitate silver sulfide. The resultant precipitate is isolated from the fixer solution in order to remove silver from the solution. The process is especially suited for treating spent solutions which contain thiosulfate, in particular sodium or ammonium thiosulfate. The preferred hydrosulfide is either a sodium or ammonium hydrosulfide for precipitating silver in the form of silver sulfide. The quantity of hydrosulfide used is in the range of greater than 0.5 mole per mole of silver in the fixer solution. Experiments are described to illustrate the process of the invention. 3 figs., 5 tabs.

  16. Predictive modeling, simulation, and optimization of laser processing techniques: UV nanosecond-pulsed laser micromachining of polymers and selective laser melting of powder metals (United States)

    Criales Escobar, Luis Ernesto

    One of the most frequently evolving areas of research is the utilization of lasers for micro-manufacturing and additive manufacturing purposes. The use of laser beam as a tool for manufacturing arises from the need for flexible and rapid manufacturing at a low-to-mid cost. Laser micro-machining provides an advantage over mechanical micro-machining due to the faster production times of large batch sizes and the high costs associated with specific tools. Laser based additive manufacturing enables processing of powder metals for direct and rapid fabrication of products. Therefore, laser processing can be viewed as a fast, flexible, and cost-effective approach compared to traditional manufacturing processes. Two types of laser processing techniques are studied: laser ablation of polymers for micro-channel fabrication and selective laser melting of metal powders. Initially, a feasibility study for laser-based micro-channel fabrication of poly(dimethylsiloxane) (PDMS) via experimentation is presented. In particular, the effectiveness of utilizing a nanosecond-pulsed laser as the energy source for laser ablation is studied. The results are analyzed statistically and a relationship between process parameters and micro-channel dimensions is established. Additionally, a process model is introduced for predicting channel depth. Model outputs are compared and analyzed to experimental results. The second part of this research focuses on a physics-based FEM approach for predicting the temperature profile and melt pool geometry in selective laser melting (SLM) of metal powders. Temperature profiles are calculated for a moving laser heat source to understand the temperature rise due to heating during SLM. Based on the predicted temperature distributions, melt pool geometry, i.e. the locations at which melting of the powder material occurs, is determined. Simulation results are compared against data obtained from experimental Inconel 625 test coupons fabricated at the National

  17. Photographs and Classroom Response Systems in Middle School Astronomy Classes (United States)

    Lee, Hyunju; Feldman, Allan


    In spite of being readily available, photographs have played a minor and passive role in science classes. In our study, we present an active way of using photographs in classroom discussions with the use of a classroom response system (CRS) in middle school astronomy classes to teach the concepts of day-night and seasonal change. In this new…

  18. Using Photographs and Diagrams to Test Young Children's Mass Thinking (United States)

    Cheeseman, Jill; McDonough, Andrea


    This paper reports the results of a pencil-and-paper test developed to assess young children's understanding of mass measurement. The innovative element of the test was its use of photographs. We found many children of the 295 6-8 year-old children tested could "read" the photographs and diagrams and recognise the images as…

  19. 44 CFR 15.12 - Photographs and other depictions. (United States)


    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Photographs and other depictions. 15.12 Section 15.12 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY... other depictions at the NETC. (1) Photographs may be taken inside classroom or office areas of the NETC...

  20. Photographic assessment of burn size and depth: reliability and validity

    NARCIS (Netherlands)

    Hop, M.; Moues, C.; Bogomolova, K.; Nieuwenhuis, M.; Oen, I.; Middelkoop, E.; Breederveld, R.; de Baar, M.


    Objective: The aim of this study was to examine the reliability and validity of using photographs of burns to assess both burn size and depth. Method: Fifty randomly selected photographs taken on day 0-1 post burn were assessed by seven burn experts and eight referring physicians. Inter-rater