WorldWideScience

Sample records for automatic mesh generation

  1. Procedure for the automatic mesh generation of innovative gear teeth

    Directory of Open Access Journals (Sweden)

    Radicella Andrea Chiaramonte

    2016-01-01

    Full Text Available After having described gear wheels with teeth having the two sides constituted by different involutes and their importance in engineering applications, we stress the need for an efficient procedure for the automatic mesh generation of innovative gear teeth. First, we describe the procedure for the subdivision of the tooth profile in the various possible cases, then we show the method for creating the subdivision mesh, defined by two series of curves called meridians and parallels. Finally, we describe how the above procedure for automatic mesh generation is able to solve specific cases that may arise when dealing with teeth having the two sides constituted by different involutes.

  2. Composite structured mesh generation with automatic domain decomposition in complex geometries

    Science.gov (United States)

    This paper presents a novel automatic domain decomposition method to generate quality composite structured meshes in complex domains with arbitrary shapes, in which quality structured mesh generation still remains a challenge. The proposed decomposition algorithm is based on the analysis of an initi...

  3. 2D automatic body-fitted structured mesh generation using advancing extraction method

    Science.gov (United States)

    This paper presents an automatic mesh generation algorithm for body-fitted structured meshes in Computational Fluids Dynamics (CFD) analysis using the Advancing Extraction Method (AEM). The method is applicable to two-dimensional domains with complex geometries, which have the hierarchical tree-like...

  4. 2D automatic body-fitted structured mesh generation using advancing extraction method

    Science.gov (United States)

    Zhang, Yaoxin; Jia, Yafei

    2018-01-01

    This paper presents an automatic mesh generation algorithm for body-fitted structured meshes in Computational Fluids Dynamics (CFD) analysis using the Advancing Extraction Method (AEM). The method is applicable to two-dimensional domains with complex geometries, which have the hierarchical tree-like topography with extrusion-like structures (i.e., branches or tributaries) and intrusion-like structures (i.e., peninsula or dikes). With the AEM, the hierarchical levels of sub-domains can be identified, and the block boundary of each sub-domain in convex polygon shape in each level can be extracted in an advancing scheme. In this paper, several examples were used to illustrate the effectiveness and applicability of the proposed algorithm for automatic structured mesh generation, and the implementation of the method.

  5. Applications of automatic mesh generation and adaptive methods in computational medicine

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, J.A.; Macleod, R.S. [Univ. of Utah, Salt Lake City, UT (United States); Johnson, C.R.; Eason, J.C. [Duke Univ., Durham, NC (United States)

    1995-12-31

    Important problems in Computational Medicine exist that can benefit from the implementation of adaptive mesh refinement techniques. Biological systems are so inherently complex that only efficient models running on state of the art hardware can begin to simulate reality. To tackle the complex geometries associated with medical applications we present a general purpose mesh generation scheme based upon the Delaunay tessellation algorithm and an iterative point generator. In addition, automatic, two- and three-dimensional adaptive mesh refinement methods are presented that are derived from local and global estimates of the finite element error. Mesh generation and adaptive refinement techniques are utilized to obtain accurate approximations of bioelectric fields within anatomically correct models of the heart and human thorax. Specifically, we explore the simulation of cardiac defibrillation and the general forward and inverse problems in electrocardiography (ECG). Comparisons between uniform and adaptive refinement techniques are made to highlight the computational efficiency and accuracy of adaptive methods in the solution of field problems in computational medicine.

  6. Automatic mesh generation for structural analysis of pressure vessels using fuzzy knowledge processing

    International Nuclear Information System (INIS)

    Kado, Kenichiro; Sato, Takuya; Yoshimura, Shinobu; Yagawa, Genki.

    1994-01-01

    This paper describes the automatic mesh generation system for 2D axisymmetric and 3D shell structures based on the fuzzy knowledge processing. In this system, an analysis model, i.e. a geometric model, is first defined using a conventional method for 2D structures and a commercial CAD system, Auto-CAD, for 3D shell structures. Nodes are then generated based on the fuzzy knowledge processing technique, well controlling the node density distribution over the whole analysis domain. Triangular elements are generated using the Delaunay triangulation technique. The triangular elements are converted to quadrilateral elements. The fundamental performances of the system are demonstrated through its application to typical components of a pressure vessel. (author)

  7. Evaluation of user-guided semi-automatic decomposition tool for hexahedral mesh generation

    Directory of Open Access Journals (Sweden)

    Jean Hsiang-Chun Lu

    2017-10-01

    Full Text Available Volumetric decomposition is essential for all-hexahedral mesh generation. Because fully automatic decomposition methods that can generate high-quality hexahedral meshes for arbitrary volumes have yet to be realized, manual decomposition is still required frequently. Manual decomposition is a laborious process and requires a high level of user expertise. Therefore, a user-guided semi-automatic tool to reduce the human effort and lower the requirement of expertise is necessary. To date, only a few of these approaches have been proposed, and a lack of user evaluation makes it difficult to improve upon this approach. Based on our previous work, we present a user evaluation of a user-guided semi-automatic tool that provides visual guidance to assist users in determining decomposition solutions, accepts sketch-based inputs to create decomposition surfaces, and simplifies the decomposition commands. This user evaluation investigated (1 the usability of the visual guidance, (2 the types of visual guidance essential for decomposition, (3 the effectiveness of the sketch-based decomposition, and (4 the performance differences between beginner and experienced users using the sketch-based decomposition. The result and user feedback indicate that the tool enables users who have limited prior experience or familiarity with the computer-aided engineering software to perform volumetric decomposition more efficiently. The visual guidance increases the success rate of the user’s decomposition solution by 28%. The sketch-based decomposition significantly reduces 46% of the user’s time on creating decomposition surfaces and setting up decomposition commands.

  8. An expert system for automatic mesh generation for Sn particle transport simulation in parallel environment

    International Nuclear Information System (INIS)

    Apisit, Patchimpattapong; Alireza, Haghighat; Shedlock, D.

    2003-01-01

    An expert system for generating an effective mesh distribution for the SN particle transport simulation has been developed. This expert system consists of two main parts: 1) an algorithm for generating an effective mesh distribution in a serial environment, and 2) an algorithm for inference of an effective domain decomposition strategy for parallel computing. For the first part, the algorithm prepares an effective mesh distribution considering problem physics and the spatial differencing scheme. For the second part, the algorithm determines a parallel-performance-index (PPI), which is defined as the ratio of the granularity to the degree-of-coupling. The parallel-performance-index provides expected performance of an algorithm depending on computing environment and resources. A large index indicates a high granularity algorithm with relatively low coupling among processors. This expert system has been successfully tested within the PENTRAN (Parallel Environment Neutral-Particle Transport) code system for simulating real-life shielding problems. (authors)

  9. Spherical geodesic mesh generation

    Energy Technology Data Exchange (ETDEWEB)

    Fung, Jimmy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kenamond, Mark Andrew [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Burton, Donald E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Shashkov, Mikhail Jurievich [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-27

    In ALE simulations with moving meshes, mesh topology has a direct influence on feature representation and code robustness. In three-dimensional simulations, modeling spherical volumes and features is particularly challenging for a hydrodynamics code. Calculations on traditional spherical meshes (such as spin meshes) often lead to errors and symmetry breaking. Although the underlying differencing scheme may be modified to rectify this, the differencing scheme may not be accessible. This work documents the use of spherical geodesic meshes to mitigate solution-mesh coupling. These meshes are generated notionally by connecting geodesic surface meshes to produce triangular-prismatic volume meshes. This mesh topology is fundamentally different from traditional mesh topologies and displays superior qualities such as topological symmetry. This work describes the geodesic mesh topology as well as motivating demonstrations with the FLAG hydrocode.

  10. Automatic mesh generation for finite element calculations in the case of thermal loads

    International Nuclear Information System (INIS)

    Cords, H.; Zimmermann, R.

    1975-01-01

    The presentation describes a method to generate finite element nodal point networks on the basis of isothermals and flux lines. Such a mesh provides a relatively fine partitioning at regions where pronounced temperature variations exist. In case of entirely thermal loads a net of this kind is advantageous since the refinement is provided at exactly those locations where high stress levels are expected. In the present contribution the method was employed to analyze the structural behavior of a nuclear fuel element under operating conditions. The graphite block fuel elements for high temperature reactors are of prismatic shape with a large number of parallel bores in the axial direction. Some of these bores are open at both ends and cooling is effected by helium flowing through. Blind holes contain the fuel as compacts or cartridges. The basic temperature distribution in a horizontal section of the block was obtained by the boundary point least squares method which yields analytical expressions for both temperature and thermal flux. The corresponding computer code was presented at an earlier SMiRT conference. The method is particularly useful for regular arrays of heat sources and sinks as encountered in heat exchanger problems. The generated mesh matches the requirements of a subsequent structural analysis with finite elements provided there are no other than thermal loads

  11. Automatic geometric modeling, mesh generation and FE analysis for pipelines with idealized defects and arbitrary location

    Energy Technology Data Exchange (ETDEWEB)

    Motta, R.S.; Afonso, S.M.B.; Willmersdorf, R.B.; Lyra, P.R.M. [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil); Cabral, H.L.D. [TRANSPETRO, Rio de Janeiro, RJ (Brazil); Andrade, E.Q. [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2009-07-01

    Although the Finite Element Method (FEM) has proved to be a powerful tool to predict the failure pressure of corroded pipes, the generation of good computational models of pipes with corrosion defects can take several days. This makes the use of computational simulation procedure difficult to apply in practice. The main purpose of this work is to develop a set of computational tools to produce automatically models of pipes with defects, ready to be analyzed with commercial FEM programs, starting from a few parameters that locate and provide the main dimensions of the defect or a series of defects. Here these defects can be internal and external and also assume general spatial locations along the pipe. Idealized rectangular and elliptic geometries can be generated. These tools were based on MSC.PATRAN pre and post-processing programs and were written with PCL (Patran Command Language). The program for the automatic generation of models (PIPEFLAW) has a simplified and customized graphical interface, so that an engineer with basic notions of computational simulation with the FEM can generate rapidly models that result in precise and reliable simulations. Some examples of models of pipes with defects generated by the PIPEFLAW system are shown, and the results of numerical analyses, done with the tools presented in this work, are compared with, empiric results. (author)

  12. Automatic Scheme Selection for Toolkit Hex Meshing

    Energy Technology Data Exchange (ETDEWEB)

    TAUTGES,TIMOTHY J.; WHITE,DAVID R.

    1999-09-27

    Current hexahedral mesh generation techniques rely on a set of meshing tools, which when combined with geometry decomposition leads to an adequate mesh generation process. Of these tools, sweeping tends to be the workhorse algorithm, accounting for at least 50% of most meshing applications. Constraints which must be met for a volume to be sweepable are derived, and it is proven that these constraints are necessary but not sufficient conditions for sweepability. This paper also describes a new algorithm for detecting extruded or sweepable geometries. This algorithm, based on these constraints, uses topological and local geometric information, and is more robust than feature recognition-based algorithms. A method for computing sweep dependencies in volume assemblies is also given. The auto sweep detect and sweep grouping algorithms have been used to reduce interactive user time required to generate all-hexahedral meshes by filtering out non-sweepable volumes needing further decomposition and by allowing concurrent meshing of independent sweep groups. Parts of the auto sweep detect algorithm have also been used to identify independent sweep paths, for use in volume-based interval assignment.

  13. GENERATION OF IRREGULAR HEXAGONAL MESHES

    Directory of Open Access Journals (Sweden)

    Vlasov Aleksandr Nikolaevich

    2012-07-01

    Decomposition is performed in a constructive way and, as option, it involves meshless representation. Further, this mapping method is used to generate the calculation mesh. In this paper, the authors analyze different cases of mapping onto simply connected and bi-connected canonical domains. They represent forward and backward mapping techniques. Their potential application for generation of nonuniform meshes within the framework of the asymptotic homogenization theory is also performed to assess and project effective characteristics of heterogeneous materials (composites.

  14. The generation of hexahedral meshes for assembly geometries: A survey

    Energy Technology Data Exchange (ETDEWEB)

    TAUTGES,TIMOTHY J.

    2000-02-14

    The finite element method is being used today to model component assemblies in a wide variety of application areas, including structural mechanics, fluid simulations, and others. Generating hexahedral meshes for these assemblies usually requires the use of geometry decomposition, with different meshing algorithms applied to different regions. While the primary motivation for this approach remains the lack of an automatic, reliable all-hexahedral meshing algorithm, requirements in mesh quality and mesh configuration for typical analyses are also factors. For these reasons, this approach is also sometimes required when producing other types of unstructured meshes. This paper will review progress to date in automating many parts of the hex meshing process, which has halved the time to produce all-hex meshes for large assemblies. Particular issues which have been exposed due to this progress will also be discussed, along with their applicability to the general unstructured meshing problem.

  15. MeSH Now: automatic MeSH indexing at PubMed scale via learning to rank.

    Science.gov (United States)

    Mao, Yuqing; Lu, Zhiyong

    2017-04-17

    MeSH indexing is the task of assigning relevant MeSH terms based on a manual reading of scholarly publications by human indexers. The task is highly important for improving literature retrieval and many other scientific investigations in biomedical research. Unfortunately, given its manual nature, the process of MeSH indexing is both time-consuming (new articles are not immediately indexed until 2 or 3 months later) and costly (approximately ten dollars per article). In response, automatic indexing by computers has been previously proposed and attempted but remains challenging. In order to advance the state of the art in automatic MeSH indexing, a community-wide shared task called BioASQ was recently organized. We propose MeSH Now, an integrated approach that first uses multiple strategies to generate a combined list of candidate MeSH terms for a target article. Through a novel learning-to-rank framework, MeSH Now then ranks the list of candidate terms based on their relevance to the target article. Finally, MeSH Now selects the highest-ranked MeSH terms via a post-processing module. We assessed MeSH Now on two separate benchmarking datasets using traditional precision, recall and F 1 -score metrics. In both evaluations, MeSH Now consistently achieved over 0.60 in F-score, ranging from 0.610 to 0.612. Furthermore, additional experiments show that MeSH Now can be optimized by parallel computing in order to process MEDLINE documents on a large scale. We conclude that MeSH Now is a robust approach with state-of-the-art performance for automatic MeSH indexing and that MeSH Now is capable of processing PubMed scale documents within a reasonable time frame. http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/Demo/MeSHNow/ .

  16. Advanced numerical methods in mesh generation and mesh adaptation

    Energy Technology Data Exchange (ETDEWEB)

    Lipnikov, Konstantine [Los Alamos National Laboratory; Danilov, A [MOSCOW, RUSSIA; Vassilevski, Y [MOSCOW, RUSSIA; Agonzal, A [UNIV OF LYON

    2010-01-01

    Numerical solution of partial differential equations requires appropriate meshes, efficient solvers and robust and reliable error estimates. Generation of high-quality meshes for complex engineering models is a non-trivial task. This task is made more difficult when the mesh has to be adapted to a problem solution. This article is focused on a synergistic approach to the mesh generation and mesh adaptation, where best properties of various mesh generation methods are combined to build efficiently simplicial meshes. First, the advancing front technique (AFT) is combined with the incremental Delaunay triangulation (DT) to build an initial mesh. Second, the metric-based mesh adaptation (MBA) method is employed to improve quality of the generated mesh and/or to adapt it to a problem solution. We demonstrate with numerical experiments that combination of all three methods is required for robust meshing of complex engineering models. The key to successful mesh generation is the high-quality of the triangles in the initial front. We use a black-box technique to improve surface meshes exported from an unattainable CAD system. The initial surface mesh is refined into a shape-regular triangulation which approximates the boundary with the same accuracy as the CAD mesh. The DT method adds robustness to the AFT. The resulting mesh is topologically correct but may contain a few slivers. The MBA uses seven local operations to modify the mesh topology. It improves significantly the mesh quality. The MBA method is also used to adapt the mesh to a problem solution to minimize computational resources required for solving the problem. The MBA has a solid theoretical background. In the first two experiments, we consider the convection-diffusion and elasticity problems. We demonstrate the optimal reduction rate of the discretization error on a sequence of adaptive strongly anisotropic meshes. The key element of the MBA method is construction of a tensor metric from hierarchical edge

  17. Curved mesh generation and mesh refinement using Lagrangian solid mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Persson, P.-O.; Peraire, J.

    2008-12-31

    We propose a method for generating well-shaped curved unstructured meshes using a nonlinear elasticity analogy. The geometry of the domain to be meshed is represented as an elastic solid. The undeformed geometry is the initial mesh of linear triangular or tetrahedral elements. The external loading results from prescribing a boundary displacement to be that of the curved geometry, and the final configuration is determined by solving for the equilibrium configuration. The deformations are represented using piecewise polynomials within each element of the original mesh. When the mesh is sufficiently fine to resolve the solid deformation, this method guarantees non-intersecting elements even for highly distorted or anisotropic initial meshes. We describe the method and the solution procedures, and we show a number of examples of two and three dimensional simplex meshes with curved boundaries. We also demonstrate how to use the technique for local refinement of non-curved meshes in the presence of curved boundaries.

  18. Documentation for MeshKit - Reactor Geometry (&mesh) Generator

    Energy Technology Data Exchange (ETDEWEB)

    Jain, Rajeev [Argonne National Lab. (ANL), Argonne, IL (United States); Mahadevan, Vijay [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-09-30

    This report gives documentation for using MeshKit’s Reactor Geometry (and mesh) Generator (RGG) GUI and also briefly documents other algorithms and tools available in MeshKit. RGG is a program designed to aid in modeling and meshing of complex/large hexagonal and rectilinear reactor cores. RGG uses Argonne’s SIGMA interfaces, Qt and VTK to produce an intuitive user interface. By integrating a 3D view of the reactor with the meshing tools and combining them into one user interface, RGG streamlines the task of preparing a simulation mesh and enables real-time feedback that reduces accidental scripting mistakes that could waste hours of meshing. RGG interfaces with MeshKit tools to consolidate the meshing process, meaning that going from model to mesh is as easy as a button click. This report is designed to explain RGG v 2.0 interface and provide users with the knowledge and skills to pilot RGG successfully. Brief documentation of MeshKit source code, tools and other algorithms available are also presented for developers to extend and add new algorithms to MeshKit. RGG tools work in serial and parallel and have been used to model complex reactor core models consisting of conical pins, load pads, several thousands of axially varying material properties of instrumentation pins and other interstices meshes.

  19. Automatic generation of 3D fine mesh geometries for the analysis of the venus-3 shielding benchmark experiment with the Tort code

    International Nuclear Information System (INIS)

    Pescarini, M.; Orsi, R.; Martinelli, T.

    2003-01-01

    In many practical radiation transport applications today the cost for solving refined, large size and complex multi-dimensional problems is not so much computing but is linked to the cumbersome effort required by an expert to prepare a detailed geometrical model, verify and validate that it is correct and represents, to a specified tolerance, the real design or facility. This situation is, in particular, relevant and frequent in reactor core criticality and shielding calculations, with three-dimensional (3D) general purpose radiation transport codes, requiring a very large number of meshes and high performance computers. The need for developing tools that make easier the task to the physicist or engineer, by reducing the time required, by facilitating through effective graphical display the verification of correctness and, finally, that help the interpretation of the results obtained, has clearly emerged. The paper shows the results of efforts in this field through detailed simulations of a complex shielding benchmark experiment. In the context of the activities proposed by the OECD/NEA Nuclear Science Committee (NSC) Task Force on Computing Radiation Dose and Modelling of Radiation-Induced Degradation of Reactor Components (TFRDD), the ENEA-Bologna Nuclear Data Centre contributed with an analysis of the VENUS-3 low-flux neutron shielding benchmark experiment (SCK/CEN-Mol, Belgium). One of the targets of the work was to test the BOT3P system, originally developed at the Nuclear Data Centre in ENEA-Bologna and actually released to OECD/NEA Data Bank for free distribution. BOT3P, ancillary system of the DORT (2D) and TORT (3D) SN codes, permits a flexible automatic generation of spatial mesh grids in Cartesian or cylindrical geometry, through combinatorial geometry algorithms, following a simplified user-friendly approach. This system demonstrated its validity also in core criticality analyses, as for example the Lewis MOX fuel benchmark, permitting to easily

  20. SHARP/PRONGHORN Interoperability: Mesh Generation

    Energy Technology Data Exchange (ETDEWEB)

    Avery Bingham; Javier Ortensi

    2012-09-01

    Progress toward collaboration between the SHARP and MOOSE computational frameworks has been demonstrated through sharing of mesh generation and ensuring mesh compatibility of both tools with MeshKit. MeshKit was used to build a three-dimensional, full-core very high temperature reactor (VHTR) reactor geometry with 120-degree symmetry, which was used to solve a neutron diffusion critical eigenvalue problem in PRONGHORN. PRONGHORN is an application of MOOSE that is capable of solving coupled neutron diffusion, heat conduction, and homogenized flow problems. The results were compared to a solution found on a 120-degree, reflected, three-dimensional VHTR mesh geometry generated by PRONGHORN. The ability to exchange compatible mesh geometries between the two codes is instrumental for future collaboration and interoperability. The results were found to be in good agreement between the two meshes, thus demonstrating the compatibility of the SHARP and MOOSE frameworks. This outcome makes future collaboration possible.

  1. Study on boundary search method for DFM mesh generation

    Directory of Open Access Journals (Sweden)

    Li Ri

    2012-08-01

    Full Text Available The boundary mesh of the casting model was determined by direct calculation on the triangular facets extracted from the STL file of the 3D model. Then the inner and outer grids of the model were identified by the algorithm in which we named Inner Seed Grid Method. Finally, a program to automatically generate a 3D FDM mesh was compiled. In the paper, a method named Triangle Contraction Search Method (TCSM was put forward to ensure not losing the boundary grids; while an algorithm to search inner seed grids to identify inner/outer grids of the casting model was also brought forward. Our algorithm was simple, clear and easy to construct program. Three examples for the casting mesh generation testified the validity of the program.

  2. Mesh Generation and Adaption for High Reynolds Number RANS Computations, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal offers to provide NASA with an automatic mesh generator for the simulation of aerodynamic flows using Reynolds-Averages Navier-Stokes (RANS) models....

  3. Mesh Generation and Adaption for High Reynolds Number RANS Computations, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal offers to provide NASA with an automatic mesh generator for the simulation of aerodynamic flows using Reynolds-Averages Navier-Stokes (RANS) models....

  4. Mesh Generation and Adaption for High Reynolds Number RANS Computations Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal offers to provide NASA with an automatic mesh generator for the simulation of aerodynamic flows using Reynolds-Averages Navier-Stokes (RANS) models....

  5. Automatic off-body overset adaptive Cartesian mesh method based on an octree approach

    International Nuclear Information System (INIS)

    Péron, Stéphanie; Benoit, Christophe

    2013-01-01

    This paper describes a method for generating adaptive structured Cartesian grids within a near-body/off-body mesh partitioning framework for the flow simulation around complex geometries. The off-body Cartesian mesh generation derives from an octree structure, assuming each octree leaf node defines a structured Cartesian block. This enables one to take into account the large scale discrepancies in terms of resolution between the different bodies involved in the simulation, with minimum memory requirements. Two different conversions from the octree to Cartesian grids are proposed: the first one generates Adaptive Mesh Refinement (AMR) type grid systems, and the second one generates abutting or minimally overlapping Cartesian grid set. We also introduce an algorithm to control the number of points at each adaptation, that automatically determines relevant values of the refinement indicator driving the grid refinement and coarsening. An application to a wing tip vortex computation assesses the capability of the method to capture accurately the flow features.

  6. An Adaptive Mesh Algorithm: Mesh Structure and Generation

    Energy Technology Data Exchange (ETDEWEB)

    Scannapieco, Anthony J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-21

    The purpose of Adaptive Mesh Refinement is to minimize spatial errors over the computational space not to minimize the number of computational elements. The additional result of the technique is that it may reduce the number of computational elements needed to retain a given level of spatial accuracy. Adaptive mesh refinement is a computational technique used to dynamically select, over a region of space, a set of computational elements designed to minimize spatial error in the computational model of a physical process. The fundamental idea is to increase the mesh resolution in regions where the physical variables are represented by a broad spectrum of modes in k-space, hence increasing the effective global spectral coverage of those physical variables. In addition, the selection of the spatially distributed elements is done dynamically by cyclically adjusting the mesh to follow the spectral evolution of the system. Over the years three types of AMR schemes have evolved; block, patch and locally refined AMR. In block and patch AMR logical blocks of various grid sizes are overlaid to span the physical space of interest, whereas in locally refined AMR no logical blocks are employed but locally nested mesh levels are used to span the physical space. The distinction between block and patch AMR is that in block AMR the original blocks refine and coarsen entirely in time, whereas in patch AMR the patches change location and zone size with time. The type of AMR described herein is a locally refi ned AMR. In the algorithm described, at any point in physical space only one zone exists at whatever level of mesh that is appropriate for that physical location. The dynamic creation of a locally refi ned computational mesh is made practical by a judicious selection of mesh rules. With these rules the mesh is evolved via a mesh potential designed to concentrate the nest mesh in regions where the physics is modally dense, and coarsen zones in regions where the physics is modally

  7. A Fully Automated Mesh Generation Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This SBIR Phase I project proposes to develop a fully automated mesh generation tool which contains two parts: surface mesh generation from the imported Computer...

  8. LOOM-P: a finite element mesh generation program with on-line graphic display

    International Nuclear Information System (INIS)

    Ise, Takeharu; Yamazaki, Toshio.

    1977-06-01

    A description of the two-dimensional mesh generation program, LOOM-P, is given in detail. The program is developed newly to produce a mesh network for a reactor core geometry with the help of an automatic mesh generation routine built in it, under the control of the refresh-type graphic display. It is therefore similar to the edit program of the self-organizing mesh generator, QMESH-RENUM. Additional techniques are incorporated to improve the pattern of mesh elements by means of on-line conversational mode. The obtained mesh network is edited out as input data to the three-dimensional neutron diffusion theory code, FEM-BABEL. (auth.)

  9. Generating quality tetrahedral meshes from binary volumes

    DEFF Research Database (Denmark)

    Hansen, Mads Fogtmann; Bærentzen, Jakob Andreas; Larsen, Rasmus

    2010-01-01

    use these measures to generate high quality meshes from signed distance maps. This paper also describes an approach for computing (smooth) signed distance maps from binary volumes as volumetric data in many cases originate from segmentation of objects from imaging techniques such as CT, MRI, etc...... generation algorithm on four examples (torus, Stanford dragon, brain mask, and pig back) and report the dihedral angle, aspect ratio and radius-edge ratio. Even though, the algorithm incorporates none of the mentioned quality measures in the compression stage it receives a good score for all these measures...

  10. CUBIT mesh generation environment. Volume 1: Users manual

    Energy Technology Data Exchange (ETDEWEB)

    Blacker, T.D.; Bohnhoff, W.J.; Edwards, T.L. [and others

    1994-05-01

    The CUBIT mesh generation environment is a two- and three-dimensional finite element mesh generation tool which is being developed to pursue the goal of robust and unattended mesh generation--effectively automating the generation of quadrilateral and hexahedral elements. It is a solid-modeler based preprocessor that meshes volume and surface solid models for finite element analysis. A combination of techniques including paving, mapping, sweeping, and various other algorithms being developed are available for discretizing the geometry into a finite element mesh. CUBIT also features boundary layer meshing specifically designed for fluid flow problems. Boundary conditions can be applied to the mesh through the geometry and appropriate files for analysis generated. CUBIT is specifically designed to reduce the time required to create all-quadrilateral and all-hexahedral meshes. This manual is designed to serve as a reference and guide to creating finite element models in the CUBIT environment.

  11. Automatic Matching of Multi-scale Road Networks under the Constraints of Smaller Scale Road Meshes

    Directory of Open Access Journals (Sweden)

    PEI Hongxing

    2017-06-01

    Full Text Available A new method is proposed to achieve automatic matching for multi-scale roads under the constraints of the smaller scale data. Firstly, meshes should be extracted from the two different scales road data. Secondly, several basic meshes in the larger scale road network will be merged as a composite one, which will be matched with one mesh from the smaller scale road network, so that the meshes with many-to-one and one-to-one matching relationships will be matched. Thirdly, meshes from the two different scale road data with many-to-many matching relationships will be matched. Finally, road will be classified into two categories under the constraints of meshes: mesh border roads and mesh internal roads, and then matching will be done in their own categories according to the matching relationships between the two scales meshes. The results showed that roads from different scale will be more precisely matched.

  12. Path Planning Based on Ply Orientation Information for Automatic Fiber Placement on Mesh Surface

    Science.gov (United States)

    Pei, Jiazhi; Wang, Xiaoping; Pei, Jingyu; Yang, Yang

    2018-03-01

    This article introduces an investigation of path planning with ply orientation information for automatic fiber placement (AFP) on open-contoured mesh surface. The new method makes use of the ply orientation information generated by loading characteristics on surface, divides the surface into several zones according to the ply orientation information and then designs different fiber paths in different zones. This article also gives new idea of up-layer design in order to make up for defects between parts and improve product's strength.

  13. Methods and evaluations of MRI content-adaptive finite element mesh generation for bioelectromagnetic problems

    International Nuclear Information System (INIS)

    Lee, W H; Kim, T-S; Cho, M H; Ahn, Y B; Lee, S Y

    2006-01-01

    In studying bioelectromagnetic problems, finite element analysis (FEA) offers several advantages over conventional methods such as the boundary element method. It allows truly volumetric analysis and incorporation of material properties such as anisotropic conductivity. For FEA, mesh generation is the first critical requirement and there exist many different approaches. However, conventional approaches offered by commercial packages and various algorithms do not generate content-adaptive meshes (cMeshes), resulting in numerous nodes and elements in modelling the conducting domain, and thereby increasing computational load and demand. In this work, we present efficient content-adaptive mesh generation schemes for complex biological volumes of MR images. The presented methodology is fully automatic and generates FE meshes that are adaptive to the geometrical contents of MR images, allowing optimal representation of conducting domain for FEA. We have also evaluated the effect of cMeshes on FEA in three dimensions by comparing the forward solutions from various cMesh head models to the solutions from the reference FE head model in which fine and equidistant FEs constitute the model. The results show that there is a significant gain in computation time with minor loss in numerical accuracy. We believe that cMeshes should be useful in the FEA of bioelectromagnetic problems

  14. Image-Based Geometric Modeling and Mesh Generation

    CERN Document Server

    2013-01-01

    As a new interdisciplinary research area, “image-based geometric modeling and mesh generation” integrates image processing, geometric modeling and mesh generation with finite element method (FEM) to solve problems in computational biomedicine, materials sciences and engineering. It is well known that FEM is currently well-developed and efficient, but mesh generation for complex geometries (e.g., the human body) still takes about 80% of the total analysis time and is the major obstacle to reduce the total computation time. It is mainly because none of the traditional approaches is sufficient to effectively construct finite element meshes for arbitrarily complicated domains, and generally a great deal of manual interaction is involved in mesh generation. This contributed volume, the first for such an interdisciplinary topic, collects the latest research by experts in this area. These papers cover a broad range of topics, including medical imaging, image alignment and segmentation, image-to-mesh conversion,...

  15. Improved mesh generator for the POISSON Group Codes

    International Nuclear Information System (INIS)

    Gupta, R.C.

    1987-01-01

    This paper describes the improved mesh generator of the POISSON Group Codes. These improvements enable one to have full control over the way the mesh is generated and in particular the way the mesh density is distributed throughout this model. A higher mesh density in certain regions coupled with a successively lower mesh density in others keeps the accuracy of the field computation high and the requirements on the computer time and computer memory low. The mesh is generated with the help of codes AUTOMESH and LATTICE; both have gone through a major upgrade. Modifications have also been made in the POISSON part of these codes. We shall present an example of a superconducting dipole magnet to explain how to use this code. The results of field computations are found to be reliable within a few parts in a hundred thousand even in such complex geometries

  16. Generation of hybrid meshes for the simulation of petroleum reservoirs; Generation de maillages hybrides pour la simulation de reservoirs petroliers

    Energy Technology Data Exchange (ETDEWEB)

    Balaven-Clermidy, S.

    2001-12-01

    Oil reservoir simulations study multiphase flows in porous media. These flows are described and evaluated through numerical schemes on a discretization of the reservoir domain. In this thesis, we were interested in this spatial discretization and a new kind of hybrid mesh has been proposed where the radial nature of flows in the vicinity of wells is directly taken into account in the geometry. Our modular approach described wells and their drainage area through radial circular meshes. These well meshes are inserted in a structured reservoir mesh (a Corner Point Geometry mesh) made up with hexahedral cells. Finally, in order to generate a global conforming mesh, proper connections are realized between the different kinds of meshes through unstructured transition ones. To compute these transition meshes that we want acceptable in terms of finite volume methods, an automatic method based on power diagrams has been developed. Our approach can deal with a homogeneous anisotropic medium and allows the user to insert vertical or horizontal wells as well as secondary faults in the reservoir mesh. Our work has been implemented, tested and validated in 2D and 2D1/2. It can also be extended in 3D when the geometrical constraints are simplicial ones: points, segments and triangles. (author)

  17. Large-scale subject-specific cerebral arterial tree modeling using automated parametric mesh generation for blood flow simulation.

    Science.gov (United States)

    Ghaffari, Mahsa; Tangen, Kevin; Alaraj, Ali; Du, Xinjian; Charbel, Fady T; Linninger, Andreas A

    2017-12-01

    In this paper, we present a novel technique for automatic parametric mesh generation of subject-specific cerebral arterial trees. This technique generates high-quality and anatomically accurate computational meshes for fast blood flow simulations extending the scope of 3D vascular modeling to a large portion of cerebral arterial trees. For this purpose, a parametric meshing procedure was developed to automatically decompose the vascular skeleton, extract geometric features and generate hexahedral meshes using a body-fitted coordinate system that optimally follows the vascular network topology. To validate the anatomical accuracy of the reconstructed vasculature, we performed statistical analysis to quantify the alignment between parametric meshes and raw vascular images using receiver operating characteristic curve. Geometric accuracy evaluation showed an agreement with area under the curves value of 0.87 between the constructed mesh and raw MRA data sets. Parametric meshing yielded on-average, 36.6% and 21.7% orthogonal and equiangular skew quality improvement over the unstructured tetrahedral meshes. The parametric meshing and processing pipeline constitutes an automated technique to reconstruct and simulate blood flow throughout a large portion of the cerebral arterial tree down to the level of pial vessels. This study is the first step towards fast large-scale subject-specific hemodynamic analysis for clinical applications. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation

    DEFF Research Database (Denmark)

    Mangado Lopez, Nerea; Ceresa, Mario; Duchateau, Nicolas

    2016-01-01

    's CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns....... To address such a challenge, we propose an automatic framework for the generation of patient-specific meshes for finite element modeling of the implanted cochlea. First, a statistical shape model is constructed from high-resolution anatomical μCT images. Then, by fitting the statistical model to a patient...

  19. Automatic generation of tourist brochures

    KAUST Repository

    Birsak, Michael

    2014-05-01

    We present a novel framework for the automatic generation of tourist brochures that include routing instructions and additional information presented in the form of so-called detail lenses. The first contribution of this paper is the automatic creation of layouts for the brochures. Our approach is based on the minimization of an energy function that combines multiple goals: positioning of the lenses as close as possible to the corresponding region shown in an overview map, keeping the number of lenses low, and an efficient numbering of the lenses. The second contribution is a route-aware simplification of the graph of streets used for traveling between the points of interest (POIs). This is done by reducing the graph consisting of all shortest paths through the minimization of an energy function. The output is a subset of street segments that enable traveling between all the POIs without considerable detours, while at the same time guaranteeing a clutter-free visualization. © 2014 The Author(s) Computer Graphics Forum © 2014 The Eurographics Association and John Wiley & Sons Ltd. Published by John Wiley & Sons Ltd.

  20. Medical Image Processing for Fully Integrated Subject Specific Whole Brain Mesh Generation

    Directory of Open Access Journals (Sweden)

    Chih-Yang Hsu

    2015-05-01

    Full Text Available Currently, anatomically consistent segmentation of vascular trees acquired with magnetic resonance imaging requires the use of multiple image processing steps, which, in turn, depend on manual intervention. In effect, segmentation of vascular trees from medical images is time consuming and error prone due to the tortuous geometry and weak signal in small blood vessels. To overcome errors and accelerate the image processing time, we introduce an automatic image processing pipeline for constructing subject specific computational meshes for entire cerebral vasculature, including segmentation of ancillary structures; the grey and white matter, cerebrospinal fluid space, skull, and scalp. To demonstrate the validity of the new pipeline, we segmented the entire intracranial compartment with special attention of the angioarchitecture from magnetic resonance imaging acquired for two healthy volunteers. The raw images were processed through our pipeline for automatic segmentation and mesh generation. Due to partial volume effect and finite resolution, the computational meshes intersect with each other at respective interfaces. To eliminate anatomically inconsistent overlap, we utilized morphological operations to separate the structures with a physiologically sound gap spaces. The resulting meshes exhibit anatomically correct spatial extent and relative positions without intersections. For validation, we computed critical biometrics of the angioarchitecture, the cortical surfaces, ventricular system, and cerebrospinal fluid (CSF spaces and compared against literature values. Volumina and surface areas of the computational mesh were found to be in physiological ranges. In conclusion, we present an automatic image processing pipeline to automate the segmentation of the main intracranial compartments including a subject-specific vascular trees. These computational meshes can be used in 3D immersive visualization for diagnosis, surgery planning with haptics

  1. Mesh Generation via Local Bisection Refinement of Triangulated Grids

    Science.gov (United States)

    2015-06-01

    Science and Technology Organisation DSTO–TR–3095 ABSTRACT This report provides a comprehensive implementation of an unstructured mesh generation method...and Technology Organisation 506 Lorimer St, Fishermans Bend, Victoria 3207, Australia Telephone: 1300 333 362 Facsimile: (03) 9626 7999 c© Commonwealth...their behaviour is critically linked to Maubach’s method and the data structures N and T . The top- level mesh refinement algorithm is also presented

  2. Automatic mesh refinement and local multigrid methods for contact problems: application to the Pellet-Cladding mechanical Interaction

    International Nuclear Information System (INIS)

    Liu, Hao

    2016-01-01

    This Ph.D. work takes place within the framework of studies on Pellet-Cladding mechanical Interaction (PCI) which occurs in the fuel rods of pressurized water reactor. This manuscript focuses on automatic mesh refinement to simulate more accurately this phenomena while maintaining acceptable computational time and memory space for industrial calculations. An automatic mesh refinement strategy based on the combination of the Local Defect Correction multigrid method (LDC) with the Zienkiewicz and Zhu a posteriori error estimator is proposed. The estimated error is used to detect the zones to be refined, where the local sub-grids of the LDC method are generated. Several stopping criteria are studied to end the refinement process when the solution is accurate enough or when the refinement does not improve the global solution accuracy anymore. Numerical results for elastic 2D test cases with pressure discontinuity show the efficiency of the proposed strategy. The automatic mesh refinement in case of unilateral contact problems is then considered. The strategy previously introduced can be easily adapted to the multi-body refinement by estimating solution error on each body separately. Post-processing is often necessary to ensure the conformity of the refined areas regarding the contact boundaries. A variety of numerical experiments with elastic contact (with or without friction, with or without an initial gap) confirms the efficiency and adaptability of the proposed strategy. (author) [fr

  3. r-Adaptive mesh generation for shell finite element analysis

    International Nuclear Information System (INIS)

    Cho, Maenghyo; Jun, Seongki

    2004-01-01

    An r-adaptive method or moving grid technique relocates a grid so that it becomes concentrated in the desired region. This concentration improves the accuracy and efficiency of finite element solutions. We apply the r-adaptive method to computational mesh of shell surfaces, which is initially regular and uniform. The r-adaptive method, given by Liao and Anderson [Appl. Anal. 44 (1992) 285], aggregate the grid in the region with a relatively high weight function without any grid-tangling. The stress error estimator is calculated in the initial uniform mesh for a weight function. However, since the r-adaptive method is a method that moves the grid, shell surface geometry error such as curvature error and mesh distortion error will increase. Therefore, to represent the exact geometry of a shell surface and to prevent surface geometric errors, we use the Naghdi's shell theory and express the shell surface by a B-spline patch. In addition, using a nine-node element, which is relatively less sensitive to mesh distortion, we try to diminish mesh distortion error in the application of an r-adaptive method. In the numerical examples, it is shown that the values of the error estimator for a cylinder, hemisphere, and torus in the overall domain can be reduced effectively by using the mesh generated by the r-adaptive method. Also, the reductions of the estimated relative errors are demonstrated in the numerical examples. In particular, a new functional is proposed to construct an adjusted mesh configuration by considering a mesh distortion measure as well as the stress error function. The proposed weight function provides a reliable mesh adaptation method after a parameter value in the weight function is properly chosen

  4. Development of unstructured mesh generator on parallel computers

    International Nuclear Information System (INIS)

    Muramatsu, Kazuhiro; Shimada, Akio; Murakami, Hiroyuki; Higashida, Akihiro; Wakatsuki, Shigeto

    2000-01-01

    A general-purpose unstructured mesh generator, 'GRID3D/UNST', has been developed on parallel computers. High-speed operations and large-scale memory capacity of parallel computers enable the system to generate a large-scale mesh at high speed. As a matter of fact, the system generates large-scale mesh composed of 2,400,000 nodes and 14,000,000 elements about 1.5 hours on HITACHI SR2201, 64 PEs (Processing Elements) through 2.5 hours pre-process on SUN. Also the system is built on standard FORTRAN, C and Motif, and therefore has high portability. The system enables us to solve a large-scale problem that has been impossible to be solved, and to break new ground in the field of science and engineering. (author)

  5. Automatic 4D reconstruction of patient-specific cardiac mesh with 1-to-1 vertex correspondence from segmented contours lines.

    Directory of Open Access Journals (Sweden)

    Chi Wan Lim

    Full Text Available We propose an automatic algorithm for the reconstruction of patient-specific cardiac mesh models with 1-to-1 vertex correspondence. In this framework, a series of 3D meshes depicting the endocardial surface of the heart at each time step is constructed, based on a set of border delineated magnetic resonance imaging (MRI data of the whole cardiac cycle. The key contribution in this work involves a novel reconstruction technique to generate a 4D (i.e., spatial-temporal model of the heart with 1-to-1 vertex mapping throughout the time frames. The reconstructed 3D model from the first time step is used as a base template model and then deformed to fit the segmented contours from the subsequent time steps. A method to determine a tree-based connectivity relationship is proposed to ensure robust mapping during mesh deformation. The novel feature is the ability to handle intra- and inter-frame 2D topology changes of the contours, which manifests as a series of merging and splitting of contours when the images are viewed either in a spatial or temporal sequence. Our algorithm has been tested on five acquisitions of cardiac MRI and can successfully reconstruct the full 4D heart model in around 30 minutes per subject. The generated 4D heart model conforms very well with the input segmented contours and the mesh element shape is of reasonably good quality. The work is important in the support of downstream computational simulation activities.

  6. Generating Signed Distance Fields From Triangle Meshes

    DEFF Research Database (Denmark)

    Bærentzen, Jakob Andreas; Aanæs, Henrik

    A method for generating a discrete, signed 3D distance field is proposed. Distance fields are used in a number of contexts. In particular the popular level set method is usually initialized by a distance field. The main focus of our work is on simplifying the computation of the sign when generating....... This leads to a method for generating signed distance fields which is a simple and straightforward extension of the method for generating unsigned distance fields. We prove that our choice of pseudo normal leads to a correct technique for computing the sign....

  7. Solution adaptive meshes with a hyperbolic grid generator

    Science.gov (United States)

    Klopfer, G. H.

    An alternative numerical procedure to generate solution-adaptive grids for use in CFD simulations is developed analytically and demonstrated. The approach is based on the hyperbolic generation scheme of Steger and Chausee (1980), with terms added to achieve line clustering while fulfilling orthogonality and smoothness requirements. The formulation of the method is outlined, and adapted meshes for a shock-shock interaction at freestream Mach number 8.03 and Reynolds number 387,500 are presented graphically. The hyperbolic approach is shown to be significantly faster than comparable elliptic-grid methods and capable of producing an arbitrarily high degree of clustering on structured meshes.

  8. Aranha: a 2D mesh generator for triangular finite elements

    International Nuclear Information System (INIS)

    Fancello, E.A.; Salgado, A.C.; Feijoo, R.A.

    1990-01-01

    A method for generating unstructured meshes for linear and quadratic triangular finite elements is described in this paper. Some topics on the C language data structure used in the development of the program Aranha are also presented. The applicability for adaptive remeshing is shown and finally several examples are included to illustrate the performance of the method in irregular connected planar domains. (author)

  9. Loft: An Automated Mesh Generator for Stiffened Shell Aerospace Vehicles

    Science.gov (United States)

    Eldred, Lloyd B.

    2011-01-01

    Loft is an automated mesh generation code that is designed for aerospace vehicle structures. From user input, Loft generates meshes for wings, noses, tanks, fuselage sections, thrust structures, and so on. As a mesh is generated, each element is assigned properties to mark the part of the vehicle with which it is associated. This property assignment is an extremely powerful feature that enables detailed analysis tasks, such as load application and structural sizing. This report is presented in two parts. The first part is an overview of the code and its applications. The modeling approach that was used to create the finite element meshes is described. Several applications of the code are demonstrated, including a Next Generation Launch Technology (NGLT) wing-sizing study, a lunar lander stage study, a launch vehicle shroud shape study, and a two-stage-to-orbit (TSTO) orbiter. Part two of the report is the program user manual. The manual includes in-depth tutorials and a complete command reference.

  10. Free Tools and Strategies for the Generation of 3D Finite Element Meshes: Modeling of the Cardiac Structures

    Directory of Open Access Journals (Sweden)

    E. Pavarino

    2013-01-01

    Full Text Available The Finite Element Method is a well-known technique, being extensively applied in different areas. Studies using the Finite Element Method (FEM are targeted to improve cardiac ablation procedures. For such simulations, the finite element meshes should consider the size and histological features of the target structures. However, it is possible to verify that some methods or tools used to generate meshes of human body structures are still limited, due to nondetailed models, nontrivial preprocessing, or mainly limitation in the use condition. In this paper, alternatives are demonstrated to solid modeling and automatic generation of highly refined tetrahedral meshes, with quality compatible with other studies focused on mesh generation. The innovations presented here are strategies to integrate Open Source Software (OSS. The chosen techniques and strategies are presented and discussed, considering cardiac structures as a first application context.

  11. MESH2D Grid generator design and use

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G. P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-10-31

    Mesh2d is a Fortran90 program originally designed to generate two-dimensional structured grids of the form [x(i),y(i,j)] where [x,y] are grid coordinates identified by indices (i,j). x-coordinates depending only on index i implies strictly vertical x-grid lines, whereas the y-grid lines can undulate. Mesh2d also assigns an integer material type to each grid cell, mtyp(i,j), in a user-specified manner. The complete grid is specified through three separate input files defining the x(i), y(i,j), and mtyp(i,j) variations. Since the original development effort, Mesh2d has been extended to more general two-dimensional structured grids of the form [x(i,j),(i,j)].

  12. Generation of high order geometry representations in Octree meshes

    Directory of Open Access Journals (Sweden)

    Harald G. Klimach

    2015-11-01

    Full Text Available We propose a robust method to convert triangulated surface data into polynomial volume data. Such polynomial representations are required for high-order partial differential solvers, as low-order surface representations would diminish the accuracy of their solution. Our proposed method deploys a first order spatial bisection algorithm to find robustly an approximation of given geometries. The resulting voxelization is then used to generate Legendre polynomials of arbitrary degree. By embedding the locally defined polynomials in cubical elements of a coarser mesh, this method can reliably approximate even complex structures, like porous media. It thereby is possible to provide appropriate material definitions for high order discontinuous Galerkin schemes. We describe the method to construct the polynomial and how it fits into the overall mesh generation. Our discussion includes numerical properties of the method and we show some results from applying it to various geometries. We have implemented the described method in our mesh generator Seeder, which is publically available under a permissive open-source license.

  13. 3D active shape models of human brain structures: application to patient-specific mesh generation

    Science.gov (United States)

    Ravikumar, Nishant; Castro-Mateos, Isaac; Pozo, Jose M.; Frangi, Alejandro F.; Taylor, Zeike A.

    2015-03-01

    The use of biomechanics-based numerical simulations has attracted growing interest in recent years for computer-aided diagnosis and treatment planning. With this in mind, a method for automatic mesh generation of brain structures of interest, using statistical models of shape (SSM) and appearance (SAM), for personalised computational modelling is presented. SSMs are constructed as point distribution models (PDMs) while SAMs are trained using intensity profiles sampled from a training set of T1-weighted magnetic resonance images. The brain structures of interest are, the cortical surface (cerebrum, cerebellum & brainstem), lateral ventricles and falx-cerebri membrane. Two methods for establishing correspondences across the training set of shapes are investigated and compared (based on SSM quality): the Coherent Point Drift (CPD) point-set registration method and B-spline mesh-to-mesh registration method. The MNI-305 (Montreal Neurological Institute) average brain atlas is used to generate the template mesh, which is deformed and registered to each training case, to establish correspondence over the training set of shapes. 18 healthy patients' T1-weightedMRimages form the training set used to generate the SSM and SAM. Both model-training and model-fitting are performed over multiple brain structures simultaneously. Compactness and generalisation errors of the BSpline-SSM and CPD-SSM are evaluated and used to quantitatively compare the SSMs. Leave-one-out cross validation is used to evaluate SSM quality in terms of these measures. The mesh-based SSM is found to generalise better and is more compact, relative to the CPD-based SSM. Quality of the best-fit model instance from the trained SSMs, to test cases are evaluated using the Hausdorff distance (HD) and mean absolute surface distance (MASD) metrics.

  14. A Novel Coarsening Method for Scalable and Efficient Mesh Generation

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, A; Hysom, D; Gunney, B

    2010-12-02

    matrix-vector multiplication can be performed locally on each processor and hence to minimize communication. Furthermore, a good graph partitioning scheme ensures the equal amount of computation performed on each processor. Graph partitioning is a well known NP-complete problem, and thus the most commonly used graph partitioning algorithms employ some forms of heuristics. These algorithms vary in terms of their complexity, partition generation time, and the quality of partitions, and they tend to trade off these factors. A significant challenge we are currently facing at the Lawrence Livermore National Laboratory is how to partition very large meshes on massive-size distributed memory machines like IBM BlueGene/P, where scalability becomes a big issue. For example, we have found that the ParMetis, a very popular graph partitioning tool, can only scale to 16K processors. An ideal graph partitioning method on such an environment should be fast and scale to very large meshes, while producing high quality partitions. This is an extremely challenging task, as to scale to that level, the partitioning algorithm should be simple and be able to produce partitions that minimize inter-processor communications and balance the load imposed on the processors. Our goals in this work are two-fold: (1) To develop a new scalable graph partitioning method with good load balancing and communication reduction capability. (2) To study the performance of the proposed partitioning method on very large parallel machines using actual data sets and compare the performance to that of existing methods. The proposed method achieves the desired scalability by reducing the mesh size. For this, it coarsens an input mesh into a smaller size mesh by coalescing the vertices and edges of the original mesh into a set of mega-vertices and mega-edges. A new coarsening method called brick algorithm is developed in this research. In the brick algorithm, the zones in a given mesh are first grouped into fixed size

  15. Unconstrained paving and plastering method for generating finite element meshes

    Science.gov (United States)

    Staten, Matthew L.; Owen, Steven J.; Blacker, Teddy D.; Kerr, Robert

    2010-03-02

    Computer software for and a method of generating a conformal all quadrilateral or hexahedral mesh comprising selecting an object with unmeshed boundaries and performing the following while unmeshed voids are larger than twice a desired element size and unrecognizable as either a midpoint subdividable or pave-and-sweepable polyhedra: selecting a front to advance; based on sizes of fronts and angles with adjacent fronts, determining which adjacent fronts should be advanced with the selected front; advancing the fronts; detecting proximities with other nearby fronts; resolving any found proximities; forming quadrilaterals or unconstrained columns of hexahedra where two layers cross; and establishing hexahedral elements where three layers cross.

  16. Exploiting MeSH indexing in MEDLINE to generate a data set for word sense disambiguation

    Directory of Open Access Journals (Sweden)

    McInnes Bridget T

    2011-06-01

    Full Text Available Abstract Background Evaluation of Word Sense Disambiguation (WSD methods in the biomedical domain is difficult because the available resources are either too small or too focused on specific types of entities (e.g. diseases or genes. We present a method that can be used to automatically develop a WSD test collection using the Unified Medical Language System (UMLS Metathesaurus and the manual MeSH indexing of MEDLINE. We demonstrate the use of this method by developing such a data set, called MSH WSD. Methods In our method, the Metathesaurus is first screened to identify ambiguous terms whose possible senses consist of two or more MeSH headings. We then use each ambiguous term and its corresponding MeSH heading to extract MEDLINE citations where the term and only one of the MeSH headings co-occur. The term found in the MEDLINE citation is automatically assigned the UMLS CUI linked to the MeSH heading. Each instance has been assigned a UMLS Concept Unique Identifier (CUI. We compare the characteristics of the MSH WSD data set to the previously existing NLM WSD data set. Results The resulting MSH WSD data set consists of 106 ambiguous abbreviations, 88 ambiguous terms and 9 which are a combination of both, for a total of 203 ambiguous entities. For each ambiguous term/abbreviation, the data set contains a maximum of 100 instances per sense obtained from MEDLINE. We evaluated the reliability of the MSH WSD data set using existing knowledge-based methods and compared their performance to that of the results previously obtained by these algorithms on the pre-existing data set, NLM WSD. We show that the knowledge-based methods achieve different results but keep their relative performance except for the Journal Descriptor Indexing (JDI method, whose performance is below the other methods. Conclusions The MSH WSD data set allows the evaluation of WSD algorithms in the biomedical domain. Compared to previously existing data sets, MSH WSD contains a larger

  17. Exploiting MeSH indexing in MEDLINE to generate a data set for word sense disambiguation.

    Science.gov (United States)

    Jimeno-Yepes, Antonio J; McInnes, Bridget T; Aronson, Alan R

    2011-06-02

    Evaluation of Word Sense Disambiguation (WSD) methods in the biomedical domain is difficult because the available resources are either too small or too focused on specific types of entities (e.g. diseases or genes). We present a method that can be used to automatically develop a WSD test collection using the Unified Medical Language System (UMLS) Metathesaurus and the manual MeSH indexing of MEDLINE. We demonstrate the use of this method by developing such a data set, called MSH WSD. In our method, the Metathesaurus is first screened to identify ambiguous terms whose possible senses consist of two or more MeSH headings. We then use each ambiguous term and its corresponding MeSH heading to extract MEDLINE citations where the term and only one of the MeSH headings co-occur. The term found in the MEDLINE citation is automatically assigned the UMLS CUI linked to the MeSH heading. Each instance has been assigned a UMLS Concept Unique Identifier (CUI). We compare the characteristics of the MSH WSD data set to the previously existing NLM WSD data set. The resulting MSH WSD data set consists of 106 ambiguous abbreviations, 88 ambiguous terms and 9 which are a combination of both, for a total of 203 ambiguous entities. For each ambiguous term/abbreviation, the data set contains a maximum of 100 instances per sense obtained from MEDLINE.We evaluated the reliability of the MSH WSD data set using existing knowledge-based methods and compared their performance to that of the results previously obtained by these algorithms on the pre-existing data set, NLM WSD. We show that the knowledge-based methods achieve different results but keep their relative performance except for the Journal Descriptor Indexing (JDI) method, whose performance is below the other methods. The MSH WSD data set allows the evaluation of WSD algorithms in the biomedical domain. Compared to previously existing data sets, MSH WSD contains a larger number of biomedical terms/abbreviations and covers the largest

  18. A GENERATIVE CAD MODEL OF A WORM GEAR MESHING

    Directory of Open Access Journals (Sweden)

    Angelika WRONKOWICZ

    2014-03-01

    Full Text Available This article introduces the term of a generative CAD model, its origins and, thus, a need of creating such a type of models. A process of generative model creation as well as specific forms of knowledge recording applied in the implementation phase in various CAD systems are briefly discussed. The example of a worm gear meshing realized by the CATIA software encapsulates the methodology of generative model construction. Sources and types of knowledge for design and construction required for development of the aforementioned model as well as the UML language as a method of formal knowledge recording are presented. The concept of model creation, i.e. assumptions and the structure as well as logic of the model operation are described. Also, the paper addresses selected elements of the project that present the manner in which the model was constructed.

  19. Unstructured Mesh Movement and Viscous Mesh Generation for CFD-Based Design Optimization Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovations proposed are twofold: 1) a robust unstructured mesh movement method able to handle isotropic (Euler), anisotropic (viscous), mixed element (hybrid)...

  20. Unstructured Mesh Movement and Viscous Mesh Generation for CFD-Based Design Optimization, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovations proposed are twofold: 1) a robust unstructured mesh movement method able to handle isotropic (Euler), anisotropic (viscous), mixed element (hybrid)...

  1. The automatic electromagnetic field generating system

    Science.gov (United States)

    Audone, B.; Gerbi, G.

    1982-07-01

    The technical study and the design approaches adopted for the definition of the automatic electromagnetic field generating system (AEFGS) dedicated to EMC susceptibility testing are presented. The AEFGS covers the frequency range 10 KHz to 40 GHZ and operates successfully in the two EMC shielded chambers at ESTEC. The performance of the generators/amplifiers subsystems, antennas selection, field amplitude and susceptibility feedback and monitoring systems is described. System control modes which guarantee the AEFGS full operability under different test conditions are discussed. Advantages of automation of susceptibility testing include increased measurement accuracy and testing cost reduction.

  2. Automatic generation of combinatorial test data

    CERN Document Server

    Zhang, Jian; Ma, Feifei

    2014-01-01

    This book reviews the state-of-the-art in combinatorial testing, with particular emphasis on the automatic generation of test data. It describes the most commonly used approaches in this area - including algebraic construction, greedy methods, evolutionary computation, constraint solving and optimization - and explains major algorithms with examples. In addition, the book lists a number of test generation tools, as well as benchmarks and applications. Addressing a multidisciplinary topic, it will be of particular interest to researchers and professionals in the areas of software testing, combi

  3. Efficient computation of clipped Voronoi diagram for mesh generation

    KAUST Repository

    Yan, Dongming

    2013-04-01

    The Voronoi diagram is a fundamental geometric structure widely used in various fields, especially in computer graphics and geometry computing. For a set of points in a compact domain (i.e. a bounded and closed 2D region or a 3D volume), some Voronoi cells of their Voronoi diagram are infinite or partially outside of the domain, but in practice only the parts of the cells inside the domain are needed, as when computing the centroidal Voronoi tessellation. Such a Voronoi diagram confined to a compact domain is called a clipped Voronoi diagram. We present an efficient algorithm to compute the clipped Voronoi diagram for a set of sites with respect to a compact 2D region or a 3D volume. We also apply the proposed method to optimal mesh generation based on the centroidal Voronoi tessellation. Crown Copyright © 2011 Published by Elsevier Ltd. All rights reserved.

  4. Advances in Parallelization for Large Scale Oct-Tree Mesh Generation

    Science.gov (United States)

    O'Connell, Matthew; Karman, Steve L.

    2015-01-01

    Despite great advancements in the parallelization of numerical simulation codes over the last 20 years, it is still common to perform grid generation in serial. Generating large scale grids in serial often requires using special "grid generation" compute machines that can have more than ten times the memory of average machines. While some parallel mesh generation techniques have been proposed, generating very large meshes for LES or aeroacoustic simulations is still a challenging problem. An automated method for the parallel generation of very large scale off-body hierarchical meshes is presented here. This work enables large scale parallel generation of off-body meshes by using a novel combination of parallel grid generation techniques and a hybrid "top down" and "bottom up" oct-tree method. Meshes are generated using hardware commonly found in parallel compute clusters. The capability to generate very large meshes is demonstrated by the generation of off-body meshes surrounding complex aerospace geometries. Results are shown including a one billion cell mesh generated around a Predator Unmanned Aerial Vehicle geometry, which was generated on 64 processors in under 45 minutes.

  5. Semi-automatic construction of the Chinese-English MeSH using Web-based term translation method.

    Science.gov (United States)

    Lu, Wen-Hsiang; Lin, Shih-Jui; Chan, Yi-Che; Chen, Kuan-Hsi

    2005-01-01

    Due to language barrier, non-English users are unable to retrieve the most updated medical information from the U.S. authoritative medical websites, such as PubMed and MedlinePlus. A few cross-language medical information retrieval (CLMIR) systems have been utilizing MeSH (Medical Subject Heading) with multilingual thesaurus to bridge the gap. Unfortunately, MeSH has yet not been translated into traditional Chinese currently. We proposed a semi-automatic approach to constructing Chinese-English MeSH based on Web-based term translation. The system provides knowledge engineers with candidate terms mining from anchor texts and search-result pages. The result is encouraging. Currently, more than 19,000 Chinese-English MeSH entries have been complied. This thesaurus will be used in Chinese-English CLMIR in the future.

  6. Automatic code generation for distributed robotic systems

    International Nuclear Information System (INIS)

    Jones, J.P.

    1993-01-01

    Hetero Helix is a software environment which supports relatively large robotic system development projects. The environment supports a heterogeneous set of message-passing LAN-connected common-bus multiprocessors, but the programming model seen by software developers is a simple shared memory. The conceptual simplicity of shared memory makes it an extremely attractive programming model, especially in large projects where coordinating a large number of people can itself become a significant source of complexity. We present results from three system development efforts conducted at Oak Ridge National Laboratory over the past several years. Each of these efforts used automatic software generation to create 10 to 20 percent of the system

  7. h-Adaptive Mesh Generation using Electric Field Intensity Value as a Criterion (in Japanese)

    OpenAIRE

    Toyonaga, Kiyomi; Cingoski, Vlatko; Kaneda, Kazufumi; Yamashita, Hideo

    1994-01-01

    Finite mesh divisions are essential to obtain accurate solution of two dimensional electric field analysis. It requires the technical knowledge to generate a suitable fine mesh divisions. In electric field problem, analysts are usually interested in the electric field intensity and its distribution. In order to obtain electric field intensity with high-accuracy, we have developed and adaptive mesh generator using electric field intensity value as a criterion.

  8. Adaptive mesh generation for image registration and segmentation

    DEFF Research Database (Denmark)

    Fogtmann, Mads; Larsen, Rasmus

    2013-01-01

    measure. The method was tested on a T1 weighted MR volume of an adult brain and showed a 66% reduction in the number of mesh vertices compared to a red-subdivision strategy. The deformation capability of the mesh was tested by registration to five additional T1-weighted MR volumes....

  9. Grain Boundary Conformed Volumetric Mesh Generation from a Three-Dimensional Voxellated Polycrystalline Microstructure

    Science.gov (United States)

    Lee, Myeong-Jin; Jeon, Young-Ju; Son, Ga-Eun; Sung, Sihwa; Kim, Ju-Young; Han, Heung Nam; Cho, Soo Gyeong; Jung, Sang-Hyun; Lee, Sukbin

    2018-03-01

    We present a new comprehensive scheme for generating grain boundary conformed, volumetric mesh elements from a three-dimensional voxellated polycrystalline microstructure. From the voxellated image of a polycrystalline microstructure obtained from the Monte Carlo Potts model in the context of isotropic normal grain growth simulation, its grain boundary network is approximated as a curvature-maintained conformal triangular surface mesh using a set of in-house codes. In order to improve the surface mesh quality and to adjust mesh resolution, various re-meshing techniques in a commercial software are applied to the approximated grain boundary mesh. It is found that the aspect ratio, the minimum angle and the Jacobian value of the re-meshed surface triangular mesh are successfully improved. Using such an enhanced surface mesh, conformal volumetric tetrahedral elements of the polycrystalline microstructure are created using a commercial software, again. The resultant mesh seamlessly retains the short- and long-range curvature of grain boundaries and junctions as well as the realistic morphology of the grains inside the polycrystal. It is noted that the proposed scheme is the first to successfully generate three-dimensional mesh elements for polycrystals with high enough quality to be used for the microstructure-based finite element analysis, while the realistic characteristics of grain boundaries and grains are maintained from the corresponding voxellated microstructure image.

  10. Challenges in Second-Generation Wireless Mesh Networks

    Directory of Open Access Journals (Sweden)

    Pescapé Antonio

    2008-01-01

    Full Text Available Wireless mesh networks have the potential to provide ubiquitous high-speed Internet access at low costs. The good news is that initial deployments of WiFi meshes show the feasibility of providing ubiquitous Internet connectivity. However, their performance is far below the necessary and achievable limit. Moreover, users' subscription in the existing meshes is dismal even though the technical challenges to get connectivity are low. This paper provides an overview of the current status of mesh networks' deployment, and highlights the technical, economical, and social challenges that need to be addressed in the next years. As a proof-of-principle study, we discuss the above-mentioned challenges with reference to three real networks: (i MagNets, an operator-driven planned two-tier mesh network; (ii Berlin Freifunk network as a pure community-driven single-tier network; (iii Weimar Freifunk network, also a community-driven but two-tier network.

  11. Challenges in Second-Generation Wireless Mesh Networks

    Directory of Open Access Journals (Sweden)

    Thomas Huehn

    2008-10-01

    Full Text Available Wireless mesh networks have the potential to provide ubiquitous high-speed Internet access at low costs. The good news is that initial deployments of WiFi meshes show the feasibility of providing ubiquitous Internet connectivity. However, their performance is far below the necessary and achievable limit. Moreover, users' subscription in the existing meshes is dismal even though the technical challenges to get connectivity are low. This paper provides an overview of the current status of mesh networks' deployment, and highlights the technical, economical, and social challenges that need to be addressed in the next years. As a proof-of-principle study, we discuss the above-mentioned challenges with reference to three real networks: (i MagNets, an operator-driven planned two-tier mesh network; (ii Berlin Freifunk network as a pure community-driven single-tier network; (iii Weimar Freifunk network, also a community-driven but two-tier network.

  12. Hex-dominant mesh generation using 3D constrained triangulation

    Energy Technology Data Exchange (ETDEWEB)

    OWEN,STEVEN J.

    2000-05-30

    A method for decomposing a volume with a prescribed quadrilateral surface mesh, into a hexahedral-dominated mesh is proposed. With this method, known as Hex-Morphing (H-Morph), an initial tetrahedral mesh is provided. Tetrahedral are transformed and combined starting from the boundary and working towards the interior of the volume. The quadrilateral faces of the hexahedra are treated as internal surfaces, which can be recovered using constrained triangulation techniques. Implementation details of the edge and face recovery process are included. Examples and performance of the H-Morph algorithm are also presented.

  13. Automatic Testcase Generation for Flight Software

    Science.gov (United States)

    Bushnell, David Henry; Pasareanu, Corina; Mackey, Ryan M.

    2008-01-01

    The TacSat3 project is applying Integrated Systems Health Management (ISHM) technologies to an Air Force spacecraft for operational evaluation in space. The experiment will demonstrate the effectiveness and cost of ISHM and vehicle systems management (VSM) technologies through onboard operation for extended periods. We present two approaches to automatic testcase generation for ISHM: 1) A blackbox approach that views the system as a blackbox, and uses a grammar-based specification of the system's inputs to automatically generate *all* inputs that satisfy the specifications (up to prespecified limits); these inputs are then used to exercise the system. 2) A whitebox approach that performs analysis and testcase generation directly on a representation of the internal behaviour of the system under test. The enabling technologies for both these approaches are model checking and symbolic execution, as implemented in the Ames' Java PathFinder (JPF) tool suite. Model checking is an automated technique for software verification. Unlike simulation and testing which check only some of the system executions and therefore may miss errors, model checking exhaustively explores all possible executions. Symbolic execution evaluates programs with symbolic rather than concrete values and represents variable values as symbolic expressions. We are applying the blackbox approach to generating input scripts for the Spacecraft Command Language (SCL) from Interface and Control Systems. SCL is an embedded interpreter for controlling spacecraft systems. TacSat3 will be using SCL as the controller for its ISHM systems. We translated the SCL grammar into a program that outputs scripts conforming to the grammars. Running JPF on this program generates all legal input scripts up to a prespecified size. Script generation can also be targeted to specific parts of the grammar of interest to the developers. These scripts are then fed to the SCL Executive. ICS's in-house coverage tools will be run to

  14. Unstructured Mesh Movement and Viscous Mesh Generation for CFD-Based Design Optimization, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovations proposed by ResearchSouth are: 1) a robust method to automatically insert high quality anisotropic prismatic (viscous boundary layer) cells into any...

  15. A System for Automatically Generating Scheduling Heuristics

    Science.gov (United States)

    Morris, Robert

    1996-01-01

    The goal of this research is to improve the performance of automated schedulers by designing and implementing an algorithm by automatically generating heuristics by selecting a schedule. The particular application selected by applying this method solves the problem of scheduling telescope observations, and is called the Associate Principal Astronomer. The input to the APA scheduler is a set of observation requests submitted by one or more astronomers. Each observation request specifies an observation program as well as scheduling constraints and preferences associated with the program. The scheduler employs greedy heuristic search to synthesize a schedule that satisfies all hard constraints of the domain and achieves a good score with respect to soft constraints expressed as an objective function established by an astronomer-user.

  16. Parallel paving: An algorithm for generating distributed, adaptive, all-quadrilateral meshes on parallel computers

    Energy Technology Data Exchange (ETDEWEB)

    Lober, R.R.; Tautges, T.J.; Vaughan, C.T.

    1997-03-01

    Paving is an automated mesh generation algorithm which produces all-quadrilateral elements. It can additionally generate these elements in varying sizes such that the resulting mesh adapts to a function distribution, such as an error function. While powerful, conventional paving is a very serial algorithm in its operation. Parallel paving is the extension of serial paving into parallel environments to perform the same meshing functions as conventional paving only on distributed, discretized models. This extension allows large, adaptive, parallel finite element simulations to take advantage of paving`s meshing capabilities for h-remap remeshing. A significantly modified version of the CUBIT mesh generation code has been developed to host the parallel paving algorithm and demonstrate its capabilities on both two dimensional and three dimensional surface geometries and compare the resulting parallel produced meshes to conventionally paved meshes for mesh quality and algorithm performance. Sandia`s {open_quotes}tiling{close_quotes} dynamic load balancing code has also been extended to work with the paving algorithm to retain parallel efficiency as subdomains undergo iterative mesh refinement.

  17. Automatic Generation of Validated Specific Epitope Sets

    Directory of Open Access Journals (Sweden)

    Sebastian Carrasco Pro

    2015-01-01

    Full Text Available Accurate measurement of B and T cell responses is a valuable tool to study autoimmunity, allergies, immunity to pathogens, and host-pathogen interactions and assist in the design and evaluation of T cell vaccines and immunotherapies. In this context, it is desirable to elucidate a method to select validated reference sets of epitopes to allow detection of T and B cells. However, the ever-growing information contained in the Immune Epitope Database (IEDB and the differences in quality and subjects studied between epitope assays make this task complicated. In this study, we develop a novel method to automatically select reference epitope sets according to a categorization system employed by the IEDB. From the sets generated, three epitope sets (EBV, mycobacteria and dengue were experimentally validated by detection of T cell reactivity ex vivo from human donors. Furthermore, a web application that will potentially be implemented in the IEDB was created to allow users the capacity to generate customized epitope sets.

  18. Automatic Generation of Minimal Cut Sets

    Directory of Open Access Journals (Sweden)

    Sentot Kromodimoeljo

    2015-06-01

    Full Text Available A cut set is a collection of component failure modes that could lead to a system failure. Cut Set Analysis (CSA is applied to critical systems to identify and rank system vulnerabilities at design time. Model checking tools have been used to automate the generation of minimal cut sets but are generally based on checking reachability of system failure states. This paper describes a new approach to CSA using a Linear Temporal Logic (LTL model checker called BT Analyser that supports the generation of multiple counterexamples. The approach enables a broader class of system failures to be analysed, by generalising from failure state formulae to failure behaviours expressed in LTL. The traditional approach to CSA using model checking requires the model or system failure to be modified, usually by hand, to eliminate already-discovered cut sets, and the model checker to be rerun, at each step. By contrast, the new approach works incrementally and fully automatically, thereby removing the tedious and error-prone manual process and resulting in significantly reduced computation time. This in turn enables larger models to be checked. Two different strategies for using BT Analyser for CSA are presented. There is generally no single best strategy for model checking: their relative efficiency depends on the model and property being analysed. Comparative results are given for the A320 hydraulics case study in the Behavior Tree modelling language.

  19. A comprehensive tool for image-based generation of fetus and pregnant women mesh models for numerical dosimetry studies

    International Nuclear Information System (INIS)

    Dahdouh, S; Serrurier, A; De la Plata, J-P; Anquez, J; Angelini, E D; Bloch, I; Varsier, N; Wiart, J

    2014-01-01

    Fetal dosimetry studies require the development of accurate numerical 3D models of the pregnant woman and the fetus. This paper proposes a 3D articulated fetal growth model covering the main phases of pregnancy and a pregnant woman model combining the utero-fetal structures and a deformable non-pregnant woman body envelope. The structures of interest were automatically or semi-automatically (depending on the stage of pregnancy) segmented from a database of images and surface meshes were generated. By interpolating linearly between fetal structures, each one can be generated at any age and in any position. A method is also described to insert the utero-fetal structures in the maternal body. A validation of the fetal models is proposed, comparing a set of biometric measurements to medical reference charts. The usability of the pregnant woman model in dosimetry studies is also investigated, with respect to the influence of the abdominal fat layer. (paper)

  20. INGEN: a general-purpose mesh generator for finite element codes

    International Nuclear Information System (INIS)

    Cook, W.A.

    1979-05-01

    INGEN is a general-purpose mesh generator for two- and three-dimensional finite element codes. The basic parts of the code are surface and three-dimensional region generators that use linear-blending interpolation formulas. These generators are based on an i, j, k index scheme that is used to number nodal points, construct elements, and develop displacement and traction boundary conditions. This code can generate truss elements (2 modal points); plane stress, plane strain, and axisymmetry two-dimensional continuum elements (4 to 8 nodal points); plate elements (4 to 8 nodal points); and three-dimensional continuum elements (8 to 21 nodal points). The traction loads generated are consistent with the element generated. The expansion--contraction option is of special interest. This option makes it possible to change an existing mesh such that some regions are refined and others are made coarser than the original mesh. 9 figures

  1. Generating anatomically accurate finite element meshes for electrical impedance tomography of the human head

    Science.gov (United States)

    Yang, Bin; Xu, Canhua; Dai, Meng; Fu, Feng; Dong, Xiuzhen

    2013-07-01

    For electrical impedance tomography (EIT) of brain, the use of anatomically accurate and patient-specific finite element (FE) mesh has been shown to confer significant improvements in the quality of image reconstruction. But, given the lack of a rapid method to achieve the accurate anatomic geometry of the head, the generation of patient-specifc mesh is time-comsuming. In this paper, a modified fuzzy c-means algorithm based on non-local means method is performed to implement the segmentation of different layers in the head based on head CT images. This algorithm showed a better effect, especially an accurate recognition of the ventricles and a suitable performance dealing with noise. And the FE mesh established according to the segmentation results is validated in computational simulation. So a rapid practicable method can be provided for the generation of patient-specific FE mesh of the human head that is suitable for brain EIT.

  2. New software developments for quality mesh generation and optimization from biomedical imaging data.

    Science.gov (United States)

    Yu, Zeyun; Wang, Jun; Gao, Zhanheng; Xu, Ming; Hoshijima, Masahiko

    2014-01-01

    In this paper we present a new software toolkit for generating and optimizing surface and volumetric meshes from three-dimensional (3D) biomedical imaging data, targeted at image-based finite element analysis of some biomedical activities in a single material domain. Our toolkit includes a series of geometric processing algorithms including surface re-meshing and quality-guaranteed tetrahedral mesh generation and optimization. All methods described have been encapsulated into a user-friendly graphical interface for easy manipulation and informative visualization of biomedical images and mesh models. Numerous examples are presented to demonstrate the effectiveness and efficiency of the described methods and toolkit. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  3. Automatic program generation: future of software engineering

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, J.H.

    1979-01-01

    At this moment software development is still more of an art than an engineering discipline. Each piece of software is lovingly engineered, nurtured, and presented to the world as a tribute to the writer's skill. When will this change. When will the craftsmanship be removed and the programs be turned out like so many automobiles from an assembly line. Sooner or later it will happen: economic necessities will demand it. With the advent of cheap microcomputers and ever more powerful supercomputers doubling capacity, much more software must be produced. The choices are to double the number of programers, double the efficiency of each programer, or find a way to produce the needed software automatically. Producing software automatically is the only logical choice. How will automatic programing come about. Some of the preliminary actions which need to be done and are being done are to encourage programer plagiarism of existing software through public library mechanisms, produce well understood packages such as compiler automatically, develop languages capable of producing software as output, and learn enough about the whole process of programing to be able to automate it. Clearly, the emphasis must not be on efficiency or size, since ever larger and faster hardware is coming.

  4. From medical images to flow computations without user-generated meshes.

    Science.gov (United States)

    Dillard, Seth I; Mousel, John A; Shrestha, Liza; Raghavan, Madhavan L; Vigmostad, Sarah C

    2014-10-01

    Biomedical flow computations in patient-specific geometries require integrating image acquisition and processing with fluid flow solvers. Typically, image-based modeling processes involve several steps, such as image segmentation, surface mesh generation, volumetric flow mesh generation, and finally, computational simulation. These steps are performed separately, often using separate pieces of software, and each step requires considerable expertise and investment of time on the part of the user. In this paper, an alternative framework is presented in which the entire image-based modeling process is performed on a Cartesian domain where the image is embedded within the domain as an implicit surface. Thus, the framework circumvents the need for generating surface meshes to fit complex geometries and subsequent creation of body-fitted flow meshes. Cartesian mesh pruning, local mesh refinement, and massive parallelization provide computational efficiency; the image-to-computation techniques adopted are chosen to be suitable for distributed memory architectures. The complete framework is demonstrated with flow calculations computed in two 3D image reconstructions of geometrically dissimilar intracranial aneurysms. The flow calculations are performed on multiprocessor computer architectures and are compared against calculations performed with a standard multistep route. Copyright © 2014 John Wiley & Sons, Ltd.

  5. Automatic generation control of interconnected power system with ...

    African Journals Online (AJOL)

    In this paper, automatic generation control (AGC) of two area interconnected power system having diverse sources of power generation is studied. A two area power system comprises power generations from hydro, thermal and gas sources in area-1 and power generations from hydro and thermal sources in area-2. All the ...

  6. System for Automatic Generation of Examination Papers in Discrete Mathematics

    Science.gov (United States)

    Fridenfalk, Mikael

    2013-01-01

    A system was developed for automatic generation of problems and solutions for examinations in a university distance course in discrete mathematics and tested in a pilot experiment involving 200 students. Considering the success of such systems in the past, particularly including automatic assessment, it should not take long before such systems are…

  7. ZONE: a finite element mesh generator. [In FORTRAN IV for CDC 7600

    Energy Technology Data Exchange (ETDEWEB)

    Burger, M. J.

    1976-05-01

    The ZONE computer program is a finite-element mesh generator which produces the nodes and element description of any two-dimensional geometry. The geometry is subdivided into a mesh of quadrilateral and triangular zones arranged sequentially in an ordered march through the geometry. The order of march can be chosen so that the minimum bandwidth is obtained. The node points are defined in terms of the x and y coordinates in a global rectangular coordinate system. The zones generated are quadrilaterals or triangles defined by four node points in a counterclockwise sequence. Node points defining the outside boundary are generated to describe pressure boundary conditions. The mesh that is generated can be used as input to any two-dimensional as well as any axisymmetrical structure program. The output from ZONE is essentially the input file to NAOS, HONDO, and other axisymmetric finite element programs. 14 figures. (RWR)

  8. Mobility Models for Next Generation Wireless Networks Ad Hoc, Vehicular and Mesh Networks

    CERN Document Server

    Santi, Paolo

    2012-01-01

    Mobility Models for Next Generation Wireless Networks: Ad Hoc, Vehicular and Mesh Networks provides the reader with an overview of mobility modelling, encompassing both theoretical and practical aspects related to the challenging mobility modelling task. It also: Provides up-to-date coverage of mobility models for next generation wireless networksOffers an in-depth discussion of the most representative mobility models for major next generation wireless network application scenarios, including WLAN/mesh networks, vehicular networks, wireless sensor networks, and

  9. Next Generation Model 8800 Automatic TLD Reader

    International Nuclear Information System (INIS)

    Velbeck, K.J.; Streetz, K.L.; Rotunda, J.E.

    1999-01-01

    BICRON NE has developed an advanced version of the Model 8800 Automatic TLD Reader. Improvements in the reader include a Windows NT TM -based operating system and a Pentium microprocessor for the host controller, a servo-controlled transport, a VGA display, mouse control, and modular assembly. This high capacity reader will automatically read fourteen hundred TLD Cards in one loading. Up to four elements in a card can be heated without mechanical contact, using hot nitrogen gas. Improvements in performance include an increased throughput rate and more precise card positioning. Operation is simplified through easy-to-read Windows-type screens. Glow curves are displayed graphically along with light intensity, temperature, and channel scaling. Maintenance and diagnostic aids are included for easier troubleshooting. A click of a mouse will command actions that are displayed in easy-to-understand English words. Available options include an internal 90 Sr irradiator, automatic TLD calibration, and two different extremity monitoring modes. Results from testing include reproducibility, reader stability, linearity, detection threshold, residue, primary power supply voltage and frequency, transient voltage, drop testing, and light leakage. (author)

  10. Automatic Structure-Based Code Generation from Coloured Petri Nets

    DEFF Research Database (Denmark)

    Kristensen, Lars Michael; Westergaard, Michael

    2010-01-01

    Automatic code generation based on Coloured Petri Net (CPN) models is challenging because CPNs allow for the construction of abstract models that intermix control flow and data processing, making translation into conventional programming constructs difficult. We introduce Process-Partitioned CPNs....... The viability of our approach is demonstrated by applying it to automatically generate an Erlang implementation of the Dynamic MANET On-demand (DYMO) routing protocol specified by the Internet Engineering Task Force (IETF)....

  11. A flexible content-adaptive mesh-generation strategy for image representation.

    Science.gov (United States)

    Adams, Michael D

    2011-09-01

    Based on the greedy-point removal (GPR) scheme of Demaret and Iske, a simple yet highly effective framework for constructing triangle-mesh representations of images, called GPRFS, is proposed. By using this framework and ideas from the error diffusion (ED) scheme (for mesh-generation) of Yang et al., a highly effective mesh-generation method, called GPRFS-ED, is derived and presented. Since the ED scheme plays a crucial role in our work, factors affecting the performance of this scheme are also studied in detail. Through experimental results, our GPRFS-ED method is shown to be capable of generating meshes of quality comparable to, and in many cases better than, the state-of-the-art GPR scheme, while requiring substantially less computation and memory. Furthermore, with our GPRFS-ED method, one can easily trade off between mesh quality and computational/memory complexity. A reduced-complexity version of the GPRFS-ED method (called GPRFS-MED) is also introduced to further demonstrate the computational/memory-complexity scalability of our GPRFS-ED method.

  12. Automatic Tamil lyric generation based on ontological interpretation ...

    Indian Academy of Sciences (India)

    This system proposes an -gram based approach to automatic Tamil lyric generation, by the ontological semantic interpretation of the input scene. The approach is based on identifying the semantics conveyed in the scenario, thereby making the system understand the situation and generate lyrics accordingly. The heart of ...

  13. Automatic Tamil lyric generation based on ontological interpretation ...

    Indian Academy of Sciences (India)

    Abstract. This system proposes an N-gram based approach to automatic Tamil lyric generation, by the ontological semantic interpretation of the input scene. The approach is based on identifying the semantics conveyed in the scenario, thereby mak- ing the system understand the situation and generate lyrics accordingly.

  14. Automatic generation of matter-of-opinion video documentaries

    NARCIS (Netherlands)

    S. Bocconi; F.-M. Nack (Frank); L. Hardman (Lynda)

    2008-01-01

    textabstractIn this paper we describe a model for automatically generating video documentaries. This allows viewers to specify the subject and the point of view of the documentary to be generated. The domain is matter-of-opinion documentaries based on interviews. The model combines rhetorical

  15. 6th International Meshing Roundtable '97

    Energy Technology Data Exchange (ETDEWEB)

    White, D.

    1997-09-01

    The goal of the 6th International Meshing Roundtable is to bring together researchers and developers from industry, academia, and government labs in a stimulating, open environment for the exchange of technical information related to the meshing process. In the pas~ the Roundtable has enjoyed significant participation born each of these groups from a wide variety of countries. The Roundtable will consist of technical presentations from contributed papers and abstracts, two invited speakers, and two invited panels of experts discussing topics related to the development and use of automatic mesh generation tools. In addition, this year we will feature a "Bring Your Best Mesh" competition and poster session to encourage discussion and participation from a wide variety of mesh generation tool users. The schedule and evening social events are designed to provide numerous opportunities for informal dialog. A proceedings will be published by Sandia National Laboratories and distributed at the Roundtable. In addition, papers of exceptionally high quaIity will be submitted to a special issue of the International Journal of Computational Geometry and Applications. Papers and one page abstracts were sought that present original results on the meshing process. Potential topics include but are got limited to: Unstructured triangular and tetrahedral mesh generation Unstructured quadrilateral and hexahedral mesh generation Automated blocking and structured mesh generation Mixed element meshing Surface mesh generation Geometry decomposition and clean-up techniques Geometry modification techniques related to meshing Adaptive mesh refinement and mesh quality control Mesh visualization Special purpose meshing algorithms for particular applications Theoretical or novel ideas with practical potential Technical presentations from industrial researchers.

  16. Formal Specification Based Automatic Test Generation for Embedded Network Systems

    Directory of Open Access Journals (Sweden)

    Eun Hye Choi

    2014-01-01

    Full Text Available Embedded systems have become increasingly connected and communicate with each other, forming large-scaled and complicated network systems. To make their design and testing more reliable and robust, this paper proposes a formal specification language called SENS and a SENS-based automatic test generation tool called TGSENS. Our approach is summarized as follows: (1 A user describes requirements of target embedded network systems by logical property-based constraints using SENS. (2 Given SENS specifications, test cases are automatically generated using a SAT-based solver. Filtering mechanisms to select efficient test cases are also available in our tool. (3 In addition, given a testing goal by the user, test sequences are automatically extracted from exhaustive test cases. We’ve implemented our approach and conducted several experiments on practical case studies. Through the experiments, we confirmed the efficiency of our approach in design and test generation of real embedded air-conditioning network systems.

  17. Parallel octree-based hexahedral mesh generation for eulerian to lagrangian conversion.

    Energy Technology Data Exchange (ETDEWEB)

    Staten, Matthew L.; Owen, Steven James

    2010-09-01

    Computational simulation must often be performed on domains where materials are represented as scalar quantities or volume fractions at cell centers of an octree-based grid. Common examples include bio-medical, geotechnical or shock physics calculations where interface boundaries are represented only as discrete statistical approximations. In this work, we introduce new methods for generating Lagrangian computational meshes from Eulerian-based data. We focus specifically on shock physics problems that are relevant to ASC codes such as CTH and Alegra. New procedures for generating all-hexahedral finite element meshes from volume fraction data are introduced. A new primal-contouring approach is introduced for defining a geometric domain. New methods for refinement, node smoothing, resolving non-manifold conditions and defining geometry are also introduced as well as an extension of the algorithm to handle tetrahedral meshes. We also describe new scalable MPI-based implementations of these procedures. We describe a new software module, Sculptor, which has been developed for use as an embedded component of CTH. We also describe its interface and its use within the mesh generation code, CUBIT. Several examples are shown to illustrate the capabilities of Sculptor.

  18. Automatic Generation of Network Protocol Gateways

    DEFF Research Database (Denmark)

    Bromberg, Yérom-David; Réveillère, Laurent; Lawall, Julia

    2009-01-01

    for describing protocol behaviors, message structures, and the gateway logic.  Z2z includes a compiler that checks essential correctness properties and produces efficient code. We have used z2z to develop a number of gateways, including SIP to RTSP, SLP to UPnP, and SMTP to SMTP via HTTP, involving a range......The emergence of networked devices in the home has made it possible to develop applications that control a variety of household functions. However, current devices communicate via a multitude of incompatible protocols, and thus gateways are needed to translate between them.  Gateway construction......, however, requires an intimate knowledge of the relevant protocols and a substantial understanding of low-level network programming, which can be a challenge for many application programmers. This paper presents a generative approach to gateway construction, z2z, based on a domain-specific language...

  19. Towards Automatic Personalized Content Generation for Platform Games

    DEFF Research Database (Denmark)

    Shaker, Noor; Yannakakis, Georgios N.; Togelius, Julian

    2010-01-01

    In this paper, we show that personalized levels can be automatically generated for platform games. We build on previous work, where models were derived that predicted player experience based on features of level design and on playing styles. These models are constructed using preference learning,...

  20. Automatic Generation of Map-Based Interface for VRML Contents

    Science.gov (United States)

    Araya, Shinji; Suzaki, Kenichi; Miyake, Yoshihiro

    The paper proposes a Web page that can automatically generate a map-based interface for any VRML contents on the Web. This new approach reduces map development costs and provides a common interface to the users. 3D contents reconstruction is distributed among the client computers to guarantee Web service efficiency.

  1. Design dependencies within the automatic generation of hypermedia presentations

    NARCIS (Netherlands)

    O. Rosell Martinez

    2002-01-01

    textabstractMany dependencies appear between the different stages of the creation of a hypermedia presentation. These dependencies have to be taken into account while designing a system for their automatic generation. In this work we study two of them and propose some techniques to treat them.

  2. A quick scan on possibilities for automatic metadata generation

    NARCIS (Netherlands)

    Benneker, Frank

    2006-01-01

    The Quick Scan is a report on research into useable solutions for automatic generation of metadata or parts of metadata. The aim of this study is to explore possibilities for facilitating the process of attaching metadata to learning objects. This document is aimed at developers of digital learning

  3. Deformable meshes for medical image segmentation accurate automatic segmentation of anatomical structures

    CERN Document Server

    Kainmueller, Dagmar

    2014-01-01

    ? Segmentation of anatomical structures in medical image data is an essential task in clinical practice. Dagmar Kainmueller introduces methods for accurate fully automatic segmentation of anatomical structures in 3D medical image data. The author's core methodological contribution is a novel deformation model that overcomes limitations of state-of-the-art Deformable Surface approaches, hence allowing for accurate segmentation of tip- and ridge-shaped features of anatomical structures. As for practical contributions, she proposes application-specific segmentation pipelines for a range of anatom

  4. Automatic Definition Extraction and Crossword Generation From Spanish News Text

    Directory of Open Access Journals (Sweden)

    Jennifer Esteche

    2017-08-01

    Full Text Available This paper describes the design and implementation of a system that takes Spanish texts and generates crosswords (board and definitions in a fully automatic way using definitions extracted from those texts. Our solution divides the problem in two parts: a definition extraction module that applies pattern matching implemented in Python, and a crossword generation module that uses a greedy strategy implemented in Prolog. The system achieves 73% precision and builds crosswords similar to those built by humans.

  5. Algebraic mesh generation for large scale viscous-compressible aerodynamic simulation

    International Nuclear Information System (INIS)

    Smith, R.E.

    1984-01-01

    Viscous-compressible aerodynamic simulation is the numerical solution of the compressible Navier-Stokes equations and associated boundary conditions. Boundary-fitted coordinate systems are well suited for the application of finite difference techniques to the Navier-Stokes equations. An algebraic approach to boundary-fitted coordinate systems is one where an explicit functional relation describes a mesh on which a solution is obtained. This approach has the advantage of rapid-precise mesh control. The basic mathematical structure of three algebraic mesh generation techniques is described. They are transfinite interpolation, the multi-surface method, and the two-boundary technique. The Navier-Stokes equations are transformed to a computational coordinate system where boundary-fitted coordinates can be applied. Large-scale computation implies that there is a large number of mesh points in the coordinate system. Computation of viscous compressible flow using boundary-fitted coordinate systems and the application of this computational philosophy on a vector computer are presented

  6. Algorithms for Zonal Methods and Development of Three Dimensional Mesh Generation Procedures.

    Science.gov (United States)

    1984-02-01

    Spac a f Pln Fiur 1., WelOree WapdSheia)ri ap C ?I -17(x .Y, z (x.___Z Figure 2. Example of Mesh Embedding in Two Dimensions Figure 3. Example of...Navier-Stokes algorithm with a well stretched grid. Such negative arguments may be offset by additional advantages for zonal schemes. For one, steady...Stanford, California ! 11 ABSTRACT T superscript indicating transpose of a matrix An algorithm for generating computational grids x independent variable in

  7. An automated tetrahedral mesh generator for computer simulation in Odontology based on the Delaunay's algorithm

    Directory of Open Access Journals (Sweden)

    Mauro Massayoshi Sakamoto

    2008-01-01

    Full Text Available In this work, a software package based on the Delaunay´s algorithm is described. The main feature of this package is the capability in applying discretization in geometric domains of teeth taking into account their complex inner structures and the materials with different hardness. Usually, the mesh generators reported in literature treat molars and other teeth by using simplified geometric models, or even considering the teeth as homogeneous structures.

  8. An Automatic Networking and Routing Algorithm for Mesh Network in PLC System

    Science.gov (United States)

    Liu, Xiaosheng; Liu, Hao; Liu, Jiasheng; Xu, Dianguo

    2017-05-01

    Power line communication (PLC) is considered to be one of the best communication technologies in smart grid. However, the topology of low voltage distribution network is complex, meanwhile power line channel has characteristics of time varying and attenuation, which lead to the unreliability of power line communication. In this paper, an automatic networking and routing algorithm is introduced which can be adapted to the "blind state" topology. The results of simulation and test show that the scheme is feasible, the routing overhead is small, and the load balance performance is good, which can achieve the establishment and maintenance of network quickly and effectively. The scheme is of great significance to improve the reliability of PLC.

  9. AUTOCASK (AUTOmatic Generation of 3-D CASK models). A microcomputer based system for shipping cask design review analysis

    International Nuclear Information System (INIS)

    Gerhard, M.A.; Sommer, S.C.

    1995-04-01

    AUTOCASK (AUTOmatic Generation of 3-D CASK models) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for the structural analysis of shipping casks for radioactive material. Model specification is performed on the microcomputer, and the analyses are performed on an engineering workstation or mainframe computer. AUTOCASK is based on 80386/80486 compatible microcomputers. The system is composed of a series of menus, input programs, display programs, a mesh generation program, and archive programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests

  10. AUTOCASK (AUTOmatic Generation of 3-D CASK models). A microcomputer based system for shipping cask design review analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gerhard, M.A.; Sommer, S.C. [Lawrence Livermore National Lab., CA (United States)

    1995-04-01

    AUTOCASK (AUTOmatic Generation of 3-D CASK models) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for the structural analysis of shipping casks for radioactive material. Model specification is performed on the microcomputer, and the analyses are performed on an engineering workstation or mainframe computer. AUTOCASK is based on 80386/80486 compatible microcomputers. The system is composed of a series of menus, input programs, display programs, a mesh generation program, and archive programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests.

  11. Automatic control system generation for robot design validation

    Science.gov (United States)

    Bacon, James A. (Inventor); English, James D. (Inventor)

    2012-01-01

    The specification and drawings present a new method, system and software product for and apparatus for generating a robotic validation system for a robot design. The robotic validation system for the robot design of a robotic system is automatically generated by converting a robot design into a generic robotic description using a predetermined format, then generating a control system from the generic robotic description and finally updating robot design parameters of the robotic system with an analysis tool using both the generic robot description and the control system.

  12. Automated Generation of Finite-Element Meshes for Aircraft Conceptual Design

    Science.gov (United States)

    Li, Wu; Robinson, Jay

    2016-01-01

    This paper presents a novel approach for automated generation of fully connected finite-element meshes for all internal structural components and skins of a given wing-body geometry model, controlled by a few conceptual-level structural layout parameters. Internal structural components include spars, ribs, frames, and bulkheads. Structural layout parameters include spar/rib locations in wing chordwise/spanwise direction and frame/bulkhead locations in longitudinal direction. A simple shell thickness optimization problem with two load conditions is used to verify versatility and robustness of the automated meshing process. The automation process is implemented in ModelCenter starting from an OpenVSP geometry and ending with a NASTRAN 200 solution. One subsonic configuration and one supersonic configuration are used for numerical verification. Two different structural layouts are constructed for each configuration and five finite-element meshes of different sizes are generated for each layout. The paper includes various comparisons of solutions of 20 thickness optimization problems, as well as discussions on how the optimal solutions are affected by the stress constraint bound and the initial guess of design variables.

  13. Optimum siting and sizing of a large distributed generator in a mesh connected system

    Energy Technology Data Exchange (ETDEWEB)

    Elnashar, Mohab M.; El Shatshat, Ramadan; Salama, Magdy M.A. [Department of Electrical and Computer Engineering, University of Waterloo, Waterloo, Ontario (Canada)

    2010-06-15

    This paper proposes a new approach to optimally determine the appropriate size and location of the distributed generator (DG) in a large mesh connected system. This paper presents a visual optimization approach in which the planner plays an important role in determining the optimal siting and sizing of the DG through the choice of the appropriate weight factors of the parameters included in the optimization technique according to the system deficiencies. Losses, voltage profile and short circuit level are used in the algorithm to determine the optimum sizes and locations of the DG. The short circuit level parameter is introduced to represent the protective device requirements in the selection of the size and location of the DG. The proposed technique has been tested on the IEEE 24 - bus mesh connected test system. The obtained results showed clearly that the optimal size and location can be simply determined through the proposed approach. (author)

  14. Automatic Performance Model Generation for Java Enterprise Edition (EE) Applications

    OpenAIRE

    Brunnert, Andreas;Vögele, Christian;Krcmar, Helmut

    2015-01-01

    The effort required to create performance models for enterprise applications is often out of proportion compared to their benefits. This work aims to reduce this effort by introducing an approach to automatically generate component-based performance models for running Java EE applications. The approach is applicable for all Java EE server products as it relies on standardized component types and interfaces to gather the required data for modeling an application. The feasibility of the approac...

  15. Developing an Automatic Generation Tool for Cryptographic Pairing Functions

    OpenAIRE

    Dominguez Perez, Luis Julian

    2011-01-01

    Pairing-Based Cryptography is receiving steadily more attention from industry, mainly because of the increasing interest in Identity-Based protocols. Although there are plenty of applications, efficiently implementing the pairing functions is often difficult as it requires more knowledge than previous cryptographic primitives. The author presents a tool for automatically generating optimized code for the pairing functions which can be used in the construction of such cryptograp...

  16. MESHJET. A mesh generation package for finite element MHD equilibrium codes at JET

    International Nuclear Information System (INIS)

    Springmann, E.; Taroni, A.

    1984-01-01

    MESHJET is a fairly general package and can be used to generate meshes for any finite element code in two space dimensions. These finite element codes are widely used at JET. The first code is for the identification of the plasma boundary and internal flux surfaces from measurements of external fluxes and fields under the assumption that the plasma toroidal density can be represented within a given class of functions. The second code computes plasma equilibrium configurations taking into account a two-dimensional model of the transformer iron core in JET. (author)

  17. Automatic generation of pictorial transcripts of video programs

    Science.gov (United States)

    Shahraray, Behzad; Gibbon, David C.

    1995-03-01

    An automatic authoring system for the generation of pictorial transcripts of video programs which are accompanied by closed caption information is presented. A number of key frames, each of which represents the visual information in a segment of the video (i.e., a scene), are selected automatically by performing a content-based sampling of the video program. The textual information is recovered from the closed caption signal and is initially segmented based on its implied temporal relationship with the video segments. The text segmentation boundaries are then adjusted, based on lexical analysis and/or caption control information, to account for synchronization errors due to possible delays in the detection of scene boundaries or the transmission of the caption information. The closed caption text is further refined through linguistic processing for conversion to lower- case with correct capitalization. The key frames and the related text generate a compact multimedia presentation of the contents of the video program which lends itself to efficient storage and transmission. This compact representation can be viewed on a computer screen, or used to generate the input to a commercial text processing package to generate a printed version of the program.

  18. Automatic generation of Fortran programs for algebraic simulation models

    International Nuclear Information System (INIS)

    Schopf, W.; Rexer, G.; Ruehle, R.

    1978-04-01

    This report documents a generator program by which econometric simulation models formulated in an application-orientated language can be transformed automatically in a Fortran program. Thus the model designer is able to build up, test and modify models without the need of a Fortran programmer. The development of a computer model is therefore simplified and shortened appreciably; in chapter 1-3 of this report all rules are presented for the application of the generator to the model design. Algebraic models including exogeneous and endogeneous time series variables, lead and lag function can be generated. In addition, to these language elements, Fortran sequences can be applied to the formulation of models in the case of complex model interrelations. Automatically the generated model is a module of the program system RSYST III and is therefore able to exchange input and output data with the central data bank of the system and in connection with the method library modules can be used to handle planning problems. (orig.) [de

  19. Automatic generation of executable communication specifications from parallel applications

    Energy Technology Data Exchange (ETDEWEB)

    Pakin, Scott [Los Alamos National Laboratory; Wu, Xing [NCSU; Mueller, Frank [NCSU

    2011-01-19

    Portable parallel benchmarks are widely used and highly effective for (a) the evaluation, analysis and procurement of high-performance computing (HPC) systems and (b) quantifying the potential benefits of porting applications for new hardware platforms. Yet, past techniques to synthetically parameterized hand-coded HPC benchmarks prove insufficient for today's rapidly-evolving scientific codes particularly when subject to multi-scale science modeling or when utilizing domain-specific libraries. To address these problems, this work contributes novel methods to automatically generate highly portable and customizable communication benchmarks from HPC applications. We utilize ScalaTrace, a lossless, yet scalable, parallel application tracing framework to collect selected aspects of the run-time behavior of HPC applications, including communication operations and execution time, while abstracting away the details of the computation proper. We subsequently generate benchmarks with identical run-time behavior from the collected traces. A unique feature of our approach is that we generate benchmarks in CONCEPTUAL, a domain-specific language that enables the expression of sophisticated communication patterns using a rich and easily understandable grammar yet compiles to ordinary C + MPI. Experimental results demonstrate that the generated benchmarks are able to preserve the run-time behavior - including both the communication pattern and the execution time - of the original applications. Such automated benchmark generation is particularly valuable for proprietary, export-controlled, or classified application codes: when supplied to a third party. Our auto-generated benchmarks ensure performance fidelity but without the risks associated with releasing the original code. This ability to automatically generate performance-accurate benchmarks from parallel applications is novel and without any precedence, to our knowledge.

  20. MODULEWRITER: a program for automatic generation of database interfaces.

    Science.gov (United States)

    Zheng, Christina L; Fana, Fariba; Udupi, Poornaprajna V; Gribskov, Michael

    2003-05-01

    MODULEWRITER is a PERL object relational mapping (ORM) tool that automatically generates database specific application programming interfaces (APIs) for SQL databases. The APIs consist of a package of modules providing access to each table row and column. Methods for retrieving, updating and saving entries are provided, as well as other generally useful methods (such as retrieval of the highest numbered entry in a table). MODULEWRITER provides for the inclusion of user-written code, which can be preserved across multiple runs of the MODULEWRITER program.

  1. Automatic generation of gene finders for eukaryotic species

    DEFF Research Database (Denmark)

    Terkelsen, Kasper Munch; Krogh, A.

    2006-01-01

    Background The number of sequenced eukaryotic genomes is rapidly increasing. This means that over time it will be hard to keep supplying customised gene finders for each genome. This calls for procedures to automatically generate species-specific gene finders and to re-train them as the quantity...... length distributions. The performance of each individual gene predictor on each individual genome is comparable to the best of the manually optimised species-specific gene finders. It is shown that species-specific gene finders are superior to gene finders trained on other species....

  2. Coupling LaGrit unstructured mesh generation and model setup with TOUGH2 flow and transport: A case study

    Science.gov (United States)

    Sentís, Manuel Lorenzo; Gable, Carl W.

    2017-11-01

    There are many applications in science and engineering modeling where an accurate representation of a complex model geometry in the form of a mesh is important. In applications of flow and transport in subsurface porous media, this is manifest in models that must capture complex geologic stratigraphy, structure (faults, folds, erosion, deposition) and infrastructure (tunnels, boreholes, excavations). Model setup, defined as the activities of geometry definition, mesh generation (creation, optimization, modification, refine, de-refine, smooth), assigning material properties, initial conditions and boundary conditions requires specialized software tools to automate and streamline the process. In addition, some model setup tools will provide more utility if they are designed to interface with and meet the needs of a particular flow and transport software suite. A control volume discretization that uses a two point flux approximation is for example most accurate when the underlying control volumes are 2D or 3D Voronoi tessellations. In this paper we will present the coupling of LaGriT, a mesh generation and model setup software suite and TOUGH2 (Pruess et al., 1999) to model subsurface flow problems and we show an example of how LaGriT can be used as a model setup tool for the generation of a Voronoi mesh for the simulation program TOUGH2. To generate the MESH file for TOUGH2 from the LaGriT output a standalone module Lagrit2Tough2 was developed, which is presented here and will be included in a future release of LaGriT. In this paper an alternative method to generate a Voronoi mesh for TOUGH2 with LaGriT is presented and thanks to the modular and command based structure of LaGriT this method is well suited to generating a mesh for complex models.

  3. Generation of reservoir models on flexible meshes; Generation de modeles de reservoir sur maillage flexible

    Energy Technology Data Exchange (ETDEWEB)

    Ricard, L.

    2005-12-15

    The high level geo-statistic description of the subsurface are often far too detailed for use in routine flow simulators. To make flow simulations tractable, the number of grid blocks has to be reduced: an approximation, still relevant with flow description, is necessary. In this work, we place the emphasis on the scaling procedure from the fine scale model to the multi-scale reservoir model. Two main problems appear: Near wells, faults and channels, the volume of flexible cells may be less than fine ones, so we need to solve a down-scaling problem; Far from these regions, the volume of cells are bigger than fine ones so we need to solve an up-scaling problem. In this work, research has been done on each of these three areas: down-scaling, up-scaling and fluid flow simulation. For each of these subjects, a review, some news improvements and comparative study are proposed. The proposed down-scaling method is build to be compatible with existing data integration methods. The comparative study shows that empirical methods are not enough accurate to solve the problem. Concerning the up-scaling step, the proposed approach is based on an existing method: the perturbed boundary conditions. An extension to unstructured mesh is developed for the inter-cell permeability tensor. The comparative study shows that numerical methods are not always as accurate as expected and the empirical model can be sufficient in lot of cases. A new approach to single-phase fluid flow simulation is developed. This approach can handle with full tensorial permeability fields with source or sink terms.(author)

  4. Semi-Automatic Construction of the Chinese-English MeSH Using Web-Based Term Translation Method

    OpenAIRE

    Lu, Wen-Hsiang; Lin, Shih-Jui; Chan, Yi-Che; Chen, Kuan-Hsi

    2005-01-01

    Due to language barrier, non-English users are unable to retrieve the most updated medical information from the U.S. authoritative medical websites, such as PubMed and MedlinePlus. A few cross-language medical information retrieval (CLMIR) systems have been utilizing MeSH (Medical Subject Heading) with multilingual thesaurus to bridge the gap. Unfortunately, MeSH has yet not been translated into traditional Chinese currently.

  5. LINGUISTIC DATABASE FOR AUTOMATIC GENERATION SYSTEM OF ENGLISH ADVERTISING TEXTS

    Directory of Open Access Journals (Sweden)

    N. A. Metlitskaya

    2017-01-01

    Full Text Available The article deals with the linguistic database for the system of automatic generation of English advertising texts on cosmetics and perfumery. The database for such a system includes two main blocks: automatic dictionary (that contains semantic and morphological information for each word, and semantic-syntactical formulas of the texts in a special formal language SEMSINT. The database is built on the result of the analysis of 30 English advertising texts on cosmetics and perfumery. First, each word was given a unique code. For example, N stands for nouns, A – for adjectives, V – for verbs, etc. Then all the lexicon of the analyzed texts was distributed into different semantic categories. According to this semantic classification each word was given a special semantic code. For example, the record N01 that is attributed to the word «lip» in the dictionary means that this word refers to nouns of the semantic category «part of a human’s body».The second block of the database includes the semantic-syntactical formulas of the analyzed advertising texts written in a special formal language SEMSINT. The author gives a brief description of this language, presenting its essence and structure. Also, an example of one formalized advertising text in SEMSINT is provided.

  6. AUTO-LAY: automatic layout generation for procedure flow diagrams

    International Nuclear Information System (INIS)

    Forzano, P.; Castagna, P.

    1995-01-01

    Nuclear Power Plant Procedures can be seen from essentially two viewpoints: the process and the information management. From the first point of view, it is important to supply the knowledge apt to solve problems connected with the control of the process, from the second one the focus of attention is on the knowledge representation, its structure, elicitation and maintenance, formal quality assurance. These two aspects of procedure representation can be considered and solved separately. In particular, methodological, formal and management issues require long and tedious activities, that in most cases constitute a great barrier for procedures development and upgrade. To solve these problems, Ansaldo is developing DIAM, a wide integrated tool for procedure management to support in procedure writing, updating, usage and documentation. One of the most challenging features of DIAM is AUTO-LAY, a CASE sub-tool that, in a complete automatical way, structures parts or complete flow diagrams. This is a feature that is partially present in some other CASE products, that, anyway, do not allow complex graph handling and isomorphism between video and paper representation AUTO-LAY has the unique prerogative to draw graphs of any complexity, to section them in pages, and to automatically compose a document. This has been recognized in the literature as the most important second-generation CASE improvement. (author). 5 refs., 9 figs

  7. Hybrid mesh generation for the new generation of oil reservoir simulators: 3D extension; Generation de maillage hybride pour les simulateurs de reservoir petrolier de nouvelle generation: extension 3D

    Energy Technology Data Exchange (ETDEWEB)

    Flandrin, N.

    2005-09-15

    During the exploitation of an oil reservoir, it is important to predict the recovery of hydrocarbons and to optimize its production. A better comprehension of the physical phenomena requires to simulate 3D multiphase flows in increasingly complex geological structures. In this thesis, we are interested in this spatial discretization and we propose to extend in 3D the 2D hybrid model proposed by IFP in 1998 that allows to take directly into account in the geometry the radial characteristics of the flows. In these hybrid meshes, the wells and their drainage areas are described by structured radial circular meshes and the reservoirs are represented by structured meshes that can be a non uniform Cartesian grid or a Corner Point Geometry grids. In order to generate a global conforming mesh, unstructured transition meshes based on power diagrams and satisfying finite volume properties are used to connect the structured meshes together. Two methods have been implemented to generate these transition meshes: the first one is based on a Delaunay triangulation, the other one uses a frontal approach. Finally, some criteria are introduced to measure the quality of the transition meshes and optimization procedures are proposed to increase this quality under finite volume properties constraints. (author)

  8. Automatic generation of warehouse mediators using an ontology engine

    Energy Technology Data Exchange (ETDEWEB)

    Critchlow, T., LLNL

    1998-04-01

    Data warehouses created for dynamic scientific environments, such as genetics, face significant challenges to their long-term feasibility One of the most significant of these is the high frequency of schema evolution resulting from both technological advances and scientific insight Failure to quickly incorporate these modifications will quickly render the warehouse obsolete, yet each evolution requires significant effort to ensure the changes are correctly propagated DataFoundry utilizes a mediated warehouse architecture with an ontology infrastructure to reduce the maintenance acquirements of a warehouse. Among the things, the ontology is used as an information source for automatically generating mediators, the methods that transfer data between the data sources and the warehouse The identification, definition and representation of the metadata required to perform this task is a primary contribution of this work.

  9. Reaction Mechanism Generator: Automatic construction of chemical kinetic mechanisms

    Science.gov (United States)

    Gao, Connie W.; Allen, Joshua W.; Green, William H.; West, Richard H.

    2016-06-01

    Reaction Mechanism Generator (RMG) constructs kinetic models composed of elementary chemical reaction steps using a general understanding of how molecules react. Species thermochemistry is estimated through Benson group additivity and reaction rate coefficients are estimated using a database of known rate rules and reaction templates. At its core, RMG relies on two fundamental data structures: graphs and trees. Graphs are used to represent chemical structures, and trees are used to represent thermodynamic and kinetic data. Models are generated using a rate-based algorithm which excludes species from the model based on reaction fluxes. RMG can generate reaction mechanisms for species involving carbon, hydrogen, oxygen, sulfur, and nitrogen. It also has capabilities for estimating transport and solvation properties, and it automatically computes pressure-dependent rate coefficients and identifies chemically-activated reaction paths. RMG is an object-oriented program written in Python, which provides a stable, robust programming architecture for developing an extensible and modular code base with a large suite of unit tests. Computationally intensive functions are cythonized for speed improvements.

  10. An enhanced geometry-independent mesh weight window generator for MCNP

    International Nuclear Information System (INIS)

    Evans, T.M.; Hendricks, J.S.

    1997-01-01

    A new, enhanced, weight window generator suite has been developed for MCNP trademark. The new generator correctly estimates importances in either an user-specified, geometry-independent orthogonal grid or in MCNP geometric cells. The geometry-independent option alleviates the need to subdivide the MCNP cell geometry for variance reduction purposes. In addition, the new suite corrects several pathologies in the existing MCNP weight window generator. To verify the correctness of the new implementation, comparisons are performed with the analytical solution for the cell importance. Using the new generator, differences between Monte Carlo generated and analytical importances are less than 0.1%. Also, assumptions implicit in the original MCNP generator are shown to be poor in problems with high scattering media. The new generator is fully compatible with MCNP's AVATAR trademark automatic variance reduction method. The new generator applications, together with AVATAR, gives MCNP an enhanced suite of variance reduction methods. The flexibility and efficacy of this suite is demonstrated in a neutron porosity tool well-logging problem

  11. Automatic speech recognition for report generation in computed tomography

    International Nuclear Information System (INIS)

    Teichgraeber, U.K.M.; Ehrenstein, T.; Lemke, M.; Liebig, T.; Stobbe, H.; Hosten, N.; Keske, U.; Felix, R.

    1999-01-01

    Purpose: A study was performed to compare the performance of automatic speech recognition (ASR) with conventional transcription. Materials and Methods: 100 CT reports were generated by using ASR and 100 CT reports were dictated and written by medical transcriptionists. The time for dictation and correction of errors by the radiologist was assessed and the type of mistakes was analysed. The text recognition rate was calculated in both groups and the average time between completion of the imaging study by the technologist and generation of the written report was assessed. A commercially available speech recognition technology (ASKA Software, IBM Via Voice) running of a personal computer was used. Results: The time for the dictation using digital voice recognition was 9.4±2.3 min compared to 4.5±3.6 min with an ordinary Dictaphone. The text recognition rate was 97% with digital voice recognition and 99% with medical transcriptionists. The average time from imaging completion to written report finalisation was reduced from 47.3 hours with medical transcriptionists to 12.7 hours with ASR. The analysis of misspellings demonstrated (ASR vs. medical transcriptionists): 3 vs. 4 for syntax errors, 0 vs. 37 orthographic mistakes, 16 vs. 22 mistakes in substance and 47 vs. erroneously applied terms. Conclusions: The use of digital voice recognition as a replacement for medical transcription is recommendable when an immediate availability of written reports is necessary. (orig.) [de

  12. An eFTD-VP framework for efficiently generating patient-specific anatomically detailed facial soft tissue FE mesh for craniomaxillofacial surgery simulation.

    Science.gov (United States)

    Zhang, Xiaoyan; Kim, Daeseung; Shen, Shunyao; Yuan, Peng; Liu, Siting; Tang, Zhen; Zhang, Guangming; Zhou, Xiaobo; Gateno, Jaime; Liebschner, Michael A K; Xia, James J

    2018-04-01

    Accurate surgical planning and prediction of craniomaxillofacial surgery outcome requires simulation of soft tissue changes following osteotomy. This can only be achieved by using an anatomically detailed facial soft tissue model. The current state-of-the-art of model generation is not appropriate to clinical applications due to the time-intensive nature of manual segmentation and volumetric mesh generation. The conventional patient-specific finite element (FE) mesh generation methods are to deform a template FE mesh to match the shape of a patient based on registration. However, these methods commonly produce element distortion. Additionally, the mesh density for patients depends on that of the template model. It could not be adjusted to conduct mesh density sensitivity analysis. In this study, we propose a new framework of patient-specific facial soft tissue FE mesh generation. The goal of the developed method is to efficiently generate a high-quality patient-specific hexahedral FE mesh with adjustable mesh density while preserving the accuracy in anatomical structure correspondence. Our FE mesh is generated by eFace template deformation followed by volumetric parametrization. First, the patient-specific anatomically detailed facial soft tissue model (including skin, mucosa, and muscles) is generated by deforming an eFace template model. The adaptation of the eFace template model is achieved by using a hybrid landmark-based morphing and dense surface fitting approach followed by a thin-plate spline interpolation. Then, high-quality hexahedral mesh is constructed by using volumetric parameterization. The user can control the resolution of hexahedron mesh to best reflect clinicians' need. Our approach was validated using 30 patient models and 4 visible human datasets. The generated patient-specific FE mesh showed high surface matching accuracy, element quality, and internal structure matching accuracy. They can be directly and effectively used for clinical

  13. An Algorithm to Automatically Generate the Combinatorial Orbit Counting Equations

    Science.gov (United States)

    Melckenbeeck, Ine; Audenaert, Pieter; Michoel, Tom; Colle, Didier; Pickavet, Mario

    2016-01-01

    Graphlets are small subgraphs, usually containing up to five vertices, that can be found in a larger graph. Identification of the graphlets that a vertex in an explored graph touches can provide useful information about the local structure of the graph around that vertex. Actually finding all graphlets in a large graph can be time-consuming, however. As the graphlets grow in size, more different graphlets emerge and the time needed to find each graphlet also scales up. If it is not needed to find each instance of each graphlet, but knowing the number of graphlets touching each node of the graph suffices, the problem is less hard. Previous research shows a way to simplify counting the graphlets: instead of looking for the graphlets needed, smaller graphlets are searched, as well as the number of common neighbors of vertices. Solving a system of equations then gives the number of times a vertex is part of each graphlet of the desired size. However, until now, equations only exist to count graphlets with 4 or 5 nodes. In this paper, two new techniques are presented. The first allows to generate the equations needed in an automatic way. This eliminates the tedious work needed to do so manually each time an extra node is added to the graphlets. The technique is independent on the number of nodes in the graphlets and can thus be used to count larger graphlets than previously possible. The second technique gives all graphlets a unique ordering which is easily extended to name graphlets of any size. Both techniques were used to generate equations to count graphlets with 4, 5 and 6 vertices, which extends all previous results. Code can be found at https://github.com/IneMelckenbeeck/equation-generator and https://github.com/IneMelckenbeeck/graphlet-naming. PMID:26797021

  14. [Development of a Software for Automatically Generated Contours in Eclipse TPS].

    Science.gov (United States)

    Xie, Zhao; Hu, Jinyou; Zou, Lian; Zhang, Weisha; Zou, Yuxin; Luo, Kelin; Liu, Xiangxiang; Yu, Luxin

    2015-03-01

    The automatic generation of planning targets and auxiliary contours have achieved in Eclipse TPS 11.0. The scripting language autohotkey was used to develop a software for automatically generated contours in Eclipse TPS. This software is named Contour Auto Margin (CAM), which is composed of operational functions of contours, script generated visualization and script file operations. RESULTS Ten cases in different cancers have separately selected, in Eclipse TPS 11.0 scripts generated by the software could not only automatically generate contours but also do contour post-processing. For different cancers, there was no difference between automatically generated contours and manually created contours. The CAM is a user-friendly and powerful software, and can automatically generated contours fast in Eclipse TPS 11.0. With the help of CAM, it greatly save plan preparation time and improve working efficiency of radiation therapy physicists.

  15. Application of a droplet evaporation model to aerodynamic size measurement of drug aerosols generated by a vibrating mesh nebulizer.

    Science.gov (United States)

    Rao, Nagaraja; Kadrichu, Nani; Ament, Brian

    2010-10-01

    Droplet evaporation has been known to bias cascade impactor measurement of aerosols generated by jet nebulizers. Previous work suggests that vibrating mesh nebulizers behave differently from jet nebulizers. Unlike jet nebulizers, vibrating mesh nebulizers do not rely on compressed air to generate droplets. However, entrained air is still required to transport the generated droplets through the cascade impactor during measurement. The mixing of the droplet and entrained air streams, and heat and mass transfer occurring downstream determines the final aerosol size distribution actually measured by the cascade impactor. This study is aimed at quantifying the effect of these factors on droplet size measurements for the case of vibrating mesh nebulizers. A simple droplet evaporation model has been applied to investigate aerodynamic size measurement of drug aerosol droplets produced by a proprietary vibrating mesh nebulizer. The droplet size measurement system used in this study is the Next Generation Impactor (NGI) cascade impactor. Comparison of modeling results with experiment indicates that droplet evaporation remains a significant effect when sizing aerosol generated by a vibrating mesh nebulizer. Results from the droplet evaporation model shows that the mass median aerodynamic diameter (MMAD) measured by the NGI is strongly influenced not only by the initial droplet size, but also by factors such as the temperature and humidity of entrained air, the nebulizer output rate, and the entrained air flow rate. The modeling and experimental results indicate that the influence of these variables on size measurements may be reduced significantly by refrigerating the impactor down to 5°C prior to measurement. The same data also support the conclusion that for the case of nebulized drug solutions, laser diffraction spectrometry provides a meaningful droplet sizing approach, that is simpler and less susceptible to such droplet evaporation artifacts.

  16. Automatic run-time provenance capture for scientific dataset generation

    Science.gov (United States)

    Frew, J.; Slaughter, P.

    2008-12-01

    Provenance---the directed graph of a dataset's processing history---is difficult to capture effectively. Human- generated provenance, as narrative metadata, is labor-intensive and thus often incorrect, incomplete, or simply not recorded. Workflow systems capture some provenance implicitly in workflow specifications, but these systems are not ubiquitous or standardized, and a workflow specification may not capture all of the factors involved in a dataset's production. System audit trails capture potentially all processing activities, but not the relationships between them. We describe a system that transparently (i.e., without any modification to science codes) and automatically (i.e. without any human intervention) captures the low-level interactions (files read/written, parameters accessed, etc.) between scientific processes, and then synthesizes these relationships into a provenance graph. This system---the Earth System Science Server (ES3)---is sufficiently general that it can accommodate any combination of stand-alone programs, interpreted codes (e.g. IDL), and command- language scripts. Provenance in ES3 can be published in well-defined XML formats (including formats suitable for graphical visualization), and queried to determine the ancestors or descendants of any specific data file or process invocation. We demonstrate how ES3 can be used to capture the provenance of a large operational ocean color dataset.

  17. Development of tools for automatic generation of PLC code

    CERN Document Server

    Koutli, Maria; Rochez, Jacques

    This Master thesis was performed at CERN and more specifically in the EN-ICE-PLC section. The Thesis describes the integration of two PLC platforms, that are based on CODESYS development tool, to the CERN defined industrial framework, UNICOS. CODESYS is a development tool for PLC programming, based on IEC 61131-3 standard, and is adopted by many PLC manufacturers. The two PLC development environments are, the SoMachine from Schneider and the TwinCAT from Beckhoff. The two CODESYS compatible PLCs, should be controlled by the SCADA system of Siemens, WinCC OA. The framework includes a library of Function Blocks (objects) for the PLC programs and a software for automatic generation of the PLC code based on this library, called UAB. The integration aimed to give a solution that is shared by both PLC platforms and was based on the PLCOpen XML scheme. The developed tools were demonstrated by creating a control application for both PLC environments and testing of the behavior of the code of the library.

  18. Automatic summary generating technology of vegetable traceability for information sharing

    Science.gov (United States)

    Zhenxuan, Zhang; Minjing, Peng

    2017-06-01

    In order to solve problems of excessive data entries and consequent high costs for data collection in vegetable traceablility for farmers in traceability applications, the automatic summary generating technology of vegetable traceability for information sharing was proposed. The proposed technology is an effective way for farmers to share real-time vegetable planting information in social networking platforms to enhance their brands and obtain more customers. In this research, the influencing factors in the vegetable traceablility for customers were analyzed to establish the sub-indicators and target indicators and propose a computing model based on the collected parameter values of the planted vegetables and standard legal systems on food safety. The proposed standard parameter model involves five steps: accessing database, establishing target indicators, establishing sub-indicators, establishing standard reference model and computing scores of indicators. On the basis of establishing and optimizing the standards of food safety and traceability system, this proposed technology could be accepted by more and more farmers and customers.

  19. Automatic generation of digital anthropomorphic phantoms from simulated MRI acquisitions

    Science.gov (United States)

    Lindsay, C.; Gennert, M. A.; KÓ§nik, A.; Dasari, P. K.; King, M. A.

    2013-03-01

    In SPECT imaging, motion from patient respiration and body motion can introduce image artifacts that may reduce the diagnostic quality of the images. Simulation studies using numerical phantoms with precisely known motion can help to develop and evaluate motion correction algorithms. Previous methods for evaluating motion correction algorithms used either manual or semi-automated segmentation of MRI studies to produce patient models in the form of XCAT Phantoms, from which one calculates the transformation and deformation between MRI study and patient model. Both manual and semi-automated methods of XCAT Phantom generation require expertise in human anatomy, with the semiautomated method requiring up to 30 minutes and the manual method requiring up to eight hours. Although faster than manual segmentation, the semi-automated method still requires a significant amount of time, is not replicable, and is subject to errors due to the difficulty of aligning and deforming anatomical shapes in 3D. We propose a new method for matching patient models to MRI that extends the previous semi-automated method by eliminating the manual non-rigid transformation. Our method requires no user supervision and therefore does not require expert knowledge of human anatomy to align the NURBs to anatomical structures in the MR image. Our contribution is employing the SIMRI MRI simulator to convert the XCAT NURBs to a voxel-based representation that is amenable to automatic non-rigid registration. Then registration is used to transform and deform the NURBs to match the anatomy in the MR image. We show that our automated method generates XCAT Phantoms more robustly and significantly faster than the previous semi-automated method.

  20. Automatic generation of investigator bibliographies for institutional research networking systems.

    Science.gov (United States)

    Johnson, Stephen B; Bales, Michael E; Dine, Daniel; Bakken, Suzanne; Albert, Paul J; Weng, Chunhua

    2014-10-01

    Publications are a key data source for investigator profiles and research networking systems. We developed ReCiter, an algorithm that automatically extracts bibliographies from PubMed using institutional information about the target investigators. ReCiter executes a broad query against PubMed, groups the results into clusters that appear to constitute distinct author identities and selects the cluster that best matches the target investigator. Using information about investigators from one of our institutions, we compared ReCiter results to queries based on author name and institution and to citations extracted manually from the Scopus database. Five judges created a gold standard using citations of a random sample of 200 investigators. About half of the 10,471 potential investigators had no matching citations in PubMed, and about 45% had fewer than 70 citations. Interrater agreement (Fleiss' kappa) for the gold standard was 0.81. Scopus achieved the best recall (sensitivity) of 0.81, while name-based queries had 0.78 and ReCiter had 0.69. ReCiter attained the best precision (positive predictive value) of 0.93 while Scopus had 0.85 and name-based queries had 0.31. ReCiter accesses the most current citation data, uses limited computational resources and minimizes manual entry by investigators. Generation of bibliographies using named-based queries will not yield high accuracy. Proprietary databases can perform well but requite manual effort. Automated generation with higher recall is possible but requires additional knowledge about investigators. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Power generation using carbon mesh cathodes with different diffusion layers in microbial fuel cells

    KAUST Repository

    Luo, Yong

    2011-11-01

    An inexpensive carbon material, carbon mesh, was examined to replace the more expensive carbon cloth usually used to make cathodes in air-cathode microbial fuel cells (MFCs). Three different diffusion layers were tested using carbon mesh: poly(dimethylsiloxane) (PDMS), polytetrafluoroethylene (PTFE), and Goretex cloth. Carbon mesh with a mixture of PDMS and carbon black as a diffusion layer produced a maximum power density of 1355 ± 62 mW m -2 (normalized to the projected cathode area), which was similar to that obtained with a carbon cloth cathode (1390 ± 72 mW m-2). Carbon mesh with a PTFE diffusion layer produced only a slightly lower (6.6%) maximum power density (1303 ± 48 mW m-2). The Coulombic efficiencies were a function of current density, with the highest value for the carbon mesh and PDMS (79%) larger than that for carbon cloth (63%). The cost of the carbon mesh cathode with PDMS/Carbon or PTFE (excluding catalyst and binder costs) is only 2.5% of the cost of the carbon cloth cathode. These results show that low cost carbon materials such as carbon mesh can be used as the cathode in an MFC without reducing the performance compared to more expensive carbon cloth. © 2011 Elsevier B.V.

  2. Automatable Annotations – Image Processing and Machine Learning for Script in 3D and 2D with GigaMesh

    OpenAIRE

    Bogacz, Bartosz; Mara, Hubert

    2017-01-01

    Libraries, archives and museums hold vast numbers of objects with script in 3D such as inscriptions, coins, and seals, which provide valuable insights into the history of humanity. Cuneiform tablets in particular provide access to information on more than three millennia BC. Since these clay tablets require an extensive examination for transcription, we developed the modular GigaMesh software framework to provide high-contrast visualization of tablets captured with 3D acquisiton techniques. T...

  3. Semi-Automatic Story Generation for a Geographic Server

    Directory of Open Access Journals (Sweden)

    Rizwan Mehmood

    2017-06-01

    Full Text Available Most existing servers providing geographic data tend to offer various numeric data. We started to work on a new type of geographic server, motivated by four major issues: (i How to handle figures when different databases present different values; (ii How to build up sizeable collections of pictures with detailed descriptions; (iii How to update rapidly changing information, such as personnel holding important functions, and (iv how to describe countries not just by using trivial facts, but stories typical of the country involved. We have discussed and partially resolved issues (i and (ii in previous papers; we have decided to deal with (iii, regional updates, by tying in an international consortium whose members would either help themselves or find individuals to do so. It is issue (iv, how to generate non-trivial stories typical of a country, that we decided to tackle both manually (the consortium has by now generated around 200 stories, and by developing techniques for semi-automatic story generation, which is the topic of this paper. The basic idea was first to define sets of reasonably reliable servers that may differ from region to region, to extract “interesting facts” from the servers, and combine them in a raw version of a report that would require some manual cleaning-up (hence: semi-automatic. It may sound difficult to extract “interesting facts” from Web pages, but it is quite possible to define heuristics to do so, never exceeding the few lines allowed for quotation purposes. One very simple rule we adopted was this: ‘Look for sentences with superlatives!’ If a sentence contains words like “biggest”, “highest”, “most impressive” etc. it is likely to contain an interesting fact. With a little imagination, we have been able to establish a set of such rules. We will show that the stories can be completely different. For some countries, historical facts may dominate; for others, the beauty of landscapes; for

  4. User evaluation of a communication system that automatically generates captions to improve telephone communication

    NARCIS (Netherlands)

    Zekveld, A.A.; Kramer, S.E.; Kessens, J.M.; Vlaming, M.S.M.G.; Houtgast, T.

    2009-01-01

    This study examined the subjective benefit obtained from automatically generated captions during telephone-speech comprehension in the presence of babble noise. Short stories were presented by telephone either with or without captions that were generated offline by an automatic speech recognition

  5. Assess and Predict Automatic Generation Control Performances for Thermal Power Generation Units Based on Modeling Techniques

    Science.gov (United States)

    Zhao, Yan; Yang, Zijiang; Gao, Song; Liu, Jinbiao

    2018-02-01

    Automatic generation control(AGC) is a key technology to maintain real time power generation and load balance, and to ensure the quality of power supply. Power grids require each power generation unit to have a satisfactory AGC performance, being specified in two detailed rules. The two rules provide a set of indices to measure the AGC performance of power generation unit. However, the commonly-used method to calculate these indices is based on particular data samples from AGC responses and will lead to incorrect results in practice. This paper proposes a new method to estimate the AGC performance indices via system identification techniques. In addition, a nonlinear regression model between performance indices and load command is built in order to predict the AGC performance indices. The effectiveness of the proposed method is validated through industrial case studies.

  6. Generating Inviscid and Viscous Fluid-Flow Simulations over an Aircraft Surface Using a Fluid-Flow Mesh

    Science.gov (United States)

    Rodriguez, David L. (Inventor); Sturdza, Peter (Inventor)

    2013-01-01

    Fluid-flow simulation over a computer-generated aircraft surface is generated using inviscid and viscous simulations. A fluid-flow mesh of fluid cells is obtained. At least one inviscid fluid property for the fluid cells is determined using an inviscid fluid simulation that does not simulate fluid viscous effects. A set of intersecting fluid cells that intersects the aircraft surface are identified. One surface mesh polygon of the surface mesh is identified for each intersecting fluid cell. A boundary-layer prediction point for each identified surface mesh polygon is determined. At least one boundary-layer fluid property for each boundary-layer prediction point is determined using the at least one inviscid fluid property of the corresponding intersecting fluid cell and a boundary-layer simulation that simulates fluid viscous effects. At least one updated fluid property for at least one fluid cell is determined using the at least one boundary-layer fluid property and the inviscid fluid simulation.

  7. An Efficient Mesh Generation Method for Fractured Network System Based on Dynamic Grid Deformation

    Directory of Open Access Journals (Sweden)

    Shuli Sun

    2013-01-01

    Full Text Available Meshing quality of the discrete model influences the accuracy, convergence, and efficiency of the solution for fractured network system in geological problem. However, modeling and meshing of such a fractured network system are usually tedious and difficult due to geometric complexity of the computational domain induced by existence and extension of fractures. The traditional meshing method to deal with fractures usually involves boundary recovery operation based on topological transformation, which relies on many complicated techniques and skills. This paper presents an alternative and efficient approach for meshing fractured network system. The method firstly presets points on fractures and then performs Delaunay triangulation to obtain preliminary mesh by point-by-point centroid insertion algorithm. Then the fractures are exactly recovered by local correction with revised dynamic grid deformation approach. Smoothing algorithm is finally applied to improve the quality of mesh. The proposed approach is efficient, easy to implement, and applicable to the cases of initial existing fractures and extension of fractures. The method is successfully applied to modeling of two- and three-dimensional discrete fractured network (DFN system in geological problems to demonstrate its effectiveness and high efficiency.

  8. Automatic Grasp Generation and Improvement for Industrial Bin-Picking

    DEFF Research Database (Denmark)

    Kraft, Dirk; Ellekilde, Lars-Peter; Rytz, Jimmy Alison

    2014-01-01

    and achieve comparable results and that our learning approach can improve system performance significantly. Automatic bin-picking is an important industrial process that can lead to significant savings and potentially keep production in countries with high labour cost rather than outsourcing it. The presented...... work allows to minimize cycle time as well as setup cost, which are essential factors in automatic bin-picking. It therefore leads to a wider applicability of bin-picking in industry....

  9. Triangle geometry processing for surface modeling and cartesian grid generation

    Science.gov (United States)

    Aftosmis, Michael J [San Mateo, CA; Melton, John E [Hollister, CA; Berger, Marsha J [New York, NY

    2002-09-03

    Cartesian mesh generation is accomplished for component based geometries, by intersecting components subject to mesh generation to extract wetted surfaces with a geometry engine using adaptive precision arithmetic in a system which automatically breaks ties with respect to geometric degeneracies. During volume mesh generation, intersected surface triangulations are received to enable mesh generation with cell division of an initially coarse grid. The hexagonal cells are resolved, preserving the ability to directionally divide cells which are locally well aligned.

  10. How to model wireless mesh networks topology

    International Nuclear Information System (INIS)

    Sanni, M L; Hashim, A A; Anwar, F; Ali, S; Ahmed, G S M

    2013-01-01

    The specification of network connectivity model or topology is the beginning of design and analysis in Computer Network researches. Wireless Mesh Networks is an autonomic network that is dynamically self-organised, self-configured while the mesh nodes establish automatic connectivity with the adjacent nodes in the relay network of wireless backbone routers. Researches in Wireless Mesh Networks range from node deployment to internetworking issues with sensor, Internet and cellular networks. These researches require modelling of relationships and interactions among nodes including technical characteristics of the links while satisfying the architectural requirements of the physical network. However, the existing topology generators model geographic topologies which constitute different architectures, thus may not be suitable in Wireless Mesh Networks scenarios. The existing methods of topology generation are explored, analysed and parameters for their characterisation are identified. Furthermore, an algorithm for the design of Wireless Mesh Networks topology based on square grid model is proposed in this paper. The performance of the topology generated is also evaluated. This research is particularly important in the generation of a close-to-real topology for ensuring relevance of design to the intended network and validity of results obtained in Wireless Mesh Networks researches

  11. A strategy for automatically generating programs in the lucid programming language

    Science.gov (United States)

    Johnson, Sally C.

    1987-01-01

    A strategy for automatically generating and verifying simple computer programs is described. The programs are specified by a precondition and a postcondition in predicate calculus. The programs generated are in the Lucid programming language, a high-level, data-flow language known for its attractive mathematical properties and ease of program verification. The Lucid programming is described, and the automatic program generation strategy is described and applied to several example problems.

  12. Extraction: a system for automatic eddy current diagnosis of steam generator tubes in nuclear power plants

    International Nuclear Information System (INIS)

    Georgel, B.; Zorgati, R.

    1994-01-01

    Improving speed and quality of Eddy Current non-destructive testing of steam generator tubes leads to automatize all processes that contribute to diagnosis. This paper describes how we use signal processing, pattern recognition and artificial intelligence to build a software package that is able to automatically provide an efficient diagnosis. (authors). 2 figs., 5 refs

  13. Automatic Texture and Orthophoto Generation from Registered Panoramic Views

    DEFF Research Database (Denmark)

    Krispel, Ulrich; Evers, Henrik Leander; Tamke, Martin

    2015-01-01

    from range data only. In order to detect these elements, we developed a method that utilizes range data and color information from high-resolution panoramic images of indoor scenes, taken at the scanners position. A proxy geometry is derived from the point clouds; orthographic views of the scene......Recent trends in 3D scanning are aimed at the fusion of range data and color information from images. The combination of these two outputs allows to extract novel semantic information. The workflow presented in this paper allows to detect objects, such as light switches, that are hard to identify...... are automatically identified from the geometry and an image per view is created via projection. We combine methods of computer vision to train a classifier to detect the objects of interest from these orthographic views. Furthermore, these views can be used for automatic texturing of the proxy geometry....

  14. Computer program for automatic generation of BWR control rod patterns

    International Nuclear Information System (INIS)

    Taner, M.S.; Levine, S.H.; Hsia, M.Y.

    1990-01-01

    A computer program named OCTOPUS has been developed to automatically determine a control rod pattern that approximates some desired target power distribution as closely as possible without violating any thermal safety or reactor criticality constraints. The program OCTOPUS performs a semi-optimization task based on the method of approximation programming (MAP) to develop control rod patterns. The SIMULATE-E code is used to determine the nucleonic characteristics of the reactor core state

  15. System and Component Software Specification, Run-time Verification and Automatic Test Generation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The following background technology is described in Part 5: Run-time Verification (RV), White Box Automatic Test Generation (WBATG). Part 5 also describes how WBATG...

  16. Cuypers : a semi-automatic hypermedia generation system

    NARCIS (Netherlands)

    J.R. van Ossenbruggen (Jacco); F.J. Cornelissen; J.P.T.M. Geurts (Joost); L. Rutledge (Lloyd); L. Hardman (Lynda)

    2000-01-01

    textabstractThe report describes the architecture of emph{Cuypers, a system supporting second and third generation Web-based multimedia. First generation Web-content encodes information in handwritten (HTML) Web pages. Second generation Web content generates HTML pages on demand, e.g. by filling in

  17. Isotropic 2D quadrangle meshing with size and orientation control

    KAUST Repository

    Pellenard, Bertrand

    2011-12-01

    We propose an approach for automatically generating isotropic 2D quadrangle meshes from arbitrary domains with a fine control over sizing and orientation of the elements. At the heart of our algorithm is an optimization procedure that, from a coarse initial tiling of the 2D domain, enforces each of the desirable mesh quality criteria (size, shape, orientation, degree, regularity) one at a time, in an order designed not to undo previous enhancements. Our experiments demonstrate how well our resulting quadrangle meshes conform to a wide range of input sizing and orientation fields.

  18. Automatic test pattern generation for iterative logic arrays | Boateng ...

    African Journals Online (AJOL)

    test are first formulated. Next, the repetition property of the test patterns is exploited to develop a method for generating C-tests for ILAs under the cell fault model. Based on the results of test generation, the method identifies points of insertion of ...

  19. Validating EHR documents: automatic schematron generation using archetypes.

    Science.gov (United States)

    Pfeiffer, Klaus; Duftschmid, Georg; Rinner, Christoph

    2014-01-01

    The goal of this study was to examine whether Schematron schemas can be generated from archetypes. The openEHR Java reference API was used to transform an archetype into an object model, which was then extended with context elements. The model was processed and the constraints were transformed into corresponding Schematron assertions. A prototype of the generator for the reference model HL7 v3 CDA R2 was developed and successfully tested. Preconditions for its reusability with other reference models were set. Our results indicate that an automated generation of Schematron schemas is possible with some limitations.

  20. RETRANS - A tool to verify the functional equivalence of automatically generated source code with its specification

    International Nuclear Information System (INIS)

    Miedl, H.

    1998-01-01

    Following the competent technical standards (e.g. IEC 880) it is necessary to verify each step in the development process of safety critical software. This holds also for the verification of automatically generated source code. To avoid human errors during this verification step and to limit the cost effort a tool should be used which is developed independently from the development of the code generator. For this purpose ISTec has developed the tool RETRANS which demonstrates the functional equivalence of automatically generated source code with its underlying specification. (author)

  1. Unidirectional high fiber content composites: Automatic 3D FE model generation and damage simulation

    DEFF Research Database (Denmark)

    Qing, Hai; Mishnaevsky, Leon

    2009-01-01

    A new method and a software code for the automatic generation of 3D micromechanical FE models of unidirectional long-fiber-reinforced composite (LFRC) with high fiber volume fraction with random fiber arrangement are presented. The fiber arrangement in the cross-section is generated through random...

  2. Knowledge Base for Automatic Generation of Online IMS LD Compliant Course Structures

    Science.gov (United States)

    Pacurar, Ecaterina Giacomini; Trigano, Philippe; Alupoaie, Sorin

    2006-01-01

    Our article presents a pedagogical scenarios-based web application that allows the automatic generation and development of pedagogical websites. These pedagogical scenarios are represented in the IMS Learning Design standard. Our application is a web portal helping teachers to dynamically generate web course structures, to edit pedagogical content…

  3. Automatic 3d Building Model Generations with Airborne LiDAR Data

    Science.gov (United States)

    Yastikli, N.; Cetin, Z.

    2017-11-01

    LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D) modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified that automatic 3D

  4. AUTOMATIC 3D BUILDING MODEL GENERATIONS WITH AIRBORNE LiDAR DATA

    Directory of Open Access Journals (Sweden)

    N. Yastikli

    2017-11-01

    Full Text Available LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified

  5. Automatic Tamil lyric generation based on ontological interpretation ...

    Indian Academy of Sciences (India)

    Once the appropriate tri-grams are selected, the root words from these tri-grams are sent to the morphological generator, to form words in their packed form. These words are then assembled to form the final lyrics. Parameters of poetry like rhyme, alliteration, simile, vocative words, etc., are also taken care of by the system.

  6. Designing a story database for use in automatic story generation

    NARCIS (Netherlands)

    Oinonen, Katri; Theune, Mariët; Nijholt, Anton; Uijlings, Jasper; Harper, Richard; Rauterberg, Matthias; Combetto, Marco

    In this paper we propose a model for the representation of stories in a story database. The use of such a database will enable computational story generation systems to learn from previous stories and associated user feedback, in order to create believable stories with dramatic plots that invoke an

  7. Power generation using an activated carbon and metal mesh cathode in a microbial fuel cell

    KAUST Repository

    Zhang, Fang

    2009-11-01

    An inexpensive activated carbon (AC) air cathode was developed as an alternative to a platinum-catalyzed electrode for oxygen reduction in a microbial fuel cell (MFC). AC was cold-pressed with a polytetrafluoroethylene (PTFE) binder to form the cathode around a Ni mesh current collector. This cathode construction avoided the need for carbon cloth or a metal catalyst, and produced a cathode with high activity for oxygen reduction at typical MFC current densities. Tests with the AC cathode produced a maximum power density of 1220 mW/m2 (normalized to cathode projected surface area; 36 W/m3 based on liquid volume) compared to 1060 mW/m2 obtained by Pt catalyzed carbon cloth cathode. The Coulombic efficiency ranged from 15% to 55%. These findings show that AC is a cost-effective material for achieving useful rates of oxygen reduction in air cathode MFCs. © 2009 Elsevier B.V. All rights reserved.

  8. Training IBM Watson using Automatically Generated Question-Answer Pairs

    OpenAIRE

    Lee, Jangho; Kim, Gyuwan; Yoo, Jaeyoon; Jung, Changwoo; Kim, Minseok; Yoon, Sungroh

    2016-01-01

    IBM Watson is a cognitive computing system capable of question answering in natural languages. It is believed that IBM Watson can understand large corpora and answer relevant questions more effectively than any other question-answering system currently available. To unleash the full power of Watson, however, we need to train its instance with a large number of well-prepared question-answer pairs. Obviously, manually generating such pairs in a large quantity is prohibitively time consuming and...

  9. Automatic capture of attention by conceptually generated working memory templates.

    Science.gov (United States)

    Sun, Sol Z; Shen, Jenny; Shaw, Mark; Cant, Jonathan S; Ferber, Susanne

    2015-08-01

    Many theories of attention propose that the contents of working memory (WM) can act as an attentional template, which biases processing in favor of perceptually similar inputs. While support has been found for this claim, it is unclear how attentional templates are generated when searching real-world environments. We hypothesized that in naturalistic settings, attentional templates are commonly generated from conceptual knowledge, an idea consistent with sensorimotor models of knowledge representation. Participants performed a visual search task in the delay period of a WM task, where the item in memory was either a colored disk or a word associated with a color concept (e.g., "Rose," associated with red). During search, we manipulated whether a singleton distractor in the array matched the contents of WM. Overall, we found that search times were impaired in the presence of a memory-matching distractor. Furthermore, the degree of impairment did not differ based on the contents of WM. Put differently, regardless of whether participants were maintaining a perceptually colored disk identical to the singleton distractor, or whether they were simply maintaining a word associated with the color of the distractor, the magnitude of attentional capture was the same. Our results suggest that attentional templates can be generated from conceptual knowledge, in the physical absence of the visual feature.

  10. A Hybrid Intelligent Search Algorithm for Automatic Test Data Generation

    Directory of Open Access Journals (Sweden)

    Ying Xing

    2015-01-01

    Full Text Available The increasing complexity of large-scale real-world programs necessitates the automation of software testing. As a basic problem in software testing, the automation of path-wise test data generation is especially important, which is in essence a constraint optimization problem solved by search strategies. Therefore, the constraint processing efficiency of the selected search algorithm is a key factor. Aiming at the increase of search efficiency, a hybrid intelligent algorithm is proposed to efficiently search the solution space of potential test data by making full use of both global and local search methods. Branch and bound is adopted for global search, which gives definite results with relatively less cost. In the search procedure for each variable, hill climbing is adopted for local search, which is enhanced with the initial values selected heuristically based on the monotonicity analysis of branching conditions. They are highly integrated by an efficient ordering method and the backtracking operation. In order to facilitate the search methods, the solution space is represented as state space. Experimental results show that the proposed method outperformed some other methods used in test data generation. The heuristic initial value selection strategy improves the search efficiency greatly and makes the search basically backtrack-free. The results also demonstrate that the proposed method is applicable in engineering.

  11. Automatic generation of computer programs servicing TFTR console displays

    International Nuclear Information System (INIS)

    Eisenberg, H.

    1983-01-01

    A number of alternatives were considered in providing programs to support the several hundred displays required for control and monitoring of TFTR equipment. Since similar functions were performed, an automated method of creating programs was suggested. The complexity of a single program servicing as many as thirty consoles mitigated against that approach. Similarly, creation of a syntactic language while elegant, was deemed to be too time consuming, and had the disadvantage of requiring a working knowledge of the language on a programming level. It was elected to pursue a method of generating an individual program to service a particular display. A feasibility study was conducted and the Control and Monitor Display Generator system (CMDG) was developed. A Control and Monitor Display Service Program (CMDS) provides a means of performing monitor and control functions for devices associated with TFTR subsystems, as well as other user functions, via TFTR Control Consoles. This paper discusses the specific capabilities provided by CMDS in a usage context, as well as the mechanics of implementation

  12. Automatic Generation of Proof Tactics for Finite-Valued Logics

    Directory of Open Access Journals (Sweden)

    João Marcos

    2010-03-01

    Full Text Available A number of flexible tactic-based logical frameworks are nowadays available that can implement a wide range of mathematical theories using a common higher-order metalanguage. Used as proof assistants, one of the advantages of such powerful systems resides in their responsiveness to extensibility of their reasoning capabilities, being designed over rule-based programming languages that allow the user to build her own `programs to construct proofs' - the so-called proof tactics. The present contribution discusses the implementation of an algorithm that generates sound and complete tableau systems for a very inclusive class of sufficiently expressive finite-valued propositional logics, and then illustrates some of the challenges and difficulties related to the algorithmic formation of automated theorem proving tactics for such logics. The procedure on whose implementation we will report is based on a generalized notion of analyticity of proof systems that is intended to guarantee termination of the corresponding automated tactics on what concerns theoremhood in our targeted logics.

  13. Automatic generation of indoor navigable space using a point cloud and its scanner trajectory

    NARCIS (Netherlands)

    Staats, B. R.; Diakite, A.A.; Voûte, R.; Zlatanova, S.

    2017-01-01

    Automatic generation of indoor navigable models is mostly based on 2D floor plans. However, in many cases the floor plans are out of date. Buildings are not always built according to their blue prints, interiors might change after a few years because of modified walls and doors, and furniture may

  14. Automatic generation of medium-detailed 3D models of buildings based on CAD data

    NARCIS (Netherlands)

    Dominguez-Martin, B.; Van Oosterom, P.; Feito-Higueruela, F.R.; Garcia-Fernandez, A.L.; Ogayar-Anguita, C.J.

    2015-01-01

    We present the preliminary results of a work in progress which aims to obtain a software system able to automatically generate a set of diverse 3D building models with a medium level of detail, that is, more detailed that a mere parallelepiped, but not as detailed as a complete geometric

  15. Validation study of automatically generated codes in colonoscopy using the endoscopic report system Endobase

    NARCIS (Netherlands)

    Groenen, Marcel J. M.; van Buuren, Henk R.; van Berge Henegouwen, Gerard P.; Fockens, Paul; van der Lei, Johan; Stuifbergen, Wouter N. H. M.; van der Schaar, Peter J.; Kuipers, Ernst J.; Ouwendijk, Rob J. Th

    2010-01-01

    OBJECTIVE: Gastrointestinal endoscopy databases are important for surveillance, epidemiology, quality control and research. A good quality of automatically generated databases to enable drawing justified conclusions based on the data is of key importance. The aim of this study is to validate the

  16. Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process

    Science.gov (United States)

    McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.

    1999-01-01

    This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.

  17. A GA-fuzzy automatic generation controller for interconnected power system

    CSIR Research Space (South Africa)

    Boesack, CD

    2011-10-01

    Full Text Available This paper presents a GA-Fuzzy Automatic Generation Controller for large interconnected power systems. The design of Fuzzy Logic Controllers by means of expert knowledge have typically been the traditional design norm, however, this may not yield...

  18. Cross-cultural assessment of automatically generated multimodal referring expressions in a virtual world

    NARCIS (Netherlands)

    van der Sluis, Ielka; Luz, Saturnino; Breitfuss, Werner; Ishizuka, Mitsuru; Prendinger, Helmut

    This paper presents an assessment of automatically generated multimodal referring expressions as produced by embodied conversational agents in a virtual world. The algorithm used for this purpose employs general principles of human motor control and cooperativity in dialogues that can be

  19. Accuracy assessment of building point clouds automatically generated from iphone images

    Science.gov (United States)

    Sirmacek, B.; Lindenbergh, R.

    2014-06-01

    Low-cost sensor generated 3D models can be useful for quick 3D urban model updating, yet the quality of the models is questionable. In this article, we evaluate the reliability of an automatic point cloud generation method using multi-view iPhone images or an iPhone video file as an input. We register such automatically generated point cloud on a TLS point cloud of the same object to discuss accuracy, advantages and limitations of the iPhone generated point clouds. For the chosen example showcase, we have classified 1.23% of the iPhone point cloud points as outliers, and calculated the mean of the point to point distances to the TLS point cloud as 0.11 m. Since a TLS point cloud might also include measurement errors and noise, we computed local noise values for the point clouds from both sources. Mean (μ) and standard deviation (σ) of roughness histograms are calculated as (μ1 = 0.44 m., σ1 = 0.071 m.) and (μ2 = 0.025 m., σ2 = 0.037 m.) for the iPhone and TLS point clouds respectively. Our experimental results indicate possible usage of the proposed automatic 3D model generation framework for 3D urban map updating, fusion and detail enhancing, quick and real-time change detection purposes. However, further insights should be obtained first on the circumstances that are needed to guarantee a successful point cloud generation from smartphone images.

  20. Automatically-generated rectal dose constraints in intensity-modulated radiation therapy for prostate cancer

    Science.gov (United States)

    Hwang, Taejin; Kim, Yong Nam; Kim, Soo Kon; Kang, Sei-Kwon; Cheong, Kwang-Ho; Park, Soah; Yoon, Jai-Woong; Han, Taejin; Kim, Haeyoung; Lee, Meyeon; Kim, Kyoung-Joo; Bae, Hoonsik; Suh, Tae-Suk

    2015-06-01

    The dose constraint during prostate intensity-modulated radiation therapy (IMRT) optimization should be patient-specific for better rectum sparing. The aims of this study are to suggest a novel method for automatically generating a patient-specific dose constraint by using an experience-based dose volume histogram (DVH) of the rectum and to evaluate the potential of such a dose constraint qualitatively. The normal tissue complication probabilities (NTCPs) of the rectum with respect to V %ratio in our study were divided into three groups, where V %ratio was defined as the percent ratio of the rectal volume overlapping the planning target volume (PTV) to the rectal volume: (1) the rectal NTCPs in the previous study (clinical data), (2) those statistically generated by using the standard normal distribution (calculated data), and (3) those generated by combining the calculated data and the clinical data (mixed data). In the calculated data, a random number whose mean value was on the fitted curve described in the clinical data and whose standard deviation was 1% was generated by using the `randn' function in the MATLAB program and was used. For each group, we validated whether the probability density function (PDF) of the rectal NTCP could be automatically generated with the density estimation method by using a Gaussian kernel. The results revealed that the rectal NTCP probability increased in proportion to V %ratio , that the predictive rectal NTCP was patient-specific, and that the starting point of IMRT optimization for the given patient might be different. The PDF of the rectal NTCP was obtained automatically for each group except that the smoothness of the probability distribution increased with increasing number of data and with increasing window width. We showed that during the prostate IMRT optimization, the patient-specific dose constraints could be automatically generated and that our method could reduce the IMRT optimization time as well as maintain the

  1. Combined carbon mesh and small graphite fiber brush anodes to enhance and stabilize power generation in microbial fuel cells treating domestic wastewater

    Science.gov (United States)

    Wu, Shijia; He, Weihua; Yang, Wulin; Ye, Yaoli; Huang, Xia; Logan, Bruce E.

    2017-07-01

    Microbial fuel cells (MFCs) need to have a compact architecture, but power generation using low strength domestic wastewater is unstable for closely-spaced electrode designs using thin anodes (flat mesh or small diameter graphite fiber brushes) due to oxygen crossover from the cathode. A composite anode configuration was developed to improve performance, by joining the mesh and brushes together, with the mesh used to block oxygen crossover to the brushes, and the brushes used to stabilize mesh potentials. In small, fed-batch MFCs (28 mL), the composite anode produced 20% higher power densities than MFCs using only brushes, and 150% power densities compared to carbon mesh anodes. In continuous flow tests at short hydraulic retention times (HRTs, 2 or 4 h) using larger MFCs (100 mL), composite anodes had stable performance, while brush anode MFCs exhibited power overshoot in polarization tests. Both configurations exhibited power overshoot at a longer HRT of 8 h due to lower effluent CODs. The use of composite anodes reduced biomass growth on the cathode (1.9 ± 0.2 mg) compared to only brushes (3.1 ± 0.3 mg), and increased coulombic efficiencies, demonstrating that they successfully reduced oxygen contamination of the anode and the bio-fouling of cathode.

  2. Design and installation of a strategically placed algae mesh barrier at OPG Pickering Nuclear Generating Station

    International Nuclear Information System (INIS)

    Marttila, D.; Patrick, P.; Gregoris, C.

    2009-01-01

    Ontario Power Generation's Pickering Nuclear has experienced a number of events in which attached algae have become entrained in the water intake costing approximately $30M over the 1995-2005 period as a result of deratings, Unit shutdowns and other operational issues. In 2005-2006 OPG and Kinectrics worked collaboratively on evaluating different potential solutions to reduce the impact of algae on the station. One of the solutions developed by Kinectrics included a strategically placed barrier net designed to regulate algae flow into the station intake. In 2006, Kinectrics designed and installed the system, the first of its kind at a Nuclear Power Plant in Canada. The system was operational by May 2007. OPG completed an effectiveness study in 2007 and concluded the barrier system had a beneficial effect on reducing algae impact on the station. (author)

  3. Analytical reconstruction scheme for the coarse-mesh solution generated by the spectral nodal method for neutral particle discrete ordinates transport model in slab geometry

    International Nuclear Information System (INIS)

    Barros, Ricardo C.; Filho, Hermes Alves; Platt, Gustavo M.; Oliveira, Francisco Bruno S.; Militao, Damiano S.

    2010-01-01

    Coarse-mesh numerical methods are very efficient in the sense that they generate accurate results in short computational time, as the number of floating point operations generally decrease, as a result of the reduced number of mesh points. On the other hand, they generate numerical solutions that do not give detailed information on the problem solution profile, as the grid points can be located considerably away from each other. In this paper we describe two steps for the analytical reconstruction of the coarse-mesh solution generated by the spectral nodal method for neutral particle discrete ordinates (S N ) transport model in slab geometry. The first step of the algorithm is based on the analytical reconstruction of the coarse-mesh solution within each discretization cell of the grid set up on the spatial domain. The second step is based on the angular reconstruction of the discrete ordinates solution between two contiguous ordinates of the angular quadrature set used in the S N model. Numerical results are given so we can illustrate the accuracy of the two reconstruction techniques, as described in this paper.

  4. High-speed particle tracking in nuclear emulsion by last-generation automatic microscopes

    International Nuclear Information System (INIS)

    Armenise, N.; De Serio, M.; Ieva, M.; Muciaccia, M.T.; Pastore, A.; Simone, S.; Damet, J.; Kreslo, I.; Savvinov, N.; Waelchli, T.; Consiglio, L.; Cozzi, M.; Di Ferdinando, D.; Esposito, L.S.; Giacomelli, G.; Giorgini, M.; Mandrioli, G.; Patrizii, L.; Sioli, M.; Sirri, G.; Arrabito, L.; Laktineh, I.; Royole-Degieux, P.; Buontempo, S.; D'Ambrosio, N.; De Lellis, G.; De Rosa, G.; Di Capua, F.; Coppola, D.; Formisano, F.; Marotta, A.; Migliozzi, P.; Pistillo, C.; Scotto Lavina, L.; Sorrentino, G.; Strolin, P.; Tioukov, V.; Juget, F.; Hauger, M.; Rosa, G.; Barbuto, E.; Bozza, C.; Grella, G.; Romano, G.; Sirignano, C.

    2005-01-01

    The technique of nuclear emulsions for high-energy physics experiments is being revived, thanks to the remarkable progress in measurement automation achieved in the past years. The present paper describes the features and performances of the European Scanning System, a last-generation automatic microscope working at a scanning speed of 20cm 2 /h. The system has been developed in the framework of the OPERA experiment, designed to unambigously detect ν μ ->ν τ oscillations in nuclear emulsions

  5. Automatic Traffic-Based Internet Control Message Protocol (ICMP) Model Generation for ns-3

    Science.gov (United States)

    2015-12-01

    more protocols (especially at different layers of the OSI model ), implementing an inference engine to extract inter- and intrapacket dependencies, and...ARL-TR-7543 ● DEC 2015 US Army Research Laboratory Automatic Traffic-Based Internet Control Message Protocol (ICMP) Model ...ICMP) Model Generation for ns-3 by Jaime C Acosta and Felipe Jovel Survivability/Lethality Analysis Directorate, ARL Felipe Sotelo and Caesar

  6. Generation of an artificial skin construct containing a non-degradable fiber mesh: a potential transcutaneous interface

    Energy Technology Data Exchange (ETDEWEB)

    Cahn, Frederick [Biomedical Strategies Inc., San Diego, CA (United States); Kyriakides, Themis R [Vascular Biology and Therapeutics, Yale University, New Haven, CT 06536-9812 (United States)], E-mail: themis.kyriakides@yale.edu

    2008-09-01

    Generation of a stable interface between soft tissues and biomaterials could improve the function of transcutaneous prostheses, primarily by minimizing chronic infections. We hypothesized that inclusion of non-biodegradable biomaterials in an artificial skin substrate would improve integration of the neodermis. In the present study, we compared the biocompatibility of an experimental substrate, consisting of collagen and glycosylaminoglycans, with commercially available artificial skin of similar composition. By utilizing a mouse excisional wound model, we found that the source of collagen (bovine tendon versus hide), extent of injury and wound contraction were critical determinants of inflammation and neodermis formation. Reducing the extent of injury to underlying muscle reduced inflammation and improved remodeling; the improved conditions allowed the detection of a pro-inflammatory effect of hide-derived collagen. To eliminate the complication of wound contraction, subsequent grafts were performed in guinea pigs and showed that inclusion of carbon fibers or non-degradable sutures resulted in increased foreign body response (FBR) and altered remodeling. On the other hand, inclusion of a polyester multi-stranded mesh induced a mild FBR and allowed normal neodermis formation. Taken together, our observations suggest that non-degradable biomaterials can be embedded in an artificial skin construct without compromising its ability to induce neodermis formation.

  7. Accuracy assessment of building point clouds automatically generated from iphone images

    Directory of Open Access Journals (Sweden)

    B. Sirmacek

    2014-06-01

    Full Text Available Low-cost sensor generated 3D models can be useful for quick 3D urban model updating, yet the quality of the models is questionable. In this article, we evaluate the reliability of an automatic point cloud generation method using multi-view iPhone images or an iPhone video file as an input. We register such automatically generated point cloud on a TLS point cloud of the same object to discuss accuracy, advantages and limitations of the iPhone generated point clouds. For the chosen example showcase, we have classified 1.23% of the iPhone point cloud points as outliers, and calculated the mean of the point to point distances to the TLS point cloud as 0.11 m. Since a TLS point cloud might also include measurement errors and noise, we computed local noise values for the point clouds from both sources. Mean (μ and standard deviation (σ of roughness histograms are calculated as (μ1 = 0.44 m., σ1 = 0.071 m. and (μ2 = 0.025 m., σ2 = 0.037 m. for the iPhone and TLS point clouds respectively. Our experimental results indicate possible usage of the proposed automatic 3D model generation framework for 3D urban map updating, fusion and detail enhancing, quick and real-time change detection purposes. However, further insights should be obtained first on the circumstances that are needed to guarantee a successful point cloud generation from smartphone images.

  8. Automatic generation of stop word lists for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J

    2013-01-08

    Methods and systems for automatically generating lists of stop words for information retrieval and analysis. Generation of the stop words can include providing a corpus of documents and a plurality of keywords. From the corpus of documents, a term list of all terms is constructed and both a keyword adjacency frequency and a keyword frequency are determined. If a ratio of the keyword adjacency frequency to the keyword frequency for a particular term on the term list is less than a predetermined value, then that term is excluded from the term list. The resulting term list is truncated based on predetermined criteria to form a stop word list.

  9. Automatic Generation of Cycle-Approximate TLMs with Timed RTOS Model Support

    Science.gov (United States)

    Hwang, Yonghyun; Schirner, Gunar; Abdi, Samar

    This paper presents a technique for automatically generating cycle-approximate transaction level models (TLMs) for multi-process applications mapped to embedded platforms. It incorporates three key features: (a) basic block level timing annotation, (b) RTOS model integration, and (c) RTOS overhead delay modeling. The inputs to TLM generation are application C processes and their mapping to processors in the platform. A processor data model, including pipelined datapath, memory hierarchy and branch delay model is used to estimate basic block execution delays. The delays are annotated to the C code, which is then integrated with a generated SystemC RTOS model. Our abstract RTOS provides dynamic scheduling and inter-process communication (IPC) with processor- and RTOS-specific pre-characterized timing. Our experiments using a MP3 decoder and a JPEG encoder show that timed TLMs, with integrated RTOS models, can be automatically generated in less than a minute. Our generated TLMs simulated three times faster than real-time and showed less than 10% timing error compared to board measurements.

  10. Automatic Generation System of Multiple-Choice Cloze Questions and its Evaluation

    Directory of Open Access Journals (Sweden)

    Takuya Goto

    2010-09-01

    Full Text Available Since English expressions vary according to the genres, it is important for students to study questions that are generated from sentences of the target genre. Although various questions are prepared, it is still not enough to satisfy various genres which students want to learn. On the other hand, when producing English questions, sufficient grammatical knowledge and vocabulary are needed, so it is difficult for non-expert to prepare English questions by themselves. In this paper, we propose an automatic generation system of multiple-choice cloze questions from English texts. Empirical knowledge is necessary to produce appropriate questions, so machine learning is introduced to acquire knowledge from existing questions. To generate the questions from texts automatically, the system (1 extracts appropriate sentences for questions from texts based on Preference Learning, (2 estimates a blank part based on Conditional Random Field, and (3 generates distracters based on statistical patterns of existing questions. Experimental results show our method is workable for selecting appropriate sentences and blank part. Moreover, our method is appropriate to generate the available distracters, especially for the sentence that does not contain the proper noun.

  11. A Method of Generating Indoor Map Spatial Data Automatically from Architectural Plans

    Directory of Open Access Journals (Sweden)

    SUN Weixin

    2016-06-01

    Full Text Available Taking architectural plans as data source, we proposed a method which can automatically generate indoor map spatial data. Firstly, referring to the spatial data demands of indoor map, we analyzed the basic characteristics of architectural plans, and introduced concepts of wall segment, adjoining node and adjoining wall segment, based on which basic flow of indoor map spatial data automatic generation was further established. Then, according to the adjoining relation between wall lines at the intersection with column, we constructed a repair method for wall connectivity in relation to the column. Utilizing the method of gradual expansibility and graphic reasoning to judge wall symbol local feature type at both sides of door or window, through update the enclosing rectangle of door or window, we developed a repair method for wall connectivity in relation to the door or window and a method for transform door or window into indoor map point feature. Finally, on the basis of geometric relation between adjoining wall segment median lines, a wall center-line extraction algorithm was presented. Taking one exhibition hall's architectural plan as example, we performed experiment and results show that the proposed methods have preferable applicability to deal with various complex situations, and realized indoor map spatial data automatic extraction effectively.

  12. SUPERIMPOSED MESH PLOTTING IN MCNP

    Energy Technology Data Exchange (ETDEWEB)

    J. HENDRICKS

    2001-02-01

    The capability to plot superimposed meshes has been added to MCNP{trademark}. MCNP4C featured a superimposed mesh weight window generator which enabled users to set up geometries without having to subdivide geometric cells for variance reduction. The variance reduction was performed with weight windows on a rectangular or cylindrical mesh superimposed over the physical geometry. Experience with the new capability was favorable but also indicated that a number of enhancements would be very beneficial, particularly a means of visualizing the mesh and its values. The mathematics for plotting the mesh and its values is described here along with a description of other upgrades.

  13. Development of ANJOYMC Program for Automatic Generation of Monte Carlo Cross Section Libraries

    International Nuclear Information System (INIS)

    Kim, Kang Seog; Lee, Chung Chan

    2007-03-01

    The NJOY code developed at Los Alamos National Laboratory is to generate the cross section libraries in ACE format for the Monte Carlo codes such as MCNP and McCARD by processing the evaluated nuclear data in ENDF/B format. It takes long time to prepare all the NJOY input files for hundreds of nuclides with various temperatures, and there can be some errors in the input files. In order to solve these problems, ANJOYMC program has been developed. By using a simple user input deck, this program is not only to generate all the NJOY input files automatically, but also to generate a batch file to perform all the NJOY calculations. The ANJOYMC program is written in Fortran90 and can be executed under the WINDOWS and LINUX operating systems in Personal Computer. Cross section libraries in ACE format can be generated in a short time and without an error by using a simple user input deck

  14. Evaluating the Potential of Imaging Rover for Automatic Point Cloud Generation

    Science.gov (United States)

    Cera, V.; Campi, M.

    2017-02-01

    The paper presents a phase of an on-going interdisciplinary research concerning the medieval site of Casertavecchia (Italy). The project aims to develop a multi-technique approach for the semantic - enriched 3D modeling starting from the automatic acquisition of several data. In particular, the paper reports the results of the first stage about the Cathedral square of the medieval village. The work is focused on evaluating the potential of an imaging rover for automatic point cloud generation. Each of survey techniques has its own advantages and disadvantages so the ideal approach is an integrated methodology in order to maximize single instrument performance. The experimentation was conducted on the Cathedral square of the ancient site of Casertavecchia, in Campania, Italy.

  15. On the application of bezier surfaces for GA-Fuzzy controller design for use in automatic generation control

    CSIR Research Space (South Africa)

    Boesack, CD

    2012-03-01

    Full Text Available Automatic Generation Control (AGC) of large interconnected power systems are typically controlled by a PI or PID type control law. Recently intelligent control techniques such as GA-Fuzzy controllers have been widely applied within the power...

  16. LHC-GCS a model-driven approach for automatic PLC and SCADA code generation

    CERN Document Server

    Thomas, Geraldine; Barillère, Renaud; Cabaret, Sebastien; Kulman, Nikolay; Pons, Xavier; Rochez, Jacques

    2005-01-01

    The LHC experiments’ Gas Control System (LHC GCS) project [1] aims to provide the four LHC experiments (ALICE, ATLAS, CMS and LHCb) with control for their 23 gas systems. To ease the production and maintenance of 23 control systems, a model-driven approach has been adopted to generate automatically the code for the Programmable Logic Controllers (PLCs) and for the Supervision Control And Data Acquisition (SCADA) systems. The first milestones of the project have been achieved. The LHC GCS framework [4] and the generation tools have been produced. A first control application has actually been generated and is in production, and a second is in preparation. This paper describes the principle and the architecture of the model-driven solution. It will in particular detail how the model-driven solution fits with the LHC GCS framework and with the UNICOS [5] data-driven tools.

  17. Development of automatic intercomparison system for generation of time scale ensembling several atomic clocks

    Directory of Open Access Journals (Sweden)

    Thorat P.P.

    2015-01-01

    Full Text Available National physical laboratory India (NPLI has five commercial cesium atomic clocks. Till recently one of these clocks had been used to maintain coordinated universal time (UTC of NPLI. To utilize all these clocks in an ensemble manner to generate a smoother time scale, it has been essential to inter-compare them very precisely. This has been achieved with an automatic measurement system with well-conceived software. Though few laboratories have developed such automatic measurement system by themselves based on the respective requirements; but they are not reported. So keeping in mind of the specific requirement of the time scale generation, a new system has been developed by NPLI. The design has taken into account of the associated infrastructure that exists and would be used. The performance of the new system has also been studied. It has been found to be quite satisfactory to serve the purpose. The system is being utilized for the generation of time scale of NPLI.

  18. Application of GA optimization for automatic generation control design in an interconnected power system

    Energy Technology Data Exchange (ETDEWEB)

    Golpira, H., E-mail: hemin.golpira@uok.ac.i [Department of Electrical and Computer Engineering, University of Kurdistan, Sanandaj, PO Box 416, Kurdistan (Iran, Islamic Republic of); Bevrani, H. [Department of Electrical and Computer Engineering, University of Kurdistan, Sanandaj, PO Box 416, Kurdistan (Iran, Islamic Republic of); Golpira, H. [Department of Industrial Engineering, Islamic Azad University, Sanandaj Branch, PO Box 618, Kurdistan (Iran, Islamic Republic of)

    2011-05-15

    Highlights: {yields} A realistic model for automatic generation control (AGC) design is proposed. {yields} The model considers GRC, Speed governor dead band, filters and time delay. {yields} The model provides an accurate model for the digital simulations. -- Abstract: This paper addresses a realistic model for automatic generation control (AGC) design in an interconnected power system. The proposed scheme considers generation rate constraint (GRC), dead band, and time delay imposed to the power system by governor-turbine, filters, thermodynamic process, and communication channels. Simplicity of structure and acceptable response of the well-known integral controller make it attractive for the power system AGC design problem. The Genetic algorithm (GA) is used to compute the decentralized control parameters to achieve an optimum operating point. A 3-control area power system is considered as a test system, and the closed-loop performance is examined in the presence of various constraints scenarios. It is shown that neglecting above physical constraints simultaneously or in part, leads to impractical and invalid results and may affect the system security, reliability and integrity. Taking to account the advantages of GA besides considering a more complete dynamic model provides a flexible and more realistic AGC system in comparison of existing conventional schemes.

  19. Application of GA optimization for automatic generation control design in an interconnected power system

    International Nuclear Information System (INIS)

    Golpira, H.; Bevrani, H.; Golpira, H.

    2011-01-01

    Highlights: → A realistic model for automatic generation control (AGC) design is proposed. → The model considers GRC, Speed governor dead band, filters and time delay. → The model provides an accurate model for the digital simulations. -- Abstract: This paper addresses a realistic model for automatic generation control (AGC) design in an interconnected power system. The proposed scheme considers generation rate constraint (GRC), dead band, and time delay imposed to the power system by governor-turbine, filters, thermodynamic process, and communication channels. Simplicity of structure and acceptable response of the well-known integral controller make it attractive for the power system AGC design problem. The Genetic algorithm (GA) is used to compute the decentralized control parameters to achieve an optimum operating point. A 3-control area power system is considered as a test system, and the closed-loop performance is examined in the presence of various constraints scenarios. It is shown that neglecting above physical constraints simultaneously or in part, leads to impractical and invalid results and may affect the system security, reliability and integrity. Taking to account the advantages of GA besides considering a more complete dynamic model provides a flexible and more realistic AGC system in comparison of existing conventional schemes.

  20. Lightning Protection Performance Assessment of Transmission Line Based on ATP model Automatic Generation

    Directory of Open Access Journals (Sweden)

    Luo Hanwu

    2016-01-01

    Full Text Available This paper presents a novel method to solve the initial lightning breakdown current by combing ATP and MATLAB simulation software effectively, with the aims to evaluate the lightning protection performance of transmission line. Firstly, the executable ATP simulation model is generated automatically according to the required information such as power source parameters, tower parameters, overhead line parameters, grounding resistance and lightning current parameters, etc. through an interface program coded by MATLAB. Then, the data are extracted from the generated LIS files which can be obtained by executing the ATP simulation model, the occurrence of transmission lie breakdown can be determined by the relative data in LIS file. The lightning current amplitude should be reduced when the breakdown occurs, and vice the verse. Thus the initial lightning breakdown current of a transmission line with given parameters can be determined accurately by continuously changing the lightning current amplitude, which is realized by a loop computing algorithm that is coded by MATLAB software. The method proposed in this paper can generate the ATP simulation program automatically, and facilitates the lightning protection performance assessment of transmission line.

  1. Automatic cloud-free image generation from high-resolution multitemporal imagery

    Science.gov (United States)

    Han, Youkyung; Bovolo, Francesca; Lee, Won Hee

    2017-04-01

    The aim of this paper is to document the automatic reconstruction of clouds and their cast shadows for the generation of spontaneous cloud-free images from high-resolution multitemporal images. To apply the proposed technique, a cloud-free reference image, which has the same position as a target image acquired at a different time, is required. First, the cloud region in the target image is detected based on integration of thick and peripheral cloud candidate regions. Next, the detected cloud region is restored using the pixel values of the target image by considering their location relative to the reference images. Finally, the pixel values of the restored image are separately normalized to the values of the reference image to generate a natural-looking cloud-free image. Multitemporal KOMPSAT-2 high-resolution images are used to construct study sites for evaluation of the proposed method in diverse cloud-cover cases. The experimental results show that the proposed method can automatically generate cloud-free images from high-resolution multitemporal images with reasonable qualitative and quantitative performance.

  2. On Optimal Bilinear Quadrilateral Meshes

    Energy Technology Data Exchange (ETDEWEB)

    D' Azevedo, E

    2000-03-17

    The novelty of this work is in presenting interesting error properties of two types of asymptotically ''optimal'' quadrilateral meshes for bilinear approximation. The first type of mesh has an error equidistributing property where the maximum interpolation error is asymptotically the same over all elements. The second type has faster than expected ''super-convergence'' property for certain saddle-shaped data functions. The ''superconvergent'' mesh may be an order of magnitude more accurate than the error equidistributing mesh. Both types of mesh are generated by a coordinate transformation of a regular mesh of squares. The coordinate transformation is derived by interpreting the Hessian matrix of a data function as a metric tensor. The insights in this work may have application in mesh design near corner or point singularities.

  3. Ontorat: automatic generation of new ontology terms, annotations, and axioms based on ontology design patterns.

    Science.gov (United States)

    Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; He, Yongqun

    2015-01-01

    It is time-consuming to build an ontology with many terms and axioms. Thus it is desired to automate the process of ontology development. Ontology Design Patterns (ODPs) provide a reusable solution to solve a recurrent modeling problem in the context of ontology engineering. Because ontology terms often follow specific ODPs, the Ontology for Biomedical Investigations (OBI) developers proposed a Quick Term Templates (QTTs) process targeted at generating new ontology classes following the same pattern, using term templates in a spreadsheet format. Inspired by the ODPs and QTTs, the Ontorat web application is developed to automatically generate new ontology terms, annotations of terms, and logical axioms based on a specific ODP(s). The inputs of an Ontorat execution include axiom expression settings, an input data file, ID generation settings, and a target ontology (optional). The axiom expression settings can be saved as a predesigned Ontorat setting format text file for reuse. The input data file is generated based on a template file created by a specific ODP (text or Excel format). Ontorat is an efficient tool for ontology expansion. Different use cases are described. For example, Ontorat was applied to automatically generate over 1,000 Japan RIKEN cell line cell terms with both logical axioms and rich annotation axioms in the Cell Line Ontology (CLO). Approximately 800 licensed animal vaccines were represented and annotated in the Vaccine Ontology (VO) by Ontorat. The OBI team used Ontorat to add assay and device terms required by ENCODE project. Ontorat was also used to add missing annotations to all existing Biobank specific terms in the Biobank Ontology. A collection of ODPs and templates with examples are provided on the Ontorat website and can be reused to facilitate ontology development. With ever increasing ontology development and applications, Ontorat provides a timely platform for generating and annotating a large number of ontology terms by following

  4. Automatic Generation of Mashups for Personalized Commerce in Digital TV by Semantic Reasoning

    Science.gov (United States)

    Blanco-Fernández, Yolanda; López-Nores, Martín; Pazos-Arias, José J.; Martín-Vicente, Manuela I.

    The evolution of information technologies is consolidating recommender systems as essential tools in e-commerce. To date, these systems have focused on discovering the items that best match the preferences, interests and needs of individual users, to end up listing those items by decreasing relevance in some menus. In this paper, we propose extending the current scope of recommender systems to better support trading activities, by automatically generating interactive applications that provide the users with personalized commercial functionalities related to the selected items. We explore this idea in the context of Digital TV advertising, with a system that brings together semantic reasoning techniques and new architectural solutions for web services and mashups.

  5. Design and construction of a graphical interface for automatic generation of simulation code GEANT4

    International Nuclear Information System (INIS)

    Driss, Mozher; Bouzaine Ismail

    2007-01-01

    This work is set in the context of the engineering studies final project; it is accomplished in the center of nuclear sciences and technologies in Sidi Thabet. This project is about conceiving and developing a system based on graphical user interface which allows an automatic codes generation for simulation under the GEANT4 engine. This system aims to facilitate the use of GEANT4 by scientific not necessary expert in this engine and to be used in different areas: research, industry and education. The implementation of this project uses Root library and several programming languages such as XML and XSL. (Author). 5 refs

  6. Spreadsheet Activities with Conditional Progression and Automatically Generated Feedback and Grades

    Directory of Open Access Journals (Sweden)

    Thomas C Juster

    2013-02-01

    Full Text Available Spreadsheet activities following the Spreadsheets Across the Curriculum (SSAC model have been modified using VBA programming to automatically generate feedback, calculate grades, and ensure that students complete them in a linear fashion. Feedback is based not only on the value of cells, but also on the formulas used to compute the values. These changes greatly ease the burden of grading on instructors, and help students more quickly master tasks and concepts by providing immediate and directed feedback to their answers. Students performed significantly better on the new spreadsheet activities compared to traditional SSAC versions, with 87% achieving perfect scores of 100%.

  7. Notes on the Mesh Handler and Mesh Data Conversion

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sang Yong; Park, Chan Eok [Korea Power Engineering Company, Daejeon (Korea, Republic of)

    2009-10-15

    At the outset of the development of the thermal-hydraulic code (THC), efforts have been made to utilize the recent technology of the computational fluid dynamics. Among many of them, the unstructured mesh approach was adopted to alleviate the restriction of the grid handling system. As a natural consequence, a mesh handler (MH) has been developed to manipulate the complex mesh data from the mesh generator. The mesh generator, Gambit, was chosen at the beginning of the development of the code. But a new mesh generator, Pointwise, was introduced to get more flexible mesh generation capability. An open source code, Paraview, was chosen as a post processor, which can handle unstructured as well as structured mesh data. Overall data processing system for THC is shown in Figure-1. There are various file formats to save the mesh data in the permanent storage media. A couple of dozen of file formats are found even in the above mentioned programs. A competent mesh handler should have the capability to import or export mesh data as many as possible formats. But, in reality, there are two aspects that make it difficult to achieve the competence. The first aspect to consider is the time and efforts to program the interface code. And the second aspect, which is even more difficult one, is the fact that many mesh data file formats are proprietary information. In this paper, some experience of the development of the format conversion programs will be presented. File formats involved are Gambit neutral format, Ansys-CFX grid file format, VTK legacy file format, Nastran format and CGNS.

  8. Notes on the Mesh Handler and Mesh Data Conversion

    International Nuclear Information System (INIS)

    Lee, Sang Yong; Park, Chan Eok

    2009-01-01

    At the outset of the development of the thermal-hydraulic code (THC), efforts have been made to utilize the recent technology of the computational fluid dynamics. Among many of them, the unstructured mesh approach was adopted to alleviate the restriction of the grid handling system. As a natural consequence, a mesh handler (MH) has been developed to manipulate the complex mesh data from the mesh generator. The mesh generator, Gambit, was chosen at the beginning of the development of the code. But a new mesh generator, Pointwise, was introduced to get more flexible mesh generation capability. An open source code, Paraview, was chosen as a post processor, which can handle unstructured as well as structured mesh data. Overall data processing system for THC is shown in Figure-1. There are various file formats to save the mesh data in the permanent storage media. A couple of dozen of file formats are found even in the above mentioned programs. A competent mesh handler should have the capability to import or export mesh data as many as possible formats. But, in reality, there are two aspects that make it difficult to achieve the competence. The first aspect to consider is the time and efforts to program the interface code. And the second aspect, which is even more difficult one, is the fact that many mesh data file formats are proprietary information. In this paper, some experience of the development of the format conversion programs will be presented. File formats involved are Gambit neutral format, Ansys-CFX grid file format, VTK legacy file format, Nastran format and CGNS

  9. Automatic Generation Control Study in Two Area Reheat Thermal Power System

    Science.gov (United States)

    Pritam, Anita; Sahu, Sibakanta; Rout, Sushil Dev; Ganthia, Sibani; Prasad Ganthia, Bibhu

    2017-08-01

    Due to industrial pollution our living environment destroyed. An electric grid system has may vital equipment like generator, motor, transformers and loads. There is always be an imbalance between sending end and receiving end system which cause system unstable. So this error and fault causing problem should be solved and corrected as soon as possible else it creates faults and system error and fall of efficiency of the whole power system. The main problem developed from this fault is deviation of frequency cause instability to the power system and may cause permanent damage to the system. Therefore this mechanism studied in this paper make the system stable and balance by regulating frequency at both sending and receiving end power system using automatic generation control using various controllers taking a two area reheat thermal power system into account.

  10. Development and Testing of Automatically Generated ACS Flight Software for the MAP Spacecraft

    Science.gov (United States)

    ODonnell, James R., Jr.; McComas, David C.; Andrews, Stephen F.

    1998-01-01

    By integrating the attitude determination and control system (ACS) analysis and design, flight software development, and flight software testing processes, it is possible to improve the overall spacecraft development cycle, as well as allow for more thorough software testing. One of the ways to achieve this integration is to use code-generation tools to automatically generate components of the ACS flight software directly from a high-fidelity (HiFi) simulation. In the development of the Microwave Anisotropy Probe (MAP) spacecraft, currently underway at the NASA Goddard Space Flight Center, approximately 1/3 of the ACS flight software was automatically generated. In this paper, we will examine each phase of the ACS subsystem and flight software design life cycle: analysis, design, and testing. In the analysis phase, we scoped how much software we would automatically generate and created the initial interface. The design phase included parallel development of the HiFi simulation and the hand-coded flight software components. Everything came together in the test phase, in which the flight software was tested, using results from the HiFi simulation as one of the bases of comparison for testing. Because parts of the spacecraft HiFi simulation were converted into flight software, more care needed to be put into its development and configuration control to support both the HiFi simulation and flight software. The components of the HiFi simulation from which code was generated needed to be designed based on the fact that they would become flight software. This process involved such considerations as protecting against mathematical exceptions, using acceptable module and parameter naming conventions, and using an input/output interface compatible with the rest of the flight software. Maintaining good configuration control was an issue for the HiFi simulation and the flight software, and a way to track the two systems was devised. Finally, an integrated test approach was

  11. Automatic generation control with thyristor controlled series compensator including superconducting magnetic energy storage units

    Directory of Open Access Journals (Sweden)

    Saroj Padhan

    2014-09-01

    Full Text Available In the present work, an attempt has been made to understand the dynamic performance of Automatic Generation Control (AGC of multi-area multi-units thermal–thermal power system with the consideration of Reheat turbine, Generation Rate Constraint (GRC and Time delay. Initially, the gains of the fuzzy PID controller are optimized using Differential Evolution (DE algorithm. The superiority of DE is demonstrated by comparing the results with Genetic Algorithm (GA. After that performance of Thyristor Controlled Series Compensator (TCSC has been investigated. Further, a TCSC is placed in the tie-line and Superconducting Magnetic Energy Storage (SMES units are considered in both areas. Finally, sensitivity analysis is performed by varying the system parameters and operating load conditions from their nominal values. It is observed that the optimum gains of the proposed controller need not be reset even if the system is subjected to wide variation in loading condition and system parameters.

  12. A reusable automatically generated software system for the control of the Large Millimeter Telescope

    Science.gov (United States)

    Souccar, Kamal; Wallace, Gary; Malin, Daniella

    2002-12-01

    A telescope system is composed of a set of real-world objects that are mapped onto software objects whose properties are described in XML configuration files. These XML files are processed to automatically generate user interfaces, underlying communication mechanisms, and extendible source code. Developers need not write user interfaces or communication methods but can focus on the production of scientific results. Any modifications or additions of objects can be easily achieved by editing or generating corresponding XML files and compiling them into the system. This framework can be utilized to implement servo controllers, device drivers, observing algorithms and instrument controllers; and is applicable to any problem domain that requires a user-based interaction with the inputs and outputs of a particular resource or program. This includes telescope systems, instruments, data reduction methods, and database interfaces. The system is implemented using Java, C++, and CORBA.

  13. Automatic deodorizing system for waste water from radioisotope facilities using an ozone generator

    International Nuclear Information System (INIS)

    Kawamura, Hiroko; Hirata, Yasuki

    2002-01-01

    We applied an ozone generator to sterilize and to deodorize the waste water from radioisotope facilities. A small tank connected to the generator is placed outside of the drainage facility founded previously, not to oxidize the other apparatus. The waste water is drained 1 m 3 at a time from the tank of drainage facility, treated with ozone and discharged to sewer. All steps proceed automatically once the draining work is started remotely in the office. The waste water was examined after the ozone treatment for 0 (original), 0.5, 1.0, 1.5 and 2.0 h. Regarding original waste water, the sum of coliform groups varied with every examination repeated - probably depend on the colibacilli used in experiments; hydrogen sulfide, biochemical oxygen demand and the offensive odor increased with increasing coliform groups. The ozone treatment remarkably decreased hydrogen sulfide and the offensive odor, decreased coliform groups when the original water had rich coliforms. (author)

  14. Optimal gravitational search algorithm for automatic generation control of interconnected power systems

    Directory of Open Access Journals (Sweden)

    Rabindra Kumar Sahu

    2014-09-01

    Full Text Available An attempt is made for the effective application of Gravitational Search Algorithm (GSA to optimize PI/PIDF controller parameters in Automatic Generation Control (AGC of interconnected power systems. Initially, comparison of several conventional objective functions reveals that ITAE yields better system performance. Then, the parameters of GSA technique are properly tuned and the GSA control parameters are proposed. The superiority of the proposed approach is demonstrated by comparing the results of some recently published techniques such as Differential Evolution (DE, Bacteria Foraging Optimization Algorithm (BFOA and Genetic Algorithm (GA. Additionally, sensitivity analysis is carried out that demonstrates the robustness of the optimized controller parameters to wide variations in operating loading condition and time constants of speed governor, turbine, tie-line power. Finally, the proposed approach is extended to a more realistic power system model by considering the physical constraints such as reheat turbine, Generation Rate Constraint (GRC and Governor Dead Band nonlinearity.

  15. Program Code Generator for Cardiac Electrophysiology Simulation with Automatic PDE Boundary Condition Handling.

    Directory of Open Access Journals (Sweden)

    Florencio Rusty Punzalan

    Full Text Available Clinical and experimental studies involving human hearts can have certain limitations. Methods such as computer simulations can be an important alternative or supplemental tool. Physiological simulation at the tissue or organ level typically involves the handling of partial differential equations (PDEs. Boundary conditions and distributed parameters, such as those used in pharmacokinetics simulation, add to the complexity of the PDE solution. These factors can tailor PDE solutions and their corresponding program code to specific problems. Boundary condition and parameter changes in the customized code are usually prone to errors and time-consuming. We propose a general approach for handling PDEs and boundary conditions in computational models using a replacement scheme for discretization. This study is an extension of a program generator that we introduced in a previous publication. The program generator can generate code for multi-cell simulations of cardiac electrophysiology. Improvements to the system allow it to handle simultaneous equations in the biological function model as well as implicit PDE numerical schemes. The replacement scheme involves substituting all partial differential terms with numerical solution equations. Once the model and boundary equations are discretized with the numerical solution scheme, instances of the equations are generated to undergo dependency analysis. The result of the dependency analysis is then used to generate the program code. The resulting program code are in Java or C programming language. To validate the automatic handling of boundary conditions in the program code generator, we generated simulation code using the FHN, Luo-Rudy 1, and Hund-Rudy cell models and run cell-to-cell coupling and action potential propagation simulations. One of the simulations is based on a published experiment and simulation results are compared with the experimental data. We conclude that the proposed program code

  16. Automatic verification of SSD and generation of respiratory signal with lasers in radiotherapy: a preliminary study.

    Science.gov (United States)

    Prabhakar, Ramachandran

    2012-01-01

    Source to surface distance (SSD) plays a very important role in external beam radiotherapy treatment verification. In this study, a simple technique has been developed to verify the SSD automatically with lasers. The study also suggests a methodology for determining the respiratory signal with lasers. Two lasers, red and green are mounted on the collimator head of a Clinac 2300 C/D linac along with a camera to determine the SSD. A software (SSDLas) was developed to estimate the SSD automatically from the images captured by a 12-megapixel camera. To determine the SSD to a patient surface, the external body contour of the central axis transverse computed tomography (CT) cut is imported into the software. Another important aspect in radiotherapy is the generation of respiratory signal. The changes in the lasers separation as the patient breathes are converted to produce a respiratory signal. Multiple frames of laser images were acquired from the camera mounted on the collimator head and each frame was analyzed with SSDLas to generate the respiratory signal. The SSD as observed with the ODI on the machine and SSD measured by the SSDlas software was found to be within the tolerance limit. The methodology described for generating the respiratory signals will be useful for the treatment of mobile tumors such as lung, liver, breast, pancreas etc. The technique described for determining the SSD and the generation of respiratory signals using lasers is cost effective and simple to implement. Copyright © 2011 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  17. H-Morph: An indirect approach to advancing front hex meshing

    Energy Technology Data Exchange (ETDEWEB)

    OWEN,STEVEN J.; SAIGAL,SUNIL

    2000-05-30

    H-Morph is a new automatic algorithm for the generation of a hexahedral-dominant finite element mesh for arbitrary volumes. The H-Morph method starts with an initial tetrahedral mesh and systematically transforms and combines tetrahedral into hexahedra. It uses an advancing front technique where the initial front consists of a set of prescribed quadrilateral surface facets. Fronts are individually processed by recovering each of the six quadrilateral faces of a hexahedron from the tetrahedral mesh. Recovery techniques similar to those used in boundary constrained Delaunay mesh generation are used. Tetrahedral internal to the six hexahedral faces are then removed and a hexahedron is formed. At any time during the H-Morph procedure a valid mixed hexahedral-tetrahedral mesh is in existence within the volume. The procedure continues until no tetrahedral remain within the volume, or tetrahedral remain which cannot be transformed or combined into valid hexahedral elements. Any remaining tetrahedral are typically towards the interior of the volume, generally a less critical region for analysis. Transition from tetrahedral to hexahedra in the final mesh is accomplished through pyramid shaped elements. Advantages of the proposed method include its ability to conform to an existing quadrilateral surface mesh, its ability to mesh without the need to decompose or recognize special classes of geometry, and its characteristic well-aligned layers of elements parallel to the boundary. Example test cases are presented on a variety of models.

  18. Automatic evaluation and data generation for analytical chemistry instrumental analysis exercises

    Directory of Open Access Journals (Sweden)

    Arsenio Muñoz de la Peña

    2014-01-01

    Full Text Available In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS, an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.

  19. Computational performance of Free Mesh Method applied to continuum mechanics problems

    Science.gov (United States)

    YAGAWA, Genki

    2011-01-01

    The free mesh method (FMM) is a kind of the meshless methods intended for particle-like finite element analysis of problems that are difficult to handle using global mesh generation, or a node-based finite element method that employs a local mesh generation technique and a node-by-node algorithm. The aim of the present paper is to review some unique numerical solutions of fluid and solid mechanics by employing FMM as well as the Enriched Free Mesh Method (EFMM), which is a new version of FMM, including compressible flow and sounding mechanism in air-reed instruments as applications to fluid mechanics, and automatic remeshing for slow crack growth, dynamic behavior of solid as well as large-scale Eigen-frequency of engine block as applications to solid mechanics. PMID:21558753

  20. Automatic generation of 3D motifs for classification of protein binding sites

    Directory of Open Access Journals (Sweden)

    Herzyk Pawel

    2007-08-01

    Full Text Available Abstract Background Since many of the new protein structures delivered by high-throughput processes do not have any known function, there is a need for structure-based prediction of protein function. Protein 3D structures can be clustered according to their fold or secondary structures to produce classes of some functional significance. A recent alternative has been to detect specific 3D motifs which are often associated to active sites. Unfortunately, there are very few known 3D motifs, which are usually the result of a manual process, compared to the number of sequential motifs already known. In this paper, we report a method to automatically generate 3D motifs of protein structure binding sites based on consensus atom positions and evaluate it on a set of adenine based ligands. Results Our new approach was validated by generating automatically 3D patterns for the main adenine based ligands, i.e. AMP, ADP and ATP. Out of the 18 detected patterns, only one, the ADP4 pattern, is not associated with well defined structural patterns. Moreover, most of the patterns could be classified as binding site 3D motifs. Literature research revealed that the ADP4 pattern actually corresponds to structural features which show complex evolutionary links between ligases and transferases. Therefore, all of the generated patterns prove to be meaningful. Each pattern was used to query all PDB proteins which bind either purine based or guanine based ligands, in order to evaluate the classification and annotation properties of the pattern. Overall, our 3D patterns matched 31% of proteins with adenine based ligands and 95.5% of them were classified correctly. Conclusion A new metric has been introduced allowing the classification of proteins according to the similarity of atomic environment of binding sites, and a methodology has been developed to automatically produce 3D patterns from that classification. A study of proteins binding adenine based ligands showed that

  1. Perfusion CT in acute stroke: effectiveness of automatically-generated colour maps.

    Science.gov (United States)

    Ukmar, Maja; Degrassi, Ferruccio; Pozzi Mucelli, Roberta Antea; Neri, Francesca; Mucelli, Fabio Pozzi; Cova, Maria Assunta

    2017-04-01

    To evaluate the accuracy of perfusion CT (pCT) in the definition of the infarcted core and the penumbra, comparing the data obtained from the evaluation of parametric maps [cerebral blood volume (CBV), cerebral blood flow (CBF) and mean transit time (MTT)] with software-generated colour maps. A retrospective analysis was performed to identify patients with suspected acute ischaemic strokes and who had undergone unenhanced CT and pCT carried out within 4.5 h from the onset of the symptoms. A qualitative evaluation of the CBV, CBF and MTT maps was performed, followed by an analysis of the colour maps automatically generated by the software. 26 patients were identified, but a direct CT follow-up was performed only on 19 patients after 24-48 h. In the qualitative analysis, 14 patients showed perfusion abnormalities. Specifically, 29 perfusion deficit areas were detected, of which 15 areas suggested the penumbra and the remaining 14 areas suggested the infarct. As for automatically software-generated maps, 12 patients showed perfusion abnormalities. 25 perfusion deficit areas were identified, 15 areas of which suggested the penumbra and the other 10 areas the infarct. The McNemar's test showed no statistically significant difference between the two methods of evaluation in highlighting infarcted areas proved later at CT follow-up. We demonstrated how pCT provides good diagnostic accuracy in the identification of acute ischaemic lesions. The limits of identification of the lesions mainly lie at the pons level and in the basal ganglia area. Qualitative analysis has proven to be more efficient in identification of perfusion lesions in comparison with software-generated maps. However, software-generated maps have proven to be very useful in the emergency setting. Advances in knowledge: The use of CT perfusion is requested in increasingly more patients in order to optimize the treatment, thanks also to the technological evolution of CT, which now allows a whole

  2. Computed tomography landmark-based semi-automated mesh morphing and mapping techniques: generation of patient specific models of the human pelvis without segmentation.

    Science.gov (United States)

    Salo, Zoryana; Beek, Maarten; Wright, David; Whyne, Cari Marisa

    2015-04-13

    Current methods for the development of pelvic finite element (FE) models generally are based upon specimen specific computed tomography (CT) data. This approach has traditionally required segmentation of CT data sets, which is time consuming and necessitates high levels of user intervention due to the complex pelvic anatomy. The purpose of this research was to develop and assess CT landmark-based semi-automated mesh morphing and mapping techniques to aid the generation and mechanical analysis of specimen-specific FE models of the pelvis without the need for segmentation. A specimen-specific pelvic FE model (source) was created using traditional segmentation methods and morphed onto a CT scan of a different (target) pelvis using a landmark-based method. The morphed model was then refined through mesh mapping by moving the nodes to the bone boundary. A second target model was created using traditional segmentation techniques. CT intensity based material properties were assigned to the morphed/mapped model and to the traditionally segmented target models. Models were analyzed to evaluate their geometric concurrency and strain patterns. Strains generated in a double-leg stance configuration were compared to experimental strain gauge data generated from the same target cadaver pelvis. CT landmark-based morphing and mapping techniques were efficiently applied to create a geometrically multifaceted specimen-specific pelvic FE model, which was similar to the traditionally segmented target model and better replicated the experimental strain results (R(2)=0.873). This study has shown that mesh morphing and mapping represents an efficient validated approach for pelvic FE model generation without the need for segmentation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. OConGraX - Automatically Generating Data-Flow Test Cases for Fault-Tolerant Systems

    Science.gov (United States)

    Nunes, Paulo R. F.; Hanazumi, Simone; de Melo, Ana C. V.

    The more complex to develop and manage systems the more software design faults increase, making fault-tolerant systems highly required. To ensure their quality, the normal and exceptional behaviors must be tested and/or verified. Software testing is still a difficult and costly software development task and a reasonable amount of effort has been employed to develop techniques for testing programs’ normal behaviors. For the exceptional behavior, however, there is a lack of techniques and tools to effectively test it. To help in testing and analyzing fault-tolerant systems, we present in this paper a tool that provides an automatic generation of data-flow test cases for objects and exception-handling mechanisms of Java programs and data/control-flow graphs for program analysis.

  4. Automatic modulation format recognition for the next generation optical communication networks using artificial neural networks

    Science.gov (United States)

    Guesmi, Latifa; Hraghi, Abir; Menif, Mourad

    2015-03-01

    A new technique for Automatic Modulation Format Recognition (AMFR) in next generation optical communication networks is presented. This technique uses the Artificial Neural Network (ANN) in conjunction with the features of Linear Optical Sampling (LOS) of the detected signal at high bit rates using direct detection or coherent detection. The use of LOS method for this purpose mainly driven by the increase of bit rates which enables the measurement of eye diagrams. The efficiency of this technique is demonstrated under different transmission impairments such as chromatic dispersion (CD) in the range of -500 to 500 ps/nm, differential group delay (DGD) in the range of 0-15 ps and the optical signal tonoise ratio (OSNR) in the range of 10-30 dB. The results of numerical simulation for various modulation formats demonstrate successful recognition from a known bit rates with a higher estimation accuracy, which exceeds 99.8%.

  5. HELAC-Onia: an automatic matrix element generator for heavy quarkonium physics

    CERN Document Server

    Shao, Hua-Sheng

    2013-01-01

    By the virtues of the Dyson-Schwinger equations, we upgrade the published code \\mtt{HELAC} to be capable to calculate the heavy quarkonium helicity amplitudes in the framework of NRQCD factorization, which we dub \\mtt{HELAC-Onia}. We rewrote the original \\mtt{HELAC} to make the new program be able to calculate helicity amplitudes of multi P-wave quarkonium states production at hadron colliders and electron-positron colliders by including new P-wave off-shell currents. Therefore, besides the high efficiencies in computation of multi-leg processes within the Standard Model, \\mtt{HELAC-Onia} is also sufficiently numerical stable in dealing with P-wave quarkonia (e.g. $h_{c,b},\\chi_{c,b}$) and P-wave color-octet intermediate states. To the best of our knowledge, it is a first general-purpose automatic quarkonium matrix elements generator based on recursion relations on the market.

  6. AUTOMATIC GENERATION OF BUILDING MODELS WITH LEVELS OF DETAIL 1-3

    Directory of Open Access Journals (Sweden)

    W. Nguatem

    2016-06-01

    Full Text Available We present a workflow for the automatic generation of building models with levels of detail (LOD 1 to 3 according to the CityGML standard (Gröger et al., 2012. We start with orienting unsorted image sets employing (Mayer et al., 2012, we compute depth maps using semi-global matching (SGM (Hirschmüller, 2008, and fuse these depth maps to reconstruct dense 3D point clouds (Kuhn et al., 2014. Based on planes segmented from these point clouds, we have developed a stochastic method for roof model selection (Nguatem et al., 2013 and window model selection (Nguatem et al., 2014. We demonstrate our workflow up to the export into CityGML.

  7. Automatic Motion Generation for Robotic Milling Optimizing Stiffness with Sample-Based Planning

    Directory of Open Access Journals (Sweden)

    Julian Ricardo Diaz Posada

    2017-01-01

    Full Text Available Optimal and intuitive robotic machining is still a challenge. One of the main reasons for this is the lack of robot stiffness, which is also dependent on the robot positioning in the Cartesian space. To make up for this deficiency and with the aim of increasing robot machining accuracy, this contribution describes a solution approach for optimizing the stiffness over a desired milling path using the free degree of freedom of the machining process. The optimal motion is computed based on the semantic and mathematical interpretation of the manufacturing process modeled on its components: product, process and resource; and by configuring automatically a sample-based motion problem and the transition-based rapid-random tree algorithm for computing an optimal motion. The approach is simulated on a CAM software for a machining path revealing its functionality and outlining future potentials for the optimal motion generation for robotic machining processes.

  8. Parameter optimization of differential evolution algorithm for automatic playlist generation problem

    Science.gov (United States)

    Alamag, Kaye Melina Natividad B.; Addawe, Joel M.

    2017-11-01

    With the digitalization of music, the number of collection of music increased largely and there is a need to create lists of music that filter the collection according to user preferences, thus giving rise to the Automatic Playlist Generation Problem (APGP). Previous attempts to solve this problem include the use of search and optimization algorithms. If a music database is very large, the algorithm to be used must be able to search the lists thoroughly taking into account the quality of the playlist given a set of user constraints. In this paper we perform an evolutionary meta-heuristic optimization algorithm, Differential Evolution (DE) using different combination of parameter values and select the best performing set when used to solve four standard test functions. Performance of the proposed algorithm is then compared with normal Genetic Algorithm (GA) and a hybrid GA with Tabu Search. Numerical simulations are carried out to show better results from Differential Evolution approach with the optimized parameter values.

  9. Finite element meshing approached as a global minimization process

    Energy Technology Data Exchange (ETDEWEB)

    WITKOWSKI,WALTER R.; JUNG,JOSEPH; DOHRMANN,CLARK R.; LEUNG,VITUS J.

    2000-03-01

    The ability to generate a suitable finite element mesh in an automatic fashion is becoming the key to being able to automate the entire engineering analysis process. However, placing an all-hexahedron mesh in a general three-dimensional body continues to be an elusive goal. The approach investigated in this research is fundamentally different from any other that is known of by the authors. A physical analogy viewpoint is used to formulate the actual meshing problem which constructs a global mathematical description of the problem. The analogy used was that of minimizing the electrical potential of a system charged particles within a charged domain. The particles in the presented analogy represent duals to mesh elements (i.e., quads or hexes). Particle movement is governed by a mathematical functional which accounts for inter-particles repulsive, attractive and alignment forces. This functional is minimized to find the optimal location and orientation of each particle. After the particles are connected a mesh can be easily resolved. The mathematical description for this problem is as easy to formulate in three-dimensions as it is in two- or one-dimensions. The meshing algorithm was developed within CoMeT. It can solve the two-dimensional meshing problem for convex and concave geometries in a purely automated fashion. Investigation of the robustness of the technique has shown a success rate of approximately 99% for the two-dimensional geometries tested. Run times to mesh a 100 element complex geometry were typically in the 10 minute range. Efficiency of the technique is still an issue that needs to be addressed. Performance is an issue that is critical for most engineers generating meshes. It was not for this project. The primary focus of this work was to investigate and evaluate a meshing algorithm/philosophy with efficiency issues being secondary. The algorithm was also extended to mesh three-dimensional geometries. Unfortunately, only simple geometries were tested

  10. An automatic way of finding robust elimination trees for a multi-frontal sparse solver for radical 2D hierarchical meshes

    KAUST Repository

    AbouEisha, Hassan M.

    2014-01-01

    In this paper we present a dynamic programming algorithm for finding optimal elimination trees for the multi-frontal direct solver algorithm executed over two dimensional meshes with point singularities. The elimination tree found by the optimization algorithm results in a linear computational cost of sequential direct solver. Based on the optimal elimination tree found by the optimization algorithm we construct heuristic sequential multi-frontal direct solver algorithm resulting in a linear computational cost as well as heuristic parallel multi-frontal direct solver algorithm resulting in a logarithmic computational cost. The resulting parallel algorithm is implemented on NVIDIA CUDA GPU architecture based on our graph-grammar approach. © 2014 Springer-Verlag.

  11. Automatic Test Pattern Generator for Fuzzing Based on Finite State Machine

    Directory of Open Access Journals (Sweden)

    Ming-Hung Wang

    2017-01-01

    Full Text Available With the rapid development of the Internet, several emerging technologies are adopted to construct fancy, interactive, and user-friendly websites. Among these technologies, HTML5 is a popular one and is widely used in establishing modern sites. However, the security issues in the new web technologies are also raised and are worthy of investigation. For vulnerability investigation, many previous studies used fuzzing and focused on generation-based approaches to produce test cases for fuzzing; however, these methods require a significant amount of knowledge and mental efforts to develop test patterns for generating test cases. To decrease the entry barrier of conducting fuzzing, in this study, we propose a test pattern generation algorithm based on the concept of finite state machines. We apply graph analysis techniques to extract paths from finite state machines and use these paths to construct test patterns automatically. According to the proposal, fuzzing can be completed through inputting a regular expression corresponding to the test target. To evaluate the performance of our proposal, we conduct an experiment in identifying vulnerabilities of the input attributes in HTML5. According to the results, our approach is not only efficient but also effective for identifying weak validators in HTML5.

  12. Deep Learning-Based Data Forgery Detection in Automatic Generation Control

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Fengli [Univ. of Arkansas, Fayetteville, AR (United States); Li, Qinghua [Univ. of Arkansas, Fayetteville, AR (United States)

    2017-10-09

    Automatic Generation Control (AGC) is a key control system in the power grid. It is used to calculate the Area Control Error (ACE) based on frequency and tie-line power flow between balancing areas, and then adjust power generation to maintain the power system frequency in an acceptable range. However, attackers might inject malicious frequency or tie-line power flow measurements to mislead AGC to do false generation correction which will harm the power grid operation. Such attacks are hard to be detected since they do not violate physical power system models. In this work, we propose algorithms based on Neural Network and Fourier Transform to detect data forgery attacks in AGC. Different from the few previous work that rely on accurate load prediction to detect data forgery, our solution only uses the ACE data already available in existing AGC systems. In particular, our solution learns the normal patterns of ACE time series and detects abnormal patterns caused by artificial attacks. Evaluations on the real ACE dataset show that our methods have high detection accuracy.

  13. Heterogeneous meshing and biomechanical modeling of human spine.

    Science.gov (United States)

    Teo, J C M; Chui, C K; Wang, Z L; Ong, S H; Yan, C H; Wang, S C; Wong, H K; Teoh, S H

    2007-03-01

    We aim to develop a patient-specific biomechanical model of human spine for the purpose of surgical training and planning. In this paper, we describe the development of a finite-element model of the spine from the VHD Male Data. The finite-element spine model comprises volumetric elements suitable for deformation and other finite-element analysis using ABAQUS. The mesh generation solution accepts segmented radiological slices as input, and outputs three-dimensional (3D) volumetric finite element meshes that are ABAQUS compliant. The proposed mesh generation method first uses a grid plane to divide the contours of the anatomical boundaries and its inclusions into discrete meshes. A grid frame is then built to connect the grid planes between any two adjacent planes using a novel scheme. The meshes produced consist of brick elements in the interior of the contours and with tetrahedral and wedge elements at the boundaries. The nodal points are classified according to their materials and hence, elements can be assigned different properties. The resultant spine model comprises a detailed model of the 7 cervical vertebrae, 12 thoracic vertebrae, 5 lumbar vertebrae, and S1. Each of the vertebrae and intervertebral disc has between 1200 and 6000 elements, and approximately 1200 elements, respectively. The accuracy of the resultant VHD finite element spine model was good based on visual comparison of volume-rendered images of the original CT data, and has been used in a computational analysis involving needle insertion and static deformation. We also compared the mesh generated using our method against two automatically generated models; one consists of purely tetrahedral elements and the other hexahedral elements.

  14. Higher-order meshing of implicit geometries, Part II: Approximations on manifolds

    Science.gov (United States)

    Fries, T. P.; Schöllhammer, D.

    2017-11-01

    A new concept for the higher-order accurate approximation of partial differential equations on manifolds is proposed where a surface mesh composed by higher-order elements is automatically generated based on level-set data. Thereby, it enables a completely automatic workflow from the geometric description to the numerical analysis without any user-intervention. A master level-set function defines the shape of the manifold through its zero-isosurface which is then restricted to a finite domain by additional level-set functions. It is ensured that the surface elements are sufficiently continuous and shape regular which is achieved by manipulating the background mesh. The numerical results show that optimal convergence rates are obtained with a moderate increase in the condition number compared to handcrafted surface meshes.

  15. Surface meshing with curvature convergence

    KAUST Repository

    Li, Huibin

    2014-06-01

    Surface meshing plays a fundamental role in graphics and visualization. Many geometric processing tasks involve solving geometric PDEs on meshes. The numerical stability, convergence rates and approximation errors are largely determined by the mesh qualities. In practice, Delaunay refinement algorithms offer satisfactory solutions to high quality mesh generations. The theoretical proofs for volume based and surface based Delaunay refinement algorithms have been established, but those for conformal parameterization based ones remain wide open. This work focuses on the curvature measure convergence for the conformal parameterization based Delaunay refinement algorithms. Given a metric surface, the proposed approach triangulates its conformal uniformization domain by the planar Delaunay refinement algorithms, and produces a high quality mesh. We give explicit estimates for the Hausdorff distance, the normal deviation, and the differences in curvature measures between the surface and the mesh. In contrast to the conventional results based on volumetric Delaunay refinement, our stronger estimates are independent of the mesh structure and directly guarantee the convergence of curvature measures. Meanwhile, our result on Gaussian curvature measure is intrinsic to the Riemannian metric and independent of the embedding. In practice, our meshing algorithm is much easier to implement and much more efficient. The experimental results verified our theoretical results and demonstrated the efficiency of the meshing algorithm. © 2014 IEEE.

  16. AUTOMATIC GENERATION OF INDOOR NAVIGABLE SPACE USING A POINT CLOUD AND ITS SCANNER TRAJECTORY

    Directory of Open Access Journals (Sweden)

    B. R. Staats

    2017-09-01

    Full Text Available Automatic generation of indoor navigable models is mostly based on 2D floor plans. However, in many cases the floor plans are out of date. Buildings are not always built according to their blue prints, interiors might change after a few years because of modified walls and doors, and furniture may be repositioned to the user’s preferences. Therefore, new approaches for the quick recording of indoor environments should be investigated. This paper concentrates on laser scanning with a Mobile Laser Scanner (MLS device. The MLS device stores a point cloud and its trajectory. If the MLS device is operated by a human, the trajectory contains information which can be used to distinguish different surfaces. In this paper a method is presented for the identification of walkable surfaces based on the analysis of the point cloud and the trajectory of the MLS scanner. This method consists of several steps. First, the point cloud is voxelized. Second, the trajectory is analysing and projecting to acquire seed voxels. Third, these seed voxels are generated into floor regions by the use of a region growing process. By identifying dynamic objects, doors and furniture, these floor regions can be modified so that each region represents a specific navigable space inside a building as a free navigable voxel space. By combining the point cloud and its corresponding trajectory, the walkable space can be identified for any type of building even if the interior is scanned during business hours.

  17. Performance Evaluation of Antlion Optimizer Based Regulator in Automatic Generation Control of Interconnected Power System

    Directory of Open Access Journals (Sweden)

    Esha Gupta

    2016-01-01

    Full Text Available This paper presents an application of the recently introduced Antlion Optimizer (ALO to find the parameters of primary governor loop of thermal generators for successful Automatic Generation Control (AGC of two-area interconnected power system. Two standard objective functions, Integral Square Error (ISE and Integral Time Absolute Error (ITAE, have been employed to carry out this parameter estimation process. The problem is transformed in optimization problem to obtain integral gains, speed regulation, and frequency sensitivity coefficient for both areas. The comparison of the regulator performance obtained from ALO is carried out with Genetic Algorithm (GA, Particle Swarm Optimization (PSO, and Gravitational Search Algorithm (GSA based regulators. Different types of perturbations and load changes are incorporated to establish the efficacy of the obtained design. It is observed that ALO outperforms all three optimization methods for this real problem. The optimization performance of ALO is compared with other algorithms on the basis of standard deviations in the values of parameters and objective functions.

  18. Automatic Generation of Optimized and Synthesizable Hardware Implementation from High-Level Dataflow Programs

    Directory of Open Access Journals (Sweden)

    Khaled Jerbi

    2012-01-01

    Full Text Available In this paper, we introduce the Reconfigurable Video Coding (RVC standard based on the idea that video processing algorithms can be defined as a library of components that can be updated and standardized separately. MPEG RVC framework aims at providing a unified high-level specification of current MPEG coding technologies using a dataflow language called Cal Actor Language (CAL. CAL is associated with a set of tools to design dataflow applications and to generate hardware and software implementations. Before this work, the existing CAL hardware compilers did not support high-level features of the CAL. After presenting the main notions of the RVC standard, this paper introduces an automatic transformation process that analyses the non-compliant features and makes the required changes in the intermediate representation of the compiler while keeping the same behavior. Finally, the implementation results of the transformation on video and still image decoders are summarized. We show that the obtained results can largely satisfy the real time constraints for an embedded design on FPGA as we obtain a throughput of 73 FPS for MPEG 4 decoder and 34 FPS for coding and decoding process of the LAR coder using a video of CIF image size. This work resolves the main limitation of hardware generation from CAL designs.

  19. Solution to automatic generation control problem using firefly algorithm optimized I(λ)D(µ) controller.

    Science.gov (United States)

    Debbarma, Sanjoy; Saikia, Lalit Chandra; Sinha, Nidul

    2014-03-01

    Present work focused on automatic generation control (AGC) of a three unequal area thermal systems considering reheat turbines and appropriate generation rate constraints (GRC). A fractional order (FO) controller named as I(λ)D(µ) controller based on crone approximation is proposed for the first time as an appropriate technique to solve the multi-area AGC problem in power systems. A recently developed metaheuristic algorithm known as firefly algorithm (FA) is used for the simultaneous optimization of the gains and other parameters such as order of integrator (λ) and differentiator (μ) of I(λ)D(µ) controller and governor speed regulation parameters (R). The dynamic responses corresponding to optimized I(λ)D(µ) controller gains, λ, μ, and R are compared with that of classical integer order (IO) controllers such as I, PI and PID controllers. Simulation results show that the proposed I(λ)D(µ) controller provides more improved dynamic responses and outperforms the IO based classical controllers. Further, sensitivity analysis confirms the robustness of the so optimized I(λ)D(µ) controller to wide changes in system loading conditions and size and position of SLP. Proposed controller is also found to have performed well as compared to IO based controllers when SLP takes place simultaneously in any two areas or all the areas. Robustness of the proposed I(λ)D(µ) controller is also tested against system parameter variations. © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Automatic generation and verification of railway interlocking control tables using FSM and NuSMV

    Directory of Open Access Journals (Sweden)

    Mohammad B. YAZDI

    2009-01-01

    Full Text Available Due to their important role in providing safe conditions for train movements, railway interlocking systems are considered as safety critical systems. The reliability, safety and integrity of these systems, relies on reliability and integrity of all stages in their lifecycle including the design, verification, manufacture, test, operation and maintenance.In this paper, the Automatic generation and verification of interlocking control tables, as one of the most important stages in the interlocking design process has been focused on, by the safety critical research group in the School of Railway Engineering, SRE. Three different subsystems including a graphical signalling layout planner, a Control table generator and a Control table verifier, have been introduced. Using NuSMV model checker, the control table verifier analyses the contents of control table besides the safe train movement conditions and checks for any conflicting settings in the table. This includes settings for conflicting routes, signals, points and also settings for route isolation and single and multiple overlap situations. The latest two settings, as route isolation and multiple overlap situations are from new outcomes of the work comparing to works represented on the subject recently.

  1. Integration of Variable Speed Pumped Hydro Storage in Automatic Generation Control Systems

    Science.gov (United States)

    Fulgêncio, N.; Moreira, C.; Silva, B.

    2017-04-01

    Pumped storage power (PSP) plants are expected to be an important player in modern electrical power systems when dealing with increasing shares of new renewable energies (NRE) such as solar or wind power. The massive penetration of NRE and consequent replacement of conventional synchronous units will significantly affect the controllability of the system. In order to evaluate the capability of variable speed PSP plants participation in the frequency restoration reserve (FRR) provision, taking into account the expected performance in terms of improved ramp response capability, a comparison with conventional hydro units is presented. In order to address this issue, a three area test network was considered, as well as the corresponding automatic generation control (AGC) systems, being responsible for re-dispatching the generation units to re-establish power interchange between areas as well as the system nominal frequency. The main issue under analysis in this paper is related to the benefits of the fast response of variable speed PSP with respect to its capability of providing fast power balancing in a control area.

  2. Wolf pack hunting strategy for automatic generation control of an islanding smart distribution network

    International Nuclear Information System (INIS)

    Xi, Lei; Zhang, Zeyu; Yang, Bo; Huang, Linni; Yu, Tao

    2016-01-01

    Highlights: • A mixed homogeneous and heterogeneous multi-agent based wolf pack hunting (WPH) method is proposed. • WPH can effectively handle the ever-increasing penetration of renewable energy in smart grid. • An AGC power dispatch, coordinated control, and electric power autonomy of an ISDN is achieved. - Abstract: As the conventional centralized automatic generation control (AGC) is inadequate to handle the ever-increasing penetration of renewable energy and the requirement of plug-and-play of smart grid, this paper proposes a mixed homogeneous and heterogeneous multi-agent based wolf pack hunting (WPH) strategy to achieve a fast AGC power dispatch, optimal coordinated control, and electric power autonomy of an islanding smart distribution network (ISDN). A virtual consensus variable is employed to deal with the topology variation resulted from the excess of power limits and to achieve the plug-and-play of AGC units. Then an integrated objective of frequency deviation and short-term economic dispatch is developed, such that all units can maintain an optimal operation in the presence of load disturbances. Four case studies are undertaken to an ISDN with various distributed generations and microgrids. Simulation results demonstrate that WPH has a greater robustness and a faster dynamic optimization than that of conventional approaches, which can increase the utilization rate of the renewable energy and effectively resolve the coordination and electric power autonomy of ISDN.

  3. Chemical name extraction based on automatic training data generation and rich feature set.

    Science.gov (United States)

    Yan, Su; Spangler, W Scott; Chen, Ying

    2013-01-01

    The automation of extracting chemical names from text has significant value to biomedical and life science research. A major barrier in this task is the difficulty of getting a sizable and good quality data to train a reliable entity extraction model. Another difficulty is the selection of informative features of chemical names, since comprehensive domain knowledge on chemistry nomenclature is required. Leveraging random text generation techniques, we explore the idea of automatically creating training sets for the task of chemical name extraction. Assuming the availability of an incomplete list of chemical names, called a dictionary, we are able to generate well-controlled, random, yet realistic chemical-like training documents. We statistically analyze the construction of chemical names based on the incomplete dictionary, and propose a series of new features, without relying on any domain knowledge. Compared to state-of-the-art models learned from manually labeled data and domain knowledge, our solution shows better or comparable results in annotating real-world data with less human effort. Moreover, we report an interesting observation about the language for chemical names. That is, both the structural and semantic components of chemical names follow a Zipfian distribution, which resembles many natural languages.

  4. Automatic sequences

    CERN Document Server

    Haeseler, Friedrich

    2003-01-01

    Automatic sequences are sequences which are produced by a finite automaton. Although they are not random they may look as being random. They are complicated, in the sense of not being not ultimately periodic, they may look rather complicated, in the sense that it may not be easy to name the rule by which the sequence is generated, however there exists a rule which generates the sequence. The concept automatic sequences has special applications in algebra, number theory, finite automata and formal languages, combinatorics on words. The text deals with different aspects of automatic sequences, in particular:· a general introduction to automatic sequences· the basic (combinatorial) properties of automatic sequences· the algebraic approach to automatic sequences· geometric objects related to automatic sequences.

  5. Automatic treatment planning facilitates fast generation of high-quality treatment plans for esophageal cancer.

    Science.gov (United States)

    Hansen, Christian Rønn; Nielsen, Morten; Bertelsen, Anders Smedegaard; Hazell, Irene; Holtved, Eva; Zukauskaite, Ruta; Bjerregaard, Jon Kroll; Brink, Carsten; Bernchou, Uffe

    2017-11-01

    The quality of radiotherapy planning has improved substantially in the last decade with the introduction of intensity modulated radiotherapy. The purpose of this study was to analyze the plan quality and efficacy of automatically (AU) generated VMAT plans for inoperable esophageal cancer patients. Thirty-two consecutive inoperable patients with esophageal cancer originally treated with manually (MA) generated volumetric modulated arc therapy (VMAT) plans were retrospectively replanned using an auto-planning engine. All plans were optimized with one full 6MV VMAT arc giving 60 Gy to the primary target and 50 Gy to the elective target. The planning techniques were blinded before clinical evaluation by three specialized oncologists. To supplement the clinical evaluation, the optimization time for the AU plan was recorded along with DVH parameters for all plans. Upon clinical evaluation, the AU plan was preferred for 31/32 patients, and for one patient, there was no difference in the plans. In terms of DVH parameters, similar target coverage was obtained between the two planning methods. The mean dose for the spinal cord increased by 1.8 Gy using AU (p = .002), whereas the mean lung dose decreased by 1.9 Gy (p plans were more modulated as seen by the increase of 12% in mean MUs (p = .001). The median optimization time for AU plans was 117 min. The AU plans were in general preferred and showed a lower mean dose to the lungs. The automation of the planning process generated esophageal cancer treatment plans quickly and with high quality.

  6. Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images

    Science.gov (United States)

    Fischer, Bernd

    2004-01-01

    Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems

  7. ScholarLens: extracting competences from research publications for the automatic generation of semantic user profiles

    Directory of Open Access Journals (Sweden)

    Bahar Sateli

    2017-07-01

    Full Text Available Motivation Scientists increasingly rely on intelligent information systems to help them in their daily tasks, in particular for managing research objects, like publications or datasets. The relatively young research field of Semantic Publishing has been addressing the question how scientific applications can be improved through semantically rich representations of research objects, in order to facilitate their discovery and re-use. To complement the efforts in this area, we propose an automatic workflow to construct semantic user profiles of scholars, so that scholarly applications, like digital libraries or data repositories, can better understand their users’ interests, tasks, and competences, by incorporating these user profiles in their design. To make the user profiles sharable across applications, we propose to build them based on standard semantic web technologies, in particular the Resource Description Framework (RDF for representing user profiles and Linked Open Data (LOD sources for representing competence topics. To avoid the cold start problem, we suggest to automatically populate these profiles by analyzing the publications (co-authored by users, which we hypothesize reflect their research competences. Results We developed a novel approach, ScholarLens, which can automatically generate semantic user profiles for authors of scholarly literature. For modeling the competences of scholarly users and groups, we surveyed a number of existing linked open data vocabularies. In accordance with the LOD best practices, we propose an RDF Schema (RDFS based model for competence records that reuses existing vocabularies where appropriate. To automate the creation of semantic user profiles, we developed a complete, automated workflow that can generate semantic user profiles by analyzing full-text research articles through various natural language processing (NLP techniques. In our method, we start by processing a set of research articles for a

  8. User Manual for the PROTEUS Mesh Tools

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Micheal A. [Argonne National Lab. (ANL), Argonne, IL (United States); Shemon, Emily R [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-09-19

    PROTEUS is built around a finite element representation of the geometry for visualization. In addition, the PROTEUS-SN solver was built to solve the even-parity transport equation on a finite element mesh provided as input. Similarly, PROTEUS-MOC and PROTEUS-NEMO were built to apply the method of characteristics on unstructured finite element meshes. Given the complexity of real world problems, experience has shown that using commercial mesh generator to create rather simple input geometries is overly complex and slow. As a consequence, significant effort has been put into place to create multiple codes that help assist in the mesh generation and manipulation. There are three input means to create a mesh in PROTEUS: UFMESH, GRID, and NEMESH. At present, the UFMESH is a simple way to generate two-dimensional Cartesian and hexagonal fuel assembly geometries. The UFmesh input allows for simple assembly mesh generation while the GRID input allows the generation of Cartesian, hexagonal, and regular triangular structured grid geometry options. The NEMESH is a way for the user to create their own mesh or convert another mesh file format into a PROTEUS input format. Given that one has an input mesh format acceptable for PROTEUS, we have constructed several tools which allow further mesh and geometry construction (i.e. mesh extrusion and merging). This report describes the various mesh tools that are provided with the PROTEUS code giving both descriptions of the input and output. In many cases the examples are provided with a regression test of the mesh tools. The most important mesh tools for any user to consider using are the MT_MeshToMesh.x and the MT_RadialLattice.x codes. The former allows the conversion between most mesh types handled by PROTEUS while the second allows the merging of multiple (assembly) meshes into a radial structured grid. Note that the mesh generation process is recursive in nature and that each input specific for a given mesh tool (such as .axial

  9. Automatic bearing fault diagnosis of permanent magnet synchronous generators in wind turbines subjected to noise interference

    Science.gov (United States)

    Guo, Jun; Lu, Siliang; Zhai, Chao; He, Qingbo

    2018-02-01

    An automatic bearing fault diagnosis method is proposed for permanent magnet synchronous generators (PMSGs), which are widely installed in wind turbines subjected to low rotating speeds, speed fluctuations, and electrical device noise interferences. The mechanical rotating angle curve is first extracted from the phase current of a PMSG by sequentially applying a series of algorithms. The synchronous sampled vibration signal of the fault bearing is then resampled in the angular domain according to the obtained rotating phase information. Considering that the resampled vibration signal is still overwhelmed by heavy background noise, an adaptive stochastic resonance filter is applied to the resampled signal to enhance the fault indicator and facilitate bearing fault identification. Two types of fault bearings with different fault sizes in a PMSG test rig are subjected to experiments to test the effectiveness of the proposed method. The proposed method is fully automated and thus shows potential for convenient, highly efficient and in situ bearing fault diagnosis for wind turbines subjected to harsh environments.

  10. Multi-stage fuzzy PID power system automatic generation controller in deregulated environments

    International Nuclear Information System (INIS)

    Shayeghi, H.; Shayanfar, H.A.; Jalili, A.

    2006-01-01

    In this paper, a multi-stage fuzzy proportional integral derivative (PID) type controller is proposed to solve the automatic generation control (AGC) problem in a deregulated power system that operates under deregulation based on the bilateral policy scheme. In each control area, the effects of the possible contracts are treated as a set of new input signals in a modified traditional dynamical model. The multi-stage controller uses the fuzzy switch to blend a proportional derivative (PD) fuzzy logic controller with an integral fuzzy logic input. The proposed controller operates on fuzzy values passing the consequence of a prior stage on to the next stage as fact. The salient advantage of this strategy is its high insensitivity to large load changes and disturbances in the presence of plant parameter variations and system nonlinearities. This newly developed strategy leads to a flexible controller with simple structure that is easy to implement, and therefore, it can be useful for the real world power systems. The proposed method is tested on a three area power system with different contracted scenarios under various operating conditions. The results of the proposed controller are compared with those of the classical fuzzy PID type controller and classical PID controller through some performance indices to illustrate its robust performance

  11. Automatic generation of bioinformatics tools for predicting protein-ligand binding sites.

    Science.gov (United States)

    Komiyama, Yusuke; Banno, Masaki; Ueki, Kokoro; Saad, Gul; Shimizu, Kentaro

    2016-03-15

    Predictive tools that model protein-ligand binding on demand are needed to promote ligand research in an innovative drug-design environment. However, it takes considerable time and effort to develop predictive tools that can be applied to individual ligands. An automated production pipeline that can rapidly and efficiently develop user-friendly protein-ligand binding predictive tools would be useful. We developed a system for automatically generating protein-ligand binding predictions. Implementation of this system in a pipeline of Semantic Web technique-based web tools will allow users to specify a ligand and receive the tool within 0.5-1 day. We demonstrated high prediction accuracy for three machine learning algorithms and eight ligands. The source code and web application are freely available for download at http://utprot.net They are implemented in Python and supported on Linux. shimizu@bi.a.u-tokyo.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  12. Automatic generation of bioinformatics tools for predicting protein–ligand binding sites

    Science.gov (United States)

    Banno, Masaki; Ueki, Kokoro; Saad, Gul; Shimizu, Kentaro

    2016-01-01

    Motivation: Predictive tools that model protein–ligand binding on demand are needed to promote ligand research in an innovative drug-design environment. However, it takes considerable time and effort to develop predictive tools that can be applied to individual ligands. An automated production pipeline that can rapidly and efficiently develop user-friendly protein–ligand binding predictive tools would be useful. Results: We developed a system for automatically generating protein–ligand binding predictions. Implementation of this system in a pipeline of Semantic Web technique-based web tools will allow users to specify a ligand and receive the tool within 0.5–1 day. We demonstrated high prediction accuracy for three machine learning algorithms and eight ligands. Availability and implementation: The source code and web application are freely available for download at http://utprot.net. They are implemented in Python and supported on Linux. Contact: shimizu@bi.a.u-tokyo.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26545824

  13. Parametric Quadrilateral Meshes for the Design and Optimization of Superconducting Magnets

    CERN Document Server

    Aleksa, Martin; Völlinger, Christine

    2002-01-01

    The program package ROXIE has been developed at CERN for the design and optimization of accelerator magnets. The necessity of extremely uniform fields in the superconducting accelerator magnets for LHC requires very accurate methods of field computation. For this purpose the coupled boundary-element / finite-element technique (BEM-FEM) is used. Quadrilateral higher order finite-element meshes are generated for the discretization of the iron domain (yoke) and stainless steel collars. A new mesh generator using geometrically optimized domain decomposition which was developed at the University of Stuttgart, Germany has been implemented into the ROXIE program providing fully automatic and user friendly mesh generation. The structure of the magnet cross-section can be modeled using parametric objects such as holes of different forms, elliptic, parabolic or hyperbolic arcs, notches, slots, .... For sensitivity analysis and parametric studies, point based morphing algorithms are applied to guarantee smooth adaptatio...

  14. 3D-2D Deformable Image Registration Using Feature-Based Nonuniform Meshes.

    Science.gov (United States)

    Zhong, Zichun; Guo, Xiaohu; Cai, Yiqi; Yang, Yin; Wang, Jing; Jia, Xun; Mao, Weihua

    2016-01-01

    By using prior information of planning CT images and feature-based nonuniform meshes, this paper demonstrates that volumetric images can be efficiently registered with a very small portion of 2D projection images of a Cone-Beam Computed Tomography (CBCT) scan. After a density field is computed based on the extracted feature edges from planning CT images, nonuniform tetrahedral meshes will be automatically generated to better characterize the image features according to the density field; that is, finer meshes are generated for features. The displacement vector fields (DVFs) are specified at the mesh vertices to drive the deformation of original CT images. Digitally reconstructed radiographs (DRRs) of the deformed anatomy are generated and compared with corresponding 2D projections. DVFs are optimized to minimize the objective function including differences between DRRs and projections and the regularity. To further accelerate the above 3D-2D registration, a procedure to obtain good initial deformations by deforming the volume surface to match 2D body boundary on projections has been developed. This complete method is evaluated quantitatively by using several digital phantoms and data from head and neck cancer patients. The feature-based nonuniform meshing method leads to better results than either uniform orthogonal grid or uniform tetrahedral meshes.

  15. 3D-2D Deformable Image Registration Using Feature-Based Nonuniform Meshes

    Directory of Open Access Journals (Sweden)

    Zichun Zhong

    2016-01-01

    Full Text Available By using prior information of planning CT images and feature-based nonuniform meshes, this paper demonstrates that volumetric images can be efficiently registered with a very small portion of 2D projection images of a Cone-Beam Computed Tomography (CBCT scan. After a density field is computed based on the extracted feature edges from planning CT images, nonuniform tetrahedral meshes will be automatically generated to better characterize the image features according to the density field; that is, finer meshes are generated for features. The displacement vector fields (DVFs are specified at the mesh vertices to drive the deformation of original CT images. Digitally reconstructed radiographs (DRRs of the deformed anatomy are generated and compared with corresponding 2D projections. DVFs are optimized to minimize the objective function including differences between DRRs and projections and the regularity. To further accelerate the above 3D-2D registration, a procedure to obtain good initial deformations by deforming the volume surface to match 2D body boundary on projections has been developed. This complete method is evaluated quantitatively by using several digital phantoms and data from head and neck cancer patients. The feature-based nonuniform meshing method leads to better results than either uniform orthogonal grid or uniform tetrahedral meshes.

  16. AutoWIG: automatic generation of python bindings for C++ libraries

    Directory of Open Access Journals (Sweden)

    Pierre Fernique

    2018-04-01

    Full Text Available Most of Python and R scientific packages incorporate compiled scientific libraries to speed up the code and reuse legacy libraries. While several semi-automatic solutions exist to wrap these compiled libraries, the process of wrapping a large library is cumbersome and time consuming. In this paper, we introduce AutoWIG, a Python package that wraps automatically compiled libraries into high-level languages using LLVM/Clang technologies and the Mako templating engine. Our approach is automatic, extensible, and applies to complex C++ libraries, composed of thousands of classes or incorporating modern meta-programming constructs.

  17. Experience in connecting the power generating units of thermal power plants to automatic secondary frequency regulation within the united power system of Russia

    International Nuclear Information System (INIS)

    Zhukov, A. V.; Komarov, A. N.; Safronov, A. N.; Barsukov, I. V.

    2009-01-01

    The principles of central control of the power generating units of thermal power plants by automatic secondary frequency and active power overcurrent regulation systems, and the algorithms for interactions between automatic power control systems for the power production units in thermal power plants and centralized systems for automatic frequency and power regulation, are discussed. The order of switching the power generating units of thermal power plants over to control by a centralized system for automatic frequency and power regulation and by the Central Coordinating System for automatic frequency and power regulation is presented. The results of full-scale system tests of the control of power generating units of the Kirishskaya, Stavropol, and Perm GRES (State Regional Electric Power Plants) by the Central Coordinating System for automatic frequency and power regulation at the United Power System of Russia on September 23-25, 2008, are reported.

  18. Automatic Generation and Validation of an ITER Neutronics Model from CAD Data

    International Nuclear Information System (INIS)

    Tsige-Tamirat, H.; Fischer, U.; Serikov, A.; Stickel, S.

    2006-01-01

    Quality assurance rules request the consistency of the geometry model used in neutronics Monte Carlo calculations and the underlying engineering CAD model. This can be ensured by automatically converting the CAD geometry data into the representation used by Monte Carlo codes such as MCNP. Suitable conversion algorithms have been previously developed at FZK and were implemented into an interface program. This paper describes the application of the interface program to a CAD model of a 40 degree ITER torus sector for the generation of a neutronics geometry model for MCNP. A CAD model provided by ITER consisting of all significant components was analyzed, pre-processed, and converted into MCNP geometry representation. The analysis and pre-processing steps include the checking of the adequacy of the CAD model for neutronics calculations in terms of geometric representation and complexity, and of corresponding corrections. This step is followed by the conversion of the CAD model into MCNP geometry including error detection and correction as well as the completion of the model by voids. The conversion process does not introduce any approximations so that the resulting MCNP geometry is fully equivalent to the original CAD geometry. However, there is a moderate increase of the complexity measured in terms of the number of cell and surfaces. The validity of the converted geometry model was shown by comparing the results of stochastic MCNP volume calculations and the volumes provided by the CAD kernel of the interface programme. Furthermore, successful MCNP test calculations have been performed for verifying the converted ITER model in application calculations. (author)

  19. Embedded Platform for Automatic Testing and Optimizing of FPGA Based Cryptographic True Random Number Generators

    Directory of Open Access Journals (Sweden)

    M. Varchola

    2009-12-01

    Full Text Available This paper deals with an evaluation platform for cryptographic True Random Number Generators (TRNGs based on the hardware implementation of statistical tests for FPGAs. It was developed in order to provide an automatic tool that helps to speed up the TRNG design process and can provide new insights on the TRNG behavior as it will be shown on a particular example in the paper. It enables to test sufficient statistical properties of various TRNG designs under various working conditions on the fly. Moreover, the tests are suitable to be embedded into cryptographic hardware products in order to recognize TRNG output of weak quality and thus increase its robustness and reliability. Tests are fully compatible with the FIPS 140 standard and are implemented by the VHDL language as an IP-Core for vendor independent FPGAs. A recent Flash based Actel Fusion FPGA was chosen for preliminary experiments. The Actel version of the tests possesses an interface to the Actel’s CoreMP7 softcore processor that is fully compatible with the industry standard ARM7TDMI. Moreover, identical tests suite was implemented to the Xilinx Virtex 2 and 5 in order to compare the performance of the proposed solution with the performance of already published one based on the same FPGAs. It was achieved 25% and 65% greater clock frequency respectively while consuming almost equal resources of the Xilinx FPGAs. On the top of it, the proposed FIPS 140 architecture is capable of processing one random bit per one clock cycle which results in 311.5 Mbps throughput for Virtex 5 FPGA.

  20. Field Robotics in Sports: Automatic Generation of guidance Lines for Automatic Grass Cutting, Striping and Pitch Marking of Football Playing Fields

    Directory of Open Access Journals (Sweden)

    Ole Green

    2011-03-01

    Full Text Available Progress is constantly being made and new applications are constantly coming out in the area of field robotics. In this paper, a promising application of field robotics in football playing fields is introduced. An algorithmic approach for generating the way points required for the guidance of a GPS-based field robotic through a football playing field to automatically carry out periodical tasks such as cutting the grass field, pitch and line marking illustrations and lawn striping is represented. The manual operation of these tasks requires very skilful personnel able to work for long hours with very high concentration for the football yard to be compatible with standards of Federation Internationale de Football Association (FIFA. In the other side, a GPS-based guided vehicle or robot with three implements; grass mower, lawn stripping roller and track marking illustrator is capable of working 24 h a day, in most weather and in harsh soil conditions without loss of quality. The proposed approach for the automatic operation of football playing fields requires no or very limited human intervention and therefore it saves numerous working hours and free a worker to focus on other tasks. An economic feasibility study showed that the proposed method is economically superimposing the current manual practices.

  1. Field Robotics in Sports: Automatic Generation of Guidance Lines for Automatic Grass Cutting, Striping and Pitch Marking of Football Playing Fields

    Directory of Open Access Journals (Sweden)

    Ibrahim A. Hameed

    2011-03-01

    Full Text Available Progress is constantly being made and new applications are constantly coming out in the area of field robotics. In this paper, a promising application of field robotics in football playing fields is introduced. An algorithmic approach for generating the way points required for the guidance of a GPS-based field robotic through a football playing field to automatically carry out periodical tasks such as cutting the grass field, pitch and line marking illustrations and lawn striping is represented. The manual operation of these tasks requires very skilful personnel able to work for long hours with very high concentration for the football yard to be compatible with standards of Federation Internationale de Football Association (FIFA. In the other side, a GPS-based guided vehicle or robot with three implements; grass mower, lawn stripping roller and track marking illustrator is capable of working 24 h a day, in most weather and in harsh soil conditions without loss of quality. The proposed approach for the automatic operation of football playing fields requires no or very limited human intervention and therefore it saves numerous working hours and free a worker to focus on other tasks. An economic feasibility study showed that the proposed method is economically superimposing the current manual practices.

  2. DEVELOPMENT OF THE MODEL OF AN AUTOMATIC GENERATION OF TOTAL AMOUNTS OF COMMISSIONS IN INTERNATIONAL INTERBANK PAYMENTS

    Directory of Open Access Journals (Sweden)

    Dmitry N. Bolotov

    2013-01-01

    Full Text Available The article deals with the main form of international payment - bank transfer and features when it is charging by banks correspondent fees for transit funds in their correspondent accounts. In order to optimize the cost of expenses for international money transfers there is a need to develop models and toolkit of automatic generation of the total amount of commissions in international interbank settlements. Accordingly, based on graph theory, approach to the construction of the model was developed.

  3. Computerized Generation and Simulation of Meshing and Contact of New Type of Novikov-Wildhaber Helical Gears

    National Research Council Canada - National Science Library

    Litvin, Faydor

    2000-01-01

    .... Such a function results in the reduction of noise and vibrations. Methods for the generation of the proposed gear tooth surfaces by grinding and hobbing are considered, and a tooth contact analysis (TCA...

  4. Development of user interface to support automatic program generation of nuclear power plant analysis by module-based simulation system

    International Nuclear Information System (INIS)

    Yoshikawa, Hidekazu; Mizutani, Naoki; Nakaya, Ken-ichiro; Wakabayashi, Jiro

    1988-01-01

    Module-based Simulation System (MSS) has been developed to realize a new software work environment enabling versatile dynamic simulation of a complex nuclear power system flexibly. The MSS makes full use of modern software technology to replace a large fraction of human software works in complex, large-scale program development by computer automation. Fundamental methods utilized in MSS and developmental study on human interface system SESS-1 to help users in generating integrated simulation programs automatically are summarized as follows: (1) To enhance usability and 'communality' of program resources, the basic mathematical models of common usage in nuclear power plant analysis are programed as 'modules' and stored in a module library. The information on usage of individual modules are stored in module database with easy registration, update and retrieval by the interactive management system. (2) Target simulation programs and the input/output files are automatically generated with simple block-wise languages by a precompiler system for module integration purpose. (3) Working time for program development and analysis in an example study of an LMFBR plant thermal-hydraulic transient analysis was demonstrated to be remarkably shortened, with the introduction of an interface system SESS-1 developed as an automatic program generation environment. (author)

  5. An automatic method to generate domain-specific investigator networks using PubMed abstracts

    Directory of Open Access Journals (Sweden)

    Gwinn Marta

    2007-06-01

    Full Text Available Abstract Background Collaboration among investigators has become critical to scientific research. This includes ad hoc collaboration established through personal contacts as well as formal consortia established by funding agencies. Continued growth in online resources for scientific research and communication has promoted the development of highly networked research communities. Extending these networks globally requires identifying additional investigators in a given domain, profiling their research interests, and collecting current contact information. We present a novel strategy for building investigator networks dynamically and producing detailed investigator profiles using data available in PubMed abstracts. Results We developed a novel strategy to obtain detailed investigator information by automatically parsing the affiliation string in PubMed records. We illustrated the results by using a published literature database in human genome epidemiology (HuGE Pub Lit as a test case. Our parsing strategy extracted country information from 92.1% of the affiliation strings in a random sample of PubMed records and in 97.0% of HuGE records, with accuracies of 94.0% and 91.0%, respectively. Institution information was parsed from 91.3% of the general PubMed records (accuracy 86.8% and from 94.2% of HuGE PubMed records (accuracy 87.0. We demonstrated the application of our approach to dynamic creation of investigator networks by creating a prototype information system containing a large database of PubMed abstracts relevant to human genome epidemiology (HuGE Pub Lit, indexed using PubMed medical subject headings converted to Unified Medical Language System concepts. Our method was able to identify 70–90% of the investigators/collaborators in three different human genetics fields; it also successfully identified 9 of 10 genetics investigators within the PREBIC network, an existing preterm birth research network. Conclusion We successfully created a

  6. A semi-automatic multiple view texture mapping for the surface model extracted by laser scanning

    Science.gov (United States)

    Zhang, Zhichao; Huang, Xianfeng; Zhang, Fan; Chang, Yongmin; Li, Deren

    2008-12-01

    Laser scanning is an effective way to acquire geometry data of the cultural heritage with complex architecture. After generating the 3D model of the object, it's difficult to do the exactly texture mapping for the real object. we take effort to create seamless texture maps for a virtual heritage of arbitrary topology. Texture detail is acquired directly from the real object in a light condition as uniform as we can make. After preprocessing, images are then registered on the 3D mesh by a semi-automatic way. Then we divide the mesh into mesh patches overlapped with each other according to the valid texture area of each image. An optimal correspondence between mesh patches and sections of the acquired images is built. Then, a smoothing approach is proposed to erase the seam between different images that map on adjacent mesh patches, based on texture blending. The obtained result with a Buddha of Dunhuang Mogao Grottoes is presented and discussed.

  7. Automatic Generation and Evaluation of Sentence Graphs out of Word Graphs

    NARCIS (Netherlands)

    Reidsma, Dennis; Priss, U.; Corbett, D.; Angelova, G.

    This paper reports on the development of a system that automatically constructs representations of the meaning of sentences using rules of grammar and a dictionary of word meanings. The meanings of words and sentences are expressed using an extension of knowledge graphs, a semantic network

  8. Reduction to spark coordinates of data generated by automatic measurement of spark chamber film

    International Nuclear Information System (INIS)

    Maybury, R.; Hart, J.C.

    1976-09-01

    The initial stage in the data reduction for film from two spark chamber experiments is described. The film was automatically measured at the Rutherford Laboratory. The data from these measurements were reduced to a series of spark coordinates for each gap of the spark chambers. Quality control checks are discussed. (author)

  9. Regulatory analysis for the resolution of Generic Issue 125.II.7 ''Reevaluate Provision to Automatically Isolate Feedwater from Steam Generator During a Line Break''

    International Nuclear Information System (INIS)

    Basdekas, D.L.

    1988-09-01

    Generic Issue 125.II.7 addresses the concern related to the automatic isolation of auxiliary feedwater (AFW) to a steam generator with a broken steam or feedwater line. This regulatory analysis provides a quantitative assessment of the costs and benefits associated with the removal of the AFW automatic isolation and concludes that no new regulatory requirements are warranted. 21 refs., 7 tabs

  10. Design and implementation of a control automatic module for the volume extraction of a 99mTc generator

    International Nuclear Information System (INIS)

    Lopez, Yon; Urquizo, Rafael; Gago, Javier; Mendoza, Pablo

    2014-01-01

    A module for the automatic extraction of volume from 0.05 mL to 1 mL has been developed using a 3D printer, using as base material acrylonitrile butadiene styrene (ABS). The design allows automation of the input and ejection eluate 99m Tc in the generator prototype 99 Mo/ 99m Tc processes; use in other systems is feasible due to its high degree of versatility, depending on the selection of the main components: precision syringe and multi-way solenoid valve. An accuracy equivalent to commercial equipment has been obtained, but at lower cost. This article describes the mechanical design, design calculations of the movement mechanism, electronics and automatic syringe dispenser control. (authors).

  11. Unstructured mesh based elastic wave modelling on GPU: a double-mesh grid method

    Science.gov (United States)

    Yang, Kai; Zhang, Jianfeng; Gao, Hongwei

    2017-11-01

    We present an unstructured mesh based numerical technique for modelling elastic wave propagation in heterogeneous media with complex geometrical settings. The scheme is developed by adapting the so-called grid method with a double-mesh implementation. The double-mesh is generated by subdividing each triangular grid of the first-level mesh into a group of congruent smaller grids with equally dividing each edge of the triangle. The resulting double-mesh grid method incorporates the advantages of structured- and unstructured-mesh schemes. The irregular, unstructured first-level mesh, which is generated by centroidal Voronoi tessellation based on Delaunay triangulation with a velocity-dependent density function, can accurately describe the surface topography and interfaces, and the size of the grid cells can vary according to local velocities. Congruent smaller grids within each grid cell of the first-level mesh greatly reduce the memory requirement of geometrical coefficients compared to a whole irregular, unstructured mesh. Applying the double-mesh approach can also alleviate the discontinuity of memory accessing mainly caused by adoption of fully unstructured mesh. As a result, the GPU implementation of the proposed scheme can obtain a high speedup rate. Numerical examples demonstrate the good behaviour of the double-mesh elastic grid method.

  12. A constrained Delaunay discretization method for adaptively meshing highly discontinuous geological media

    Science.gov (United States)

    Wang, Yang; Ma, Guowei; Ren, Feng; Li, Tuo

    2017-12-01

    A constrained Delaunay discretization method is developed to generate high-quality doubly adaptive meshes of highly discontinuous geological media. Complex features such as three-dimensional discrete fracture networks (DFNs), tunnels, shafts, slopes, boreholes, water curtains, and drainage systems are taken into account in the mesh generation. The constrained Delaunay triangulation method is used to create adaptive triangular elements on planar fractures. Persson's algorithm (Persson, 2005), based on an analogy between triangular elements and spring networks, is enriched to automatically discretize a planar fracture into mesh points with varying density and smooth-quality gradient. The triangulated planar fractures are treated as planar straight-line graphs (PSLGs) to construct piecewise-linear complex (PLC) for constrained Delaunay tetrahedralization. This guarantees the doubly adaptive characteristic of the resulted mesh: the mesh is adaptive not only along fractures but also in space. The quality of elements is compared with the results from an existing method. It is verified that the present method can generate smoother elements and a better distribution of element aspect ratios. Two numerical simulations are implemented to demonstrate that the present method can be applied to various simulations of complex geological media that contain a large number of discontinuities.

  13. Meshing Highly Regular Structures: The Case of Super Carbon Nanotubes of Arbitrary Order

    Directory of Open Access Journals (Sweden)

    Christian Schröppel

    2015-01-01

    Full Text Available Mesh generation is an important step in many numerical methods. We present the “Hierarchical Graph Meshing” (HGM method as a novel approach to mesh generation, based on algebraic graph theory. The HGM method can be used to systematically construct configurations exhibiting multiple hierarchies and complex symmetry characteristics. The hierarchical description of structures provided by the HGM method can be exploited to increase the efficiency of multiscale and multigrid methods. In this paper, the HGM method is employed for the systematic construction of super carbon nanotubes of arbitrary order, which present a pertinent example of structurally and geometrically complex, yet highly regular, structures. The HGM algorithm is computationally efficient and exhibits good scaling characteristics. In particular, it scales linearly for super carbon nanotube structures and is working much faster than geometry-based methods employing neighborhood search algorithms. Its modular character makes it conducive to automatization. For the generation of a mesh, the information about the geometry of the structure in a given configuration is added in a way that relates geometric symmetries to structural symmetries. The intrinsically hierarchic description of the resulting mesh greatly reduces the effort of determining mesh hierarchies for multigrid and multiscale applications and helps to exploit symmetry-related methods in the mechanical analysis of complex structures.

  14. MAGE (M-file/Mif Automatic GEnerator): A graphical interface tool for automatic generation of Object Oriented Micromagnetic Framework configuration files and Matlab scripts for results analysis

    Science.gov (United States)

    Chęciński, Jakub; Frankowski, Marek

    2016-10-01

    We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.

  15. Cloud Detection from Satellite Imagery: A Comparison of Expert-Generated and Automatically-Generated Decision Trees

    Science.gov (United States)

    Shiffman, Smadar

    2004-01-01

    Automated cloud detection and tracking is an important step in assessing global climate change via remote sensing. Cloud masks, which indicate whether individual pixels depict clouds, are included in many of the data products that are based on data acquired on- board earth satellites. Many cloud-mask algorithms have the form of decision trees, which employ sequential tests that scientists designed based on empirical astrophysics studies and astrophysics simulations. Limitations of existing cloud masks restrict our ability to accurately track changes in cloud patterns over time. In this study we explored the potential benefits of automatically-learned decision trees for detecting clouds from images acquired using the Advanced Very High Resolution Radiometer (AVHRR) instrument on board the NOAA-14 weather satellite of the National Oceanic and Atmospheric Administration. We constructed three decision trees for a sample of 8km-daily AVHRR data from 2000 using a decision-tree learning procedure provided within MATLAB(R), and compared the accuracy of the decision trees to the accuracy of the cloud mask. We used ground observations collected by the National Aeronautics and Space Administration Clouds and the Earth s Radiant Energy Systems S COOL project as the gold standard. For the sample data, the accuracy of automatically learned decision trees was greater than the accuracy of the cloud masks included in the AVHRR data product.

  16. Unstructured Polyhedral Mesh Thermal Radiation Diffusion

    International Nuclear Information System (INIS)

    Palmer, T.S.; Zika, M.R.; Madsen, N.K.

    2000-01-01

    Unstructured mesh particle transport and diffusion methods are gaining wider acceptance as mesh generation, scientific visualization and linear solvers improve. This paper describes an algorithm that is currently being used in the KULL code at Lawrence Livermore National Laboratory to solve the radiative transfer equations. The algorithm employs a point-centered diffusion discretization on arbitrary polyhedral meshes in 3D. We present the results of a few test problems to illustrate the capabilities of the radiation diffusion module

  17. Solution Approach to Automatic Generation Control Problem Using Hybridized Gravitational Search Algorithm Optimized PID and FOPID Controllers

    Directory of Open Access Journals (Sweden)

    DAHIYA, P.

    2015-05-01

    Full Text Available This paper presents the application of hybrid opposition based disruption operator in gravitational search algorithm (DOGSA to solve automatic generation control (AGC problem of four area hydro-thermal-gas interconnected power system. The proposed DOGSA approach combines the advantages of opposition based learning which enhances the speed of convergence and disruption operator which has the ability to further explore and exploit the search space of standard gravitational search algorithm (GSA. The addition of these two concepts to GSA increases its flexibility for solving the complex optimization problems. This paper addresses the design and performance analysis of DOGSA based proportional integral derivative (PID and fractional order proportional integral derivative (FOPID controllers for automatic generation control problem. The proposed approaches are demonstrated by comparing the results with the standard GSA, opposition learning based GSA (OGSA and disruption based GSA (DGSA. The sensitivity analysis is also carried out to study the robustness of DOGSA tuned controllers in order to accommodate variations in operating load conditions, tie-line synchronizing coefficient, time constants of governor and turbine. Further, the approaches are extended to a more realistic power system model by considering the physical constraints such as thermal turbine generation rate constraint, speed governor dead band and time delay.

  18. Automatic Generation of Structural Building Descriptions from 3D Point Cloud Scans

    DEFF Research Database (Denmark)

    Ochmann, Sebastian; Vock, Richard; Wessel, Raoul

    2013-01-01

    We present a new method for automatic semantic structuring of 3D point clouds representing buildings. In contrast to existing approaches which either target the outside appearance like the facade structure or rather low-level geometric structures, we focus on the building’s interior using indoor...... scans to derive high-level architectural entities like rooms and doors. Starting with a registered 3D point cloud, we probabilistically model the affiliation of each measured point to a certain room in the building. We solve the resulting clustering problem using an iterative algorithm that relies...

  19. Automatic Generation of Machine Emulators: Efficient Synthesis of Robust Virtual Machines for Legacy Software Migration

    DEFF Research Database (Denmark)

    Franz, Michael; Gal, Andreas; Probst, Christian

    2006-01-01

    As older mainframe architectures become obsolete, the corresponding le- gacy software is increasingly executed via platform emulators running on top of more modern commodity hardware. These emulators are virtual machines that often include a combination of interpreters and just-in-time compilers....... Implementing interpreters and compilers for each combination of emulated and target platform independently of each other is a redundant and error-prone task. We describe an alternative approach that automatically synthesizes specialized virtual-machine interpreters and just-in-time compilers, which...

  20. AUTOMATIC TEXTURE RECONSTRUCTION OF 3D CITY MODEL FROM OBLIQUE IMAGES

    Directory of Open Access Journals (Sweden)

    J. Kang

    2016-06-01

    Full Text Available In recent years, the photorealistic 3D city models are increasingly important in various geospatial applications related to virtual city tourism, 3D GIS, urban planning, real-estate management. Besides the acquisition of high-precision 3D geometric data, texture reconstruction is also a crucial step for generating high-quality and visually realistic 3D models. However, most of the texture reconstruction approaches are probably leading to texture fragmentation and memory inefficiency. In this paper, we introduce an automatic framework of texture reconstruction to generate textures from oblique images for photorealistic visualization. Our approach include three major steps as follows: mesh parameterization, texture atlas generation and texture blending. Firstly, mesh parameterization procedure referring to mesh segmentation and mesh unfolding is performed to reduce geometric distortion in the process of mapping 2D texture to 3D model. Secondly, in the texture atlas generation step, the texture of each segmented region in texture domain is reconstructed from all visible images with exterior orientation and interior orientation parameters. Thirdly, to avoid color discontinuities at boundaries between texture regions, the final texture map is generated by blending texture maps from several corresponding images. We evaluated our texture reconstruction framework on a dataset of a city. The resulting mesh model can get textured by created texture without resampling. Experiment results show that our method can effectively mitigate the occurrence of texture fragmentation. It is demonstrated that the proposed framework is effective and useful for automatic texture reconstruction of 3D city model.

  1. Automatic Texture Reconstruction of 3d City Model from Oblique Images

    Science.gov (United States)

    Kang, Junhua; Deng, Fei; Li, Xinwei; Wan, Fang

    2016-06-01

    In recent years, the photorealistic 3D city models are increasingly important in various geospatial applications related to virtual city tourism, 3D GIS, urban planning, real-estate management. Besides the acquisition of high-precision 3D geometric data, texture reconstruction is also a crucial step for generating high-quality and visually realistic 3D models. However, most of the texture reconstruction approaches are probably leading to texture fragmentation and memory inefficiency. In this paper, we introduce an automatic framework of texture reconstruction to generate textures from oblique images for photorealistic visualization. Our approach include three major steps as follows: mesh parameterization, texture atlas generation and texture blending. Firstly, mesh parameterization procedure referring to mesh segmentation and mesh unfolding is performed to reduce geometric distortion in the process of mapping 2D texture to 3D model. Secondly, in the texture atlas generation step, the texture of each segmented region in texture domain is reconstructed from all visible images with exterior orientation and interior orientation parameters. Thirdly, to avoid color discontinuities at boundaries between texture regions, the final texture map is generated by blending texture maps from several corresponding images. We evaluated our texture reconstruction framework on a dataset of a city. The resulting mesh model can get textured by created texture without resampling. Experiment results show that our method can effectively mitigate the occurrence of texture fragmentation. It is demonstrated that the proposed framework is effective and useful for automatic texture reconstruction of 3D city model.

  2. Automatic selection of informative sentences: The sentences that can generate multiple choice questions

    Directory of Open Access Journals (Sweden)

    Mukta Majumder

    2014-12-01

    Full Text Available Traditional education cannot meet the expectation and requirement of a Smart City; it require more advance forms like active learning, ICT education etc. Multiple choice questions (MCQs play an important role in educational assessment and active learning which has a key role in Smart City education. MCQs are effective to assess the understanding of well-defined concepts. A fraction of all the sentences of a text contain well-defined concepts or information that can be asked as a MCQ. These informative sentences are required to be identified first for preparing multiple choice questions manually or automatically. In this paper we propose a technique for automatic identification of such informative sentences that can act as the basis of MCQ. The technique is based on parse structure similarity. A reference set of parse structures is compiled with the help of existing MCQs. The parse structure of a new sentence is compared with the reference structures and if similarity is found then the sentence is considered as a potential candidate. Next a rule-based post-processing module works on these potential candidates to select the final set of informative sentences. The proposed approach is tested in sports domain, where many MCQs are easily available for preparing the reference set of structures. The quality of the system selected sentences is evaluated manually. The experimental result shows that the proposed technique is quite promising.

  3. Framework for automatic generation of facades on free-form surfaces

    Directory of Open Access Journals (Sweden)

    Diego Andrade

    2017-09-01

    Full Text Available New design tools have created a growing interest for presenting complex geometries and patterns. The need to form curved geometries of facades, without incurring high construction costs and time increases, presents one of the most complex design challenges for any project. In this paper, we present and demonstrate a new computational framework for the creation of patterns on top of facades, via cladding of panels and honeycomb structures. The tool describes a given region on a base model; dealing particularly with location, size and orientation of general geometric features on the surface of such model. The user inputs curves that manifest the desired user׳s intention for the panels and a set of seed features that correspond to the initial boundary conditions of a Riemannian metric tensor field. The system interpolates the tensors defined by input features and input curves by solving a Laplace-Beltrami partial differential equation over the entire domain. We show a fast clustering and search operations for correct panel utilization based on size quantization as design variable and implemented via Voronoi segmentation. We present honeycomb structures that can be retrieved from the fundamental mesh producing another option for facade creation and ideation. The system connects to a geometric modeling kernel of a commercial CAD package; the system places features on top of the base model facade using boolean operations from the core geometric engine via its programming interface calls. With this computational tool, thousands of clad panels can be visualized and developed within minutes.

  4. A NEW APPROACH FOR THE SEMI-AUTOMATIC TEXTURE GENERATION OF THE BUILDINGS FACADES, FROM TERRESTRIAL LASER SCANNER DATA

    Directory of Open Access Journals (Sweden)

    E. Oniga

    2012-07-01

    Full Text Available The result of the terrestrial laser scanning is an impressive number of spatial points, each of them being characterized as position by the X, Y and Z co-ordinates, by the value of the laser reflectance and their real color, expressed as RGB (Red, Green, Blue values. The color code for each LIDAR point is taken from the georeferenced digital images, taken with a high resolution panoramic camera incorporated in the scanner system. In this article I propose a new algorithm for the semiautomatic texture generation, using the color information, the RGB values of every point that has been taken by terrestrial laser scanning technology and the 3D surfaces defining the buildings facades, generated with the Leica Cyclone software. The first step is when the operator defines the limiting value, i.e. the minimum distance between a point and the closest surface. The second step consists in calculating the distances, or the perpendiculars drawn from each point to the closest surface. In the third step we associate the points whose 3D coordinates are known, to every surface, depending on the limiting value. The fourth step consists in computing the Voronoi diagram for the points that belong to a surface. The final step brings automatic association between the RGB value of the color code and the corresponding polygon of the Voronoi diagram. The advantage of using this algorithm is that we can obtain, in a semi-automatic manner, a photorealistic 3D model of the building.

  5. Wind power integration into the automatic generation control of power systems with large-scale wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Hansen, Anca Daniela; Altin, Müfit

    2014-01-01

    Transmission system operators have an increased interest in the active participation of wind power plants (WPP) in the power balance control of power systems with large wind power penetration. The emphasis in this study is on the integration of WPPs into the automatic generation control (AGC......) of the power system. The present paper proposes a coordinated control strategy for the AGC between combined heat and power plants (CHPs) and WPPs to enhance the security and the reliability of a power system operation in the case of a large wind power penetration. The proposed strategy, described...... and exemplified for the future Danish power system, takes the hour-ahead regulating power plan for generation and power exchange with neighbouring power systems into account. The performance of the proposed strategy for coordinated secondary control is assessed and discussed by means of simulations for different...

  6. Automatic generation of groundwater model hydrostratigraphy from AEM resistivity and boreholes

    DEFF Research Database (Denmark)

    Marker, Pernille Aabye; Foged, N.; Christiansen, A. V.

    2014-01-01

    distribution govern groundwater flow. The coupling between hydrological and geophysical parameters is managed using a translator function with spatially variable parameters followed by a 3D zonation. The translator function translates geophysical resistivities into clay fractions and is calibrated...... with observed lithological data. Principal components are computed for the translated clay fractions and geophysical resistivities. Zonation is carried out by k-means clustering on the principal components. The hydraulic parameters of the zones are determined in a hydrological model calibration using head...... and discharge observations. The method was applied to field data collected at a Danish field site. Our results show that a competitive hydrological model can be constructed from the AEM dataset using the automatic procedure outlined above....

  7. Automatic generation of groundwater model hydrostratigraphy from AEM resistivity and boreholes

    DEFF Research Database (Denmark)

    Marker, Pernille Aabye; Foged, N.; Christiansen, A. V.

    2014-01-01

    and heterogeneity, which spatially scarce borehole lithology data may overlook, are well resolved in AEM surveys. This study presents a semi-automatic sequential hydrogeophysical inversion method for the integration of AEM and borehole data into regional groundwater models in sedimentary areas, where sand/ clay...... distribution govern groundwater flow. The coupling between hydrological and geophysical parameters is managed using a translator function with spatially variable parameters followed by a 3D zonation. The translator function translates geophysical resistivities into clay fractions and is calibrated...... with observed lithological data. Principal components are computed for the translated clay fractions and geophysical resistivities. Zonation is carried out by k-means clustering on the principal components. The hydraulic parameters of the zones are determined in a hydrological model calibration using head...

  8. Automatic Generation of Structural Building Descriptions from 3D Point Cloud Scans

    DEFF Research Database (Denmark)

    Ochmann, Sebastian; Vock, Richard; Wessel, Raoul

    2013-01-01

    We present a new method for automatic semantic structuring of 3D point clouds representing buildings. In contrast to existing approaches which either target the outside appearance like the facade structure or rather low-level geometric structures, we focus on the building’s interior using indoor...... scans to derive high-level architectural entities like rooms and doors. Starting with a registered 3D point cloud, we probabilistically model the affiliation of each measured point to a certain room in the building. We solve the resulting clustering problem using an iterative algorithm that relies...... on the estimated visibilities between any two locations within the point cloud. With the segmentation into rooms at hand, we subsequently determine the locations and extents of doors between adjacent rooms. In our experiments, we demonstrate the feasibility of our method by applying it to synthetic as well...

  9. CarSim: Automatic 3D Scene Generation of a Car Accident Description

    NARCIS (Netherlands)

    Egges, A.; Nijholt, A.; Nugues, P.

    2001-01-01

    The problem of generating a 3D simulation of a car accident from a written description can be divided into two subtasks: the linguistic analysis and the virtual scene generation. As a means of communication between these two system parts, we designed a template formalism to represent a written

  10. Automatically Generating Questions to Support the Acquisition of Particle Verbs: Evaluating via Crowdsourcing

    Science.gov (United States)

    Chinkina, Maria; Ruiz, Simón; Meurers, Detmar

    2017-01-01

    We integrate insights from research in Second Language Acquisition (SLA) and Computational Linguistics (CL) to generate text-based questions. We discuss the generation of wh- questions as functionally-driven input enhancement facilitating the acquisition of particle verbs and report the results of two crowdsourcing studies. The first study shows…

  11. Gene-Auto: Automatic Software Code Generation for Real-Time Embedded Systems

    Science.gov (United States)

    Rugina, A.-E.; Thomas, D.; Olive, X.; Veran, G.

    2008-08-01

    This paper gives an overview of the Gene-Auto ITEA European project, which aims at building a qualified C code generator from mathematical models under Matlab-Simulink and Scilab-Scicos. The project is driven by major European industry partners, active in the real-time embedded systems domains. The Gene- Auto code generator will significantly improve the current development processes in such domains by shortening the time to market and by guaranteeing the quality of the generated code through the use of formal methods. The first version of the Gene-Auto code generator has already been released and has gone thought a validation phase on real-life case studies defined by each project partner. The validation results are taken into account in the implementation of the second version of the code generator. The partners aim at introducing the Gene-Auto results into industrial development by 2010.

  12. Mesh network simulation

    OpenAIRE

    Pei, Ping; Petrenko, Y. N.

    2015-01-01

    A Mesh network simulation framework which provides a powerful and concise modeling chain for a network structure will be introduce in this report. Mesh networks has a special topologic structure. The paper investigates a message transfer in wireless mesh network simulation and how does it works in cellular network simulation. Finally the experimental result gave us the information that mesh networks have different principle in transmission way with cellular networks in transmission, and multi...

  13. Connectivity editing for quadrilateral meshes

    KAUST Repository

    Peng, Chihan

    2011-12-01

    We propose new connectivity editing operations for quadrilateral meshes with the unique ability to explicitly control the location, orientation, type, and number of the irregular vertices (valence not equal to four) in the mesh while preserving sharp edges. We provide theoretical analysis on what editing operations are possible and impossible and introduce three fundamental operations to move and re-orient a pair of irregular vertices. We argue that our editing operations are fundamental, because they only change the quad mesh in the smallest possible region and involve the fewest irregular vertices (i.e., two). The irregular vertex movement operations are supplemented by operations for the splitting, merging, canceling, and aligning of irregular vertices. We explain how the proposed high-level operations are realized through graph-level editing operations such as quad collapses, edge flips, and edge splits. The utility of these mesh editing operations are demonstrated by improving the connectivity of quad meshes generated from state-of-art quadrangulation techniques.

  14. Connectivity editing for quadrilateral meshes

    KAUST Repository

    Peng, Chihan

    2011-12-12

    We propose new connectivity editing operations for quadrilateral meshes with the unique ability to explicitly control the location, orientation, type, and number of the irregular vertices (valence not equal to four) in the mesh while preserving sharp edges. We provide theoretical analysis on what editing operations are possible and impossible and introduce three fundamental operations to move and re-orient a pair of irregular vertices. We argue that our editing operations are fundamental, because they only change the quad mesh in the smallest possible region and involve the fewest irregular vertices (i.e., two). The irregular vertex movement operations are supplemented by operations for the splitting, merging, canceling, and aligning of irregular vertices. We explain how the proposed highlevel operations are realized through graph-level editing operations such as quad collapses, edge flips, and edge splits. The utility of these mesh editing operations are demonstrated by improving the connectivity of quad meshes generated from state-of-art quadrangulation techniques. © 2011 ACM.

  15. Mesh control information of windmill designed by Solidwork program

    Science.gov (United States)

    Mulyana, T.; Sebayang, D.; Rafsanjani, A. M. D.; Adani, J. H. D.; Muhyiddin, Y. S.

    2017-12-01

    This paper presents the mesh control information imposed on the windmill already designed. The accuracy of Simulation results is influenced by the quality of the created mesh. However, compared to the quality of the mesh is made, the simulation time running will be done software also increases. The smaller the size of the elements created when making the mesh, the better the mesh quality will be generated. When adjusting the mesh size, there is a slider that acts as the density regulator of the element. SolidWorks Simulation also has Mesh Control facility. Features that can adjust mesh density only in the desired part. The best results of mesh control obtained for both static and thermal simulation have ratio 1.5.

  16. Highly Symmetric and Congruently Tiled Meshes for Shells and Domes

    Science.gov (United States)

    Rasheed, Muhibur; Bajaj, Chandrajit

    2016-01-01

    We describe the generation of all possible shell and dome shapes that can be uniquely meshed (tiled) using a single type of mesh face (tile), and following a single meshing (tiling) rule that governs the mesh (tile) arrangement with maximal vertex, edge and face symmetries. Such tiling arrangements or congruently tiled meshed shapes, are frequently found in chemical forms (fullerenes or Bucky balls, crystals, quasi-crystals, virus nano shells or capsids), and synthetic shapes (cages, sports domes, modern architectural facades). Congruently tiled meshes are both aesthetic and complete, as they support maximal mesh symmetries with minimal complexity and possess simple generation rules. Here, we generate congruent tilings and meshed shape layouts that satisfy these optimality conditions. Further, the congruent meshes are uniquely mappable to an almost regular 3D polyhedron (or its dual polyhedron) and which exhibits face-transitive (and edge-transitive) congruency with at most two types of vertices (each type transitive to the other). The family of all such congruently meshed polyhedra create a new class of meshed shapes, beyond the well-studied regular, semi-regular and quasi-regular classes, and their duals (platonic, Catalan and Johnson). While our new mesh class is infinite, we prove that there exists a unique mesh parametrization, where each member of the class can be represented by two integer lattice variables, and moreover efficiently constructable. PMID:27563368

  17. Automatic Generation of Overlays and Offset Values Based on Visiting Vehicle Telemetry and RWS Visuals

    Science.gov (United States)

    Dunne, Matthew J.

    2011-01-01

    The development of computer software as a tool to generate visual displays has led to an overall expansion of automated computer generated images in the aerospace industry. These visual overlays are generated by combining raw data with pre-existing data on the object or objects being analyzed on the screen. The National Aeronautics and Space Administration (NASA) uses this computer software to generate on-screen overlays when a Visiting Vehicle (VV) is berthing with the International Space Station (ISS). In order for Mission Control Center personnel to be a contributing factor in the VV berthing process, computer software similar to that on the ISS must be readily available on the ground to be used for analysis. In addition, this software must perform engineering calculations and save data for further analysis.

  18. Automatic navigation path generation based on two-phase adaptive region-growing algorithm for virtual angioscopy.

    Science.gov (United States)

    Kim, Do-Yeon; Chung, Sung-Mo; Park, Jong-Won

    2006-05-01

    In this paper, we propose a fast and automated navigation path generation algorithm to visualize inside of carotid artery using MR angiography images. The carotid artery is one of the body regions not accessible by real optical probe but can be visualized with virtual endoscopy. By applying two-phase adaptive region-growing algorithm, the carotid artery segmentation is started at the initial seed, which is located on the initially thresholded binary image. This segmentation algorithm automatically detects the branch position with stack feature. Combining with a priori knowledge of anatomic structure of carotid artery, the detected branch position is used to separate the carotid artery into internal carotid artery and external carotid artery. A fly-through path is determined to automatically move the virtual camera based on the intersecting coordinates of two bisectors on the circumscribed quadrangle of segmented carotid artery. In consideration of the interactive rendering speed and the usability of standard graphic hardware, endoscopic view of carotid artery is generated by using surface rendering algorithm with perspective projection method. In addition, the endoscopic view is provided with ray casting algorithm for off-line navigation of carotid artery. Experiments have been conducted on both mathematical phantom and clinical data sets. This algorithm is more effective than key-framing and topological thinning method in terms of automated features and computing time. This algorithm is also applicable to generate the centerline of renal artery, coronary artery, and airway tree which has tree-like cylinder shape of organ structures in the medical imagery.

  19. CarSim: Automatic 3D Scene Generation of a Car Accident Description

    OpenAIRE

    Egges, A.; Nijholt, A.; Nugues, P.

    2001-01-01

    The problem of generating a 3D simulation of a car accident from a written description can be divided into two subtasks: the linguistic analysis and the virtual scene generation. As a means of communication between these two system parts, we designed a template formalism to represent a written accident report. The CarSim system processes formal descriptions of accidents and creates corresponding 3D simulations. A planning component models the trajectories and temporal values of every vehicle ...

  20. Effective System for Automatic Bundle Block Adjustment and Ortho Image Generation from Multi Sensor Satellite Imagery

    Science.gov (United States)

    Akilan, A.; Nagasubramanian, V.; Chaudhry, A.; Reddy, D. Rajesh; Sudheer Reddy, D.; Usha Devi, R.; Tirupati, T.; Radhadevi, P. V.; Varadan, G.

    2014-11-01

    Block Adjustment is a technique for large area mapping for images obtained from different remote sensingsatellites.The challenge in this process is to handle huge number of satellite imageries from different sources with different resolution and accuracies at the system level. This paper explains a system with various tools and techniques to effectively handle the end-to-end chain in large area mapping and production with good level of automation and the provisions for intuitive analysis of final results in 3D and 2D environment. In addition, the interface for using open source ortho and DEM references viz., ETM, SRTM etc. and displaying ESRI shapes for the image foot-prints are explained. Rigorous theory, mathematical modelling, workflow automation and sophisticated software engineering tools are included to ensure high photogrammetric accuracy and productivity. Major building blocks like Georeferencing, Geo-capturing and Geo-Modelling tools included in the block adjustment solution are explained in this paper. To provide optimal bundle block adjustment solution with high precision results, the system has been optimized in many stages to exploit the full utilization of hardware resources. The robustness of the system is ensured by handling failure in automatic procedure and saving the process state in every stage for subsequent restoration from the point of interruption. The results obtained from various stages of the system are presented in the paper.

  1. Automatic generation of virtual worlds from architectural and mechanical CAD models

    International Nuclear Information System (INIS)

    Szepielak, D.

    2003-12-01

    Accelerator projects like the XFEL or the planned linear collider TESLA involve extensive architectural and mechanical design work, resulting in a variety of CAD models. The CAD models will be showing different parts of the project, like e.g. the different accelerator components or parts of the building complexes, and they will be created and stored by different groups in different formats. A complete CAD model of the accelerator and its buildings is thus difficult to obtain and would also be extremely huge and difficult to handle. This thesis describes the design and prototype development of a tool which automatically creates virtual worlds from different CAD models. The tool will enable the user to select a required area for visualization on a map, and then create a 3D-model of the selected area which can be displayed in a web-browser. The thesis first discusses the system requirements and provides some background on data visualization. Then, it introduces the system architecture, the algorithms and the used technologies, and finally demonstrates the capabilities of the system using two case studies. (orig.)

  2. An extensible six-step methodology to automatically generate fuzzy DSSs for diagnostic applications

    Science.gov (United States)

    2013-01-01

    Background The diagnosis of many diseases can be often formulated as a decision problem; uncertainty affects these problems so that many computerized Diagnostic Decision Support Systems (in the following, DDSSs) have been developed to aid the physician in interpreting clinical data and thus to improve the quality of the whole process. Fuzzy logic, a well established attempt at the formalization and mechanization of human capabilities in reasoning and deciding with noisy information, can be profitably used. Recently, we informally proposed a general methodology to automatically build DDSSs on the top of fuzzy knowledge extracted from data. Methods We carefully refine and formalize our methodology that includes six stages, where the first three stages work with crisp rules, whereas the last three ones are employed on fuzzy models. Its strength relies on its generality and modularity since it supports the integration of alternative techniques in each of its stages. Results The methodology is designed and implemented in the form of a modular and portable software architecture according to a component-based approach. The architecture is deeply described and a summary inspection of the main components in terms of UML diagrams is outlined as well. A first implementation of the architecture has been then realized in Java following the object-oriented paradigm and used to instantiate a DDSS example aimed at accurately diagnosing breast masses as a proof of concept. Conclusions The results prove the feasibility of the whole methodology implemented in terms of the architecture proposed. PMID:23368970

  3. An extensible six-step methodology to automatically generate fuzzy DSSs for diagnostic applications.

    Science.gov (United States)

    d'Acierno, Antonio; Esposito, Massimo; De Pietro, Giuseppe

    2013-01-01

    The diagnosis of many diseases can be often formulated as a decision problem; uncertainty affects these problems so that many computerized Diagnostic Decision Support Systems (in the following, DDSSs) have been developed to aid the physician in interpreting clinical data and thus to improve the quality of the whole process. Fuzzy logic, a well established attempt at the formalization and mechanization of human capabilities in reasoning and deciding with noisy information, can be profitably used. Recently, we informally proposed a general methodology to automatically build DDSSs on the top of fuzzy knowledge extracted from data. We carefully refine and formalize our methodology that includes six stages, where the first three stages work with crisp rules, whereas the last three ones are employed on fuzzy models. Its strength relies on its generality and modularity since it supports the integration of alternative techniques in each of its stages. The methodology is designed and implemented in the form of a modular and portable software architecture according to a component-based approach. The architecture is deeply described and a summary inspection of the main components in terms of UML diagrams is outlined as well. A first implementation of the architecture has been then realized in Java following the object-oriented paradigm and used to instantiate a DDSS example aimed at accurately diagnosing breast masses as a proof of concept. The results prove the feasibility of the whole methodology implemented in terms of the architecture proposed.

  4. Automatic generation of aesthetic patterns on fractal tilings by means of dynamical systems

    International Nuclear Information System (INIS)

    Chung, K.W.; Ma, H.M.

    2005-01-01

    A fractal tiling or f-tiling is a tiling which possesses self-similarity and the boundary of which is a fractal. In this paper, we investigate the classification of fractal tilings with kite-shaped and dart-shaped prototiles from which three new f-tilings are found. Invariant mappings are constructed for the creation of aesthetic patterns on such tilings. A modified convergence time scheme is described, which reflects the rate of convergence of various orbits and at the same time, enhances the artistic appeal of a generated image. A scheme based on the frequency of visit at a pixel is used to generate chaotic attractors

  5. Accuracy of automatic tube compensation in new-generation mechanical ventilators.

    Science.gov (United States)

    Elsasser, Serge; Guttmann, Josef; Stocker, Reto; Mols, Georg; Priebe, Hans-Joachim; Haberthür, Christoph

    2003-11-01

    To compare performance of flow-adapted compensation of endotracheal tube resistance (automatic tube compensation, ATC) between the original ATC system and ATC systems incorporated in commercially available ventilators. Bench study. University research laboratory. The original ATC system, Dräger Evita 2 prototype, Dräger Evita 4, Puritan-Bennett 840. The four ventilators under investigation were alternatively connected via different sized endotracheal tubes and an artificial trachea to an active lung model. Test conditions consisted of two ventilatory modes (ATC vs. continuous positive airway pressure), three different sized endotracheal tubes (inner diameter 7.0, 8.0, and 9.0 mm), two ventilatory rates (15/min and 30/min), and four levels of positive end-expiratory pressure (0, 5, 10, and 15 cm H2O). Performance of tube compensation was assessed by the amount of tube-related (additional) work of breathing (WOBadd), which was calculated on the basis of pressure gradient across the endotracheal tube. Compared with continuous positive airway pressure, ATC reduced inspiratory WOBadd by 58%, 68%, 50%, and 97% when using the Evita 4, the Evita 2 prototype, the Puritan-Bennett 840, and the original ATC system, respectively. Depending on endotracheal tube diameter and ventilatory pattern, inspiratory WOBadd was 0.12-5.2 J/L with the original ATC system, 1.5-28.9 J/L with the Puritan-Bennett 840, 10.4-21.0 J/L with the Evita 2 prototype, and 10.1-36.1 J/L with the Evita 4 (difference between each ventilator at identical test situations, p ventilator (p <.025). Flow-adapted tube compensation by the original ATC system significantly reduced tube-related inspiratory and expiratory work of breathing. The commercially available ATC modes investigated here may be adequate for inspiratory but probably not for expiratory tube compensation.

  6. On the Automatic Generation of Plans for Life Cycle Assembly Processes

    Energy Technology Data Exchange (ETDEWEB)

    CALTON,TERRI L.

    2000-01-01

    Designing products for easy assembly and disassembly during their entire life cycles for purposes including product assembly, product upgrade, product servicing and repair, and product disposal is a process that involves many disciplines. In addition, finding the best solution often involves considering the design as a whole and by considering its intended life cycle. Different goals and manufacturing plan selection criteria, as compared to initial assembly, require re-visiting significant fundamental assumptions and methods that underlie current assembly planning techniques. Previous work in this area has been limited to either academic studies of issues in assembly planning or to applied studies of life cycle assembly processes that give no attention to automatic planning. It is believed that merging these two areas will result in a much greater ability to design for, optimize, and analyze the cycle assembly processes. The study of assembly planning is at the very heart of manufacturing research facilities and academic engineering institutions; and, in recent years a number of significant advances in the field of assembly planning have been made. These advances have ranged from the development of automated assembly planning systems, such as Sandia's Automated Assembly Analysis System Archimedes 3.0{copyright}, to the startling revolution in microprocessors and computer-controlled production tools such as computer-aided design (CAD), computer-aided manufacturing (CAM), flexible manufacturing systems (EMS), and computer-integrated manufacturing (CIM). These results have kindled considerable interest in the study of algorithms for life cycle related assembly processes and have blossomed into a field of intense interest. The intent of this manuscript is to bring together the fundamental results in this area, so that the unifying principles and underlying concepts of algorithm design may more easily be implemented in practice.

  7. Design of an optimal SMES for automatic generation control of two-area thermal power system using Cuckoo search algorithm

    Directory of Open Access Journals (Sweden)

    Sabita Chaine

    2015-05-01

    Full Text Available This work presents a methodology adopted in order to tune the controller parameters of superconducting magnetic energy storage (SMES system in the automatic generation control (AGC of a two-area thermal power system. The gains of integral controllers of AGC loop, proportional controller of SMES loop and gains of the current feedback loop of the inductor in SMES are optimized simultaneously in order to achieve a desired performance. Recently proposed intelligent technique based algorithm known as Cuckoo search algorithm (CSA is applied for optimization. Sensitivity and robustness of the tuned gains tested at different operating conditions prove the effectiveness of fast acting energy storage devices like SMES in damping out oscillations in power system when their controllers are properly tuned.

  8. MISMATCH: A basis for semi-automatic functional mixed-signal test-pattern generation

    NARCIS (Netherlands)

    Kerkhoff, Hans G.; Tangelder, R.J.W.T.; Speek, Han; Engin, N.

    1996-01-01

    This paper describes a tool which assists the designer in the rapid generation of functional tests for mixed-signal circuits down to the actual test-signals for the tester. The tool is based on manipulating design data, making use of macro-based test libraries and tester resources provided by the

  9. Solution adaptive mesh using moving mesh method

    International Nuclear Information System (INIS)

    Tilak, A.S.; Tong, A.Y.; Liao, G.

    2004-01-01

    This work deals with mesh adaptation strategy to enhance the accuracy of numerical solution of partial differential equations. This was achieved economically by employing the Moving Grid Finite Difference Method. The method was reformulated as first order div-curl system. This system was then solved using the Least Square Finite Element method (LSFEM). The reformulation of the method has two desirable effects. Firstly, it eliminates the expensive gradient computation in the original method and secondly it allows the method to be employed for mesh adaptation with dynamic boundaries. A 2-D general finite element code implementing the mesh adaptation method based on LSFEM, capable of analyzing self-adjoint problems in elasticity and heat transfer with variety of boundary conditions, sources or sinks was developed and thoroughly validated. The code was used to analyze and adapt mesh for problems in heat transfer and elasticity. The method was found to perform satisfactorily in all test cases. (author)

  10. Extending a User Interface Prototyping Tool with Automatic MISRA C Code Generation

    Directory of Open Access Journals (Sweden)

    Gioacchino Mauro

    2017-01-01

    Full Text Available We are concerned with systems, particularly safety-critical systems, that involve interaction between users and devices, such as the user interface of medical devices. We therefore developed a MISRA C code generator for formal models expressed in the PVSio-web prototyping toolkit. PVSio-web allows developers to rapidly generate realistic interactive prototypes for verifying usability and safety requirements in human-machine interfaces. The visual appearance of the prototypes is based on a picture of a physical device, and the behaviour of the prototype is defined by an executable formal model. Our approach transforms the PVSio-web prototyping tool into a model-based engineering toolkit that, starting from a formally verified user interface design model, will produce MISRA C code that can be compiled and linked into a final product. An initial validation of our tool is presented for the data entry system of an actual medical device.

  11. Automatic generation of design structure matrices through the evolution of product models

    DEFF Research Database (Denmark)

    Gopsill, James A.; Snider, Chris; McMahon, Chris

    2016-01-01

    sense. For these reasons, tools and methods to support the identification and monitoring of component interactions and dependencies continues to be an active area of research. In particular, design structure matrices (DSMs) have been extensively applied to identify and visualize product...... update the DSM structure as a product develops. It follows that the proposition of this paper is to investigate whether an automated and continuously evolving DSM can be generated by monitoring the changes in the digital models that represent the product. This includes models that are generated from......, and lengthy redesigns. Thus, the management and monitoring of these dependencies remains a crucial activity in engineering projects and is becoming ever more challenging with the increase in the number of components, component interactions, and component dependencies, in both a structural and a functional...

  12. An information retrieval system using weighted descriptors generated by automatic frequency counting

    International Nuclear Information System (INIS)

    Komatsubara, Yasutoshi

    1979-01-01

    An information retrieval system with improved relevance is described, in which a weighted descriptor file, generated by feedback of requester's relevance judgement on pretest results, is used. This method does not need modification of search formulas, and works better by only setting weight thresholds, and can alleviate searcher duties, as examples show. Index word weighting and retrieval word weighting are compared and some problems to be encountered when retrieval word weighting is combined to operational systems are pointed out. (author)

  13. A tool for automatic generation of RTL-level VHDL description of RNS FIR filters

    DEFF Research Database (Denmark)

    Re, Andrea Del; Nannarelli, Alberto; Re, Marco

    2004-01-01

    Although digital filters based on the Residue Number System (RNS) show high performance and low power dissipation, RNS filters are not widely used in DSP systems, because of the complexity of the algorithms involved. We present a tool to design RNS FIR filters which hides the RNS algorithms...... to the designer, and generates a synthesizable VHDL description of the filter taking into account several design constraints such as: delay, area and energy....

  14. Analysis of Wind Speed Forecasting Error Effects on Automatic Generation Control Performance

    Directory of Open Access Journals (Sweden)

    H. Rajabi Mashhadi

    2014-09-01

    Full Text Available The main goal of this paper is to study statistical indices and evaluate AGC indices in power system which has large penetration of the WTGs. Increasing penetration of wind turbine generations, needs to study more about impacts of it on power system frequency control. Frequency control is changed with unbalancing real-time system generation and load . Also wind turbine generations have more fluctuations and make system more unbalance. Then AGC loop helps to adjust system frequency and the scheduled tie-line powers. The quality of AGC loop is measured by some indices. A good index is a proper measure shows the AGC performance just as the power system operates. One of well-known measures in literature which was introduced by NERC is Control Performance Standards(CPS. Previously it is claimed that a key factor in CPS index is related to standard deviation of generation error, installed power and frequency response. This paper focuses on impact of a several hours-ahead wind speed forecast error on this factor. Furthermore evaluation of conventional control performances in the power systems with large-scale wind turbine penetration is studied. Effects of wind speed standard deviation and also degree of wind farm penetration are analyzed and importance of mentioned factor are criticized. In addition, influence of mean wind speed forecast error on this factor is investigated. The study system is a two area system which there is significant wind farm in one of those. The results show that mean wind speed forecast error has considerable effect on AGC performance while the mentioned key factor is insensitive to this mean error.

  15. Local adaptive mesh refinement for shock hydrodynamics

    International Nuclear Information System (INIS)

    Berger, M.J.; Colella, P.; Lawrence Livermore Laboratory, Livermore, 94550 California)

    1989-01-01

    The aim of this work is the development of an automatic, adaptive mesh refinement strategy for solving hyperbolic conservation laws in two dimensions. There are two main difficulties in doing this. The first problem is due to the presence of discontinuities in the solution and the effect on them of discontinuities in the mesh. The second problem is how to organize the algorithm to minimize memory and CPU overhead. This is an important consideration and will continue to be important as more sophisticated algorithms that use data structures other than arrays are developed for use on vector and parallel computers. copyright 1989 Academic Press, Inc

  16. Automatically Augmenting Lifelog Events Using Pervasively Generated Content from Millions of People

    Directory of Open Access Journals (Sweden)

    Alan F. Smeaton

    2010-02-01

    Full Text Available In sensor research we take advantage of additional contextual sensor information to disambiguate potentially erroneous sensor readings or to make better informed decisions on a single sensor’s output. This use of additional information reinforces, validates, semantically enriches, and augments sensed data. Lifelog data is challenging to augment, as it tracks one’s life with many images including the places they go, making it non-trivial to find associated sources of information. We investigate realising the goal of pervasive user-generated content based on sensors, by augmenting passive visual lifelogs with “Web 2.0” content collected by millions of other individuals.

  17. Automatic Aircraft Structural Topology Generation for Multidisciplinary Optimization and Weight Estimation

    Science.gov (United States)

    Sensmeier, Mark D.; Samareh, Jamshid A.

    2005-01-01

    An approach is proposed for the application of rapid generation of moderate-fidelity structural finite element models of air vehicle structures to allow more accurate weight estimation earlier in the vehicle design process. This should help to rapidly assess many structural layouts before the start of the preliminary design phase and eliminate weight penalties imposed when actual structure weights exceed those estimated during conceptual design. By defining the structural topology in a fully parametric manner, the structure can be mapped to arbitrary vehicle configurations being considered during conceptual design optimization. A demonstration of this process is shown for two sample aircraft wing designs.

  18. AUTOMATIC GENERATION OF ROAD INFRASTRUCTURE IN 3D FOR VEHICLE SIMULATORS

    Directory of Open Access Journals (Sweden)

    Adam Orlický

    2017-12-01

    Full Text Available One of the modern methods of testing new systems and interfaces in vehicles is testing in a vehicle simulator. Providing quality models of virtual scenes is one of tasks for driver-car interaction interface simulation. Nowadays, there exist many programs for creating 3D models of road infrastructures, but most of these programs are very expensive or canÂtt export models for the following use. Therefore, a plug-in has been developed at the Faculty of Transportation Sciences in Prague. It can generate road infrastructure by Czech standard for designing roads (CSN 73 6101. The uniqueness of this plug-in is that it is the first tool for generating road infrastructure in NURBS representation. This type of representation brings more exact models and allows to optimize transfer for creating quality models for vehicle simulators. The scenes created by this plug-in were tested on vehicle simulators. The results have shown that with newly created scenes drivers had a much better feeling in comparison to previous scenes.

  19. Intra-Hour Dispatch and Automatic Generator Control Demonstration with Solar Forecasting - Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Coimbra, Carlos F. M. [Univ. of California, San Diego, CA (United States

    2016-02-25

    In this project we address multiple resource integration challenges associated with increasing levels of solar penetration that arise from the variability and uncertainty in solar irradiance. We will model the SMUD service region as its own balancing region, and develop an integrated, real-time operational tool that takes solar-load forecast uncertainties into consideration and commits optimal energy resources and reserves for intra-hour and intra-day decisions. The primary objectives of this effort are to reduce power system operation cost by committing appropriate amount of energy resources and reserves, as well as to provide operators a prediction of the generation fleet’s behavior in real time for realistic PV penetration scenarios. The proposed methodology includes the following steps: clustering analysis on the expected solar variability per region for the SMUD system, Day-ahead (DA) and real-time (RT) load forecasts for the entire service areas, 1-year of intra-hour CPR forecasts for cluster centers, 1-year of smart re-forecasting CPR forecasts in real-time for determination of irreducible errors, and uncertainty quantification for integrated solar-load for both distributed and central stations (selected locations within service region) PV generation.

  20. From sequencer to supercomputer: an automatic pipeline for managing and processing next generation sequencing data.

    Science.gov (United States)

    Camerlengo, Terry; Ozer, Hatice Gulcin; Onti-Srinivasan, Raghuram; Yan, Pearlly; Huang, Tim; Parvin, Jeffrey; Huang, Kun

    2012-01-01

    Next Generation Sequencing is highly resource intensive. NGS Tasks related to data processing, management and analysis require high-end computing servers or even clusters. Additionally, processing NGS experiments requires suitable storage space and significant manual interaction. At The Ohio State University's Biomedical Informatics Shared Resource, we designed and implemented a scalable architecture to address the challenges associated with the resource intensive nature of NGS secondary analysis built around Illumina Genome Analyzer II sequencers and Illumina's Gerald data processing pipeline. The software infrastructure includes a distributed computing platform consisting of a LIMS called QUEST (http://bisr.osumc.edu), an Automation Server, a computer cluster for processing NGS pipelines, and a network attached storage device expandable up to 40TB. The system has been architected to scale to multiple sequencers without requiring additional computing or labor resources. This platform provides demonstrates how to manage and automate NGS experiments in an institutional or core facility setting.

  1. Radon transform based automatic metal artefacts generation for 3D threat image projection

    Science.gov (United States)

    Megherbi, Najla; Breckon, Toby P.; Flitton, Greg T.; Mouton, Andre

    2013-10-01

    Threat Image Projection (TIP) plays an important role in aviation security. In order to evaluate human security screeners in determining threats, TIP systems project images of realistic threat items into the images of the passenger baggage being scanned. In this proof of concept paper, we propose a 3D TIP method which can be integrated within new 3D Computed Tomography (CT) screening systems. In order to make the threat items appear as if they were genuinely located in the scanned bag, appropriate CT metal artefacts are generated in the resulting TIP images according to the scan orientation, the passenger bag content and the material of the inserted threat items. This process is performed in the projection domain using a novel methodology based on the Radon Transform. The obtained results using challenging 3D CT baggage images are very promising in terms of plausibility and realism.

  2. LigParGen web server: an automatic OPLS-AA parameter generator for organic ligands

    Science.gov (United States)

    Dodda, Leela S.

    2017-01-01

    Abstract The accurate calculation of protein/nucleic acid–ligand interactions or condensed phase properties by force field-based methods require a precise description of the energetics of intermolecular interactions. Despite the progress made in force fields, small molecule parameterization remains an open problem due to the magnitude of the chemical space; the most critical issue is the estimation of a balanced set of atomic charges with the ability to reproduce experimental properties. The LigParGen web server provides an intuitive interface for generating OPLS-AA/1.14*CM1A(-LBCC) force field parameters for organic ligands, in the formats of commonly used molecular dynamics and Monte Carlo simulation packages. This server has high value for researchers interested in studying any phenomena based on intermolecular interactions with ligands via molecular mechanics simulations. It is free and open to all at jorgensenresearch.com/ligpargen, and has no login requirements. PMID:28444340

  3. Automatic generation of active coordinates for quantum dynamics calculations: Application to the dynamics of benzene photochemistry

    International Nuclear Information System (INIS)

    Lasorne, Benjamin; Sicilia, Fabrizio; Bearpark, Michael J.; Robb, Michael A.; Worth, Graham A.; Blancafort, Lluis

    2008-01-01

    A new practical method to generate a subspace of active coordinates for quantum dynamics calculations is presented. These reduced coordinates are obtained as the normal modes of an analytical quadratic representation of the energy difference between excited and ground states within the complete active space self-consistent field method. At the Franck-Condon point, the largest negative eigenvalues of this Hessian correspond to the photoactive modes: those that reduce the energy difference and lead to the conical intersection; eigenvalues close to 0 correspond to bath modes, while modes with large positive eigenvalues are photoinactive vibrations, which increase the energy difference. The efficacy of quantum dynamics run in the subspace of the photoactive modes is illustrated with the photochemistry of benzene, where theoretical simulations are designed to assist optimal control experiments

  4. The efficiency of geophysical adjoint codes generated by automatic differentiation tools

    Science.gov (United States)

    Vlasenko, A. V.; Köhl, A.; Stammer, D.

    2016-02-01

    The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the

  5. Automatic string generation for estimating in vivo length changes of the medial patellofemoral ligament during knee flexion.

    Science.gov (United States)

    Graf, Matthias; Diether, Salomon; Vlachopoulos, Lazaros; Fucentese, Sandro; Fürnstahl, Philipp

    2014-06-01

    Modeling ligaments as three-dimensional strings is a popular method for in vivo estimation of ligament length. The purpose of this study was to develop an algorithm for automated generation of non-penetrating strings between insertion points and to evaluate its feasibility for estimating length changes of the medial patellofemoral ligament during normal knee flexion. Three-dimensional knee models were generated from computed tomography (CT) scans of 10 healthy subjects. The knee joint under weight-bearing was acquired in four flexion positions (0°-120°). The path between insertion points was computed in each position to quantify string length and isometry. The average string length was maximal in 0° of flexion (64.5 ± 3.9 mm between femoral and proximal patellar point; 62.8 ± 4.0 mm between femoral and distal patellar point). It was minimal in 30° (60.0 ± 2.6 mm) for the proximal patellar string and in 120° (58.7 ± 4.3 mm) for the distal patellar string. The insertion points were considered to be isometric in 4 of the 10 subjects. The proposed algorithm appears to be feasible for estimating string lengths between insertion points in an automatic fashion. The length measurements based on CT images acquired under physiological loading conditions may give further insights into knee kinematics.

  6. Wind power integration into the automatic generation control of power systems with large-scale wind power

    Directory of Open Access Journals (Sweden)

    Abdul Basit

    2014-10-01

    Full Text Available Transmission system operators have an increased interest in the active participation of wind power plants (WPP in the power balance control of power systems with large wind power penetration. The emphasis in this study is on the integration of WPPs into the automatic generation control (AGC of the power system. The present paper proposes a coordinated control strategy for the AGC between combined heat and power plants (CHPs and WPPs to enhance the security and the reliability of a power system operation in the case of a large wind power penetration. The proposed strategy, described and exemplified for the future Danish power system, takes the hour-ahead regulating power plan for generation and power exchange with neighbouring power systems into account. The performance of the proposed strategy for coordinated secondary control is assessed and discussed by means of simulations for different possible future scenarios, when wind power production in the power system is high and conventional production from CHPs is at a minimum level. The investigation results of the proposed control strategy have shown that the WPPs can actively help the AGC, and reduce the real-time power imbalance in the power system, by down regulating their production when CHPs are unable to provide the required response.

  7. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    Science.gov (United States)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  8. Automatic generation control of multi-area power systems with diverse energy sources using Teaching Learning Based Optimization algorithm

    Directory of Open Access Journals (Sweden)

    Rabindra Kumar Sahu

    2016-03-01

    Full Text Available This paper presents the design and analysis of Proportional-Integral-Double Derivative (PIDD controller for Automatic Generation Control (AGC of multi-area power systems with diverse energy sources using Teaching Learning Based Optimization (TLBO algorithm. At first, a two-area reheat thermal power system with appropriate Generation Rate Constraint (GRC is considered. The design problem is formulated as an optimization problem and TLBO is employed to optimize the parameters of the PIDD controller. The superiority of the proposed TLBO based PIDD controller has been demonstrated by comparing the results with recently published optimization technique such as hybrid Firefly Algorithm and Pattern Search (hFA-PS, Firefly Algorithm (FA, Bacteria Foraging Optimization Algorithm (BFOA, Genetic Algorithm (GA and conventional Ziegler Nichols (ZN for the same interconnected power system. Also, the proposed approach has been extended to two-area power system with diverse sources of generation like thermal, hydro, wind and diesel units. The system model includes boiler dynamics, GRC and Governor Dead Band (GDB non-linearity. It is observed from simulation results that the performance of the proposed approach provides better dynamic responses by comparing the results with recently published in the literature. Further, the study is extended to a three unequal-area thermal power system with different controllers in each area and the results are compared with published FA optimized PID controller for the same system under study. Finally, sensitivity analysis is performed by varying the system parameters and operating load conditions in the range of ±25% from their nominal values to test the robustness.

  9. Hernia Surgical Mesh Implants

    Science.gov (United States)

    ... knitted mesh or non-knitted sheet forms. The synthetic materials used can be absorbable, non-absorbable or a combination of absorbable and non-absorbable materials. Animal-derived mesh are made of animal tissue, such as intestine or skin, that has been processed and disinfected to be ...

  10. Urogynecologic Surgical Mesh Implants

    Science.gov (United States)

    ... knitted mesh or non-knitted sheet forms. The synthetic materials used can be either absorbable, non-absorbable, or a combination of absorbable and non-absorbable materials. Animal-derived mesh are made of animal tissue, such as intestine or skin, that have been processed and disinfected to be ...

  11. Hand-eye coordination of a robot for the automatic inspection of steam-generator tubes in nuclear power plants

    International Nuclear Information System (INIS)

    Choi, D.H.; Song, Y.C.; Kim, J.H.; Kim, J.G.

    2004-01-01

    The inspection of steam-generator tubes in nuclear power plants needs to collect test signals in a highly radiated region that is not accessible by humans. In general, a robot equipped with a camera and a test probe is used to handle such a dangerous environment. The robot moves the probe to right below a tube to be inspected and then the probe is inserted into the tube. The inspection signals are acquired while the probe is pulling back. Currently, an operator in a control room controls all the process remotely. To make a fully automatic inspection system, first of all, a control mechanism is needed to position the probe to the proper location. This is so called a hand-eye coordination problem. In this paper, a hand-eye coordination method for a robot has been presented. The proposed method consists of the two consecutive control modes: rough positioning and fine-tuning. The rough positioning controller tries to position its probe near a target place using kinematics information and the known environments, and then the fine-tuning controller tries to adjust the probe to the target using the image acquired by the camera attached to the robot. The usefulness of the proposed method has been tested and verified through experiments. (orig.)

  12. Applying modern psychometric techniques to melodic discrimination testing: Item response theory, computerised adaptive testing, and automatic item generation.

    Science.gov (United States)

    Harrison, Peter M C; Collins, Tom; Müllensiefen, Daniel

    2017-06-15

    Modern psychometric theory provides many useful tools for ability testing, such as item response theory, computerised adaptive testing, and automatic item generation. However, these techniques have yet to be integrated into mainstream psychological practice. This is unfortunate, because modern psychometric techniques can bring many benefits, including sophisticated reliability measures, improved construct validity, avoidance of exposure effects, and improved efficiency. In the present research we therefore use these techniques to develop a new test of a well-studied psychological capacity: melodic discrimination, the ability to detect differences between melodies. We calibrate and validate this test in a series of studies. Studies 1 and 2 respectively calibrate and validate an initial test version, while Studies 3 and 4 calibrate and validate an updated test version incorporating additional easy items. The results support the new test's viability, with evidence for strong reliability and construct validity. We discuss how these modern psychometric techniques may also be profitably applied to other areas of music psychology and psychological science in general.

  13. Automatic CT-based finite element model generation for temperature-based death time estimation: feasibility study and sensitivity analysis.

    Science.gov (United States)

    Schenkl, Sebastian; Muggenthaler, Holger; Hubig, Michael; Erdmann, Bodo; Weiser, Martin; Zachow, Stefan; Heinrich, Andreas; Güttler, Felix Victor; Teichgräber, Ulf; Mall, Gita

    2017-05-01

    Temperature-based death time estimation is based either on simple phenomenological models of corpse cooling or on detailed physical heat transfer models. The latter are much more complex but allow a higher accuracy of death time estimation, as in principle, all relevant cooling mechanisms can be taken into account.Here, a complete workflow for finite element-based cooling simulation is presented. The following steps are demonstrated on a CT phantom: Computer tomography (CT) scan Segmentation of the CT images for thermodynamically relevant features of individual geometries and compilation in a geometric computer-aided design (CAD) model Conversion of the segmentation result into a finite element (FE) simulation model Computation of the model cooling curve (MOD) Calculation of the cooling time (CTE) For the first time in FE-based cooling time estimation, the steps from the CT image over segmentation to FE model generation are performed semi-automatically. The cooling time calculation results are compared to cooling measurements performed on the phantoms under controlled conditions. In this context, the method is validated using a CT phantom. Some of the phantoms' thermodynamic material parameters had to be determined via independent experiments.Moreover, the impact of geometry and material parameter uncertainties on the estimated cooling time is investigated by a sensitivity analysis.

  14. Expert system for the automatic analysis of the Eddy current signals from the monitoring of vapor generators of a PWR, type reactor

    International Nuclear Information System (INIS)

    Lefevre, F.; Baumaire, A.; Comby, R.; Benas, J.C.

    1990-01-01

    The automatization of the monitoring of the steam generator tubes required some developments in the field of data processing. The monitoring is performed by means of Eddy current tests. Improvements in signal processing and in pattern recognition associated to the artificial intelligence techniques induced EDF (French Electricity Company) to develop an automatic signal processing system. The system, named EXTRACSION (French acronym for Expert System for the Processing and classification of Signals of Nuclear Nature), insures the coherence between the different fields of knowledge (metallurgy, measurement, signals) during data processing by applying an object oriented representation [fr

  15. Boundary denoising for open surface meshes

    Science.gov (United States)

    Lee, Wei Zhe; Lim, Wee Keong; Soo, Wooi King

    2013-04-01

    Recently, applications of open surfaces in 3D have emerged to be an interesting research topic due to the popularity of range cameras such as the Microsoft Kinect. However, surface meshes representing such open surfaces are often corrupted with noises especially at the boundary. Such deformity needs to be treated to facilitate further applications such as texture mapping and zippering of multiple open surface meshes. Conventional methods perform denoising by removing components with high frequencies, thus smoothing the boundaries. However, this may result in loss of information, as not all high frequency transitions at the boundaries correspond to noises. To overcome such shortcoming, we propose a combination of local information and geometric features to single out the noises or unusual vertices at the mesh boundaries. The local shape of the selected mesh boundaries regions, characterized by the mean curvature value, is compared with that of the neighbouring interior region. The neighbouring interior region is chosen such that it is the closest to the corresponding boundary region, while curvature evaluation is independent of the boundary. The smoothing processing is done via Laplacian smoothing with our modified weights to reduce boundary shrinkage. The evaluation of the algorithm is done by noisy meshes generated from controlled model clean meshes. The Hausdorff distance is used as the measurement between the meshes. We show that our method produces better results than conventional smoothing of the whole boundary loop.

  16. 22nd International Meshing Roundtable

    CERN Document Server

    Staten, Matthew

    2014-01-01

    This volume contains the articles presented at the 22nd International Meshing Roundtable (IMR) organized, in part, by Sandia National Laboratories and was held on Oct 13-16, 2013 in Orlando, Florida, USA.  The first IMR was held in 1992, and the conference series has been held annually since.  Each year the IMR brings together researchers, developers, and application experts in a variety of disciplines, from all over the world, to present and discuss ideas on mesh generation and related topics.  The technical papers in this volume present theoretical and novel ideas and algorithms with practical potential, as well as technical applications in science and engineering, geometric modeling, computer graphics and visualization.

  17. 21st International Meshing Roundtable

    CERN Document Server

    Weill, Jean-Christophe

    2013-01-01

    This volume contains the articles presented at the 21st International Meshing Roundtable (IMR) organized, in part, by Sandia National Laboratories and was held on October 7–10, 2012 in San Jose, CA, USA. The first IMR was held in 1992, and the conference series has been held annually since. Each year the IMR brings together researchers, developers, and application experts in a variety of disciplines, from all over the world, to present and discuss ideas on mesh generation and related topics. The technical papers in this volume present theoretical and novel ideas and algorithms with practical potential, as well as technical applications in science and engineering, geometric modeling, computer graphics, and visualization.

  18. Wireless mesh networks

    CERN Document Server

    Held, Gilbert

    2005-01-01

    Wireless mesh networking is a new technology that has the potential to revolutionize how we access the Internet and communicate with co-workers and friends. Wireless Mesh Networks examines the concept and explores its advantages over existing technologies. This book explores existing and future applications, and examines how some of the networking protocols operate.The text offers a detailed analysis of the significant problems affecting wireless mesh networking, including network scale issues, security, and radio frequency interference, and suggests actual and potential solutions for each pro

  19. Mesh construction for the 2-dimensional computational fracture mechanics using the I-DEAS

    International Nuclear Information System (INIS)

    Kim, Jong Wook; Kim, Tae Wan; Park, Keun Bae

    2000-09-01

    Recently research activities have been reported regarding the generation of the input data for the crack problems at a minimum of effort utilizing the general characteristics of the finite element modeling technique. Several automatic FE mesh generation methods for the cracked structure of particular geometries and boundary conditions have been proposed by using commercial codes or developing in-house programs. In general, development of software to deal with special crack problem can maximize the efficiency and accuracy for a specific environment. However, applicable range of such scheme is usually very restricted and new program should be formed in each case. On the other hand, commercial codes can be used for the automatic mesh generation of variety of geometries, but with an additional effort to accomodate the singular element for the cracked-body analysis. In the present study, a procedure for the generation of input data for the optimized computational fracture mechanics is developed as a series of effort to establish the structural integrity evaluation procedure of SMART reactor vessel assembly. Input data for the finite element analysis are prepared using the commercial code I-DEAS. The midpoint nodes near the crack front are shifted at the quarter-points. The complete finite element model generated is given to another commercial finite element code ABAQUS for the stress analysis. The stress intensity factors are calculated using the J-integral method. To demonstrate the validation of the present procedure, double-edge crack in a plate subjected to uniform tension is solved, and the effects of mesh construction are discussed in detail. The structural integrity evaluation procedure through the 2-D crack modeling is then established

  20. Automatic Test Data Generation Using Data Flow Information = Veri Akışı Bilgisi Kullanılarak Otomatik Test Verisi Üretimi

    Directory of Open Access Journals (Sweden)

    Rana ABDELAZIZ

    2000-06-01

    Full Text Available This paper presents a tool for automatically generating test data for Pascal programs that satisfy the data flow criteria. Unlike existing tools, our tool is not limited to Pascal programs whose program flow graph contains read statements in only one node but rather deals with read statements appearing in any node in the program flow graph. Moreover, our tool handles loops and arrays, these two features are traditionally difficult to handle in test data generation systems. This allows us to generate tests for larger programs than those previously reported in the literature.

  1. A neural network model for the automatic detection and forecast of convective cells based on meteosat second generation data

    Science.gov (United States)

    Puca, S.; de Leonibus, L.; Zauli, F.; Rosci, P.; Musmanno, L.

    The Mesoscale Convective Systems (MCSs) are often correlated with heavy rainfall, thunderstorms and hail showers, frequently causing significant damages. The most intensive weather activities occur during the maturing stage of the development, which can be found in the case of a multi-cell storm in the centre of the convective complex systems. These convective systems may occur in several different unstable air mass; in a cold air mass behind a polar cold front, in the frontal zone of a polar front and in warm air ahead of a polar warm front. To understand the meteorological situation and apply the best conceptual model, the knowledge of the convective cluster is often not enough. In many cases the forecasters need to know the distribution of the convective cells in the cloudy cluster. A model, running in operational mode at the Italian Air Force Meteorological Service (UGM/CNMCA), for the automatic detection and forecast of the convective cells, is here proposed. The application relays on the Meteosat Second Generation infrared (IR) windows (10.8 μ m, 7.3 μ m) and the two water vapour (WV) channels (6.2 μ m and 7.3 μ m), giving as output the detection of the convective cells and their evolution for the next 15 and 30 minutes. The format of the output of the product is the last IR (10.8 μ m) image where the detected cells, their development and their tracking are represented. This multispectral method, based on a variable threshold method during the detection phase and a neural network algorithm during the forecast phase, allowed us to define a model able to detect the convective cells present in a convective cluster, plot their distribution and forecast the evolution of them for the next 15 and 30 minutes with a good efficiency. For analysing the performance of the model with the Meteosat Second Generation data, different error functions have been evaluated for various meteorological cloud contexts (i.e. high layer and cirrus clouds). Some methods for

  2. Polygon mesh processing

    CERN Document Server

    Botsch, Mario; Pauly, Mark; Alliez, Pierre; Levy, Bruno

    2010-01-01

    Geometry processing, or mesh processing, is a fast-growing area of research that uses concepts from applied mathematics, computer science, and engineering to design efficient algorithms for the acquisition, reconstruction, analysis, manipulation, simulation, and transmission of complex 3D models. Applications of geometry processing algorithms already cover a wide range of areas from multimedia, entertainment, and classical computer-aided design, to biomedical computing, reverse engineering, and scientific computing. Over the last several years, triangle meshes have become increasingly popular,

  3. Geometrically Consistent Mesh Modification

    KAUST Repository

    Bonito, A.

    2010-01-01

    A new paradigm of adaptivity is to execute refinement, coarsening, and smoothing of meshes on manifolds with incomplete information about their geometry and yet preserve position and curvature accuracy. We refer to this collectively as geometrically consistent (GC) mesh modification. We discuss the concept of discrete GC, show the failure of naive approaches, and propose and analyze a simple algorithm that is GC and accuracy preserving. © 2010 Society for Industrial and Applied Mathematics.

  4. Modeling and simulation of the generation automatic control of electric power systems; Modelado y simulacion del control automatico de generacion de sistemas electricos de potencia

    Energy Technology Data Exchange (ETDEWEB)

    Caballero Ortiz, Ezequiel

    2002-12-01

    This work is devoted to the analysis of the Automatic Control of Electrical Systems Generation of power, as of the information that generates the loop with Load-Frequency Control and the Automatic Voltage Regulator loop. To accomplish the analysis, the control classical theory and feedback control systems concepts are applied. Thus also, the modern theory concepts are employed. The studies are accomplished in the digital computer through the MATLAB program and the available simulation technique in the SIMULINK tool. In this thesis the theoretical and physical concepts of the automatic control of generation are established; dividing it in load frequency control and automatic voltage regulator loops. The mathematical models of the two control loops are established. Later, the models of the elements are interconnected in order to integrate the loop with load frequency control and the digital simulation of the system is carried out. In first instance, the function of the primary control in are - machine, area - multi machine and multi area - multi machine power systems, is analyzed. Then, the automatic control of generation of the area and multi area power systems is studied. The economic dispatch concept is established and with this plan the power system multi area is simulated, there in after the energy exchange among areas in stationary stage is studied. The mathematical models of the component elements of the control loop of the automatic voltage regulator are interconnected. Data according to the nature of each component are generated and their behavior is simulated to analyze the system response. The two control loops are interconnected and a simulation is carry out with data generated previously, examining the performance of the automatic control of generation and the interaction between the two control loops. Finally, the Poles Positioning and the Optimum Control techniques of the modern control theory are applied to the automatic control of an area generation

  5. Evaluation of plan quality assurance models for prostate cancer patients based on fully automatically generated Pareto-optimal treatment plans.

    Science.gov (United States)

    Wang, Yibing; Breedveld, Sebastiaan; Heijmen, Ben; Petit, Steven F

    2016-06-07

    IMRT planning with commercial Treatment Planning Systems (TPSs) is a trial-and-error process. Consequently, the quality of treatment plans may not be consistent among patients, planners and institutions. Recently, different plan quality assurance (QA) models have been proposed, that could flag and guide improvement of suboptimal treatment plans. However, the performance of these models was validated using plans that were created using the conventional trail-and-error treatment planning process. Consequently, it is challenging to assess and compare quantitatively the accuracy of different treatment planning QA models. Therefore, we created a golden standard dataset of consistently planned Pareto-optimal IMRT plans for 115 prostate patients. Next, the dataset was used to assess the performance of a treatment planning QA model that uses the overlap volume histogram (OVH). 115 prostate IMRT plans were fully automatically planned using our in-house developed TPS Erasmus-iCycle. An existing OVH model was trained on the plans of 58 of the patients. Next it was applied to predict DVHs of the rectum, bladder and anus of the remaining 57 patients. The predictions were compared with the achieved values of the golden standard plans for the rectum D mean, V 65, and V 75, and D mean of the anus and the bladder. For the rectum, the prediction errors (predicted-achieved) were only  -0.2  ±  0.9 Gy (mean  ±  1 SD) for D mean,-1.0  ±  1.6% for V 65, and  -0.4  ±  1.1% for V 75. For D mean of the anus and the bladder, the prediction error was 0.1  ±  1.6 Gy and 4.8  ±  4.1 Gy, respectively. Increasing the training cohort to 114 patients only led to minor improvements. A dataset of consistently planned Pareto-optimal prostate IMRT plans was generated. This dataset can be used to train new, and validate and compare existing treatment planning QA models, and has been made publicly available. The OVH model was highly accurate

  6. Parametric Quadrilateral Meshes for the Design and Optimization of Superconducting Magnets

    CERN Document Server

    Aleksa, Martin; Völlinger, Christine

    2000-01-01

    The program package ROXIE [1] has been developed at CERN for the design and optimization of the superconducting magnets for the LHC.The necessity of extremely uniform (coil dominated) fields in accelerator magnets requires very accurate methods of .eld computation. For this purpose a coupled boundary-element/ finite-element technique (BEM-FEM) is used [2]. Quadrilateral higher order finite-elements are used for the discretization of the iron domain.This is necessary for the accurate modeling of the iron contours and is favorable for 3D meshes. A new quadrilateral mesh generator using geometrically optimized domain decomposition which was developed at the University of Stuttgart, Germany [3] has been implemented into the ROXIE program providing fully automatic and user friendly mesh generation.The frequent application of mathematical optimization techniques requires parametric models which are set-up using a feature-based approach.The structure of the magnet cross-section can be modeled using parametric object...

  7. A mass-redistributed finite element method (MR-FEM) for acoustic problems using triangular mesh

    Science.gov (United States)

    He, Z. C.; Li, Eric; Liu, G. R.; Li, G. Y.; Cheng, A. G.

    2016-10-01

    The accuracy of numerical results using standard finite element method (FEM) in acoustic problems will deteriorate with increasing frequency due to the ;dispersion error;. Such dispersion error depends on the balance between the ;stiffness; and ;mass; of discretization equation systems. This paper reports an improved finite element method (FEM) for solving acoustic problems by re-distributing the mass in the mass matrix to ;tune; the balance, aiming to minimize the dispersion errors. This is done by shifting the integration point locations when computing the entries of the mass matrix, while ensuring the mass conservation. The new method is verified through the detailed numerical error analysis, and a strategy is also proposed for the best mass redistribution in terms of minimizing dispersion error. The relative dispersion error of present mass-redistributed finite element method (MR-FEM) is found to be much smaller than the FEM solution, in both theoretical prediction and numerical examination. The present MR-FEM works well by using the linear triangular elements that can be generated automatically, which enables automation in computation and saving computational cost in mesh generation. Numerical examples demonstrate the advantages of MR-FEM, in comparison with the standard FEM using the same triangular meshes and quadrilateral meshes.

  8. An algorithm for generating data accessibility recommendations for flight deck Automatic Dependent Surveillance-Broadcast (ADS-B) applications

    Science.gov (United States)

    2014-09-09

    Automatic Dependent Surveillance-Broadcast (ADS-B) In technology supports the display of traffic data on Cockpit Displays of Traffic Information (CDTIs). The data are used by flightcrews to perform defined self-separation procedures, such as the in-t...

  9. Automatic Substitute Computed Tomography Generation and Contouring for Magnetic Resonance Imaging (MRI)-Alone External Beam Radiation Therapy From Standard MRI Sequences

    International Nuclear Information System (INIS)

    Dowling, Jason A.; Sun, Jidi; Pichler, Peter; Rivest-Hénault, David; Ghose, Soumya; Richardson, Haylea; Wratten, Chris; Martin, Jarad; Arm, Jameen; Best, Leah; Chandra, Shekhar S.; Fripp, Jurgen; Menk, Frederick W.; Greer, Peter B.

    2015-01-01

    Purpose: To validate automatic substitute computed tomography CT (sCT) scans generated from standard T2-weighted (T2w) magnetic resonance (MR) pelvic scans for MR-Sim prostate treatment planning. Patients and Methods: A Siemens Skyra 3T MR imaging (MRI) scanner with laser bridge, flat couch, and pelvic coil mounts was used to scan 39 patients scheduled for external beam radiation therapy for localized prostate cancer. For sCT generation a whole-pelvis MRI scan (1.6 mm 3-dimensional isotropic T2w SPACE [Sampling Perfection with Application optimized Contrasts using different flip angle Evolution] sequence) was acquired. Three additional small field of view scans were acquired: T2w, T2*w, and T1w flip angle 80° for gold fiducials. Patients received a routine planning CT scan. Manual contouring of the prostate, rectum, bladder, and bones was performed independently on the CT and MR scans. Three experienced observers contoured each organ on MRI, allowing interobserver quantification. To generate a training database, each patient CT scan was coregistered to their whole-pelvis T2w using symmetric rigid registration and structure-guided deformable registration. A new multi-atlas local weighted voting method was used to generate automatic contours and sCT results. Results: The mean error in Hounsfield units between the sCT and corresponding patient CT (within the body contour) was 0.6 ± 14.7 (mean ± 1 SD), with a mean absolute error of 40.5 ± 8.2 Hounsfield units. Automatic contouring results were very close to the expert interobserver level (Dice similarity coefficient): prostate 0.80 ± 0.08, bladder 0.86 ± 0.12, rectum 0.84 ± 0.06, bones 0.91 ± 0.03, and body 1.00 ± 0.003. The change in monitor units between the sCT-based plans relative to the gold standard CT plan for the same dose prescription was found to be 0.3% ± 0.8%. The 3-dimensional γ pass rate was 1.00 ± 0.00 (2 mm/2%). Conclusions: The MR-Sim setup and automatic s

  10. Automatic Substitute Computed Tomography Generation and Contouring for Magnetic Resonance Imaging (MRI)-Alone External Beam Radiation Therapy From Standard MRI Sequences

    Energy Technology Data Exchange (ETDEWEB)

    Dowling, Jason A., E-mail: jason.dowling@csiro.au [CSIRO Australian e-Health Research Centre, Herston, Queensland (Australia); University of Newcastle, Callaghan, New South Wales (Australia); Sun, Jidi [University of Newcastle, Callaghan, New South Wales (Australia); Pichler, Peter [Calvary Mater Newcastle Hospital, Waratah, New South Wales (Australia); Rivest-Hénault, David; Ghose, Soumya [CSIRO Australian e-Health Research Centre, Herston, Queensland (Australia); Richardson, Haylea [Calvary Mater Newcastle Hospital, Waratah, New South Wales (Australia); Wratten, Chris; Martin, Jarad [University of Newcastle, Callaghan, New South Wales (Australia); Calvary Mater Newcastle Hospital, Waratah, New South Wales (Australia); Arm, Jameen [Calvary Mater Newcastle Hospital, Waratah, New South Wales (Australia); Best, Leah [Department of Radiology, Hunter New England Health, New Lambton, New South Wales (Australia); Chandra, Shekhar S. [School of Information Technology and Electrical Engineering, University of Queensland, Brisbane, Queensland (Australia); Fripp, Jurgen [CSIRO Australian e-Health Research Centre, Herston, Queensland (Australia); Menk, Frederick W. [University of Newcastle, Callaghan, New South Wales (Australia); Greer, Peter B. [University of Newcastle, Callaghan, New South Wales (Australia); Calvary Mater Newcastle Hospital, Waratah, New South Wales (Australia)

    2015-12-01

    Purpose: To validate automatic substitute computed tomography CT (sCT) scans generated from standard T2-weighted (T2w) magnetic resonance (MR) pelvic scans for MR-Sim prostate treatment planning. Patients and Methods: A Siemens Skyra 3T MR imaging (MRI) scanner with laser bridge, flat couch, and pelvic coil mounts was used to scan 39 patients scheduled for external beam radiation therapy for localized prostate cancer. For sCT generation a whole-pelvis MRI scan (1.6 mm 3-dimensional isotropic T2w SPACE [Sampling Perfection with Application optimized Contrasts using different flip angle Evolution] sequence) was acquired. Three additional small field of view scans were acquired: T2w, T2*w, and T1w flip angle 80° for gold fiducials. Patients received a routine planning CT scan. Manual contouring of the prostate, rectum, bladder, and bones was performed independently on the CT and MR scans. Three experienced observers contoured each organ on MRI, allowing interobserver quantification. To generate a training database, each patient CT scan was coregistered to their whole-pelvis T2w using symmetric rigid registration and structure-guided deformable registration. A new multi-atlas local weighted voting method was used to generate automatic contours and sCT results. Results: The mean error in Hounsfield units between the sCT and corresponding patient CT (within the body contour) was 0.6 ± 14.7 (mean ± 1 SD), with a mean absolute error of 40.5 ± 8.2 Hounsfield units. Automatic contouring results were very close to the expert interobserver level (Dice similarity coefficient): prostate 0.80 ± 0.08, bladder 0.86 ± 0.12, rectum 0.84 ± 0.06, bones 0.91 ± 0.03, and body 1.00 ± 0.003. The change in monitor units between the sCT-based plans relative to the gold standard CT plan for the same dose prescription was found to be 0.3% ± 0.8%. The 3-dimensional γ pass rate was 1.00 ± 0.00 (2 mm/2%). Conclusions: The MR-Sim setup and automatic s

  11. Extraction: a system for automatic eddy current diagnosis of steam generator tubes in nuclear power plants; Extracsion: un systeme de controle automatique par courants de Foucault des tubes de generateurs de vapeur de centrales nucleaires

    Energy Technology Data Exchange (ETDEWEB)

    Georgel, B.; Zorgati, R.

    1994-12-31

    Improving speed and quality of Eddy Current non-destructive testing of steam generator tubes leads to automatize all processes that contribute to diagnosis. This paper describes how we use signal processing, pattern recognition and artificial intelligence to build a software package that is able to automatically provide an efficient diagnosis. (authors). 2 figs., 5 refs.

  12. A robust moving mesh finite volume method applied to 1D hyperbolic conservation laws from magnetohydrodynamics

    NARCIS (Netherlands)

    Dam, A. van; Zegeling, P.A.

    2006-01-01

    In this paper we describe a one-dimensional adaptive moving mesh method and its application to hyperbolic conservation laws from magnetohydrodynamics (MHD). The method is robust, because it employs automatic control of mesh adaptation when a new model is considered, without manually-set

  13. A semi-automatic computer-aided method for surgical template design.

    Science.gov (United States)

    Chen, Xiaojun; Xu, Lu; Yang, Yue; Egger, Jan

    2016-02-04

    This paper presents a generalized integrated framework of semi-automatic surgical template design. Several algorithms were implemented including the mesh segmentation, offset surface generation, collision detection, ruled surface generation, etc., and a special software named TemDesigner was developed. With a simple user interface, a customized template can be semi- automatically designed according to the preoperative plan. Firstly, mesh segmentation with signed scalar of vertex is utilized to partition the inner surface from the input surface mesh based on the indicated point loop. Then, the offset surface of the inner surface is obtained through contouring the distance field of the inner surface, and segmented to generate the outer surface. Ruled surface is employed to connect inner and outer surfaces. Finally, drilling tubes are generated according to the preoperative plan through collision detection and merging. It has been applied to the template design for various kinds of surgeries, including oral implantology, cervical pedicle screw insertion, iliosacral screw insertion and osteotomy, demonstrating the efficiency, functionality and generality of our method.

  14. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...

  15. MeshVoro: A Three-Dimensional Voronoi Mesh Building Tool for the TOUGH Family of Codes

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, C. M.; Boyle, K. L.; Reagan, M.; Johnson, J.; Rycroft, C.; Moridis, G. J.

    2013-09-30

    Few tools exist for creating and visualizing complex three-dimensional simulation meshes, and these have limitations that restrict their application to particular geometries and circumstances. Mesh generation needs to trend toward ever more general applications. To that end, we have developed MeshVoro, a tool that is based on the Voro (Rycroft 2009) library and is capable of generating complex threedimensional Voronoi tessellation-based (unstructured) meshes for the solution of problems of flow and transport in subsurface geologic media that are addressed by the TOUGH (Pruess et al. 1999) family of codes. MeshVoro, which includes built-in data visualization routines, is a particularly useful tool because it extends the applicability of the TOUGH family of codes by enabling the scientifically robust and relatively easy discretization of systems with challenging 3D geometries. We describe several applications of MeshVoro. We illustrate the ability of the tool to straightforwardly transform a complex geological grid into a simulation mesh that conforms to the specifications of the TOUGH family of codes. We demonstrate how MeshVoro can describe complex system geometries with a relatively small number of grid blocks, and we construct meshes for geometries that would have been practically intractable with a standard Cartesian grid approach. We also discuss the limitations and appropriate applications of this new technology.

  16. Effective Generation and Update of a Building Map Database Through Automatic Building Change Detection from LiDAR Point Cloud Data

    Directory of Open Access Journals (Sweden)

    Mohammad Awrangjeb

    2015-10-01

    Full Text Available Periodic building change detection is important for many applications, including disaster management. Building map databases need to be updated based on detected changes so as to ensure their currency and usefulness. This paper first presents a graphical user interface (GUI developed to support the creation of a building database from building footprints automatically extracted from LiDAR (light detection and ranging point cloud data. An automatic building change detection technique by which buildings are automatically extracted from newly-available LiDAR point cloud data and compared to those within an existing building database is then presented. Buildings identified as totally new or demolished are directly added to the change detection output. However, for part-building demolition or extension, a connected component analysis algorithm is applied, and for each connected building component, the area, width and height are estimated in order to ascertain if it can be considered as a demolished or new building-part. Using the developed GUI, a user can quickly examine each suggested change and indicate his/her decision to update the database, with a minimum number of mouse clicks. In experimental tests, the proposed change detection technique was found to produce almost no omission errors, and when compared to the number of reference building corners, it reduced the human interaction to 14% for initial building map generation and to 3% for map updating. Thus, the proposed approach can be exploited for enhanced automated building information updating within a topographic database.

  17. Toward An Unstructured Mesh Database

    Science.gov (United States)

    Rezaei Mahdiraji, Alireza; Baumann, Peter Peter

    2014-05-01

    Unstructured meshes are used in several application domains such as earth sciences (e.g., seismology), medicine, oceanography, cli- mate modeling, GIS as approximate representations of physical objects. Meshes subdivide a domain into smaller geometric elements (called cells) which are glued together by incidence relationships. The subdivision of a domain allows computational manipulation of complicated physical structures. For instance, seismologists model earthquakes using elastic wave propagation solvers on hexahedral meshes. The hexahedral con- tains several hundred millions of grid points and millions of hexahedral cells. Each vertex node in the hexahedrals stores a multitude of data fields. To run simulation on such meshes, one needs to iterate over all the cells, iterate over incident cells to a given cell, retrieve coordinates of cells, assign data values to cells, etc. Although meshes are used in many application domains, to the best of our knowledge there is no database vendor that support unstructured mesh features. Currently, the main tool for querying and manipulating unstructured meshes are mesh libraries, e.g., CGAL and GRAL. Mesh li- braries are dedicated libraries which includes mesh algorithms and can be run on mesh representations. The libraries do not scale with dataset size, do not have declarative query language, and need deep C++ knowledge for query implementations. Furthermore, due to high coupling between the implementations and input file structure, the implementations are less reusable and costly to maintain. A dedicated mesh database offers the following advantages: 1) declarative querying, 2) ease of maintenance, 3) hiding mesh storage structure from applications, and 4) transparent query optimization. To design a mesh database, the first challenge is to define a suitable generic data model for unstructured meshes. We proposed ImG-Complexes data model as a generic topological mesh data model which extends incidence graph model to multi

  18. Expert system for the automatic analysis of the Eddy current signals from the monitoring of vapor generators of a PWR type reactor

    International Nuclear Information System (INIS)

    Benoist, P.; David, B.; Pigeon, M.

    1990-01-01

    An expert system for the automatic analysis of signals from Eddy currents is presented. The system was developed in order to detect and analyse the defects which may exist in vapor generators. The extraction of a signal from a high level background noise is possible. The organization of the work during the system's development, the results of the technique for the extraction of the signal from the background noise, and an example concerning the interpretation of the signal from a defect are presented [fr

  19. LiDAR The Generation of Automatic Mapping for Buildings, Using High Spatial Resolution Digital Vertical Aerial Photography and LiDAR Point Clouds

    Directory of Open Access Journals (Sweden)

    William Barragán Zaque

    2015-06-01

    Full Text Available The aim of this paper is to generate photogrammetrie products and to automatically map buildings in the area of interest in vector format. The research was conducted Bogotá using high resolution digital vertical aerial photographs and point clouds obtained using LIDAR technology. Image segmentation was also used, alongside radiometric and geometric digital processes. The process took into account aspects including building height, segmentation algorithms, and spectral band combination. The results had an effectiveness of 97.2 % validated through ground-truthing.

  20. NASA Lewis Meshed VSAT Workshop meeting summary

    Science.gov (United States)

    Ivancic, William

    1993-11-01

    NASA Lewis Research Center's Space Electronics Division (SED) hosted a workshop to address specific topics related to future meshed very small-aperture terminal (VSAT) satellite communications networks. The ideas generated by this workshop will help to identify potential markets and focus technology development within the commercial satellite communications industry and NASA. The workshop resulted in recommendations concerning these principal points of interest: the window of opportunity for a meshed VSAT system; system availability; ground terminal antenna sizes; recommended multifrequency for time division multiple access (TDMA) uplink; a packet switch design concept for narrowband; and fault tolerance design concepts. This report presents a summary of group presentations and discussion associated with the technological, economic, and operational issues of meshed VSAT architectures that utilize processing satellites.

  1. Finite Macro-Element Mesh Deformation in a Structured Multi-Block Navier-Stokes Code

    Science.gov (United States)

    Bartels, Robert E.

    2005-01-01

    A mesh deformation scheme is developed for a structured multi-block Navier-Stokes code consisting of two steps. The first step is a finite element solution of either user defined or automatically generated macro-elements. Macro-elements are hexagonal finite elements created from a subset of points from the full mesh. When assembled, the finite element system spans the complete flow domain. Macro-element moduli vary according to the distance to the nearest surface, resulting in extremely stiff elements near a moving surface and very pliable elements away from boundaries. Solution of the finite element system for the imposed boundary deflections generally produces smoothly varying nodal deflections. The manner in which distance to the nearest surface has been found to critically influence the quality of the element deformation. The second step is a transfinite interpolation which distributes the macro-element nodal deflections to the remaining fluid mesh points. The scheme is demonstrated for several two-dimensional applications.

  2. From intraperitoneal onlay mesh repair to preperitoneal onlay mesh repair.

    Science.gov (United States)

    Yang, George Pei Cheung

    2017-05-01

    Laparoscopic repair for ventral and incisional hernias was first reported in the early 1990s. It uses intraperitoneal only mesh placement to achieve a tension-free repair of the hernia. However, in recent years, there has been greater concern about long-term complication involving intraperitoneal mesh placement. Many case reports and case series have found evidence of mesh adhesion, mesh fistulation, and mesh migration into hollow organs including the esophagus, small bowel, and large bowel, resulting in various major acute abdominal events. Subsequent management of these complications may require major surgery that is technically demanding and difficult; in such cases, laparotomy and bowel resection have often been performed. Because of these significant, but not common, adverse events, many surgeons favor open sublay repair for ventral and incisional hernias. Investigators are therefore searching for a laparoscopic approach for ventral and incisional hernias that might overcome the mesh-induced visceral complications seen after intraperitoneal only mesh placement repair. Laparoscopic preperitoneal onlay mesh is one such approach. This article will explore the fundamental of intraperitoneal only mesh placement and its problems, the currently available peritoneal visceral-compatible meshes, and upcoming developments in laparoscopic ventral and incisional hernia repair. The technical details of preperitoneal onlay mesh, as well as its potential advantages and disadvantages, will also be discussed. © 2017 Japan Society for Endoscopic Surgery, Asia Endosurgery Task Force and John Wiley & Sons Australia, Ltd.

  3. Mathematics and computational methods development in U.S. department of energy-sponsored research (nuclear energy research initiative and nuclear engineering education research). 4. Development of an Expert System for Generation of an Effective Mesh Distribution for the SN Method

    International Nuclear Information System (INIS)

    Patchimpattapong, Apisit; Haghighat, Alireza

    2001-01-01

    The discrete ordinates (S N ) method is widely used to obtain numerical solutions of the transport equation. The method calls for discretization of spatial, energy, and angular variables. To generate an 'effective' spatial mesh distribution, one has to consider various factors including particle mean free path (mfp), material and source discontinuities, and problem objectives. This becomes more complicated if we consider the effect of numerics such as differencing schemes, parallel processing strategies, and computation resources. As a result, one may often over/under-mesh depending upon limitations on accuracy, computing resources, and time allotted. To overcome the foregoing issues, we are developing an expert system for input preparation of the discrete ordinates (S N ) method. This project is a part of an ongoing project sponsored by Nuclear Engineering Education Research. Our expert system consists of two parts: (a) an algorithm for generation of a mesh distribution for a serial calculation and (b) an algorithm for extension to parallel computing, which accounts for parallelization parameters including granularity, load balancing, parallel algorithms, and possible architectural issues. Thus far, we have developed a stand-alone algorithm for generation of an 'effective' mesh distribution for a serial calculation. The algorithm has been successfully tested with the Parallel Environment Neutral-Particle Transport (PENTRAN) code system. In this paper, we discuss the structure of our algorithm and present its use for simulating the VENUS-3 experimental facility. To date, we have developed and tested part 1 of this system. This part comprises of four steps: creation of a geometric model and coarse meshes, calculation of un-collided flux, selection of differencing schemes, and generation of fine-mesh distribution. For the un-collided flux calculation, we have developed a parallel code called PENFC. It is capable of calculating un-collided and first-collision fluxes

  4. A new tool for rapid and automatic estimation of earthquake source parameters and generation of seismic bulletins

    Science.gov (United States)

    Zollo, Aldo

    2016-04-01

    RISS S.r.l. is a Spin-off company recently born from the initiative of the research group constituting the Seismology Laboratory of the Department of Physics of the University of Naples Federico II. RISS is an innovative start-up, based on the decade-long experience in earthquake monitoring systems and seismic data analysis of its members and has the major goal to transform the most recent innovations of the scientific research into technological products and prototypes. With this aim, RISS has recently started the development of a new software, which is an elegant solution to manage and analyse seismic data and to create automatic earthquake bulletins. The software has been initially developed to manage data recorded at the ISNet network (Irpinia Seismic Network), which is a network of seismic stations deployed in Southern Apennines along the active fault system responsible for the 1980, November 23, MS 6.9 Irpinia earthquake. The software, however, is fully exportable and can be used to manage data from different networks, with any kind of station geometry or network configuration and is able to provide reliable estimates of earthquake source parameters, whichever is the background seismicity level of the area of interest. Here we present the real-time automated procedures and the analyses performed by the software package, which is essentially a chain of different modules, each of them aimed at the automatic computation of a specific source parameter. The P-wave arrival times are first detected on the real-time streaming of data and then the software performs the phase association and earthquake binding. As soon as an event is automatically detected by the binder, the earthquake location coordinates and the origin time are rapidly estimated, using a probabilistic, non-linear, exploration algorithm. Then, the software is able to automatically provide three different magnitude estimates. First, the local magnitude (Ml) is computed, using the peak-to-peak amplitude

  5. From Finite Element Meshes to Clouds of Points: A Review of Methods for Generation of Computational Biomechanics Models for Patient-Specific Applications.

    Science.gov (United States)

    Wittek, Adam; Grosland, Nicole M; Joldes, Grand Roman; Magnotta, Vincent; Miller, Karol

    2016-01-01

    It has been envisaged that advances in computing and engineering technologies could extend surgeons' ability to plan and carry out surgical interventions more accurately and with less trauma. The progress in this area depends crucially on the ability to create robustly and rapidly patient-specific biomechanical models. We focus on methods for generation of patient-specific computational grids used for solving partial differential equations governing the mechanics of the body organs. We review state-of-the-art in this area and provide suggestions for future research. To provide a complete picture of the field of patient-specific model generation, we also discuss methods for identifying and assigning patient-specific material properties of tissues and boundary conditions.

  6. MO-G-BRE-04: Automatic Verification of Daily Treatment Deliveries and Generation of Daily Treatment Reports for a MR Image-Guided Treatment Machine

    International Nuclear Information System (INIS)

    Yang, D; Li, X; Li, H; Wooten, H; Green, O; Rodriguez, V; Mutic, S

    2014-01-01

    Purpose: Two aims of this work were to develop a method to automatically verify treatment delivery accuracy immediately after patient treatment and to develop a comprehensive daily treatment report to provide all required information for daily MR-IGRT review. Methods: After systematically analyzing the requirements for treatment delivery verification and understanding the available information from a novel MR-IGRT treatment machine, we designed a method to use 1) treatment plan files, 2) delivery log files, and 3) dosimetric calibration information to verify the accuracy and completeness of daily treatment deliveries. The method verifies the correctness of delivered treatment plans and beams, beam segments, and for each segment, the beam-on time and MLC leaf positions. Composite primary fluence maps are calculated from the MLC leaf positions and the beam-on time. Error statistics are calculated on the fluence difference maps between the plan and the delivery. We also designed the daily treatment delivery report by including all required information for MR-IGRT and physics weekly review - the plan and treatment fraction information, dose verification information, daily patient setup screen captures, and the treatment delivery verification results. Results: The parameters in the log files (e.g. MLC positions) were independently verified and deemed accurate and trustable. A computer program was developed to implement the automatic delivery verification and daily report generation. The program was tested and clinically commissioned with sufficient IMRT and 3D treatment delivery data. The final version has been integrated into a commercial MR-IGRT treatment delivery system. Conclusion: A method was developed to automatically verify MR-IGRT treatment deliveries and generate daily treatment reports. Already in clinical use since December 2013, the system is able to facilitate delivery error detection, and expedite physician daily IGRT review and physicist weekly chart

  7. Conforming to interface structured adaptive mesh refinement: 3D algorithm and implementation

    Science.gov (United States)

    Nagarajan, Anand; Soghrati, Soheil

    2018-03-01

    A new non-iterative mesh generation algorithm named conforming to interface structured adaptive mesh refinement (CISAMR) is introduced for creating 3D finite element models of problems with complex geometries. CISAMR transforms a structured mesh composed of tetrahedral elements into a conforming mesh with low element aspect ratios. The construction of the mesh begins with the structured adaptive mesh refinement of elements in the vicinity of material interfaces. An r-adaptivity algorithm is then employed to relocate selected nodes of nonconforming elements, followed by face-swapping a small fraction of them to eliminate tetrahedrons with high aspect ratios. The final conforming mesh is constructed by sub-tetrahedralizing remaining nonconforming elements, as well as tetrahedrons with hanging nodes. In addition to studying the convergence and analyzing element-wise errors in meshes generated using CISAMR, several example problems are presented to show the ability of this method for modeling 3D problems with intricate morphologies.

  8. Energy mesh optimization for multi-level calculation schemes

    International Nuclear Information System (INIS)

    Mosca, P.; Taofiki, A.; Bellier, P.; Prevost, A.

    2011-01-01

    The industrial calculations of third generation nuclear reactors are based on sophisticated strategies of homogenization and collapsing at different spatial and energetic levels. An important issue to ensure the quality of these calculation models is the choice of the collapsing energy mesh. In this work, we show a new approach to generate optimized energy meshes starting from the SHEM 281-group library. The optimization model is applied on 1D cylindrical cells and consists of finding an energy mesh which minimizes the errors between two successive collision probability calculations. The former is realized over the fine SHEM mesh with Livolant-Jeanpierre self-shielded cross sections and the latter is performed with collapsed cross sections over the energy mesh being optimized. The optimization is done by the particle swarm algorithm implemented in the code AEMC and multigroup flux solutions are obtained from standard APOLLO2 solvers. By this new approach, a set of new optimized meshes which encompass from 10 to 50 groups has been defined for PWR and BWR calculations. This set will allow users to adapt the energy detail of the solution to the complexity of the calculation (assembly, multi-assembly, two-dimensional whole core). Some preliminary verifications, in which the accuracy of the new meshes is measured compared to a direct 281-group calculation, show that the 30-group optimized mesh offers a good compromise between simulation time and accuracy for a standard 17 x 17 UO 2 assembly with and without control rods. (author)

  9. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    International Nuclear Information System (INIS)

    Gloger, Oliver; Völzke, Henry; Tönnies, Klaus; Mensel, Birger

    2015-01-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches. (paper)

  10. Some Behavioral Considerations on the GPS4GEF Cloud-Based Generator of Evaluation Forms with Automatic Feedback and References to Interactive Support Content

    Directory of Open Access Journals (Sweden)

    Daniel HOMOCIANU

    2015-01-01

    Full Text Available The paper introduces some considerations on a previously defined general purpose system used to dynamically generate online evaluation forms with automatic feedback immediately after submitting responses and working with a simple and well-known data source format able to store questions, answers and links to additional support materials in order to increase the productivity of evaluation and assessment. Beyond presenting a short description of the prototype’s components and underlining advantages and limitations of using it for any user involved in assessment and evaluation processes, this paper promotes the use of such a system together with a simple technique of generating and referencing interactive support content cited within this paper and defined together with the LIVES4IT approach. This type of content means scenarios having adhoc documentation and interactive simulation components useful when emulating concrete examples of working with real world objects, operating with devices or using software applications from any activity field.

  11. Overlap Areas of a Square Box on a Square Mesh

    Science.gov (United States)

    2017-04-01

    ABSTRACT To aid in a data-reduction process, an algorithm was generated to calculate on a square mesh (elements with sides of length 2m) the area of... aid in a data-reduction process, an algorithm was generated to calculate on a square mesh (elements with sides of length 2m) the area of overlap for...an official Department of the Army position unless so designated by other authorized documents. Citation of manufacturer’s or trade names does not

  12. A Survey of Solver-Related Geometry and Meshing Issues

    Science.gov (United States)

    Masters, James; Daniel, Derick; Gudenkauf, Jared; Hine, David; Sideroff, Chris

    2016-01-01

    There is a concern in the computational fluid dynamics community that mesh generation is a significant bottleneck in the CFD workflow. This is one of several papers that will help set the stage for a moderated panel discussion addressing this issue. Although certain general "rules of thumb" and a priori mesh metrics can be used to ensure that some base level of mesh quality is achieved, inadequate consideration is often given to the type of solver or particular flow regime on which the mesh will be utilized. This paper explores how an analyst may want to think differently about a mesh based on considerations such as if a flow is compressible vs. incompressible or hypersonic vs. subsonic or if the solver is node-centered vs. cell-centered. This paper is a high-level investigation intended to provide general insight into how considering the nature of the solver or flow when performing mesh generation has the potential to increase the accuracy and/or robustness of the solution and drive the mesh generation process to a state where it is no longer a hindrance to the analysis process.

  13. Stacouf: A new system for automatic processing of eddy current signal from steam generator testing of PWR power plants

    International Nuclear Information System (INIS)

    Ducreux, J.; Eyrolles, P.; Meylogan, T.

    1990-01-01

    A new system called STACOUF will be soon industrialized. The aim is to improve on-site signal processing for eddy testing of steam generators. Testing time, quality and productivity will be improved [fr

  14. Comparison of a semi-automatic annotation tool and a natural language processing application for the generation of clinical statement entries.

    Science.gov (United States)

    Lin, Ching-Heng; Wu, Nai-Yuan; Lai, Wei-Shao; Liou, Der-Ming

    2015-01-01

    Electronic medical records with encoded entries should enhance the semantic interoperability of document exchange. However, it remains a challenge to encode the narrative concept and to transform the coded concepts into a standard entry-level document. This study aimed to use a novel approach for the generation of entry-level interoperable clinical documents. Using HL7 clinical document architecture (CDA) as the example, we developed three pipelines to generate entry-level CDA documents. The first approach was a semi-automatic annotation pipeline (SAAP), the second was a natural language processing (NLP) pipeline, and the third merged the above two pipelines. We randomly selected 50 test documents from the i2b2 corpora to evaluate the performance of the three pipelines. The 50 randomly selected test documents contained 9365 words, including 588 Observation terms and 123 Procedure terms. For the Observation terms, the merged pipeline had a significantly higher F-measure than the NLP pipeline (0.89 vs 0.80, p<0.0001), but a similar F-measure to that of the SAAP (0.89 vs 0.87). For the Procedure terms, the F-measure was not significantly different among the three pipelines. The combination of a semi-automatic annotation approach and the NLP application seems to be a solution for generating entry-level interoperable clinical documents. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.comFor numbered affiliation see end of article.

  15. SummitView 1.0: a code to automatically generate 3D solid models of surface micro-machining based MEMS designs.

    Energy Technology Data Exchange (ETDEWEB)

    McBride, Cory L. (Elemental Technologies, American Fort, UT); Yarberry, Victor R.; Schmidt, Rodney Cannon; Meyers, Ray J. (Elemental Technologies, American Fort, UT)

    2006-11-01

    This report describes the SummitView 1.0 computer code developed at Sandia National Laboratories. SummitView is designed to generate a 3D solid model, amenable to visualization and meshing, that represents the end state of a microsystem fabrication process such as the SUMMiT (Sandia Ultra-Planar Multilevel MEMS Technology) V process. Functionally, SummitView performs essentially the same computational task as an earlier code called the 3D Geometry modeler [1]. However, because SummitView is based on 2D instead of 3D data structures and operations, it has significant speed and robustness advantages. As input it requires a definition of both the process itself and the collection of individual 2D masks created by the designer and associated with each of the process steps. The definition of the process is contained in a special process definition file [2] and the 2D masks are contained in MEM format files [3]. The code is written in C++ and consists of a set of classes and routines. The classes represent the geometric data and the SUMMiT V process steps. Classes are provided for the following process steps: Planar Deposition, Planar Etch, Conformal Deposition, Dry Etch, Wet Etch and Release Etch. SummitView is built upon the 2D Boolean library GBL-2D [4], and thus contains all of that library's functionality.

  16. Improving MeSH classification of biomedical articles using citation contexts.

    Science.gov (United States)

    Aljaber, Bader; Martinez, David; Stokes, Nicola; Bailey, James

    2011-10-01

    Medical Subject Headings (MeSH) are used to index the majority of databases generated by the National Library of Medicine. Essentially, MeSH terms are designed to make information, such as scientific articles, more retrievable and assessable to users of systems such as PubMed. This paper proposes a novel method for automating the assignment of biomedical publications with MeSH terms that takes advantage of citation references to these publications. Our findings show that analysing the citation references that point to a document can provide a useful source of terms that are not present in the document. The use of these citation contexts, as they are known, can thus help to provide a richer document feature representation, which in turn can help improve text mining and information retrieval applications, in our case MeSH term classification. In this paper, we also explore new methods of selecting and utilising citation contexts. In particular, we assess the effect of weighting the importance of citation terms (found in the citation contexts) according to two aspects: (i) the section of the paper they appear in and (ii) their distance to the citation marker. We conduct intrinsic and extrinsic evaluations of citation term quality. For the intrinsic evaluation, we rely on the UMLS Metathesaurus conceptual database to explore the semantic characteristics of the mined citation terms. We also analyse the "informativeness" of these terms using a class-entropy measure. For the extrinsic evaluation, we run a series of automatic document classification experiments over MeSH terms. Our experimental evaluation shows that citation contexts contain terms that are related to the original document, and that the integration of this knowledge results in better classification performance compared to two state-of-the-art MeSH classification systems: MeSHUP and MTI. Our experiments also demonstrate that the consideration of Section and Distance factors can lead to statistically

  17. The Generator of the Event Structure Lexicon (GESL): Automatic Annotation of Event Structure for Textual Inference Tasks

    Science.gov (United States)

    Im, Seohyun

    2013-01-01

    This dissertation aims to develop the Generator of the Event Structure Lexicon (GESL) which is a tool to automate annotating the event structure of verbs in text to support textual inference tasks related to lexically entailed subevents. The output of the GESL is the Event Structure Lexicon (ESL), which is a lexicon of verbs in text which includes…

  18. Comparison of Intensity-Modulated Radiotherapy Planning Based on Manual and Automatically Generated Contours Using Deformable Image Registration in Four-Dimensional Computed Tomography of Lung Cancer Patients

    International Nuclear Information System (INIS)

    Weiss, Elisabeth; Wijesooriya, Krishni; Ramakrishnan, Viswanathan; Keall, Paul J.

    2008-01-01

    Purpose: To evaluate the implications of differences between contours drawn manually and contours generated automatically by deformable image registration for four-dimensional (4D) treatment planning. Methods and Materials: In 12 lung cancer patients intensity-modulated radiotherapy (IMRT) planning was performed for both manual contours and automatically generated ('auto') contours in mid and peak expiration of 4D computed tomography scans, with the manual contours in peak inspiration serving as the reference for the displacement vector fields. Manual and auto plans were analyzed with respect to their coverage of the manual contours, which were assumed to represent the anatomically correct volumes. Results: Auto contours were on average larger than manual contours by up to 9%. Objective scores, D 2% and D 98% of the planning target volume, homogeneity and conformity indices, and coverage of normal tissue structures (lungs, heart, esophagus, spinal cord) at defined dose levels were not significantly different between plans (p = 0.22-0.94). Differences were statistically insignificant for the generalized equivalent uniform dose of the planning target volume (p = 0.19-0.94) and normal tissue complication probabilities for lung and esophagus (p = 0.13-0.47). Dosimetric differences >2% or >1 Gy were more frequent in patients with auto/manual volume differences ≥10% (p = 0.04). Conclusions: The applied deformable image registration algorithm produces clinically plausible auto contours in the majority of structures. At this stage clinical supervision of the auto contouring process is required, and manual interventions may become necessary. Before routine use, further investigations are required, particularly to reduce imaging artifacts

  19. Generation of alloreactivity-reduced donor lymphocyte products retaining memory function by fully automatic depletion of CD45RA-positive cells.

    Science.gov (United States)

    Müller, Nina; Landwehr, Katharina; Langeveld, Kirsten; Stenzel, Joanna; Pouwels, Walter; van der Hoorn, Menno A W G; Seifried, Erhard; Bonig, Halvard

    2018-02-28

    For patients needing allogeneic stem cell transplantation but lacking a major histocompatibility complex (MHC)-matched donor, haplo-identical (family) donors may be an alternative. Stringent T-cell depletion required in these cases to avoid lethal graft-versus-host disease (GVHD) can delay immune reconstitution, thus impairing defense against virus reactivation and attenuating graft-versus-leukemia (GVL) activity. Several groups reported that GVHD is caused by cells residing within the naive (CD45RA + ) T-cell compartment and proposed use of CD45RA-depleted donor lymphocyte infusion (DLI) to accelerate immune reconstitution. We developed and tested the performance of a CD45RA depletion module for the automatic cell-processing device CliniMACS Prodigy and investigated quality attributes of the generated products. Unstimulated apheresis products from random volunteer donors were depleted of CD45RA + cells on CliniMACS Prodigy, using Good Manufacturing Practice (GMP)-compliant reagents and methods throughout. Using phenotypic and functional in vitro assays, we assessed the cellular constitution of CD45RA-depleted products, including T-cell subset analyses, immunological memory function and allo-reactivity. Selections were technically uneventful and proceeded automatically with minimal hands-on time beyond tubing set installation. Products were near-qualitatively CD45RA + depleted, that is, largely devoid of CD45RA + T cells but also of almost all B and natural killer cells. Naive and effector as well as γ/δ T cells were greatly reduced. The CD4:CD8 ratio was fivefold increased. Mixed lymphocyte reaction assays of the product against third-party leukocytes revealed reduced allo-reactivity compared to starting material. Anti-pathogen responses were retained. The novel, closed, fully GMP-compatible process on Prodigy generates highly CD45RA-depleted cellular products predicted to be clinically meaningfully depleted of GvH reactivity. Copyright © 2018 International

  20. Reenganche automático en circuitos de distribución con generación distribuida; Automatic reclosing in distribution circuits with distributed generation

    Directory of Open Access Journals (Sweden)

    Marta Bravo de las Casas

    2015-04-01

    Full Text Available Las redes de distribución han sido diseñadas tradicionalmente para que la potencia fluya en un solo sentido. La introducción de las unidades de generación distribuida hace que esta consideración ya no sea cierta, lo que traerá consigo nuevos retos para la operación y el diseño de estas redes. Una de las áreas afectadas en este sentido son la de las protecciones eléctricas, sobre todo la protección anti-aislamiento o separadora, y en especial cuando se utiliza reenganche automático, típico en las redes eléctricas de media tensión. El presente artículo realiza un estudio del reenganche automático en una subestación típica cubana que presenta generación distribuida fuel y diesel. Inicialmente se hace una breve revisión de la literatura y los resultados se presentan por medio de simulaciones en el software Matlab – Simulik (versión 7.4. La simulación confirma la existencia del problema y para ello se plantean las posibles soluciones. Distribution networks traditionally have been designed so that the power flows in one direction only. The introduction of distributed generation units makes this consideration is no longer true, which will bring new challenges for the operation and design of these networks. One of the areas affected in this regard are the electrical protections, especially the anti-isolating or separating, especially when automatic reclosing is used. The automatic reclosing is typical in middle voltage networks. In present article is carried out a study of automatic reclosing on a Cuban typical substation that presents distributed generation diesel and fuel. Initially a short review of the literature is made and the results are presented by means of the simulations from Matlab -Simulik (version 7.4 software. The simulation confirms the existence of this problem and possible solutions arise.

  1. Automatic Commercial Permit Sets

    Energy Technology Data Exchange (ETDEWEB)

    Grana, Paul [Folsom Labs, Inc., San Francisco, CA (United States)

    2017-12-21

    Final report for Folsom Labs’ Solar Permit Generator project, which has successfully completed, resulting in the development and commercialization of a software toolkit within the cloud-based HelioScope software environment that enables solar engineers to automatically generate and manage draft documents for permit submission.

  2. Numerical convergence of discrete exterior calculus on arbitrary surface meshes

    KAUST Repository

    Mohamed, Mamdouh S.

    2018-02-13

    Discrete exterior calculus (DEC) is a structure-preserving numerical framework for partial differential equations solution, particularly suitable for simplicial meshes. A longstanding and widespread assumption has been that DEC requires special (Delaunay) triangulations, which complicated the mesh generation process especially for curved surfaces. This paper presents numerical evidence demonstrating that this restriction is unnecessary. Convergence experiments are carried out for various physical problems using both Delaunay and non-Delaunay triangulations. Signed diagonal definition for the key DEC operator (Hodge star) is adopted. The errors converge as expected for all considered meshes and experiments. This relieves the DEC paradigm from unnecessary triangulation limitation.

  3. AVID: Automatic Visualization Interface Designer

    National Research Council Canada - National Science Library

    Chuah, Mei

    2000-01-01

    .... Automatic generation offers great flexibility in performing data and information analysis tasks, because new designs are generated on a case by case basis to suit current and changing future needs...

  4. Quadrilateral finite element mesh coarsening

    Science.gov (United States)

    Staten, Matthew L; Dewey, Mark W; Benzley, Steven E

    2012-10-16

    Techniques for coarsening a quadrilateral mesh are described. These techniques include identifying a coarsening region within the quadrilateral mesh to be coarsened. Quadrilateral elements along a path through the coarsening region are removed. Node pairs along opposite sides of the path are identified. The node pairs along the path are then merged to collapse the path.

  5. Fog-harvesting Mesh Surfaces

    Science.gov (United States)

    Park, Kyoo-Chul; Chhatre, Shreerang S.; Srinivasan, Siddarth; Cohen, Robert E.; McKinley, Gareth H.

    2012-11-01

    Fog represents a large, untapped source of potable water, especially in arid climates. Various plants and animals use morphological as well as chemical features on their surfaces to harvest this precious resource. In this work, we investigate the influence of surface wettability, structural length scale, and relative openness of the weave on the fog harvesting ability of mesh surfaces. We choose simple woven meshes as a canonical family of model permeable surfaces due to the ability to systematically vary periodicity, porosity, mechanical robustness and ease of fabrication. We measure the fog collecting capacity of a set of meshes with a directed aqueous aerosol stream to simulate a natural foggy environment. Further, we strive to develop and test appropriate scalings and correlations that quantify the collection of water on the mesh surfaces. These design rules can be deployed as an a priori design chart for designing optimal performance meshes for given environmental/operating conditions.

  6. The ear, the eye, earthquakes and feature selection: listening to automatically generated seismic bulletins for clues as to the differences between true and false events.

    Science.gov (United States)

    Kuzma, H. A.; Arehart, E.; Louie, J. N.; Witzleben, J. L.

    2012-04-01

    Listening to the waveforms generated by earthquakes is not new. The recordings of seismometers have been sped up and played to generations of introductory seismology students, published on educational websites and even included in the occasional symphony. The modern twist on earthquakes as music is an interest in using state-of-the-art computer algorithms for seismic data processing and evaluation. Algorithms such as such as Hidden Markov Models, Bayesian Network models and Support Vector Machines have been highly developed for applications in speech recognition, and might also be adapted for automatic seismic data analysis. Over the last three years, the International Data Centre (IDC) of the Comprehensive Test Ban Treaty Organization (CTBTO) has supported an effort to apply computer learning and data mining algorithms to IDC data processing, particularly to the problem of weeding through automatically generated event bulletins to find events which are non-physical and would otherwise have to be eliminated by the hand of highly trained human analysts. Analysts are able to evaluate events, distinguish between phases, pick new phases and build new events by looking at waveforms displayed on a computer screen. Human ears, however, are much better suited to waveform processing than are the eyes. Our hypothesis is that combining an auditory representation of seismic events with visual waveforms would reduce the time it takes to train an analyst and the time they need to evaluate an event. Since it takes almost two years for a person of extraordinary diligence to become a professional analyst and IDC contracts are limited to seven years by Treaty, faster training would significantly improve IDC operations. Furthermore, once a person learns to distinguish between true and false events by ear, various forms of audio compression can be applied to the data. The compression scheme which yields the smallest data set in which relevant signals can still be heard is likely an

  7. Automatic measurement of contact angle in pore-space images

    Science.gov (United States)

    AlRatrout, Ahmed; Raeini, Ali Q.; Bijeljic, Branko; Blunt, Martin J.

    2017-11-01

    A new approach is presented to measure the in-situ contact angle (θ) between immiscible fluids, applied to segmented pore-scale X-ray images. We first identify and mesh the fluid/fluid and fluid/solid interfaces. A Gaussian smoothing is applied to this mesh to eliminate artifacts associated with the voxelized nature of the image, while preserving large-scale features of the rock surface. Then, for the fluid/fluid interface we apply an additional smoothing and adjustment of the mesh to impose a constant curvature. We then track the three-phase contact line, and the two vectors that have a direction perpendicular to both surfaces: the contact angle is found from the dot product of these vectors where they meet at the contact line. This calculation can be applied at every point on the mesh at the contact line. We automatically generate contact angle values representing each invaded pore-element in the image with high accuracy. To validate the approach, we first study synthetic three-dimensional images of a spherical droplet of oil residing on a tilted flat solid surface surrounded by brine and show that our results are accurate to within 3° if the sphere diameter is 2 or more voxels. We then apply this method to oil/brine systems imaged at ambient temperature and reservoir pressure (10MPa) using X-ray microtomography (Singh et al., 2016). We analyse an image volume of diameter approximately 4.6 mm and 10.7 mm long, obtaining hundreds of thousands of values from a dataset with around 700 million voxels. We show that in a system of altered wettability, contact angles both less than and greater than 90° can be observed. This work provides a rapid method to provide an accurate characterization of pore-scale wettability, which is important for the design and assessment of hydrocarbon recovery and carbon dioxide storage.

  8. Applying fractional order PID to design TCSC-based damping controller in coordination with automatic generation control of interconnected multi-source power system

    Directory of Open Access Journals (Sweden)

    Javad Morsali

    2017-02-01

    Full Text Available In this paper, fractional order proportional-integral-differential (FOPID controller is employed in the design of thyristor controlled series capacitor (TCSC-based damping controller in coordination with the secondary integral controller as automatic generation control (AGC loop. In doing so, the contribution of the TCSC in tie-line power exchange is extracted mathematically for small load disturbance. Adjustable parameters of the proposed FOPID-based TCSC damping controller and the AGC loop are optimized concurrently via an improved particle swarm optimization (IPSO algorithm which is reinforced by chaotic parameter and crossover operator to obtain a globally optimal solution. The powerful FOMCON toolbox is used along with MATLAB for handling fractional order modeling and control. An interconnected multi-source power system is simulated regarding the physical constraints of generation rate constraint (GRC nonlinearity and governor dead band (GDB effect. Simulation results using FOMCON toolbox demonstrate that the proposed FOPID-based TCSC damping controller achieves the greatest dynamic performance under different load perturbation patterns in comparison with phase lead-lag and classical PID-based TCSC damping controllers, all in coordination with the integral AGC. Moreover, sensitivity analyses are performed to show the robustness of the proposed controller under various uncertainty scenarios.

  9. Finite element simulation of impact response of wire mesh screens

    Directory of Open Access Journals (Sweden)

    Wang Caizheng

    2015-01-01

    Full Text Available In this paper, the response of wire mesh screens to low velocity impact with blunt objects is investigated using finite element (FE simulation. The woven wire mesh is modelled with homogeneous shell elements with equivalent smeared mechanical properties. The mechanical behaviour of the woven wire mesh was determined experimentally with tensile tests on steel wire mesh coupons to generate the data for the smeared shell material used in the FE. The effects of impacts with a low mass (4 kg and a large mass (40 kg providing the same impact energy are studied. The joint between the wire mesh screen and the aluminium frame surrounding it is modelled using contact elements with friction between the corresponding elements. Damage to the screen of different types compromising its structural integrity, such as mesh separation and pulling out from the surrounding frame is modelled. The FE simulation is validated with results of impact tests conducted on woven steel wire screen meshes.

  10. Dynamic Mesh Adaptation for Front Evolution Using Discontinuous Galerkin Based Weighted Condition Number Mesh Relaxation

    Energy Technology Data Exchange (ETDEWEB)

    Greene, Patrick T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schofield, Samuel P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Nourgaliev, Robert [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-06-21

    A new mesh smoothing method designed to cluster mesh cells near a dynamically evolving interface is presented. The method is based on weighted condition number mesh relaxation with the weight function being computed from a level set representation of the interface. The weight function is expressed as a Taylor series based discontinuous Galerkin projection, which makes the computation of the derivatives of the weight function needed during the condition number optimization process a trivial matter. For cases when a level set is not available, a fast method for generating a low-order level set from discrete cell-centered elds, such as a volume fraction or index function, is provided. Results show that the low-order level set works equally well for the weight function as the actual level set. Meshes generated for a number of interface geometries are presented, including cases with multiple level sets. Dynamic cases for moving interfaces are presented to demonstrate the method's potential usefulness to arbitrary Lagrangian Eulerian (ALE) methods.

  11. PLUM: Parallel Load Balancing for Unstructured Adaptive Meshes

    Science.gov (United States)

    Oliker, Leonid

    1998-01-01

    Dynamic mesh adaption on unstructured grids is a powerful tool for computing large-scale problems that require grid modifications to efficiently resolve solution features. Unfortunately, an efficient parallel implementation is difficult to achieve, primarily due to the load imbalance created by the dynamically-changing nonuniform grid. To address this problem, we have developed PLUM, an automatic portable framework for performing adaptive large-scale numerical computations in a message-passing environment. First, we present an efficient parallel implementation of a tetrahedral mesh adaption scheme. Extremely promising parallel performance is achieved for various refinement and coarsening strategies on a realistic-sized domain. Next we describe PLUM, a novel method for dynamically balancing the processor workloads in adaptive grid computations. This research includes interfacing the parallel mesh adaption procedure based on actual flow solutions to a data remapping module, and incorporating an efficient parallel mesh repartitioner. A significant runtime improvement is achieved by observing that data movement for a refinement step should be performed after the edge-marking phase but before the actual subdivision. We also present optimal and heuristic remapping cost metrics that can accurately predict the total overhead for data redistribution. Several experiments are performed to verify the effectiveness of PLUM on sequences of dynamically adapted unstructured grids. Portability is demonstrated by presenting results on the two vastly different architectures of the SP2 and the Origin2OOO. Additionally, we evaluate the performance of five state-of-the-art partitioning algorithms that can be used within PLUM. It is shown that for certain classes of unsteady adaption, globally repartitioning the computational mesh produces higher quality results than diffusive repartitioning schemes. We also demonstrate that a coarse starting mesh produces high quality load balancing, at

  12. Magnetic Resonance–Based Automatic Air Segmentation for Generation of Synthetic Computed Tomography Scans in the Head Region

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, Weili; Kim, Joshua P. [Department of Radiation Oncology, Henry Ford Health Systems, Detroit, Michigan (United States); Kadbi, Mo [Philips Healthcare, Cleveland, Ohio (United States); Movsas, Benjamin; Chetty, Indrin J. [Department of Radiation Oncology, Henry Ford Health Systems, Detroit, Michigan (United States); Glide-Hurst, Carri K., E-mail: churst2@hfhs.org [Department of Radiation Oncology, Henry Ford Health Systems, Detroit, Michigan (United States)

    2015-11-01

    Purpose: To incorporate a novel imaging sequence for robust air and tissue segmentation using ultrashort echo time (UTE) phase images and to implement an innovative synthetic CT (synCT) solution as a first step toward MR-only radiation therapy treatment planning for brain cancer. Methods and Materials: Ten brain cancer patients were scanned with a UTE/Dixon sequence and other clinical sequences on a 1.0 T open magnet with simulation capabilities. Bone-enhanced images were generated from a weighted combination of water/fat maps derived from Dixon images and inverted UTE images. Automated air segmentation was performed using unwrapped UTE phase maps. Segmentation accuracy was assessed by calculating segmentation errors (true-positive rate, false-positive rate, and Dice similarity indices using CT simulation (CT-SIM) as ground truth. The synCTs were generated using a voxel-based, weighted summation method incorporating T2, fluid attenuated inversion recovery (FLAIR), UTE1, and bone-enhanced images. Mean absolute error (MAE) characterized Hounsfield unit (HU) differences between synCT and CT-SIM. A dosimetry study was conducted, and differences were quantified using γ-analysis and dose-volume histogram analysis. Results: On average, true-positive rate and false-positive rate for the CT and MR-derived air masks were 80.8% ± 5.5% and 25.7% ± 6.9%, respectively. Dice similarity indices values were 0.78 ± 0.04 (range, 0.70-0.83). Full field of view MAE between synCT and CT-SIM was 147.5 ± 8.3 HU (range, 138.3-166.2 HU), with the largest errors occurring at bone–air interfaces (MAE 422.5 ± 33.4 HU for bone and 294.53 ± 90.56 HU for air). Gamma analysis revealed pass rates of 99.4% ± 0.04%, with acceptable treatment plan quality for the cohort. Conclusions: A hybrid MRI phase/magnitude UTE image processing technique was introduced that significantly improved bone and air contrast in MRI. Segmented air masks and bone-enhanced images were integrated

  13. A segmentation framework towards automatic generation of boost subvolumes for FDG-PET tumors: A digital phantom study

    International Nuclear Information System (INIS)

    Yang, Fei; Grigsby, Perry W.

    2012-01-01

    Potential benefits of administering nonuniform radiation dose to heterogeneous tumors imaged with FDG-PET have been widely demonstrated; whereas the number of discrete dose levels to be utilized and corresponding locations for prescription inside tumors vary significantly with current existing methods. In this paper, an automated and unsupervised segmentation framework constituted mainly by an image restoration mechanism based on variational decomposition and a voxel clustering scheme based on spectral clustering was presented towards partitioning FDG-PET imaged tumors into subvolumes characterized with the total intra-subvolume activity similarity and the total inter-subvolume activity dissimilarity being simultaneously maximized. Experiments to evaluate the proposed system were carried out with using FDG-PET data generated from a digital phantom that employed SimSET (Simulation System for Emission Tomography) to simulate PET acquisition of tumors. The obtained results show the feasibility of the proposed system in dividing FDG-PET imaged tumor volumes into subvolumes with intratumoral heterogeneity being properly characterized, irrespective of variation in tumor morphology as well as diversity in intratumoral heterogeneity pattern.

  14. Ocean modeling on unstructured meshes

    Science.gov (United States)

    Danilov, S.

    2013-09-01

    Unstructured meshes are common in coastal modeling, but still rarely used for modeling the large-scale ocean circulation. Existing and new projects aim at changing this situation by proposing models enabling a regional focus (multiresolution) in global setups, without nesting and open boundaries. Among them, finite-volume models using the C-grid discretization on Voronoi-centroidal meshes or cell-vertex quasi-B-grid discretization on triangular meshes work well and offer the multiresolution functionality at a price of being 2 to 4 times slower per degree of freedom than structured-mesh models. This is already sufficient for many practical tasks and will be further improved as the number of vertical layers is increased. Approaches based on the finite-element method, both used or proposed, are as a rule slower at present. Most of staggered discretizations on triangular or Voronoi meshes allow spurious modes which are difficult to filter on unstructured meshes. The ongoing research seeks how to handle them and explores new approaches where such modes are absent. Issues of numerical efficiency and accurate transport schemes are still important, and the question on parameterizations for multiresolution meshes is hardly explored at all. The review summarizes recent developments the main practical result of which is the emergence of multiresolution models for simulating large-scale ocean circulation.

  15. Mesh Adaptation and Shape Optimization on Unstructured Meshes, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — In this SBIR CRM proposes to implement the entropy adjoint method for solution adaptive mesh refinement into the Loci/CHEM unstructured flow solver. The scheme will...

  16. DEVELOPMENT AND TESTING OF GEO-PROCESSING MODELS FOR THE AUTOMATIC GENERATION OF REMEDIATION PLAN AND NAVIGATION DATA TO USE IN INDUSTRIAL DISASTER REMEDIATION

    Directory of Open Access Journals (Sweden)

    G. Lucas

    2015-08-01

    Full Text Available This paper introduces research done on the automatic preparation of remediation plans and navigation data for the precise guidance of heavy machinery in clean-up work after an industrial disaster. The input test data consists of a pollution extent shapefile derived from the processing of hyperspectral aerial survey data from the Kolontár red mud disaster. Three algorithms were developed and the respective scripts were written in Python. The first model aims at drawing a parcel clean-up plan. The model tests four different parcel orientations (0, 90, 45 and 135 degree and keeps the plan where clean-up parcels are less numerous considering it is an optimal spatial configuration. The second model drifts the clean-up parcel of a work plan both vertically and horizontally following a grid pattern with sampling distance of a fifth of a parcel width and keep the most optimal drifted version; here also with the belief to reduce the final number of parcel features. The last model aims at drawing a navigation line in the middle of each clean-up parcel. The models work efficiently and achieve automatic optimized plan generation (parcels and navigation lines. Applying the first model we demonstrated that depending on the size and geometry of the features of the contaminated area layer, the number of clean-up parcels generated by the model varies in a range of 4% to 38% from plan to plan. Such a significant variation with the resulting feature numbers shows that the optimal orientation identification can result in saving work, time and money in remediation. The various tests demonstrated that the model gains efficiency when 1/ the individual features of contaminated area present a significant orientation with their geometry (features are long, 2/ the size of pollution extent features becomes closer to the size of the parcels (scale effect. The second model shows only 1% difference with the variation of feature number; so this last is less interesting for

  17. A novel three-dimensional mesh deformation method based on sphere relaxation

    International Nuclear Information System (INIS)

    Zhou, Xuan; Li, Shuixiang

    2015-01-01

    In our previous work (2013) [19], we developed a disk relaxation based mesh deformation method for two-dimensional mesh deformation. In this paper, the idea of the disk relaxation is extended to the sphere relaxation for three-dimensional meshes with large deformations. We develop a node based pre-displacement procedure to apply initial movements on nodes according to their layer indices. Afterwards, the nodes are moved locally by the improved sphere relaxation algorithm to transfer boundary deformations and increase the mesh quality. A three-dimensional mesh smoothing method is also adopted to prevent the occurrence of the negative volume of elements, and further improve the mesh quality. Numerical applications in three-dimension including the wing rotation, bending beam and morphing aircraft are carried out. The results demonstrate that the sphere relaxation based approach generates the deformed mesh with high quality, especially regarding complex boundaries and large deformations

  18. A novel three-dimensional mesh deformation method based on sphere relaxation

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Xuan [Department of Mechanics & Engineering Science, College of Engineering, Peking University, Beijing, 100871 (China); Institute of Applied Physics and Computational Mathematics, Beijing, 100094 (China); Li, Shuixiang, E-mail: lsx@pku.edu.cn [Department of Mechanics & Engineering Science, College of Engineering, Peking University, Beijing, 100871 (China)

    2015-10-01

    In our previous work (2013) [19], we developed a disk relaxation based mesh deformation method for two-dimensional mesh deformation. In this paper, the idea of the disk relaxation is extended to the sphere relaxation for three-dimensional meshes with large deformations. We develop a node based pre-displacement procedure to apply initial movements on nodes according to their layer indices. Afterwards, the nodes are moved locally by the improved sphere relaxation algorithm to transfer boundary deformations and increase the mesh quality. A three-dimensional mesh smoothing method is also adopted to prevent the occurrence of the negative volume of elements, and further improve the mesh quality. Numerical applications in three-dimension including the wing rotation, bending beam and morphing aircraft are carried out. The results demonstrate that the sphere relaxation based approach generates the deformed mesh with high quality, especially regarding complex boundaries and large deformations.

  19. A program for assisting automatic generation control of the ELETRONORTE using artificial neural network; Um programa para assistencia ao controle automatico de geracao da Eletronorte usando rede neuronal artificial

    Energy Technology Data Exchange (ETDEWEB)

    Brito Filho, Pedro Rodrigues de; Nascimento Garcez, Jurandyr do [Para Univ., Belem, PA (Brazil). Centro Tecnologico; Charone Junior, Wady [Centrais Eletricas do Nordeste do Brasil S.A. (ELETRONORTE), Belem, PA (Brazil)

    1994-12-31

    This work presents an application of artificial neural network as a support to decision making in the automatic generation control (AGC) of the ELETRONORTE. It uses a software to auxiliary in the decisions in real time of the AGC. (author) 2 refs., 6 figs., 1 tab.

  20. A novel surface mesh deformation method for handling wing-fuselage intersections

    Directory of Open Access Journals (Sweden)

    Mario Jaime Martin-Burgos

    2017-02-01

    Full Text Available This paper describes a method for mesh adaptation in the presence of intersections, such as wing-fuselage. Automatic optimization tools, using Computational Fluid Dynamics (CFD simulations, face the problem to adapt the computational grid upon deformations of the boundary surface. When mesh regeneration is not feasible, due to the high cost to build up the computational grid, mesh deformation techniques are considered a cheap approach to adapt the mesh to changes on the geometry. Mesh adaptation is a well-known subject in the literature; however, there is very little work which deals with moving intersections. Without a proper treatment of the intersections, the use of automatic optimization methods for aircraft design is limited to individual components. The proposed method takes advantage of the CAD description, which usually comes in the form of Non-Uniform Rational B-Splines (NURBS patches. This paper describes an algorithm to recalculate the intersection line between two parametric surfaces. Then, the surface mesh is adapted to the moving intersection in parametric coordinates. Finally, the deformation is propagated through the volumetric mesh. The proposed method is tested with the DLR F6 wing-body configuration.

  1. Mersiline mesh in premaxillary augmentation.

    Science.gov (United States)

    Foda, Hossam M T

    2005-01-01

    Premaxillary retrusion may distort the aesthetic appearance of the columella, lip, and nasal tip. This defect is characteristically seen in, but not limited to, patients with cleft lip nasal deformity. This study investigated 60 patients presenting with premaxillary deficiencies in which Mersiline mesh was used to augment the premaxilla. All the cases had surgery using the external rhinoplasty technique. Two methods of augmentation with Mersiline mesh were used: the Mersiline roll technique, for the cases with central symmetric deficiencies, and the Mersiline packing technique, for the cases with asymmetric deficiencies. Premaxillary augmentation with Mersiline mesh proved to be simple technically, easy to perform, and not associated with any complications. Periodic follow-up evaluation for a mean period of 32 months (range, 12-98 months) showed that an adequate degree of premaxillary augmentation was maintained with no clinically detectable resorption of the mesh implant.

  2. Element Partition Trees For H-Refined Meshes to Optimize Direct Solver Performance. Part I: Dynamic Programming

    KAUST Repository

    AbouEisha, Hassan M.

    2017-07-13

    We consider a class of two-and three-dimensional h-refined meshes generated by an adaptive finite element method. We introduce an element partition tree, which controls the execution of the multi-frontal solver algorithm over these refined grids. We propose and study algorithms with polynomial computational cost for the optimization of these element partition trees. The trees provide an ordering for the elimination of unknowns. The algorithms automatically optimize the element partition trees using extensions of dynamic programming. The construction of the trees by the dynamic programming approach is expensive. These generated trees cannot be used in practice, but rather utilized as a learning tool to propose fast heuristic algorithms. In this first part of our paper we focus on the dynamic programming approach, and draw a sketch of the heuristic algorithm. The second part will be devoted to a more detailed analysis of the heuristic algorithm extended for the case of hp-adaptive

  3. Method and system for mesh network embedded devices

    Science.gov (United States)

    Wang, Ray (Inventor)

    2009-01-01

    A method and system for managing mesh network devices. A mesh network device with integrated features creates an N-way mesh network with a full mesh network topology or a partial mesh network topology.

  4. Automatic sets and Delone sets

    International Nuclear Information System (INIS)

    Barbe, A; Haeseler, F von

    2004-01-01

    Automatic sets D part of Z m are characterized by having a finite number of decimations. They are equivalently generated by fixed points of certain substitution systems, or by certain finite automata. As examples, two-dimensional versions of the Thue-Morse, Baum-Sweet, Rudin-Shapiro and paperfolding sequences are presented. We give a necessary and sufficient condition for an automatic set D part of Z m to be a Delone set in R m . The result is then extended to automatic sets that are defined as fixed points of certain substitutions. The morphology of automatic sets is discussed by means of examples

  5. Automatic code generation in practice

    DEFF Research Database (Denmark)

    Adam, Marian Sorin; Kuhrmann, Marco; Schultz, Ulrik Pagh

    2016-01-01

    Mobile robots often use a distributed architecture in which software components are deployed to heterogeneous hardware modules. Ensuring the consistency with the designed architecture is a complex task, notably if functional safety requirements have to be fulfilled. We propose to use a domain-spe...

  6. Unbiased Sampling and Meshing of Isosurfaces

    KAUST Repository

    Yan, Dongming

    2014-05-07

    In this paper, we present a new technique to generate unbiased samples on isosurfaces. An isosurface, F(x,y,z) = c , of a function, F , is implicitly defined by trilinear interpolation of background grid points. The key idea of our approach is that of treating the isosurface within a grid cell as a graph (height) function in one of the three coordinate axis directions, restricted to where the slope is not too high, and integrating / sampling from each of these three. We use this unbiased sampling algorithm for applications in Monte Carlo integration, Poisson-disk sampling, and isosurface meshing.

  7. Enhancing physiologic simulations using supervised learning on coarse mesh solutions.

    Science.gov (United States)

    Kolandaivelu, Kumaran; O'Brien, Caroline C; Shazly, Tarek; Edelman, Elazer R; Kolachalama, Vijaya B

    2015-03-06

    Computational modelling of physical and biochemical processes has emerged as a means of evaluating medical devices, offering new insights that explain current performance, inform future designs and even enable personalized use. Yet resource limitations force one to compromise with reduced order computational models and idealized assumptions that yield either qualitative descriptions or approximate, quantitative solutions to problems of interest. Considering endovascular drug delivery as an exemplary scenario, we used a supervised machine learning framework to process data generated from low fidelity coarse meshes and predict high fidelity solutions on refined mesh configurations. We considered two models simulating drug delivery to the arterial wall: (i) two-dimensional drug-coated balloons and (ii) three-dimensional drug-eluting stents. Simulations were performed on computational mesh configurations of increasing density. Supervised learners based on Gaussian process modelling were constructed from combinations of coarse mesh setting solutions of drug concentrations and nearest neighbourhood distance information as inputs, and higher fidelity mesh solutions as outputs. These learners were then used as computationally inexpensive surrogates to extend predictions using low fidelity information to higher levels of mesh refinement. The cross-validated, supervised learner-based predictions improved fidelity as compared with computational simulations performed at coarse level meshes--a result consistent across all outputs and computational models considered. Supervised learning on coarse mesh solutions can augment traditional physics-based modelling of complex physiologic phenomena. By obtaining efficient solutions at a fraction of the computational cost, this framework has the potential to transform how modelling approaches can be applied in the evaluation of medical technologies and their real-time administration in an increasingly personalized fashion.

  8. Adaptive meshes in ecosystem modelling: a way forward?

    Science.gov (United States)

    Popova, E. E.; Ham, D. A.; Srokosz, M. A.; Piggott, M. D.

    2009-04-01

    The need to resolve physical processes occuring on many different length scales has lead to the development of ocean flow models based on unstructured and adaptive meshes. However, thus far models of biological processes have been based on fixed, structured grids which lack the ability to dynamically focus resolution on areas of developing small-scale structure. Here we will present the initial results of coupling a four component biological model to the 3D non-hydrostatic, finite element, adaptive grid ocean model ICOM (the Imperial College Ocean Model). Mesh adaptivity automatically resolves fine-scale physical or biological features as they develop, optimising computational cost by reducing resolution where it is not required. Experiments are carried out within the framework of a horizontally uniform water column. The vertical physical processes in top 500m are represented by a two equation turbulence model. The physical model is coupled to a four component biological model, which includes generic phytoplankton, zooplankton, nitrate and particular organic matter (detritus). The physical and biological model is set up to represent idealised oligotrophic conditions, typical of subtropical gyres. A stable annual cycle is achieved after a number of years of integration. We compare results obtained on a fully adaptive mesh with ones using a high resolution static mesh. We assess the computational efficiency of the adaptive approach for modelling of ecosystem processes such as the dynamics of the phytoplankton spring bloom, formation of the subsurface chlorophyll maximum and nutrient supply to the photic zone.

  9. Cache-Oblivious Mesh Layouts

    International Nuclear Information System (INIS)

    Yoon, S; Lindstrom, P; Pascucci, V; Manocha, D

    2005-01-01

    We present a novel method for computing cache-oblivious layouts of large meshes that improve the performance of interactive visualization and geometric processing algorithms. Given that the mesh is accessed in a reasonably coherent manner, we assume no particular data access patterns or cache parameters of the memory hierarchy involved in the computation. Furthermore, our formulation extends directly to computing layouts of multi-resolution and bounding volume hierarchies of large meshes. We develop a simple and practical cache-oblivious metric for estimating cache misses. Computing a coherent mesh layout is reduced to a combinatorial optimization problem. We designed and implemented an out-of-core multilevel minimization algorithm and tested its performance on unstructured meshes composed of tens to hundreds of millions of triangles. Our layouts can significantly reduce the number of cache misses. We have observed 2-20 times speedups in view-dependent rendering, collision detection, and isocontour extraction without any modification of the algorithms or runtime applications

  10. Tetrahedral meshing via maximal Poisson-disk sampling

    KAUST Repository

    Guo, Jianwei

    2016-02-15

    In this paper, we propose a simple yet effective method to generate 3D-conforming tetrahedral meshes from closed 2-manifold surfaces. Our approach is inspired by recent work on maximal Poisson-disk sampling (MPS), which can generate well-distributed point sets in arbitrary domains. We first perform MPS on the boundary of the input domain, we then sample the interior of the domain, and we finally extract the tetrahedral mesh from the samples by using 3D Delaunay or regular triangulation for uniform or adaptive sampling, respectively. We also propose an efficient optimization strategy to protect the domain boundaries and to remove slivers to improve the meshing quality. We present various experimental results to illustrate the efficiency and the robustness of our proposed approach. We demonstrate that the performance and quality (e.g., minimal dihedral angle) of our approach are superior to current state-of-the-art optimization-based approaches.

  11. Finite element modeling of the human kidney for probabilistic occupant models: Statistical shape analysis and mesh morphing.

    Science.gov (United States)

    Yates, Keegan M; Untaroiu, Costin D

    2018-04-16

    Statistical shape analysis was conducted on 15 pairs (left and right) of human kidneys. It was shown that the left and right kidney were significantly different in size and shape. In addition, several common modes of kidney variation were identified using statistical shape analysis. Semi-automatic mesh morphing techniques have been developed to efficiently create subject specific meshes from a template mesh with a similar geometry. Subject specific meshes as well as probabilistic kidney meshes were created from a template mesh. Mesh quality remained about the same as the template mesh while only taking a fraction of the time to create the mesh from scratch or morph with manually identified landmarks. This technique can help enhance the quality of information gathered from experimental testing with subject specific meshes as well as help to more efficiently predict injury by creating models with the mean shape as well as models at the extremes for each principal component. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. A skinning prediction scheme for dynamic 3D mesh compression

    Science.gov (United States)

    Mamou, Khaled; Zaharia, Titus; Prêteux, Françoise

    2006-08-01

    This paper presents a new prediction-based compression technique for dynamic 3D meshes with constant connectivity and time-varying geometry. The core of the proposed algorithm is a skinning model used for motion compensation. The mesh is first partitioned within vertex clusters that can be described by a single affine motion model. The proposed segmentation technique automatically determines the number of clusters and relays on a decimation strategy privileging the simplification of vertices exhibiting the same affine motion over the whole animation sequence. The residual prediction errors are finally compressed using a temporal-DCT representation. The performances of our encoder are objectively evaluated on a data set of eight animation sequences with various sizes, geometries and topologies, and exhibiting both rigid and elastic motions. The experimental evaluation shows that the proposed compression scheme outperforms state of the art techniques such as MPEG-4/AFX, Dynapack, RT, GV, MCGV, TDCT, PCA and RT compression schemes.

  13. Issues in adaptive mesh refinement

    Energy Technology Data Exchange (ETDEWEB)

    Dai, William Wenlong [Los Alamos National Laboratory

    2009-01-01

    In this paper, we present an approach for a patch-based adaptive mesh refinement (AMR) for multi-physics simulations. The approach consists of clustering, symmetry preserving, mesh continuity, flux correction, communications, and management of patches. Among the special features of this patch-based AMR are symmetry preserving, efficiency of refinement, special implementation offlux correction, and patch management in parallel computing environments. Here, higher efficiency of refinement means less unnecessarily refined cells for a given set of cells to be refined. To demonstrate the capability of the AMR framework, hydrodynamics simulations with many levels of refinement are shown in both two- and three-dimensions.

  14. Almost optimal distributed M2M multicasting in wireless mesh networks

    DEFF Research Database (Denmark)

    Xin, Qin; Manne, Fredrik; Zhang, Yan

    2012-01-01

    Wireless Mesh Networking (WMN) is an emerging communication paradigm to enable resilient, cost-efficient and reliable services for the future-generation wireless networks. In this paper, we study the problem of multipoint-to- multipoint (M2M) multicasting in a WMN which aims to use the minimum...... number of time slots to exchange messages among a group of k mesh nodes in a multi-hop WMN with n mesh nodes. We study the M2M multicasting problem in a distributed environment where each participant only knows that there are k participants and it does not know who are other k-1 participants among n mesh...

  15. THM-GTRF: New Spider meshes, New Hydra-TH runs

    Energy Technology Data Exchange (ETDEWEB)

    Bakosi, Jozsef [Los Alamos National Laboratory; Christon, Mark A. [Los Alamos National Laboratory; Francois, Marianne M. [Los Alamos National Laboratory; Lowrie, Robert B. [Los Alamos National Laboratory; Nourgaliev, Robert [Los Alamos National Laboratory

    2012-06-20

    Progress is reported on computational capabilities for the grid-to-rod-fretting (GTRF) problem of pressurized water reactors. Numeca's Hexpress/Hybrid mesh generator is demonstrated as an excellent alternative to generating computational meshes for complex flow geometries, such as in GTRF. Mesh assessment is carried out using standard industrial computational fluid dynamics practices. Hydra-TH, a simulation code developed at LANL for reactor thermal-hydraulics, is demonstrated on hybrid meshes, containing different element types. A series of new Hydra-TH calculations has been carried out collecting turbulence statistics. Preliminary results on the newly generated meshes are discussed; full analysis will be documented in the L3 milestone, THM.CFD.P5.05, Sept. 2012.

  16. A Linear-Elasticity Solver for Higher-Order Space-Time Mesh Deformation

    Science.gov (United States)

    Diosady, Laslo T.; Murman, Scott M.

    2018-01-01

    A linear-elasticity approach is presented for the generation of meshes appropriate for a higher-order space-time discontinuous finite-element method. The equations of linear-elasticity are discretized using a higher-order, spatially-continuous, finite-element method. Given an initial finite-element mesh, and a specified boundary displacement, we solve for the mesh displacements to obtain a higher-order curvilinear mesh. Alternatively, for moving-domain problems we use the linear-elasticity approach to solve for a temporally discontinuous mesh velocity on each time-slab and recover a continuous mesh deformation by integrating the velocity. The applicability of this methodology is presented for several benchmark test cases.

  17. Multigrid for refined triangle meshes

    Energy Technology Data Exchange (ETDEWEB)

    Shapira, Yair

    1997-02-01

    A two-level preconditioning method for the solution of (locally) refined finite element schemes using triangle meshes is introduced. In the isotropic SPD case, it is shown that the condition number of the preconditioned stiffness matrix is bounded uniformly for all sufficiently regular triangulations. This is also verified numerically for an isotropic diffusion problem with highly discontinuous coefficients.

  18. Reactor physics verification of the MCNP6 unstructured mesh capability

    International Nuclear Information System (INIS)

    Burke, T. P.; Kiedrowski, B. C.; Martz, R. L.; Martin, W. R.

    2013-01-01

    The Monte Carlo software package MCNP6 has the ability to transport particles on unstructured meshes generated from the Computed-Aided Engineering software Abaqus. Verification is performed using benchmarks with features relevant to reactor physics - Big Ten and the C5G7 computational benchmark. Various meshing strategies are tested and results are compared to reference solutions. Computational performance results are also given. The conclusions show MCNP6 is capable of producing accurate calculations for reactor physics geometries and the computational requirements for small lattice benchmarks are reasonable on modern computing platforms. (authors)

  19. Wien Automatic System Planning (WASP) Package. A computer code for power generating system expansion planning. Version WASP-III Plus. User's manual. Volume 1: Chapters 1-11

    International Nuclear Information System (INIS)

    1995-01-01

    As a continuation of its effort to provide comprehensive and impartial guidance to Member States facing the need for introducing nuclear power, the IAEA has completed a new version of the Wien Automatic System Planning (WASP) Package for carrying out power generation expansion planning studies. WASP was originally developed in 1972 in the USA to meet the IAEA's needs to analyze the economic competitiveness of nuclear power in comparison to other generation expansion alternatives for supplying the future electricity requirements of a country or region. The model was first used by the IAEA to conduct global studies (Market Survey for Nuclear Power Plants in Developing Countries, 1972-1973) and to carry out Nuclear Power Planning Studies for several Member States. The WASP system developed into a very comprehensive planning tool for electric power system expansion analysis. Following these developments, the so-called WASP-Ill version was produced in 1979. This version introduced important improvements to the system, namely in the treatment of hydroelectric power plants. The WASP-III version has been continually updated and maintained in order to incorporate needed enhancements. In 1981, the Model for Analysis of Energy Demand (MAED) was developed in order to allow the determination of electricity demand, consistent with the overall requirements for final energy, and thus, to provide a more adequate forecast of electricity needs to be considered in the WASP study. MAED and WASP have been used by the Agency for the conduct of Energy and Nuclear Power Planning Studies for interested Member States. More recently, the VALORAGUA model was completed in 1992 as a means for helping in the preparation of the hydro plant characteristics to be input in the WASP study and to verify that the WASP overall optimized expansion plan takes also into account an optimization of the use of water for electricity generation. The combined application of VALORAGUA and WASP permits the

  20. On the Mesh Array for Matrix Multiplication

    OpenAIRE

    Kak, Subhash

    2010-01-01

    This article presents new properties of the mesh array for matrix multiplication. In contrast to the standard array that requires 3n-2 steps to complete its computation, the mesh array requires only 2n-1 steps. Symmetries of the mesh array computed values are presented which enhance the efficiency of the array for specific applications. In multiplying symmetric matrices, the results are obtained in 3n/2+1 steps. The mesh array is examined for its application as a scrambling system.

  1. The mesh network protocol evaluation and development

    OpenAIRE

    Pei, Ping; Petrenko, Y. N.

    2015-01-01

    In this paper, we introduce a Mesh network protocol evaluation and development. It has a special protocol. We could easily to understand that how different protocols are used in mesh network. In addition to our comprehension, Multi – hop routing protocol could provide robustness and load balancing to communication in wireless mesh networks.

  2. Mesh network achieve its fuction on Linux

    OpenAIRE

    Pei, Ping; Petrenko, Y. N.

    2015-01-01

    In this paper, we introduce a Mesh network protocol evaluation and development. It has a special protocol. We could easily understand the Linux operation principles which are in use in mesh network. In addition to our comprehension, we describe the graph which shows package routing way. At last according to testing we prove that Mesh protocol AODV satisfy Linux platform performance requirements.

  3. Development of a automatic positioning system of photovoltaic panels for electric energy generation; Desenvolvimento de um sistema de posicionamento automatico de placas fotovoltaicas para a geracao de energia eletrica

    Energy Technology Data Exchange (ETDEWEB)

    Alves, Alceu F.; Cagnon, Odivaldo Jose [Universidade Estadual Paulista (DEE/FEB/UNESP), Bauru, SP (Brazil). Fac. de Engenharia. Dept. de Engenharia Eletrica; Seraphin, Odivaldo Jose [Universidade Estadual Paulista (DER/FCA/UNESP), Botucatu, SP (Brazil). Fac. de Ciencias Agronomicas. Dept. de Engenharia Rural

    2008-07-01

    This work presents an automatic positioning system for photovoltaic panels, in order to improve the conversion of solar energy to electric energy. A prototype with automatic movement was developed, and its efficiency in generating electric energy was compared to another one with the same characteristics, but fixed in space. Preliminary results point to a significant increase in efficiency, obtained from a simplified process of movement, in which sensors are not used to determine the apparent sun's position, but instead of it, the relative Sun-Earth's position equations are used. An innovative movement mechanical system is also presented, using two stepper motors to move the panel along two-axis, but with independent movement, contributing, this way, to save energy during the positioning times. The use of this proposed system in rural areas is suggested. (author)

  4. Surface Mesh Reconstruction from Cardiac MRI Contours

    Directory of Open Access Journals (Sweden)

    Benjamin Villard

    2018-01-01

    Full Text Available We introduce a tool to build a surface mesh able to deal with sparse, heterogeneous, non-parallel, cross-sectional, non-coincidental contours and show its application to reconstruct surfaces of the heart. In recent years, much research has looked at creating personalised 3D anatomical models of the heart. These models usually incorporate a geometrical reconstruction of the anatomy in order to better understand cardiovascular functions as well as predict different cardiac processes. As MRIs are becoming the standard for cardiac medical imaging, we tested our methodology on cardiac MRI data from standard acquisitions. However, the ability to accurately reconstruct heart anatomy in three dimensions commonly comes with fundamental challenges—notably, the trade-off between data fitting and expected visual appearance. Most current techniques can either require contours from parallel slices or, if multiple slice orientations are used, require an exact match between these contours. In addition, some methods introduce a bias by the use of prior shape models or by trade-offs between the data matching terms and the smoothing terms. Our approach uses a composition of smooth approximations towards the maximization of the data fitting, ensuring a good matching to the input data as well as pleasant interpolation characteristics. To assess our method in the task of cardiac mesh generations, we evaluated its performance on synthetic data obtained from a cardiac statistical shape model as well as on real data. Using a statistical shape model, we simulated standard cardiac MRI acquisitions planes and contour data. We performed a multi-parameter evaluation study using plausible cardiac shapes generated from the model. We also show that long axes contours as well as the most extremal slices (basal and apical contain the most amount of structural information, and thus should be taken into account when generating anatomically relevant geometrical cardiovascular

  5. Voltammetry at micro-mesh electrodes

    Directory of Open Access Journals (Sweden)

    Wadhawan Jay D.

    2003-01-01

    Full Text Available The voltammetry at three micro-mesh electrodes is explored. It is found that at sufficiently short experimental durations, the micro-mesh working electrode first behaves as an ensemble of microband electrodes, then follows the behaviour anticipated for an array of diffusion-independent micro-ring electrodes of the same perimeter as individual grid-squares within the mesh. During prolonged electrolysis, the micro-mesh electrode follows that behaviour anticipated theoretically for a cubically-packed partially-blocked electrode. Application of the micro-mesh electrode for the electrochemical determination of carbon dioxide in DMSO electrolyte solutions is further illustrated.

  6. On the flexibility of Kokotsakis meshes

    OpenAIRE

    Karpenkov, Oleg

    2008-01-01

    In this paper we study geometric, algebraic, and computational aspects of flexibility and infinitesimal flexibility of Kokotsakis meshes. A Kokotsakis mesh is a mesh that consists of a face in the middle and a certain band of faces attached to the middle face by its perimeter. In particular any 3x3-mesh made of quadrangles is a Kokotsakis mesh. We express the infinitesimal flexibility condition in terms of Ceva and Menelaus theorems. Further we study semi-algebraic properties of the set of fl...

  7. Adaptive Mesh Refinement in CTH

    International Nuclear Information System (INIS)

    Crawford, David

    1999-01-01

    This paper reports progress on implementing a new capability of adaptive mesh refinement into the Eulerian multimaterial shock- physics code CTH. The adaptivity is block-based with refinement and unrefinement occurring in an isotropic 2:1 manner. The code is designed to run on serial, multiprocessor and massive parallel platforms. An approximate factor of three in memory and performance improvements over comparable resolution non-adaptive calculations has-been demonstrated for a number of problems

  8. Finite element analysis of osteosynthesis screw fixation in the bone stock: an appropriate method for automatic screw modelling.

    Directory of Open Access Journals (Sweden)

    Jan Wieding

    Full Text Available The use of finite element analysis (FEA has grown to a more and more important method in the field of biomedical engineering and biomechanics. Although increased computational performance allows new ways to generate more complex biomechanical models, in the area of orthopaedic surgery, solid modelling of screws and drill holes represent a limitation of their use for individual cases and an increase of computational costs. To cope with these requirements, different methods for numerical screw modelling have therefore been investigated to improve its application diversity. Exemplarily, fixation was performed for stabilization of a large segmental femoral bone defect by an osteosynthesis plate. Three different numerical modelling techniques for implant fixation were used in this study, i.e. without screw modelling, screws as solid elements as well as screws as structural elements. The latter one offers the possibility to implement automatically generated screws with variable geometry on arbitrary FE models. Structural screws were parametrically generated by a Python script for the automatic generation in the FE-software Abaqus/CAE on both a tetrahedral and a hexahedral meshed femur. Accuracy of the FE models was confirmed by experimental testing using a composite femur with a segmental defect and an identical osteosynthesis plate for primary stabilisation with titanium screws. Both deflection of the femoral head and the gap alteration were measured with an optical measuring system with an accuracy of approximately 3 µm. For both screw modelling techniques a sufficient correlation of approximately 95% between numerical and experimental analysis was found. Furthermore, using structural elements for screw modelling the computational time could be reduced by 85% using hexahedral elements instead of tetrahedral elements for femur meshing. The automatically generated screw modelling offers a realistic simulation of the osteosynthesis fixation with

  9. Finite element analysis of osteosynthesis screw fixation in the bone stock: an appropriate method for automatic screw modelling.

    Science.gov (United States)

    Wieding, Jan; Souffrant, Robert; Fritsche, Andreas; Mittelmeier, Wolfram; Bader, Rainer

    2012-01-01

    The use of finite element analysis (FEA) has grown to a more and more important method in the field of biomedical engineering and biomechanics. Although increased computational performance allows new ways to generate more complex biomechanical models, in the area of orthopaedic surgery, solid modelling of screws and drill holes represent a limitation of their use for individual cases and an increase of computational costs. To cope with these requirements, different methods for numerical screw modelling have therefore been investigated to improve its application diversity. Exemplarily, fixation was performed for stabilization of a large segmental femoral bone defect by an osteosynthesis plate. Three different numerical modelling techniques for implant fixation were used in this study, i.e. without screw modelling, screws as solid elements as well as screws as structural elements. The latter one offers the possibility to implement automatically generated screws with variable geometry on arbitrary FE models. Structural screws were parametrically generated by a Python script for the automatic generation in the FE-software Abaqus/CAE on both a tetrahedral and a hexahedral meshed femur. Accuracy of the FE models was confirmed by experimental testing using a composite femur with a segmental defect and an identical osteosynthesis plate for primary stabilisation with titanium screws. Both deflection of the femoral head and the gap alteration were measured with an optical measuring system with an accuracy of approximately 3 µm. For both screw modelling techniques a sufficient correlation of approximately 95% between numerical and experimental analysis was found. Furthermore, using structural elements for screw modelling the computational time could be reduced by 85% using hexahedral elements instead of tetrahedral elements for femur meshing. The automatically generated screw modelling offers a realistic simulation of the osteosynthesis fixation with screws in the adjacent

  10. What Information Does Your EHR Contain? Automatic Generation of a Clinical Metadata Warehouse (CMDW) to Support Identification and Data Access Within Distributed Clinical Research Networks.

    Science.gov (United States)

    Bruland, Philipp; Doods, Justin; Storck, Michael; Dugas, Martin

    2017-01-01

    Data dictionaries provide structural meta-information about data definitions in health information technology (HIT) systems. In this regard, reusing healthcare data for secondary purposes offers several advantages (e.g. reduce documentation times or increased data quality). Prerequisites for data reuse are its quality, availability and identical meaning of data. In diverse projects, research data warehouses serve as core components between heterogeneous clinical databases and various research applications. Given the complexity (high number of data elements) and dynamics (regular updates) of electronic health record (EHR) data structures, we propose a clinical metadata warehouse (CMDW) based on a metadata registry standard. Metadata of two large hospitals were automatically inserted into two CMDWs containing 16,230 forms and 310,519 data elements. Automatic updates of metadata are possible as well as semantic annotations. A CMDW allows metadata discovery, data quality assessment and similarity analyses. Common data models for distributed research networks can be established based on similarity analyses.

  11. Link Prediction on a Network of Co-occurring MeSH Terms: Towards Literature-based Discovery.

    Science.gov (United States)

    Kastrin, Andrej; Rindflesch, Thomas C; Hristovski, Dimitar

    2016-08-05

    Literature-based discovery (LBD) is a text mining methodology for automatically generating research hypotheses from existing knowledge. We mimic the process of LBD as a classification problem on a graph of MeSH terms. We employ unsupervised and supervised link prediction methods for predicting previously unknown connections between biomedical concepts. We evaluate the effectiveness of link prediction through a series of experiments using a MeSH network that contains the history of link formation between biomedical concepts. We performed link prediction using proximity measures, such as common neighbor (CN), Jaccard coefficient (JC), Adamic / Adar index (AA) and preferential attachment (PA). Our approach relies on the assumption that similar nodes are more likely to establish a link in the future. Applying an unsupervised approach, the AA measure achieved the best performance in terms of area under the ROC curve (AUC = 0.76), followed by CN, JC, and PA. In a supervised approach, we evaluate whether proximity measures can be combined to define a model of link formation across all four predictors. We applied various classifiers, including decision trees, k-nearest neighbors, logistic regression, multilayer perceptron, naïve Bayes, and random forests. Random forest classifier accomplishes the best performance (AUC = 0.87). The link prediction approach proved to be effective for LBD processing. Supervised statistical learning approaches clearly outperform an unsupervised approach to link prediction.

  12. Automatic selective feature retention in patient specific elastic surface registration

    CSIR Research Space (South Africa)

    Jansen van Rensburg, GJ

    2011-01-01

    Full Text Available . An intelligent mesh morphing strategy where dissimilar feature surfaces can be extracted automatically also greatly reduces the amount of user input required. REFERENCES [1] R. Bryan, P.S. Mohan, A. Hopkins, F. Galloway, M. Taylor and P. Nair, Statitical...

  13. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  14. Mesh Intercomparisons of Fog Water Collected Yield Insight Into the Nature of Fog-Drip Collection Mechanisms

    Science.gov (United States)

    Fernandez, D.; Torregrosa, A.; Weiss-Penzias, P. S.; Oliphant, A. J.; Dodge, C.; Bowman, M.; Wilson, S.; Mairs, A. A.; Gravelle, M.; Barkley, T.

    2016-12-01

    At multiple sites across central CA, several passive fog water collectors have been deployed for the past 3 years. All of the sites employ standard Raschel polypropylene mesh as the fog collection medium and five of them also integrated a novel polypropylene mesh of German manufacture with a 3-dimensional internal structure. Additionally, six metal mesh manufactured by McMaster-Carr of various hole sizing were coated with a POSS-PEMA substance at the Massachusetts Institute of Technology and deployed in parallel with the Raschel mesh at six distinct locations. Finally, fluorine-free versions of the POSS-PEMA substance were generated by NBD Nanotechnology and coated on a much finer mesh substrate. Three of those and one control (uncoated mesh) were deployed at one of the fog collection sites for one season, along with a standard Raschel mesh. Preliminary results from one intercomparison from just one pair of mesh over two seasons seem to reveal a wind speed and also, possibly, a droplet-size dependence on the fog collection efficiency for the mesh. This study will continue to intercompare the various mesh in conjunction with the wind speed and direction data. If a collection efficiency dependence on mesh size or coating is confirmed, it may point to interesting and relevant mechanisms for fog droplet capture and collection hitherto unobserved in field conditions.

  15. TESS: A RELATIVISTIC HYDRODYNAMICS CODE ON A MOVING VORONOI MESH

    International Nuclear Information System (INIS)

    Duffell, Paul C.; MacFadyen, Andrew I.

    2011-01-01

    We have generalized a method for the numerical solution of hyperbolic systems of equations using a dynamic Voronoi tessellation of the computational domain. The Voronoi tessellation is used to generate moving computational meshes for the solution of multidimensional systems of conservation laws in finite-volume form. The mesh-generating points are free to move with arbitrary velocity, with the choice of zero velocity resulting in an Eulerian formulation. Moving the points at the local fluid velocity makes the formulation effectively Lagrangian. We have written the TESS code to solve the equations of compressible hydrodynamics and magnetohydrodynamics for both relativistic and non-relativistic fluids on a dynamic Voronoi mesh. When run in Lagrangian mode, TESS is significantly less diffusive than fixed mesh codes and thus preserves contact discontinuities to high precision while also accurately capturing strong shock waves. TESS is written for Cartesian, spherical, and cylindrical coordinates and is modular so that auxiliary physics solvers are readily integrated into the TESS framework and so that this can be readily adapted to solve general systems of equations. We present results from a series of test problems to demonstrate the performance of TESS and to highlight some of the advantages of the dynamic tessellation method for solving challenging problems in astrophysical fluid dynamics.

  16. Contours, 2 foot contours automatically generated from 2008 LIDAR for the purpose of supporting FEMA floodplain mapping. Limited manual editing, breaklines for waterbodies greater than 5 acres created and use.10' index contours labeled., Published in 2008, 1:1200 (1in=100ft) scale, City of Portage Government.

    Data.gov (United States)

    NSGIC Local Govt | GIS Inventory — Contours dataset current as of 2008. 2 foot contours automatically generated from 2008 LIDAR for the purpose of supporting FEMA floodplain mapping. Limited manual...

  17. Synthetic Versus Biological Mesh-Related Erosion After Laparoscopic Ventral Mesh Rectopexy: A Systematic Review.

    Science.gov (United States)

    Balla, Andrea; Quaresima, Silvia; Smolarek, Sebastian; Shalaby, Mostafa; Missori, Giulia; Sileri, Pierpaolo

    2017-04-01

    This review reports the incidence of mesh-related erosion after ventral mesh rectopexy to determine whether any difference exists in the erosion rate between synthetic and biological mesh. A systematic search of the MEDLINE and the Ovid databases was conducted to identify suitable articles published between 2004 and 2015. The search strategy capture terms were laparoscopic ventral mesh rectopexy, laparoscopic anterior rectopexy, robotic ventral rectopexy, and robotic anterior rectopexy. Eight studies (3,956 patients) were included in this review. Of those patients, 3,517 patients underwent laparoscopic ventral rectopexy (LVR) using synthetic mesh and 439 using biological mesh. Sixty-six erosions were observed with synthetic mesh (26 rectal, 32 vaginal, 8 recto-vaginal fistulae) and one (perineal erosion) with biological mesh. The synthetic and the biological mesh-related erosion rates were 1.87% and 0.22%, respectively. The time between rectopexy and diagnosis of mesh erosion ranged from 1.7 to 124 months. No mesh-related mortalities were reported. The incidence of mesh-related erosion after LVR is low and is more common after the placement of synthetic mesh. The use of biological mesh for LVR seems to be a safer option; however, large, multicenter, randomized, control trials with long follow-ups are required if a definitive answer is to be obtained.

  18. Laparoscopic appendicectomy for suspected mesh-induced appendicitis after laparoscopic transabdominal preperitoneal polypropylene mesh inguinal herniorraphy

    Directory of Open Access Journals (Sweden)

    Jennings Jason

    2010-01-01

    Full Text Available Laparoscopic inguinal herniorraphy via a transabdominal preperitoneal (TAPP approach using Polypropylene Mesh (Mesh and staples is an accepted technique. Mesh induces a localised inflammatory response that may extend to, and involve, adjacent abdominal and pelvic viscera such as the appendix. We present an interesting case of suspected Mesh-induced appendicitis treated successfully with laparoscopic appendicectomy, without Mesh removal, in an elderly gentleman who presented with symptoms and signs of acute appendicitis 18 months after laparoscopic inguinal hernia repair. Possible mechanisms for Mesh-induced appendicitis are briefly discussed.

  19. An interoperable standard system for the automatic generation and publication of the fire risk maps based on Fire Weather Index (FWI)

    Science.gov (United States)

    Julià Selvas, Núria; Ninyerola Casals, Miquel

    2015-04-01

    It has been implemented an automatic system to predict the fire risk in the Principality of Andorra, a small country located in the eastern Pyrenees mountain range, bordered by Catalonia and France, due to its location, his landscape is a set of a rugged mountains with an average elevation around 2000 meters. The system is based on the Fire Weather Index (FWI) that consists on different components, each one, measuring a different aspect of the fire danger calculated by the values of the weather variables at midday. CENMA (Centre d'Estudis de la Neu i de la Muntanya d'Andorra) has a network around 10 automatic meteorological stations, located in different places, peeks and valleys, that measure weather data like relative humidity, wind direction and speed, surface temperature, rainfall and snow cover every ten minutes; this data is sent daily and automatically to the system implemented that will be processed in the way to filter incorrect measurements and to homogenizer measurement units. Then this data is used to calculate all components of the FWI at midday and for the level of each station, creating a database with the values of the homogeneous measurements and the FWI components for each weather station. In order to extend and model this data to all Andorran territory and to obtain a continuous map, an interpolation method based on a multiple regression with spline residual interpolation has been implemented. This interpolation considerer the FWI data as well as other relevant predictors such as latitude, altitude, global solar radiation and sea distance. The obtained values (maps) are validated using a cross-validation leave-one-out method. The discrete and continuous maps are rendered in tiled raster maps and published in a web portal conform to Web Map Service (WMS) Open Geospatial Consortium (OGC) standard. Metadata and other reference maps (fuel maps, topographic maps, etc) are also available from this geoportal.

  20. Method and system for progressive mesh storage and reconstruction using wavelet-encoded height fields

    Science.gov (United States)

    Baxes, Gregory A. (Inventor); Linger, Timothy C. (Inventor)

    2011-01-01

    Systems and methods are provided for progressive mesh storage and reconstruction using wavelet-encoded height fields. A method for progressive mesh storage includes reading raster height field data, and processing the raster height field data with a discrete wavelet transform to generate wavelet-encoded height fields. In another embodiment, a method for progressive mesh storage includes reading texture map data, and processing the texture map data with a discrete wavelet transform to generate wavelet-encoded texture map fields. A method for reconstructing a progressive mesh from wavelet-encoded height field data includes determining terrain blocks, and a level of detail required for each terrain block, based upon a viewpoint. Triangle strip constructs are generated from vertices of the terrain blocks, and an image is rendered utilizing the triangle strip constructs. Software products that implement these methods are provided.

  1. Short term post-operative morphing of sacrocolpopexy mesh measured by magnetic resonance imaging.

    Science.gov (United States)

    Sindhwani, Nikhil; Callewaert, Geertje; Deprest, Thomas; Housmans, Susanne; Van Beckevoort, Dirk; Deprest, Jan

    2018-04-01

    Sacrocolpopexy (SC) involves suspension of the vaginal vault or cervix to the sacrum using a mesh. Following insertion, the meshes have been observed to have undergone dimensional changes. To quantify dimensional changes of meshes following implantation and characterize their morphology in-vivo. 24 patients underwent SC using PolyVinyliDeneFluoride mesh loaded with Fe 3 O 4 particles. Tailored anterior and posterior mesh flaps were sutured to the respective vaginal walls, uniting at the apex. The posterior flap continued to the sacrum and was attached there. Meshes were visualized on magnetic resonance (MR) imaging at 12 [3-12] (median [range]) months postoperatively and 3D models of the mesh were generated. Dynamic MR sequences were acquired during valsalva to record mesh mobility. The area of the vagina effectively supported by the mesh (Effective Support Area (ESA)) was calculated. The 3D models' wall thickness map was analyzed to identify the locations of mesh folding. Intraclass correlation (ICC) was calculated to test the reliability of the methods. To measure the laxity and flatness of the mesh, the curvature and the ellipticity of the sacral flap were calculated. The ESA calculation methodology had ICC = 0.97. A reduction of 75.49 [61.55-78.67] % (median [IQR]) in area, 47.64 [38.07-59.81] % in anterior flap, and of 23.95 [10.96-27.21] % in the posterior flap was measured. The mesh appeared thicker near its attachment at the sacral promontory (n = 19) and near the vaginal apex (n = 22). The laxity of the mesh was 1.13 [1.10-1.16] and 60.55 [49.76-76.25] % of the sacral flap was flat. We could not reliably measure mesh mobility (ICC = 0.16). A methodology for complete 3D characterization of SC meshes using MR images was presented. After implantation, the supported area is much lower than what is prepared prior to implantation. We propose this happened during the surgery itself. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Evaluation of mechanical properties in metal wire mesh supported selective catalytic reduction (SCR) catalyst structures

    Science.gov (United States)

    Rajath, S.; Siddaraju, C.; Nandakishora, Y.; Roy, Sukumar

    2018-04-01

    The objective of this research is to evaluate certain specific mechanical properties of certain stainless steel wire mesh supported Selective catalytic reduction catalysts structures wherein the physical properties of the metal wire mesh and also its surface treatments played vital role thereby influencing the mechanical properties. As the adhesion between the stainless steel wire mesh and the catalyst material determines the bond strength and the erosion resistance of catalyst structures, surface modifications of the metal- wire mesh structure in order to facilitate the interface bonding is therefore very important to realize enhanced level of mechanical properties. One way to enhance such adhesion properties, the stainless steel wire mesh is treated with the various acids, i.e., chromic acid, phosphoric acid including certain mineral acids and combination of all those in various molar ratios that could generate surface active groups on metal surface that promotes good interface structure between the metal- wire mesh and metal oxide-based catalyst material and then the stainless steel wire mesh is dipped in the glass powder slurry containing some amount of organic binder. As a result of which the said catalyst material adheres to the metal-wire mesh surface more effectively that improves the erosion profile of supported catalysts structure including bond strength.

  3. Assessment of the anti-biofouling potentials of a copper iodide-doped nylon mesh.

    Science.gov (United States)

    Sato, Tetsuya; Fujimori, Yoshie; Nakayama, Tsuruo; Gotoh, Yasuo; Sunaga, Yoshihiko; Nemoto, Michiko; Matsunaga, Tadashi; Tanaka, Tsuyoshi

    2012-08-01

    We propose a copper iodide (CuI)-doped nylon mesh prepared using polyiodide ions as a precursor toward anti-biofouling polymer textile. The CuI-doped nylon mesh was subjected to the prevention of biofouling in marine environments. The attachment of the marine organisms was markedly inhibited on the CuI-doped nylon mesh surface until 249 days. Scanning electron microscopy-energy dispersive X-ray analysis indicated that copper compounds were maintained in the nylon mesh after the field experiment, although copper content in the nylon mesh was reduced. Therefore, the copper ions slowly dissolved from nylon mesh will contribute to the long-term prevention of biofouling. Furthermore, electron spin resonance analysis revealed the generation of reactive oxygen species (ROS) from CuI-doped nylon mesh after the field experiment. One of the possibilities for toxic action of copper ions will be the direct effect of Cu+ -induced ROS on biofilm forming on nylon mesh surface. The proposed polymer textile can be applied to fishing and aquafarming nets, mooring rope for ship, or silt fence to restrict polluted water in marine environments.

  4. Bluetooth Low Energy Mesh Networks: A Survey.

    Science.gov (United States)

    Darroudi, Seyed Mahdi; Gomez, Carles

    2017-06-22

    Bluetooth Low Energy (BLE) has gained significant momentum. However, the original design of BLE focused on star topology networking, which limits network coverage range and precludes end-to-end path diversity. In contrast, other competing technologies overcome such constraints by supporting the mesh network topology. For these reasons, academia, industry, and standards development organizations have been designing solutions to enable BLE mesh networks. Nevertheless, the literature lacks a consolidated view on this emerging area. This paper comprehensively surveys state of the art BLE mesh networking. We first provide a taxonomy of BLE mesh network solutions. We then review the solutions, describing the variety of approaches that leverage existing BLE functionality to enable BLE mesh networks. We identify crucial aspects of BLE mesh network solutions and discuss their advantages and drawbacks. Finally, we highlight currently open issues.

  5. Sierra toolkit computational mesh conceptual model

    International Nuclear Information System (INIS)

    Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.

    2010-01-01

    The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

  6. Floating shock fitting via Lagrangian adaptive meshes

    Science.gov (United States)

    Vanrosendale, John

    1995-01-01

    In recent work we have formulated a new approach to compressible flow simulation, combining the advantages of shock-fitting and shock-capturing. Using a cell-centered on Roe scheme discretization on unstructured meshes, we warp the mesh while marching to steady state, so that mesh edges align with shocks and other discontinuities. This new algorithm, the Shock-fitting Lagrangian Adaptive Method (SLAM), is, in effect, a reliable shock-capturing algorithm which yields shock-fitted accuracy at convergence.

  7. Performance of Implementation IBR-DTN and Batman-Adv Routing Protocol in Wireless Mesh Networks

    Directory of Open Access Journals (Sweden)

    Herman Yuliandoko

    2016-03-01

    Full Text Available Wireless mesh networks is a network which has high mobility and flexibility network. In Wireless mesh networks nodes are free to move and able to automatically build a network connection with other nodes. High mobility, heterogeneous condition and intermittent network connectivity cause data packets drop during wireless communication and it becomes a problem in the wireless mesh networks. This condition can happen because wireless mesh networks use connectionless networking type such as IP protocol which it is not tolerant to delay. To solve this condition it is needed a technology to keep data packets when the network is disconnect. Delay tolerant technology is a technology that provides store and forward mechanism and it can prevent packet data dropping during communication. In our research, we proposed a test bed wireless mesh networks implementation by using proactive routing protocol and combining with delay tolerant technology. We used Batman-adv routing protocol and IBR-DTN on our research. We measured some particular performance aspect of networking such as packet loss, delay, and throughput of the network. We identified that delay tolerant could keep packet data from dropping better than current wireless mesh networks in the intermittent network condition. We also proved that IBR-DTN and Batman-adv could run together on the wireless mesh networks. In The experiment throughput test result of IBR-DTN was higher than Current TCP on the LoS (Line of Side and on environment with obstacle. Keywords: Delay Tolerant, IBR-DTN, Wireless Mesh, Batman-adv, Performance

  8. Parallel Performance Optimizations on Unstructured Mesh-based Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Sarje, Abhinav; Song, Sukhyun; Jacobsen, Douglas; Huck, Kevin; Hollingsworth, Jeffrey; Malony, Allen; Williams, Samuel; Oliker, Leonid

    2015-01-01

    © The Authors. Published by Elsevier B.V. This paper addresses two key parallelization challenges the unstructured mesh-based ocean modeling code, MPAS-Ocean, which uses a mesh based on Voronoi tessellations: (1) load imbalance across processes, and (2) unstructured data access patterns, that inhibit intra- and inter-node performance. Our work analyzes the load imbalance due to naive partitioning of the mesh, and develops methods to generate mesh partitioning with better load balance and reduced communication. Furthermore, we present methods that minimize both inter- and intranode data movement and maximize data reuse. Our techniques include predictive ordering of data elements for higher cache efficiency, as well as communication reduction approaches. We present detailed performance data when running on thousands of cores using the Cray XC30 supercomputer and show that our optimization strategies can exceed the original performance by over 2×. Additionally, many of these solutions can be broadly applied to a wide variety of unstructured grid-based computations.

  9. Unstructured mesh adaptivity for urban flooding modelling

    Science.gov (United States)

    Hu, R.; Fang, F.; Salinas, P.; Pain, C. C.

    2018-05-01

    Over the past few decades, urban floods have been gaining more attention due to their increase in frequency. To provide reliable flooding predictions in urban areas, various numerical models have been developed to perform high-resolution flood simulations. However, the use of high-resolution meshes across the whole computational domain causes a high computational burden. In this paper, a 2D control-volume and finite-element flood model using adaptive unstructured mesh technology has been developed. This adaptive unstructured mesh technique enables meshes to be adapted optimally in time and space in response to the evolving flow features, thus providing sufficient mesh resolution where and when it is required. It has the advantage of capturing the details of local flows and wetting and drying front while reducing the computational cost. Complex topographic features are represented accurately during the flooding process. For example, the high-resolution meshes around the buildings and steep regions are placed when the flooding water reaches these regions. In this work a flooding event that happened in 2002 in Glasgow, Scotland, United Kingdom has been simulated to demonstrate the capability of the adaptive unstructured mesh flooding model. The simulations have been performed using both fixed and adaptive unstructured meshes, and then results have been compared with those published 2D and 3D results. The presented method shows that the 2D adaptive mesh model provides accurate results while having a low computational cost.

  10. Meshes optimized for discrete exterior calculus (DEC).

    Energy Technology Data Exchange (ETDEWEB)

    Mousley, Sarah C. [Univ. of Illinois, Urbana-Champaign, IL (United States); Deakin, Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Knupp, Patrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mitchell, Scott A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-12-01

    We study the optimization of an energy function used by the meshing community to measure and improve mesh quality. This energy is non-traditional because it is dependent on both the primal triangulation and its dual Voronoi (power) diagram. The energy is a measure of the mesh's quality for usage in Discrete Exterior Calculus (DEC), a method for numerically solving PDEs. In DEC, the PDE domain is triangulated and this mesh is used to obtain discrete approximations of the continuous operators in the PDE. The energy of a mesh gives an upper bound on the error of the discrete diagonal approximation of the Hodge star operator. In practice, one begins with an initial mesh and then makes adjustments to produce a mesh of lower energy. However, we have discovered several shortcomings in directly optimizing this energy, e.g. its non-convexity, and we show that the search for an optimized mesh may lead to mesh inversion (malformed triangles). We propose a new energy function to address some of these issues.

  11. Parallel adaptive simulations on unstructured meshes

    International Nuclear Information System (INIS)

    Shephard, M S; Jansen, K E; Sahni, O; Diachin, L A

    2007-01-01

    This paper discusses methods being developed by the ITAPS center to support the execution of parallel adaptive simulations on unstructured meshes. The paper first outlines the ITAPS approach to the development of interoperable mesh, geometry and field services to support the needs of SciDAC application in these areas. The paper then demonstrates the ability of unstructured adaptive meshing methods built on such interoperable services to effectively solve important physics problems. Attention is then focused on ITAPs' developing ability to solve adaptive unstructured mesh problems on massively parallel computers

  12. Application of numerical analysis techniques to eddy current testing for steam generator tubes

    International Nuclear Information System (INIS)

    Morimoto, Kazuo; Satake, Koji; Araki, Yasui; Morimura, Koichi; Tanaka, Michio; Shimizu, Naoya; Iwahashi, Yoichi

    1994-01-01

    This paper describes the application of numerical analysis to eddy current testing (ECT) for steam generator tubes. A symmetrical and three-dimensional sinusoidal steady state eddy current analysis code was developed. This code is formulated by future element method-boundary element method coupling techniques, in order not to regenerate the mesh data in the tube domain at every movement of the probe. The calculations were carried out under various conditions including those for various probe types, defect orientations and so on. Compared with the experimental data, it was shown that it is feasible to apply this code to actual use. Furthermore, we have developed a total eddy current analysis system which consists of an ECT calculation code, an automatic mesh generator for analysis, a database and display software for calculated results. ((orig.))

  13. Biologic mesh versus synthetic mesh in open inguinal hernia repair: system review and meta-analysis.

    Science.gov (United States)

    Fang, Zhixue; Ren, Feng; Zhou, Jianping; Tian, Jiao

    2015-12-01

    Biologic meshes are mostly used for abdominal wall reinforcement in infected fields, but no consensus has been reached on its use for inguinal hernia repairing. The purpose of this study was to compare biologic mesh with synthetic mesh in open inguinal herniorrhaphy. A systematic literature review and meta-analysis was undertaken to identify studies comparing the outcomes of biologic mesh and synthetic mesh in open inguinal hernia repair. Published studies were identified by the databases PubMed, EMBASE and the Cochrane Library. A total of 382 patients in five randomized controlled trials were reviewed (179 patients in biologic mesh group; 203 patients in synthetic mesh group). The two groups did not significantly differ in chronic groin pain (P = 0.06) or recurrence (P = 0.38). The incidence of seroma trended higher in biologic mesh group (P = 0.03). Operating time was significantly longer with biologic mesh (P = 0.03). There was no significant difference in hematomas (P = 0.23) between the two groups. From the data of this study, biologic mesh had no superiority to synthetic mesh in open inguinal hernia repair with similar recurrence rates and incidence of chronic groin pain, but higher rate of seroma and longer operating time. However, this mesh still needs to be assessed in a large, multicentre, well-designed randomized controlled trial. © 2015 Royal Australasian College of Surgeons.

  14. Tensile Behaviour of Welded Wire Mesh and Hexagonal Metal Mesh for Ferrocement Application

    Science.gov (United States)

    Tanawade, A. G.; Modhera, C. D.

    2017-08-01

    Tension tests were conducted on welded mesh and hexagonal Metal mesh. Welded Mesh is available in the market in different sizes. The two types are analysed viz. Ø 2.3 mm and Ø 2.7 mm welded mesh, having opening size 31.75 mm × 31.75 mm and 25.4 mm × 25.4 mm respectively. Tensile strength test was performed on samples of welded mesh in three different orientations namely 0°, 30° and 45° degrees with the loading axis and hexagonal Metal mesh of Ø 0.7 mm, having opening 19.05 × 19.05 mm. Experimental tests were conducted on samples of these meshes. The objective of this study was to investigate the behaviour of the welded mesh and hexagonal Metal mesh. The result shows that the tension load carrying capacity of welded mesh of Ø 2.7 mm of 0° orientation is good as compared to Ø2.3 mm mesh and ductility of hexagonal Metal mesh is good in behaviour.

  15. Automatic NC-Data generation method for 5-axis cutting of turbine-blades by finding Safe heel-angles and adaptive path-intervals

    International Nuclear Information System (INIS)

    Piao, Cheng Dao; Lee, Cheol Soo; Cho, Kyu Zong; Park, Gwang Ryeol

    2004-01-01

    In this paper, an efficient method for generating 5-axis cutting data for a turbine blade is presented. The interference elimination of 5-axis cutting currently is very complicated, and it takes up a lot of time. The proposed method can generate an interference-free tool path, within an allowance range. Generating the cutting data just point to the cutting process and using it to obtain NC data by calculating the feed rate, allows us to maintain the proper feed rate of the 5-axis machine. This paper includes the algorithms for: (1) CL data generation by detecting an interference-free heel angle, (2) finding the optimal tool path interval considering the cusp-height, (3) finding the adaptive feed rate values for each cutter path, and (4) the inverse kinematics depending on the structure of the 5-axis machine, for generating the NC data

  16. Flexible CFD meshing strategy for prediction of ship resistance and propulsion performance

    Directory of Open Access Journals (Sweden)

    Jeong Hwa Seo

    2010-09-01

    Full Text Available In the present study, we conducted resistance test, propeller open water test and self-propulsion test for a ship's resistance and propulsion performance, using computational fluid dynamics techniques, where a Reynolds-averaged Navier-Stokes equations solver was employed. For convenience of mesh generation, unstructured meshes were used in the bow and stern region of a ship, where the hull shape is formed of delicate curved surfaces. On the other hand, structured meshes were generated for the middle part of the hull and the rest of the domain, i.e., the region of relatively simple geometry. To facilitate the rotating propeller for propeller open water test and self-propulsion test, a sliding mesh technique was adopted. Free-surface effects were included by employing the volume of fluid method for multi-phase flows. The computational results were validated by comparing with the existing experimental data.

  17. Next Generation Hydro Software

    NARCIS (Netherlands)

    Donchyts, G.; Baart, F.; Van Dam, A.; De Goede, E.; Icke, J.; Putten, H.

    2014-01-01

    An overview paper, describes motivation and main deliverables of the Next Generation Hydro Software (NGHS) project. Important technological innovations include development of the new computational core Delft3D Flexible Mesh, as well as the open modelling environment Delta Shell.

  18. Adaptive mesh refinement in titanium

    Energy Technology Data Exchange (ETDEWEB)

    Colella, Phillip; Wen, Tong

    2005-01-21

    In this paper, we evaluate Titanium's usability as a high-level parallel programming language through a case study, where we implement a subset of Chombo's functionality in Titanium. Chombo is a software package applying the Adaptive Mesh Refinement methodology to numerical Partial Differential Equations at the production level. In Chombo, the library approach is used to parallel programming (C++ and Fortran, with MPI), whereas Titanium is a Java dialect designed for high-performance scientific computing. The performance of our implementation is studied and compared with that of Chombo in solving Poisson's equation based on two grid configurations from a real application. Also provided are the counts of lines of code from both sides.

  19. Converting skeletal structures to quad dominant meshes

    DEFF Research Database (Denmark)

    Bærentzen, Jakob Andreas; Misztal, Marek Krzysztof; Welnicka, Katarzyna

    2012-01-01

    We propose the Skeleton to Quad-dominant polygonal Mesh algorithm (SQM), which converts skeletal structures to meshes composed entirely of polar and annular regions. Both types of regions have a regular structure where all faces are quads except for a single ring of triangles at the center of eac...

  20. Parallel mesh management using interoperable tools.

    Energy Technology Data Exchange (ETDEWEB)

    Tautges, Timothy James (Argonne National Laboratory); Devine, Karen Dragon

    2010-10-01

    This presentation included a discussion of challenges arising in parallel mesh management, as well as demonstrated solutions. They also described the broad range of software for mesh management and modification developed by the Interoperable Technologies for Advanced Petascale Simulations (ITAPS) team, and highlighted applications successfully using the ITAPS tool suite.

  1. 7th International Meshing Roundtable '98

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, T.J.

    1998-10-01

    The goal of the 7th International Meshing Roundtable is to bring together researchers and developers from industry, academia, and government labs in a stimulating, open environment for the exchange of technical information related to the meshing process. In the past, the Roundtable has enjoyed significant participation from each of these groups from a wide variety of countries.

  2. Laparoscopic Pelvic Floor Repair Using Polypropylene Mesh

    Directory of Open Access Journals (Sweden)

    Shih-Shien Weng

    2008-09-01

    Conclusion: Laparoscopic pelvic floor repair using a single piece of polypropylene mesh combined with uterosacral ligament suspension appears to be a feasible procedure for the treatment of advanced vaginal vault prolapse and enterocele. Fewer mesh erosions and postoperative pain syndromes were seen in patients who had no previous pelvic floor reconstructive surgery.

  3. Wrinkling prediction with adaptive mesh refinement

    NARCIS (Netherlands)

    Selman, A.; Meinders, Vincent T.; van den Boogaard, Antonius H.; Huetink, Han

    2000-01-01

    An adaptive mesh refinement procedure for wrinkling prediction analyses is presented. First the critical values are determined using Hutchinson’s bifurcation functional. A wrinkling risk factor is then defined and used to determined areas of potential wrinkling risk. Finally, a mesh refinement is

  4. A Comparative Study of Navigation Meshes

    NARCIS (Netherlands)

    van Toll, W.G.; Triesscheijn, Roy; Kallmann, Marcelo; Oliva, Ramon; Pelechano, Nuria; Pettré, Julien; Geraerts, R.J.

    2016-01-01

    A navigation mesh is a representation of a 2D or 3D virtual environment that enables path planning and crowd simulation for walking characters. Various state-of-the-art navigation meshes exist, but there is no standardized way of evaluating or comparing them. Each implementation is in a different

  5. A fast and automatic full-potential finite volume solver on Cartesian grids for unconventional configurations

    Directory of Open Access Journals (Sweden)

    Fanxi LYU

    2017-06-01

    Full Text Available To meet the requirements of fast and automatic computation of subsonic and transonic aerodynamics in aircraft conceptual design, a novel finite volume solver for full potential flows on adaptive Cartesian grids is developed in this paper. Cartesian grids with geometric adaptation are firstly generated automatically with boundary cells processed by cell-cutting and cell-merging algorithms. The nonlinear full potential equation is discretized by a finite volume scheme on these Cartesian grids and iteratively solved in an implicit fashion with a generalized minimum residual (GMRES algorithm. During computation, solution-based mesh adaptation is also applied so as to capture flow features more accurately. An improved ghost-cell method is proposed to implement the non-penetration wall boundary condition where the velocity-potential of a ghost cell is modified by an analytic method instead. According to the characteristics of the Cartesian grids, the Kutta condition is applied by specially computing the gradients on Kutta-faces without directly assigning the potential jump to cells adjacent wake faces, which can significantly improve the solution converging speed. The feasibility and accuracy of the proposed method are validated by several typical cases of sub/transonic flows around an ONERA M6 wing, a DLR-F4 wing-body, and an unconventional figuration of a blended wing body (BWB. The validation cases demonstrate a fast convergence with fully automatic grid treatment and computation, and the results suggest its capacity in application for aircraft conceptual design.

  6. ImageParser: a tool for finite element generation from three-dimensional medical images.

    Science.gov (United States)

    Yin, H M; Sun, L Z; Wang, G; Yamada, T; Wang, J; Vannier, M W

    2004-10-01

    The finite element method (FEM) is a powerful mathematical tool to simulate and visualize the mechanical deformation of tissues and organs during medical examinations or interventions. It is yet a challenge to build up an FEM mesh directly from a volumetric image partially because the regions (or structures) of interest (ROIs) may be irregular and fuzzy. A software package, ImageParser, is developed to generate an FEM mesh from 3-D tomographic medical images. This software uses a semi-automatic method to detect ROIs from the context of image including neighboring tissues and organs, completes segmentation of different tissues, and meshes the organ into elements. The ImageParser is shown to build up an FEM model for simulating the mechanical responses of the breast based on 3-D CT images. The breast is compressed by two plate paddles under an overall displacement as large as 20% of the initial distance between the paddles. The strain and tangential Young's modulus distributions are specified for the biomechanical analysis of breast tissues. The ImageParser can successfully exact the geometry of ROIs from a complex medical image and generate the FEM mesh with customer-defined segmentation information.

  7. ImageParser: a tool for finite element generation from three-dimensional medical images

    Directory of Open Access Journals (Sweden)

    Yamada T

    2004-10-01

    Full Text Available Abstract Background The finite element method (FEM is a powerful mathematical tool to simulate and visualize the mechanical deformation of tissues and organs during medical examinations or interventions. It is yet a challenge to build up an FEM mesh directly from a volumetric image partially because the regions (or structures of interest (ROIs may be irregular and fuzzy. Methods A software package, ImageParser, is developed to generate an FEM mesh from 3-D tomographic medical images. This software uses a semi-automatic method to detect ROIs from the context of image including neighboring tissues and organs, completes segmentation of different tissues, and meshes the organ into elements. Results The ImageParser is shown to build up an FEM model for simulating the mechanical responses of the breast based on 3-D CT images. The breast is compressed by two plate paddles under an overall displacement as large as 20% of the initial distance between the paddles. The strain and tangential Young's modulus distributions are specified for the biomechanical analysis of breast tissues. Conclusion The ImageParser can successfully exact the geometry of ROIs from a complex medical image and generate the FEM mesh with customer-defined segmentation information.

  8. SU-E-J-87: Lung Deformable Image Registration Using Surface Mesh Deformation for Dose Distribution Combination

    International Nuclear Information System (INIS)

    Labine, A; Carrier, J; Bedwani, S; Chav, R; DeGuise, J

    2014-01-01

    Purpose: To allow a reliable deformable image registration (DIR) method for dose calculation in radiation therapy. This work proposes a performance assessment of a morphological segmentation algorithm that generates a deformation field from lung surface displacements with 4DCT datasets. Methods: From the 4DCT scans of 15 selected patients, the deep exhale phase of the breathing cycle is identified as the reference scan. Varian TPS EclipseTM is used to draw lung contours, which are given as input to the morphological segmentation algorithm. Voxelized contours are smoothed by a Gaussian filter and then transformed into a surface mesh representation. Such mesh is adapted by rigid and elastic deformations to match each subsequent lung volumes. The segmentation efficiency is assessed by comparing the segmented lung contour and the TPS contour considering two volume metrics, defined as Volumetric Overlap Error (VOE) [%] and Relative Volume Difference (RVD) [%] and three surface metrics, defined as Average Symmetric Surface Distance (ASSD) [mm], Root Mean Square Symmetric Surface Distance (RMSSD) [mm] and Maximum Symmetric Surface Distance (MSSD) [mm]. Then, the surface deformation between two breathing phases is determined by the displacement of corresponding vertices in each deformed surface. The lung surface deformation is linearly propagated in the lung volume to generate 3D deformation fields for each breathing phase. Results: The metrics were averaged over the 15 patients and calculated with the same segmentation parameters. The volume metrics obtained are a VOE of 5.2% and a RVD of 2.6%. The surface metrics computed are an ASSD of 0.5 mm, a RMSSD of 0.8 mm and a MSSD of 6.9 mm. Conclusion: This study shows that the morphological segmentation algorithm can provide an automatic method to capture an organ motion from 4DCT scans and translate it into a volume deformation grid needed by DIR method for dose distribution combination

  9. Smart-Home Architecture Based on Bluetooth mesh Technology

    Science.gov (United States)

    Wan, Qing; Liu, Jianghua

    2018-03-01

    This paper describes the smart home network system based on Nordic nrf52832 device. Nrf52832 is new generation RF SOC device focus on sensor monitor and low power Bluetooth connection applications. In this smart home system, we set up a self-organizing network system which consists of one control node and a lot of monitor nodes. The control node manages the whole network works; the monitor nodes collect the sensor information such as light intensity, temperature, humidity, PM2.5, etc. Then update to the control node by Bluetooth mesh network. The design results show that the Bluetooth mesh wireless network system is flexible and construction cost is low, which is suitable for the communication characteristics of a smart home network. We believe it will be wildly used in the future.

  10. Proceedings of the 20th International Meshing Roundtable

    CERN Document Server

    2012-01-01

    This volume contains the articles presented at the 20th International Meshing Roundtable (IMR) organized, in part, by Sandia National Laboratories and was held in Paris, France on Oct 23-26, 2011. This is the first year the IMR was held outside the United States territory. Other sponsors of the 20th IMR are Systematic Paris Region Systems & ICT Cluster, AIAA, NAFEMS, CEA, and NSF. The Sandia National Laboratories started the first IMR in 1992, and the conference has been held annually since. Each year the IMR brings together researchers, developers, and application experts, from a variety of disciplines, to present and discuss ideas on mesh generation and related topics. The topics covered by the IMR have applications in numerical analysis, computational geometry, computer graphics, as well as other areas, and the presentations describe novel work ranging from theory to application.     .

  11. Implementation of an Automatic System for the Monitoring of Start-up and Operating Regimes of the Cooling Water Installations of a Hydro Generator

    Directory of Open Access Journals (Sweden)

    Ioan Pădureanu

    2015-07-01

    Full Text Available The safe operation of a hydro generator depends on its thermal regime, the basic conditions being that the temperature in the stator winding fall within the limits of the insulation class. As the losses in copper depend on the square current in the stator winding, it is necessary that the cooling water debit should be adapted to the values of these losses, so that the winding temperature falls within the range of the values prescribed in the specifications. This paper presents an efficient solution of commanding and monitoring the water cooling installations of two high-power hydro generators.

  12. An Automatic Mosaicking Algorithm for the Generation of a Large-Scale Forest Height Map Using Spaceborne Repeat-Pass InSAR Correlation Magnitude

    Directory of Open Access Journals (Sweden)

    Yang Lei

    2015-05-01

    Full Text Available This paper describes an automatic mosaicking algorithm for creating large-scale mosaic maps of forest height. In contrast to existing mosaicking approaches through using SAR backscatter power and/or InSAR phase, this paper utilizes the forest height estimates that are inverted from spaceborne repeat-pass cross-pol InSAR correlation magnitude. By using repeat-pass InSAR correlation measurements that are dominated by temporal decorrelation, it has been shown that a simplified inversion approach can be utilized to create a height-sensitive measure over the whole interferometric scene, where two scene-wide fitting parameters are able to characterize the mean behavior of the random motion and dielectric changes of the volume scatterers within the scene. In order to combine these single-scene results into a mosaic, a matrix formulation is used with nonlinear least squares and observations in adjacent-scene overlap areas to create a self-consistent estimate of forest height over the larger region. This automated mosaicking method has the benefit of suppressing the global fitting error and, thus, mitigating the “wallpapering” problem in the manual mosaicking process. The algorithm is validated over the U.S. state of Maine by using InSAR correlation magnitude data from ALOS/PALSAR and comparing the inverted forest height with Laser Vegetation Imaging Sensor (LVIS height and National Biomass and Carbon Dataset (NBCD basal area weighted (BAW height. This paper serves as a companion work to previously demonstrated results, the combination of which is meant to be an observational prototype for NASA’s DESDynI-R (now called NISAR and JAXA’s ALOS-2 satellite missions.

  13. A Reinforcement Learning Model Equipped with Sensors for Generating Perception Patterns: Implementation of a Simulated Air Navigation System Using ADS-B (Automatic Dependent Surveillance-Broadcast Technology

    Directory of Open Access Journals (Sweden)

    Santiago Álvarez de Toledo

    2017-01-01

    Full Text Available Over the last few decades, a number of reinforcement learning techniques have emerged, and different reinforcement learning-based applications have proliferated. However, such techniques tend to specialize in a particular field. This is an obstacle to their generalization and extrapolation to other areas. Besides, neither the reward-punishment (r-p learning process nor the convergence of results is fast and efficient enough. To address these obstacles, this research proposes a general reinforcement learning model. This model is independent of input and output types and based on general bioinspired principles that help to speed up the learning process. The model is composed of a perception module based on sensors whose specific perceptions are mapped as perception patterns. In this manner, similar perceptions (even if perceived at different positions in the environment are accounted for by the same perception pattern. Additionally, the model includes a procedure that statistically associates perception-action pattern pairs depending on the positive or negative results output by executing the respective action in response to a particular perception during the learning process. To do this, the model is fitted with a mechanism that reacts positively or negatively to particular sensory stimuli in order to rate results. The model is supplemented by an action module that can be configured depending on the maneuverability of each specific agent. The model has been applied in the air navigation domain, a field with strong safety restrictions, which led us to implement a simulated system equipped with the proposed model. Accordingly, the perception sensors were based on Automatic Dependent Surveillance-Broadcast (ADS-B technology, which is described in this paper. The results were quite satisfactory, and it outperformed traditional methods existing in the literature with respect to learning reliability and efficiency.

  14. A Reinforcement Learning Model Equipped with Sensors for Generating Perception Patterns: Implementation of a Simulated Air Navigation System Using ADS-B (Automatic Dependent Surveillance-Broadcast) Technology.

    Science.gov (United States)

    Álvarez de Toledo, Santiago; Anguera, Aurea; Barreiro, José M; Lara, Juan A; Lizcano, David

    2017-01-19

    Over the last few decades, a number of reinforcement learning techniques have emerged, and different reinforcement learning-based applications have proliferated. However, such techniques tend to specialize in a particular field. This is an obstacle to their generalization and extrapolation to other areas. Besides, neither the reward-punishment (r-p) learning process nor the convergence of results is fast and efficient enough. To address these obstacles, this research proposes a general reinforcement learning model. This model is independent of input and output types and based on general bioinspired principles that help to speed up the learning process. The model is composed of a perception module based on sensors whose specific perceptions are mapped as perception patterns. In this manner, similar perceptions (even if perceived at different positions in the environment) are accounted for by the same perception pattern. Additionally, the model includes a procedure that statistically associates perception-action pattern pairs depending on the positive or negative results output by executing the respective action in response to a particular perception during the learning process. To do this, the model is fitted with a mechanism that reacts positively or negatively to particular sensory stimuli in order to rate results. The model is supplemented by an action module that can be configured depending on the maneuverability of each specific agent. The model has been applied in the air navigation domain, a field with strong safety restrictions, which led us to implement a simulated system equipped with the proposed model. Accordingly, the perception sensors were based on Automatic Dependent Surveillance-Broadcast (ADS-B) technology, which is described in this paper. The results were quite satisfactory, and it outperformed traditional methods existing in the literature with respect to learning reliability and efficiency.

  15. A randomized controlled experimental study comparing chitosan coated polypropylene mesh and Proceed™ mesh for abdominal wall defect closure

    Directory of Open Access Journals (Sweden)

    S.T. Jayanth

    2015-12-01

    Conclusion: Chitosan coated polypropylene mesh was found to have similar efficacy to Proceed™ mesh. Chitosan coated polypropylene mesh, can act as an anti adhesive barrier when used in the repair of incisional hernias and abdominal wall defects.

  16. [CLINICAL EVALUATION OF THE NEW ANTISEPTIC MESHES].

    Science.gov (United States)

    Gogoladze, M; Kiladze, M; Chkhikvadze, T; Jiqia, D

    2016-12-01

    Improving the results of hernia treatment and prevention of complications became a goal of our research which included two parts - experimental and clinical. Histomorphological and bacteriological researches showed that the best result out of the 3 control groups was received in case of covering implant "Coladerm"+ with chlorhexidine. Based on the experiment results working process continued in clinics in order to test and introduce new "coladerm"+ chlorhexidine covered poliprophilene meshes into practice. For clinical illustration there were 60 patients introduced to the research who had hernioplasty procedures by different nets: I group - standard meshes+"coladerm"+chlorhexidine, 35 patients; II group - standard meshes +"coladerm", 15 patients; III group - standard meshes, 10 patients. Assessment of the wound and echo-control was done post-surgery on the 8th, 30th and 90th days. This clinical research based on the experimental results once again showed the best anti-microbe features of new antiseptic polymeric biocomposite meshes (standard meshes+"coladerm"+chlorhexidine); timely termination of regeneration and reparation processes without any post-surgery suppurative complications. We hope that new antiseptic polymeric biocomposite meshes presented by us will be successfully used in surgical practice of hernia treatment based on and supported by expermental-clinical research.

  17. Fog water collection effectiveness: Mesh intercomparisons

    Science.gov (United States)

    Fernandez, Daniel; Torregrosa, Alicia; Weiss-Penzias, Peter; Zhang, Bong June; Sorensen, Deckard; Cohen, Robert; McKinley, Gareth; Kleingartner, Justin; Oliphant, Andrew; Bowman, Matthew

    2018-01-01

    To explore fog water harvesting potential in California, we conducted long-term measurements involving three types of mesh using standard fog collectors (SFC). Volumetric fog water measurements from SFCs and wind data were collected and recorded in 15-minute intervals over three summertime fog seasons (2014–2016) at four California sites. SFCs were deployed with: standard 1.00 m2 double-layer 35% shade coefficient Raschel; stainless steel mesh coated with the MIT-14 hydrophobic formulation; and FogHa-Tin, a German manufactured, 3-dimensional spacer fabric deployed in two orientations. Analysis of 3419 volumetric samples from all sites showed strong relationships between mesh efficiency and wind speed. Raschel mesh collected 160% more fog water than FogHa-Tin at wind speeds less than 1 m s–1 and 45% less for wind speeds greater than 5 m s–1. MIT-14 coated stainless-steel mesh collected more fog water than Raschel mesh at all wind speeds. At low wind speeds of wind speeds of 4–5 m s–1, it collected 41% more. FogHa-Tin collected 5% more fog water when the warp of the weave was oriented vertically, per manufacturer specification, than when the warp of the weave was oriented horizontally. Time series measurements of three distinct mesh across similar wind regimes revealed inconsistent lags in fog water collection and inconsistent performance. Since such differences occurred under similar wind-speed regimes, we conclude that other factors play important roles in mesh performance, including in-situ fog event and aerosol dynamics that affect droplet-size spectra and droplet-to-mesh surface interactions.

  18. Mesh optimization for microbial fuel cell cathodes constructed around stainless steel mesh current collectors

    KAUST Repository

    Zhang, Fang

    2011-02-01

    Mesh current collectors made of stainless steel (SS) can be integrated into microbial fuel cell (MFC) cathodes constructed of a reactive carbon black and Pt catalyst mixture and a poly(dimethylsiloxane) (PDMS) diffusion layer. It is shown here that the mesh properties of these cathodes can significantly affect performance. Cathodes made from the coarsest mesh (30-mesh) achieved the highest maximum power of 1616 ± 25 mW m-2 (normalized to cathode projected surface area; 47.1 ± 0.7 W m-3 based on liquid volume), while the finest mesh (120-mesh) had the lowest power density (599 ± 57 mW m-2). Electrochemical impedance spectroscopy showed that charge transfer and diffusion resistances decreased with increasing mesh opening size. In MFC tests, the cathode performance was primarily limited by reaction kinetics, and not mass transfer. Oxygen permeability increased with mesh opening size, accounting for the decreased diffusion resistance. At higher current densities, diffusion became a limiting factor, especially for fine mesh with low oxygen transfer coefficients. These results demonstrate the critical nature of the mesh size used for constructing MFC cathodes. © 2010 Elsevier B.V. All rights reserved.

  19. Markov random fields on triangle meshes

    DEFF Research Database (Denmark)

    Andersen, Vedrana; Aanæs, Henrik; Bærentzen, Jakob Andreas

    2010-01-01

    In this paper we propose a novel anisotropic smoothing scheme based on Markov Random Fields (MRF). Our scheme is formulated as two coupled processes. A vertex process is used to smooth the mesh by displacing the vertices according to a MRF smoothness prior, while an independent edge process labels...... mesh edges according to a feature detecting prior. Since we should not smooth across a sharp feature, we use edge labels to control the vertex process. In a Bayesian framework, MRF priors are combined with the likelihood function related to the mesh formation method. The output of our algorithm...

  20. Engagement of Metal Debris into Gear Mesh

    Science.gov (United States)

    handschuh, Robert F.; Krantz, Timothy L.

    2010-01-01

    A series of bench-top experiments was conducted to determine the effects of metallic debris being dragged through meshing gear teeth. A test rig that is typically used to conduct contact fatigue experiments was used for these tests. Several sizes of drill material, shim stock and pieces of gear teeth were introduced and then driven through the meshing region. The level of torque required to drive the "chip" through the gear mesh was measured. From the data gathered, chip size sufficient to jam the mechanism can be determined.

  1. Osteoblast functions in functionally graded Ti-6Al-4 V mesh structures.

    Science.gov (United States)

    Nune, K C; Kumar, A; Misra, R D K; Li, S J; Hao, Y L; Yang, R

    2016-03-01

    We describe here the combined efforts of engineering and biological sciences as a systemic approach to fundamentally elucidate osteoblast functions in functionally graded Ti-6Al-4 V mesh structures in relation to uniform/monolithic mesh arrays. First, the interconnecting porous architecture of functionally graded mesh arrays was conducive to cellular functions including attachment, proliferation, and mineralization. The underlying reason is that the graded fabricated structure with cells seeded from the large pore size side provided a channel for efficient transfer of nutrients to other end of the structure (small pore size), leading to the generation of mineralized extracellular matrix by differentiating pre-osteoblasts. Second, a comparative and parametric study indicated that gradient mesh structure had a pronounced effect on cell adhesion and mineralization, and strongly influenced the proliferation phase. High intensity and near-uniform distribution of proteins (actin and vinculin) on struts of the gradient mesh structure (cells seeded from large pore side) implied signal transduction during cell adhesion and was responsible for superior cellular activity, in comparison to the uniform mesh structure and non-porous titanium alloy. Cells adhered to the mesh struts by forming a sheet, bridging the pores through numerous cytoplasmic extensions, in the case of porous mesh structures. Intercellular interaction in porous structures provided a pathway for cells to communicate and mature to a differentiated phenotype. Furthermore, the capability of cells to migrate through the interconnecting porous architecture on mesh structures led to colonization of the entire structure. Cells were embedded layer-by-layer in the extracellular matrix as the matrix mineralized. The outcomes of the study are expected to address challenges associated with the treatment of segmental bone defects and bone-remodeling through favorable modulation of cellular response. Moreover, the study

  2. Charged particle tracking through electrostatic wire meshes using the finite element method

    Energy Technology Data Exchange (ETDEWEB)

    Devlin, L. J.; Karamyshev, O.; Welsch, C. P., E-mail: carsten.welsch@cockcroft.ac.uk [The Cockcroft Institute, Daresbury Laboratory, Warrington (United Kingdom); Department of Physics, University of Liverpool, Liverpool (United Kingdom)

    2016-06-15

    Wire meshes are used across many disciplines to accelerate and focus charged particles, however, analytical solutions are non-exact and few codes exist which simulate the exact fields around a mesh with physical sizes. A tracking code based in Matlab-Simulink using field maps generated using finite element software has been developed which tracks electrons or ions through electrostatic wire meshes. The fields around such a geometry are presented as an analytical expression using several basic assumptions, however, it is apparent that computational calculations are required to obtain realistic values of electric potential and fields, particularly when multiple wire meshes are deployed. The tracking code is flexible in that any quantitatively describable particle distribution can be used for both electrons and ions as well as other benefits such as ease of export to other programs for analysis. The code is made freely available and physical examples are highlighted where this code could be beneficial for different applications.

  3. Note: Radial-thrust combo metal mesh foil bearing for microturbomachinery.

    Science.gov (United States)

    Park, Cheol Hoon; Choi, Sang Kyu; Hong, Doo Euy; Yoon, Tae Gwang; Lee, Sung Hwi

    2013-10-01

    This Note proposes a novel radial-thrust combo metal mesh foil bearing (MMFB). Although MMFBs have advantages such as higher stiffness and damping over conventional air foil bearings, studies related to MMFBs have been limited to radial MMFBs. The novel combo MMFB is composed of a radial top foil, thrust top foils, and a ring-shaped metal mesh damper--fabricated by compressing a copper wire mesh--with metal mesh thrust pads for the thrust bearing at both side faces. In this study, the combo MMFB was fabricated in half-split type to support the rotor for a micro gas turbine generator. The manufacture and assembly process for the half-split-type combo MMFB is presented. In addition, to verify the proposed combo MMFB, motoring test results up to 250,000 rpm and axial displacements as a function of rotational speed are presented.

  4. Acquiring Plausible Predications from MEDLINE by Clustering MeSH Annotations.

    Science.gov (United States)

    Miñarro-Giménez, Jose Antonio; Kreuzthaler, Markus; Bernhardt-Melischnig, Johannes; Martínez-Costa, Catalina; Schulz, Stefan

    2015-01-01

    The massive accumulation of biomedical knowledge is reflected by the growth of the literature database MEDLINE with over 23 million bibliographic records. All records are manually indexed by MeSH descriptors, many of them refined by MeSH subheadings. We use subheading information to cluster types of MeSH descriptor co-occurrences in MEDLINE by processing co-occurrence information provided by the UMLS. The goal is to infer plausible predicates to each resulting cluster. In an initial experiment this was done by grouping disease-pharmacologic substance co-occurrences into six clusters. Then, a domain expert manually performed the assignment of meaningful predicates to the clusters. The mean accuracy of the best ten generated biomedical facts of each cluster was 85%. This result supports the evidence of the potential of MeSH subheadings for extracting plausible medical predications from MEDLINE.

  5. Stress adapted embroidered meshes with a graded pattern design for abdominal wall hernia repair

    Science.gov (United States)

    Hahn, J.; Bittrich, L.; Breier, A.; Spickenheuer, A.

    2017-10-01

    Abdominal wall hernias are one of the most relevant injuries of the digestive system with 25 million patients in 2013. Surgery is recommended primarily using allogenic non-absorbable wrap-knitted meshes. These meshes have in common that their stress-strain behaviour is not adapted to the anisotropic behaviour of native abdominal wall tissue. The ideal mesh should possess an adequate mechanical behaviour and a suitable porosity at the same time. An alternative fabrication method to wrap-knitting is the embroidery technology with a high flexibility in pattern design and adaption of mechanical properties. In this study, a pattern generator was created for pattern designs consisting of a base and a reinforcement pattern. The embroidered mesh structures demonstrated different structural and mechanical characteristics. Additionally, the investigation of the mechanical properties exhibited an anisotropic mechanical behaviour for the embroidered meshes. As a result, the investigated pattern generator and the embroidery technology allow the production of stress adapted mesh structures that are a promising approach for hernia reconstruction.

  6. Effects of Chitosan Coatings on Polypropylene Mesh for Implantation in a Rat Abdominal Wall Model

    Science.gov (United States)

    Udpa, Natasha; Iyer, Shama R.; Rajoria, Rohit; Breyer, Kate E.; Valentine, Helen; Singh, Bhupinder; McDonough, Sean P.; Brown, Bryan N.; Bonassar, Lawrence J.

    2013-01-01

    Hernia repair and pelvic floor reconstruction are usually accompanied with the implantation of a surgical mesh, which frequently results in a foreign body response with associated complications. An ideal surgical mesh that allows force generation of muscle tissues without significant granulation tissue and/or fibrosis is of significant clinical interest. The objective of the present study was to evaluate the in vitro and in vivo responses of a chitosan coating on polypropylene mesh (Ch-PPM) in comparison with commercially available meshes. We found that application of a 0.5% (w/v) Ch-PPM elicited preferential attachment of myoblasts over fibroblast attachment in vitro. Therefore, we test the hypothesis that 0.5% Ch-PPM will encourage skeletal muscle tissue ingrowth and decrease fibrosis formation in vivo. We implanted 0.5% Ch-PPM, collagen-coated polypropylene mesh (Pelvitex™; C.R. Bard), and polypropylene (Avaulta Solo®; C.R. Bard) alone using a rat abdominal defect model. Force generation capacity and inflammatory response of each mesh were evaluated 2, 4, and 12 weeks postimplantation. We found that chitosan coating is associated with the restoration of functional skeletal muscle with histomorphologic characteristics that resemble native muscle and an early macrophage phenotypic response that has previously been shown to lead to more functional outcomes. PMID:23859182

  7. Solution of the neutron transport equation by the Method of Characteristics using a linear representation of the source within a mesh

    International Nuclear Information System (INIS)

    Mazumdar, Tanay; Degweker, S.B.

    2017-01-01

    Highlights: • In Method of Characteristics, the neutron source within a mesh is expanded up to linear term. • This expansion reduces the number of meshes as compared to flat source assumption. • Poor representation of circular geometry with coarser meshes is corrected. • Few benchmark problems are solved to show the advantages of linear expansion of source. • The advantage of the present formalism is quite visible in problems with large flux gradient. - Abstract: A common assumption in the solution of the neutron transport equation by the Method of Characteristics (MOC) is that the source (or flux) is constant within a mesh. This assumption is adequate provided the meshes are small enough so that the spatial variation of flux within a mesh may be ignored. Whether a mesh is small enough or not depends upon the flux gradient across a mesh, which in turn depends on factors like the presence of strong absorbers, localized sources or vacuum boundaries. The flat flux assumption often requires a very large number of meshes for solving the neutron transport equation with acceptable accuracy as was observed in our earlier work on the subject. A significant reduction in the required number of meshes is attainable by using a higher order representation of the flux within a mesh. In this paper, we expand the source within a mesh up to first order (linear) terms, which permits the use of larger sized (and therefore fewer) meshes and thereby reduces the computation time without compromising the accuracy of calculation. Since the division of the geometry into meshes is through an automatic triangulation procedure using the Bowyer-Watson algorithm, representation of circular objects (cylindrical fuel rods) with coarse meshes is poorer and causes geometry related errors. A numerical recipe is presented to make a correction to the automatic triangulation process and thereby eliminate this source of error. A number of benchmark problems are analyzed to emphasize the

  8. LR: Compact connectivity representation for triangle meshes

    Energy Technology Data Exchange (ETDEWEB)

    Gurung, T; Luffel, M; Lindstrom, P; Rossignac, J

    2011-01-28

    We propose LR (Laced Ring) - a simple data structure for representing the connectivity of manifold triangle meshes. LR provides the option to store on average either 1.08 references per triangle or 26.2 bits per triangle. Its construction, from an input mesh that supports constant-time adjacency queries, has linear space and time complexity, and involves ordering most vertices along a nearly-Hamiltonian cycle. LR is best suited for applications that process meshes with fixed connectivity, as any changes to the connectivity require the data structure to be rebuilt. We provide an implementation of the set of standard random-access, constant-time operators for traversing a mesh, and show that LR often saves both space and traversal time over competing representations.

  9. Obtuse triangle suppression in anisotropic meshes

    KAUST Repository

    Sun, Feng

    2011-12-01

    Anisotropic triangle meshes are used for efficient approximation of surfaces and flow data in finite element analysis, and in these applications it is desirable to have as few obtuse triangles as possible to reduce the discretization error. We present a variational approach to suppressing obtuse triangles in anisotropic meshes. Specifically, we introduce a hexagonal Minkowski metric, which is sensitive to triangle orientation, to give a new formulation of the centroidal Voronoi tessellation (CVT) method. Furthermore, we prove several relevant properties of the CVT method with the newly introduced metric. Experiments show that our algorithm produces anisotropic meshes with much fewer obtuse triangles than using existing methods while maintaining mesh anisotropy. © 2011 Elsevier B.V. All rights reserved.

  10. Mesh Processing in Medical Image Analysis

    DEFF Research Database (Denmark)

    The following topics are dealt with: mesh processing; medical image analysis; interactive freeform modeling; statistical shape analysis; clinical CT images; statistical surface recovery; automated segmentation; cerebral aneurysms; and real-time particle-based representation....

  11. Pectus excavatum repair using Prolene polypropylene mesh.

    Science.gov (United States)

    Rasihashemi, Seyed Ziaeddin; Ramouz, Ali

    2016-02-01

    We aimed to assess the clinical outcomes of our surgical technique for repair of pectus excavatum using Prolene polypropylene mesh. Among 29 patients with pectus excavatum, the major complaint was cosmetic dissatisfaction, and the main symptom was exercise dyspnea in 15 patients. The Haller index used to assess pectus excavatum severity; it was significant in 22 patients. In all patients, a 2-layer sheet of Prolene polypropylene mesh was placed behind the sternum. No serious complication was observed postoperatively, and all patients were satisfied with the cosmetic result. Mitral valve prolapse improved in all cases after 3 months. Spirometry revealed improved pulmonary function after surgery. With due attention to the advantages of Prolene polypropylene mesh, such as remaining permanently in place, adapting to various stresses encountered in the body, resisting degradation by tissue enzymes, and trimming without unraveling, we concluded that this mesh is suitable for use as posterior sternal support in pectus excavatum patients. © The Author(s) 2016.

  12. Shape space exploration of constrained meshes

    KAUST Repository

    Yang, Yongliang

    2011-12-12

    We present a general computational framework to locally characterize any shape space of meshes implicitly prescribed by a collection of non-linear constraints. We computationally access such manifolds, typically of high dimension and co-dimension, through first and second order approximants, namely tangent spaces and quadratically parameterized osculant surfaces. Exploration and navigation of desirable subspaces of the shape space with regard to application specific quality measures are enabled using approximants that are intrinsic to the underlying manifold and directly computable in the parameter space of the osculant surface. We demonstrate our framework on shape spaces of planar quad (PQ) meshes, where each mesh face is constrained to be (nearly) planar, and circular meshes, where each face has a circumcircle. We evaluate our framework for navigation and design exploration on a variety of inputs, while keeping context specific properties such as fairness, proximity to a reference surface, etc. © 2011 ACM.

  13. Reduced order modelling techniques for mesh movement strategies as applied to fluid structure interactions

    CSIR Research Space (South Africa)

    Bogaers, Alfred EJ

    2010-01-01

    Full Text Available In this paper, we implement the method of Proper Orthogonal Decomposition (POD) to generate a reduced order model (ROM) of an optimization based mesh movement technique. In the study it is shown that POD can be used effectively to generate a ROM...

  14. Evaluation of mesh morphing and mapping techniques in patient specific modeling of the human pelvis.

    Science.gov (United States)

    Salo, Zoryana; Beek, Maarten; Whyne, Cari Marisa

    2013-01-01

    Robust generation of pelvic finite element models is necessary to understand the variation in mechanical behaviour resulting from differences in gender, aging, disease and injury. The objective of this study was to apply and evaluate mesh morphing and mapping techniques to facilitate the creation and structural analysis of specimen-specific finite element (FE) models of the pelvis. A specimen-specific pelvic FE model (source mesh) was generated following a traditional user-intensive meshing scheme. The source mesh was morphed onto a computed tomography scan generated target surface of a second pelvis using a landmarked-based approach, in which exterior source nodes were shifted to target surface vertices, while constrained along a normal. A second copy of the morphed model was further refined through mesh mapping, in which surface nodes of the initial morphed model were selected in patches and remapped onto the surfaces of the target model. Computed tomography intensity based material properties were assigned to each model. The source, target, morphed and mapped models were analyzed under axial compression using linear static FE analysis and their strain distributions evaluated. Morphing and mapping techniques were effectively applied to generate good quality geometrically complex specimen-specific pelvic FE models. Mapping significantly improved strain concurrence with the target pelvis FE model. Copyright © 2012 John Wiley & Sons, Ltd.

  15. Evaluation of mesh morphing and mapping techniques in patient specific modelling of the human pelvis.

    Science.gov (United States)

    Salo, Zoryana; Beek, Maarten; Whyne, Cari Marisa

    2012-08-01

    Robust generation of pelvic finite element models is necessary to understand variation in mechanical behaviour resulting from differences in gender, aging, disease and injury. The objective of this study was to apply and evaluate mesh morphing and mapping techniques to facilitate the creation and structural analysis of specimen-specific finite element (FE) models of the pelvis. A specimen-specific pelvic FE model (source mesh) was generated following a traditional user-intensive meshing scheme. The source mesh was morphed onto a computed tomography scan generated target surface of a second pelvis using a landmarked-based approach, in which exterior source nodes were shifted to target surface vertices, while constrained along a normal. A second copy of the morphed model was further refined through mesh mapping, in which surface nodes of the initial morphed model were selected in patches and remapped onto the surfaces of the target model. Computed tomography intensity-based material properties were assigned to each model. The source, target, morphed and mapped models were analyzed under axial compression using linear static FE analysis, and their strain distributions were evaluated. Morphing and mapping techniques were effectively applied to generate good quality and geometrically complex specimen-specific pelvic FE models. Mapping significantly improved strain concurrence with the target pelvis FE model. Copyright © 2012 John Wiley & Sons, Ltd.

  16. Progressive compression of generic surface meshes

    OpenAIRE

    Caillaud , Florian; Vidal , Vincent; Dupont , Florent; Lavoué , Guillaume

    2015-01-01

    International audience; This paper presents a progressive compression method for generic surface meshes (non-manifold and/or polygonal). Two major contributions are proposed : (1) generic edge collapse and vertex split operators allowing surface simplication and renement of a mesh, whatever its connectivity; (2) a distortion-aware collapse clustering strategy that adapts the decima-tion granularity in order to optimize the rate-distortion tradeoff.

  17. Towards Blockchain-enabled Wireless Mesh Networks

    OpenAIRE

    Selimi, Mennan; Kabbinale, Aniruddh Rao; Ali, Anwaar; Navarro, Leandro; Sathiaseelan, Arjuna

    2018-01-01

    Recently, mesh networking and blockchain are two of the hottest technologies in the telecommunications industry. Combining both can reformulate internet access and make connecting to the Internet not only easy, but affordable too. Hyperledger Fabric (HLF) is a blockchain framework implementation and one of the Hyperledger projects hosted by The Linux Foundation. We evaluate HLF in a real production mesh network and in the laboratory, quantify its performance, bottlenecks and limitations of th...

  18. Quadrilateral/hexahedral finite element mesh coarsening

    Science.gov (United States)

    Staten, Matthew L; Dewey, Mark W; Scott, Michael A; Benzley, Steven E

    2012-10-16

    A technique for coarsening a finite element mesh ("FEM") is described. This technique includes identifying a coarsening region within the FEM to be coarsened. Perimeter chords running along perimeter boundaries of the coarsening region are identified. The perimeter chords are redirected to create an adaptive chord separating the coarsening region from a remainder of the FEM. The adaptive chord runs through mesh elements residing along the perimeter boundaries of the coarsening region. The adaptive chord is then extracted to coarsen the FEM.

  19. Gear Mesh Loss-of-Lubrication Experiments and Analytical Simulation

    Science.gov (United States)

    Handschuh, Robert F.; Polly, Joseph; Morales, Wilfredo

    2011-01-01

    An experimental program to determine the loss-of-lubrication (LOL) characteristics of spur gears in an aerospace simulation test facility has been completed. Tests were conducted using two different emergency lubricant types: (1) an oil mist system (two different misted lubricants) and (2) a grease injection system (two different grease types). Tests were conducted using a NASA Glenn test facility normally used for conducting contact fatigue. Tests were run at rotational speeds up to 10000 rpm using two different gear designs and two different gear materials. For the tests conducted using an air-oil misting system, a minimum lubricant injection rate was determined to permit the gear mesh to operate without failure for at least 1 hr. The tests allowed an elevated steady state temperature to be established. A basic 2-D heat transfer simulation has been developed to investigate temperatures of a simulated gear as a function of frictional behavior. The friction (heat generation source) between the meshing surfaces is related to the position in the meshing cycle, the load applied, and the amount of lubricant in the contact. Experimental conditions will be compared to those from the 2-D simulation.

  20. [Using polypropylene mesh in surgery for stress urinary incontinence].

    Science.gov (United States)

    Murguía-Flores, Erick Arturo; Quintero-Granados, Fernando; Torres-Gómez, Luis Guillermo; Chávez-Navarro, Mariela Mariela; Vázquez-Gómez, Martha Berenice; Rodríguez-Rodríguez, Elizabeth

    2017-01-01

    Stress urinary incontinence (SUI) is defined as the involuntary leakage of urine while making an effort, such as coughing, sneezing or during activity. Since SUI generates high costs and affects the quality of life, it is important to make a proper diagnosis and, consequently, manage SUI efficiently. The objective was to know whether it is appropriate to use polypropylene mesh for SUI. A historical cohort was conducted by reviewing records of patients with SUI treated with polypropylene mesh during 2013 with a follow-up of 12 months. Urinary continence was achieved in 98% of patients at one year. The complication rate was 2%. Only 12% of patients had normal weight. The most commonly used surgery was the placement of tension-free transobturator tape. The healing average reported worldwide is 90%, while the average for complications is 10%. In this study we achieved similar results. Using polypropylene mesh for surgical correction of SUI is a safe and effective alternative; however, studies with larger populations and more extensive monitoring to clarify this situation are required.