WorldWideScience

Sample records for unstructured triangulated surface

  1. Looseness and Independence Number of Triangulations on Closed Surfaces

    Directory of Open Access Journals (Sweden)

    Nakamoto Atsuhiro

    2016-08-01

    Full Text Available The looseness of a triangulation G on a closed surface F2, denoted by ξ (G, is defined as the minimum number k such that for any surjection c : V (G → {1, 2, . . . , k + 3}, there is a face uvw of G with c(u, c(v and c(w all distinct. We shall bound ξ (G for triangulations G on closed surfaces by the independence number of G denoted by α(G. In particular, for a triangulation G on the sphere, we have

  2. Dynamical triangulated fermionic surfaces

    International Nuclear Information System (INIS)

    Ambjoern, J.; Varsted, S.

    1990-12-01

    We perform Monte Carlo simulations of randomly triangulated random surfaces which have fermionic world-sheet scalars θ i associated with each vertex i in addition to the usual bosonic world-sheet scalar χ i μ . The fermionic degrees of freedom force the internal metrics of the string to be less singular than the internal metric of the pure bosonic string. (orig.)

  3. Gaussian vector fields on triangulated surfaces

    DEFF Research Database (Denmark)

    Ipsen, John H

    2016-01-01

    proven to be very useful to resolve the complex interplay between in-plane ordering of membranes and membrane conformations. In the present work we have developed a procedure for realistic representations of Gaussian models with in-plane vector degrees of freedoms on a triangulated surface. The method...

  4. Summations over equilaterally triangulated surfaces and the critical string measure

    International Nuclear Information System (INIS)

    Smit, D.J.; Lawrence Berkeley Lab., CA

    1992-01-01

    We propose a new approach to the summation over dynamically triangulated Riemann surfaces which does not rely on properties of the potential in a matrix model. Instead, we formulate a purely algebraic discretization of critical string path integral. This is combined with a technique which assigns to each equilateral triangulation of a two-dimensional surface a Riemann surface defined over a certain finite extension of the field of rational numbers, i.e. an arithmetic surface. Thus we establish a new formulated in which the sum over randomly triangulated surfaces defines an invariant measure on the moduli space of arithmetic surfaces. It is shown that because of this it is far from obvious that this measure for large genera approximates the measure defined by the continuum theory, i.e. Liouville theory or critical string theory. In low genus this subtlety does not exist. In the case of critical string theory we explicitly compute the volume of the moduli space of arithmetic surfaces in terms of the modular height function and show that for low genus it approximates correctly the continuum measure. We also discuss a continuum limit which bears some resemblance with a double scaling limit in matrix models. (orig.)

  5. Dynamically triangulated surfaces - some analytical results

    International Nuclear Information System (INIS)

    Kostov, I.K.

    1987-01-01

    We give a brief review of the analytical results concerning the model of dynamically triangulated surfaces. We will discuss the possible types of critical behaviour (depending on the dimension D of the embedding space) and the exact solutions obtained for D=0 and D=-2. The latter are important as a check of the Monte Carlo simulations applyed to study the model in more physical dimensions. They give also some general insight of its critical properties

  6. A grand-canonical ensemble of randomly triangulated surfaces

    International Nuclear Information System (INIS)

    Jurkiewicz, J.; Krzywicki, A.; Petersson, B.

    1986-01-01

    An algorithm is presented generating the grand-canonical ensemble of discrete, randomly triangulated Polyakov surfaces. The algorithm is used to calculate the susceptibility exponent, which controls the existence of the continuum limit of the considered model, for the dimensionality of the embedding space ranging from 0 to 20. (orig.)

  7. The ising model on the dynamical triangulated random surface

    International Nuclear Information System (INIS)

    Aleinov, I.D.; Migelal, A.A.; Zmushkow, U.V.

    1990-01-01

    The critical properties of Ising model on the dynamical triangulated random surface embedded in D-dimensional Euclidean space are investigated. The strong coupling expansion method is used. The transition to thermodynamical limit is performed by means of continuous fractions

  8. Reconstructing Surface Triangulations by Their Intersection Matrices 26 September 2014

    Directory of Open Access Journals (Sweden)

    Arocha Jorge L.

    2015-08-01

    Full Text Available The intersection matrix of a simplicial complex has entries equal to the rank of the intersecction of its facets. We prove that this matrix is enough to define up to isomorphism a triangulation of a surface.

  9. Smooth Bézier surfaces over unstructured quadrilateral meshes

    CERN Document Server

    Bercovier, Michel

    2017-01-01

    Using an elegant mixture of geometry, graph theory and linear analysis, this monograph completely solves a problem lying at the interface of Isogeometric Analysis (IgA) and Finite Element Methods (FEM). The recent explosion of IgA, strongly tying Computer Aided Geometry Design to Analysis, does not easily apply to the rich variety of complex shapes that engineers have to design and analyse. Therefore new developments have studied the extension of IgA to unstructured unions of meshes, similar to those one can find in FEM. The following problem arises: given an unstructured planar quadrilateral mesh, construct a C1-surface, by piecewise Bézier or B-Spline patches defined over this mesh. This problem is solved for C1-surfaces defined over plane bilinear Bézier patches, the corresponding results for B-Splines then being simple consequences. The method can be extended to higher-order quadrilaterals and even to three dimensions, and the most recent developments in this direction are also mentioned here.

  10. Accurate measurement of surface areas of anatomical structures by computer-assisted triangulation of computed tomography images

    Energy Technology Data Exchange (ETDEWEB)

    Allardice, J.T.; Jacomb-Hood, J.; Abulafi, A.M.; Williams, N.S. (Royal London Hospital (United Kingdom)); Cookson, J.; Dykes, E.; Holman, J. (London Hospital Medical College (United Kingdom))

    1993-05-01

    There is a need for accurate surface area measurement of internal anatomical structures in order to define light dosimetry in adjunctive intraoperative photodynamic therapy (AIOPDT). The authors investigated whether computer-assisted triangulation of serial sections generated by computed tomography (CT) scanning can give an accurate assessment of the surface area of the walls of the true pelvis after anterior resection and before colorectal anastomosis. They show that the technique of paper density tessellation is an acceptable method of measuring the surface areas of phantom objects, with a maximum error of 0.5%, and is used as the gold standard. Computer-assisted triangulation of CT images of standard geometric objects and accurately-constructed pelvic phantoms gives a surface area assessment with a maximum error of 2.5% compared with the gold standard. The CT images of 20 patients' pelves have been analysed by computer-assisted triangulation and this shows the surface area of the walls varies from 143 cm[sup 2] to 392 cm[sup 2]. (Author).

  11. Degree-regular triangulations of torus and Klein bottle

    Indian Academy of Sciences (India)

    Home; Journals; Proceedings – Mathematical Sciences; Volume 115; Issue 3 ... A triangulation of a connected closed surface is called degree-regular if each of its vertices have the same degree. ... In [5], Datta and Nilakantan have classified all the degree-regular triangulations of closed surfaces on at most 11 vertices.

  12. A general and Robust Ray-Casting-Based Algorithm for Triangulating Surfaces at the Nanoscale

    Science.gov (United States)

    Decherchi, Sergio; Rocchia, Walter

    2013-01-01

    We present a general, robust, and efficient ray-casting-based approach to triangulating complex manifold surfaces arising in the nano-bioscience field. This feature is inserted in a more extended framework that: i) builds the molecular surface of nanometric systems according to several existing definitions, ii) can import external meshes, iii) performs accurate surface area estimation, iv) performs volume estimation, cavity detection, and conditional volume filling, and v) can color the points of a grid according to their locations with respect to the given surface. We implemented our methods in the publicly available NanoShaper software suite (www.electrostaticszone.eu). Robustness is achieved using the CGAL library and an ad hoc ray-casting technique. Our approach can deal with any manifold surface (including nonmolecular ones). Those explicitly treated here are the Connolly-Richards (SES), the Skin, and the Gaussian surfaces. Test results indicate that it is robust to rotation, scale, and atom displacement. This last aspect is evidenced by cavity detection of the highly symmetric structure of fullerene, which fails when attempted by MSMS and has problems in EDTSurf. In terms of timings, NanoShaper builds the Skin surface three times faster than the single threaded version in Lindow et al. on a 100,000 atoms protein and triangulates it at least ten times more rapidly than the Kruithof algorithm. NanoShaper was integrated with the DelPhi Poisson-Boltzmann equation solver. Its SES grid coloring outperformed the DelPhi counterpart. To test the viability of our method on large systems, we chose one of the biggest molecular structures in the Protein Data Bank, namely the 1VSZ entry, which corresponds to the human adenovirus (180,000 atoms after Hydrogen addition). We were able to triangulate the corresponding SES and Skin surfaces (6.2 and 7.0 million triangles, respectively, at a scale of 2 grids per Å) on a middle-range workstation. PMID:23577073

  13. Ising model of a randomly triangulated random surface as a definition of fermionic string theory

    International Nuclear Information System (INIS)

    Bershadsky, M.A.; Migdal, A.A.

    1986-01-01

    Fermionic degrees of freedom are added to randomly triangulated planar random surfaces. It is shown that the Ising model on a fixed graph is equivalent to a certain Majorana fermion theory on the dual graph. (orig.)

  14. SOFTWARE MODULE FOR CONSTRUCTING THE INTERSECTION OF TRIANGULATED SURFACES

    Directory of Open Access Journals (Sweden)

    Vladimir V. Kurgansky

    2018-03-01

    Full Text Available The effective algorithm is proposed for implementing Boolean operations over triangulated surfaces, namely, disjunction, conjunction and Boolean difference, and its software implementation. The idea consists in as follow. The first step is to determine pairs of intersecting triangles: localizing the intersection of the two surfaces using the bounding volume of the parallelepipeds and the future of their intersection. The second step is constructing an intersection line for each pair of triangles: a pair of intersecting triangles is selected, and the segment along which they intersect is constructed. Further, thanks to the entered data structure, "adjacent" triangles are selected, among which are selected those that form the intersecting pair. The process described above continues as long as such triangles can be detected. After that the triangles involved in the intersection are retriangulated. For each triangle, all the edges are known on which it intersects with triangles from another surface. These edges are structural edges in the triangulation problem with constraints for a given triangle. The third step is to combine all surfaces into one surface. Further, subsurfaces are constructed along the loops of intersection limited by the found loops. Since the intersection line of the surfaces was constructed in sequence, it is possible to specify the direction of each edge. Any edge from the intersection line is selected. The triangle is added to the subsurface under construction, which includes this edge and its orientation is the same as the direction of the edge. The edge which was selected previously is deleted from intersection line, but two new edges are added is the remaining edges of added triangle. The third step is to combine all surfaces into one surface. Further, subsurfaces are constructed along the cycles of intersection limited by the found cycles. Since the intersection line of the surfaces was constructed in sequence, it is

  15. Generation of triangulated random surfaces by the Monte Carlo method in the grand canonical ensemble

    International Nuclear Information System (INIS)

    Zmushko, V.V.; Migdal, A.A.

    1987-01-01

    A model of triangulated random surfaces which is the discrete analog of the Polyakov string is considered. An algorithm is proposed which enables one to study the model by the Monte Carlo method in the grand canonical ensemble. Preliminary results on the determination of the critical index γ are presented

  16. A Novel Model of Conforming Delaunay Triangulation for Sensor Network Configuration

    Directory of Open Access Journals (Sweden)

    Yan Ma

    2015-01-01

    Full Text Available Delaunay refinement is a technique for generating unstructured meshes of triangles for sensor network configuration engineering practice. A new method for solving Delaunay triangulation problem is proposed in this paper, which is called endpoint triangle’s circumcircle model (ETCM. As compared with the original fractional node refinement algorithms, the proposed algorithm can get well refinement stability with least time cost. Simulations are performed under five aspects including refinement stability, the number of additional nodes, time cost, mesh quality after intruding additional nodes, and the aspect ratio improved by single additional node. All experimental results show the advantages of the proposed algorithm as compared with the existing algorithms and confirm the algorithm analysis sufficiently.

  17. Multigrid and multilevel domain decomposition for unstructured grids

    Energy Technology Data Exchange (ETDEWEB)

    Chan, T.; Smith, B.

    1994-12-31

    Multigrid has proven itself to be a very versatile method for the iterative solution of linear and nonlinear systems of equations arising from the discretization of PDES. In some applications, however, no natural multilevel structure of grids is available, and these must be generated as part of the solution procedure. In this presentation the authors will consider the problem of generating a multigrid algorithm when only a fine, unstructured grid is given. Their techniques generate a sequence of coarser grids by first forming an approximate maximal independent set of the vertices and then applying a Cavendish type algorithm to form the coarser triangulation. Numerical tests indicate that convergence using this approach can be as fast as standard multigrid on a structured mesh, at least in two dimensions.

  18. Generation of triangulated random surfaces by means of the Monte Carlo method in the grand canonical ensemble

    International Nuclear Information System (INIS)

    Zmushko, V.V.; Migdal, A.A.

    1987-01-01

    A model of triangulated random surfaces which is the discrete analogue of the Polyakov string is considered in the work. An algorithm is proposed which enables one to study the model by means of the Monte Carlo method in the grand canonical ensemble. Preliminary results are presented on the evaluation of the critical index γ

  19. The use of triangulation in qualitative research.

    Science.gov (United States)

    Carter, Nancy; Bryant-Lukosius, Denise; DiCenso, Alba; Blythe, Jennifer; Neville, Alan J

    2014-09-01

    Triangulation refers to the use of multiple methods or data sources in qualitative research to develop a comprehensive understanding of phenomena (Patton, 1999). Triangulation also has been viewed as a qualitative research strategy to test validity through the convergence of information from different sources. Denzin (1978) and Patton (1999) identified four types of triangulation: (a) method triangulation, (b) investigator triangulation, (c) theory triangulation, and (d) data source triangulation. The current article will present the four types of triangulation followed by a discussion of the use of focus groups (FGs) and in-depth individual (IDI) interviews as an example of data source triangulation in qualitative inquiry.

  20. A REST Service for Triangulation of Point Sets Using Oriented Matroids

    Directory of Open Access Journals (Sweden)

    José Antonio Valero Medina

    2014-05-01

    Full Text Available This paper describes the implementation of a prototype REST service for triangulation of point sets collected by mobile GPS receivers. The first objective of this paper is to test functionalities of an application, which exploits mobile devices’ capabilities to get data associated with their spatial location. A triangulation of a set of points provides a mechanism through which it is possible to produce an accurate representation of spatial data. Such triangulation may be used for representing surfaces by Triangulated Irregular Networks (TINs, and for decomposing complex two-dimensional spatial objects into simpler geometries. The second objective of this paper is to promote the use of oriented matroids for finding alternative solutions to spatial data processing and analysis tasks. This study focused on the particular case of the calculation of triangulations based on oriented matroids. The prototype described in this paper used a wrapper to integrate and expose several tools previously implemented in C++.

  1. Triangulated categories (AM-148)

    CERN Document Server

    Neeman, Amnon

    2014-01-01

    The first two chapters of this book offer a modern, self-contained exposition of the elementary theory of triangulated categories and their quotients. The simple, elegant presentation of these known results makes these chapters eminently suitable as a text for graduate students. The remainder of the book is devoted to new research, providing, among other material, some remarkable improvements on Brown''s classical representability theorem. In addition, the author introduces a class of triangulated categories""--the ""well generated triangulated categories""--and studies their properties. This

  2. Summing Feynman graphs by Monte Carlo: Planar φ3-theory and dynamically triangulated random surfaces

    International Nuclear Information System (INIS)

    Boulatov, D.V.

    1988-01-01

    New combinatorial identities are suggested relating the ratio of (n-1)th and nth orders of (planar) perturbation expansion for any quantity to some average over the ensemble of all planar graphs of the nth order. These identities are used for Monte Carlo calculation of critical exponents γ str (string susceptibility) in planar φ 3 -theory and in the dynamically triangulated random surface (DTRS) model near the convergence circle for various dimensions. In the solvable case D=1 the exact critical properties of the theory are reproduced numerically. (orig.)

  3. Non-degenerated Ground States and Low-degenerated Excited States in the Antiferromagnetic Ising Model on Triangulations

    Science.gov (United States)

    Jiménez, Andrea

    2014-02-01

    We study the unexpected asymptotic behavior of the degeneracy of the first few energy levels in the antiferromagnetic Ising model on triangulations of closed Riemann surfaces. There are strong mathematical and physical reasons to expect that the number of ground states (i.e., degeneracy) of the antiferromagnetic Ising model on the triangulations of a fixed closed Riemann surface is exponential in the number of vertices. In the set of plane triangulations, the degeneracy equals the number of perfect matchings of the geometric duals, and thus it is exponential by a recent result of Chudnovsky and Seymour. From the physics point of view, antiferromagnetic triangulations are geometrically frustrated systems, and in such systems exponential degeneracy is predicted. We present results that contradict these predictions. We prove that for each closed Riemann surface S of positive genus, there are sequences of triangulations of S with exactly one ground state. One possible explanation of this phenomenon is that exponential degeneracy would be found in the excited states with energy close to the ground state energy. However, as our second result, we show the existence of a sequence of triangulations of a closed Riemann surface of genus 10 with exactly one ground state such that the degeneracy of each of the 1st, 2nd, 3rd and 4th excited energy levels belongs to O( n), O( n 2), O( n 3) and O( n 4), respectively.

  4. Finite Volume Method for Unstructured Grid

    International Nuclear Information System (INIS)

    Casmara; Kardana, N.D.

    1997-01-01

    The success of a computational method depends on the solution algorithm and mesh generation techniques. cell distributions are needed, which allow the solution to be calculated over the entire body surface with sufficient accuracy. to handle the mesh generation for multi-connected region such as multi-element bodies, the unstructured finite volume method will be applied. the advantages of the unstructured meshes are it provides a great deal more flexibility for generating meshes about complex geometries and provides a natural setting for the use of adaptive meshing. the governing equations to be discretized are inviscid and rotational euler equations. Applications of the method will be evaluated on flow around single and multi-component bodies

  5. MHD simulations on an unstructured mesh

    International Nuclear Information System (INIS)

    Strauss, H.R.; Park, W.

    1996-01-01

    We describe work on a full MHD code using an unstructured mesh. MH3D++ is an extension of the PPPL MH3D resistive full MHD code. MH3D++ replaces the structured mesh and finite difference / fourier discretization of MH3D with an unstructured mesh and finite element / fourier discretization. Low level routines which perform differential operations, solution of PDEs such as Poisson's equation, and graphics, are encapsulated in C++ objects to isolate the finite element operations from the higher level code. The high level code is the same, whether it is run in structured or unstructured mesh versions. This allows the unstructured mesh version to be benchmarked against the structured mesh version. As a preliminary example, disruptions in DIIID reverse shear equilibria are studied numerically with the MH3D++ code. Numerical equilibria were first produced starting with an EQDSK file containing equilibrium data of a DIII-D L-mode negative central shear discharge. Using these equilibria, the linearized equations are time advanced to get the toroidal mode number n = 1 linear growth rate and eigenmode, which is resistively unstable. The equilibrium and linear mode are used to initialize 3D nonlinear runs. An example shows poloidal slices of 3D pressure surfaces: initially, on the left, and at an intermediate time, on the right

  6. Pre-processing for Triangulation of Probabilistic Networks

    NARCIS (Netherlands)

    Bodlaender, H.L.; Koster, A.M.C.A.; Eijkhof, F. van den; Gaag, L.C. van der

    2001-01-01

    The currently most efficient algorithm for inference with a probabilistic network builds upon a triangulation of a networks graph. In this paper, we show that pre-processing can help in finding good triangulations for probabilistic networks, that is, triangulations with a minimal maximum

  7. Simulating triangulations. Graphs, manifolds and (quantum) spacetime

    International Nuclear Information System (INIS)

    Krueger, Benedikt

    2016-01-01

    Triangulations, which can intuitively be described as a tessellation of space into simplicial building blocks, are structures that arise in various different branches of physics: They can be used for describing complicated and curved objects in a discretized way, e.g., in foams, gels or porous media, or for discretizing curved boundaries for fluid simulations or dissipative systems. Interpreting triangulations as (maximal planar) graphs makes it possible to use them in graph theory or statistical physics, e.g., as small-world networks, as networks of spins or in biological physics as actin networks. Since one can find an analogue of the Einstein-Hilbert action on triangulations, they can even be used for formulating theories of quantum gravity. Triangulations have also important applications in mathematics, especially in discrete topology. Despite their wide occurrence in different branches of physics and mathematics, there are still some fundamental open questions about triangulations in general. It is a prior unknown how many triangulations there are for a given set of points or a given manifold, or even whether there are exponentially many triangulations or more, a question that relates to a well-defined behavior of certain quantum geometry models. Another major unknown question is whether elementary steps transforming triangulations into each other, which are used in computer simulations, are ergodic. Using triangulations as model for spacetime, it is not clear whether there is a meaningful continuum limit that can be identified with the usual and well-tested theory of general relativity. Within this thesis some of these fundamental questions about triangulations are answered by the use of Markov chain Monte Carlo simulations, which are a probabilistic method for calculating statistical expectation values, or more generally a tool for calculating high-dimensional integrals. Additionally, some details about the Wang-Landau algorithm, which is the primary used

  8. Simulating triangulations. Graphs, manifolds and (quantum) spacetime

    Energy Technology Data Exchange (ETDEWEB)

    Krueger, Benedikt

    2016-07-01

    Triangulations, which can intuitively be described as a tessellation of space into simplicial building blocks, are structures that arise in various different branches of physics: They can be used for describing complicated and curved objects in a discretized way, e.g., in foams, gels or porous media, or for discretizing curved boundaries for fluid simulations or dissipative systems. Interpreting triangulations as (maximal planar) graphs makes it possible to use them in graph theory or statistical physics, e.g., as small-world networks, as networks of spins or in biological physics as actin networks. Since one can find an analogue of the Einstein-Hilbert action on triangulations, they can even be used for formulating theories of quantum gravity. Triangulations have also important applications in mathematics, especially in discrete topology. Despite their wide occurrence in different branches of physics and mathematics, there are still some fundamental open questions about triangulations in general. It is a prior unknown how many triangulations there are for a given set of points or a given manifold, or even whether there are exponentially many triangulations or more, a question that relates to a well-defined behavior of certain quantum geometry models. Another major unknown question is whether elementary steps transforming triangulations into each other, which are used in computer simulations, are ergodic. Using triangulations as model for spacetime, it is not clear whether there is a meaningful continuum limit that can be identified with the usual and well-tested theory of general relativity. Within this thesis some of these fundamental questions about triangulations are answered by the use of Markov chain Monte Carlo simulations, which are a probabilistic method for calculating statistical expectation values, or more generally a tool for calculating high-dimensional integrals. Additionally, some details about the Wang-Landau algorithm, which is the primary used

  9. Triangulation positioning system network

    Directory of Open Access Journals (Sweden)

    Sfendourakis Marios

    2017-01-01

    Full Text Available This paper presents ongoing work on localization and positioning through triangulation procedure for a Fixed Sensors Network - FSN.The FSN has to work as a system.As the triangulation problem becomes high complicated in a case with large numbers of sensors and transmitters, an adequate grid topology is needed in order to tackle the detection complexity.For that reason a Network grid topology is presented and areas that are problematic and need further analysis are analyzed.The Network System in order to deal with problems of saturation and False Triangulations - FTRNs will have to find adequate methods in every sub-area of the Area Of Interest - AOI.Also, concepts like Sensor blindness and overall Network blindness, are presented. All these concepts affect the Network detection rate and its performance and ought to be considered in a way that the network overall performance won’t be degraded.Network performance should be monitored contentiously, with right algorithms and methods.It is also shown that as the number of TRNs and FTRNs is increased Detection Complexity - DC is increased.It is hoped that with further research all the characteristics of a triangulation system network for positioning will be gained and the system will be able to perform autonomously with a high detection rate.

  10. Triangulation-based 3D surveying borescope

    Science.gov (United States)

    Pulwer, S.; Steglich, P.; Villringer, C.; Bauer, J.; Burger, M.; Franz, M.; Grieshober, K.; Wirth, F.; Blondeau, J.; Rautenberg, J.; Mouti, S.; Schrader, S.

    2016-04-01

    In this work, a measurement concept based on triangulation was developed for borescopic 3D-surveying of surface defects. The integration of such measurement system into a borescope environment requires excellent space utilization. The triangulation angle, the projected pattern, the numerical apertures of the optical system, and the viewing angle were calculated using partial coherence imaging and geometric optical raytracing methods. Additionally, optical aberrations and defocus were considered by the integration of Zernike polynomial coefficients. The measurement system is able to measure objects with a size of 50 μm in all dimensions with an accuracy of +/- 5 μm. To manage the issue of a low depth of field while using an optical high resolution system, a wavelength dependent aperture was integrated. Thereby, we are able to control depth of field and resolution of the optical system and can use the borescope in measurement mode with high resolution and low depth of field or in inspection mode with low resolution and higher depth of field. First measurements of a demonstrator system are in good agreement with our simulations.

  11. Robotic tool positioning process using a multi-line off-axis laser triangulation sensor

    Science.gov (United States)

    Pinto, T. C.; Matos, G.

    2018-03-01

    Proper positioning of a friction stir welding head for pin insertion, driven by a closed chain robot, is important to ensure quality repair of cracks. A multi-line off-axis laser triangulation sensor was designed to be integrated to the robot, allowing relative measurements of the surface to be repaired. This work describes the sensor characteristics, its evaluation and the measurement process for tool positioning to a surface point of interest. The developed process uses a point of interest image and a measured point cloud to define the translation and rotation for tool positioning. Sensor evaluation and tests are described. Keywords: laser triangulation, 3D measurement, tool positioning, robotics.

  12. Optimization-based Fluid Simulation on Unstructured Meshes

    DEFF Research Database (Denmark)

    Misztal, Marek Krzysztof; Bridson, Robert; Erleben, Kenny

    2010-01-01

    for solving the fluid dynamics equations as well as direct access to the interface geometry data, making in- clusion of a new surface energy term feasible. Furthermore, using an unstructured mesh makes it straightforward to handle curved solid boundaries and gives us a possibility to explore several fluid...

  13. Mixed Methods, Triangulation, and Causal Explanation

    Science.gov (United States)

    Howe, Kenneth R.

    2012-01-01

    This article distinguishes a disjunctive conception of mixed methods/triangulation, which brings different methods to bear on different questions, from a conjunctive conception, which brings different methods to bear on the same question. It then examines a more inclusive, holistic conception of mixed methods/triangulation that accommodates…

  14. Hamiltonian Cycles on Random Eulerian Triangulations

    DEFF Research Database (Denmark)

    Guitter, E.; Kristjansen, C.; Nielsen, Jakob Langgaard

    1998-01-01

    . Considering the case n -> 0, this implies that the system of random Eulerian triangulations equipped with Hamiltonian cycles describes a c=-1 matter field coupled to 2D quantum gravity as opposed to the system of usual random triangulations equipped with Hamiltonian cycles which has c=-2. Hence, in this case...

  15. Natively unstructured loops differ from other loops.

    Directory of Open Access Journals (Sweden)

    Avner Schlessinger

    2007-07-01

    Full Text Available Natively unstructured or disordered protein regions may increase the functional complexity of an organism; they are particularly abundant in eukaryotes and often evade structure determination. Many computational methods predict unstructured regions by training on outliers in otherwise well-ordered structures. Here, we introduce an approach that uses a neural network in a very different and novel way. We hypothesize that very long contiguous segments with nonregular secondary structure (NORS regions differ significantly from regular, well-structured loops, and that a method detecting such features could predict natively unstructured regions. Training our new method, NORSnet, on predicted information rather than on experimental data yielded three major advantages: it removed the overlap between testing and training, it systematically covered entire proteomes, and it explicitly focused on one particular aspect of unstructured regions with a simple structural interpretation, namely that they are loops. Our hypothesis was correct: well-structured and unstructured loops differ so substantially that NORSnet succeeded in their distinction. Benchmarks on previously used and new experimental data of unstructured regions revealed that NORSnet performed very well. Although it was not the best single prediction method, NORSnet was sufficiently accurate to flag unstructured regions in proteins that were previously not annotated. In one application, NORSnet revealed previously undetected unstructured regions in putative targets for structural genomics and may thereby contribute to increasing structural coverage of large eukaryotic families. NORSnet found unstructured regions more often in domain boundaries than expected at random. In another application, we estimated that 50%-70% of all worm proteins observed to have more than seven protein-protein interaction partners have unstructured regions. The comparative analysis between NORSnet and DISOPRED2 suggested

  16. Triangulation in rewriting

    NARCIS (Netherlands)

    Oostrom, V. van; Zantema, Hans

    2012-01-01

    We introduce a process, dubbed triangulation, turning any rewrite relation into a confluent one. It is more direct than usual completion, in the sense that objects connected by a peak are directly oriented rather than their normal forms. We investigate conditions under which this process preserves

  17. Numerical experiments on unstructured PIC stability.

    Energy Technology Data Exchange (ETDEWEB)

    Day, David Minot

    2011-04-01

    Particle-In-Cell (PIC) is a method for plasmas simulation. Particles are pushed with Verlet time integration. Fields are modeled using finite differences on a tensor product mesh (cells). The Unstructured PIC methods studied here use instead finite element discretizations on unstructured (simplicial) meshes. PIC is constrained by stability limits (upper bounds) on mesh and time step sizes. Numerical evidence (2D) and analysis will be presented showing that similar bounds constrain unstructured PIC.

  18. A TQFT of Tuarev-Viro type on shaped triangulations

    Energy Technology Data Exchange (ETDEWEB)

    Kashaev, Rinat [Geneva Univ. (Switzerland); Luo, Feng [Rutgers Univ., Piscataway, NJ (United States). Dept. of Mathematics; Vartanov, Grigory [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2012-10-15

    A shaped triangulation is a finite triangulation of an oriented pseudo three manifold where each tetrahedron carries dihedral angles of an ideal hyberbolic tetrahedron. To each shaped triangulation, we associate a quantum partition function in the form of an absolutely convergent state integral which is invariant under shaped 3-2 Pachner moves and invariant with respect to shape gauge transformations generated by total dihedral angles around internal edges through the Neumann-Zagier Poisson bracket. Similarly to Turaev-Viro theory, the state variables live on edges of the triangulation but take their values on the whole real axis. The tetrahedral weight functions are composed of three hyperbolic gamma functions in a way that they enjoy a manifest tetrahedral symmetry. We conjecture that for shaped triangulations of closed 3-manifolds, our partition function is twice the absolute value squared of the partition function of Techmueller TQFT defined by Andersen and Kashaev. This is similar to the known relationship between the Turaev-Viro and the Witten-Reshetikhin-Turaev invariants of three manifolds. We also discuss interpretations of our construction in terms of three-dimensional supersymmetric field theories related to triangulated three-dimensional manifolds.

  19. A TQFT of Tuarev-Viro type on shaped triangulations

    International Nuclear Information System (INIS)

    Kashaev, Rinat; Luo, Feng

    2012-10-01

    A shaped triangulation is a finite triangulation of an oriented pseudo three manifold where each tetrahedron carries dihedral angles of an ideal hyberbolic tetrahedron. To each shaped triangulation, we associate a quantum partition function in the form of an absolutely convergent state integral which is invariant under shaped 3-2 Pachner moves and invariant with respect to shape gauge transformations generated by total dihedral angles around internal edges through the Neumann-Zagier Poisson bracket. Similarly to Turaev-Viro theory, the state variables live on edges of the triangulation but take their values on the whole real axis. The tetrahedral weight functions are composed of three hyperbolic gamma functions in a way that they enjoy a manifest tetrahedral symmetry. We conjecture that for shaped triangulations of closed 3-manifolds, our partition function is twice the absolute value squared of the partition function of Techmueller TQFT defined by Andersen and Kashaev. This is similar to the known relationship between the Turaev-Viro and the Witten-Reshetikhin-Turaev invariants of three manifolds. We also discuss interpretations of our construction in terms of three-dimensional supersymmetric field theories related to triangulated three-dimensional manifolds.

  20. Balanced Central Schemes for the Shallow Water Equations on Unstructured Grids

    Science.gov (United States)

    Bryson, Steve; Levy, Doron

    2004-01-01

    We present a two-dimensional, well-balanced, central-upwind scheme for approximating solutions of the shallow water equations in the presence of a stationary bottom topography on triangular meshes. Our starting point is the recent central scheme of Kurganov and Petrova (KP) for approximating solutions of conservation laws on triangular meshes. In order to extend this scheme from systems of conservation laws to systems of balance laws one has to find an appropriate discretization of the source terms. We first show that for general triangulations there is no discretization of the source terms that corresponds to a well-balanced form of the KP scheme. We then derive a new variant of a central scheme that can be balanced on triangular meshes. We note in passing that it is straightforward to extend the KP scheme to general unstructured conformal meshes. This extension allows us to recover our previous well-balanced scheme on Cartesian grids. We conclude with several simulations, verifying the second-order accuracy of our scheme as well as its well-balanced properties.

  1. Aerial Triangulation Close-range Images with Dual Quaternion

    Directory of Open Access Journals (Sweden)

    SHENG Qinghong

    2015-05-01

    Full Text Available A new method for the aerial triangulation of close-range images based on dual quaternion is presented. Using dual quaternion to represent the spiral screw motion of the beam in the space, the real part of dual quaternion represents the angular elements of all the beams in the close-range area networks, the real part and the dual part of dual quaternion represents the line elements corporately. Finally, an aerial triangulation adjustment model based on dual quaternion is established, and the elements of interior orientation and exterior orientation and the object coordinates of the ground points are calculated. Real images and large attitude angle simulated images are selected to run the experiments of aerial triangulation. The experimental results show that the new method for the aerial triangulation of close-range images based on dual quaternion can obtain higher accuracy.

  2. Efficient 3D geometric and Zernike moments computation from unstructured surface meshes.

    Science.gov (United States)

    Pozo, José María; Villa-Uriol, Maria-Cruz; Frangi, Alejandro F

    2011-03-01

    This paper introduces and evaluates a fast exact algorithm and a series of faster approximate algorithms for the computation of 3D geometric moments from an unstructured surface mesh of triangles. Being based on the object surface reduces the computational complexity of these algorithms with respect to volumetric grid-based algorithms. In contrast, it can only be applied for the computation of geometric moments of homogeneous objects. This advantage and restriction is shared with other proposed algorithms based on the object boundary. The proposed exact algorithm reduces the computational complexity for computing geometric moments up to order N with respect to previously proposed exact algorithms, from N(9) to N(6). The approximate series algorithm appears as a power series on the rate between triangle size and object size, which can be truncated at any desired degree. The higher the number and quality of the triangles, the better the approximation. This approximate algorithm reduces the computational complexity to N(3). In addition, the paper introduces a fast algorithm for the computation of 3D Zernike moments from the computed geometric moments, with a computational complexity N(4), while the previously proposed algorithm is of order N(6). The error introduced by the proposed approximate algorithms is evaluated in different shapes and the cost-benefit ratio in terms of error, and computational time is analyzed for different moment orders.

  3. On-Line Metrology with Conoscopic Holography: Beyond Triangulation

    Directory of Open Access Journals (Sweden)

    Ignacio Álvarez

    2009-09-01

    Full Text Available On-line non-contact surface inspection with high precision is still an open problem. Laser triangulation techniques are the most common solution for this kind of systems, but there exist fundamental limitations to their applicability when high precisions, long standoffs or large apertures are needed, and when there are difficult operating conditions. Other methods are, in general, not applicable in hostile environments or inadequate for on-line measurement. In this paper we review the latest research in Conoscopic Holography, an interferometric technique that has been applied successfully in this kind of applications, ranging from submicrometric roughness measurements, to long standoff sensors for surface defect detection in steel at high temperatures.

  4. Strongly minimal triangulations of (S × S )#3 and (S S

    Indian Academy of Sciences (India)

    2011) 986–995). We show that there are exactly 12 such triangulations up to isomorphism, 10 of which are orientable. Keywords. Stacked sphere; tight neighbourly triangulation; minimal triangulation. 2000 Mathematics Subject Classification.

  5. Measuring and Controlling Fairness of Triangulations

    KAUST Repository

    Jiang, Caigui

    2016-09-30

    The fairness of meshes that represent geometric shapes is a topic that has been studied extensively and thoroughly. However, the focus in such considerations often is not on the mesh itself, but rather on the smooth surface approximated by it, and fairness essentially expresses a mesh’s suitability for purposes such as visualization or simulation. This paper focusses on meshes in the architectural context, where vertices, edges, and faces of meshes are often highly visible, and any notion of fairness must take new aspects into account. We use concepts from discrete differential geometry (star-shaped Gauss images) to express fairness, and we also demonstrate how fairness can be incorporated into interactive geometric design of triangulated freeform skins.

  6. Efficient triangulation of Poisson-disk sampled point sets

    KAUST Repository

    Guo, Jianwei

    2014-05-06

    In this paper, we present a simple yet efficient algorithm for triangulating a 2D input domain containing a Poisson-disk sampled point set. The proposed algorithm combines a regular grid and a discrete clustering approach to speedup the triangulation. Moreover, our triangulation algorithm is flexible and performs well on more general point sets such as adaptive, non-maximal Poisson-disk sets. The experimental results demonstrate that our algorithm is robust for a wide range of input domains and achieves significant performance improvement compared to the current state-of-the-art approaches. © 2014 Springer-Verlag Berlin Heidelberg.

  7. Observation, innovation and triangulation

    DEFF Research Database (Denmark)

    Hetmar, Vibeke

    2007-01-01

    on experiences from a pilot project in three different classrooms methodological possibilities and problems are presented and discussed: 1) educational criticism, including the concepts of positions, perspectives and connoisseurship, 2) classroom observations and 3) triangulation as a methodological tool....

  8. Parallel adaptive simulations on unstructured meshes

    International Nuclear Information System (INIS)

    Shephard, M S; Jansen, K E; Sahni, O; Diachin, L A

    2007-01-01

    This paper discusses methods being developed by the ITAPS center to support the execution of parallel adaptive simulations on unstructured meshes. The paper first outlines the ITAPS approach to the development of interoperable mesh, geometry and field services to support the needs of SciDAC application in these areas. The paper then demonstrates the ability of unstructured adaptive meshing methods built on such interoperable services to effectively solve important physics problems. Attention is then focused on ITAPs' developing ability to solve adaptive unstructured mesh problems on massively parallel computers

  9. I/O-Efficient Construction of Constrained Delaunay Triangulations

    DEFF Research Database (Denmark)

    Agarwal, Pankaj Kumar; Arge, Lars; Yi, Ke

    2005-01-01

    In this paper, we designed and implemented an I/O-efficient algorithm for constructing constrained Delaunay triangulations. If the number of constraining segments is smaller than the memory size, our algorithm runs in expected O( N B logM/B NB ) I/Os for triangulating N points in the plane, where...

  10. Quantitative evaluation for small surface damage based on iterative difference and triangulation of 3D point cloud

    Science.gov (United States)

    Zhang, Yuyan; Guo, Quanli; Wang, Zhenchun; Yang, Degong

    2018-03-01

    This paper proposes a non-contact, non-destructive evaluation method for the surface damage of high-speed sliding electrical contact rails. The proposed method establishes a model of damage identification and calculation. A laser scanning system is built to obtain the 3D point cloud data of the rail surface. In order to extract the damage region of the rail surface, the 3D point cloud data are processed using iterative difference, nearest neighbours search and a data registration algorithm. The curvature of the point cloud data in the damage region is mapped to RGB color information, which can directly reflect the change trend of the curvature of the point cloud data in the damage region. The extracted damage region is divided into three prism elements by a method of triangulation. The volume and mass of a single element are calculated by the method of geometric segmentation. Finally, the total volume and mass of the damage region are obtained by the principle of superposition. The proposed method is applied to several typical injuries and the results are discussed. The experimental results show that the algorithm can identify damage shapes and calculate damage mass with milligram precision, which are useful for evaluating the damage in a further research stage.

  11. Solving the Einstein constraint equations on multi-block triangulations using finite element methods

    Energy Technology Data Exchange (ETDEWEB)

    Korobkin, Oleg; Pazos, Enrique [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, LA 70803 (United States); Aksoylu, Burak [Center for Computation and Technology, Louisiana State University, Baton Rouge, LA 70803 (United States); Holst, Michael [Department of Mathematics, University of California at San Diego 9500 Gilman Drive La Jolla, CA 92093-0112 (United States); Tiglio, Manuel [Department of Physics, University of Maryland, College Park, MD 20742 (United States)

    2009-07-21

    In order to generate initial data for nonlinear relativistic simulations, one needs to solve the Einstein constraints, which can be cast into a coupled set of nonlinear elliptic equations. Here we present an approach for solving these equations on three-dimensional multi-block domains using finite element methods. We illustrate our approach on a simple example of Brill wave initial data, with the constraints reducing to a single linear elliptic equation for the conformal factor psi. We use quadratic Lagrange elements on semi-structured simplicial meshes, obtained by triangulation of multi-block grids. In the case of uniform refinement the scheme is superconvergent at most mesh vertices, due to local symmetry of the finite element basis with respect to local spatial inversions. We show that in the superconvergent case subsequent unstructured mesh refinements do not improve the quality of our initial data. As proof of concept that this approach is feasible for generating multi-block initial data in three dimensions, after constructing the initial data we evolve them in time using a high-order finite-differencing multi-block approach and extract the gravitational waves from the numerical solution.

  12. Solving the Einstein constraint equations on multi-block triangulations using finite element methods

    International Nuclear Information System (INIS)

    Korobkin, Oleg; Pazos, Enrique; Aksoylu, Burak; Holst, Michael; Tiglio, Manuel

    2009-01-01

    In order to generate initial data for nonlinear relativistic simulations, one needs to solve the Einstein constraints, which can be cast into a coupled set of nonlinear elliptic equations. Here we present an approach for solving these equations on three-dimensional multi-block domains using finite element methods. We illustrate our approach on a simple example of Brill wave initial data, with the constraints reducing to a single linear elliptic equation for the conformal factor ψ. We use quadratic Lagrange elements on semi-structured simplicial meshes, obtained by triangulation of multi-block grids. In the case of uniform refinement the scheme is superconvergent at most mesh vertices, due to local symmetry of the finite element basis with respect to local spatial inversions. We show that in the superconvergent case subsequent unstructured mesh refinements do not improve the quality of our initial data. As proof of concept that this approach is feasible for generating multi-block initial data in three dimensions, after constructing the initial data we evolve them in time using a high-order finite-differencing multi-block approach and extract the gravitational waves from the numerical solution.

  13. HIRENASD coarse unstructured

    Data.gov (United States)

    National Aeronautics and Space Administration — Unstructured HIRENASD mesh: - coarse size (5.7 million nodes, 14.4 million elements) - for node centered solvers - 01.06.2011 - caution: dimensions in mm

  14. Label triangulation

    International Nuclear Information System (INIS)

    May, R.P.

    1983-01-01

    Label Triangulation (LT) with neutrons allows the investigation of the quaternary structure of biological multicomponent complexes under native conditions. Provided that the complex can be fully separated into and reconstituted from its single - protonated and deuterated - components, small angle neutron scattering (SANS) can give selective information on shapes and pair distances of these components. Following basic geometrical rules, the spatial arrangement of the components can be reconstructed from these data. LT has so far been successfully applied to the small and large ribosomal subunits and the transcriptase of E. coli. (author)

  15. Onomatopoeia characters extraction from comic images using constrained Delaunay triangulation

    Science.gov (United States)

    Liu, Xiangping; Shoji, Kenji; Mori, Hiroshi; Toyama, Fubito

    2014-02-01

    A method for extracting onomatopoeia characters from comic images was developed based on stroke width feature of characters, since they nearly have a constant stroke width in a number of cases. An image was segmented with a constrained Delaunay triangulation. Connected component grouping was performed based on the triangles generated by the constrained Delaunay triangulation. Stroke width calculation of the connected components was conducted based on the altitude of the triangles generated with the constrained Delaunay triangulation. The experimental results proved the effectiveness of the proposed method.

  16. Domain Discretization and Circle Packings

    DEFF Research Database (Denmark)

    Dias, Kealey

    A circle packing is a configuration of circles which are tangent with one another in a prescribed pattern determined by a combinatorial triangulation, where the configuration fills a planar domain or a two-dimensional surface. The vertices in the triangulation correspond to centers of circles...... to domain discretization problems such as triangulation and unstructured mesh generation techniques. We wish to ask ourselves the question: given a cloud of points in the plane (we restrict ourselves to planar domains), is it possible to construct a circle packing preserving the positions of the vertices...... and constrained meshes having predefined vertices as constraints. A standard method of two-dimensional mesh generation involves conformal mapping of the surface or domain to standardized shapes, such as a disk. Since circle packing is a new technique for constructing discrete conformal mappings, it is possible...

  17. Stereo-tomography in triangulated models

    Science.gov (United States)

    Yang, Kai; Shao, Wei-Dong; Xing, Feng-yuan; Xiong, Kai

    2018-04-01

    Stereo-tomography is a distinctive tomographic method. It is capable of estimating the scatterer position, the local dip of scatterer and the background velocity simultaneously. Building a geologically consistent velocity model is always appealing for applied and earthquake seismologists. Differing from the previous work to incorporate various regularization techniques into the cost function of stereo-tomography, we think extending stereo-tomography to the triangulated model will be the most straightforward way to achieve this goal. In this paper, we provided all the Fréchet derivatives of stereo-tomographic data components with respect to model components for slowness-squared triangulated model (or sloth model) in 2D Cartesian coordinate based on the ray perturbation theory for interfaces. A sloth model representation means a sparser model representation when compared with conventional B-spline model representation. A sparser model representation leads to a smaller scale of stereo-tomographic (Fréchet) matrix, a higher-accuracy solution when solving linear equations, a faster convergence rate and a lower requirement for quantity of data space. Moreover, a quantitative representation of interface strengthens the relationships among different model components, which makes the cross regularizations among these model components, such as node coordinates, scatterer coordinates and scattering angles, etc., more straightforward and easier to be implemented. The sensitivity analysis, the model resolution matrix analysis and a series of synthetic data examples demonstrate the correctness of the Fréchet derivatives, the applicability of the regularization terms and the robustness of the stereo-tomography in triangulated model. It provides a solid theoretical foundation for the real applications in the future.

  18. Random surfaces and strings

    International Nuclear Information System (INIS)

    Ambjoern, J.

    1987-08-01

    The theory of strings is the theory of random surfaces. I review the present attempts to regularize the world sheet of the string by triangulation. The corresponding statistical theory of triangulated random surfaces has a surprising rich structure, but the connection to conventional string theory seems non-trivial. (orig.)

  19. MHD simulations on an unstructured mesh

    International Nuclear Information System (INIS)

    Strauss, H.R.; Park, W.; Belova, E.; Fu, G.Y.; Sugiyama, L.E.

    1998-01-01

    Two reasons for using an unstructured computational mesh are adaptivity, and alignment with arbitrarily shaped boundaries. Two codes which use finite element discretization on an unstructured mesh are described. FEM3D solves 2D and 3D RMHD using an adaptive grid. MH3D++, which incorporates methods of FEM3D into the MH3D generalized MHD code, can be used with shaped boundaries, which might be 3D

  20. Triangulating and guarding realistic polygons

    NARCIS (Netherlands)

    Aloupis, G.; Bose, P.; Dujmovic, V.; Gray, C.M.; Langerman, S.; Speckmann, B.

    2014-01-01

    We propose a new model of realistic input: k-guardable objects. An object is k-guardable if its boundary can be seen by k guards. We show that k-guardable polygons generalize two previously identified classes of realistic input. Following this, we give two simple algorithms for triangulating

  1. Triangulation, Respondent Validation, and Democratic Participation in Mixed Methods Research

    Science.gov (United States)

    Torrance, Harry

    2012-01-01

    Over the past 10 years or so the "Field" of "Mixed Methods Research" (MMR) has increasingly been exerting itself as something separate, novel, and significant, with some advocates claiming paradigmatic status. Triangulation is an important component of mixed methods designs. Triangulation has its origins in attempts to validate research findings…

  2. Mobile-robot navigation with complete coverage of unstructured environments

    OpenAIRE

    García Armada, Elena; González de Santos, Pablo

    2004-01-01

    There are some mobile-robot applications that require the complete coverage of an unstructured environment. Examples are humanitarian de-mining and floor-cleaning tasks. A complete-coverage algorithm is then used, a path-planning technique that allows the robot to pass over all points in the environment, avoiding unknown obstacles. Different coverage algorithms exist, but they fail working in unstructured environments. This paper details a complete-coverage algorithm for unstructured environm...

  3. Triangulating and guarding realistic polygons

    NARCIS (Netherlands)

    Aloupis, G.; Bose, P.; Dujmovic, V.; Gray, C.M.; Langerman, S.; Speckmann, B.

    2008-01-01

    We propose a new model of realistic input: k-guardable objects. An object is k-guardable if its boundary can be seen by k guards in the interior of the object. In this abstract, we describe a simple algorithm for triangulating k-guardable polygons. Our algorithm, which is easily implementable, takes

  4. Measuring and Controlling Fairness of Triangulations

    KAUST Repository

    Jiang, Caigui; Gü nther, Felix; Wallner, Johannes; Pottmann, Helmut

    2016-01-01

    of fairness must take new aspects into account. We use concepts from discrete differential geometry (star-shaped Gauss images) to express fairness, and we also demonstrate how fairness can be incorporated into interactive geometric design of triangulated

  5. Path integral measure and triangulation independence in discrete gravity

    Science.gov (United States)

    Dittrich, Bianca; Steinhaus, Sebastian

    2012-02-01

    A path integral measure for gravity should also preserve the fundamental symmetry of general relativity, which is diffeomorphism symmetry. In previous work, we argued that a successful implementation of this symmetry into discrete quantum gravity models would imply discretization independence. We therefore consider the requirement of triangulation independence for the measure in (linearized) Regge calculus, which is a discrete model for quantum gravity, appearing in the semi-classical limit of spin foam models. To this end we develop a technique to evaluate the linearized Regge action associated to Pachner moves in 3D and 4D and show that it has a simple, factorized structure. We succeed in finding a local measure for 3D (linearized) Regge calculus that leads to triangulation independence. This measure factor coincides with the asymptotics of the Ponzano Regge Model, a 3D spin foam model for gravity. We furthermore discuss to which extent one can find a triangulation independent measure for 4D Regge calculus and how such a measure would be related to a quantum model for 4D flat space. To this end, we also determine the dependence of classical Regge calculus on the choice of triangulation in 3D and 4D.

  6. Accuracy enhancement of point triangulation probes for linear displacement measurement

    Science.gov (United States)

    Kim, Kyung-Chan; Kim, Jong-Ahn; Oh, SeBaek; Kim, Soo Hyun; Kwak, Yoon Keun

    2000-03-01

    Point triangulation probes (PTBs) fall into a general category of noncontact height or displacement measurement devices. PTBs are widely used for their simple structure, high resolution, and long operating range. However, there are several factors that must be taken into account in order to obtain high accuracy and reliability; measurement errors from inclinations of an object surface, probe signal fluctuations generated by speckle effects, power variation of a light source, electronic noises, and so on. In this paper, we propose a novel signal processing algorithm, named as EASDF (expanded average square difference function), for a newly designed PTB which is composed of an incoherent source (LED), a line scan array detector, a specially selected diffuse reflecting surface, and several optical components. The EASDF, which is a modified correlation function, is able to calculate displacement between the probe and the object surface effectively even if there are inclinations, power fluctuations, and noises.

  7. Triangulation in Friedmann's cosmological model

    International Nuclear Information System (INIS)

    Fagundes, H.V.

    1977-01-01

    In Friedmann's model, physical 3-space has a curvature K = constant. In the cases of greatest interest (K different from 0) triangulation for the measurement of great distances should be based on non-Euclidean geometries: Riemannian (or doubly elliptic) geometry for a closed universe and Bolyai-Lobatchevsky's (or hiperbolic) geometry for an open universe [pt

  8. Unstructured Navier-Stokes Analysis of Full TCA Configuration

    Science.gov (United States)

    Frink, Neal T.; Pirzadeh, Shahyar Z.

    1999-01-01

    This paper presents an Unstructured Navier-Stokes Analysis of Full TCA (Technology Concept Airplane) Configuration. The topics include: 1) Motivation; 2) Milestone and approach; 3) Overview of the unstructured-grid system; 4) Results on full TCA W/B/N/D/E configuration; 5) Concluding remarks; and 6) Future directions.

  9. An overview of the stereo correlation and triangulation formulations used in DICe.

    Energy Technology Data Exchange (ETDEWEB)

    Turner, Daniel Z. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    This document provides a detailed overview of the stereo correlation algorithm and triangulation formulation used in the Digital Image Correlation Engine (DICe) to triangulate three dimensional motion in space given the image coordinates and camera calibration parameters.

  10. Quantum triangulations. Moduli spaces, strings, and quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Carfora, Mauro; Marzouli, Annalisa [Univ. degli Studi di Pavia (Italy). Dipt. Fisica Nucleare e Teorica; Istituto Nazionale di Fisica Nucleare e Teorica, Pavia (Italy)

    2012-07-01

    Research on polyhedral manifolds often points to unexpected connections between very distinct aspects of Mathematics and Physics. In particular triangulated manifolds play quite a distinguished role in such settings as Riemann moduli space theory, strings and quantum gravity, topological quantum field theory, condensed matter physics, and critical phenomena. Not only do they provide a natural discrete analogue to the smooth manifolds on which physical theories are typically formulated, but their appearance is rather often a consequence of an underlying structure which naturally calls into play non-trivial aspects of representation theory, of complex analysis and topology in a way which makes manifest the basic geometric structures of the physical interactions involved. Yet, in most of the existing literature, triangulated manifolds are still merely viewed as a convenient discretization of a given physical theory to make it more amenable for numerical treatment. The motivation for these lectures notes is thus to provide an approachable introduction to this topic, emphasizing the conceptual aspects, and probing, through a set of cases studies, the connection between triangulated manifolds and quantum physics to the deepest. This volume addresses applied mathematicians and theoretical physicists working in the field of quantum geometry and its applications. (orig.)

  11. Multiphase flow of immiscible fluids on unstructured moving meshes

    DEFF Research Database (Denmark)

    Misztal, Marek Krzysztof; Erleben, Kenny; Bargteil, Adam

    2012-01-01

    In this paper, we present a method for animating multiphase flow of immiscible fluids using unstructured moving meshes. Our underlying discretization is an unstructured tetrahedral mesh, the deformable simplicial complex (DSC), that moves with the flow in a Lagrangian manner. Mesh optimization op...

  12. Multiphase Flow of Immiscible Fluids on Unstructured Moving Meshes

    DEFF Research Database (Denmark)

    Misztal, Marek Krzysztof; Erleben, Kenny; Bargteil, Adam

    2013-01-01

    In this paper, we present a method for animating multiphase flow of immiscible fluids using unstructured moving meshes. Our underlying discretization is an unstructured tetrahedral mesh, the deformable simplicial complex (DSC), that moves with the flow in a Lagrangian manner. Mesh optimization op...

  13. The Unstructured Clinical Interview

    Science.gov (United States)

    Jones, Karyn Dayle

    2010-01-01

    In mental health, family, and community counseling settings, master's-level counselors engage in unstructured clinical interviewing to develop diagnoses based on the "Diagnostic and Statistical Manual of Mental Disorders" (4th ed., text rev.; "DSM-IV-TR"; American Psychiatric Association, 2000). Although counselors receive education about…

  14. The chromatic class and the chromatic number of the planar conjugated triangulation

    OpenAIRE

    Malinina, Natalia

    2013-01-01

    This material is dedicated to the estimation of the chromatic number and chromatic class of the conjugated triangulation (first conversion) and also of the second conversion of the planar triangulation. Also this paper introduces some new hypotheses, which are equivalent to Four Color Problem.

  15. Computing Flows Using Chimera and Unstructured Grids

    Science.gov (United States)

    Liou, Meng-Sing; Zheng, Yao

    2006-01-01

    DRAGONFLOW is a computer program that solves the Navier-Stokes equations of flows in complexly shaped three-dimensional regions discretized by use of a direct replacement of arbitrary grid overlapping by nonstructured (DRAGON) grid. A DRAGON grid (see figure) is a combination of a chimera grid (a composite of structured subgrids) and a collection of unstructured subgrids. DRAGONFLOW incorporates modified versions of two prior Navier-Stokes-equation-solving programs: OVERFLOW, which is designed to solve on chimera grids; and USM3D, which is used to solve on unstructured grids. A master module controls the invocation of individual modules in the libraries. At each time step of a simulated flow, DRAGONFLOW is invoked on the chimera portion of the DRAGON grid in alternation with USM3D, which is invoked on the unstructured subgrids of the DRAGON grid. The USM3D and OVERFLOW modules then immediately exchange their solutions and other data. As a result, USM3D and OVERFLOW are coupled seamlessly.

  16. On Adding Structure to Unstructured Overlay Networks

    Science.gov (United States)

    Leitão, João; Carvalho, Nuno A.; Pereira, José; Oliveira, Rui; Rodrigues, Luís

    Unstructured peer-to-peer overlay networks are very resilient to churn and topology changes, while requiring little maintenance cost. Therefore, they are an infrastructure to build highly scalable large-scale services in dynamic networks. Typically, the overlay topology is defined by a peer sampling service that aims at maintaining, in each process, a random partial view of peers in the system. The resulting random unstructured topology is suboptimal when a specific performance metric is considered. On the other hand, structured approaches (for instance, a spanning tree) may optimize a given target performance metric but are highly fragile. In fact, the cost for maintaining structures with strong constraints may easily become prohibitive in highly dynamic networks. This chapter discusses different techniques that aim at combining the advantages of unstructured and structured networks. Namely we focus on two distinct approaches, one based on optimizing the overlay and another based on optimizing the gossip mechanism itself.

  17. Drug repurposing by integrated literature mining and drug–gene–disease triangulation

    DEFF Research Database (Denmark)

    Sun, Peng; Guo, Jiong; Winnenburg, Rainer

    2017-01-01

    recent developments in computational drug repositioning and introduce the utilized data sources. Afterwards, we introduce a new data fusion model based on n-cluster editing as a novel multi-source triangulation strategy, which was further combined with semantic literature mining. Our evaluation suggests...... that utilizing drug–gene–disease triangulation coupled to sophisticated text analysis is a robust approach for identifying new drug candidates for repurposing....

  18. Triangulation-based edge measurement using polyview optics

    Science.gov (United States)

    Li, Yinan; Kästner, Markus; Reithmeier, Eduard

    2018-04-01

    Laser triangulation sensors as non-contact measurement devices are widely used in industry and research for profile measurements and quantitative inspections. Some technical applications e.g. edge measurements usually require a configuration of a single sensor and a translation stage or a configuration of multiple sensors, so that they can measure a large measurement range that is out of the scope of a single sensor. However, the cost of both configurations is high, due to the additional rotational axis or additional sensor. This paper provides a special measurement system for measurement of great curved surfaces based on a single sensor configuration. Utilizing a self-designed polyview optics and calibration process, the proposed measurement system allows an over 180° FOV (field of view) with a precise measurement accuracy as well as an advantage of low cost. The detailed capability of this measurement system based on experimental data is discussed in this paper.

  19. Unstructured mesh adaptivity for urban flooding modelling

    Science.gov (United States)

    Hu, R.; Fang, F.; Salinas, P.; Pain, C. C.

    2018-05-01

    Over the past few decades, urban floods have been gaining more attention due to their increase in frequency. To provide reliable flooding predictions in urban areas, various numerical models have been developed to perform high-resolution flood simulations. However, the use of high-resolution meshes across the whole computational domain causes a high computational burden. In this paper, a 2D control-volume and finite-element flood model using adaptive unstructured mesh technology has been developed. This adaptive unstructured mesh technique enables meshes to be adapted optimally in time and space in response to the evolving flow features, thus providing sufficient mesh resolution where and when it is required. It has the advantage of capturing the details of local flows and wetting and drying front while reducing the computational cost. Complex topographic features are represented accurately during the flooding process. For example, the high-resolution meshes around the buildings and steep regions are placed when the flooding water reaches these regions. In this work a flooding event that happened in 2002 in Glasgow, Scotland, United Kingdom has been simulated to demonstrate the capability of the adaptive unstructured mesh flooding model. The simulations have been performed using both fixed and adaptive unstructured meshes, and then results have been compared with those published 2D and 3D results. The presented method shows that the 2D adaptive mesh model provides accurate results while having a low computational cost.

  20. Fast randomized point location without preprocessing in two- and three-dimensional Delaunay triangulations

    Energy Technology Data Exchange (ETDEWEB)

    Muecke, E.P.; Saias, I.; Zhu, B.

    1996-05-01

    This paper studies the point location problem in Delaunay triangulations without preprocessing and additional storage. The proposed procedure finds the query point simply by walking through the triangulation, after selecting a good starting point by random sampling. The analysis generalizes and extends a recent result of d = 2 dimensions by proving this procedure to take expected time close to O(n{sup 1/(d+1)}) for point location in Delaunay triangulations of n random points in d = 3 dimensions. Empirical results in both two and three dimensions show that this procedure is efficient in practice.

  1. Triangulating' AMPATH: Demonstration of a multi-perspective ...

    African Journals Online (AJOL)

    For strategic planning, the Kenyan HIV/AIDS programme AMPATH (Academic Model Providing Access to Healthcare) sought to evaluate its performance in 2006. The method used for this evaluation was termed 'triangulation,' because it used information from three different sources – patients, communities, and programme ...

  2. A Sweepline Algorithm for Generalized Delaunay Triangulations

    DEFF Research Database (Denmark)

    Skyum, Sven

    We give a deterministic O(n log n) sweepline algorithm to construct the generalized Voronoi diagram for n points in the plane or rather its dual the generalized Delaunay triangulation. The algorithm uses no transformations and it is developed solely from the sweepline paradigm together...

  3. Triangulation applied to Jan H. van Bemmel

    NARCIS (Netherlands)

    Hasman, A.; Bergemann, D.; McCray, A. T.; Talmon, J. L.; Zvárová, J.

    2006-01-01

    OBJECTIVE: To describe the person of Jan H. van Bemmel from different points of view. METHOD: Triangulation. RESULTS AND CONCLUSIONS: Jan H. van Bemmel successfully contributed to research and education in medical informatics. He inspired a lot of people in The Netherlands and internationally

  4. Tradeoffs in Design Research: Development Oriented Triangulation

    NARCIS (Netherlands)

    Koen van Turnhout; Sabine Craenmehr; Robert Holwerda; Mark Menijn; Jan-Pieter Zwart; René Bakker

    2013-01-01

    The Development Oriented Triangulation (DOT) framework in this paper can spark and focus the debate about mixed-method approaches in HCI. The framework can be used to classify HCI methods, create mixed-method designs, and to align research activities in multidisciplinary projects. The framework is

  5. Constructing Delaunay triangulations along space-filling curves

    NARCIS (Netherlands)

    Buchin, K.; Fiat, A.; Sanders, P.

    2009-01-01

    Incremental construction con BRIO using a space-filling curve order for insertion is a popular algorithm for constructing Delaunay triangulations. So far, it has only been analyzed for the case that a worst-case optimal point location data structure is used which is often avoided in implementations.

  6. High Performance Parallel Multigrid Algorithms for Unstructured Grids

    Science.gov (United States)

    Frederickson, Paul O.

    1996-01-01

    We describe a high performance parallel multigrid algorithm for a rather general class of unstructured grid problems in two and three dimensions. The algorithm PUMG, for parallel unstructured multigrid, is related in structure to the parallel multigrid algorithm PSMG introduced by McBryan and Frederickson, for they both obtain a higher convergence rate through the use of multiple coarse grids. Another reason for the high convergence rate of PUMG is its smoother, an approximate inverse developed by Baumgardner and Frederickson.

  7. Semantic Annotation of Unstructured Documents Using Concepts Similarity

    Directory of Open Access Journals (Sweden)

    Fernando Pech

    2017-01-01

    Full Text Available There is a large amount of information in the form of unstructured documents which pose challenges in the information storage, search, and retrieval. This situation has given rise to several information search approaches. Some proposals take into account the contextual meaning of the terms specified in the query. Semantic annotation technique can help to retrieve and extract information in unstructured documents. We propose a semantic annotation strategy for unstructured documents as part of a semantic search engine. In this proposal, ontologies are used to determine the context of the entities specified in the query. Our strategy for extracting the context is focused on concepts similarity. Each relevant term of the document is associated with an instance in the ontology. The similarity between each of the explicit relationships is measured through the combination of two types of associations: the association between each pair of concepts and the calculation of the weight of the relationships.

  8. Parallel Performance Optimizations on Unstructured Mesh-based Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Sarje, Abhinav; Song, Sukhyun; Jacobsen, Douglas; Huck, Kevin; Hollingsworth, Jeffrey; Malony, Allen; Williams, Samuel; Oliker, Leonid

    2015-01-01

    © The Authors. Published by Elsevier B.V. This paper addresses two key parallelization challenges the unstructured mesh-based ocean modeling code, MPAS-Ocean, which uses a mesh based on Voronoi tessellations: (1) load imbalance across processes, and (2) unstructured data access patterns, that inhibit intra- and inter-node performance. Our work analyzes the load imbalance due to naive partitioning of the mesh, and develops methods to generate mesh partitioning with better load balance and reduced communication. Furthermore, we present methods that minimize both inter- and intranode data movement and maximize data reuse. Our techniques include predictive ordering of data elements for higher cache efficiency, as well as communication reduction approaches. We present detailed performance data when running on thousands of cores using the Cray XC30 supercomputer and show that our optimization strategies can exceed the original performance by over 2×. Additionally, many of these solutions can be broadly applied to a wide variety of unstructured grid-based computations.

  9. Altitude, Orthocenter of a Triangle and Triangulation

    Directory of Open Access Journals (Sweden)

    Coghetto Roland

    2016-03-01

    Full Text Available We introduce the altitudes of a triangle (the cevians perpendicular to the opposite sides. Using the generalized Ceva’s Theorem, we prove the existence and uniqueness of the orthocenter of a triangle [7]. Finally, we formalize in Mizar [1] some formulas [2] to calculate distance using triangulation.

  10. Multigrid on unstructured grids using an auxiliary set of structured grids

    Energy Technology Data Exchange (ETDEWEB)

    Douglas, C.C.; Malhotra, S.; Schultz, M.H. [Yale Univ., New Haven, CT (United States)

    1996-12-31

    Unstructured grids do not have a convenient and natural multigrid framework for actually computing and maintaining a high floating point rate on standard computers. In fact, just the coarsening process is expensive for many applications. Since unstructured grids play a vital role in many scientific computing applications, many modifications have been proposed to solve this problem. One suggested solution is to map the original unstructured grid onto a structured grid. This can be used as a fine grid in a standard multigrid algorithm to precondition the original problem on the unstructured grid. We show that unless extreme care is taken, this mapping can lead to a system with a high condition number which eliminates the usefulness of the multigrid method. Theorems with lower and upper bounds are provided. Simple examples show that the upper bounds are sharp.

  11. An efficient approach to unstructured mesh hydrodynamics on the cell broadband engine

    Energy Technology Data Exchange (ETDEWEB)

    Ferenbaugh, Charles R [Los Alamos National Laboratory

    2010-01-01

    Unstructured mesh physics for the Cell Broadband Engine (CBE) has received little or no attention to date, largely because the CBE architecture poses particular challenges for unstructured mesh algorithms. The most common SPU memory management strategies cannot be applied to the irregular memory access patterns of unstructured meshes, and the SPU vector instruction set does not support the indirect addressing needed by connectivity arrays. This paper presents an approach to unstructured mesh physics that addresses these challenges, by creating a new mesh data structure and reorganizing code to give efficient CBE performance. The approach is demonstrated on the FLAG production hydrodynamics code using standard test problems, and results show an average speedup of more than 5x over the original code.

  12. Development and verification of unstructured adaptive mesh technique with edge compatibility

    International Nuclear Information System (INIS)

    Ito, Kei; Ohshima, Hiroyuki; Kunugi, Tomoaki

    2010-01-01

    In the design study of the large-sized sodium-cooled fast reactor (JSFR), one key issue is suppression of gas entrainment (GE) phenomena at a gas-liquid interface. Therefore, the authors have been developed a high-precision CFD algorithm to evaluate the GE phenomena accurately. The CFD algorithm has been developed on unstructured meshes to establish an accurate modeling of JSFR system. For two-phase interfacial flow simulations, a high-precision volume-of-fluid algorithm is employed. It was confirmed that the developed CFD algorithm could reproduce the GE phenomena in a simple GE experiment. Recently, the authors have been developed an important technique for the simulation of the GE phenomena in JSFR. That is an unstructured adaptive mesh technique which can apply fine cells dynamically to the region where the GE occurs in JSFR. In this paper, as a part of the development, a two-dimensional unstructured adaptive mesh technique is discussed. In the two-dimensional adaptive mesh technique, each cell is refined isotropically to reduce distortions of the mesh. In addition, connection cells are formed to eliminate the edge incompatibility between refined and non-refined cells. The two-dimensional unstructured adaptive mesh technique is verified by solving well-known lid-driven cavity flow problem. As a result, the two-dimensional unstructured adaptive mesh technique succeeds in providing a high-precision solution, even though poor-quality distorted initial mesh is employed. In addition, the simulation error on the two-dimensional unstructured adaptive mesh is much less than the error on the structured mesh with a larger number of cells. (author)

  13. Enhancement of Friction against a Rough Surface by a Ridge-Channel Surface Microstructure.

    Science.gov (United States)

    Bai, Ying; Hui, Chung-Yuen; Levrard, Benjamin; Jagota, Anand

    2015-07-14

    We report on a study of the sliding friction of elastomeric surfaces patterned with ridges and channels (and unstructured flat controls), against both smooth and roughened spherical indenters. Against the smooth spherical indenter, all of the structured surfaces have highly reduced sliding friction due to the reduction in actual area of contact. Against roughened spherical indenters, however, the sliding force for structured samples can be up to 50% greater than that of an unstructured flat control. The mechanism of enhanced friction against a rough surface is due to a combination of increased actual area of contact, interlocking between roughness and the surface structure, and attendant dynamic instabilities that dissipate energy.

  14. The relationships between stressful life events during childhood and differentiation of self and intergenerational triangulation in adulthood.

    Science.gov (United States)

    Peleg, Ora

    2014-12-01

    This study examined the relationships between stressful life events in childhood and differentiation of self and intergenerational triangulation in adulthood. The sample included 217 students (173 females and 44 males) from a college in northern Israel. Participants completed the Hebrew versions of Life Events Checklist (LEC), Differentiation of Self Inventory-Revised (DSI-R) and intergenerational triangulation (INTRI). The main findings were that levels of stressful life events during childhood and adolescence among both genders were positively correlated with the levels of fusion with others and intergenerational triangulation. The levels of positive life events were negatively related to levels of emotional reactivity, emotional cut-off and intergenerational triangulation. Levels of stressful life events in females were positively correlated with emotional reactivity. Intergenerational triangulation was correlated with emotional reactivity, emotional cut-off, fusion with others and I-position. Findings suggest that families that experience higher levels of stressful life events may be at risk for higher levels of intergenerational triangulation and lower levels of differentiation of self. © 2014 International Union of Psychological Science.

  15. TRIANGULATION OF METHODS OF CAREER EDUCATION

    Directory of Open Access Journals (Sweden)

    Marija Turnsek Mikacic

    2015-09-01

    Full Text Available This paper is an overview of the current research in the field of career education and career planning. Presented results constitute a model based on the insight into different theories and empirical studies about career planning as a building block of personal excellence. We defined credibility, transferability and reliability of the research by means of triangulation. As sources of data of triangulation we included essays of participants of education and questionnaires. Qualitative analysis represented the framework for the construction of the paradigmatic model and the formulation of the final theory. We formulated a questionnaire on the basis of our own experiences in the area of the education of individuals. The quantitative analysis, based on the results of the interviews, confirms the following three hypotheses: The individuals who elaborated a personal career plan and acted accordingly, changed their attitudes towards their careers and took control over their lives; in addition, they achieved a high level of self-esteem and self-confidence, in tandem with the perception of personal excellence, in contrast to the individuals who did not participate in career education and did not elaborate a career plan. We used the tools of NLP (neurolinguistic programming as an additional technique at learning.

  16. VisualUrText: A Text Analytics Tool for Unstructured Textual Data

    Science.gov (United States)

    Zainol, Zuraini; Jaymes, Mohd T. H.; Nohuddin, Puteri N. E.

    2018-05-01

    The growing amount of unstructured text over Internet is tremendous. Text repositories come from Web 2.0, business intelligence and social networking applications. It is also believed that 80-90% of future growth data is available in the form of unstructured text databases that may potentially contain interesting patterns and trends. Text Mining is well known technique for discovering interesting patterns and trends which are non-trivial knowledge from massive unstructured text data. Text Mining covers multidisciplinary fields involving information retrieval (IR), text analysis, natural language processing (NLP), data mining, machine learning statistics and computational linguistics. This paper discusses the development of text analytics tool that is proficient in extracting, processing, analyzing the unstructured text data and visualizing cleaned text data into multiple forms such as Document Term Matrix (DTM), Frequency Graph, Network Analysis Graph, Word Cloud and Dendogram. This tool, VisualUrText, is developed to assist students and researchers for extracting interesting patterns and trends in document analyses.

  17. Internet information triangulation: Design theory and prototype evaluation

    NARCIS (Netherlands)

    Wijnhoven, Alphonsus B.J.M.; Brinkhuis, Michel

    2014-01-01

    Many discussions exist regarding the credibility of information on the Internet. Similar discussions happen on the interpretation of social scientific research data, for which information triangulation has been proposed as a useful method. In this article, we explore a design theory—consisting of a

  18. Incompressible Navier-Stokes inverse design method based on adaptive unstructured meshes

    International Nuclear Information System (INIS)

    Rahmati, M.T.; Charlesworth, D.; Zangeneh, M.

    2005-01-01

    An inverse method for blade design based on Navier-Stokes equations on adaptive unstructured meshes has been developed. In the method, unlike the method based on inviscid equations, the effect of viscosity is directly taken into account. In the method, the pressure (or pressure loading) is prescribed. The design method then computes the blade shape that would accomplish the target prescribed pressure distribution. The method is implemented using a cell-centered finite volume method, which solves the incompressible Navier-Stokes equations on unstructured meshes. An adaptive unstructured mesh method based on grid subdivision and local adaptive mesh method is utilized for increasing the accuracy. (author)

  19. Relating covariant and canonical approaches to triangulated models of quantum gravity

    International Nuclear Information System (INIS)

    Arnsdorf, Matthias

    2002-01-01

    In this paper we explore the relation between covariant and canonical approaches to quantum gravity and BF theory. We will focus on the dynamical triangulation and spin-foam models, which have in common that they can be defined in terms of sums over spacetime triangulations. Our aim is to show how we can recover these covariant models from a canonical framework by providing two regularizations of the projector onto the kernel of the Hamiltonian constraint. This link is important for the understanding of the dynamics of quantum gravity. In particular, we will see how in the simplest dynamical triangulation model we can recover the Hamiltonian constraint via our definition of the projector. Our discussion of spin-foam models will show how the elementary spin-network moves in loop quantum gravity, which were originally assumed to describe the Hamiltonian constraint action, are in fact related to the time-evolution generated by the constraint. We also show that the Immirzi parameter is important for the understanding of a continuum limit of the theory

  20. Triangulation and the importance of establishing valid methods for food safety culture evaluation.

    Science.gov (United States)

    Jespersen, Lone; Wallace, Carol A

    2017-10-01

    The research evaluates maturity of food safety culture in five multi-national food companies using method triangulation, specifically self-assessment scale, performance documents, and semi-structured interviews. Weaknesses associated with each individual method are known but there are few studies in food safety where a method triangulation approach is used for both data collection and data analysis. Significantly, this research shows that individual results taken in isolation can lead to wrong conclusions, resulting in potentially failing tactics and wasted investments. However, by applying method triangulation and reviewing results from a range of culture measurement tools it is possible to better direct investments and interventions. The findings add to the food safety culture paradigm beyond a single evaluation of food safety culture using generic culture surveys. Copyright © 2017. Published by Elsevier Ltd.

  1. Flattening of the electrocardiographic T-wave is a sign of proarrhythmic risk and a reflection of action potential triangulation

    DEFF Research Database (Denmark)

    Bhuiyan, Tanveer Ahmed; Graff, Claus; Kanters, J.K.

    2013-01-01

    Drug-induced triangulation of the cardiac action potential is associated with increased risk of arrhythmic events. It has been suggested that triangulation causes a flattening of the electrocardiographic T-wave but the relationship between triangulation, T-wave flattening and onset of arrhythmia ...

  2. Toward An Unstructured Mesh Database

    Science.gov (United States)

    Rezaei Mahdiraji, Alireza; Baumann, Peter Peter

    2014-05-01

    Unstructured meshes are used in several application domains such as earth sciences (e.g., seismology), medicine, oceanography, cli- mate modeling, GIS as approximate representations of physical objects. Meshes subdivide a domain into smaller geometric elements (called cells) which are glued together by incidence relationships. The subdivision of a domain allows computational manipulation of complicated physical structures. For instance, seismologists model earthquakes using elastic wave propagation solvers on hexahedral meshes. The hexahedral con- tains several hundred millions of grid points and millions of hexahedral cells. Each vertex node in the hexahedrals stores a multitude of data fields. To run simulation on such meshes, one needs to iterate over all the cells, iterate over incident cells to a given cell, retrieve coordinates of cells, assign data values to cells, etc. Although meshes are used in many application domains, to the best of our knowledge there is no database vendor that support unstructured mesh features. Currently, the main tool for querying and manipulating unstructured meshes are mesh libraries, e.g., CGAL and GRAL. Mesh li- braries are dedicated libraries which includes mesh algorithms and can be run on mesh representations. The libraries do not scale with dataset size, do not have declarative query language, and need deep C++ knowledge for query implementations. Furthermore, due to high coupling between the implementations and input file structure, the implementations are less reusable and costly to maintain. A dedicated mesh database offers the following advantages: 1) declarative querying, 2) ease of maintenance, 3) hiding mesh storage structure from applications, and 4) transparent query optimization. To design a mesh database, the first challenge is to define a suitable generic data model for unstructured meshes. We proposed ImG-Complexes data model as a generic topological mesh data model which extends incidence graph model to multi

  3. Implementation of LDG method for 3D unstructured meshes

    Directory of Open Access Journals (Sweden)

    Filander A. Sequeira Chavarría

    2012-07-01

    Full Text Available This paper describes an implementation of the Local Discontinuous Galerkin method (LDG applied to elliptic problems in 3D. The implementation of the major operators is discussed. In particular the use of higher-order approximations and unstructured meshes. Efficient data structures that allow fast assembly of the linear system in the mixed formulation are described in detail. Keywords: Discontinuous finite element methods, high-order approximations, unstructured meshes, object-oriented programming. Mathematics Subject Classification: 65K05, 65N30, 65N55.

  4. An efficient approach to unstructured mesh hydrodynamics on the cell broadband engine (u)

    Energy Technology Data Exchange (ETDEWEB)

    Ferenbaugh, Charles R [Los Alamos National Laboratory

    2010-12-14

    Unstructured mesh physics for the Cell Broadband Engine (CBE) has received little or no attention to date, largely because the CBE architecture poses particular challenges for unstructured mesh algorithms. SPU memory management strategies such as data preloading cannot be applied to the irregular memory storage patterns of unstructured meshes; and the SPU vector instruction set does not support the indirect addressing needed by connectivity arrays. This paper presents an approach to unstructured mesh physics that addresses these challenges, by creating a new mesh data structure and reorganizing code to give efficient CBE performance. The approach is demonstrated on the FLAG production hydrodynamics code using standard test problems, and results show an average speedup of more than 5x over the original code.

  5. Quantum search of a real unstructured database

    Science.gov (United States)

    Broda, Bogusław

    2016-02-01

    A simple circuit implementation of the oracle for Grover's quantum search of a real unstructured classical database is proposed. The oracle contains a kind of quantumly accessible classical memory, which stores the database.

  6. Finite volume methods for the incompressible Navier-Stokes equations on unstructured grids

    Energy Technology Data Exchange (ETDEWEB)

    Meese, Ernst Arne

    1998-07-01

    Most solution methods of computational fluid dynamics (CFD) use structured grids based on curvilinear coordinates for compliance with complex geometries. In a typical industry application, about 80% of the time used to produce the results is spent constructing computational grids. Recently the use of unstructured grids has been strongly advocated. For unstructured grids there are methods for generating them automatically on quite complex domains. This thesis focuses on the design of Navier-Stokes solvers that can cope with unstructured grids and ''low quality grids'', thus reducing the need for human intervention in the grid generation.

  7. Triangulation Made Easy

    Energy Technology Data Exchange (ETDEWEB)

    Lindstrom, P

    2009-12-23

    We describe a simple and efficient algorithm for two-view triangulation of 3D points from approximate 2D matches based on minimizing the L2 reprojection error. Our iterative algorithm improves on the one by Kanatani et al. by ensuring that in each iteration the epipolar constraint is satisfied. In the case where the two cameras are pointed in the same direction, the method provably converges to an optimal solution in exactly two iterations. For more general camera poses, two iterations are sufficient to achieve convergence to machine precision, which we exploit to devise a fast, non-iterative method. The resulting algorithm amounts to little more than solving a quadratic equation, and involves a fixed, small number of simple matrixvector operations and no conditional branches. We demonstrate that the method computes solutions that agree to very high precision with those of Hartley and Sturm's original polynomial method, though achieves higher numerical stability and 1-4 orders of magnitude greater speed.

  8. A Denotational Semantics for Communicating Unstructured Code

    Directory of Open Access Journals (Sweden)

    Nils Jähnig

    2015-03-01

    Full Text Available An important property of programming language semantics is that they should be compositional. However, unstructured low-level code contains goto-like commands making it hard to define a semantics that is compositional. In this paper, we follow the ideas of Saabas and Uustalu to structure low-level code. This gives us the possibility to define a compositional denotational semantics based on least fixed points to allow for the use of inductive verification methods. We capture the semantics of communication using finite traces similar to the denotations of CSP. In addition, we examine properties of this semantics and give an example that demonstrates reasoning about communication and jumps. With this semantics, we lay the foundations for a proof calculus that captures both, the semantics of unstructured low-level code and communication.

  9. The use of Triangulation in Social Sciences Research : Can qualitative and quantitative methods be combined?

    Directory of Open Access Journals (Sweden)

    Ashatu Hussein

    2015-03-01

    Full Text Available This article refers to a study in Tanzania on fringe benefits or welfare via the work contract1 where we will work both quantitatively and qualitatively. My focus is on the vital issue of combining methods or methodologies. There has been mixed views on the uses of triangulation in researches. Some authors argue that triangulation is just for increasing the wider and deep understanding of the study phenomenon, while others have argued that triangulation is actually used to increase the study accuracy, in this case triangulation is one of the validity measures. Triangulation is defined as the use of multiple methods mainly qualitative and quantitative methods in studying the same phenomenon for the purpose of increasing study credibility. This implies that triangulation is the combination of two or more methodological approaches, theoretical perspectives, data sources, investigators and analysis methods to study the same phenomenon.However, using both qualitative and quantitative paradigms in the same study has resulted into debate from some researchers arguing that the two paradigms differ epistemologically and ontologically. Nevertheless, both paradigms are designed towards understanding about a particular subject area of interest and both of them have strengths and weaknesses. Thus, when combined there is a great possibility of neutralizing the flaws of one method and strengthening the benefits of the other for the better research results. Thus, to reap the benefits of two paradigms and minimizing the drawbacks of each, the combination of the two approaches have been advocated in this article. The quality of our studies on welfare to combat poverty is crucial, and especially when we want our conclusions to matter in practice.

  10. Random discrete Morse theory and a new library of triangulations

    DEFF Research Database (Denmark)

    Benedetti, Bruno; Lutz, Frank Hagen

    2014-01-01

    We introduce random discrete Morse theory as a computational scheme to measure the complexity of a triangulation. The idea is to try to quantify the frequency of discrete Morse matchings with few critical cells. Our measure will depend on the topology of the space, but also on how nicely the space...... is triangulated. The scheme we propose looks for optimal discrete Morse functions with an elementary random heuristic. Despite its naiveté, this approach turns out to be very successful even in the case of huge inputs. In our view, the existing libraries of examples in computational topology are “too easy......” for testing algorithms based on discrete Morse theory. We propose a new library containing more complicated (and thus more meaningful) test examples....

  11. Employee-satisfaction: A triangulation approach

    Directory of Open Access Journals (Sweden)

    P. J. Visser

    1997-06-01

    Full Text Available The research on employee-satisfaction was conducted in the manufacturing industry. The sample consisted of 543 employees. The methodology could be described as a "triangulation approach" where a combination of quantitative and qualitative measurements were utilised and the results of both types of measurement integrated in the study of the construct. The research confirms existing findings that although the measurement of dimensions such as equitable rewards, working conditions, supportive colleagues, job content, etc. yield results on the level of employee-satisfaction, a single question, namely, "How satisfied are you with your job?" compares favourably with the general index. The findings also suggest the advantage of complimenting the quantitative data with qualitative information. The conclusions confirm the value of a qualitative method in cross-cultural research in an African environment. Opsomming Die navorsing omtrent werknemerstevredenheid is uitgevoer in die vervaardigingsbedryf. Die steekproef het bestaan uit 543 werknemers. Die metode van ondersoek kan beskryf word as 'n "driekantige benadering" (triangulation approach waar daar van kwantitatiewe en kwalitatiewe meting gebruik gemaak is en die resultate geihtegreer is in die bestudering van die konstruk. Die navorsing bevestig bestaande bevindinge dat die meting van dimensies soos vergelykbare belonings, werkstoestande, ondersteunende kollegas, inhoud van werk, ens. resultate lewer rakende die vlak van werknemerstevredenheid, 'n enkel vraag, naamlik, "Hoe tevrede is jy met jou werk?" gunstig vergelyk met die algemene indeks. Die bevindinge dui ook op die voordele van 'n benadering waar die kwantitatiewe data gekomplimenteer word deur kwalitatiewe inligting soos verkry uit individuele onderhoude. Die gevolgtrekkings bevestig die waarde wat die kwalitatiewe navorsingsmetode inhou vir kruis-kulturele navorsing in 'n Afrika konteks.

  12. Multi-region unstructured volume segmentation using tetrahedron filling

    Energy Technology Data Exchange (ETDEWEB)

    Willliams, Sean Jamerson [Los Alamos National Laboratory; Dillard, Scott E [Los Alamos National Laboratory; Thoma, Dan J [MDI, INSTITUTES; Hlawitschka, Mario [UC DAVIS; Hamann, Bernd [UC DAVIS

    2010-01-01

    Segmentation is one of the most common operations in image processing, and while there are several solutions already present in the literature, they each have their own benefits and drawbacks that make them well-suited for some types of data and not for others. We focus on the problem of breaking an image into multiple regions in a single segmentation pass, while supporting both voxel and scattered point data. To solve this problem, we begin with a set of potential boundary points and use a Delaunay triangulation to complete the boundaries. We use heuristic- and interaction-driven Voronoi clustering to find reasonable groupings of tetrahedra. Apart from the computation of the Delaunay triangulation, our algorithm has linear time complexity with respect to the number of tetrahedra.

  13. Visualization research of 3D radiation field based on Delaunay triangulation

    International Nuclear Information System (INIS)

    Xie Changji; Chen Yuqing; Li Shiting; Zhu Bo

    2011-01-01

    Based on the characteristics of the three dimensional partition, the triangulation of discrete date sets is improved by the method of point-by-point insertion. The discrete data for the radiation field by theoretical calculation or actual measurement is restructured, and the continuous distribution of the radiation field data is obtained. Finally, the 3D virtual scene of the nuclear facilities is built with the VR simulation techniques, and the visualization of the 3D radiation field is also achieved by the visualization mapping techniques. It is shown that the method combined VR and Delaunay triangulation could greatly improve the quality and efficiency of 3D radiation field visualization. (authors)

  14. Indirect measurement of molten steel level in tundish based on laser triangulation

    Science.gov (United States)

    Su, Zhiqi; He, Qing; Xie, Zhi

    2016-03-01

    For real-time and precise measurement of molten steel level in tundish during continuous casting, slag level and slag thickness are needed. Among which, the problem of slag thickness measurement has been solved in our previous work. In this paper, a systematic solution for slag level measurement based on laser triangulation is proposed. Being different from traditional laser triangulation, several aspects for measuring precision and robustness have been done. First, laser line is adopted for multi-position measurement to overcome the deficiency of single point laser range finder caused by the uneven surface of the slag. Second, the key parameters, such as installing angle and minimum requirement of the laser power, are analyzed and determined based on the gray-body radiation theory to fulfill the rigorous requirement of measurement accuracy. Third, two kinds of severe noises in the acquired images, which are, respectively, caused by heat radiation and Electro-Magnetic Interference (EMI), are cleaned via morphological characteristic of the liquid slag and color difference between EMI and the laser signals, respectively. Fourth, as false target created by stationary slag usually disorders the measurement, valid signals of the slag are distinguished from the false ones to calculate the slag level. Then, molten steel level is obtained by the slag level minus the slag thickness. The measuring error of this solution is verified by the applications in steel plants, which is ±2.5 mm during steady casting and ±3.2 mm at the end of casting.

  15. Indirect measurement of molten steel level in tundish based on laser triangulation

    Energy Technology Data Exchange (ETDEWEB)

    Su, Zhiqi; He, Qing, E-mail: heqing@ise.neu.edu.cn; Xie, Zhi [State Key Laboratory of Synthetical Automation for Process Industries, School of Information Science and Engineering, Northeastern University, Shenyang 110819 (China)

    2016-03-15

    For real-time and precise measurement of molten steel level in tundish during continuous casting, slag level and slag thickness are needed. Among which, the problem of slag thickness measurement has been solved in our previous work. In this paper, a systematic solution for slag level measurement based on laser triangulation is proposed. Being different from traditional laser triangulation, several aspects for measuring precision and robustness have been done. First, laser line is adopted for multi-position measurement to overcome the deficiency of single point laser range finder caused by the uneven surface of the slag. Second, the key parameters, such as installing angle and minimum requirement of the laser power, are analyzed and determined based on the gray-body radiation theory to fulfill the rigorous requirement of measurement accuracy. Third, two kinds of severe noises in the acquired images, which are, respectively, caused by heat radiation and Electro-Magnetic Interference (EMI), are cleaned via morphological characteristic of the liquid slag and color difference between EMI and the laser signals, respectively. Fourth, as false target created by stationary slag usually disorders the measurement, valid signals of the slag are distinguished from the false ones to calculate the slag level. Then, molten steel level is obtained by the slag level minus the slag thickness. The measuring error of this solution is verified by the applications in steel plants, which is ±2.5 mm during steady casting and ±3.2 mm at the end of casting.

  16. Large N Limits in Tensor Models: Towards More Universality Classes of Colored Triangulations in Dimension d≥2

    Science.gov (United States)

    Bonzom, Valentin

    2016-07-01

    We review an approach which aims at studying discrete (pseudo-)manifolds in dimension d≥ 2 and called random tensor models. More specifically, we insist on generalizing the two-dimensional notion of p-angulations to higher dimensions. To do so, we consider families of triangulations built out of simplices with colored faces. Those simplices can be glued to form new building blocks, called bubbles which are pseudo-manifolds with boundaries. Bubbles can in turn be glued together to form triangulations. The main challenge is to classify the triangulations built from a given set of bubbles with respect to their numbers of bubbles and simplices of codimension two. While the colored triangulations which maximize the number of simplices of codimension two at fixed number of simplices are series-parallel objects called melonic triangulations, this is not always true anymore when restricting attention to colored triangulations built from specific bubbles. This opens up the possibility of new universality classes of colored triangulations. We present three existing strategies to find those universality classes. The first two strategies consist in building new bubbles from old ones for which the problem can be solved. The third strategy is a bijection between those colored triangulations and stuffed, edge-colored maps, which are some sort of hypermaps whose hyperedges are replaced with edge-colored maps. We then show that the present approach can lead to enumeration results and identification of universality classes, by working out the example of quartic tensor models. They feature a tree-like phase, a planar phase similar to two-dimensional quantum gravity and a phase transition between them which is interpreted as a proliferation of baby universes. While this work is written in the context of random tensors, it is almost exclusively of combinatorial nature and we hope it is accessible to interested readers who are not familiar with random matrices, tensors and quantum

  17. Putting a cap on causality violations in causal dynamical triangulations

    International Nuclear Information System (INIS)

    Ambjoern, Jan; Loll, Renate; Westra, Willem; Zohren, Stefan

    2007-01-01

    The formalism of causal dynamical triangulations (CDT) provides us with a non-perturbatively defined model of quantum gravity, where the sum over histories includes only causal space-time histories. Path integrals of CDT and their continuum limits have been studied in two, three and four dimensions. Here we investigate a generalization of the two-dimensional CDT model, where the causality constraint is partially lifted by introducing branching points with a weight g s , and demonstrate that the system can be solved analytically in the genus-zero sector. The solution is analytic in a neighborhood around weight g s = 0 and cannot be analytically continued to g s = ∞, where the branching is entirely geometric and where one would formally recover standard Euclidean two-dimensional quantum gravity defined via dynamical triangulations or Liouville theory

  18. A Krylov Subspace Method for Unstructured Mesh SN Transport Computation

    International Nuclear Information System (INIS)

    Yoo, Han Jong; Cho, Nam Zin; Kim, Jong Woon; Hong, Ser Gi; Lee, Young Ouk

    2010-01-01

    Hong, et al., have developed a computer code MUST (Multi-group Unstructured geometry S N Transport) for the neutral particle transport calculations in three-dimensional unstructured geometry. In this code, the discrete ordinates transport equation is solved by using the discontinuous finite element method (DFEM) or the subcell balance methods with linear discontinuous expansion. In this paper, the conventional source iteration in the MUST code is replaced by the Krylov subspace method to reduce computing time and the numerical test results are given

  19. Methodological triangulation in work life research

    DEFF Research Database (Denmark)

    Warring, Niels

    Based on examples from two research projects on preschool teachers' work, the paper will discuss potentials and challenges in methodological triangulation in work life research. Analysis of ethnographic and phenomenological inspired observations of everyday life in day care centers formed the basis...... for individual interviews and informal talks with employees. The interviews and conversations were based on a critical hermeneutic approach. The analysis of observations and interviews constituted a knowledge base as the project went in to the last phase: action research workshops. In the workshops findings from...

  20. GEOPOSITIONING PRECISION ANALYSIS OF MULTIPLE IMAGE TRIANGULATION USING LRO NAC LUNAR IMAGES

    Directory of Open Access Journals (Sweden)

    K. Di

    2016-06-01

    Full Text Available This paper presents an empirical analysis of the geopositioning precision of multiple image triangulation using Lunar Reconnaissance Orbiter Camera (LROC Narrow Angle Camera (NAC images at the Chang’e-3(CE-3 landing site. Nine LROC NAC images are selected for comparative analysis of geopositioning precision. Rigorous sensor models of the images are established based on collinearity equations with interior and exterior orientation elements retrieved from the corresponding SPICE kernels. Rational polynomial coefficients (RPCs of each image are derived by least squares fitting using vast number of virtual control points generated according to rigorous sensor models. Experiments of different combinations of images are performed for comparisons. The results demonstrate that the plane coordinates can achieve a precision of 0.54 m to 2.54 m, with a height precision of 0.71 m to 8.16 m when only two images are used for three-dimensional triangulation. There is a general trend that the geopositioning precision, especially the height precision, is improved with the convergent angle of the two images increasing from several degrees to about 50°. However, the image matching precision should also be taken into consideration when choosing image pairs for triangulation. The precisions of using all the 9 images are 0.60 m, 0.50 m, 1.23 m in along-track, cross-track, and height directions, which are better than most combinations of two or more images. However, triangulation with selected fewer images could produce better precision than that using all the images.

  1. Classification and Filtering of Constrained Delaunay Triangulation for Automated Building Aggregation

    Directory of Open Access Journals (Sweden)

    GUO Peipei

    2016-08-01

    Full Text Available Building aggregation is an important part of research on large scale map generalization. A triangulation based approach is proposed from the perspective of shape features, six measure parameters of triangles in a constrained Delaunay triangulation are proposed. First of all, use the six measure parameters to determine which triangles are retained and which are erased. Then, the contours of retained triangles, as bridge areas between buildings, are automatically identified and right angle processed. And then, the buildings are aggregated with right angle feature retained by merging the bridge areas with connecting buildings. Finally, the approach is verified by being carried out on actual data. Experimental result shows that it is efficient and practical.

  2. UAV PHOTOGRAMMETRY: BLOCK TRIANGULATION COMPARISONS

    Directory of Open Access Journals (Sweden)

    R. Gini

    2013-08-01

    Full Text Available UAVs systems represent a flexible technology able to collect a big amount of high resolution information, both for metric and interpretation uses. In the frame of experimental tests carried out at Dept. ICA of Politecnico di Milano to validate vector-sensor systems and to assess metric accuracies of images acquired by UAVs, a block of photos taken by a fixed wing system is triangulated with several software. The test field is a rural area included in an Italian Park ("Parco Adda Nord", useful to study flight and imagery performances on buildings, roads, cultivated and uncultivated vegetation. The UAV SenseFly, equipped with a camera Canon Ixus 220HS, flew autonomously over the area at a height of 130 m yielding a block of 49 images divided in 5 strips. Sixteen pre-signalized Ground Control Points, surveyed in the area through GPS (NRTK survey, allowed the referencing of the block and accuracy analyses. Approximate values for exterior orientation parameters (positions and attitudes were recorded by the flight control system. The block was processed with several software: Erdas-LPS, EyeDEA (Univ. of Parma, Agisoft Photoscan, Pix4UAV, in assisted or automatic way. Results comparisons are given in terms of differences among digital surface models, differences in orientation parameters and accuracies, when available. Moreover, image and ground point coordinates obtained by the various software were independently used as initial values in a comparative adjustment made by scientific in-house software, which can apply constraints to evaluate the effectiveness of different methods of point extraction and accuracies on ground check points.

  3. Grey signal processing and data reconstruction in the non-diffracting beam triangulation measurement system

    Science.gov (United States)

    Meng, Hao; Wang, Zhongyu; Fu, Jihua

    2008-12-01

    The non-diffracting beam triangulation measurement system possesses the advantages of longer measurement range, higher theoretical measurement accuracy and higher resolution over the traditional laser triangulation measurement system. Unfortunately the measurement accuracy of the system is greatly degraded due to the speckle noise, the CCD photoelectric noise and the background light noise in practical applications. Hence, some effective signal processing methods must be applied to improve the measurement accuracy. In this paper a novel effective method for removing the noises in the non-diffracting beam triangulation measurement system is proposed. In the method the grey system theory is used to process and reconstruct the measurement signal. Through implementing the grey dynamic filtering based on the dynamic GM(1,1), the noises can be effectively removed from the primary measurement data and the measurement accuracy of the system can be improved as a result.

  4. A software platform for continuum modeling of ion channels based on unstructured mesh

    International Nuclear Information System (INIS)

    Tu, B; Bai, S Y; Xie, Y; Zhang, L B; Lu, B Z; Chen, M X

    2014-01-01

    Most traditional continuum molecular modeling adopted finite difference or finite volume methods which were based on a structured mesh (grid). Unstructured meshes were only occasionally used, but an increased number of applications emerge in molecular simulations. To facilitate the continuum modeling of biomolecular systems based on unstructured meshes, we are developing a software platform with tools which are particularly beneficial to those approaches. This work describes the software system specifically for the simulation of a typical, complex molecular procedure: ion transport through a three-dimensional channel system that consists of a protein and a membrane. The platform contains three parts: a meshing tool chain for ion channel systems, a parallel finite element solver for the Poisson–Nernst–Planck equations describing the electrodiffusion process of ion transport, and a visualization program for continuum molecular modeling. The meshing tool chain in the platform, which consists of a set of mesh generation tools, is able to generate high-quality surface and volume meshes for ion channel systems. The parallel finite element solver in our platform is based on the parallel adaptive finite element package PHG which wass developed by one of the authors [1]. As a featured component of the platform, a new visualization program, VCMM, has specifically been developed for continuum molecular modeling with an emphasis on providing useful facilities for unstructured mesh-based methods and for their output analysis and visualization. VCMM provides a graphic user interface and consists of three modules: a molecular module, a meshing module and a numerical module. A demonstration of the platform is provided with a study of two real proteins, the connexin 26 and hemolysin ion channels. (paper)

  5. Comparative reliability of structured versus unstructured interviews in the admission process of a residency program.

    Science.gov (United States)

    Blouin, Danielle; Day, Andrew G; Pavlov, Andrey

    2011-12-01

    Although never directly compared, structured interviews are reported as being more reliable than unstructured interviews. This study compared the reliability of both types of interview when applied to a common pool of applicants for positions in an emergency medicine residency program. In 2008, one structured interview was added to the two unstructured interviews traditionally used in our resident selection process. A formal job analysis using the critical incident technique guided the development of the structured interview tool. This tool consisted of 7 scenarios assessing 4 of the domains deemed essential for success as a resident in this program. The traditional interview tool assessed 5 general criteria. In addition to these criteria, the unstructured panel members were asked to rate each candidate on the same 4 essential domains rated by the structured panel members. All 3 panels interviewed all candidates. Main outcomes were the overall, interitem, and interrater reliabilities, the correlations between interview panels, and the dimensionality of each interview tool. Thirty candidates were interviewed. The overall reliability reached 0.43 for the structured interview, and 0.81 and 0.71 for the unstructured interviews. Analyses of the variance components showed a high interrater, low interitem reliability for the structured interview, and a high interrater, high interitem reliability for the unstructured interviews. The summary measures from the 2 unstructured interviews were significantly correlated, but neither was correlated with the structured interview. Only the structured interview was multidimensional. A structured interview did not yield a higher overall reliability than both unstructured interviews. The lower reliability is explained by a lower interitem reliability, which in turn is due to the multidimensionality of the interview tool. Both unstructured panels consistently rated a single dimension, even when prompted to assess the 4 specific domains

  6. Bracketing as a skill in conducting unstructured qualitative interviews.

    Science.gov (United States)

    Sorsa, Minna Anneli; Kiikkala, Irma; Åstedt-Kurki, Päivi

    2015-03-01

    To provide an overview of bracketing as a skill in unstructured qualitative research interviews. Researchers affect the qualitative research process. Bracketing in descriptive phenomenology entails researchers setting aside their pre-understanding and acting non-judgementally. In interpretative phenomenology, previous knowledge is used intentionally to create new understanding. A literature search of bracketing in phenomenology and qualitative research. This is a methodology paper examining the researchers' impact in creating data in creating data in qualitative research. Self-knowledge, sensitivity and reflexivity of the researcher enable bracketing. Skilled and experienced researchers are needed to use bracketing in unstructured qualitative research interviews. Bracketing adds scientific rigour and validity to any qualitative study.

  7. Source parameters for the 1952 Kern County earthquake, California: A joint inversion of leveling and triangulation observations

    OpenAIRE

    Bawden, Gerald W.

    2001-01-01

    Coseismic leveling and triangulation observations are used to determine the faulting geometry and slip distribution of the July 21, 1952, Mw 7.3 Kern County earthquake on the White Wolf fault. A singular value decomposition inversion is used to assess the ability of the geodetic network to resolve slip along a multisegment fault and shows that the network is sufficient to resolve slip along the surface rupture to a depth of 10 km. Below 10 km, the network can only resolve dip slip near the fa...

  8. All roads lead to Rome - New search methods for the optimal triangulation problem

    Czech Academy of Sciences Publication Activity Database

    Ottosen, T. J.; Vomlel, Jiří

    2012-01-01

    Roč. 53, č. 9 (2012), s. 1350-1366 ISSN 0888-613X R&D Projects: GA MŠk 1M0572; GA ČR GEICC/08/E010; GA ČR GA201/09/1891 Grant - others:GA MŠk(CZ) 2C06019 Institutional support: RVO:67985556 Keywords : Bayesian networks * Optimal triangulation * Probabilistic inference * Cliques in a graph Subject RIV: BD - Theory of Information Impact factor: 1.729, year: 2012 http://library.utia.cas.cz/separaty/2012/MTR/vomlel-all roads lead to rome - new search methods for the optimal triangulation problem.pdf

  9. Warehousing Structured and Unstructured Data for Data Mining.

    Science.gov (United States)

    Miller, L. L.; Honavar, Vasant; Barta, Tom

    1997-01-01

    Describes an extensible object-oriented view system that supports the integration of both structured and unstructured data sources in either the multidatabase or data warehouse environment. Discusses related work and data mining issues. (AEF)

  10. MCR2S unstructured mesh capabilities for use in shutdown dose rate analysis

    International Nuclear Information System (INIS)

    Eade, T.; Stonell, D.; Turner, A.

    2015-01-01

    Highlights: • Advancements in shutdown dose rate calculations will be needed as fusion moves from experimental reactors to full scale demonstration reactors in order to ensure the safety of personnel. • The MCR2S shutdown dose rate tool has been modified to allow shutdown dose rates calculations using an unstructured mesh. • The unstructured mesh capability of MCR2S was used on three shutdown dose rate models, a simple sphere, the ITER computational benchmark and the DEMO computational benchmark. • The results showed a reasonable agreement between an unstructured mesh approach and the CSG approach and highlighted the need to carefully choose the unstructured mesh resolution. - Abstract: As nuclear fusion progresses towards a sustainable energy source and the power of tokamak devices increases, a greater understanding of the radiation fields will be required. As well as on-load radiation fields, off-load or shutdown radiation field are an important consideration for the safety and economic viability of a commercial fusion reactor. Previously codes such as MCR2S have been written in order to predict the shutdown dose rates within, and in regions surrounding, a fusion reactor. MCR2S utilises a constructive solid geometry (CSG) model and a superimposed structured mesh to calculate 3-D maps of the shutdown dose rate. A new approach to MCR2S calculations is proposed and implemented using a single unstructured mesh to replace both the CSG model and the superimposed structured mesh. This new MCR2S approach has been demonstrated on three models of increasing complexity. These models were: a sphere, the ITER computational shutdown dose rate benchmark and the DEMO computational shutdown dose rate benchmark. In each case the results were compared to MCR2S calculations performed using MCR2S with CSG geometry and a superimposed structured mesh. It was concluded that the results from the unstructured mesh implementation of MCR2S compared well to the CSG structured mesh

  11. Radiation Coupling with the FUN3D Unstructured-Grid CFD Code

    Science.gov (United States)

    Wood, William A.

    2012-01-01

    The HARA radiation code is fully-coupled to the FUN3D unstructured-grid CFD code for the purpose of simulating high-energy hypersonic flows. The radiation energy source terms and surface heat transfer, under the tangent slab approximation, are included within the fluid dynamic ow solver. The Fire II flight test, at the Mach-31 1643-second trajectory point, is used as a demonstration case. Comparisons are made with an existing structured-grid capability, the LAURA/HARA coupling. The radiative surface heat transfer rates from the present approach match the benchmark values within 6%. Although radiation coupling is the focus of the present work, convective surface heat transfer rates are also reported, and are seen to vary depending upon the choice of mesh connectivity and FUN3D ux reconstruction algorithm. On a tetrahedral-element mesh the convective heating matches the benchmark at the stagnation point, but under-predicts by 15% on the Fire II shoulder. Conversely, on a mixed-element mesh the convective heating over-predicts at the stagnation point by 20%, but matches the benchmark away from the stagnation region.

  12. Quantum Computing in Decoherence-Free Subspace Constructed by Triangulation

    OpenAIRE

    Bi, Qiao; Guo, Liu; Ruda, H. E.

    2010-01-01

    A formalism for quantum computing in decoherence-free subspaces is presented. The constructed subspaces are partial triangulated to an index related to environment. The quantum states in the subspaces are just projected states which are ruled by a subdynamic kinetic equation. These projected states can be used to perform ideal quantum logical operations without decoherence.

  13. A Parallel Multiblock Structured Grid Method with Automated Interblocked Unstructured Grids for Chemically Reacting Flows

    Science.gov (United States)

    Spiegel, Seth Christian

    An automated method for using unstructured grids to patch non- C0 interfaces between structured blocks has been developed in conjunction with a finite-volume method for solving chemically reacting flows on unstructured grids. Although the standalone unstructured solver, FVFLO-NCSU, is capable of resolving flows for high-speed aeropropulsion devices with complex geometries, unstructured-mesh algorithms are inherently inefficient when compared to their structured counterparts. However, the advantages of structured algorithms in developing a flow solution in a timely manner can be negated by the amount of time required to develop a mesh for complex geometries. The global domain can be split up into numerous smaller blocks during the grid-generation process to alleviate some of the difficulties in creating these complex meshes. An even greater abatement can be found by allowing the nodes on abutting block interfaces to be nonmatching or non-C 0 continuous. One code capable of solving chemically reacting flows on these multiblock grids is VULCAN, which uses a nonconservative approach for patching non-C0 block interfaces. The developed automated unstructured-grid patching algorithm has been installed within VULCAN to provide it the capability of a fully conservative approach for patching non-C0 block interfaces. Additionally, the FVFLO-NCSU solver algorithms have been deeply intertwined with the VULCAN source code to solve chemically reacting flows on these unstructured patches. Finally, the CGNS software library was added to the VULCAN postprocessor so structured and unstructured data can be stored in a single compact file. This final upgrade to VULCAN has been successfully installed and verified using test cases with particular interest towards those involving grids with non- C0 block interfaces.

  14. Quantum Computing in Decoherence-Free Subspace Constructed by Triangulation

    Directory of Open Access Journals (Sweden)

    Qiao Bi

    2010-01-01

    Full Text Available A formalism for quantum computing in decoherence-free subspaces is presented. The constructed subspaces are partial triangulated to an index related to environment. The quantum states in the subspaces are just projected states which are ruled by a subdynamic kinetic equation. These projected states can be used to perform ideal quantum logical operations without decoherence.

  15. Marginal elasticity of periodic triangulated origami

    Science.gov (United States)

    Chen, Bryan; Sussman, Dan; Lubensky, Tom; Santangelo, Chris

    Origami, the classical art of folding paper, has inspired much recent work on assembling complex 3D structures from planar sheets. Origami, and more generally hinged structures with rigid panels, where all faces are triangles have special properties due to having a bulk balance of mechanical degrees of freedom and constraints. We study two families of periodic triangulated origami structures, one based on the Miura ori and one based on a kagome-like pattern due to Ron Resch. We point out the consequences of the balance of degrees of freedom and constraints for these ''metamaterial plates'' and show how the elasticity can be tuned by changing the unit cell geometry.

  16. Spectral triangulation: a 3D method for locating single-walled carbon nanotubes in vivo

    Science.gov (United States)

    Lin, Ching-Wei; Bachilo, Sergei M.; Vu, Michael; Beckingham, Kathleen M.; Bruce Weisman, R.

    2016-05-01

    Nanomaterials with luminescence in the short-wave infrared (SWIR) region are of special interest for biological research and medical diagnostics because of favorable tissue transparency and low autofluorescence backgrounds in that region. Single-walled carbon nanotubes (SWCNTs) show well-known sharp SWIR spectral signatures and therefore have potential for noninvasive detection and imaging of cancer tumours, when linked to selective targeting agents such as antibodies. However, such applications face the challenge of sensitively detecting and localizing the source of SWIR emission from inside tissues. A new method, called spectral triangulation, is presented for three dimensional (3D) localization using sparse optical measurements made at the specimen surface. Structurally unsorted SWCNT samples emitting over a range of wavelengths are excited inside tissue phantoms by an LED matrix. The resulting SWIR emission is sampled at points on the surface by a scanning fibre optic probe leading to an InGaAs spectrometer or a spectrally filtered InGaAs avalanche photodiode detector. Because of water absorption, attenuation of the SWCNT fluorescence in tissues is strongly wavelength-dependent. We therefore gauge the SWCNT-probe distance by analysing differential changes in the measured SWCNT emission spectra. SWCNT fluorescence can be clearly detected through at least 20 mm of tissue phantom, and the 3D locations of embedded SWCNT test samples are found with sub-millimeter accuracy at depths up to 10 mm. Our method can also distinguish and locate two embedded SWCNT sources at distinct positions.Nanomaterials with luminescence in the short-wave infrared (SWIR) region are of special interest for biological research and medical diagnostics because of favorable tissue transparency and low autofluorescence backgrounds in that region. Single-walled carbon nanotubes (SWCNTs) show well-known sharp SWIR spectral signatures and therefore have potential for noninvasive detection and

  17. The Status Quo of Ontology Learning from Unstructured Knowledge Sources for Knowledge Management

    OpenAIRE

    Scheuermann , Andreas; Obermann , Jens

    2012-01-01

    International audience; In the global race for competitive advantage Knowledge Management gains increasing importance for companies. The purposeful and systematic creation, maintenance, and transfer of unstructured knowledge sources demands for advanced Information Technology. Ontologies constitute a basic ingredient of Knowledge Management; thus, ontology learning from unstructured knowledge sources is of particular interest since it bears the potential to bring significant advantages for Kn...

  18. Computational Complexity of Combinatorial Surfaces

    NARCIS (Netherlands)

    Vegter, Gert; Yap, Chee K.

    1990-01-01

    We investigate the computational problems associated with combinatorial surfaces. Specifically, we present an algorithm (based on the Brahana-Dehn-Heegaard approach) for transforming the polygonal schema of a closed triangulated surface into its canonical form in O(n log n) time, where n is the

  19. Chromatic polynomials of planar triangulations, the Tutte upper bound and chromatic zeros

    International Nuclear Information System (INIS)

    Shrock, Robert; Xu Yan

    2012-01-01

    Tutte proved that if G pt is a planar triangulation and P(G pt , q) is its chromatic polynomial, then |P(G pt , τ + 1)| ⩽ (τ − 1) n−5 , where τ=(1+√5 )/2 and n is the number of vertices in G pt . Here we study the ratio r(G pt ) = |P(G pt , τ + 1)|/(τ − 1) n−5 for a variety of planar triangulations. We construct infinite recursive families of planar triangulations G pt,m depending on a parameter m linearly related to n and show that if P(G pt,m , q) only involves a single power of a polynomial, then r(G pt,m ) approaches zero exponentially fast as n → ∞. We also construct infinite recursive families for which P(G pt,m , q) is a sum of powers of certain functions and show that for these, r(G pt,m ) may approach a finite nonzero constant as n → ∞. The connection between the Tutte upper bound and the observed chromatic zero(s) near to τ + 1 is investigated. We report the first known graph for which the zero(s) closest to τ + 1 is not real, but instead is a complex-conjugate pair. Finally, we discuss connections with the nonzero ground-state entropy of the Potts antiferromagnet on these families of graphs. (paper)

  20. Quantum gravity from simplices: analytical investigations of causal dynamical triangulations

    NARCIS (Netherlands)

    Benedetti, D.

    2007-01-01

    A potentially powerful approach to quantum gravity has been developed over the last few years under the name of Causal Dynamical Triangulations. Although these models can be solved exactly in a variety of ways in the case of pure gravity in (1+1) dimensions,it is difficult to extend any of the

  1. Domain decomposition multigrid for unstructured grids

    Energy Technology Data Exchange (ETDEWEB)

    Shapira, Yair

    1997-01-01

    A two-level preconditioning method for the solution of elliptic boundary value problems using finite element schemes on possibly unstructured meshes is introduced. It is based on a domain decomposition and a Galerkin scheme for the coarse level vertex unknowns. For both the implementation and the analysis, it is not required that the curves of discontinuity in the coefficients of the PDE match the interfaces between subdomains. Generalizations to nonmatching or overlapping grids are made.

  2. Investigation of point triangulation methods for optimality and performance in Structure from Motion systems

    DEFF Research Database (Denmark)

    Structure from Motion (SFM) systems are composed of cameras and structure in the form of 3D points and other features. It is most often that the structure components outnumber the cameras by a great margin. It is not uncommon to have a configuration with 3 cameras observing more than 500 3D points...... an overview of existing triangulation methods with emphasis on performance versus optimality, and will suggest a fast triangulation algorithm based on linear constraints. The structure and camera motion estimation in a SFM system is based on the minimization of some norm of the reprojection error between...

  3. Measuring teamwork in primary care: Triangulation of qualitative and quantitative data.

    Science.gov (United States)

    Brown, Judith Belle; Ryan, Bridget L; Thorpe, Cathy; Markle, Emma K R; Hutchison, Brian; Glazier, Richard H

    2015-09-01

    This article describes the triangulation of qualitative dimensions, reflecting high functioning teams, with the results of standardized teamwork measures. The study used a mixed methods design using qualitative and quantitative approaches to assess teamwork in 19 Family Health Teams in Ontario, Canada. This article describes dimensions from the qualitative phase using grounded theory to explore the issues and challenges to teamwork. Two quantitative measures were used in the study, the Team Climate Inventory (TCI) and the Providing Effective Resources and Knowledge (PERK) scale. For the triangulation analysis, the mean scores of these measures were compared with the qualitatively derived ratings for the dimensions. The final sample for the qualitative component was 107 participants. The qualitative analysis identified 9 dimensions related to high team functioning such as common philosophy, scope of practice, conflict resolution, change management, leadership, and team evolution. From these dimensions, teams were categorized numerically as high, moderate, or low functioning. Three hundred seventeen team members completed the survey measures. Mean site scores for the TCI and PERK were 3.87 and 3.88, respectively (of 5). The TCI was associated will all dimensions except for team location, space allocation, and executive director leadership. The PERK was associated with all dimensions except team location. Data triangulation provided qualitative and quantitative evidence of what constitutes teamwork. Leadership was pivotal in forging a common philosophy and encouraging team collaboration. Teams used conflict resolution strategies and adapted to the changes they encountered. These dimensions advanced the team's evolution toward a high functioning team. (c) 2015 APA, all rights reserved).

  4. Triangulation and Mixed Methods Designs: Data Integration with New Research Technologies

    Science.gov (United States)

    Fielding, Nigel G.

    2012-01-01

    Data integration is a crucial element in mixed methods analysis and conceptualization. It has three principal purposes: illustration, convergent validation (triangulation), and the development of analytic density or "richness." This article discusses such applications in relation to new technologies for social research, looking at three…

  5. Diffusion on unstructured triangular grids using Lattice Boltzmann

    NARCIS (Netherlands)

    Sman, van der R.G.M.

    2004-01-01

    In this paper, we present a Lattice Boltzmann scheme for diffusion on unstructured triangular grids. In this formulation there is no need for interpolation, as is required in other LB schemes on irregular grids. At the end of the propagation step, the lattice gas particles arrive exactly at

  6. Determination of Shift/Bias in Digital Aerial Triangulation of UAV Imagery Sequences

    Science.gov (United States)

    Wierzbicki, Damian

    2017-12-01

    Currently UAV Photogrammetry is characterized a largely automated and efficient data processing. Depicting from the low altitude more often gains on the meaning in the uses of applications as: cities mapping, corridor mapping, road and pipeline inspections or mapping of large areas e.g. forests. Additionally, high-resolution video image (HD and bigger) is more often use for depicting from the low altitude from one side it lets deliver a lot of details and characteristics of ground surfaces features, and from the other side is presenting new challenges in the data processing. Therefore, determination of elements of external orientation plays a substantial role the detail of Digital Terrain Models and artefact-free ortophoto generation. Parallel a research on the quality of acquired images from UAV and above the quality of products e.g. orthophotos are conducted. Despite so fast development UAV photogrammetry still exists the necessity of accomplishment Automatic Aerial Triangulation (AAT) on the basis of the observations GPS/INS and via ground control points. During low altitude photogrammetric flight, the approximate elements of external orientation registered by UAV are burdened with the influence of some shift/bias errors. In this article, methods of determination shift/bias error are presented. In the process of the digital aerial triangulation two solutions are applied. In the first method shift/bias error was determined together with the drift/bias error, elements of external orientation and coordinates of ground control points. In the second method shift/bias error was determined together with the elements of external orientation, coordinates of ground control points and drift/bias error equals 0. When two methods were compared the difference for shift/bias error is more than ±0.01 m for all terrain coordinates XYZ.

  7. Numerical convergence of discrete exterior calculus on arbitrary surface meshes

    KAUST Repository

    Mohamed, Mamdouh S.

    2018-02-13

    Discrete exterior calculus (DEC) is a structure-preserving numerical framework for partial differential equations solution, particularly suitable for simplicial meshes. A longstanding and widespread assumption has been that DEC requires special (Delaunay) triangulations, which complicated the mesh generation process especially for curved surfaces. This paper presents numerical evidence demonstrating that this restriction is unnecessary. Convergence experiments are carried out for various physical problems using both Delaunay and non-Delaunay triangulations. Signed diagonal definition for the key DEC operator (Hodge star) is adopted. The errors converge as expected for all considered meshes and experiments. This relieves the DEC paradigm from unnecessary triangulation limitation.

  8. Adaptive maximal poisson-disk sampling on surfaces

    KAUST Repository

    Yan, Dongming; Wonka, Peter

    2012-01-01

    In this paper, we study the generation of maximal Poisson-disk sets with varying radii on surfaces. Based on the concepts of power diagram and regular triangulation, we present a geometric analysis of gaps in such disk sets on surfaces, which

  9. A survey of simultaneous localization and mapping on unstructured lunar complex environment

    Science.gov (United States)

    Wang, Yiqiao; Zhang, Wei; An, Pei

    2017-10-01

    Simultaneous localization and mapping (SLAM) technology is the key to realizing lunar rover's intelligent perception and autonomous navigation. It embodies the autonomous ability of mobile robot, and has attracted plenty of concerns of researchers in the past thirty years. Visual sensors are meaningful to SLAM research because they can provide a wealth of information. Visual SLAM uses merely images as external information to estimate the location of the robot and construct the environment map. Nowadays, SLAM technology still has problems when applied in large-scale, unstructured and complex environment. Based on the latest technology in the field of visual SLAM, this paper investigates and summarizes the SLAM technology using in the unstructured complex environment of lunar surface. In particular, we focus on summarizing and comparing the detection and matching of features of SIFT, SURF and ORB, in the meanwhile discussing their advantages and disadvantages. We have analyzed the three main methods: SLAM Based on Extended Kalman Filter, SLAM Based on Particle Filter and SLAM Based on Graph Optimization (EKF-SLAM, PF-SLAM and Graph-based SLAM). Finally, this article summarizes and discusses the key scientific and technical difficulties in the lunar context that Visual SLAM faces. At the same time, we have explored the frontier issues such as multi-sensor fusion SLAM and multi-robot cooperative SLAM technology. We also predict and prospect the development trend of lunar rover SLAM technology, and put forward some ideas of further research.

  10. Interferometer predictions with triangulated images

    DEFF Research Database (Denmark)

    Brinch, Christian; Dullemond, C. P.

    2014-01-01

    the synthetic model images. To get the correct values of these integrals, the model images must have the right size and resolution. Insufficient care in these choices can lead to wrong results. We present a new general-purpose scheme for the computation of visibilities of radiative transfer images. Our method...... requires a model image that is a list of intensities at arbitrarily placed positions on the image-plane. It creates a triangulated grid from these vertices, and assumes that the intensity inside each triangle of the grid is a linear function. The Fourier integral over each triangle is then evaluated...... with an analytic expression and the complex visibility of the entire image is then the sum of all triangles. The result is a robust Fourier transform that does not suffer from aliasing effects due to grid regularities. The method automatically ensures that all structure contained in the model gets reflected...

  11. Interprofessional collaboration from nurses and physicians – A triangulation of quantitative and qualitative data

    Science.gov (United States)

    Schärli, Marianne; Müller, Rita; Martin, Jacqueline S; Spichiger, Elisabeth; Spirig, Rebecca

    2017-01-01

    Background: Interprofessional collaboration between nurses and physicians is a recurrent challenge in daily clinical practice. To ameliorate the situation, quantitative or qualitative studies are conducted. However, the results of these studies have often been limited by the methods chosen. Aim: To describe the synthesis of interprofessional collaboration from the nursing perspective by triangulating quantitative and qualitative data. Method: Data triangulation was performed as a sub-project of the interprofessional Sinergia DRG Research program. Initially, quantitative and qualitative data were analyzed separately in a mixed methods design. By means of triangulation a „meta-matrix“ resulted in a four-step process. Results: The „meta-matrix“ displays all relevant quantitative and qualitative results as well as their interrelations on one page. Relevance, influencing factors as well as consequences of interprofessional collaboration for patients, relatives and systems become visible. Conclusion: For the first time, the interprofessional collaboration from the nursing perspective at five Swiss hospitals is shown in a „meta-matrix“. The consequences of insufficient collaboration between nurses and physicians are considerable. This is why it’s necessary to invest in interprofessional concepts. In the „meta-matrix“ the factors which influence the interprofessional collaboration positively or negatively are visible.

  12. Alternative model of random surfaces

    International Nuclear Information System (INIS)

    Ambartzumian, R.V.; Sukiasian, G.S.; Savvidy, G.K.; Savvidy, K.G.

    1992-01-01

    We analyse models of triangulated random surfaces and demand that geometrically nearby configurations of these surfaces must have close actions. The inclusion of this principle drives us to suggest a new action, which is a modified Steiner functional. General arguments, based on the Minkowski inequality, shows that the maximal distribution to the partition function comes from surfaces close to the sphere. (orig.)

  13. Parallel Sn Sweeps on Unstructured Grids: Algorithms for Prioritization, Grid Partitioning, and Cycle Detection

    International Nuclear Information System (INIS)

    Plimpton, Steven J.; Hendrickson, Bruce; Burns, Shawn P.; McLendon, William III; Rauchwerger, Lawrence

    2005-01-01

    The method of discrete ordinates is commonly used to solve the Boltzmann transport equation. The solution in each ordinate direction is most efficiently computed by sweeping the radiation flux across the computational grid. For unstructured grids this poses many challenges, particularly when implemented on distributed-memory parallel machines where the grid geometry is spread across processors. We present several algorithms relevant to this approach: (a) an asynchronous message-passing algorithm that performs sweeps simultaneously in multiple ordinate directions, (b) a simple geometric heuristic to prioritize the computational tasks that a processor works on, (c) a partitioning algorithm that creates columnar-style decompositions for unstructured grids, and (d) an algorithm for detecting and eliminating cycles that sometimes exist in unstructured grids and can prevent sweeps from successfully completing. Algorithms (a) and (d) are fully parallel; algorithms (b) and (c) can be used in conjunction with (a) to achieve higher parallel efficiencies. We describe our message-passing implementations of these algorithms within a radiation transport package. Performance and scalability results are given for unstructured grids with up to 3 million elements (500 million unknowns) running on thousands of processors of Sandia National Laboratories' Intel Tflops machine and DEC-Alpha CPlant cluster

  14. Mesh Generation via Local Bisection Refinement of Triangulated Grids

    Science.gov (United States)

    2015-06-01

    Science and Technology Organisation DSTO–TR–3095 ABSTRACT This report provides a comprehensive implementation of an unstructured mesh generation method...and Technology Organisation 506 Lorimer St, Fishermans Bend, Victoria 3207, Australia Telephone: 1300 333 362 Facsimile: (03) 9626 7999 c© Commonwealth...their behaviour is critically linked to Maubach’s method and the data structures N and T . The top- level mesh refinement algorithm is also presented

  15. Unstructured Socializing with Peers and Delinquent Behavior: A Genetically Informed Analysis.

    Science.gov (United States)

    Meldrum, Ryan C; Barnes, J C

    2017-09-01

    A large body of research finds that unstructured socializing with peers is positively associated with delinquency during adolescence. Yet, existing research has not ruled out the potential for confounding due to genetic factors and factors that can be traced to environments shared between siblings. To fill this void, the current study examines whether the association between unstructured socializing with peers and delinquent behavior remains when accounting for genetic factors, shared environmental influences, and a variety of non-shared environmental covariates. We do so by using data from the twin subsample of the National Longitudinal Study of Adolescent to Adult Health (n = 1200 at wave 1 and 1103 at wave 2; 51% male; mean age at wave 1 = 15.63). Results from both cross-sectional and lagged models indicate the association between unstructured socializing with peers and delinquent behavior remains when controlling for both genetic and environmental influences. Supplementary analyses examining the association under different specifications offer additional, albeit qualified, evidence supportive of this finding. The study concludes with a discussion highlighting the importance of limiting free time with friends in the absence of authority figures as a strategy for reducing delinquency during adolescence.

  16. From Pore Scale to Turbulent Flow with the Unstructured Lattice Boltzmann Method

    DEFF Research Database (Denmark)

    Matin, Rastin

    Abstract: The lattice Boltzmann method is a class of methods in computational fluid dynamics for simulating fluid flow. Implementations on unstructured grids are particularly relevant for various engineering applications, where geometric flexibility or high resolution near a body or a wall...... is required. The main topic of this thesis is to further develop unstructured lattice Boltzmann methods for simulations of Newtonian fluid flow in three dimensions, in particular porous flow. Two methods are considered in this thesis based on the finite volume method and finite element method, respectively...

  17. (2+1)-dimensional quantum gravity as the continuum limit of causal dynamical triangulations

    International Nuclear Information System (INIS)

    Benedetti, D.; Loll, R.; Zamponi, F.

    2007-01-01

    We perform a nonperturbative sum over geometries in a (2+1)-dimensional quantum gravity model given in terms of causal dynamical triangulations. Inspired by the concept of triangulations of product type introduced previously, we impose an additional notion of order on the discrete, causal geometries. This simplifies the combinatorial problem of counting geometries just enough to enable us to calculate the transfer matrix between boundary states labeled by the area of the spatial universe, as well as the corresponding quantum Hamiltonian of the continuum theory. This is the first time in dimension larger than 2 that a Hamiltonian has been derived from such a model by mainly analytical means, and it opens the way for a better understanding of scaling and renormalization issues

  18. Triangulation based inclusion probabilities: a design-unbiased sampling approach

    OpenAIRE

    Fehrmann, Lutz; Gregoire, Timothy; Kleinn, Christoph

    2011-01-01

    A probabilistic sampling approach for design-unbiased estimation of area-related quantitative characteristics of spatially dispersed population units is proposed. The developed field protocol includes a fixed number of 3 units per sampling location and is based on partial triangulations over their natural neighbors to derive the individual inclusion probabilities. The performance of the proposed design is tested in comparison to fixed area sample plots in a simulation with two forest stands. ...

  19. Three-Dimensional Reconstruction Optical System Using Shadows Triangulation

    Science.gov (United States)

    Barba, J. Leiner; Vargas, Q. Lorena; Torres, M. Cesar; Mattos, V. Lorenzo

    2008-04-01

    In this work is developed a three-dimensional reconstruction system using the Shades3D tool of the Matlab® Programming Language and materials of low cost, such as webcam camera, a stick, a weak structured lighting system composed by a desk lamp, and observation plane in which the object is located. The reconstruction is obtained through a triangulation process that is executed after acquiring a sequence of images of the scene with a shadow projected on the object; additionally an image filtering process is done for obtaining only the part of the scene that will be reconstructed. Previously, it is necessary to develop a calibration process for determining the internal camera geometric and optical characteristics (intrinsic parameters), and the 3D position and orientation of the camera frame relative to a certain world coordinate system (extrinsic parameters). The lamp and the stick are used to produce a shadow which scans the object; in this technique, it is not necessary to know the position of the light source, instead the triangulation is obtained using shadow plane produced by intersection between the stick and the illumination pattern. The webcam camera captures all images with the shadow scanning the object, and Shades3D tool processes all information taking into account captured images and calibration parameters. Likewise, this technique is evaluated in the reconstruction of parts of the human body and its application in the detection of external abnormalities and elaboration of prosthesis or implant.

  20. Mesh Adaptation and Shape Optimization on Unstructured Meshes, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — In this SBIR CRM proposes to implement the entropy adjoint method for solution adaptive mesh refinement into the Loci/CHEM unstructured flow solver. The scheme will...

  1. The Application of a Multiphase Triangulation Approach to Mixed Methods: The Research of an Aspiring School Principal Development Program

    Science.gov (United States)

    Youngs, Howard; Piggot-Irvine, Eileen

    2012-01-01

    Mixed methods research has emerged as a credible alternative to unitary research approaches. The authors show how a combination of a triangulation convergence model with a triangulation multilevel model was used to research an aspiring school principal development pilot program. The multilevel model is used to show the national and regional levels…

  2. Wave Resource Characterization Using an Unstructured Grid Modeling Approach

    Directory of Open Access Journals (Sweden)

    Wei-Cheng Wu

    2018-03-01

    Full Text Available This paper presents a modeling study conducted on the central Oregon coast for wave resource characterization, using the unstructured grid Simulating WAve Nearshore (SWAN model coupled with a nested grid WAVEWATCH III® (WWIII model. The flexibility of models with various spatial resolutions and the effects of open boundary conditions simulated by a nested grid WWIII model with different physics packages were evaluated. The model results demonstrate the advantage of the unstructured grid-modeling approach for flexible model resolution and good model skills in simulating the six wave resource parameters recommended by the International Electrotechnical Commission in comparison to the observed data in Year 2009 at National Data Buoy Center Buoy 46050. Notably, spectral analysis indicates that the ST4 physics package improves upon the ST2 physics package’s ability to predict wave power density for large waves, which is important for wave resource assessment, load calculation of devices, and risk management. In addition, bivariate distributions show that the simulated sea state of maximum occurrence with the ST4 physics package matched the observed data better than with the ST2 physics package. This study demonstrated that the unstructured grid wave modeling approach, driven by regional nested grid WWIII outputs along with the ST4 physics package, can efficiently provide accurate wave hindcasts to support wave resource characterization. Our study also suggests that wind effects need to be considered if the dimension of the model domain is greater than approximately 100 km, or O (102 km.

  3. Organized and Unstructured Activity Participation among Adolescents Involved with Child Protective Services in the United States

    Science.gov (United States)

    Kwak, Yoonyoung; Lu, Ting; Christ, Sharon L.

    2017-01-01

    Background: Many adolescents are referred to Child Protective Services for possible maltreatment every year, but not much is known about their organized and unstructured activity participation. Objective: The purposes of this study are to provide a description of organized and unstructured activity participation for adolescents who were possible…

  4. Implicit Unstructured Aerodynamics on Emerging Multi- and Many-Core HPC Architectures

    KAUST Repository

    Al Farhan, Mohammed A.

    2017-03-13

    Shared memory parallelization of PETSc-FUN3D, an unstructured tetrahedral mesh Euler code previously characterized for distributed memory Single Program, Multiple Data (SPMD) for thousands of nodes, is hybridized with shared memory Single Instruction, Multiple Data (SIMD) for hundreds of threads per node. We explore thread-level performance optimizations on state-of-the-art multi- and many-core Intel processors, including the second generation of Xeon Phi, Knights Landing (KNL). We study the performance on the KNL with different configurations of memory and cluster modes, with code optimizations to minimize indirect addressing and enhance the cache locality. The optimizations employed are expected to be of value other unstructured applications as many-core architecture evolves.

  5. A hybrid Boundary Element Unstructured Transmission-line (BEUT) method for accurate 2D electromagnetic simulation

    Energy Technology Data Exchange (ETDEWEB)

    Simmons, Daniel, E-mail: daniel.simmons@nottingham.ac.uk; Cools, Kristof; Sewell, Phillip

    2016-11-01

    Time domain electromagnetic simulation tools have the ability to model transient, wide-band applications, and non-linear problems. The Boundary Element Method (BEM) and the Transmission Line Modeling (TLM) method are both well established numerical techniques for simulating time-varying electromagnetic fields. The former surface based method can accurately describe outwardly radiating fields from piecewise uniform objects and efficiently deals with large domains filled with homogeneous media. The latter volume based method can describe inhomogeneous and non-linear media and has been proven to be unconditionally stable. Furthermore, the Unstructured TLM (UTLM) enables modelling of geometrically complex objects by using triangular meshes which removes staircasing and unnecessary extensions of the simulation domain. The hybridization of BEM and UTLM which is described in this paper is named the Boundary Element Unstructured Transmission-line (BEUT) method. It incorporates the advantages of both methods. The theory and derivation of the 2D BEUT method is described in this paper, along with any relevant implementation details. The method is corroborated by studying its correctness and efficiency compared to the traditional UTLM method when applied to complex problems such as the transmission through a system of Luneburg lenses and the modelling of antenna radomes for use in wireless communications. - Graphical abstract:.

  6. Restrictions on Measurement of Roughness of Textile Fabrics by Laser Triangulation: A Phenomenological Approach

    International Nuclear Information System (INIS)

    Berberi, Pellumb; Tabaku, Burhan

    2010-01-01

    Laser triangulation method is one of the methods used for contactless measurement of roughness of textile fabrics. Method is based on measurement of distance between the sensor and the object by imaging the light scattered from the surface. However, experimental results, especially for high values of roughness, show a strong dependence to duration of exposure time to laser pulses. Use of very short exposure times and long exposures times causes appearance on the surface of the scanned textile of pixels with Active peak heights. The number of Active peaks increases with decrease of exposure time down to 0.1 ms, and increases with increase of exposure time up to 100 ms. Appearance of Active peaks leads to nonrealistic increase of roughness of the surface both for short exposure times and long exposure times reaching a minimum somewhere in the region of medium exposure times, 1 to 2 ms. The above effect suggests a careful analysis of experimental data and, also, becomes an important restriction to the method. In this paper we attempt to make a phenomenological approach to the mechanisms leading to these effects. We suppose that effect is related both to scattering properties of scanned surface and to physical parameters of CCD sensors. The first factor becomes more important in the region of long exposure times, while second factor becomes more important in the region of short exposure times.

  7. Continuous-time quantum algorithms for unstructured problems

    International Nuclear Information System (INIS)

    Hen, Itay

    2014-01-01

    We consider a family of unstructured optimization problems, for which we propose a method for constructing analogue, continuous-time (not necessarily adiabatic) quantum algorithms that are faster than their classical counterparts. In this family of problems, which we refer to as ‘scrambled input’ problems, one has to find a minimum-cost configuration of a given integer-valued n-bit black-box function whose input values have been scrambled in some unknown way. Special cases within this set of problems are Grover’s search problem of finding a marked item in an unstructured database, certain random energy models, and the functions of the Deutsch–Josza problem. We consider a couple of examples in detail. In the first, we provide an O(1) deterministic analogue quantum algorithm to solve the seminal problem of Deutsch and Josza, in which one has to determine whether an n-bit boolean function is constant (gives 0 on all inputs or 1 on all inputs) or balanced (returns 0 on half the input states and 1 on the other half). We also study one variant of the random energy model, and show that, as one might expect, its minimum energy configuration can be found quadratically faster with a quantum adiabatic algorithm than with classical algorithms. (paper)

  8. Triangulation of written assessments from patients, teachers and students: useful for students and teachers?

    Science.gov (United States)

    Gran, Sarah Frandsen; Braend, Anja Maria; Lindbaek, Morten

    2010-01-01

    Many medical students in general practice clerkships experience lack of observation-based feedback. The StudentPEP project combined written feedback from patients, observing teachers and students. This study analyzes the perceived usefulness of triangulated written feedback. A total of 71 general practitioners and 79 medical students at the University of Oslo completed project evaluation forms after a 6-week clerkship. A principal component analysis was performed to find structures within the questionnaire. Regression analysis was performed regarding students' answers to whether StudentPEP was worthwhile. Free-text answers were analyzed qualitatively. Student and teacher responses were mixed within six subscales, with highest agreement on 'Teachers oral and written feedback' and 'Attitude to patient evaluation'. Fifty-four per cent of the students agreed that the triangulation gave concrete feedback on their weaknesses, and 59% valued the teachers' feedback provided. Two statements regarding the teacher's attitudes towards StudentPEP were significantly associated with the student's perception of worthwhileness. Qualitative analysis showed that patient evaluations were encouraging or distrusted. Some students thought that StudentPEP ensured observation and feedback. The patient evaluations increased the students' awareness of the patient perspective. A majority of the students considered the triangulated written feedback beneficial, although time-consuming. The teacher's attitudes strongly influenced how the students perceived the usefulness of StudentPEP.

  9. Recognition and characterization of unstructured environmental sounds

    Science.gov (United States)

    Chu, Selina

    2011-12-01

    Environmental sounds are what we hear everyday, or more generally sounds that surround us ambient or background audio. Humans utilize both vision and hearing to respond to their surroundings, a capability still quite limited in machine processing. The first step toward achieving multimodal input applications is the ability to process unstructured audio and recognize audio scenes (or environments). Such ability would have applications in content analysis and mining of multimedia data or improving robustness in context aware applications through multi-modality, such as in assistive robotics, surveillances, or mobile device-based services. The goal of this thesis is on the characterization of unstructured environmental sounds for understanding and predicting the context surrounding of an agent or device. Most research on audio recognition has focused primarily on speech and music. Less attention has been paid to the challenges and opportunities for using audio to characterize unstructured audio. My research focuses on investigating challenging issues in characterizing unstructured environmental audio and to develop novel algorithms for modeling the variations of the environment. The first step in building a recognition system for unstructured auditory environment was to investigate on techniques and audio features for working with such audio data. We begin by performing a study that explore suitable features and the feasibility of designing an automatic environment recognition system using audio information. In my initial investigation to explore the feasibility of designing an automatic environment recognition system using audio information, I have found that traditional recognition and feature extraction for audio were not suitable for environmental sound, as they lack any type of structures, unlike those of speech and music which contain formantic and harmonic structures, thus dispelling the notion that traditional speech and music recognition techniques can simply

  10. Exploring Torus Universes in Causal Dynamical Triangulations

    DEFF Research Database (Denmark)

    Budd, Timothy George; Loll, R.

    2013-01-01

    Motivated by the search for new observables in nonperturbative quantum gravity, we consider Causal Dynamical Triangulations (CDT) in 2+1 dimensions with the spatial topology of a torus. This system is of particular interest, because one can study not only the global scale factor, but also global...... shape variables in the presence of arbitrary quantum fluctuations of the geometry. Our initial investigation focusses on the dynamics of the scale factor and uncovers a qualitatively new behaviour, which leads us to investigate a novel type of boundary conditions for the path integral. Comparing large....... Apart from setting the stage for the analysis of shape dynamics on the torus, the new set-up highlights the role of nontrivial boundaries and topology....

  11. WebDat: bridging the gap between unstructured and structured data

    International Nuclear Information System (INIS)

    Nogiec, Jerzy M.; Trombly-Freytag, Kelley; Carcagno, Ruben

    2008-01-01

    Accelerator R and D environments produce data characterized by different levels of organization. Whereas some systems produce repetitively predictable and standardized structured data, others may produce data of unknown or changing structure. In addition, structured data, typically sets of numeric values, are frequently logically connected with unstructured content (e.g., images, graphs, comments). Despite these different characteristics, a coherent, organized and integrated view of all information is sought out. WebDat is a system conceived as a result of efforts in this direction. It provides a uniform and searchable view of structured and unstructured data via common metadata, regardless of the repository used (DBMS or file system). It also allows for processing data and creating interactive reports. WebDat supports metadata management, administration, data and content access, application integration via Web services, and Web-based collaborative analysis

  12. WebDat: bridging the gap between unstructured and structured data

    Energy Technology Data Exchange (ETDEWEB)

    Nogiec, Jerzy M.; Trombly-Freytag, Kelley; Carcagno, Ruben; /Fermilab

    2008-11-01

    Accelerator R&D environments produce data characterized by different levels of organization. Whereas some systems produce repetitively predictable and standardized structured data, others may produce data of unknown or changing structure. In addition, structured data, typically sets of numeric values, are frequently logically connected with unstructured content (e.g., images, graphs, comments). Despite these different characteristics, a coherent, organized and integrated view of all information is sought out. WebDat is a system conceived as a result of efforts in this direction. It provides a uniform and searchable view of structured and unstructured data via common metadata, regardless of the repository used (DBMS or file system). It also allows for processing data and creating interactive reports. WebDat supports metadata management, administration, data and content access, application integration via Web services, and Web-based collaborative analysis.

  13. Optimizing 3D Triangulations to Recapture Sharp Edges

    DEFF Research Database (Denmark)

    Bærentzen, Jakob Andreas

    2006-01-01

    In this report, a technique for optimizing 3D triangulations is proposed. The method seeks to minimize an energy defined as a sum of energy terms for each edge in a triangle mesh. The main contribution is a novel per edge energy which strikes a balance between penalizing dihedral angle yet allowing...... sharp edges. The energy is minimized using edge swapping, and this can be done either in a greedy fashion or using simulated annealing. The latter is more costly, but effectively avoids local minima. The method has been used on a number of models. Particularly good results have been obtained on digital...

  14. Transmission probability method based on triangle meshes for solving unstructured geometry neutron transport problem

    Energy Technology Data Exchange (ETDEWEB)

    Wu Hongchun [Nuclear Engineering Department, Xi' an Jiaotong University, Xi' an 710049, Shaanxi (China)]. E-mail: hongchun@mail.xjtu.edu.cn; Liu Pingping [Nuclear Engineering Department, Xi' an Jiaotong University, Xi' an 710049, Shaanxi (China); Zhou Yongqiang [Nuclear Engineering Department, Xi' an Jiaotong University, Xi' an 710049, Shaanxi (China); Cao Liangzhi [Nuclear Engineering Department, Xi' an Jiaotong University, Xi' an 710049, Shaanxi (China)

    2007-01-15

    In the advanced reactor, the fuel assembly or core with unstructured geometry is frequently used and for calculating its fuel assembly, the transmission probability method (TPM) has been used widely. However, the rectangle or hexagon meshes are mainly used in the TPM codes for the normal core structure. The triangle meshes are most useful for expressing the complicated unstructured geometry. Even though finite element method and Monte Carlo method is very good at solving unstructured geometry problem, they are very time consuming. So we developed the TPM code based on the triangle meshes. The TPM code based on the triangle meshes was applied to the hybrid fuel geometry, and compared with the results of the MCNP code and other codes. The results of comparison were consistent with each other. The TPM with triangle meshes would thus be expected to be able to apply to the two-dimensional arbitrary fuel assembly.

  15. An assessment of unstructured grid finite volume schemes for cold gas hypersonic flow calculations

    Directory of Open Access Journals (Sweden)

    João Luiz F. Azevedo

    2009-06-01

    Full Text Available A comparison of five different spatial discretization schemes is performed considering a typical high speed flow application. Flowfields are simulated using the 2-D Euler equations, discretized in a cell-centered finite volume procedure on unstructured triangular meshes. The algorithms studied include a central difference-type scheme, and 1st- and 2nd-order van Leer and Liou flux-vector splitting schemes. These methods are implemented in an efficient, edge-based, unstructured grid procedure which allows for adaptive mesh refinement based on flow property gradients. Details of the unstructured grid implementation of the methods are presented together with a discussion of the data structure and of the adaptive refinement strategy. The application of interest is the cold gas flow through a typical hypersonic inlet. Results for different entrance Mach numbers and mesh topologies are discussed in order to assess the comparative performance of the various spatial discretization schemes.

  16. A study on the effect of different image centres on stereo triangulation accuracy

    CSIR Research Space (South Africa)

    De Villiers, J

    2015-11-01

    Full Text Available This paper evaluates the effect of mixing the distortion centre, principal point and arithmetic image centre on the distortion correction, focal length determination and resulting real-world stereo vision triangulation. A robotic arm is used...

  17. Triangulation of the monophasic action potential causes flattening of the electrocardiographic T-wave

    DEFF Research Database (Denmark)

    Bhuiyan, Tanveer Ahmed; Graff, Claus; Thomsen, Morten Bækgaard

    2012-01-01

    of the action potential under the effect of the IKr blocker sertindole and associated these changes to concurrent changes in the morphology of electrocardiographic T-waves in dogs. We show that, under the effect of sertindole, the peak changes in the morphology of action potentials occur at time points similar......It has been proposed that triangulation on the cardiac action potential manifests as a broadened, more flat and notched T-wave on the ECG but to what extent such morphology characteristics are indicative of triangulation is more unclear. In this paper, we have analyzed the morphological changes...... to those observed for the peak changes in T-wave morphology on the ECG. We further show that the association between action potential shape and ECG shape is dose-dependent and most prominent at the time corresponding to phase 3 of the action potential....

  18. Linear systems with unstructured multiplicative uncertainty: Modeling and robust stability analysis.

    Directory of Open Access Journals (Sweden)

    Radek Matušů

    Full Text Available This article deals with continuous-time Linear Time-Invariant (LTI Single-Input Single-Output (SISO systems affected by unstructured multiplicative uncertainty. More specifically, its aim is to present an approach to the construction of uncertain models based on the appropriate selection of a nominal system and a weight function and to apply the fundamentals of robust stability investigation for considered sort of systems. The initial theoretical parts are followed by three extensive illustrative examples in which the first order time-delay, second order and third order plants with parametric uncertainty are modeled as systems with unstructured multiplicative uncertainty and subsequently, the robust stability of selected feedback loops containing constructed models and chosen controllers is analyzed and obtained results are discussed.

  19. Unsupervised Ontology Generation from Unstructured Text. CRESST Report 827

    Science.gov (United States)

    Mousavi, Hamid; Kerr, Deirdre; Iseli, Markus R.

    2013-01-01

    Ontologies are a vital component of most knowledge acquisition systems, and recently there has been a huge demand for generating ontologies automatically since manual or supervised techniques are not scalable. In this paper, we introduce "OntoMiner", a rule-based, iterative method to extract and populate ontologies from unstructured or…

  20. GRIZ: Visualization of finite element analysis results on unstructured grids

    International Nuclear Information System (INIS)

    Dovey, D.; Loomis, M.D.

    1994-01-01

    GRIZ is a general-purpose post-processing application that supports interactive visualization of finite element analysis results on three-dimensional unstructured grids. GRIZ includes direct-to-videodisc animation capabilities and is being used as a production tool for creating engineering animations

  1. Smooth random surfaces from tight immersions?

    International Nuclear Information System (INIS)

    Baillie, C.F.; Johnston, D.A.

    1994-01-01

    We investigate actions for dynamically triangulated random surfaces that consist of a Gaussian or area term plus the modulus of the Gaussian curvature and compare their behavior with both Gaussian plus extrinsic curvature and ''Steiner'' actions

  2. The finite body triangulation: algorithms, subgraphs, homogeneity estimation and application.

    Science.gov (United States)

    Carson, Cantwell G; Levine, Jonathan S

    2016-09-01

    The concept of a finite body Dirichlet tessellation has been extended to that of a finite body Delaunay 'triangulation' to provide a more meaningful description of the spatial distribution of nonspherical secondary phase bodies in 2- and 3-dimensional images. A finite body triangulation (FBT) consists of a network of minimum edge-to-edge distances between adjacent objects in a microstructure. From this is also obtained the characteristic object chords formed by the intersection of the object boundary with the finite body tessellation. These two sets of distances form the basis of a parsimonious homogeneity estimation. The characteristics of the spatial distribution are then evaluated with respect to the distances between objects and the distances within them. Quantitative analysis shows that more physically representative distributions can be obtained by selecting subgraphs, such as the relative neighbourhood graph and the minimum spanning tree, from the finite body tessellation. To demonstrate their potential, we apply these methods to 3-dimensional X-ray computed tomographic images of foamed cement and their 2-dimensional cross sections. The Python computer code used to estimate the FBT is made available. Other applications for the algorithm - such as porous media transport and crack-tip propagation - are also discussed. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  3. Simulating multi-scale oceanic processes around Taiwan on unstructured grids

    Science.gov (United States)

    Yu, Hao-Cheng; Zhang, Yinglong J.; Yu, Jason C. S.; Terng, C.; Sun, Weiling; Ye, Fei; Wang, Harry V.; Wang, Zhengui; Huang, Hai

    2017-11-01

    We validate a 3D unstructured-grid (UG) model for simulating multi-scale processes as occurred in Northwestern Pacific around Taiwan using recently developed new techniques (Zhang et al., Ocean Modeling, 102, 64-81, 2016) that require no bathymetry smoothing even for this region with prevalent steep bottom slopes and many islands. The focus is on short-term forecast for several months instead of long-term variability. Compared with satellite products, the errors for the simulated Sea-surface Height (SSH) and Sea-surface Temperature (SST) are similar to a reference data-assimilated global model. In the nearshore region, comparison with 34 tide gauges located around Taiwan indicates an average RMSE of 13 cm for the tidal elevation. The average RMSE for SST at 6 coastal buoys is 1.2 °C. The mean transport and eddy kinetic energy compare reasonably with previously published values and the reference model used to provide boundary and initial conditions. The model suggests ∼2-day interruption of Kuroshio east of Taiwan during a typhoon period. The effect of tidal mixing is shown to be significant nearshore. The multi-scale model is easily extendable to target regions of interest due to its UG framework and a flexible vertical gridding system, which is shown to be superior to terrain-following coordinates.

  4. Introductory review on `Flying Triangulation': a motion-robust optical 3D measurement principle

    Science.gov (United States)

    Ettl, Svenja

    2015-04-01

    'Flying Triangulation' (FlyTri) is a recently developed principle which allows for a motion-robust optical 3D measurement of rough surfaces. It combines a simple sensor with sophisticated algorithms: a single-shot sensor acquires 2D camera images. From each camera image, a 3D profile is generated. The series of 3D profiles generated are aligned to one another by algorithms, without relying on any external tracking device. It delivers real-time feedback of the measurement process which enables an all-around measurement of objects. The principle has great potential for small-space acquisition environments, such as the measurement of the interior of a car, and motion-sensitive measurement tasks, such as the intraoral measurement of teeth. This article gives an overview of the basic ideas and applications of FlyTri. The main challenges and their solutions are discussed. Measurement examples are also given to demonstrate the potential of the measurement principle.

  5. Scaling analyses of the spectral dimension in 3-dimensional causal dynamical triangulations

    Science.gov (United States)

    Cooperman, Joshua H.

    2018-05-01

    The spectral dimension measures the dimensionality of a space as witnessed by a diffusing random walker. Within the causal dynamical triangulations approach to the quantization of gravity (Ambjørn et al 2000 Phys. Rev. Lett. 85 347, 2001 Nucl. Phys. B 610 347, 1998 Nucl. Phys. B 536 407), the spectral dimension exhibits novel scale-dependent dynamics: reducing towards a value near 2 on sufficiently small scales, matching closely the topological dimension on intermediate scales, and decaying in the presence of positive curvature on sufficiently large scales (Ambjørn et al 2005 Phys. Rev. Lett. 95 171301, Ambjørn et al 2005 Phys. Rev. D 72 064014, Benedetti and Henson 2009 Phys. Rev. D 80 124036, Cooperman 2014 Phys. Rev. D 90 124053, Cooperman et al 2017 Class. Quantum Grav. 34 115008, Coumbe and Jurkiewicz 2015 J. High Energy Phys. JHEP03(2015)151, Kommu 2012 Class. Quantum Grav. 29 105003). I report the first comprehensive scaling analysis of the small-to-intermediate scale spectral dimension for the test case of the causal dynamical triangulations of 3-dimensional Einstein gravity. I find that the spectral dimension scales trivially with the diffusion constant. I find that the spectral dimension is completely finite in the infinite volume limit, and I argue that its maximal value is exactly consistent with the topological dimension of 3 in this limit. I find that the spectral dimension reduces further towards a value near 2 as this case’s bare coupling approaches its phase transition, and I present evidence against the conjecture that the bare coupling simply sets the overall scale of the quantum geometry (Ambjørn et al 2001 Phys. Rev. D 64 044011). On the basis of these findings, I advance a tentative physical explanation for the dynamical reduction of the spectral dimension observed within causal dynamical triangulations: branched polymeric quantum geometry on sufficiently small scales. My analyses should facilitate attempts to employ the spectral

  6. The Family System and Depressive Symptoms during the College Years: Triangulation, Parental Differential Treatment, and Sibling Warmth as Predictors.

    Science.gov (United States)

    Ponappa, Sujata; Bartle-Haring, Suzanne; Holowacz, Eugene; Ferriby, Megan

    2017-01-01

    Guided by Bowen theory, we investigated the relationships between parent-child triangulation, parental differential treatment (PDT), sibling warmth, and individual depressive symptoms in a sample of 77 sibling dyads, aged 18-25 years, recruited through undergraduate classes at a U.S. public University. Results of the actor-partner interdependence models suggested that being triangulated into parental conflict was positively related to both siblings' perception of PDT; however, as one sibling felt triangulated, the other perceived reduced levels of PDT. For both siblings, the perception of higher levels of PDT was related to decreased sibling warmth and higher sibling warmth was associated with fewer depressive symptoms. The implications of these findings for research and the treatment of depression in the college-aged population are discussed. © 2016 American Association for Marriage and Family Therapy.

  7. Using unstructured diaries for primary data collection.

    Science.gov (United States)

    Thomas, Juliet Anne

    2015-05-01

    To give a reflective account of using unstructured handwritten diaries as a method of collecting qualitative data. Diaries are primarily used in research as a method of collecting qualitative data. There are some challenges associated with their use, including compliance rates. However, they can provide a rich source of meaningful data and can avoid the difficulties of participants trying to precisely recall events after some time has elapsed. The author used unstructured handwritten diaries as her primary method of collecting data during her grounded theory doctoral study, when she examined the professional socialisation of nursing students. Over two years, 26 participants selected from four consecutive recruited groups of nursing students volunteered to take part in the study and were asked to keep a daily diary throughout their first five weeks of clinical experience. When using open-ended research questions, grounded theory's pragmatic approach permits the examination of processes thereby creating conceptual interpretive understanding of data. A wealth of rich, detailed data was obtained from the diaries that permitted the development of new theories regarding the effects early clinical experiences have on nursing students' professional socialisation. Diaries were found to provide insightful in-depth qualitative data in a resource-friendly manner. Nurse researchers should consider using diaries as an alternative to more commonly used approaches to collecting qualitative data.

  8. Exploring Shared-Memory Optimizations for an Unstructured Mesh CFD Application on Modern Parallel Systems

    KAUST Repository

    Mudigere, Dheevatsa

    2015-05-01

    In this work, we revisit the 1999 Gordon Bell Prize winning PETSc-FUN3D aerodynamics code, extending it with highly-tuned shared-memory parallelization and detailed performance analysis on modern highly parallel architectures. An unstructured-grid implicit flow solver, which forms the backbone of computational aerodynamics, poses particular challenges due to its large irregular working sets, unstructured memory accesses, and variable/limited amount of parallelism. This code, based on a domain decomposition approach, exposes tradeoffs between the number of threads assigned to each MPI-rank sub domain, and the total number of domains. By applying several algorithm- and architecture-aware optimization techniques for unstructured grids, we show a 6.9X speed-up in performance on a single-node Intel® XeonTM1 E5 2690 v2 processor relative to the out-of-the-box compilation. Our scaling studies on TACC Stampede supercomputer show that our optimizations continue to provide performance benefits over baseline implementation as we scale up to 256 nodes.

  9. Development of 3-D Flow Analysis Code for Fuel Assembly using Unstructured Grid System

    Energy Technology Data Exchange (ETDEWEB)

    Myong, Hyon Kook; Kim, Jong Eun; Ahn, Jong Ki; Yang, Seung Yong [Kookmin Univ., Seoul (Korea, Republic of)

    2007-03-15

    The flow through a nuclear rod bundle with mixing vanes are very complex and required a suitable turbulence model to be predicted accurately. Final objective of this study is to develop a CFD code for fluid flow and heat transfer analysis in a nuclear fuel assembly using unstructured grid system. In order to develop a CFD code for fluid flow and heat transfer analysis in a nuclear fuel assembly using unstructured grid system, the following researches are made: - Development of numerical algorithm for CFD code's solver - Grid and geometric connectivity data - Development of software(PowerCFD code) for fluid flow and heat transfer analysis in a nuclear fuel assembly using unstructured grid system - Modulation of software(PowerCFD code) - Development of turbulence model - Development of analysis module of RANS/LES hybrid models - Analysis of turbulent flow and heat transfer - Basic study on LES analysis - Development of main frame on pre/post processors based on GUI - Algorithm for fully-developed flow.

  10. Development of 3-D Flow Analysis Code for Fuel Assembly using Unstructured Grid System

    International Nuclear Information System (INIS)

    Myong, Hyon Kook; Kim, Jong Eun; Ahn, Jong Ki; Yang, Seung Yong

    2007-03-01

    The flow through a nuclear rod bundle with mixing vanes are very complex and required a suitable turbulence model to be predicted accurately. Final objective of this study is to develop a CFD code for fluid flow and heat transfer analysis in a nuclear fuel assembly using unstructured grid system. In order to develop a CFD code for fluid flow and heat transfer analysis in a nuclear fuel assembly using unstructured grid system, the following researches are made: - Development of numerical algorithm for CFD code's solver - Grid and geometric connectivity data - Development of software(PowerCFD code) for fluid flow and heat transfer analysis in a nuclear fuel assembly using unstructured grid system - Modulation of software(PowerCFD code) - Development of turbulence model - Development of analysis module of RANS/LES hybrid models - Analysis of turbulent flow and heat transfer - Basic study on LES analysis - Development of main frame on pre/post processors based on GUI - Algorithm for fully-developed flow

  11. Incremental Reconstruction of Urban Environments by Edge-Points Delaunay Triangulation

    OpenAIRE

    Romanoni, Andrea; Matteucci, Matteo

    2016-01-01

    Urban reconstruction from a video captured by a surveying vehicle constitutes a core module of automated mapping. When computational power represents a limited resource and, a detailed map is not the primary goal, the reconstruction can be performed incrementally, from a monocular video, carving a 3D Delaunay triangulation of sparse points; this allows online incremental mapping for tasks such as traversability analysis or obstacle avoidance. To exploit the sharp edges of urban landscape, we ...

  12. Theoretical triangulation as an approach for revealing the complexity of a classroom discussion

    NARCIS (Netherlands)

    van Drie, J.; Dekker, R.

    2013-01-01

    In this paper we explore the value of theoretical triangulation as a methodological approach for the analysis of classroom interaction. We analyze an excerpt of a whole-class discussion in history from three theoretical perspectives: interactivity of the discourse, conceptual level raising and

  13. Quantum triangulations moduli space, quantum computing, non-linear sigma models and Ricci flow

    CERN Document Server

    Carfora, Mauro

    2017-01-01

    This book discusses key conceptual aspects and explores the connection between triangulated manifolds and quantum physics, using a set of case studies ranging from moduli space theory to quantum computing to provide an accessible introduction to this topic. Research on polyhedral manifolds often reveals unexpected connections between very distinct aspects of mathematics and physics. In particular, triangulated manifolds play an important role in settings such as Riemann moduli space theory, strings and quantum gravity, topological quantum field theory, condensed matter physics, critical phenomena and complex systems. Not only do they provide a natural discrete analogue to the smooth manifolds on which physical theories are typically formulated, but their appearance is also often a consequence of an underlying structure that naturally calls into play non-trivial aspects of representation theory, complex analysis and topology in a way that makes the basic geometric structures of the physical interactions involv...

  14. Shared decision-making in medical encounters regarding breast cancer treatment: the contribution of methodological triangulation.

    Science.gov (United States)

    Durif-Bruckert, C; Roux, P; Morelle, M; Mignotte, H; Faure, C; Moumjid-Ferdjaoui, N

    2015-07-01

    The aim of this study on shared decision-making in the doctor-patient encounter about surgical treatment for early-stage breast cancer, conducted in a regional cancer centre in France, was to further the understanding of patient perceptions on shared decision-making. The study used methodological triangulation to collect data (both quantitative and qualitative) about patient preferences in the context of a clinical consultation in which surgeons followed a shared decision-making protocol. Data were analysed from a multi-disciplinary research perspective (social psychology and health economics). The triangulated data collection methods were questionnaires (n = 132), longitudinal interviews (n = 47) and observations of consultations (n = 26). Methodological triangulation revealed levels of divergence and complementarity between qualitative and quantitative results that suggest new perspectives on the three inter-related notions of decision-making, participation and information. Patients' responses revealed important differences between shared decision-making and participation per se. The authors note that subjecting patients to a normative behavioural model of shared decision-making in an era when paradigms of medical authority are shifting may undermine the patient's quest for what he or she believes is a more important right: a guarantee of the best care available. © 2014 John Wiley & Sons Ltd.

  15. Segmentation of Large Unstructured Point Clouds Using Octree-Based Region Growing and Conditional Random Fields

    Science.gov (United States)

    Bassier, M.; Bonduel, M.; Van Genechten, B.; Vergauwen, M.

    2017-11-01

    Point cloud segmentation is a crucial step in scene understanding and interpretation. The goal is to decompose the initial data into sets of workable clusters with similar properties. Additionally, it is a key aspect in the automated procedure from point cloud data to BIM. Current approaches typically only segment a single type of primitive such as planes or cylinders. Also, current algorithms suffer from oversegmenting the data and are often sensor or scene dependent. In this work, a method is presented to automatically segment large unstructured point clouds of buildings. More specifically, the segmentation is formulated as a graph optimisation problem. First, the data is oversegmented with a greedy octree-based region growing method. The growing is conditioned on the segmentation of planes as well as smooth surfaces. Next, the candidate clusters are represented by a Conditional Random Field after which the most likely configuration of candidate clusters is computed given a set of local and contextual features. The experiments prove that the used method is a fast and reliable framework for unstructured point cloud segmentation. Processing speeds up to 40,000 points per second are recorded for the region growing. Additionally, the recall and precision of the graph clustering is approximately 80%. Overall, nearly 22% of oversegmentation is reduced by clustering the data. These clusters will be classified and used as a basis for the reconstruction of BIM models.

  16. Recent development of micro-triangulation for magnet fiducialisation

    CERN Document Server

    Vlachakis, Vasileios; Mainaud Durand, Helene; CERN. Geneva. ATS Department

    2016-01-01

    The micro-triangulation method is proposed as an alternative for magnet fiducialisation. The main objective is to measure horizontal and vertical angles to fiducial points and stretched wires, utilising theodolites equipped with cameras. This study aims to develop various methods, algorithms and software tools to enable the data acquisition and processing. In this paper, we present the first test measurement as an attempt to demonstrate the feasibility of the method and to evaluate the accuracy. The preliminary results are very promising, with accuracy always better than 20 μm for the wire position, and of about40 μm/m for the wire orientation, compared with a coordinate measuring machine.

  17. Public health triangulation: approach and application to synthesizing data to understand national and local HIV epidemics

    Directory of Open Access Journals (Sweden)

    Aberle-Grasse John

    2010-07-01

    Full Text Available Abstract Background Public health triangulation is a process for reviewing, synthesising and interpreting secondary data from multiple sources that bear on the same question to make public health decisions. It can be used to understand the dynamics of HIV transmission and to measure the impact of public health programs. While traditional intervention research and metaanalysis would be ideal sources of information for public health decision making, they are infrequently available, and often decisions can be based only on surveillance and survey data. Methods The process involves examination of a wide variety of data sources and both biological, behavioral and program data and seeks input from stakeholders to formulate meaningful public health questions. Finally and most importantly, it uses the results to inform public health decision-making. There are 12 discrete steps in the triangulation process, which included identification and assessment of key questions, identification of data sources, refining questions, gathering data and reports, assessing the quality of those data and reports, formulating hypotheses to explain trends in the data, corroborating or refining working hypotheses, drawing conclusions, communicating results and recommendations and taking public health action. Results Triangulation can be limited by the quality of the original data, the potentials for ecological fallacy and "data dredging" and reproducibility of results. Conclusions Nonetheless, we believe that public health triangulation allows for the interpretation of data sets that cannot be analyzed using meta-analysis and can be a helpful adjunct to surveillance, to formal public health intervention research and to monitoring and evaluation, which in turn lead to improved national strategic planning and resource allocation.

  18. Unstructured Spectral Element Model for Dispersive and Nonlinear Wave Propagation

    DEFF Research Database (Denmark)

    Engsig-Karup, Allan Peter; Eskilsson, Claes; Bigoni, Daniele

    2016-01-01

    We introduce a new stabilized high-order and unstructured numerical model for modeling fully nonlinear and dispersive water waves. The model is based on a nodal spectral element method of arbitrary order in space and a -transformed formulation due to Cai, Langtangen, Nielsen and Tveito (1998). In...

  19. Improved laser-based triangulation sensor with enhanced range and resolution through adaptive optics-based active beam control.

    Science.gov (United States)

    Reza, Syed Azer; Khwaja, Tariq Shamim; Mazhar, Mohsin Ali; Niazi, Haris Khan; Nawab, Rahma

    2017-07-20

    Various existing target ranging techniques are limited in terms of the dynamic range of operation and measurement resolution. These limitations arise as a result of a particular measurement methodology, the finite processing capability of the hardware components deployed within the sensor module, and the medium through which the target is viewed. Generally, improving the sensor range adversely affects its resolution and vice versa. Often, a distance sensor is designed for an optimal range/resolution setting depending on its intended application. Optical triangulation is broadly classified as a spatial-signal-processing-based ranging technique and measures target distance from the location of the reflected spot on a position sensitive detector (PSD). In most triangulation sensors that use lasers as a light source, beam divergence-which severely affects sensor measurement range-is often ignored in calculations. In this paper, we first discuss in detail the limitations to ranging imposed by beam divergence, which, in effect, sets the sensor dynamic range. Next, we show how the resolution of laser-based triangulation sensors is limited by the interpixel pitch of a finite-sized PSD. In this paper, through the use of tunable focus lenses (TFLs), we propose a novel design of a triangulation-based optical rangefinder that improves both the sensor resolution and its dynamic range through adaptive electronic control of beam propagation parameters. We present the theory and operation of the proposed sensor and clearly demonstrate a range and resolution improvement with the use of TFLs. Experimental results in support of our claims are shown to be in strong agreement with theory.

  20. Towards a supervised rescoring system for unstructured data bases used to build specialized dictionaries

    Directory of Open Access Journals (Sweden)

    Antonio Rico-Sulayes

    2014-12-01

    Full Text Available This article proposes the architecture for a system that uses previously learned weights to sort query results from unstructured data bases when building specialized dictionaries. A common resource in the construction of dictionaries, unstructured data bases have been especially useful in providing information about lexical items frequencies and examples in use. However, when building specialized dictionaries, whose selection of lexical items does not rely on frequency, the use of these data bases gets restricted to a simple provider of examples. Even in this task, the information unstructured data bases provide may not be very useful when looking for specialized uses of lexical items with various meanings and very long lists of results. In the face of this problem, long lists of hits can be rescored based on a supervised learning model that relies on previously helpful results. The allocation of a vast set of high quality training data for this rescoring system is reported here. Finally, the architecture of sucha system,an unprecedented tool in specialized lexicography, is proposed.

  1. A Parallel Non-Overlapping Domain-Decomposition Algorithm for Compressible Fluid Flow Problems on Triangulated Domains

    Science.gov (United States)

    Barth, Timothy J.; Chan, Tony F.; Tang, Wei-Pai

    1998-01-01

    This paper considers an algebraic preconditioning algorithm for hyperbolic-elliptic fluid flow problems. The algorithm is based on a parallel non-overlapping Schur complement domain-decomposition technique for triangulated domains. In the Schur complement technique, the triangulation is first partitioned into a number of non-overlapping subdomains and interfaces. This suggests a reordering of triangulation vertices which separates subdomain and interface solution unknowns. The reordering induces a natural 2 x 2 block partitioning of the discretization matrix. Exact LU factorization of this block system yields a Schur complement matrix which couples subdomains and the interface together. The remaining sections of this paper present a family of approximate techniques for both constructing and applying the Schur complement as a domain-decomposition preconditioner. The approximate Schur complement serves as an algebraic coarse space operator, thus avoiding the known difficulties associated with the direct formation of a coarse space discretization. In developing Schur complement approximations, particular attention has been given to improving sequential and parallel efficiency of implementations without significantly degrading the quality of the preconditioner. A computer code based on these developments has been tested on the IBM SP2 using MPI message passing protocol. A number of 2-D calculations are presented for both scalar advection-diffusion equations as well as the Euler equations governing compressible fluid flow to demonstrate performance of the preconditioning algorithm.

  2. Numerical study of Taylor bubbles with adaptive unstructured meshes

    Science.gov (United States)

    Xie, Zhihua; Pavlidis, Dimitrios; Percival, James; Pain, Chris; Matar, Omar; Hasan, Abbas; Azzopardi, Barry

    2014-11-01

    The Taylor bubble is a single long bubble which nearly fills the entire cross section of a liquid-filled circular tube. This type of bubble flow regime often occurs in gas-liquid slug flows in many industrial applications, including oil-and-gas production, chemical and nuclear reactors, and heat exchangers. The objective of this study is to investigate the fluid dynamics of Taylor bubbles rising in a vertical pipe filled with oils of extremely high viscosity (mimicking the ``heavy oils'' found in the oil-and-gas industry). A modelling and simulation framework is presented here which can modify and adapt anisotropic unstructured meshes to better represent the underlying physics of bubble rise and reduce the computational effort without sacrificing accuracy. The numerical framework consists of a mixed control-volume and finite-element formulation, a ``volume of fluid''-type method for the interface capturing based on a compressive control volume advection method, and a force-balanced algorithm for the surface tension implementation. Numerical examples of some benchmark tests and the dynamics of Taylor bubbles are presented to show the capability of this method. EPSRC Programme Grant, MEMPHIS, EP/K0039761/1.

  3. Optimizations of Unstructured Aerodynamics Computations for Many-core Architectures

    KAUST Repository

    Al Farhan, Mohammed Ahmed

    2018-04-13

    We investigate several state-of-the-practice shared-memory optimization techniques applied to key routines of an unstructured computational aerodynamics application with irregular memory accesses. We illustrate for the Intel KNL processor, as a representative of the processors in contemporary leading supercomputers, identifying and addressing performance challenges without compromising the floating point numerics of the original code. We employ low and high-level architecture-specific code optimizations involving thread and data-level parallelism. Our approach is based upon a multi-level hierarchical distribution of work and data across both the threads and the SIMD units within every hardware core. On a 64-core KNL chip, we achieve nearly 2.9x speedup of the dominant routines relative to the baseline. These exhibit almost linear strong scalability up to 64 threads, and thereafter some improvement with hyperthreading. At substantially fewer Watts, we achieve up to 1.7x speedup relative to the performance of 72 threads of a 36-core Haswell CPU and roughly equivalent performance to 112 threads of a 56-core Skylake scalable processor. These optimizations are expected to be of value for many other unstructured mesh PDE-based scientific applications as multi and many-core architecture evolves.

  4. The Virtual Family-development of surface-based anatomical models of two adults and two children for dosimetric simulations

    Energy Technology Data Exchange (ETDEWEB)

    Christ, Andreas; Honegger, Katharina; Zefferer, Marcel; Neufeld, Esra; Oberle, Michael; Szczerba, Dominik; Kuster, Niels [Foundation for Research on Information Technologies in Society (IT' IS), Zeughausstr. 43, 8004 Zuerich (Switzerland); Kainz, Wolfgang; Guag, Joshua W [US Food and Drug Administration (FDA), Center for Devices and Radiological Health (CDRH), Silver Spring, MD 20993 (United States); Hahn, Eckhart G; Rascher, Wolfgang; Janka, Rolf; Bautz, Werner [Universitaetsklinikum Erlangen, Friedrich-Alexander Universitaet Erlangen-Nuernberg, 91054 Erlangen (Germany); Chen, Ji; Shen, Jianxiang [Department of Electrical and Computer Engineering, The University of Houston, Houston, TX 77204 (United States); Kiefer, Berthold; Schmitt, Peter; Hollenbach, Hans-Peter [Siemens Healthcare, MR-Application Development, 91052 Erlangen (Germany); Kam, Anthony [Department of Imaging, Johns Hopkins Bayview Medical Center, Baltimore, MD 21224 (United States)], E-mail: christ@itis.ethz.ch

    2010-01-21

    The objective of this study was to develop anatomically correct whole body human models of an adult male (34 years old), an adult female (26 years old) and two children (an 11-year-old girl and a six-year-old boy) for the optimized evaluation of electromagnetic exposure. These four models are referred to as the Virtual Family. They are based on high resolution magnetic resonance (MR) images of healthy volunteers. More than 80 different tissue types were distinguished during the segmentation. To improve the accuracy and the effectiveness of the segmentation, a novel semi-automated tool was used to analyze and segment the data. All tissues and organs were reconstructed as three-dimensional (3D) unstructured triangulated surface objects, yielding high precision images of individual features of the body. This greatly enhances the meshing flexibility and the accuracy with respect to thin tissue layers and small organs in comparison with the traditional voxel-based representation of anatomical models. Conformal computational techniques were also applied. The techniques and tools developed in this study can be used to more effectively develop future models and further improve the accuracy of the models for various applications. For research purposes, the four models are provided for free to the scientific community. (note)

  5. The Virtual Family-development of surface-based anatomical models of two adults and two children for dosimetric simulations

    International Nuclear Information System (INIS)

    Christ, Andreas; Honegger, Katharina; Zefferer, Marcel; Neufeld, Esra; Oberle, Michael; Szczerba, Dominik; Kuster, Niels; Kainz, Wolfgang; Guag, Joshua W; Hahn, Eckhart G; Rascher, Wolfgang; Janka, Rolf; Bautz, Werner; Chen, Ji; Shen, Jianxiang; Kiefer, Berthold; Schmitt, Peter; Hollenbach, Hans-Peter; Kam, Anthony

    2010-01-01

    The objective of this study was to develop anatomically correct whole body human models of an adult male (34 years old), an adult female (26 years old) and two children (an 11-year-old girl and a six-year-old boy) for the optimized evaluation of electromagnetic exposure. These four models are referred to as the Virtual Family. They are based on high resolution magnetic resonance (MR) images of healthy volunteers. More than 80 different tissue types were distinguished during the segmentation. To improve the accuracy and the effectiveness of the segmentation, a novel semi-automated tool was used to analyze and segment the data. All tissues and organs were reconstructed as three-dimensional (3D) unstructured triangulated surface objects, yielding high precision images of individual features of the body. This greatly enhances the meshing flexibility and the accuracy with respect to thin tissue layers and small organs in comparison with the traditional voxel-based representation of anatomical models. Conformal computational techniques were also applied. The techniques and tools developed in this study can be used to more effectively develop future models and further improve the accuracy of the models for various applications. For research purposes, the four models are provided for free to the scientific community. (note)

  6. MOCUM: A two-dimensional method of characteristics code based on constructive solid geometry and unstructured meshing for general geometries

    International Nuclear Information System (INIS)

    Yang Xue; Satvat, Nader

    2012-01-01

    Highlight: ► A two-dimensional numerical code based on the method of characteristics is developed. ► The complex arbitrary geometries are represented by constructive solid geometry and decomposed by unstructured meshing. ► Excellent agreement between Monte Carlo and the developed code is observed. ► High efficiency is achieved by parallel computing. - Abstract: A transport theory code MOCUM based on the method of characteristics as the flux solver with an advanced general geometry processor has been developed for two-dimensional rectangular and hexagonal lattice and full core neutronics modeling. In the code, the core structure is represented by the constructive solid geometry that uses regularized Boolean operations to build complex geometries from simple polygons. Arbitrary-precision arithmetic is also used in the process of building geometry objects to eliminate the round-off error from the commonly used double precision numbers. Then, the constructed core frame will be decomposed and refined into a Conforming Delaunay Triangulation to ensure the quality of the meshes. The code is fully parallelized using OpenMP and is verified and validated by various benchmarks representing rectangular, hexagonal, plate type and CANDU reactor geometries. Compared with Monte Carlo and deterministic reference solution, MOCUM results are highly accurate. The mentioned characteristics of the MOCUM make it a perfect tool for high fidelity full core calculation for current and GenIV reactor core designs. The detailed representation of reactor physics parameters can enhance the safety margins with acceptable confidence levels, which lead to more economically optimized designs.

  7. An Array of Qualitative Data Analysis Tools: A Call for Data Analysis Triangulation

    Science.gov (United States)

    Leech, Nancy L.; Onwuegbuzie, Anthony J.

    2007-01-01

    One of the most important steps in the qualitative research process is analysis of data. The purpose of this article is to provide elements for understanding multiple types of qualitative data analysis techniques available and the importance of utilizing more than one type of analysis, thus utilizing data analysis triangulation, in order to…

  8. A nonlinear model predictive control formulation for obstacle avoidance in high-speed autonomous ground vehicles in unstructured environments

    Science.gov (United States)

    Liu, Jiechao; Jayakumar, Paramsothy; Stein, Jeffrey L.; Ersal, Tulga

    2018-06-01

    This paper presents a nonlinear model predictive control (MPC) formulation for obstacle avoidance in high-speed, large-size autono-mous ground vehicles (AGVs) with high centre of gravity (CoG) that operate in unstructured environments, such as military vehicles. The term 'unstructured' in this context denotes that there are no lanes or traffic rules to follow. Existing MPC formulations for passenger vehicles in structured environments do not readily apply to this context. Thus, a new nonlinear MPC formulation is developed to navigate an AGV from its initial position to a target position at high-speed safely. First, a new cost function formulation is used that aims to find the shortest path to the target position, since no reference trajectory exists in unstructured environments. Second, a region partitioning approach is used in conjunction with a multi-phase optimal control formulation to accommodate the complicated forms the obstacle-free region can assume due to the presence of multiple obstacles in the prediction horizon in an unstructured environment. Third, the no-wheel-lift-off condition, which is the major dynamical safety concern for high-speed, high-CoG AGVs, is ensured by limiting the steering angle within a range obtained offline using a 14 degrees-of-freedom vehicle dynamics model. Thus, a safe, high-speed navigation is enabled in an unstructured environment. Simulations of an AGV approaching multiple obstacles are provided to demonstrate the effectiveness of the algorithm.

  9. RECONSTRUCTION, QUANTIFICATION, AND VISUALIZATION OF FOREST CANOPY BASED ON 3D TRIANGULATIONS OF AIRBORNE LASER SCANNING POINT DATA

    Directory of Open Access Journals (Sweden)

    J. Vauhkonen

    2015-03-01

    Full Text Available Reconstruction of three-dimensional (3D forest canopy is described and quantified using airborne laser scanning (ALS data with densities of 0.6–0.8 points m-2 and field measurements aggregated at resolutions of 400–900 m2. The reconstruction was based on computational geometry, topological connectivity, and numerical optimization. More precisely, triangulations and their filtrations, i.e. ordered sets of simplices belonging to the triangulations, based on the point data were analyzed. Triangulating the ALS point data corresponds to subdividing the underlying space of the points into weighted simplicial complexes with weights quantifying the (empty space delimited by the points. Reconstructing the canopy volume populated by biomass will thus likely require filtering to exclude that volume from canopy voids. The approaches applied for this purpose were (i to optimize the degree of filtration with respect to the field measurements, and (ii to predict this degree by means of analyzing the persistent homology of the obtained triangulations, which is applied for the first time for vegetation point clouds. When derived from optimized filtrations, the total tetrahedral volume had a high degree of determination (R2 with the stem volume considered, both alone (R2=0.65 and together with other predictors (R2=0.78. When derived by analyzing the topological persistence of the point data and without any field input, the R2 were lower, but the predictions still showed a correlation with the field-measured stem volumes. Finally, producing realistic visualizations of a forested landscape using the persistent homology approach is demonstrated.

  10. Laser triangulation method for measuring the size of parking claw

    Science.gov (United States)

    Liu, Bo; Zhang, Ming; Pang, Ying

    2017-10-01

    With the development of science and technology and the maturity of measurement technology, the 3D profile measurement technology has been developed rapidly. Three dimensional measurement technology is widely used in mold manufacturing, industrial inspection, automatic processing and manufacturing, etc. There are many kinds of situations in scientific research and industrial production. It is necessary to transform the original mechanical parts into the 3D data model on the computer quickly and accurately. At present, many methods have been developed to measure the contour size, laser triangulation is one of the most widely used methods.

  11. Automated Photogrammetric Image Matching with Sift Algorithm and Delaunay Triangulation

    DEFF Research Database (Denmark)

    Karagiannis, Georgios; Antón Castro, Francesc/François; Mioc, Darka

    2016-01-01

    An algorithm for image matching of multi-sensor and multi-temporal satellite images is developed. The method is based on the SIFT feature detector proposed by Lowe in (Lowe, 1999). First, SIFT feature points are detected independently in two images (reference and sensed image). The features detec...... of each feature set for each image are computed. The isomorphism of the Delaunay triangulations is determined to guarantee the quality of the image matching. The algorithm is implemented in Matlab and tested on World-View 2, SPOT6 and TerraSAR-X image patches....

  12. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  13. Restoration of an object from its complex cross sections and surface smoothing of the object

    International Nuclear Information System (INIS)

    Agui, Takeshi; Arai, Kiyoshi; Nakajima, Masayuki

    1990-01-01

    In clinical medicine, restoring the surface of a three-dimensional object from its set of parallel cross sections obtained by CT or MRI is useful in diagnoses. A method of connecting a pair of contours on neighboring cross sections to each other by triangular patches is generally used for this restoration. This method, however, has the complexity of triangulation algorithm, and requires the numerous quantity of calculations when surface smoothing is executed. In our new method, the positions of sampling points are expressed in cylindrical coordinates. Sampling points including auxiliary points are extracted and connected using simple algorithm. Surface smoothing is executed by moving sampling points. This method extends the application scope of restoring objects by triangulation. (author)

  14. Qualitative to quantitative: linked trajectory of method triangulation in a study on HIV/AIDS in Goa, India.

    Science.gov (United States)

    Bailey, Ajay; Hutter, Inge

    2008-10-01

    With 3.1 million people estimated to be living with HIV/AIDS in India and 39.5 million people globally, the epidemic has posed academics the challenge of identifying behaviours and their underlying beliefs in the effort to reduce the risk of HIV transmission. The Health Belief Model (HBM) is frequently used to identify risk behaviours and adherence behaviour in the field of HIV/AIDS. Risk behaviour studies that apply HBM have been largely quantitative and use of qualitative methodology is rare. The marriage of qualitative and quantitative methods has never been easy. The challenge is in triangulating the methods. Method triangulation has been largely used to combine insights from the qualitative and quantitative methods but not to link both the methods. In this paper we suggest a linked trajectory of method triangulation (LTMT). The linked trajectory aims to first gather individual level information through in-depth interviews and then to present the information as vignettes in focus group discussions. We thus validate information obtained from in-depth interviews and gather emic concepts that arise from the interaction. We thus capture both the interpretation and the interaction angles of the qualitative method. Further, using the qualitative information gained, a survey is designed. In doing so, the survey questions are grounded and contextualized. We employed this linked trajectory of method triangulation in a study on the risk assessment of HIV/AIDS among migrant and mobile men. Fieldwork was carried out in Goa, India. Data come from two waves of studies, first an explorative qualitative study (2003), second a larger study (2004-2005), including in-depth interviews (25), focus group discussions (21) and a survey (n=1259). By employing the qualitative to quantitative LTMT we can not only contextualize the existing concepts of the HBM, but also validate new concepts and identify new risk groups.

  15. Nonlinear Projective-Iteration Methods for Solving Transport Problems on Regular and Unstructured Grids

    International Nuclear Information System (INIS)

    Dmitriy Y. Anistratov; Adrian Constantinescu; Loren Roberts; William Wieselquist

    2007-01-01

    This is a project in the field of fundamental research on numerical methods for solving the particle transport equation. Numerous practical problems require to use unstructured meshes, for example, detailed nuclear reactor assembly-level calculations, large-scale reactor core calculations, radiative hydrodynamics problems, where the mesh is determined by hydrodynamic processes, and well-logging problems in which the media structure has very complicated geometry. Currently this is an area of very active research in numerical transport theory. main issues in developing numerical methods for solving the transport equation are the accuracy of the numerical solution and effectiveness of iteration procedure. The problem in case of unstructured grids is that it is very difficult to derive an iteration algorithm that will be unconditionally stable

  16. Unstructured Grid Euler Method Assessment for Aerodynamics Performance Prediction of the Complete TCA Configuration at Supersonic Cruise Speed

    Science.gov (United States)

    Ghaffari, Farhad

    1999-01-01

    Unstructured grid Euler computations, performed at supersonic cruise speed, are presented for a proposed high speed civil transport configuration, designated as the Technology Concept Airplane (TCA) within the High Speed Research (HSR) Program. The numerical results are obtained for the complete TCA cruise configuration which includes the wing, fuselage, empennage, diverters, and flow through nacelles at Mach 2.4 for a range of angles-of-attack and sideslip. The computed surface and off-surface flow characteristics are analyzed and the pressure coefficient contours on the wing lower surface are shown to correlate reasonably well with the available pressure sensitive paint results, particularly, for the complex shock wave structures around the nacelles. The predicted longitudinal and lateral/directional performance characteristics are shown to correlate very well with the measured data across the examined range of angles-of-attack and sideslip. The results from the present effort have been documented into a NASA Controlled-Distribution report which is being presently reviewed for publication.

  17. A Solution Adaptive Structured/Unstructured Overset Grid Flow Solver with Applications to Helicopter Rotor Flows

    Science.gov (United States)

    Duque, Earl P. N.; Biswas, Rupak; Strawn, Roger C.

    1995-01-01

    This paper summarizes a method that solves both the three dimensional thin-layer Navier-Stokes equations and the Euler equations using overset structured and solution adaptive unstructured grids with applications to helicopter rotor flowfields. The overset structured grids use an implicit finite-difference method to solve the thin-layer Navier-Stokes/Euler equations while the unstructured grid uses an explicit finite-volume method to solve the Euler equations. Solutions on a helicopter rotor in hover show the ability to accurately convect the rotor wake. However, isotropic subdivision of the tetrahedral mesh rapidly increases the overall problem size.

  18. A Level Set Discontinuous Galerkin Method for Free Surface Flows

    DEFF Research Database (Denmark)

    Grooss, Jesper; Hesthaven, Jan

    2006-01-01

    We present a discontinuous Galerkin method on a fully unstructured grid for the modeling of unsteady incompressible fluid flows with free surfaces. The surface is modeled by embedding and represented by a levelset. We discuss the discretization of the flow equations and the level set equation...

  19. Coronary artery disease risk assessment from unstructured electronic health records using text mining.

    Science.gov (United States)

    Jonnagaddala, Jitendra; Liaw, Siaw-Teng; Ray, Pradeep; Kumar, Manish; Chang, Nai-Wen; Dai, Hong-Jie

    2015-12-01

    Coronary artery disease (CAD) often leads to myocardial infarction, which may be fatal. Risk factors can be used to predict CAD, which may subsequently lead to prevention or early intervention. Patient data such as co-morbidities, medication history, social history and family history are required to determine the risk factors for a disease. However, risk factor data are usually embedded in unstructured clinical narratives if the data is not collected specifically for risk assessment purposes. Clinical text mining can be used to extract data related to risk factors from unstructured clinical notes. This study presents methods to extract Framingham risk factors from unstructured electronic health records using clinical text mining and to calculate 10-year coronary artery disease risk scores in a cohort of diabetic patients. We developed a rule-based system to extract risk factors: age, gender, total cholesterol, HDL-C, blood pressure, diabetes history and smoking history. The results showed that the output from the text mining system was reliable, but there was a significant amount of missing data to calculate the Framingham risk score. A systematic approach for understanding missing data was followed by implementation of imputation strategies. An analysis of the 10-year Framingham risk scores for coronary artery disease in this cohort has shown that the majority of the diabetic patients are at moderate risk of CAD. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. The Marginalized "Model" Minority: An Empirical Examination of the Racial Triangulation of Asian Americans

    Science.gov (United States)

    Xu, Jun; Lee, Jennifer C.

    2013-01-01

    In this article, we propose a shift in race research from a one-dimensional hierarchical approach to a multidimensional system of racial stratification. Building upon Claire Kim's (1999) racial triangulation theory, we examine how the American public rates Asians relative to blacks and whites along two dimensions of racial stratification: racial…

  1. Using Text Analytics to Derive Customer Service Management Benefits from Unstructured Data

    DEFF Research Database (Denmark)

    Müller, Oliver; Junglas, Iris; Debortoli, Stefan

    2016-01-01

    Deriving value from structured data is now commonplace. The value of unstructured textual data, however, remains mostly untapped and often unrecognized. This article describes the text analytics journeys of three organizations in the customer service management area. Based on their experiences, we...

  2. Lymphoscintigraphy and triangulated body marking for morbidity reduction during sentinel node biopsy in breast cancer.

    Science.gov (United States)

    Krynyckyi, Borys R; Shafir, Michail K; Kim, Suk Chul; Kim, Dong Wook; Travis, Arlene; Moadel, Renee M; Kim, Chun K

    2005-11-08

    Current trends in patient care include the desire for minimizing invasiveness of procedures and interventions. This aim is reflected in the increasing utilization of sentinel lymph node biopsy, which results in a lower level of morbidity in breast cancer staging, in comparison to extensive conventional axillary dissection. Optimized lymphoscintigraphy with triangulated body marking is a clinical option that can further reduce morbidity, more than when a hand held gamma probe alone is utilized. Unfortunately it is often either overlooked or not fully understood, and thus not utilized. This results in the unnecessary loss of an opportunity to further reduce morbidity. Optimized lymphoscintigraphy and triangulated body marking provides a detailed 3 dimensional map of the number and location of the sentinel nodes, available before the first incision is made. The number, location, relevance based on time/sequence of appearance of the nodes, all can influence 1) where the incision is made, 2) how extensive the dissection is, and 3) how many nodes are removed. In addition, complex patterns can arise from injections. These include prominent lymphatic channels, pseudo-sentinel nodes, echelon and reverse echelon nodes and even contamination, which are much more difficult to access with the probe only. With the detailed information provided by optimized lymphoscintigraphy and triangulated body marking, the surgeon can approach the axilla in a more enlightened fashion, in contrast to when the less informed probe only method is used. This allows for better planning, resulting in the best cosmetic effect and less trauma to the tissues, further reducing morbidity while maintaining adequate sampling of the sentinel node(s).

  3. MPI to Coarray Fortran: Experiences with a CFD Solver for Unstructured Meshes

    Directory of Open Access Journals (Sweden)

    Anuj Sharma

    2017-01-01

    Full Text Available High-resolution numerical methods and unstructured meshes are required in many applications of Computational Fluid Dynamics (CFD. These methods are quite computationally expensive and hence benefit from being parallelized. Message Passing Interface (MPI has been utilized traditionally as a parallelization strategy. However, the inherent complexity of MPI contributes further to the existing complexity of the CFD scientific codes. The Partitioned Global Address Space (PGAS parallelization paradigm was introduced in an attempt to improve the clarity of the parallel implementation. We present our experiences of converting an unstructured high-resolution compressible Navier-Stokes CFD solver from MPI to PGAS Coarray Fortran. We present the challenges, methodology, and performance measurements of our approach using Coarray Fortran. With the Cray compiler, we observe Coarray Fortran as a viable alternative to MPI. We are hopeful that Intel and open-source implementations could be utilized in the future.

  4. Linear Discontinuous Expansion Method using the Subcell Balances for Unstructured Geometry SN Transport

    International Nuclear Information System (INIS)

    Hong, Ser Gi; Kim, Jong Woon; Lee, Young Ouk; Kim, Kyo Youn

    2010-01-01

    The subcell balance methods have been developed for one- and two-dimensional SN transport calculations. In this paper, a linear discontinuous expansion method using sub-cell balances (LDEM-SCB) is developed for neutral particle S N transport calculations in 3D unstructured geometrical problems. At present, this method is applied to the tetrahedral meshes. As the name means, this method assumes the linear distribution of the particle flux in each tetrahedral mesh and uses the balance equations for four sub-cells of each tetrahedral mesh to obtain the equations for the four sub-cell average fluxes which are unknowns. This method was implemented in the computer code MUST (Multi-group Unstructured geometry S N Transport). The numerical tests show that this method gives more robust solution than DFEM (Discontinuous Finite Element Method)

  5. First Instances of Generalized Expo-Rational Finite Elements on Triangulations

    Science.gov (United States)

    Dechevsky, Lubomir T.; Zanaty, Peter; Laksa˚, Arne; Bang, Børre

    2011-12-01

    In this communication we consider a construction of simplicial finite elements on triangulated two-dimensional polygonal domains. This construction is, in some sense, dual to the construction of generalized expo-rational B-splines (GERBS). The main result is in the obtaining of new polynomial simplicial patches of the first several lowest possible total polynomial degrees which exhibit Hermite interpolatory properties. The derivation of these results is based on the theory of piecewise polynomial GERBS called Euler Beta-function B-splines. We also provide 3-dimensional visualization of the graphs of the new polynomial simplicial patches and their control polygons.

  6. Options for a health system researcher to choose in Meta Review (MR approaches-Meta Narrative (MN and Meta Triangulation (MT

    Directory of Open Access Journals (Sweden)

    Sanjeev Davey

    2015-01-01

    Full Text Available Two new approaches in systematic reviewing i.e. Meta-narrative review(MNR (which a health researcher can use for topics which are differently conceptualized and studied by different types of researchers for policy decisions and Meta-triangulation review(MTR (done to build theory for studying multifaceted phenomena characterized by expansive and contested research domains are ready for penetration in an arena of health system research. So critical look at which approach in Meta-review is better i.e. Meta-narrative review or Meta-triangulation review, can give new insights to a health system researcher. A systematic review on 2 key words-"meta-narrative review" and "meta-triangulation review" in health system research, were searched from key search engines, such as Pubmed, Cochrane library, Bio-med Central and Google Scholar etc till 21st March 2014 since last 20 years. Studies from both developed and developing world were included in any form and scope to draw final conclusions. However unpublished data from thesis was not included in systematic review. Meta-narrative review is a type of systematic review which can be used for a wide range of topics and questions involving making judgments and inferences in public health. On the other hand Meta-triangulation review is a three-phased, qualitative meta-analysis process which can be used to explore variations in the assumptions of alternative paradigms, gain insights into these multiple paradigms at one point of time and addresses emerging themes and the resulting theories.

  7. Making the Most of Obesity Research: Developing Research and Policy Objectives through Evidence Triangulation

    Science.gov (United States)

    Oliver, Kathryn; Aicken, Catherine; Arai, Lisa

    2013-01-01

    Drawing lessons from research can help policy makers make better decisions. If a large and methodologically varied body of research exists, as with childhood obesity, this is challenging. We present new research and policy objectives for child obesity developed by triangulating user involvement data with a mapping study of interventions aimed at…

  8. The Study Related to the Execution of a Triangulation Network in the Dump of Rovinari Pit, in Order to be Restored to the Economic Circuit

    Directory of Open Access Journals (Sweden)

    George Popescu

    2016-11-01

    Full Text Available The lignite mining extraction within the mining perimeter in Rovinari is carried out through mining works in the open, by using large equipments for the excavation, transport and storage of the mining material. These surfaces are currently being set up in the area of level two of the dump, the west and north-west part of Rovinari pit. In order to carry out the set-up works and of follow-up of the stability of the pit levels it is necessary to maintain the triangulation network.

  9. Unstructured Mesh Movement and Viscous Mesh Generation for CFD-Based Design Optimization, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovations proposed are twofold: 1) a robust unstructured mesh movement method able to handle isotropic (Euler), anisotropic (viscous), mixed element (hybrid)...

  10. A point-centered diffusion differencing for unstructured meshes in 3-D

    International Nuclear Information System (INIS)

    Palmer, T.S.

    1994-01-01

    We describe a point-centered diffusion discretization for 3-D unstructured meshes of polyhedra. The method has several attractive qualities, including second-order accuracy and preservation of linear solutions. A potential drawback to the scheme is that the diffusion matrix is asymmetric, in general. Results of numerical test problems illustrate the behavior of the scheme

  11. Incorporating Unstructured Socializing Into the Study of Secondary Exposure to Community Violence: Etiological and Empirical Implications.

    Science.gov (United States)

    Zimmerman, Gregory M; Messner, Steven F; Rees, Carter

    2014-07-01

    Secondary exposure to community violence, defined as witnessing or hearing violence in the community, has the potential to profoundly impact long-term development, health, happiness, and security. While research has explored pathways to community violence exposure at the individual, family, and neighborhood levels, prior work has largely neglected situational factors conducive to secondary violence exposure. The present study evaluates "unstructured socializing with peers in the absence of authority figures" as a situational process that has implications for secondary exposure to violence. Results indicate that a measure of unstructured socializing was significantly associated with exposure to violence, net of an array of theoretically relevant covariates of violence exposure. Moreover, the relationships between exposure to violence and three of the most well-established correlates of violence exposure in the literature-age, male, and prior violence-were mediated to varying degrees by unstructured socializing. The results suggest a more nuanced approach to the study of secondary violence exposure that expands the focus of attention beyond individual and neighborhood background factors to include situational opportunities presented by patterns of everyday activities. © The Author(s) 2013.

  12. On the application of Chimera/unstructured hybrid grids for conjugate heat transfer

    Science.gov (United States)

    Kao, Kai-Hsiung; Liou, Meng-Sing

    1995-01-01

    A hybrid grid system that combines the Chimera overset grid scheme and an unstructured grid method is developed to study fluid flow and heat transfer problems. With the proposed method, the solid structural region, in which only the heat conduction is considered, can be easily represented using an unstructured grid method. As for the fluid flow region external to the solid material, the Chimera overset grid scheme has been shown to be very flexible and efficient in resolving complex configurations. The numerical analyses require the flow field solution and material thermal response to be obtained simultaneously. A continuous transfer of temperature and heat flux is specified at the interface, which connects the solid structure and the fluid flow as an integral system. Numerical results are compared with analytical and experimental data for a flat plate and a C3X cooled turbine cascade. A simplified drum-disk system is also simulated to show the effectiveness of this hybrid grid system.

  13. Discrete Adjoint-Based Design for Unsteady Turbulent Flows On Dynamic Overset Unstructured Grids

    Science.gov (United States)

    Nielsen, Eric J.; Diskin, Boris

    2012-01-01

    A discrete adjoint-based design methodology for unsteady turbulent flows on three-dimensional dynamic overset unstructured grids is formulated, implemented, and verified. The methodology supports both compressible and incompressible flows and is amenable to massively parallel computing environments. The approach provides a general framework for performing highly efficient and discretely consistent sensitivity analysis for problems involving arbitrary combinations of overset unstructured grids which may be static, undergoing rigid or deforming motions, or any combination thereof. General parent-child motions are also accommodated, and the accuracy of the implementation is established using an independent verification based on a complex-variable approach. The methodology is used to demonstrate aerodynamic optimizations of a wind turbine geometry, a biologically-inspired flapping wing, and a complex helicopter configuration subject to trimming constraints. The objective function for each problem is successfully reduced and all specified constraints are satisfied.

  14. Discrete Adjoint-Based Design Optimization of Unsteady Turbulent Flows on Dynamic Unstructured Grids

    Science.gov (United States)

    Nielsen, Eric J.; Diskin, Boris; Yamaleev, Nail K.

    2009-01-01

    An adjoint-based methodology for design optimization of unsteady turbulent flows on dynamic unstructured grids is described. The implementation relies on an existing unsteady three-dimensional unstructured grid solver capable of dynamic mesh simulations and discrete adjoint capabilities previously developed for steady flows. The discrete equations for the primal and adjoint systems are presented for the backward-difference family of time-integration schemes on both static and dynamic grids. The consistency of sensitivity derivatives is established via comparisons with complex-variable computations. The current work is believed to be the first verified implementation of an adjoint-based optimization methodology for the true time-dependent formulation of the Navier-Stokes equations in a practical computational code. Large-scale shape optimizations are demonstrated for turbulent flows over a tiltrotor geometry and a simulated aeroelastic motion of a fighter jet.

  15. An overset algorithm for 3D unstructured grids

    International Nuclear Information System (INIS)

    Pishevar, A.R.; Shateri, A.R.

    2004-01-01

    In this paper a new methodology is introduced to simulate flows around complex geometries by using overset unstructured grids. The proposed algorithm can also be used for the unsteady flows about objects in relative motions. In such a case since the elements are not deformed during the computation the costly part of conventional methods, re-meshing, is prevented. This method relies on the inter-grid boundary definition to establish communications among independent grids in the overset system. At the end, the Euler set of equations are integrated on several overset systems to examine the capabilities of this methodology. (author)

  16. Feminist Approaches to Triangulation: Uncovering Subjugated Knowledge and Fostering Social Change in Mixed Methods Research

    Science.gov (United States)

    Hesse-Biber, Sharlene

    2012-01-01

    This article explores the deployment of triangulation in the service of uncovering subjugated knowledge and promoting social change for women and other oppressed groups. Feminist approaches to mixed methods praxis create a tight link between the research problem and the research design. An analysis of selected case studies of feminist praxis…

  17. Hand-held triangulation laser profilometer with audio output for blind people Profilométre laser à triangulation tenu en main avec sortie sonare pour non-voyants

    Science.gov (United States)

    Farcy, R.; Damaschini, R.

    1998-06-01

    We describe a device currently under industrial development which will give to the blind a means of three-dimensional space perception. It consists of a 350 g hand-held triangulating laser telemeter including electronic parts and batteries, with auditory feedback either inside the apparatus or close to the ear. The microprocessor unit converts in real time the distance measured by the telemeter into a musical note. Scanning the space with an adequate movement of the hand produces musical lines corresponding to the profiles of the environment. We discuss the optical configuration of the system relative to our first year of clinical experimentation.

  18. Divergence-free MHD on unstructured meshes using high order finite volume schemes based on multidimensional Riemann solvers

    Science.gov (United States)

    Balsara, Dinshaw S.; Dumbser, Michael

    2015-10-01

    Several advances have been reported in the recent literature on divergence-free finite volume schemes for Magnetohydrodynamics (MHD). Almost all of these advances are restricted to structured meshes. To retain full geometric versatility, however, it is also very important to make analogous advances in divergence-free schemes for MHD on unstructured meshes. Such schemes utilize a staggered Yee-type mesh, where all hydrodynamic quantities (mass, momentum and energy density) are cell-centered, while the magnetic fields are face-centered and the electric fields, which are so useful for the time update of the magnetic field, are centered at the edges. Three important advances are brought together in this paper in order to make it possible to have high order accurate finite volume schemes for the MHD equations on unstructured meshes. First, it is shown that a divergence-free WENO reconstruction of the magnetic field can be developed for unstructured meshes in two and three space dimensions using a classical cell-centered WENO algorithm, without the need to do a WENO reconstruction for the magnetic field on the faces. This is achieved via a novel constrained L2-projection operator that is used in each time step as a postprocessor of the cell-centered WENO reconstruction so that the magnetic field becomes locally and globally divergence free. Second, it is shown that recently-developed genuinely multidimensional Riemann solvers (called MuSIC Riemann solvers) can be used on unstructured meshes to obtain a multidimensionally upwinded representation of the electric field at each edge. Third, the above two innovations work well together with a high order accurate one-step ADER time stepping strategy, which requires the divergence-free nonlinear WENO reconstruction procedure to be carried out only once per time step. The resulting divergence-free ADER-WENO schemes with MuSIC Riemann solvers give us an efficient and easily-implemented strategy for divergence-free MHD on

  19. An improved triangulation laser rangefinder using a custom CMOS HDR linear image sensor

    Science.gov (United States)

    Liscombe, Michael

    3-D triangulation laser rangefinders are used in many modern applications, from terrain mapping to biometric identification. Although a wide variety of designs have been proposed, laser speckle noise still provides a fundamental limitation on range accuracy. These works propose a new triangulation laser rangefinder designed specifically to mitigate the effects of laser speckle noise. The proposed rangefinder uses a precision linear translator to laterally reposition the imaging system (e.g., image sensor and imaging lens). For a given spatial location of the laser spot, capturing N spatially uncorrelated laser spot profiles is shown to improve range accuracy by a factor of N . This technique has many advantages over past speckle-reduction technologies, such as a fixed system cost and form factor, and the ability to virtually eliminate laser speckle noise. These advantages are made possible through spatial diversity and come at the cost of increased acquisition time. The rangefinder makes use of the ICFYKWG1 linear image sensor, a custom CMOS sensor developed at the Vision Sensor Laboratory (York University). Tests are performed on the image sensor's innovative high dynamic range technology to determine its effects on range accuracy. As expected, experimental results have shown that the sensor provides a trade-off between dynamic range and range accuracy.

  20. From causal dynamical triangulations to astronomical observations

    Science.gov (United States)

    Mielczarek, Jakub

    2017-09-01

    This letter discusses phenomenological aspects of dimensional reduction predicted by the Causal Dynamical Triangulations (CDT) approach to quantum gravity. The deformed form of the dispersion relation for the fields defined on the CDT space-time is reconstructed. Using the Fermi satellite observations of the GRB 090510 source we find that the energy scale of the dimensional reduction is E* > 0.7 \\sqrt{4-d\\text{UV}} \\cdot 1010 \\text{GeV} at (95% CL), where d\\text{UV} is the value of the spectral dimension in the UV limit. By applying the deformed dispersion relation to the cosmological perturbations it is shown that, for a scenario when the primordial perturbations are formed in the UV region, the scalar power spectrum PS \\propto kn_S-1 , where n_S-1≈ \\frac{3 r (d\\text{UV}-2)}{(d\\text{UV}-1)r-48} . Here, r is the tensor-to-scalar ratio. We find that within the considered model, the predicted from CDT deviation from the scale invariance (n_S=1) is in contradiction with the up to date Planck and BICEP2.

  1. Fixed-topology Lorentzian triangulations: Quantum Regge Calculus in the Lorentzian domain

    Science.gov (United States)

    Tate, Kyle; Visser, Matt

    2011-11-01

    A key insight used in developing the theory of Causal Dynamical Triangu-lations (CDTs) is to use the causal (or light-cone) structure of Lorentzian manifolds to restrict the class of geometries appearing in the Quantum Gravity (QG) path integral. By exploiting this structure the models developed in CDTs differ from the analogous models developed in the Euclidean domain, models of (Euclidean) Dynamical Triangulations (DT), and the corresponding Lorentzian results are in many ways more "physical". In this paper we use this insight to formulate a Lorentzian signature model that is anal-ogous to the Quantum Regge Calculus (QRC) approach to Euclidean Quantum Gravity. We exploit another crucial fact about the structure of Lorentzian manifolds, namely that certain simplices are not constrained by the triangle inequalities present in Euclidean signa-ture. We show that this model is not related to QRC by a naive Wick rotation; this serves as another demonstration that the sum over Lorentzian geometries is not simply related to the sum over Euclidean geometries. By removing the triangle inequality constraints, there is more freedom to perform analytical calculations, and in addition numerical simulations are more computationally efficient. We first formulate the model in 1 + 1 dimensions, and derive scaling relations for the pure gravity path integral on the torus using two different measures. It appears relatively easy to generate "large" universes, both in spatial and temporal extent. In addition, loopto-loop amplitudes are discussed, and a transfer matrix is derived. We then also discuss the model in higher dimensions.

  2. Exploring Shared-Memory Optimizations for an Unstructured Mesh CFD Application on Modern Parallel Systems

    KAUST Repository

    Mudigere, Dheevatsa; Sridharan, Srinivas; Deshpande, Anand; Park, Jongsoo; Heinecke, Alexander; Smelyanskiy, Mikhail; Kaul, Bharat; Dubey, Pradeep; Kaushik, Dinesh; Keyes, David E.

    2015-01-01

    -grid implicit flow solver, which forms the backbone of computational aerodynamics, poses particular challenges due to its large irregular working sets, unstructured memory accesses, and variable/limited amount of parallelism. This code, based on a domain

  3. Depth measurements of drilled holes in bone by laser triangulation for the field of oral implantology

    Science.gov (United States)

    Quest, D.; Gayer, C.; Hering, P.

    2012-01-01

    Laser osteotomy is one possible method of preparing beds for dental implants in the human jaw. A major problem in using this contactless treatment modality is the lack of haptic feedback to control the depth while drilling the implant bed. A contactless measurement system called laser triangulation is presented as a new procedure to overcome this problem. Together with a tomographic picture the actual position of the laser ablation in the bone can be calculated. Furthermore, the laser response is sufficiently fast as to pose little risk to surrounding sensitive areas such as nerves and blood vessels. In the jaw two different bone structures exist, namely the cancellous bone and the compact bone. Samples of both bone structures were examined with test drillings performed either by laser osteotomy or by a conventional rotating drilling tool. The depth of these holes was measured using laser triangulation. The results and the setup are reported in this study.

  4. Sensor-based whole-arm obstacle avoidance for unstructured environments

    International Nuclear Information System (INIS)

    Wintenberg, AL.; Butler, P.L.; Babcock, S.M.; Ericson, M.N.; Britton, C.L. Jr.; Hamel, W.R.

    1992-01-01

    Whole-arm obstacle avoidance is needed for a variety of robotic applications in the Environmental Restoration and Waste Management (ER ampersand WM) Program. Typical industrial applications of robotics involve well-defined work spaces, allowing a predetermined knowledge of collision-free paths for manipulator motion. In the unstructured or poorly defined hazardous environments of the ER ampersand WM program, the potential for significant problems resulting from collisions between manipulators and the environment in which they are utilized is great. A sensing system under development, which will provide protection against such collisions, is described in this paper

  5. Node Discovery and Interpretation in Unstructured Resource-Constrained Environments

    DEFF Research Database (Denmark)

    Gechev, Miroslav; Kasabova, Slavyana; Mihovska, Albena D.

    2014-01-01

    for the discovery, linking and interpretation of nodes in unstructured and resource-constrained network environments and their interrelated and collective use for the delivery of smart services. The model is based on a basic mathematical approach, which describes and predicts the success of human interactions...... in the context of long-term relationships and identifies several key variables in the context of communications in resource-constrained environments. The general theoretical model is described and several algorithms are proposed as part of the node discovery, identification, and linking processes in relation...

  6. Spectral triangulation molecular contrast optical coherence tomography with indocyanine green as the contrast agent

    OpenAIRE

    Yang, Changhuei; McGuckin, Laura E. L.; Simon, John D.; Choma, Michael A.; Applegate, Brian E.; Izatt, Joseph A.

    2004-01-01

    We report a new molecular contrast optical coherence tomography (MCOCT) implementation that profiles the contrast agent distribution in a sample by measuring the agent's spectral differential absorption. The method, spectra triangulation MCOCT, can effectively suppress contributions from spectrally dependent scatterings from the sample without a priori knowledge of the scattering properties. We demonstrate molecular imaging with this new MCOCT modality by mapping the distribution of indocyani...

  7. The impact of the unstructured contacts component in influenza pandemic modeling.

    Directory of Open Access Journals (Sweden)

    Marco Ajelli

    Full Text Available Individual based models have become a valuable tool for modeling the spatiotemporal dynamics of epidemics, e.g. influenza pandemic, and for evaluating the effectiveness of intervention strategies. While specific contacts among individuals into diverse environments (family, school/workplace can be modeled in a standard way by employing available socio-demographic data, all the other (unstructured contacts can be dealt with by adopting very different approaches. This can be achieved for instance by employing distance-based models or by choosing unstructured contacts in the local communities or by employing commuting data.Here we show how diverse choices can lead to different model outputs and thus to a different evaluation of the effectiveness of the containment/mitigation strategies. Sensitivity analysis has been conducted for different values of the first generation index G(0, which is the average number of secondary infections generated by the first infectious individual in a completely susceptible population and by varying the seeding municipality. Among the different considered models, attack rate ranges from 19.1% to 25.7% for G(0 = 1.1, from 47.8% to 50.7% for G(0 = 1.4 and from 62.4% to 67.8% for G(0 = 1.7. Differences of about 15 to 20 days in the peak day have been observed. As regards spatial diffusion, a difference of about 100 days to cover 200 km for different values of G(0 has been observed.To reduce uncertainty in the models it is thus important to employ data, which start being available, on contacts on neglected but important activities (leisure time, sport mall, restaurants, etc. and time-use data for improving the characterization of the unstructured contacts. Moreover, all the possible effects of different assumptions should be considered for taking public health decisions: not only sensitivity analysis to various model parameters should be performed, but intervention options should be based on the analysis and

  8. How to Measure Quality of Service Using Unstructured Data Analysis: A General Method Design

    Directory of Open Access Journals (Sweden)

    Lucie Sperková,

    2015-10-01

    Full Text Available The aim of the paper is to design a general method usable for measuring the quality of the service from the customer’s point of view with the help of content analytics. Large amount of unstructured data is created by customers of the service. This data can provide a valuable feedback from the service usage. Customers talk among themselves about their experiences and feelings from consumption of the service. The design of the method is based on a systematic literature review in the area of the service quality and unstructured data analysis. Analytics and quality measurement models are collected and critically evaluated regarding their potential use for measuring IT service quality. The method can be used by IT service provider to measure and monitor service quality based on World-of-Mouth in order to continual service improvement.

  9. Numerical methods and analysis of the nonlinear Vlasov equation on unstructured meshes of phase space

    International Nuclear Information System (INIS)

    Besse, Nicolas

    2003-01-01

    This work is dedicated to the mathematical and numerical studies of the Vlasov equation on phase-space unstructured meshes. In the first part, new semi-Lagrangian methods are developed to solve the Vlasov equation on unstructured meshes of phase space. As the Vlasov equation describes multi-scale phenomena, we also propose original methods based on a wavelet multi-resolution analysis. The resulting algorithm leads to an adaptive mesh-refinement strategy. The new massively-parallel computers allow to use these methods with several phase-space dimensions. Particularly, these numerical schemes are applied to plasma physics and charged particle beams in the case of two-, three-, and four-dimensional Vlasov-Poisson systems. In the second part we prove the convergence and give error estimates for several numerical schemes applied to the Vlasov-Poisson system when strong and classical solutions are considered. First we show the convergence of a semi-Lagrangian scheme on an unstructured mesh of phase space, when the regularity hypotheses for the initial data are minimal. Then we demonstrate the convergence of classes of high-order semi-Lagrangian schemes in the framework of the regular classical solution. In order to reconstruct the distribution function, we consider symmetrical Lagrange polynomials, B-Splines and wavelets bases. Finally we prove the convergence of a semi-Lagrangian scheme with propagation of gradients yielding a high-order and stable reconstruction of the solution. (author) [fr

  10. Development and comparison of different spatial numerical schemes for the radiative transfer equation resolution using three-dimensional unstructured meshes

    International Nuclear Information System (INIS)

    Capdevila, R.; Perez-Segarra, C.D.; Oliva, A.

    2010-01-01

    In the present work four different spatial numerical schemes have been developed with the aim of reducing the false-scattering of the numerical solutions obtained with the discrete ordinates (DOM) and the finite volume (FVM) methods. These schemes have been designed specifically for unstructured meshes by means of the extrapolation of nodal values of intensity on the studied radiative direction. The schemes have been tested and compared in several 3D benchmark test cases using both structured orthogonal and unstructured grids.

  11. 1:500 Scale Aerial Triangulation Test with Unmanned Airship in Hubei Province

    International Nuclear Information System (INIS)

    Feifei, Xie; Zongjian, Lin; Dezhu, Gui

    2014-01-01

    A new UAVS (Unmanned Aerial Vehicle System) for low altitude aerial photogrammetry is introduced for fine surveying and mapping, including the platform airship, sensor system four-combined wide-angle camera and photogrammetry software MAP-AT. It is demonstrated that this low-altitude aerial photogrammetric system meets the precision requirements of 1:500 scale aerial triangulation based on the test of this system in Hubei province, including the working condition of the airship, the quality of image data and the data processing report. This work provides a possibility for fine surveying and mapping

  12. Triangulating laser profilometer as a navigational aid for the blind: optical aspects

    Science.gov (United States)

    Farcy, R.; Denise, B.; Damaschini, R.

    1996-03-01

    We propose a navigational aid approach for the blind that relies on active optical profilometry with real-time electrotactile interfacing on the skin. Here we are concerned with the optical parts of this system. We point out the particular requirements the profilometer must meet to meet the needs of blind people. We show experimentally that an adequate compromise is possible that consists of a compact class I IR laser-diode triangulation profilometer with the following angular resolution, 20-ms acquisition time per measure of distance, 60 degrees angular scanning field.

  13. Flow simulations about steady-complex and unsteady moving configurations using structured-overlapped and unstructured grids

    Science.gov (United States)

    Newman, James C., III

    1995-01-01

    The limiting factor in simulating flows past realistic configurations of interest has been the discretization of the physical domain on which the governing equations of fluid flow may be solved. In an attempt to circumvent this problem, many Computational Fluid Dynamic (CFD) methodologies that are based on different grid generation and domain decomposition techniques have been developed. However, due to the costs involved and expertise required, very few comparative studies between these methods have been performed. In the present work, the two CFD methodologies which show the most promise for treating complex three-dimensional configurations as well as unsteady moving boundary problems are evaluated. These are namely the structured-overlapped and the unstructured grid schemes. Both methods use a cell centered, finite volume, upwind approach. The structured-overlapped algorithm uses an approximately factored, alternating direction implicit scheme to perform the time integration, whereas, the unstructured algorithm uses an explicit Runge-Kutta method. To examine the accuracy, efficiency, and limitations of each scheme, they are applied to the same steady complex multicomponent configurations and unsteady moving boundary problems. The steady complex cases consist of computing the subsonic flow about a two-dimensional high-lift multielement airfoil and the transonic flow about a three-dimensional wing/pylon/finned store assembly. The unsteady moving boundary problems are a forced pitching oscillation of an airfoil in a transonic freestream and a two-dimensional, subsonic airfoil/store separation sequence. Accuracy was accessed through the comparison of computed and experimentally measured pressure coefficient data on several of the wing/pylon/finned store assembly's components and at numerous angles-of-attack for the pitching airfoil. From this study, it was found that both the structured-overlapped and the unstructured grid schemes yielded flow solutions of

  14. Unstructured Cartesian refinement with sharp interface immersed boundary method for 3D unsteady incompressible flows

    Science.gov (United States)

    Angelidis, Dionysios; Chawdhary, Saurabh; Sotiropoulos, Fotis

    2016-11-01

    A novel numerical method is developed for solving the 3D, unsteady, incompressible Navier-Stokes equations on locally refined fully unstructured Cartesian grids in domains with arbitrarily complex immersed boundaries. Owing to the utilization of the fractional step method on an unstructured Cartesian hybrid staggered/non-staggered grid layout, flux mismatch and pressure discontinuity issues are avoided and the divergence free constraint is inherently satisfied to machine zero. Auxiliary/hanging nodes are used to facilitate the discretization of the governing equations. The second-order accuracy of the solver is ensured by using multi-dimension Lagrange interpolation operators and appropriate differencing schemes at the interface of regions with different levels of refinement. The sharp interface immersed boundary method is augmented with local near-boundary refinement to handle arbitrarily complex boundaries. The discrete momentum equation is solved with the matrix free Newton-Krylov method and the Krylov-subspace method is employed to solve the Poisson equation. The second-order accuracy of the proposed method on unstructured Cartesian grids is demonstrated by solving the Poisson equation with a known analytical solution. A number of three-dimensional laminar flow simulations of increasing complexity illustrate the ability of the method to handle flows across a range of Reynolds numbers and flow regimes. Laminar steady and unsteady flows past a sphere and the oblique vortex shedding from a circular cylinder mounted between two end walls demonstrate the accuracy, the efficiency and the smooth transition of scales and coherent structures across refinement levels. Large-eddy simulation (LES) past a miniature wind turbine rotor, parameterized using the actuator line approach, indicates the ability of the fully unstructured solver to simulate complex turbulent flows. Finally, a geometry resolving LES of turbulent flow past a complete hydrokinetic turbine illustrates

  15. Coupling an Unstructured NoSQL Database with a Geographic Information System

    OpenAIRE

    Holemans, Amandine; Kasprzyk, Jean-Paul; Donnay, Jean-Paul

    2018-01-01

    The management of unstructured NoSQL (Not only Structured Query Language) databases has undergone a great development in the last years mainly thanks to Big Data. Nevertheless, the specificity of spatial information is not purposely taken into account. To overcome this difficulty, we propose to couple a NoSQL database with a spatial Relational Data Base Management System (RDBMS). Exchanges of information between these two systems are illustrated with relevant examples ...

  16. Three-dimensional dynamic rupture simulation with a high-order discontinuous Galerkin method on unstructured tetrahedral meshes

    KAUST Repository

    Pelties, Christian

    2012-02-18

    Accurate and efficient numerical methods to simulate dynamic earthquake rupture and wave propagation in complex media and complex fault geometries are needed to address fundamental questions in earthquake dynamics, to integrate seismic and geodetic data into emerging approaches for dynamic source inversion, and to generate realistic physics-based earthquake scenarios for hazard assessment. Modeling of spontaneous earthquake rupture and seismic wave propagation by a high-order discontinuous Galerkin (DG) method combined with an arbitrarily high-order derivatives (ADER) time integration method was introduced in two dimensions by de la Puente et al. (2009). The ADER-DG method enables high accuracy in space and time and discretization by unstructured meshes. Here we extend this method to three-dimensional dynamic rupture problems. The high geometrical flexibility provided by the usage of tetrahedral elements and the lack of spurious mesh reflections in the ADER-DG method allows the refinement of the mesh close to the fault to model the rupture dynamics adequately while concentrating computational resources only where needed. Moreover, ADER-DG does not generate spurious high-frequency perturbations on the fault and hence does not require artificial Kelvin-Voigt damping. We verify our three-dimensional implementation by comparing results of the SCEC TPV3 test problem with two well-established numerical methods, finite differences, and spectral boundary integral. Furthermore, a convergence study is presented to demonstrate the systematic consistency of the method. To illustrate the capabilities of the high-order accurate ADER-DG scheme on unstructured meshes, we simulate an earthquake scenario, inspired by the 1992 Landers earthquake, that includes curved faults, fault branches, and surface topography. Copyright 2012 by the American Geophysical Union.

  17. A finite element formulation of the Darwin electromagnetic PIC model for unstructured meshes of triangles

    International Nuclear Information System (INIS)

    Sonnendrucker, E.; Ambrosiano, J.; Brandon, S.

    1993-01-01

    The Darwin model for electromagnetic simulation is a reduced form of the Maxwell-Vlasov system that retains all essential physical processes except the propagation of light waves. It is useful in modeling systems for which the light-transit timescales are less important than Alfven wave propagation, or quasistatic effects. The Darwin model is elliptic rather than hyperbolic as are the full set of Maxwell's equations. Appropriate boundary conditions must be chosen for the problems to be well-posed. Using finite element techniques to apply this method for unstructured triangular meshes, a mesh made up of unstructured triangles allows realistic device geometries to be modeled without the necessity of using a large number of mesh points. Analyzing the dispersion relation allows us to validate the code as well as the Darwin approximation

  18. Energy transfer in structured and unstructured environments

    DEFF Research Database (Denmark)

    Iles-Smith, Jake; Dijkstra, Arend G.; Lambert, Neill

    2016-01-01

    of motion over a wide range of parameters. Furthermore, we show that the Zusman equations, which may be obtained in a semiclassical limit of the reaction coordinate model, are often incapable of describing the correct dynamical behaviour. This demonstrates the necessity of properly accounting for quantum......We explore excitonic energy transfer dynamics in a molecular dimer system coupled to both structured and unstructured oscillator environments. By extending the reaction coordinate master equation technique developed by Iles-Smith et al. [Phys. Rev. A 90, 032114 (2014)], we go beyond the commonly...... correlations generated between the system and its environment when the Born-Markov approximations no longer hold. Finally, we apply the reaction coordinate formalism to the case of a structured environment comprising of both underdamped (i.e., sharply peaked) and overdamped (broad) components simultaneously...

  19. Triangulated Proxy Reporting: a technique for improving how communication partners come to know people with severe cognitive impairment.

    Science.gov (United States)

    Lyons, Gordon; De Bortoli, Tania; Arthur-Kelly, Michael

    2017-09-01

    This paper explains and demonstrates the pilot application of Triangulated Proxy Reporting (TPR); a practical technique for enhancing communication around people who have severe cognitive impairment (SCI). An introduction explains SCI and how this impacts on communication; and consequently on quality of care and quality of life. This is followed by an explanation of TPR and its origins in triangulation research techniques. An illustrative vignette explicates its utility and value in a group home for a resident with profound multiple disabilities. The Discussion and Conclusion sections propose the wider application of TPR for different cohorts of people with SCIs, their communication partners and service providers. TPR presents as a practical technique for enhancing communication interactions with people who have SCI. The paper demonstrates the potential of the technique for improving engagement amongst those with profound multiple disabilities, severe acquired brain injury and advanced dementia and their partners in and across different care settings. Implications for Rehabilitation Triangulated Proxy Reporting (TPR) shows potential to improve communications between people with severe cognitive impairments and their communication partners. TPR can lead to improved quality of care and quality of life for people with profound multiple disabilities, very advanced dementia and severe acquired brain injury, who otherwise are very difficult to support. TPR is a relatively simple and inexpensive technique that service providers can incorporate into practice to improving communications between clients with severe cognitive impairments, their carers and other support professionals.

  20. A local level set method based on a finite element method for unstructured meshes

    International Nuclear Information System (INIS)

    Ngo, Long Cu; Choi, Hyoung Gwon

    2016-01-01

    A local level set method for unstructured meshes has been implemented by using a finite element method. A least-square weighted residual method was employed for implicit discretization to solve the level set advection equation. By contrast, a direct re-initialization method, which is directly applicable to the local level set method for unstructured meshes, was adopted to re-correct the level set function to become a signed distance function after advection. The proposed algorithm was constructed such that the advection and direct reinitialization steps were conducted only for nodes inside the narrow band around the interface. Therefore, in the advection step, the Gauss–Seidel method was used to update the level set function using a node-by-node solution method. Some benchmark problems were solved by using the present local level set method. Numerical results have shown that the proposed algorithm is accurate and efficient in terms of computational time

  1. A local level set method based on a finite element method for unstructured meshes

    Energy Technology Data Exchange (ETDEWEB)

    Ngo, Long Cu; Choi, Hyoung Gwon [School of Mechanical Engineering, Seoul National University of Science and Technology, Seoul (Korea, Republic of)

    2016-12-15

    A local level set method for unstructured meshes has been implemented by using a finite element method. A least-square weighted residual method was employed for implicit discretization to solve the level set advection equation. By contrast, a direct re-initialization method, which is directly applicable to the local level set method for unstructured meshes, was adopted to re-correct the level set function to become a signed distance function after advection. The proposed algorithm was constructed such that the advection and direct reinitialization steps were conducted only for nodes inside the narrow band around the interface. Therefore, in the advection step, the Gauss–Seidel method was used to update the level set function using a node-by-node solution method. Some benchmark problems were solved by using the present local level set method. Numerical results have shown that the proposed algorithm is accurate and efficient in terms of computational time.

  2. Random surfaces: A non-perturbative regularization of strings?

    International Nuclear Information System (INIS)

    Ambjoern, J.

    1989-12-01

    I review the basic properties of the theory of randum surfaces. While it is by now well known that the theory of (discretized) random surfaces correctly describes the (perturbative) aspects of non-critical strings in d 1. In these lectures I intend to show that the theory of dynamical triangulated random surfaces provides us with a lot of information about the dynamics of both the bosonic string and the superstring even for d>1. I also briefly review recent attempts to define a string field theory (sum over all genus) in this approach. (orig.)

  3. Wavelet Radiosity on Arbitrary Planar Surfaces

    OpenAIRE

    Holzschuch , Nicolas; Cuny , François; Alonso , Laurent

    2000-01-01

    Colloque avec actes et comité de lecture. internationale.; International audience; Wavelet radiosity is, by its nature, restricted to parallelograms or triangles. This paper presents an innovative technique enabling wavelet radiosity computations on planar surfaces of arbitrary shape, including concave contours or contours with holes. This technique replaces the need for triangulating such complicated shapes, greatly reducing the complexity of the wavelet radiosity algorithm and the computati...

  4. Sensor-based whole-arm obstacle avoidance for unstructured environments

    International Nuclear Information System (INIS)

    Wintenberg, A.L.; Butler, P.L.; Babcock, S.M.; Ericson, M.N.; Armstrong, G.A.; Britton, C.L. Jr.; Hamel, W.R.

    1992-01-01

    Whole-arm obstacle avoidance is needed for a variety of robotic applications in the Environmental Restoration and Waste Management (ER ampersand WM) Program. Typical industrial applications of robotics involve well-defined workspaces, allowing a predetermined knowledge of collision-free paths for manipulator motion. However, many hazardous environments are unstructured or poorly defined, providing a significant potential for collisions between manipulators and the environment. In order to allow applications of robotics in such situations, a sensing system is under development which will provide protection against collisions. Specifics of this system including system architecture and projected implementation are described

  5. 3D unstructured mesh discontinuous finite element hydro

    International Nuclear Information System (INIS)

    Prasad, M.K.; Kershaw, D.S.; Shaw, M.J.

    1995-01-01

    The authors present detailed features of the ICF3D hydrodynamics code used for inertial fusion simulations. This code is intended to be a state-of-the-art upgrade of the well-known fluid code, LASNEX. ICF3D employs discontinuous finite elements on a discrete unstructured mesh consisting of a variety of 3D polyhedra including tetrahedra, prisms, and hexahedra. The authors discussed details of how the ROE-averaged second-order convection was applied on the discrete elements, and how the C++ coding interface has helped to simplify implementing the many physics and numerics modules within the code package. The author emphasized the virtues of object-oriented design in large scale projects such as ICF3D

  6. Observation of melt surface depressions during electron beam evaporation

    International Nuclear Information System (INIS)

    Ohba, Hironori; Shibata, Takemasa

    2000-08-01

    Depths of depressed surface of liquid gadolinium, cerium and copper during electron beam evaporation were measured by triangulation method using a CCD camera. The depression depths estimated from the balance of the vapor pressure and the hydrostatic pressure at the evaporation surface agreed with the measured values. The periodic fluctuation of atomic beam was observed when the depression of 3∼4 mm in depth was formed at the evaporation spot. (author)

  7. INTEGRATION OF HETEROGENOUS DIGITAL SURFACE MODELS

    Directory of Open Access Journals (Sweden)

    R. Boesch

    2012-08-01

    distribution can be used to derive a local accuracy measure. For the calculation of a robust point distribution measure, a constrained triangulation of local points (within an area of 100m2 has been implemented using the Open Source project CGAL. The area of each triangle is a measure for the spatial distribution of raw points in this local area. Combining the FOM-map with the local evaluation of LiDAR points allows an appropriate local accuracy evaluation of both surface models. The currently implemented strategy ("partial replacement" uses the hypothesis, that the ADS-DSM is superior due to its better global accuracy of 1m. If the local analysis of the FOM-map within the 100m2 area shows significant matching errors, the corresponding area of the triangulated LiDAR points is analyzed. If the point density and distribution is sufficient, the LiDAR-DSM will be used in favor of the ADS-DSM at this location. If the local triangulation reflects low point density or the variance of triangle areas exceeds a threshold, the investigated location will be marked as NODATA area. In a future implementation ("anisotropic fusion" an anisotropic inverse distance weighting (IDW will be used, which merges both surface models in the point data space by using FOM-map and local triangulation to derive a quality weight for each of the interpolation points. The "partial replacement" implementation and the "fusion" prototype for the anisotropic IDW make use of the Open Source projects CGAL (Computational Geometry Algorithms Library, GDAL (Geospatial Data Abstraction Library and OpenCV (Open Source Computer Vision.

  8. Sensation Seeking and Adolescent Alcohol Use: Exploring the Mediating Role of Unstructured Socializing With Peers.

    Science.gov (United States)

    Sznitman, Sharon; Engel-Yeger, Batya

    2017-05-01

    Researchers have theorized that adolescents high in sensation seeking are particularly sensitive to positive reinforcement and the rewarding outcomes of alcohol use, and thus that the personality vulnerability is a direct causal risk factor for alcohol use. In contrast, the routine activity perspective theorizes that part of the effect of sensation seeking on alcohol use goes through the propensity that sensation seekers have towards unstructured socializing with peers. The study tests a model with indirect and direct paths from sensation seeking and participation in unstructured peer socialization to adolescent alcohol use. Cross-sectional data were collected from 360 students in a state-secular Jewish high school (10th to 12th grade) in the center region of Israel. The sample was equally divided between boys (51.9%) and girls (48.1%), respondents' age ranged from 15 to 17 years (mean = 16.02 ± 0.85). Structural equation modeling was used to test the direct and indirect paths. While sensation seeking had a significant direct path to adolescent alcohol use, part of the association was mediated by unstructured socializing with peers. The mediated paths were similar for boys and girls alike. Sensation seeking is primarily biologically determined and prevention efforts are unlikely to modify this personality vulnerability. The results of this study suggest that a promising prevention avenue is to modify extracurricular participation patterns of vulnerable adolescents. © The Author 2017. Medical Council on Alcohol and Oxford University Press. All rights reserved.

  9. Adaptive maximal poisson-disk sampling on surfaces

    KAUST Repository

    Yan, Dongming

    2012-01-01

    In this paper, we study the generation of maximal Poisson-disk sets with varying radii on surfaces. Based on the concepts of power diagram and regular triangulation, we present a geometric analysis of gaps in such disk sets on surfaces, which is the key ingredient of the adaptive maximal Poisson-disk sampling framework. Moreover, we adapt the presented sampling framework for remeshing applications. Several novel and efficient operators are developed for improving the sampling/meshing quality over the state-of-theart. © 2012 ACM.

  10. Development of a Two-Phase Flow Analysis Code based on a Unstructured-Mesh SIMPLE Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Tae; Park, Ik Kyu; Cho, Heong Kyu; Yoon, Han Young; Kim, Kyung Doo; Jeong, Jae Jun

    2008-09-15

    For analyses of multi-phase flows in a water-cooled nuclear power plant, a three-dimensional SIMPLE-algorithm based hydrodynamic solver CUPID-S has been developed. As governing equations, it adopts a two-fluid three-field model for the two-phase flows. The three fields represent a continuous liquid, a dispersed droplets, and a vapour field. The governing equations are discretized by a finite volume method on an unstructured grid to handle the geometrical complexity of the nuclear reactors. The phasic momentum equations are coupled and solved with a sparse block Gauss-Seidel matrix solver to increase a numerical stability. The pressure correction equation derived by summing the phasic volume fraction equations is applied on the unstructured mesh in the context of a cell-centered co-located scheme. This paper presents the numerical method and the preliminary results of the calculations.

  11. Long-term versus short-term deformation of the meizoseismal area of the 2008 Achaia-Elia (MW 6.4) earthquake in NW Peloponnese, Greece: Evidence from historical triangulation and morphotectonic data

    Science.gov (United States)

    Stiros, Stathis; Moschas, Fanis; Feng, Lujia; Newman, Andrew

    2013-04-01

    The deformation of the meizoseismal area of the 2008 Achaia-Elia (MW 6.4) earthquake in NW Peloponnese, of the first significant strike slip earthquake in continental Greece, was examined in two time scales; of 102 years, based on the analysis of high-accuracy historical triangulation data describing shear, and of 105-106 years, based on the analysis of the hydrographic network of the area for signs of streams offset by faulting. Our study revealed pre-seismic accumulation of shear strain of the order of 0.2 μrad/year in the study area, consistent with recent GPS evidence, but no signs of significant strike slip-induced offsets in the hydrographic network. These results confirm the hypothesis that the 2008 fault, which did not reached the surface and was not associated with significant seismic ground deformation, probably because of a surface flysch layer filtering high-strain events, was associated with an immature or a dormant, recently activated fault. This fault, about 150 km long and discordant to the morphotectonic trends of the area, seems first, to contain segments which have progressively reactivated in a specific direction in the last 20 years, reminiscent of the North Anatolian Fault, and second, to limit an 150 km wide (recent?) shear zone in the internal part of the arc, in a region mostly dominated by thrust faulting and strong destructive earthquakes. Deformation of the first main strike slip fault in continental Greece analyzed. Triangulation data show preseismic shear, hydrographic net no previous faulting. Surface shear deformation only in low strain rates. Immature or reactivated dormant strike slip fault, with gradual oriented rupturing. Interplay between shear and thrusting along the arc.

  12. An unstructured-mesh finite-volume MPDATA for compressible atmospheric dynamics

    International Nuclear Information System (INIS)

    Kühnlein, Christian; Smolarkiewicz, Piotr K.

    2017-01-01

    An advancement of the unstructured-mesh finite-volume MPDATA (Multidimensional Positive Definite Advection Transport Algorithm) is presented that formulates the error-compensative pseudo-velocity of the scheme to rely only on face-normal advective fluxes to the dual cells, in contrast to the full vector employed in previous implementations. This is essentially achieved by expressing the temporal truncation error underlying the pseudo-velocity in a form consistent with the flux-divergence of the governing conservation law. The development is especially important for integrating fluid dynamics equations on non-rectilinear meshes whenever face-normal advective mass fluxes are employed for transport compatible with mass continuity—the latter being essential for flux-form schemes. In particular, the proposed formulation enables large-time-step semi-implicit finite-volume integration of the compressible Euler equations using MPDATA on arbitrary hybrid computational meshes. Furthermore, it facilitates multiple error-compensative iterations of the finite-volume MPDATA and improved overall accuracy. The advancement combines straightforwardly with earlier developments, such as the nonoscillatory option, the infinite-gauge variant, and moving curvilinear meshes. A comprehensive description of the scheme is provided for a hybrid horizontally-unstructured vertically-structured computational mesh for efficient global atmospheric flow modelling. The proposed finite-volume MPDATA is verified using selected 3D global atmospheric benchmark simulations, representative of hydrostatic and non-hydrostatic flow regimes. Besides the added capabilities, the scheme retains fully the efficacy of established finite-volume MPDATA formulations.

  13. An unstructured-mesh finite-volume MPDATA for compressible atmospheric dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Kühnlein, Christian, E-mail: christian.kuehnlein@ecmwf.int; Smolarkiewicz, Piotr K., E-mail: piotr.smolarkiewicz@ecmwf.int

    2017-04-01

    An advancement of the unstructured-mesh finite-volume MPDATA (Multidimensional Positive Definite Advection Transport Algorithm) is presented that formulates the error-compensative pseudo-velocity of the scheme to rely only on face-normal advective fluxes to the dual cells, in contrast to the full vector employed in previous implementations. This is essentially achieved by expressing the temporal truncation error underlying the pseudo-velocity in a form consistent with the flux-divergence of the governing conservation law. The development is especially important for integrating fluid dynamics equations on non-rectilinear meshes whenever face-normal advective mass fluxes are employed for transport compatible with mass continuity—the latter being essential for flux-form schemes. In particular, the proposed formulation enables large-time-step semi-implicit finite-volume integration of the compressible Euler equations using MPDATA on arbitrary hybrid computational meshes. Furthermore, it facilitates multiple error-compensative iterations of the finite-volume MPDATA and improved overall accuracy. The advancement combines straightforwardly with earlier developments, such as the nonoscillatory option, the infinite-gauge variant, and moving curvilinear meshes. A comprehensive description of the scheme is provided for a hybrid horizontally-unstructured vertically-structured computational mesh for efficient global atmospheric flow modelling. The proposed finite-volume MPDATA is verified using selected 3D global atmospheric benchmark simulations, representative of hydrostatic and non-hydrostatic flow regimes. Besides the added capabilities, the scheme retains fully the efficacy of established finite-volume MPDATA formulations.

  14. Partitioning of unstructured meshes for load balancing

    International Nuclear Information System (INIS)

    Martin, O.C.; Otto, S.W.

    1994-01-01

    Many large-scale engineering and scientific calculations involve repeated updating of variables on an unstructured mesh. To do these types of computations on distributed memory parallel computers, it is necessary to partition the mesh among the processors so that the load balance is maximized and inter-processor communication time is minimized. This can be approximated by the problem, of partitioning a graph so as to obtain a minimum cut, a well-studied combinatorial optimization problem. Graph partitioning algorithms are discussed that give good but not necessarily optimum solutions. These algorithms include local search methods recursive spectral bisection, and more general purpose methods such as simulated annealing. It is shown that a general procedure enables to combine simulated annealing with Kernighan-Lin. The resulting algorithm is both very fast and extremely effective. (authors) 23 refs., 3 figs., 1 tab

  15. Reactor physics verification of the MCNP6 unstructured mesh capability

    International Nuclear Information System (INIS)

    Burke, T. P.; Kiedrowski, B. C.; Martz, R. L.; Martin, W. R.

    2013-01-01

    The Monte Carlo software package MCNP6 has the ability to transport particles on unstructured meshes generated from the Computed-Aided Engineering software Abaqus. Verification is performed using benchmarks with features relevant to reactor physics - Big Ten and the C5G7 computational benchmark. Various meshing strategies are tested and results are compared to reference solutions. Computational performance results are also given. The conclusions show MCNP6 is capable of producing accurate calculations for reactor physics geometries and the computational requirements for small lattice benchmarks are reasonable on modern computing platforms. (authors)

  16. Reactor physics verification of the MCNP6 unstructured mesh capability

    Energy Technology Data Exchange (ETDEWEB)

    Burke, T. P. [Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2355 Bonisteel Boulevard, Ann Arbor, MI 48109 (United States); Kiedrowski, B. C.; Martz, R. L. [X-Computational Physics Division, Monte Carlo Codes Group, Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Martin, W. R. [Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2355 Bonisteel Boulevard, Ann Arbor, MI 48109 (United States)

    2013-07-01

    The Monte Carlo software package MCNP6 has the ability to transport particles on unstructured meshes generated from the Computed-Aided Engineering software Abaqus. Verification is performed using benchmarks with features relevant to reactor physics - Big Ten and the C5G7 computational benchmark. Various meshing strategies are tested and results are compared to reference solutions. Computational performance results are also given. The conclusions show MCNP6 is capable of producing accurate calculations for reactor physics geometries and the computational requirements for small lattice benchmarks are reasonable on modern computing platforms. (authors)

  17. Transport and dynamcis in toroidal fusion systems. Final report, 1992--1995

    International Nuclear Information System (INIS)

    Schnack, D.D.

    1995-01-01

    This document is organized as follows. Discussions are presented on the properties of structured and unstructured meshes, and the data structures useful for describing them. Issues related to the triangulation of an arbitrary set of points in a plane are also discussed. A derivation is made of a finite volume approximation to the resistive MHD equations suitable for use on an unstructured, triangular mesh in toroidal geometry. Boundary conditions are discussed. The specific MHD model, and its implementation on the unstructured mesh, is discussed. A discussion is presented of methods of time integration, and descriptions are given for implementation of semi-implicit and fully implicit algorithms. Examples of the application of the method are given. Included are standard, two- dimensional hydrodynamic and MHD shock problems, as well as applications of the method to the equilibrium and stability of toroidal fusion plasmas in two and three dimensions. The initial results with mesh adaptation are also described

  18. An Algorithm for Parallel Sn Sweeps on Unstructured Meshes

    International Nuclear Information System (INIS)

    Pautz, Shawn D.

    2002-01-01

    A new algorithm for performing parallel S n sweeps on unstructured meshes is developed. The algorithm uses a low-complexity list ordering heuristic to determine a sweep ordering on any partitioned mesh. For typical problems and with 'normal' mesh partitionings, nearly linear speedups on up to 126 processors are observed. This is an important and desirable result, since although analyses of structured meshes indicate that parallel sweeps will not scale with normal partitioning approaches, no severe asymptotic degradation in the parallel efficiency is observed with modest (≤100) levels of parallelism. This result is a fundamental step in the development of efficient parallel S n methods

  19. Methods in the analysis of mobile robots behavior in unstructured environment

    Science.gov (United States)

    Mondoc, Alina; Dolga, Valer; Gorie, Nina

    2012-11-01

    A mobile robot can be described as a mechatronic system that must execute an application in a working environment. From mechatronic concept, the authors highlight mechatronic system structure based on its secondary function. Mobile robot will move, either in a known environment - structured environment may be described in time by an appropriate mathematical model or in an unfamiliar environment - unstructured - the random aspects prevail. Starting from a point robot must reach a START STOP point in the context of functional constraints imposed on the one hand, the application that, on the other hand, the working environment. The authors focus their presentation on unstructured environment. In this case the evolution of mobile robot is based on obtaining information in the work environment, their processing and integration results in action strategy. Number of sensory elements used is subject to optimization parameter. Starting from a known structure of mobile robot, the authors analyze the possibility of developing a mathematical model variants mathematical contact wheel - ground. It analyzes the various types of soil and the possibility of obtaining a "signature" on it based on sensory information. Theoretical aspects of the problem are compared to experimental results obtained in robot evolution. The mathematical model of the robot system allowed the simulation environment and its evolution in comparison with the experimental results estimated.

  20. A volume-based method for denoising on curved surfaces

    KAUST Repository

    Biddle, Harry

    2013-09-01

    We demonstrate a method for removing noise from images or other data on curved surfaces. Our approach relies on in-surface diffusion: we formulate both the Gaussian diffusion and Perona-Malik edge-preserving diffusion equations in a surface-intrinsic way. Using the Closest Point Method, a recent technique for solving partial differential equations (PDEs) on general surfaces, we obtain a very simple algorithm where we merely alternate a time step of the usual Gaussian diffusion (and similarly Perona-Malik) in a small 3D volume containing the surface with an interpolation step. The method uses a closest point function to represent the underlying surface and can treat very general surfaces. Experimental results include image filtering on smooth surfaces, open surfaces, and general triangulated surfaces. © 2013 IEEE.

  1. A volume-based method for denoising on curved surfaces

    KAUST Repository

    Biddle, Harry; von Glehn, Ingrid; Macdonald, Colin B.; Marz, Thomas

    2013-01-01

    We demonstrate a method for removing noise from images or other data on curved surfaces. Our approach relies on in-surface diffusion: we formulate both the Gaussian diffusion and Perona-Malik edge-preserving diffusion equations in a surface-intrinsic way. Using the Closest Point Method, a recent technique for solving partial differential equations (PDEs) on general surfaces, we obtain a very simple algorithm where we merely alternate a time step of the usual Gaussian diffusion (and similarly Perona-Malik) in a small 3D volume containing the surface with an interpolation step. The method uses a closest point function to represent the underlying surface and can treat very general surfaces. Experimental results include image filtering on smooth surfaces, open surfaces, and general triangulated surfaces. © 2013 IEEE.

  2. A resistor interpretation of general anisotropic cardiac tissue.

    Science.gov (United States)

    Shao, Hai; Sampson, Kevin J; Pormann, John B; Rose, Donald J; Henriquez, Craig S

    2004-02-01

    This paper describes a spatial discretization scheme for partial differential equation systems that contain anisotropic diffusion. The discretization method uses unstructured finite volumes, or the boxes, that are formed as a secondary geometric structure from an underlying triangular mesh. We show how the discretization can be interpreted as a resistive circuit network, where each resistor is assigned at each edge of the triangular element. The resistor is computed as an anisotropy dependent geometric quantity of the local mesh structure. Finally, we show that under certain conditions, the discretization gives rise to negative resistors that can produce non-physical hyperpolarizations near depolarizing stimuli. We discuss how the proper choice of triangulation (anisotropic Delaunay triangulation) can ensure monotonicity (i.e. all resistors are positive).

  3. A Delaunay Triangulation Approach For Segmenting Clumps Of Nuclei

    International Nuclear Information System (INIS)

    Wen, Quan; Chang, Hang; Parvin, Bahram

    2009-01-01

    Cell-based fluorescence imaging assays have the potential to generate massive amount of data, which requires detailed quantitative analysis. Often, as a result of fixation, labeled nuclei overlap and create a clump of cells. However, it is important to quantify phenotypic read out on a cell-by-cell basis. In this paper, we propose a novel method for decomposing clumps of nuclei using high-level geometric constraints that are derived from low-level features of maximum curvature computed along the contour of each clump. Points of maximum curvature are used as vertices for Delaunay triangulation (DT), which provides a set of edge hypotheses for decomposing a clump of nuclei. Each hypothesis is subsequently tested against a constraint satisfaction network for a near optimum decomposition. The proposed method is compared with other traditional techniques such as the watershed method with/without markers. The experimental results show that our approach can overcome the deficiencies of the traditional methods and is very effective in separating severely touching nuclei.

  4. Introducing a distributed unstructured mesh into gyrokinetic particle-in-cell code, XGC

    Science.gov (United States)

    Yoon, Eisung; Shephard, Mark; Seol, E. Seegyoung; Kalyanaraman, Kaushik

    2017-10-01

    XGC has shown good scalability for large leadership supercomputers. The current production version uses a copy of the entire unstructured finite element mesh on every MPI rank. Although an obvious scalability issue if the mesh sizes are to be dramatically increased, the current approach is also not optimal with respect to data locality of particles and mesh information. To address these issues we have initiated the development of a distributed mesh PIC method. This approach directly addresses the base scalability issue with respect to mesh size and, through the use of a mesh entity centric view of the particle mesh relationship, provides opportunities to address data locality needs of many core and GPU supported heterogeneous systems. The parallel mesh PIC capabilities are being built on the Parallel Unstructured Mesh Infrastructure (PUMI). The presentation will first overview the form of mesh distribution used and indicate the structures and functions used to support the mesh, the particles and their interaction. Attention will then focus on the node-level optimizations being carried out to ensure performant operation of all PIC operations on the distributed mesh. Partnership for Edge Physics Simulation (EPSI) Grant No. DE-SC0008449 and Center for Extended Magnetohydrodynamic Modeling (CEMM) Grant No. DE-SC0006618.

  5. A study on the unstructured music database—Taking the Bo people’s music and its music iconography database as an example

    Directory of Open Access Journals (Sweden)

    Liu Yutong

    2015-01-01

    Full Text Available An unstructured music iconography data system constructed by key technologies like Dublin Core, Lucene technology and MVC framework is studied in this paper. Results indicate that the traditional directory tree and the existing indexing and searching tools are severely insufficient in the organization and management of the massive unstructured data. Relevant documents can be searched effectively and rapidly through the index established by provided by BeFS. Key technologies, such as Dublin Core, Lucene technology and MVC framework, can be applied to the construction of the enormous unstructured database of music and image resources. The database system test can be divided into two links, functional test and performance test. The test results of the Bo people’s music and image database system obtained through the tested design scheme indicate that the performance of the system is relatively high and able to satisfy the concurrent access of massive data with excellent user experience.

  6. Unstructured Computational Aerodynamics on Many Integrated Core Architecture

    KAUST Repository

    Al Farhan, Mohammed A.

    2016-06-08

    Shared memory parallelization of the flux kernel of PETSc-FUN3D, an unstructured tetrahedral mesh Euler flow code previously studied for distributed memory and multi-core shared memory, is evaluated on up to 61 cores per node and up to 4 threads per core. We explore several thread-level optimizations to improve flux kernel performance on the state-of-the-art many integrated core (MIC) Intel processor Xeon Phi “Knights Corner,” with a focus on strong thread scaling. While the linear algebraic kernel is bottlenecked by memory bandwidth for even modest numbers of cores sharing a common memory, the flux kernel, which arises in the control volume discretization of the conservation law residuals and in the formation of the preconditioner for the Jacobian by finite-differencing the conservation law residuals, is compute-intensive and is known to exploit effectively contemporary multi-core hardware. We extend study of the performance of the flux kernel to the Xeon Phi in three thread affinity modes, namely scatter, compact, and balanced, in both offload and native mode, with and without various code optimizations to improve alignment and reduce cache coherency penalties. Relative to baseline “out-of-the-box” optimized compilation, code restructuring optimizations provide about 3.8x speedup using the offload mode and about 5x speedup using the native mode. Even with these gains for the flux kernel, with respect to execution time the MIC simply achieves par with optimized compilation on a contemporary multi-core Intel CPU, the 16-core Sandy Bridge E5 2670. Nevertheless, the optimizations employed to reduce the data motion and cache coherency protocol penalties of the MIC are expected to be of value for CFD and many other unstructured applications as many-core architecture evolves. We explore large-scale distributed-shared memory performance on the Cray XC40 supercomputer, to demonstrate that optimizations employed on Phi hybridize to this context, where each of

  7. Unstructured Computational Aerodynamics on Many Integrated Core Architecture

    KAUST Repository

    Al Farhan, Mohammed A.; Kaushik, Dinesh K.; Keyes, David E.

    2016-01-01

    Shared memory parallelization of the flux kernel of PETSc-FUN3D, an unstructured tetrahedral mesh Euler flow code previously studied for distributed memory and multi-core shared memory, is evaluated on up to 61 cores per node and up to 4 threads per core. We explore several thread-level optimizations to improve flux kernel performance on the state-of-the-art many integrated core (MIC) Intel processor Xeon Phi “Knights Corner,” with a focus on strong thread scaling. While the linear algebraic kernel is bottlenecked by memory bandwidth for even modest numbers of cores sharing a common memory, the flux kernel, which arises in the control volume discretization of the conservation law residuals and in the formation of the preconditioner for the Jacobian by finite-differencing the conservation law residuals, is compute-intensive and is known to exploit effectively contemporary multi-core hardware. We extend study of the performance of the flux kernel to the Xeon Phi in three thread affinity modes, namely scatter, compact, and balanced, in both offload and native mode, with and without various code optimizations to improve alignment and reduce cache coherency penalties. Relative to baseline “out-of-the-box” optimized compilation, code restructuring optimizations provide about 3.8x speedup using the offload mode and about 5x speedup using the native mode. Even with these gains for the flux kernel, with respect to execution time the MIC simply achieves par with optimized compilation on a contemporary multi-core Intel CPU, the 16-core Sandy Bridge E5 2670. Nevertheless, the optimizations employed to reduce the data motion and cache coherency protocol penalties of the MIC are expected to be of value for CFD and many other unstructured applications as many-core architecture evolves. We explore large-scale distributed-shared memory performance on the Cray XC40 supercomputer, to demonstrate that optimizations employed on Phi hybridize to this context, where each of

  8. Simulation of geothermal water extraction in heterogeneous reservoirs using dynamic unstructured mesh optimisation

    Science.gov (United States)

    Salinas, P.; Pavlidis, D.; Jacquemyn, C.; Lei, Q.; Xie, Z.; Pain, C.; Jackson, M.

    2017-12-01

    It is well known that the pressure gradient into a production well increases with decreasing distance to the well. To properly capture the local pressure drawdown into the well a high grid or mesh resolution is required; moreover, the location of the well must be captured accurately. In conventional simulation models, the user must interact with the model to modify grid resolution around wells of interest, and the well location is approximated on a grid defined early in the modelling process.We report a new approach for improved simulation of near wellbore flow in reservoir scale models through the use of dynamic mesh optimisation and the recently presented double control volume finite element method. Time is discretized using an adaptive, implicit approach. Heterogeneous geologic features are represented as volumes bounded by surfaces. Within these volumes, termed geologic domains, the material properties are constant. Up-, cross- or down-scaling of material properties during dynamic mesh optimization is not required, as the properties are uniform within each geologic domain. A given model typically contains numerous such geologic domains. Wells are implicitly coupled with the domain, and the fluid flows is modelled inside the wells. The method is novel for two reasons. First, a fully unstructured tetrahedral mesh is used to discretize space, and the spatial location of the well is specified via a line vector, ensuring its location even if the mesh is modified during the simulation. The well location is therefore accurately captured, the approach allows complex well trajectories and wells with many laterals to be modelled. Second, computational efficiency is increased by use of dynamic mesh optimization, in which an unstructured mesh adapts in space and time to key solution fields (preserving the geometry of the geologic domains), such as pressure, velocity or temperature, this also increases the quality of the solutions by placing higher resolution where required

  9. Modeling and simulation of xylitol production in bioreactor by Debaryomyces nepalensis NCYC 3413 using unstructured and artificial neural network models.

    Science.gov (United States)

    Pappu, J Sharon Mano; Gummadi, Sathyanarayana N

    2016-11-01

    This study examines the use of unstructured kinetic model and artificial neural networks as predictive tools for xylitol production by Debaryomyces nepalensis NCYC 3413 in bioreactor. An unstructured kinetic model was proposed in order to assess the influence of pH (4, 5 and 6), temperature (25°C, 30°C and 35°C) and volumetric oxygen transfer coefficient kLa (0.14h(-1), 0.28h(-1) and 0.56h(-1)) on growth and xylitol production. A feed-forward back-propagation artificial neural network (ANN) has been developed to investigate the effect of process condition on xylitol production. ANN configuration of 6-10-3 layers was selected and trained with 339 experimental data points from bioreactor studies. Results showed that simulation and prediction accuracy of ANN was apparently higher when compared to unstructured mechanistic model under varying operational conditions. ANN was found to be an efficient data-driven tool to predict the optimal harvest time in xylitol production. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Implicit flux-split Euler schemes for unsteady aerodynamic analysis involving unstructured dynamic meshes

    Science.gov (United States)

    Batina, John T.

    1990-01-01

    Improved algorithm for the solution of the time-dependent Euler equations are presented for unsteady aerodynamic analysis involving unstructured dynamic meshes. The improvements were developed recently to the spatial and temporal discretizations used by unstructured grid flow solvers. The spatial discretization involves a flux-split approach which is naturally dissipative and captures shock waves sharply with at most one grid point within the shock structure. The temporal discretization involves an implicit time-integration scheme using a Gauss-Seidel relaxation procedure which is computationally efficient for either steady or unsteady flow problems. For example, very large time steps may be used for rapid convergence to steady state, and the step size for unsteady cases may be selected for temporal accuracy rather than for numerical stability. Steady and unsteady flow results are presented for the NACA 0012 airfoil to demonstrate applications of the new Euler solvers. The unsteady results were obtained for the airfoil pitching harmonically about the quarter chord. The resulting instantaneous pressure distributions and lift and moment coefficients during a cycle of motion compare well with experimental data. A description of the Euler solvers is presented along with results and comparisons which assess the capability.

  11. Runge-Kutta discontinuous Galerkin method using a new type of WENO limiters on unstructured meshes

    Science.gov (United States)

    Zhu, Jun; Zhong, Xinghui; Shu, Chi-Wang; Qiu, Jianxian

    2013-09-01

    In this paper we generalize a new type of limiters based on the weighted essentially non-oscillatory (WENO) finite volume methodology for the Runge-Kutta discontinuous Galerkin (RKDG) methods solving nonlinear hyperbolic conservation laws, which were recently developed in [32] for structured meshes, to two-dimensional unstructured triangular meshes. The key idea of such limiters is to use the entire polynomials of the DG solutions from the troubled cell and its immediate neighboring cells, and then apply the classical WENO procedure to form a convex combination of these polynomials based on smoothness indicators and nonlinear weights, with suitable adjustments to guarantee conservation. The main advantage of this new limiter is its simplicity in implementation, especially for the unstructured meshes considered in this paper, as only information from immediate neighbors is needed and the usage of complicated geometric information of the meshes is largely avoided. Numerical results for both scalar equations and Euler systems of compressible gas dynamics are provided to illustrate the good performance of this procedure.

  12. Domain-independent information extraction in unstructured text

    Energy Technology Data Exchange (ETDEWEB)

    Irwin, N.H. [Sandia National Labs., Albuquerque, NM (United States). Software Surety Dept.

    1996-09-01

    Extracting information from unstructured text has become an important research area in recent years due to the large amount of text now electronically available. This status report describes the findings and work done during the second year of a two-year Laboratory Directed Research and Development Project. Building on the first-year`s work of identifying important entities, this report details techniques used to group words into semantic categories and to output templates containing selective document content. Using word profiles and category clustering derived during a training run, the time-consuming knowledge-building task can be avoided. Though the output still lacks in completeness when compared to systems with domain-specific knowledge bases, the results do look promising. The two approaches are compatible and could complement each other within the same system. Domain-independent approaches retain appeal as a system that adapts and learns will soon outpace a system with any amount of a priori knowledge.

  13. Emergence of Unstructured Data and Scope of Big Data in Indian Education

    OpenAIRE

    S S Kolhatkar; M Y Patil; S P Kolhatkar; M S Paranjape

    2017-01-01

    The Indian Education sector has grown exponentially in the last few decades as per various official reports[22]. Large amount of information pertaining to education sector is generated every year. This has led to the requirement for managing and analyzing the structured and unstructured information related to various stakeholders. At the same time there is a need to adapt to the dynamic global world by channelizing young talent in appropriate domains by cognizing and deriving the knowledge ab...

  14. Finsler Geometry Modeling of an Orientation-Asymmetric Surface Model for Membranes

    Science.gov (United States)

    Proutorov, Evgenii; Koibuchi, Hiroshi

    2017-12-01

    In this paper, a triangulated surface model is studied in the context of Finsler geometry (FG) modeling. This FG model is an extended version of a recently reported model for two-component membranes, and it is asymmetric under surface inversion. We show that the definition of the model is independent of how the Finsler length of a bond is defined. This leads us to understand that the canonical (or Euclidean) surface model is obtained from the FG model such that it is uniquely determined as a trivial model from the viewpoint of well definedness.

  15. Controlling cell adhesion via replication of laser micro/nano-textured surfaces on polymers

    Energy Technology Data Exchange (ETDEWEB)

    Koufaki, Niki; Ranella, Anthi; Barberoglou, Marios; Psycharakis, Stylianos; Fotakis, Costas; Stratakis, Emmanuel [Institute of Electronic Structure and Laser (IESL), Foundation for Research and Technology-Hellas (FORTH), 711 10, Heraklion, Crete (Greece); Aifantis, Katerina E, E-mail: stratak@iesl.forth.gr [Lab of Mechanics and Materials, Aristotle University of Thessaloniki, Thessaloniki (Greece)

    2011-12-15

    The aim of this study is to investigate cell adhesion and viability on highly rough polymeric surfaces with gradient roughness ratios and wettabilities prepared by microreplication of laser micro/nano-textured Si surfaces. Negative replicas on polydimethylsiloxane as well as positive ones on a photocurable (organically modified ceramic) and a biodegradable (poly(lactide-co-glycolide)) polymer have been successfully reproduced. The final culture substrates comprised from forests of micron-sized conical spikes exhibiting a range of roughness ratios and wettabilities, was achieved by changing the laser fluence used to fabricate the original template surfaces. Cell culture experiments were performed with the fibroblast NIH/3T3 and PC12 neuronal cell lines in order to investigate how these surfaces are capable of modulating different types of cellular responses including, viability, adhesion and morphology. The results showed a preferential adhesion of both cell types on the microstructured surfaces compared to the unstructured ones. In particular, the fibroblast NIH/3T3 cells show optimal adhesion for small roughness ratios, independent of the surface wettability and polymer type, indicating a non-monotonic dependence of cell adhesion on surface energy. In contrast, the PC12 cells were observed to adhere well to the patterned surfaces independent of the roughness ratio and wettability. These experimental findings are correlated with micromechanical measurements performed on the unstructured and replicated surfaces and discussed on the basis of previous observations describing the relation of cell response to surface energy and rigidity.

  16. Controlling cell adhesion via replication of laser micro/nano-textured surfaces on polymers

    International Nuclear Information System (INIS)

    Koufaki, Niki; Ranella, Anthi; Barberoglou, Marios; Psycharakis, Stylianos; Fotakis, Costas; Stratakis, Emmanuel; Aifantis, Katerina E

    2011-01-01

    The aim of this study is to investigate cell adhesion and viability on highly rough polymeric surfaces with gradient roughness ratios and wettabilities prepared by microreplication of laser micro/nano-textured Si surfaces. Negative replicas on polydimethylsiloxane as well as positive ones on a photocurable (organically modified ceramic) and a biodegradable (poly(lactide-co-glycolide)) polymer have been successfully reproduced. The final culture substrates comprised from forests of micron-sized conical spikes exhibiting a range of roughness ratios and wettabilities, was achieved by changing the laser fluence used to fabricate the original template surfaces. Cell culture experiments were performed with the fibroblast NIH/3T3 and PC12 neuronal cell lines in order to investigate how these surfaces are capable of modulating different types of cellular responses including, viability, adhesion and morphology. The results showed a preferential adhesion of both cell types on the microstructured surfaces compared to the unstructured ones. In particular, the fibroblast NIH/3T3 cells show optimal adhesion for small roughness ratios, independent of the surface wettability and polymer type, indicating a non-monotonic dependence of cell adhesion on surface energy. In contrast, the PC12 cells were observed to adhere well to the patterned surfaces independent of the roughness ratio and wettability. These experimental findings are correlated with micromechanical measurements performed on the unstructured and replicated surfaces and discussed on the basis of previous observations describing the relation of cell response to surface energy and rigidity.

  17. Intrinsic and extrinsic geometry of random surfaces

    International Nuclear Information System (INIS)

    Jonsson, T.

    1992-01-01

    We prove that the extrinsic Hausdorff dimension is always greater than or equal to the intrinsic Hausdorff dimension in models of triangulated random surfaces with action which is quadratic in the separation of vertices. We furthermore derive a few naive scaling relations which relate the intrinsic Hausdorff dimension to other critical exponents. These relations suggest that the intrinsic Hausdorff dimension is infinite if the susceptibility does not diverge at the critical point. (orig.)

  18. Hanging out with Which Friends? Friendship-Level Predictors of Unstructured and Unsupervised Socializing in Adolescence

    Science.gov (United States)

    Siennick, Sonja E.; Osgood, D. Wayne

    2012-01-01

    Companions are central to explanations of the risky nature of unstructured and unsupervised socializing, yet we know little about whom adolescents are with when hanging out. We examine predictors of how often friendship dyads hang out via multilevel analyses of longitudinal friendship-level data on over 5,000 middle schoolers. Adolescents hang out…

  19. Implicit Unstructured Computational Aerodynamics on Many-Integrated Core Architecture

    KAUST Repository

    Al Farhan, Mohammed A.

    2014-05-04

    This research aims to understand the performance of PETSc-FUN3D, a fully nonlinear implicit unstructured grid incompressible or compressible Euler code with origins at NASA and the U.S. DOE, on many-integrated core architecture and how a hybridprogramming paradigm (MPI+OpenMP) can exploit Intel Xeon Phi hardware with upwards of 60 cores per node and 4 threads per core. For the current contribution, we focus on strong scaling with many-integrated core hardware. In most implicit PDE-based codes, while the linear algebraic kernel is limited by the bottleneck of memory bandwidth, the flux kernel arising in control volume discretization of the conservation law residuals and the preconditioner for the Jacobian exploits the Phi hardware well.

  20. An efficient unstructured WENO method for supersonic reactive flows

    Science.gov (United States)

    Zhao, Wen-Geng; Zheng, Hong-Wei; Liu, Feng-Jun; Shi, Xiao-Tian; Gao, Jun; Hu, Ning; Lv, Meng; Chen, Si-Cong; Zhao, Hong-Da

    2018-03-01

    An efficient high-order numerical method for supersonic reactive flows is proposed in this article. The reactive source term and convection term are solved separately by splitting scheme. In the reaction step, an adaptive time-step method is presented, which can improve the efficiency greatly. In the convection step, a third-order accurate weighted essentially non-oscillatory (WENO) method is adopted to reconstruct the solution in the unstructured grids. Numerical results show that our new method can capture the correct propagation speed of the detonation wave exactly even in coarse grids, while high order accuracy can be achieved in the smooth region. In addition, the proposed adaptive splitting method can reduce the computational cost greatly compared with the traditional splitting method.

  1. Chequered surfaces and complex matrices

    International Nuclear Information System (INIS)

    Morris, T.R.; Southampton Univ.

    1991-01-01

    We investigate a large-N matrix model involving general complex matrices. It can be reinterpreted as a model of two hermitian matrices with specific couplings, and as a model of positive definite hermitian matrices. Large-N perturbation theory generates dynamical triangulations in which the triangles can be chequered (i.e. coloured so that neighbours are opposite colours). On a sphere there is a simple relation between such triangulations and those generated by the single hermitian matrix model. For the torus (and a quartic potential) we solve the counting problem for the number of triangulations that cannot be quechered. The critical physics of chequered triangulations is the same as that of the hermitian matrix model. We show this explicitly by solving non-perturbatively pure two-dimensional ''chequered'' gravity. The interpretative framework given here applies to a number of other generalisations of the hermitian matrix model. (orig.)

  2. A three-dimensional electrostatic particle-in-cell methodology on unstructured Delaunay-Voronoi grids

    International Nuclear Information System (INIS)

    Gatsonis, Nikolaos A.; Spirkin, Anton

    2009-01-01

    The mathematical formulation and computational implementation of a three-dimensional particle-in-cell methodology on unstructured Delaunay-Voronoi tetrahedral grids is presented. The method allows simulation of plasmas in complex domains and incorporates the duality of the Delaunay-Voronoi in all aspects of the particle-in-cell cycle. Charge assignment and field interpolation weighting schemes of zero- and first-order are formulated based on the theory of long-range constraints. Electric potential and fields are derived from a finite-volume formulation of Gauss' law using the Voronoi-Delaunay dual. Boundary conditions and the algorithms for injection, particle loading, particle motion, and particle tracking are implemented for unstructured Delaunay grids. Error and sensitivity analysis examines the effects of particles/cell, grid scaling, and timestep on the numerical heating, the slowing-down time, and the deflection times. The problem of current collection by cylindrical Langmuir probes in collisionless plasmas is used for validation. Numerical results compare favorably with previous numerical and analytical solutions for a wide range of probe radius to Debye length ratios, probe potentials, and electron to ion temperature ratios. The versatility of the methodology is demonstrated with the simulation of a complex plasma microsensor, a directional micro-retarding potential analyzer that includes a low transparency micro-grid.

  3. Error Correction of Measured Unstructured Road Profiles Based on Accelerometer and Gyroscope Data

    Directory of Open Access Journals (Sweden)

    Jinhua Han

    2017-01-01

    Full Text Available This paper describes a noncontact acquisition system composed of several time synchronized laser height sensors, accelerometers, gyroscope, and so forth in order to collect the road profiles of vehicle riding on the unstructured roads. A method of correcting road profiles based on the accelerometer and gyroscope data is proposed to eliminate the adverse impacts of vehicle vibration and attitudes change. Because the power spectral density (PSD of gyro attitudes concentrates in the low frequency band, a method called frequency division is presented to divide the road profiles into two parts: high frequency part and low frequency part. The vibration error of road profiles is corrected by displacement data obtained through two times integration of measured acceleration data. After building the mathematical model between gyro attitudes and road profiles, the gyro attitudes signals are separated from low frequency road profile by the method of sliding block overlap based on correlation analysis. The accuracy and limitations of the system have been analyzed, and its validity has been verified by implementing the system on wheeled equipment for road profiles’ measuring of vehicle testing ground. The paper offers an accurate and practical approach to obtaining unstructured road profiles for road simulation test.

  4. Modification of the laser triangulation method for measuring the thickness of optical layers

    Science.gov (United States)

    Khramov, V. N.; Adamov, A. A.

    2018-04-01

    The problem of determining the thickness of thin films by the method of laser triangulation is considered. An expression is derived for the film thickness and the distance between the focused beams on the photo detector. The possibility of applying the chosen method for measuring thickness is in the range [0.1; 1] mm. We could resolve 2 individual light marks for a minimum film thickness of 0.23 mm. We resolved with the help of computer processing of photos with a resolution of 0.10 mm. The obtained results can be used in ophthalmology for express diagnostics during surgical operations on the corneal layer.

  5. Development of the delyed-neutron triangulation technique for locating failed fuel in LMFBR

    International Nuclear Information System (INIS)

    Kryter, R.C.

    1975-01-01

    Two major accomplishments of the ORNL delayed neutron triangulation program are (1) an analysis of anticipated detector counting rates and sensitivities to unclad fuel and erosion types of pin failure, and (2) an experimental assessment of the accuracy with which the position of failed fuel can be determined in the FFTF (this was performed in a quarter-scale water mockup of realistic outlet plenum geometry using electrolyte injections and conductivity cells to simulate delayed-neutron precursor releases and detections, respectively). The major results and conclusions from these studies are presented, along with plans for further DNT development work at ORNL for the FFTF and CRBR. (author)

  6. The structure of chromatic polynomials of planar triangulations and implications for chromatic zeros and asymptotic limiting quantities

    International Nuclear Information System (INIS)

    Shrock, Robert; Xu Yan

    2012-01-01

    We present an analysis of the structure and properties of chromatic polynomials P(G pt,m-vector, q) of one-parameter and multi-parameter families of planar triangulation graphs G pt,m-vector , where m-vector = (m 1 ,…,m p ) is a vector of integer parameters. We use these to study the ratio of |P(G pt,m-vector, τ+1)| to the Tutte upper bound (τ − 1) n−5 , where τ=(1+√5)/2 and n is the number of vertices in G pt,m-vector . In particular, we calculate limiting values of this ratio as n → ∞ for various families of planar triangulations. We also use our calculations to analyze zeros of these chromatic polynomials. We study a large class of families G pt,m-vector with p = 1 and p = 2 and show that these have a structure of the form P(G pt,m ,q) = c G pt ,1 λ 1 m + c G pt ,2 λ 2 m + c G pt ,3 λ 3 m for p = 1, where λ 1 = q − 2, λ 2 = q − 3, and λ 3 = −1, and P(G pt,m-vector ,q) =Σ i 1 =1 3 Σ i 2 =1 3 c G pt ,i 1 i 2 λ i 1 m 1 λ i 2 m 2 for p = 2. We derive properties of the coefficients c G pt ,i-vector and show that P(G pt,m-vector ,q) has a real chromatic zero that approaches (1/2)(3+√5) as one or more of the m i → ∞. The generalization to p ⩾ 3 is given. Further, we present a one-parameter family of planar triangulations with real zeros that approach 3 from below as m → ∞. Implications for the ground-state entropy of the Potts antiferromagnet are discussed. (paper)

  7. Accuracy of an unstructured-grid upwind-Euler algorithm for the ONERA M6 wing

    Science.gov (United States)

    Batina, John T.

    1991-01-01

    Improved algorithms for the solution of the three-dimensional, time-dependent Euler equations are presented for aerodynamic analysis involving unstructured dynamic meshes. The improvements have been developed recently to the spatial and temporal discretizations used by unstructured-grid flow solvers. The spatial discretization involves a flux-split approach that is naturally dissipative and captures shock waves sharply with at most one grid point within the shock structure. The temporal discretization involves either an explicit time-integration scheme using a multistage Runge-Kutta procedure or an implicit time-integration scheme using a Gauss-Seidel relaxation procedure, which is computationally efficient for either steady or unsteady flow problems. With the implicit Gauss-Seidel procedure, very large time steps may be used for rapid convergence to steady state, and the step size for unsteady cases may be selected for temporal accuracy rather than for numerical stability. Steady flow results are presented for both the NACA 0012 airfoil and the Office National d'Etudes et de Recherches Aerospatiales M6 wing to demonstrate applications of the new Euler solvers. The paper presents a description of the Euler solvers along with results and comparisons that assess the capability.

  8. Assessment of behavioral changes associated with oral meloxicam administration at time of dehorning in calves using a remote triangulation device and accelerometers

    Directory of Open Access Journals (Sweden)

    Theurer Miles E

    2012-04-01

    Full Text Available Abstract Background Dehorning is common in the cattle industry, and there is a need for research evaluating pain mitigation techniques. The objective of this study was to determine the effects of oral meloxicam, a non-steroidal anti-inflammatory, on cattle behavior post-dehorning by monitoring the percent of time spent standing, walking, and lying in specific locations within the pen using accelerometers and a remote triangulation device. Twelve calves approximately ten weeks of age were randomized into 2 treatment groups (meloxicam or control in a complete block design by body weight. Six calves were orally administered 0.5 mg/kg meloxicam at the time of dehorning and six calves served as negative controls. All calves were dehorned using thermocautery and behavior of each calf was continuously monitored for 7 days after dehorning using accelerometers and a remote triangulation device. Accelerometers monitored lying behavior and the remote triangulation device was used to monitor each calf’s movement within the pen. Results Analysis of behavioral data revealed significant interactions between treatment (meloxicam vs. control and the number of days post dehorning. Calves that received meloxicam spent more time at the grain bunk on trial days 2 and 6 post-dehorning; spent more time lying down on days 1, 2, 3, and 4; and less time at the hay feeder on days 0 and 1 compared to the control group. Meloxicam calves tended to walk more at the beginning and end of the trial compared to the control group. By day 5, the meloxicam and control group exhibited similar behaviors. Conclusions The noted behavioral changes provide evidence of differences associated with meloxicam administration. More studies need to be performed to evaluate the relationship of behavior monitoring and post-operative pain. To our knowledge this is the first published report demonstrating behavioral changes following dehorning using a remote triangulation device in conjunction

  9. TRIANGULATION OF THE INTERSTELLAR MAGNETIC FIELD

    Energy Technology Data Exchange (ETDEWEB)

    Schwadron, N. A.; Moebius, E. [University of New Hampshire, Durham, NH 03824 (United States); Richardson, J. D. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Burlaga, L. F. [Goddard Space Flight Center, Greenbelt, MD 20771 (United States); McComas, D. J. [Southwest Research Institute, San Antonio, TX 78228 (United States)

    2015-11-01

    Determining the direction of the local interstellar magnetic field (LISMF) is important for understanding the heliosphere’s global structure, the properties of the interstellar medium, and the propagation of cosmic rays in the local galactic medium. Measurements of interstellar neutral atoms by Ulysses for He and by SOHO/SWAN for H provided some of the first observational insights into the LISMF direction. Because secondary neutral H is partially deflected by the interstellar flow in the outer heliosheath and this deflection is influenced by the LISMF, the relative deflection of H versus He provides a plane—the so-called B–V plane in which the LISMF direction should lie. Interstellar Boundary Explorer (IBEX) subsequently discovered a ribbon, the center of which is conjectured to be the LISMF direction. The most recent He velocity measurements from IBEX and those from Ulysses yield a B–V plane with uncertainty limits that contain the centers of the IBEX ribbon at 0.7–2.7 keV. The possibility that Voyager 1 has moved into the outer heliosheath now suggests that Voyager 1's direct observations provide another independent determination of the LISMF. We show that LISMF direction measured by Voyager 1 is >40° off from the IBEX ribbon center and the B–V plane. Taking into account the temporal gradient of the field direction measured by Voyager 1, we extrapolate to a field direction that passes directly through the IBEX ribbon center (0.7–2.7 keV) and the B–V plane, allowing us to triangulate the LISMF direction and estimate the gradient scale size of the magnetic field.

  10. Unstructured socialization and territorialization. A street-ethnographic take on urban youth in a medium-sized town in Denmark

    DEFF Research Database (Denmark)

    Gravesen, David Thore; Frostholm, Peter Hornbæk

    Abstract / Journal of youth studies Conference 2015 Peter Frostholm Olesen & David Thore Gravesen Unstructured socialization and territorialization. A street-ethnographic take on urban youth in a medium-sized town in Denmark. In 2013, the municipality in Horsens, a medium-sized provincial town...... in Denmark, bestowed the city's children and young people a skater / parkour / ball-cage facility right on the city's central squares. The facility serves as a territorial meeting place for a number of conflicting groups of adolescents with different codes of behavior based on their cultural orientation...... and sense of belonging to certain districts of the city. Through positioning battles of various kinds the groups fight for space and place for their unstructured socialization processes with their peers. Officially, the municipality donated the facility to give local children and young people an opportunity...

  11. The Extraction of Road Boundary from Crowdsourcing Trajectory Using Constrained Delaunay Triangulation

    Directory of Open Access Journals (Sweden)

    YANG Wei

    2017-02-01

    Full Text Available Extraction of road boundary accurately from crowdsourcing trajectory lines is still a hard work.Therefore,this study presented a new approach to use vehicle trajectory lines to extract road boundary.Firstly, constructing constrained Delaunay triangulation within interpolated track lines to calculate road boundary descriptors using triangle edge length and Voronoi cell.Road boundary recognition model was established by integrating the two boundary descriptors.Then,based on seed polygons,a regional growing method was proposed to extract road boundary. Finally, taxi GPS traces in Beijing were used to verify the validity of the novel method, and the results also showed that our method was suitable for GPS traces with disparity density,complex road structure and different time interval.

  12. Managing Competition “Unstructured Decision Making” Benefits for Executives, Enterprise and Society

    Directory of Open Access Journals (Sweden)

    Carlos Ernesto Martín-Pérez

    2016-06-01

    Full Text Available In an increasingly uncertain and changing environment, the success of the decisions is limited and thus obtains the expected results. Hence the need to manage professional competences in executives, as those ones encourages and contributes to job performance. For this reason, we want to argue the impact of managing competition “unstructured decision making” by executives, enterprises and Cuban society. The results are expressed in the description of the process and the advantages that executives, enterprises and society could obtain in general.

  13. An Interpreted Language and System for the Visualization of Unstructured Meshes

    Science.gov (United States)

    Moran, Patrick J.; Gerald-Yamasaki, Michael (Technical Monitor)

    1998-01-01

    We present an interpreted language and system supporting the visualization of unstructured meshes and the manipulation of shapes defined in terms of mesh subsets. The language features primitives inspired by geometric modeling, mathematical morphology and algebraic topology. The adaptation of the topology ideas to an interpreted environment, along with support for programming constructs such, as user function definition, provide a flexible system for analyzing a mesh and for calculating with shapes defined in terms of the mesh. We present results demonstrating some of the capabilities of the language, based on an implementation called the Shape Calculator, for tetrahedral meshes in R^3.

  14. A positional estimation technique for an autonomous land vehicle in an unstructured environment

    Science.gov (United States)

    Talluri, Raj; Aggarwal, J. K.

    1990-01-01

    This paper presents a solution to the positional estimation problem of an autonomous land vehicle navigating in an unstructured mountainous terrain. A Digital Elevation Map (DEM) of the area in which the robot is to navigate is assumed to be given. It is also assumed that the robot is equipped with a camera that can be panned and tilted, and a device to measure the elevation of the robot above the ground surface. No recognizable landmarks are assumed to be present in the environment in which the robot is to navigate. The solution presented makes use of the DEM information, and structures the problem as a heuristic search in the DEM for the possible robot location. The shape and position of the horizon line in the image plane and the known camera geometry of the perspective projection are used as parameters to search the DEM. Various heuristics drawn from the geometric constraints are used to prune the search space significantly. The algorithm is made robust to errors in the imaging process by accounting for the worst care errors. The approach is tested using DEM data of areas in Colorado and Texas. The method is suitable for use in outdoor mobile robots and planetary rovers.

  15. 5D {sup 13}C-detected experiments for backbone assignment of unstructured proteins with a very low signal dispersion

    Energy Technology Data Exchange (ETDEWEB)

    Novacek, Jiri [Masaryk University, Faculty of Science, NCBR, and CEITEC (Czech Republic); Zawadzka-Kazimierczuk, Anna [University of Warsaw, Faculty of Chemistry (Poland); Papouskova, Veronika; Zidek, Lukas, E-mail: lzidek@chemi.muni.cz [Masaryk University, Faculty of Science, NCBR, and CEITEC (Czech Republic); Sanderova, Hana; Krasny, Libor [Institute of Microbiology, Academy of Sciences of the Czech Republic, Laboratory of Molecular Genetics of Bacteria and Department of Bacteriology (Czech Republic); Kozminski, Wiktor [University of Warsaw, Faculty of Chemistry (Poland); Sklenar, Vladimir [Masaryk University, Faculty of Science, NCBR, and CEITEC (Czech Republic)

    2011-05-15

    Two novel 5D NMR experiments (CACONCACO, NCOCANCO) for backbone assignment of disordered proteins are presented. The pulse sequences exploit relaxation properties of the unstructured proteins and combine the advantages of {sup 13}C-direct detection, non-uniform sampling, and longitudinal relaxation optimization to maximize the achievable resolution and minimize the experimental time. The pulse sequences were successfully tested on the sample of partially disordered delta subunit from RNA polymerase from Bacillus subtilis. The unstructured part of this 20 kDa protein consists of 81 amino acids with frequent sequential repeats. A collection of 0.0003% of the data needed for a conventional experiment with linear sampling was sufficient to perform an unambiguous assignment of the disordered part of the protein from a single 5D spectrum.

  16. Assessment of the Unstructured Grid Software TetrUSS for Drag Prediction of the DLR-F4 Configuration

    Science.gov (United States)

    Pirzadeh, Shahyar Z.; Frink, Neal T.

    2002-01-01

    An application of the NASA unstructured grid software system TetrUSS is presented for the prediction of aerodynamic drag on a transport configuration. The paper briefly describes the underlying methodology and summarizes the results obtained on the DLR-F4 transport configuration recently presented in the first AIAA computational fluid dynamics (CFD) Drag Prediction Workshop. TetrUSS is a suite of loosely coupled unstructured grid CFD codes developed at the NASA Langley Research Center. The meshing approach is based on the advancing-front and the advancing-layers procedures. The flow solver employs a cell-centered, finite volume scheme for solving the Reynolds Averaged Navier-Stokes equations on tetrahedral grids. For the present computations, flow in the viscous sublayer has been modeled with an analytical wall function. The emphasis of the paper is placed on the practicality of the methodology for accurately predicting aerodynamic drag data.

  17. A matrix-free implicit unstructured multigrid finite volume method for simulating structural dynamics and fluid structure interaction

    Science.gov (United States)

    Lv, X.; Zhao, Y.; Huang, X. Y.; Xia, G. H.; Su, X. H.

    2007-07-01

    A new three-dimensional (3D) matrix-free implicit unstructured multigrid finite volume (FV) solver for structural dynamics is presented in this paper. The solver is first validated using classical 2D and 3D cantilever problems. It is shown that very accurate predictions of the fundamental natural frequencies of the problems can be obtained by the solver with fast convergence rates. This method has been integrated into our existing FV compressible solver [X. Lv, Y. Zhao, et al., An efficient parallel/unstructured-multigrid preconditioned implicit method for simulating 3d unsteady compressible flows with moving objects, Journal of Computational Physics 215(2) (2006) 661-690] based on the immersed membrane method (IMM) [X. Lv, Y. Zhao, et al., as mentioned above]. Results for the interaction between the fluid and an immersed fixed-free cantilever are also presented to demonstrate the potential of this integrated fluid-structure interaction approach.

  18. Learning to Take an Inquiry Stance in Teacher Research: An Exploration of Unstructured Thought-Partner Spaces

    Science.gov (United States)

    Lawton-Sticklor, Nastasia; Bodamer, Scott F.

    2016-01-01

    This article explores a research partnership between a university-based researcher and a middle school science teacher. Our partnership began with project-based inquiry and continued with unstructured thought-partner spaces: meetings with no agenda where we wrestled with problems of practice. Framed as incubation periods, these meetings allowed us…

  19. A Survey on Methods for Reconstructing Surfaces from Unorganized Point Sets

    Directory of Open Access Journals (Sweden)

    Vilius Matiukas

    2011-08-01

    Full Text Available This paper addresses the issue of reconstructing and visualizing surfaces from unorganized point sets. These can be acquired using different techniques, such as 3D-laser scanning, computerized tomography, magnetic resonance imaging and multi-camera imaging. The problem of reconstructing surfaces from their unorganized point sets is common for many diverse areas, including computer graphics, computer vision, computational geometry or reverse engineering. The paper presents three alternative methods that all use variations in complementary cones to triangulate and reconstruct the tested 3D surfaces. The article evaluates and contrasts three alternatives.Article in English

  20. Euclidean Dynamical Triangulation revisited: is the phase transition really 1st order?

    International Nuclear Information System (INIS)

    Rindlisbacher, Tobias; Forcrand, Philippe de

    2015-01-01

    The transition between the two phases of 4D Euclidean Dynamical Triangulation (http://dx.doi.org/10.1016/0370-2693(92)90709-D) was long believed to be of second order until in 1996 first order behavior was found for sufficiently large systems (http://dx.doi.org/10.1016/0550-3213(96)00214-3, http://dx.doi.org/10.1016/S0370-2693(96)01277-4). However, one may wonder if this finding was affected by the numerical methods used: to control volume fluctuations, in both studies (http://dx.doi.org/10.1016/0550-3213(96)00214-3, http://dx.doi.org/10.1016/S0370-2693(96)01277-4) an artificial harmonic potential was added to the action and in (http://dx.doi.org/10.1016/S0370-2693(96)01277-4) measurements were taken after a fixed number of accepted instead of attempted moves which introduces an additional error. Finally the simulations suffer from strong critical slowing down which may have been underestimated. In the present work, we address the above weaknesses: we allow the volume to fluctuate freely within a fixed interval; we take measurements after a fixed number of attempted moves; and we overcome critical slowing down by using an optimized parallel tempering algorithm (http://dx.doi.org/10.1088/1742-5468/2010/01/P01020). With these improved methods, on systems of size up to N_4=64k 4-simplices, we confirm that the phase transition is 1"s"t order. In addition, we discuss a local criterion to decide whether parts of a triangulation are in the elongated or crumpled state and describe a new correspondence between EDT and the balls in boxes model. The latter gives rise to a modified partition function with an additional, third coupling. Finally, we propose and motivate a class of modified path-integral measures that might remove the metastability of the Markov chain and turn the phase transition into 2"n"d order.

  1. An Efficient and Robust Method for Lagrangian Magnetic Particle Tracking in Fluid Flow Simulations on Unstructured Grids

    NARCIS (Netherlands)

    Cohen Stuart, D.C.; Kleijn, C.R.; Kenjeres, S.

    2010-01-01

    In this paper we report on a newly developed particle tracking scheme for fluid flow simulations on 3D unstructured grids, aiming to provide detailed insights in the particle behaviour in complex geometries. A possible field of applications is the Magnetic Drug Targeting (MDT) technique, on which

  2. Two Strategies for Qualitative Content Analysis: An Intramethod Approach to Triangulation.

    Science.gov (United States)

    Renz, Susan M; Carrington, Jane M; Badger, Terry A

    2018-04-01

    The overarching aim of qualitative research is to gain an understanding of certain social phenomena. Qualitative research involves the studied use and collection of empirical materials, all to describe moments and meanings in individuals' lives. Data derived from these various materials require a form of analysis of the content, focusing on written or spoken language as communication, to provide context and understanding of the message. Qualitative research often involves the collection of data through extensive interviews, note taking, and tape recording. These methods are time- and labor-intensive. With the advances in computerized text analysis software, the practice of combining methods to analyze qualitative data can assist the researcher in making large data sets more manageable and enhance the trustworthiness of the results. This article will describe a novel process of combining two methods of qualitative data analysis, or Intramethod triangulation, as a means to provide a deeper analysis of text.

  3. Electromagnetic forward modelling for realistic Earth models using unstructured tetrahedral meshes and a meshfree approach

    Science.gov (United States)

    Farquharson, C.; Long, J.; Lu, X.; Lelievre, P. G.

    2017-12-01

    Real-life geology is complex, and so, even when allowing for the diffusive, low resolution nature of geophysical electromagnetic methods, we need Earth models that can accurately represent this complexity when modelling and inverting electromagnetic data. This is particularly the case for the scales, detail and conductivity contrasts involved in mineral and hydrocarbon exploration and development, but also for the larger scale of lithospheric studies. Unstructured tetrahedral meshes provide a flexible means of discretizing a general, arbitrary Earth model. This is important when wanting to integrate a geophysical Earth model with a geological Earth model parameterized in terms of surfaces. Finite-element and finite-volume methods can be derived for computing the electric and magnetic fields in a model parameterized using an unstructured tetrahedral mesh. A number of such variants have been proposed and have proven successful. However, the efficiency and accuracy of these methods can be affected by the "quality" of the tetrahedral discretization, that is, how many of the tetrahedral cells in the mesh are long, narrow and pointy. This is particularly the case if one wants to use an iterative technique to solve the resulting linear system of equations. One approach to deal with this issue is to develop sophisticated model and mesh building and manipulation capabilities in order to ensure that any mesh built from geological information is of sufficient quality for the electromagnetic modelling. Another approach is to investigate other methods of synthesizing the electromagnetic fields. One such example is a "meshfree" approach in which the electromagnetic fields are synthesized using a mesh that is distinct from the mesh used to parameterized the Earth model. There are then two meshes, one describing the Earth model and one used for the numerical mathematics of computing the fields. This means that there are no longer any quality requirements on the model mesh, which

  4. An unstructured shock-fitting solver for hypersonic plasma flows in chemical non-equilibrium

    Science.gov (United States)

    Pepe, R.; Bonfiglioli, A.; D'Angola, A.; Colonna, G.; Paciorri, R.

    2015-11-01

    A CFD solver, using Residual Distribution Schemes on unstructured grids, has been extended to deal with inviscid chemical non-equilibrium flows. The conservative equations have been coupled with a kinetic model for argon plasma which includes the argon metastable state as independent species, taking into account electron-atom and atom-atom processes. Results in the case of an hypersonic flow around an infinite cylinder, obtained by using both shock-capturing and shock-fitting approaches, show higher accuracy of the shock-fitting approach.

  5. Energy transfer in structured and unstructured environments: Master equations beyond the Born-Markov approximations

    Energy Technology Data Exchange (ETDEWEB)

    Iles-Smith, Jake, E-mail: Jakeilessmith@gmail.com [Controlled Quantum Dynamics Theory, Imperial College London, London SW7 2PG (United Kingdom); Photon Science Institute and School of Physics and Astronomy, The University of Manchester, Oxford Road, Manchester M13 9PL (United Kingdom); Department of Photonics Engineering, DTU Fotonik, Ørsteds Plads, 2800 Kongens Lyngby (Denmark); Dijkstra, Arend G. [Max Planck Institute for the Structure and Dynamics of Matter, Luruper Chaussee 149, 22761 Hamburg (Germany); Lambert, Neill [CEMS, RIKEN, Saitama 351-0198 (Japan); Nazir, Ahsan, E-mail: ahsan.nazir@manchester.ac.uk [Photon Science Institute and School of Physics and Astronomy, The University of Manchester, Oxford Road, Manchester M13 9PL (United Kingdom)

    2016-01-28

    We explore excitonic energy transfer dynamics in a molecular dimer system coupled to both structured and unstructured oscillator environments. By extending the reaction coordinate master equation technique developed by Iles-Smith et al. [Phys. Rev. A 90, 032114 (2014)], we go beyond the commonly used Born-Markov approximations to incorporate system-environment correlations and the resultant non-Markovian dynamical effects. We obtain energy transfer dynamics for both underdamped and overdamped oscillator environments that are in perfect agreement with the numerical hierarchical equations of motion over a wide range of parameters. Furthermore, we show that the Zusman equations, which may be obtained in a semiclassical limit of the reaction coordinate model, are often incapable of describing the correct dynamical behaviour. This demonstrates the necessity of properly accounting for quantum correlations generated between the system and its environment when the Born-Markov approximations no longer hold. Finally, we apply the reaction coordinate formalism to the case of a structured environment comprising of both underdamped (i.e., sharply peaked) and overdamped (broad) components simultaneously. We find that though an enhancement of the dimer energy transfer rate can be obtained when compared to an unstructured environment, its magnitude is rather sensitive to both the dimer-peak resonance conditions and the relative strengths of the underdamped and overdamped contributions.

  6. Commutative discrete filtering on unstructured grids based on least-squares techniques

    International Nuclear Information System (INIS)

    Haselbacher, Andreas; Vasilyev, Oleg V.

    2003-01-01

    The present work is concerned with the development of commutative discrete filters for unstructured grids and contains two main contributions. First, building on the work of Marsden et al. [J. Comp. Phys. 175 (2002) 584], a new commutative discrete filter based on least-squares techniques is constructed. Second, a new analysis of the discrete commutation error is carried out. The analysis indicates that the discrete commutation error is not only dependent on the number of vanishing moments of the filter weights, but also on the order of accuracy of the discrete gradient operator. The results of the analysis are confirmed by grid-refinement studies

  7. LanguageNet: A Novel Framework for Processing Unstructured Text Information

    DEFF Research Database (Denmark)

    Qureshi, Pir Abdul Rasool; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    In this paper we present LanguageNet—a novel framework for processing unstructured text information from human generated content. The state of the art information processing frameworks have some shortcomings: modeled in generalized form, trained on fixed (limited) data sets, and leaving...... the specialization necessary for information consolidation to the end users. The proposed framework is the first major attempt to address these shortcomings. LanguageNet provides extended support of graphical methods contributing added value to the capabilities of information processing. We discuss the benefits...... of the framework and compare it with the available state of the art. We also describe how the framework improves the information gathering process and contribute towards building systems with better performance in the domain of Open Source Intelligence....

  8. Branches of Triangulated Origami Near the Unfolded State

    Directory of Open Access Journals (Sweden)

    Bryan Gin-ge Chen

    2018-02-01

    Full Text Available Origami structures are characterized by a network of folds and vertices joining unbendable plates. For applications to mechanical design and self-folding structures, it is essential to understand the interplay between the set of folds in the unfolded origami and the possible 3D folded configurations. When deforming a structure that has been folded, one can often linearize the geometric constraints, but the degeneracy of the unfolded state makes a linear approach impossible there. We derive a theory for the second-order infinitesimal rigidity of an initially unfolded triangulated origami structure and use it to study the set of nearly unfolded configurations of origami with four boundary vertices. We find that locally, this set consists of a number of distinct “branches” which intersect at the unfolded state, and that the number of these branches is exponential in the number of vertices. We find numerical and analytical evidence that suggests that the branches are characterized by choosing each internal vertex to either “pop up” or “pop down.” The large number of pathways along which one can fold an initially unfolded origami structure strongly indicates that a generic structure is likely to become trapped in a “misfolded” state. Thus, new techniques for creating self-folding origami are likely necessary; controlling the popping state of the vertices may be one possibility.

  9. Branches of Triangulated Origami Near the Unfolded State

    Science.gov (United States)

    Chen, Bryan Gin-ge; Santangelo, Christian D.

    2018-01-01

    Origami structures are characterized by a network of folds and vertices joining unbendable plates. For applications to mechanical design and self-folding structures, it is essential to understand the interplay between the set of folds in the unfolded origami and the possible 3D folded configurations. When deforming a structure that has been folded, one can often linearize the geometric constraints, but the degeneracy of the unfolded state makes a linear approach impossible there. We derive a theory for the second-order infinitesimal rigidity of an initially unfolded triangulated origami structure and use it to study the set of nearly unfolded configurations of origami with four boundary vertices. We find that locally, this set consists of a number of distinct "branches" which intersect at the unfolded state, and that the number of these branches is exponential in the number of vertices. We find numerical and analytical evidence that suggests that the branches are characterized by choosing each internal vertex to either "pop up" or "pop down." The large number of pathways along which one can fold an initially unfolded origami structure strongly indicates that a generic structure is likely to become trapped in a "misfolded" state. Thus, new techniques for creating self-folding origami are likely necessary; controlling the popping state of the vertices may be one possibility.

  10. A superlinearly convergent finite volume method for the incompressible Navier-Stokes equations on staggered unstructured grids

    International Nuclear Information System (INIS)

    Vidovic, D.; Segal, A.; Wesseling, P.

    2004-01-01

    A method for linear reconstruction of staggered vector fields with special treatment of the divergence is presented. An upwind-biased finite volume scheme for solving the unsteady incompressible Navier-Stokes equations on staggered unstructured triangular grids that uses this reconstruction is described. The scheme is applied to three benchmark problems and is found to be superlinearly convergent in space

  11. Discrepancies between qualitative and quantitative evaluation of randomised controlled trial results: achieving clarity through mixed methods triangulation.

    Science.gov (United States)

    Tonkin-Crine, Sarah; Anthierens, Sibyl; Hood, Kerenza; Yardley, Lucy; Cals, Jochen W L; Francis, Nick A; Coenen, Samuel; van der Velden, Alike W; Godycki-Cwirko, Maciek; Llor, Carl; Butler, Chris C; Verheij, Theo J M; Goossens, Herman; Little, Paul

    2016-05-12

    Mixed methods are commonly used in health services research; however, data are not often integrated to explore complementarity of findings. A triangulation protocol is one approach to integrating such data. A retrospective triangulation protocol was carried out on mixed methods data collected as part of a process evaluation of a trial. The multi-country randomised controlled trial found that a web-based training in communication skills (including use of a patient booklet) and the use of a C-reactive protein (CRP) point-of-care test decreased antibiotic prescribing by general practitioners (GPs) for acute cough. The process evaluation investigated GPs' and patients' experiences of taking part in the trial. Three analysts independently compared findings across four data sets: qualitative data collected view semi-structured interviews with (1) 62 patients and (2) 66 GPs and quantitative data collected via questionnaires with (3) 2886 patients and (4) 346 GPs. Pairwise comparisons were made between data sets and were categorised as agreement, partial agreement, dissonance or silence. Three instances of dissonance occurred in 39 independent findings. GPs and patients reported different views on the use of a CRP test. GPs felt that the test was useful in convincing patients to accept a no-antibiotic decision, but patient data suggested that this was unnecessary if a full explanation was given. Whilst qualitative data indicated all patients were generally satisfied with their consultation, quantitative data indicated highest levels of satisfaction for those receiving a detailed explanation from their GP with a booklet giving advice on self-care. Both qualitative and quantitative data sets indicated higher patient enablement for those in the communication groups who had received a booklet. Use of CRP tests does not appear to engage patients or influence illness perceptions and its effect is more centred on changing clinician behaviour. Communication skills and the patient

  12. elsA-Hybrid: an all-in-one structured/unstructured solver for the simulation of internal and external flows. Application to turbomachinery

    Science.gov (United States)

    de la Llave Plata, M.; Couaillier, V.; Le Pape, M.-C.; Marmignon, C.; Gazaix, M.

    2013-03-01

    This paper reports recent work on the extension of the multiblock structured solver elsA to deal with hybrid grids. The new hybrid-grid solver is called elsA-H (elsA-Hybrid), is based on the investigation of a new unstructured-grid module has been built within the original elsA CFD (computational fluid dynamics) system. The implementation benefits from the flexibility of the object-oriented design. The aim of elsA-H is to take advantage of the full potential of structured solvers and unstructured mesh generation by allowing any type of grid to be used within the same simulation process. The main challenge lies in the numerical treatment of the hybrid-grid interfaces where blocks of different type meet. In particular, one must pay attention to the transfer of information across these boundaries, so that the accuracy of the numerical scheme is preserved and flux conservation is guaranteed. In this paper, the numerical approach allowing to achieve this is presented. A comparison between the hybrid and the structured-grid methods is also carried out by considering a fully hexahedral multiblock mesh for which a few blocks have been transformed into unstructured. The performance of elsA-H for the simulation of internal flows will be demonstrated on a number of turbomachinery configurations.

  13. Zur Rekonstruktion einer Typologie jugendlichen Medienhandelns gemäß dem Leitbild der Triangulation

    Directory of Open Access Journals (Sweden)

    Klaus Peter Treumann

    2017-09-01

    Full Text Available Die im Folgenden dargestellten Ergebnisse sind im Rahmen des von der DFG geförderten Forschungsprojekts „Eine Untersuchung zum Mediennutzungsverhalten 12- bis 20-Jähriger und zur Entwicklung von Medienkompetenz im Jugendalter“ entstanden, das gemeinsam von Klaus Peter Treumann, Uwe Sander und Dorothee Meister geleitet wird. Das Forschungsprojekt untersucht das Medienhandeln Jugendlicher sowohl hinsichtlich Neuer als auch alter Medien. Zum einen fragen wir dabei nach den Ausprägungen von Medienkompetenz in verschiedenen Dimensionen und zum anderen konzentrieren wir uns auf die Entwicklung einer empirisch fundierten Typologie jugendlichen Medienhandelns. Methodologisch ist die Untersuchung an dem Leitbild der Triangulation orientiert und kombiniert qualitative und quantitative Zugänge zum Forschungsfeld in Form von Gruppendiskussionen, leitfadengestützten Einzelinterviews und einer Repräsentativerhebung.

  14. A numerical formulation using unstructured grids for modeling two-phase flows in porous media considering heterogeneities and capillarity effects

    International Nuclear Information System (INIS)

    Hurtado, F.S.V.; Maliska, C.R.

    2005-01-01

    This paper briefly describes a two-dimensional numerical formulation using unstructured grids, developed for simulating two-phase immiscible displacements in porous media. The Element-based Finite Volume Method (EbFVM) is used for discretizing the model differential equations. (authors)

  15. A numerical formulation using unstructured grids for modeling two-phase flows in porous media considering heterogeneities and capillarity effects

    Energy Technology Data Exchange (ETDEWEB)

    Hurtado, F.S.V.; Maliska, C.R. [Santa Catarina Federal Univ., Computational Fluid Dynamics Lab., Mechanical Engineering Dept., Florianopolis, SC (Brazil)

    2005-07-01

    This paper briefly describes a two-dimensional numerical formulation using unstructured grids, developed for simulating two-phase immiscible displacements in porous media. The Element-based Finite Volume Method (EbFVM) is used for discretizing the model differential equations. (authors)

  16. Parallel FE Electron-Photon Transport Analysis on 2-D Unstructured Mesh

    International Nuclear Information System (INIS)

    Drumm, C.R.; Lorenz, J.

    1999-01-01

    A novel solution method has been developed to solve the coupled electron-photon transport problem on an unstructured triangular mesh. Instead of tackling the first-order form of the linear Boltzmann equation, this approach is based on the second-order form in conjunction with the conventional multi-group discrete-ordinates approximation. The highly forward-peaked electron scattering is modeled with a multigroup Legendre expansion derived from the Goudsmit-Saunderson theory. The finite element method is used to treat the spatial dependence. The solution method is unique in that the space-direction dependence is solved simultaneously, eliminating the need for the conventional inner iterations, a method that is well suited for massively parallel computers

  17. The Unstructured Data Sharing System for Natural resources and Environment Science Data of the Chinese Academy of Science

    Directory of Open Access Journals (Sweden)

    Dafang Zhuang

    2007-10-01

    Full Text Available The data sharing system for resource and environment science databases of the Chinese Academy of Science (CAS is of an open three-tiered architecture, which integrates the geographical databases of about 9 institutes of CAS by the mechanism of distributive unstructured data management, metadata integration, catalogue services, and security control. The data tiers consist of several distributive data servers that are located in each CAS institute and support such unstructured data formats as vector files, remote sensing images or other raster files, documents, multi-media files, tables, and other format files. For the spatial data files, format transformation service is provided. The middle tier involves a centralized metadata server, which stores metadata records of data on all data servers. The primary function of this tier is catalog service, supporting the creation, search, browsing, updating, and deletion of catalogs. The client tier involves an integrated client that provides the end-users interfaces to search, browse, and download data or create a catalog and upload data.

  18. Conversion of a Surface Model of a Structure of Interest into a Volume Model for Medical Image Retrieval

    Directory of Open Access Journals (Sweden)

    Sarmad ISTEPHAN

    2015-06-01

    Full Text Available Volumetric medical image datasets contain vital information for noninvasive diagnosis, treatment planning and prognosis. However, direct and unlimited query of such datasets is hindered due to the unstructured nature of the imaging data. This study is a step towards the unlimited query of medical image datasets by focusing on specific Structures of Interest (SOI. A requirement in achieving this objective is having both the surface and volume models of the SOI. However, typically, only the surface model is available. Therefore, this study focuses on creating a fast method to convert a surface model to a volume model. Three methods (1D, 2D and 3D are proposed and evaluated using simulated and real data of Deep Perisylvian Area (DPSA within the human brain. The 1D method takes 80 msec for DPSA model; about 4 times faster than 2D method and 7.4 fold faster than 3D method, with over 97% accuracy. The proposed 1D method is feasible for surface to volume conversion in computer aided diagnosis, treatment planning and prognosis systems containing large amounts of unstructured medical images.

  19. Anisotropic three-dimensional inversion of CSEM data using finite-element techniques on unstructured grids

    Science.gov (United States)

    Wang, Feiyan; Morten, Jan Petter; Spitzer, Klaus

    2018-05-01

    In this paper, we present a recently developed anisotropic 3-D inversion framework for interpreting controlled-source electromagnetic (CSEM) data in the frequency domain. The framework integrates a high-order finite-element forward operator and a Gauss-Newton inversion algorithm. Conductivity constraints are applied using a parameter transformation. We discretize the continuous forward and inverse problems on unstructured grids for a flexible treatment of arbitrarily complex geometries. Moreover, an unstructured mesh is more desirable in comparison to a single rectilinear mesh for multisource problems because local grid refinement will not significantly influence the mesh density outside the region of interest. The non-uniform spatial discretization facilitates parametrization of the inversion domain at a suitable scale. For a rapid simulation of multisource EM data, we opt to use a parallel direct solver. We further accelerate the inversion process by decomposing the entire data set into subsets with respect to frequencies (and transmitters if memory requirement is affordable). The computational tasks associated with each data subset are distributed to different processes and run in parallel. We validate the scheme using a synthetic marine CSEM model with rough bathymetry, and finally, apply it to an industrial-size 3-D data set from the Troll field oil province in the North Sea acquired in 2008 to examine its robustness and practical applicability.

  20. An Automated Approach to the Generation of Structured Building Information Models from Unstructured 3d Point Cloud Scans

    DEFF Research Database (Denmark)

    Tamke, Martin; Evers, Henrik Leander; Wessel, Raoul

    2016-01-01

    In this paper we present and evaluate an approach for the automatic generation of building models in IFC BIM format from unstructured Point Cloud scans, as they result from 3dlaser scans of buildings. While the actual measurement process is relatively fast, 85% of the overall time are spend...

  1. Motion planning for autonomous vehicle based on radial basis function neural network in unstructured environment.

    Science.gov (United States)

    Chen, Jiajia; Zhao, Pan; Liang, Huawei; Mei, Tao

    2014-09-18

    The autonomous vehicle is an automated system equipped with features like environment perception, decision-making, motion planning, and control and execution technology. Navigating in an unstructured and complex environment is a huge challenge for autonomous vehicles, due to the irregular shape of road, the requirement of real-time planning, and the nonholonomic constraints of vehicle. This paper presents a motion planning method, based on the Radial Basis Function (RBF) neural network, to guide the autonomous vehicle in unstructured environments. The proposed algorithm extracts the drivable region from the perception grid map based on the global path, which is available in the road network. The sample points are randomly selected in the drivable region, and a gradient descent method is used to train the RBF network. The parameters of the motion-planning algorithm are verified through the simulation and experiment. It is observed that the proposed approach produces a flexible, smooth, and safe path that can fit any road shape. The method is implemented on autonomous vehicle and verified against many outdoor scenes; furthermore, a comparison of proposed method with the existing well-known Rapidly-exploring Random Tree (RRT) method is presented. The experimental results show that the proposed method is highly effective in planning the vehicle path and offers better motion quality.

  2. An Arbitrary Lagrangian-Eulerian Discretization of MHD on 3D Unstructured Grids

    Energy Technology Data Exchange (ETDEWEB)

    Rieben, R N; White, D A; Wallin, B K; Solberg, J M

    2006-06-12

    We present an arbitrary Lagrangian-Eulerian (ALE) discretization of the equations of resistive magnetohydrodynamics (MHD) on unstructured hexahedral grids. The method is formulated using an operator-split approach with three distinct phases: electromagnetic diffusion, Lagrangian motion, and Eulerian advection. The resistive magnetic dynamo equation is discretized using a compatible mixed finite element method with a 2nd order accurate implicit time differencing scheme which preserves the divergence-free nature of the magnetic field. At each discrete time step, electromagnetic force and heat terms are calculated and coupled to the hydrodynamic equations to compute the Lagrangian motion of the conducting materials. By virtue of the compatible discretization method used, the invariants of Lagrangian MHD motion are preserved in a discrete sense. When the Lagrangian motion of the mesh causes significant distortion, that distortion is corrected with a relaxation of the mesh, followed by a 2nd order monotonic remap of the electromagnetic state variables. The remap is equivalent to Eulerian advection of the magnetic flux density with a fictitious mesh relaxation velocity. The magnetic advection is performed using a novel variant of constrained transport (CT) that is valid for unstructured hexahedral grids with arbitrary mesh velocities. The advection method maintains the divergence free nature of the magnetic field and is second order accurate in regions where the solution is sufficiently smooth. For regions in which the magnetic field is discontinuous (e.g. MHD shocks) the method is limited using a novel variant of algebraic flux correction (AFC) which is local extremum diminishing (LED) and divergence preserving. Finally, we verify each stage of the discretization via a set of numerical experiments.

  3. Tle Triangulation Campaign by Japanese High School Students as a Space Educational Project of the Ssh Consortium Kochi

    Science.gov (United States)

    Yamamoto, Masa-Yuki; Okamoto, Sumito; Miyoshi, Terunori; Takamura, Yuzaburo; Aoshima, Akira; Hinokuchi, Jin

    As one of the space educational projects in Japan, a triangulation observation project of TLE (Transient Luminous Events: sprites, elves, blue-jets, etc.) has been carried out since 2006 in collaboration between 29 Super Science High-schools (SSH) and Kochi University of Technol-ogy (KUT). Following with previous success of sprite observations by "Astro High-school" since 2004, the SSH consortium Kochi was established as a national space educational project sup-ported by Japan Science and Technology Agency (JST). High-sensitivity CCD camera (Watec, Neptune-100) with 6 mm F/1.4 C-mount lens (Fujinon) and motion-detective software (UFO-Capture, SonotaCo) were given to each participating team in order to monitor Northern night sky of Japan with almost full-coverage. During each school year (from April to March in Japan) since 2006, thousands of TLE images were taken by many student teams, with considerably large numbers of successful triangulations, i.e., (School year, Numbers of TLE observations, Numbers of triangulations) are (2006, 43, 3), (2007, 441, 95), (2008, 734, 115), and (2009, 337, 78). Note that, school year in Japan begins on April 1 and ends on March 31. The observation campaign began in December 2006, numbers are as of Feb. 28, 2010. Recently, some high schools started wide field observations using multiple cameras, and others started VLF observations using handmade loop antennae and amplifiers. Infomation exchange among the SSH consortium Kochi is frequently communicated with scientific discussion via KUT's mailing lists. Also, interactions with amateur observers in Japan are made through an internet forum of "SonotaCo Network Japan" (http://sonotaco.jp). Not only as an educational project but also as a scientific one, the project is also in success. In February 2008, simultaneous observations of Elves were obtained, in November 2009 a Giant "Graft-shaped" Sprites driven by Jets was clearly imaged with VLF signals. Most recently, ob-servations of Elves

  4. An Automated Approach to the Generation of Structured Building Information Models from Unstructured 3d Point Cloud Scans

    DEFF Research Database (Denmark)

    Tamke, Martin; Evers, Henrik Leander; Wessel, Raoul

    2016-01-01

    In this paper we present and evaluate an approach for the automatic generation of building models in IFC BIM format from unstructured Point Cloud scans, as they result from 3dlaser scans of buildings. While the actual measurement process is relatively fast, 85% of the overall time are spend on th...

  5. Triangulating case-finding tools for patient safety surveillance: a cross-sectional case study of puncture/laceration.

    Science.gov (United States)

    Taylor, Jennifer A; Gerwin, Daniel; Morlock, Laura; Miller, Marlene R

    2011-12-01

    To evaluate the need for triangulating case-finding tools in patient safety surveillance. This study applied four case-finding tools to error-associated patient safety events to identify and characterise the spectrum of events captured by these tools, using puncture or laceration as an example for in-depth analysis. Retrospective hospital discharge data were collected for calendar year 2005 (n=48,418) from a large, urban medical centre in the USA. The study design was cross-sectional and used data linkage to identify the cases captured by each of four case-finding tools. Three case-finding tools (International Classification of Diseases external (E) and nature (N) of injury codes, Patient Safety Indicators (PSI)) were applied to the administrative discharge data to identify potential patient safety events. The fourth tool was Patient Safety Net, a web-based voluntary patient safety event reporting system. The degree of mutual exclusion among detection methods was substantial. For example, when linking puncture or laceration on unique identifiers, out of 447 potential events, 118 were identical between PSI and E-codes, 152 were identical between N-codes and E-codes and 188 were identical between PSI and N-codes. Only 100 events that were identified by PSI, E-codes and N-codes were identical. Triangulation of multiple tools through data linkage captures potential patient safety events most comprehensively. Existing detection tools target patient safety domains differently, and consequently capture different occurrences, necessitating the integration of data from a combination of tools to fully estimate the total burden.

  6. Energy harvesting through gas dynamics in the free molecular flow regime between structured surfaces at different temperatures

    DEFF Research Database (Denmark)

    Baier, Tobias; Dölger, Julia; Hardt, Steffen

    2014-01-01

    For a gas confined between surfaces held at different temperatures the velocity distribution shows a significant deviation from the Maxwell distribution when the mean free path of the molecules is comparable to or larger than the channel dimensions. If one of the surfaces is suitably structured...... from the thermal creep flow that has gained more attention so far. This situation is studied in the limit of free-molecular flow for the case that an unstructured surface is allowed to move tangentially with respect to a structured surface. Parameter studies are conducted, and configurations...

  7. EMPHASIS(TM)/Nevada Unstructured FEM Implementation Version 2.1.1.

    Energy Technology Data Exchange (ETDEWEB)

    Turner, C. David; Pointon, Timothy D.; Cartwright, Keith

    2014-08-01

    EMPHASIS TM /NEVADA is the SIERRA/NEVADA toolkit implementation of portions of the EMP HASIS TM code suite. The purpose of the toolkit i m- plementation is to facilitate coupling to other physics drivers such as radi a- tion transport as well as to better manage code design, implementation, co m- plexity, and important verification and validation processes. This document describes the theory and implementation of the unstructured finite - element method solver , associated algorithms, and selected verification and valid a- tion . Acknowledgement The author would like to recognize all of the ALEGRA team members for their gracious and willing support through this initial Nevada toolkit - implementation process. Although much of the knowledge needed was gleaned from document a- tion and code context, they were always willing to consult personally on some of the less obvious issues and enhancements necessary.

  8. MONTE CARLO ANALYSES OF THE YALINA THERMAL FACILITY WITH SERPENT STEREOLITHOGRAPHY GEOMETRY MODEL

    Energy Technology Data Exchange (ETDEWEB)

    Talamo, A.; Gohar, Y.

    2015-01-01

    This paper analyzes the YALINA Thermal subcritical assembly of Belarus using two different Monte Carlo transport programs, SERPENT and MCNP. The MCNP model is based on combinatorial geometry and universes hierarchy, while the SERPENT model is based on Stereolithography geometry. The latter consists of unstructured triangulated surfaces defined by the normal and vertices. This geometry format is used by 3D printers and it has been created by: the CUBIT software, MATLAB scripts, and C coding. All the Monte Carlo simulations have been performed using the ENDF/B-VII.0 nuclear data library. Both MCNP and SERPENT share the same geometry specifications, which describe the facility details without using any material homogenization. Three different configurations have been studied with different number of fuel rods. The three fuel configurations use 216, 245, or 280 fuel rods, respectively. The numerical simulations show that the agreement between SERPENT and MCNP results is within few tens of pcms.

  9. Unstructured grids and an element based conservative approach for a black-oil reservoir simulation

    Energy Technology Data Exchange (ETDEWEB)

    Nogueira, Regis Lopes; Fernandes, Bruno Ramon Batista [Federal University of Ceara, Fortaleza, CE (Brazil). Dept. of Chemical Engineering; Araujo, Andre Luiz de Souza [Federal Institution of Education, Science and Technology of Ceara - IFCE, Fortaleza (Brazil). Industry Department], e-mail: andre@ifce.edu.br; Marcondes, Francisco [Federal University of Ceara, Fortaleza, CE (Brazil). Dept. of Metallurgical Engineering and Material Science], e-mail: marcondes@ufc.br

    2010-07-01

    Unstructured meshes presented one upgrade in modeling the main important features of the reservoir such as discrete fractures, faults, and irregular boundaries. From several methodologies available, the Element based Finite Volume Method (EbFVM), in conjunction with unstructured meshes, is one methodology that deserves large attention. In this approach, the reservoir, for 2D domains, is discretized using a mixed two-dimensional mesh using quadrilateral and triangle elements. After the initial step of discretization, each element is divided into sub-elements and the mass balance for each component is developed for each sub-element. The equations for each control-volume using a cell vertex construction are formulated through the contribution of different neighboured elements. This paper presents an investigation of an element-based approach using the black-oil model based on pressure and global mass fractions. In this approach, even when all gas phase is dissolved in oil phase the global mass fraction of gas will be different from zero. Therefore, no additional numerical procedure is necessary in order to treat the gas phase appear/disappearance. In this paper the above mentioned approach is applied to multiphase flows involving oil, gas, and water. The mass balance equations in terms of global mass fraction of oil, gas and water are discretized through the EbFVM and linearized by the Newton's method. The results are presented in terms of volumetric rates of oil, gas, and water and phase saturations. (author)

  10. Delaunay Triangulation as a New Coverage Measurement Method in Wireless Sensor Network

    Science.gov (United States)

    Chizari, Hassan; Hosseini, Majid; Poston, Timothy; Razak, Shukor Abd; Abdullah, Abdul Hanan

    2011-01-01

    Sensing and communication coverage are among the most important trade-offs in Wireless Sensor Network (WSN) design. A minimum bound of sensing coverage is vital in scheduling, target tracking and redeployment phases, as well as providing communication coverage. Some methods measure the coverage as a percentage value, but detailed information has been missing. Two scenarios with equal coverage percentage may not have the same Quality of Coverage (QoC). In this paper, we propose a new coverage measurement method using Delaunay Triangulation (DT). This can provide the value for all coverage measurement tools. Moreover, it categorizes sensors as ‘fat’, ‘healthy’ or ‘thin’ to show the dense, optimal and scattered areas. It can also yield the largest empty area of sensors in the field. Simulation results show that the proposed DT method can achieve accurate coverage information, and provides many tools to compare QoC between different scenarios. PMID:22163792

  11. Triangulation and Gender Perspectives in ‘Falling Man’ by Don DeLillo

    Directory of Open Access Journals (Sweden)

    Noemi Abe

    2011-09-01

    Susannah Radstone argues that the rhetorical response to 9/11 by the Bush administration is based on the opposition of two father figures: “the 'chastened' but powerful 'good' patriarchal father” Vs. “the 'bad' archaic father”. She explains: “In this Manichean fantasy can be glimpsed the continuing battle between competing versions of masculinity” (2002:459 that leaves women on the margins. The battle of the fathers of Bush’s rhetoric is counterposed in Falling Man by a battle between two men that stands for an unaccomplished fatherhood. Furthermore, the dualistic vision engendered by post-9/11 rhetoric and reflected in the novel should be evaluated in a trilateral dimension, given that at its core lies a triangulation built upon three stereotypical representations: the white middle-class man; the Arab terrorist; and a composite character in the middle, the woman, who shifts from ally, to victim, to a plausible supporter of the enemy.

  12. Visualization of 2-D and 3-D fields from its value in a finite number of points

    International Nuclear Information System (INIS)

    Dari, E.A.; Venere, M.J.

    1990-01-01

    This work describes a method for the visualization of two- and three-dimensional fields, given its value in a finite number of points. These data can be originated in experimental measurements, numerical results, or any other source. For the field interpolation, the space is divided into simplices (triangles or tetrahedrons), using the Watson algorithm to obtain the Delaunay triangulation. Inside each simplex, linear interpolation is assumed. The visualization is accomplished by means of Finite Elements post-processors, capable of handling unstructured meshes, which were also developed by the authors. (Author) [es

  13. A Deep Penetration Problem Calculation Using AETIUS:An Easy Modeling Discrete Ordinates Transport Code UsIng Unstructured Tetrahedral Mesh, Shared Memory Parallel

    Science.gov (United States)

    KIM, Jong Woon; LEE, Young-Ouk

    2017-09-01

    As computing power gets better and better, computer codes that use a deterministic method seem to be less useful than those using the Monte Carlo method. In addition, users do not like to think about space, angles, and energy discretization for deterministic codes. However, a deterministic method is still powerful in that we can obtain a solution of the flux throughout the problem, particularly as when particles can barely penetrate, such as in a deep penetration problem with small detection volumes. Recently, a new state-of-the-art discrete-ordinates code, ATTILA, was developed and has been widely used in several applications. ATTILA provides the capabilities to solve geometrically complex 3-D transport problems by using an unstructured tetrahedral mesh. Since 2009, we have been developing our own code by benchmarking ATTILA. AETIUS is a discrete ordinates code that uses an unstructured tetrahedral mesh such as ATTILA. For pre- and post- processing, Gmsh is used to generate an unstructured tetrahedral mesh by importing a CAD file (*.step) and visualizing the calculation results of AETIUS. Using a CAD tool, the geometry can be modeled very easily. In this paper, we describe a brief overview of AETIUS and provide numerical results from both AETIUS and a Monte Carlo code, MCNP5, in a deep penetration problem with small detection volumes. The results demonstrate the effectiveness and efficiency of AETIUS for such calculations.

  14. Optical microtopographic inspection of the surface of tooth subjected to stripping reduction

    Science.gov (United States)

    Costa, Manuel F.; Pereira, Pedro B.

    2011-05-01

    In orthodontics, the decreasing of tooth-size by reducing interproximal enamel surfaces (stripping) of teeth is a common procedure which allows dental alignment with minimal changes in the facial profile and no arch expansion. In order to achieve smooth surfaces, clinicians have been testing various methods and progressively improved this therapeutic technique. In order to evaluate the surface roughness of teeth subject to interproximal reduction through the five most commonly used methods, teeth were inspected by scanning electron microscopy and microtopographically measured using the optical active triangulation based microtopographer MICROTOP.06.MFC. The metrological procedure will be presented as well as the comparative results concluding on the most suitable tooth interproximal reduction method.

  15. Ifcwall Reconstruction from Unstructured Point Clouds

    Science.gov (United States)

    Bassier, M.; Klein, R.; Van Genechten, B.; Vergauwen, M.

    2018-05-01

    The automated reconstruction of Building Information Modeling (BIM) objects from point cloud data is still ongoing research. A key aspect is the creation of accurate wall geometry as it forms the basis for further reconstruction of objects in a BIM. After segmenting and classifying the initial point cloud, the labelled segments are processed and the wall topology is reconstructed. However, the preocedure is challenging due to noise, occlusions and the complexity of the input data.In this work, a method is presented to automatically reconstruct consistent wall geometry from point clouds. More specifically, the use of room information is proposed to aid the wall topology creation. First, a set of partial walls is constructed based on classified planar primitives. Next, the rooms are identified using the retrieved wall information along with the floors and ceilings. The wall topology is computed by the intersection of the partial walls conditioned on the room information. The final wall geometry is defined by creating IfcWallStandardCase objects conform the IFC4 standard. The result is a set of walls according to the as-built conditions of a building. The experiments prove that the used method is a reliable framework for wall reconstruction from unstructured point cloud data. Also, the implementation of room information reduces the rate of false positives for the wall topology. Given the walls, ceilings and floors, 94% of the rooms is correctly identified. A key advantage of the proposed method is that it deals with complex rooms and is not bound to single storeys.

  16. Quantisation of super Teichmueller theory

    International Nuclear Information System (INIS)

    Aghaei, Nezhla; Hamburg Univ.; Pawelkiewicz, Michal; Techner, Joerg

    2015-12-01

    We construct a quantisation of the Teichmueller spaces of super Riemann surfaces using coordinates associated to ideal triangulations of super Riemann surfaces. A new feature is the non-trivial dependence on the choice of a spin structure which can be encoded combinatorially in a certain refinement of the ideal triangulation. By constructing a projective unitary representation of the groupoid of changes of refined ideal triangulations we demonstrate that the dependence of the resulting quantum theory on the choice of a triangulation is inessential.

  17. Simulations of four-dimensional simplicial quantum gravity as dynamical triangulation

    International Nuclear Information System (INIS)

    Agishtein, M.E.; Migdal, A.A.

    1992-01-01

    In this paper, Four-Dimensional Simplicial Quantum Gravity is simulated using the dynamical triangulation approach. The authors studied simplicial manifolds of spherical topology and found the critical line for the cosmological constant as a function of the gravitational one, separating the phases of opened and closed Universe. When the bare cosmological constant approaches this line from above, the four-volume grows: the authors reached about 5 x 10 4 simplexes, which proved to be sufficient for the statistical limit of infinite volume. However, for the genuine continuum theory of gravity, the parameters of the lattice model should be further adjusted to reach the second order phase transition point, where the correlation length grows to infinity. The authors varied the gravitational constant, and they found the first order phase transition, similar to the one found in three-dimensional model, except in 4D the fluctuations are rather large at the transition point, so that this is close to the second order phase transition. The average curvature in cutoff units is large and positive in one phase (gravity), and small negative in another (antigravity). The authors studied the fractal geometry of both phases, using the heavy particle propagator to define the geodesic map, as well as with the old approach using the shortest lattice paths

  18. Thermal Protection System Cavity Heating for Simplified and Actual Geometries Using Computational Fluid Dynamics Simulations with Unstructured Grids

    Science.gov (United States)

    McCloud, Peter L.

    2010-01-01

    Thermal Protection System (TPS) Cavity Heating is predicted using Computational Fluid Dynamics (CFD) on unstructured grids for both simplified cavities and actual cavity geometries. Validation was performed using comparisons to wind tunnel experimental results and CFD predictions using structured grids. Full-scale predictions were made for simplified and actual geometry configurations on the Space Shuttle Orbiter in a mission support timeframe.

  19. Potentiation of E-4031-induced torsade de pointes by HMR1556 or ATX-II is not predicted by action potential short-term variability or triangulation.

    Science.gov (United States)

    Michael, G; Dempster, J; Kane, K A; Coker, S J

    2007-12-01

    Torsade de pointes (TdP) can be induced by a reduction in cardiac repolarizing capacity. The aim of this study was to assess whether IKs blockade or enhancement of INa could potentiate TdP induced by IKr blockade and to investigate whether short-term variability (STV) or triangulation of action potentials preceded TdP. Experiments were performed in open-chest, pentobarbital-anaesthetized, alpha 1-adrenoceptor-stimulated, male New Zealand White rabbits, which received three consecutive i.v. infusions of either the IKr blocker E-4031 (1, 3 and 10 nmol kg(-1) min(-1)), the IKs blocker HMR1556 (25, 75 and 250 nmol kg(-1) min(-1)) or E-4031 and HMR1556 combined. In a second study rabbits received either the same doses of E-4031, the INa enhancer, ATX-II (0.4, 1.2 and 4.0 nmol kg(-1)) or both of these drugs. ECGs and epicardial monophasic action potentials were recorded. HMR1556 alone did not cause TdP but increased E-4031-induced TdP from 25 to 80%. ATX-II alone caused TdP in 38% of rabbits, as did E-4031; 75% of rabbits receiving both drugs had TdP. QT intervals were prolonged by all drugs but the extent of QT prolongation was not related to the occurrence of TdP. No changes in STV were detected and triangulation was only increased after TdP occurred. Giving modulators of ion channels in combination substantially increased TdP but, in this model, neither STV nor triangulation of action potentials could predict TdP.

  20. Discretization of the Joule heating term for plasma discharge fluid models in unstructured meshes

    International Nuclear Information System (INIS)

    Deconinck, T.; Mahadevan, S.; Raja, L.L.

    2009-01-01

    The fluid (continuum) approach is commonly used for simulation of plasma phenomena in electrical discharges at moderate to high pressures (>10's mTorr). The description comprises governing equations for charged and neutral species transport and energy equations for electrons and the heavy species, coupled to equations for the electromagnetic fields. The coupling of energy from the electrostatic field to the plasma species is modeled by the Joule heating term which appears in the electron and heavy species (ion) energy equations. Proper numerical discretization of this term is necessary for accurate description of discharge energetics; however, discretization of this term poses a special problem in the case of unstructured meshes owing to the arbitrary orientation of the faces enclosing each cell. We propose a method for the numerical discretization of the Joule heating term using a cell-centered finite volume approach on unstructured meshes with closed convex cells. The Joule heating term is computed by evaluating both the electric field and the species flux at the cell center. The dot product of these two vector quantities is computed to obtain the Joule heating source term. We compare two methods to evaluate the species flux at the cell center. One is based on reconstructing the fluxes at the cell centers from the fluxes at the face centers. The other recomputes the flux at the cell center using the common drift-diffusion approximation. The reconstructed flux scheme is the most stable method and yields reasonably accurate results on coarse meshes.

  1. Counting Cats: Spatially Explicit Population Estimates of Cheetah (Acinonyx jubatus Using Unstructured Sampling Data.

    Directory of Open Access Journals (Sweden)

    Femke Broekhuis

    Full Text Available Many ecological theories and species conservation programmes rely on accurate estimates of population density. Accurate density estimation, especially for species facing rapid declines, requires the application of rigorous field and analytical methods. However, obtaining accurate density estimates of carnivores can be challenging as carnivores naturally exist at relatively low densities and are often elusive and wide-ranging. In this study, we employ an unstructured spatial sampling field design along with a Bayesian sex-specific spatially explicit capture-recapture (SECR analysis, to provide the first rigorous population density estimates of cheetahs (Acinonyx jubatus in the Maasai Mara, Kenya. We estimate adult cheetah density to be between 1.28 ± 0.315 and 1.34 ± 0.337 individuals/100km2 across four candidate models specified in our analysis. Our spatially explicit approach revealed 'hotspots' of cheetah density, highlighting that cheetah are distributed heterogeneously across the landscape. The SECR models incorporated a movement range parameter which indicated that male cheetah moved four times as much as females, possibly because female movement was restricted by their reproductive status and/or the spatial distribution of prey. We show that SECR can be used for spatially unstructured data to successfully characterise the spatial distribution of a low density species and also estimate population density when sample size is small. Our sampling and modelling framework will help determine spatial and temporal variation in cheetah densities, providing a foundation for their conservation and management. Based on our results we encourage other researchers to adopt a similar approach in estimating densities of individually recognisable species.

  2. Counting Cats: Spatially Explicit Population Estimates of Cheetah (Acinonyx jubatus) Using Unstructured Sampling Data.

    Science.gov (United States)

    Broekhuis, Femke; Gopalaswamy, Arjun M

    2016-01-01

    Many ecological theories and species conservation programmes rely on accurate estimates of population density. Accurate density estimation, especially for species facing rapid declines, requires the application of rigorous field and analytical methods. However, obtaining accurate density estimates of carnivores can be challenging as carnivores naturally exist at relatively low densities and are often elusive and wide-ranging. In this study, we employ an unstructured spatial sampling field design along with a Bayesian sex-specific spatially explicit capture-recapture (SECR) analysis, to provide the first rigorous population density estimates of cheetahs (Acinonyx jubatus) in the Maasai Mara, Kenya. We estimate adult cheetah density to be between 1.28 ± 0.315 and 1.34 ± 0.337 individuals/100km2 across four candidate models specified in our analysis. Our spatially explicit approach revealed 'hotspots' of cheetah density, highlighting that cheetah are distributed heterogeneously across the landscape. The SECR models incorporated a movement range parameter which indicated that male cheetah moved four times as much as females, possibly because female movement was restricted by their reproductive status and/or the spatial distribution of prey. We show that SECR can be used for spatially unstructured data to successfully characterise the spatial distribution of a low density species and also estimate population density when sample size is small. Our sampling and modelling framework will help determine spatial and temporal variation in cheetah densities, providing a foundation for their conservation and management. Based on our results we encourage other researchers to adopt a similar approach in estimating densities of individually recognisable species.

  3. A bias correction for covariance estimators to improve inference with generalized estimating equations that use an unstructured correlation matrix.

    Science.gov (United States)

    Westgate, Philip M

    2013-07-20

    Generalized estimating equations (GEEs) are routinely used for the marginal analysis of correlated data. The efficiency of GEE depends on how closely the working covariance structure resembles the true structure, and therefore accurate modeling of the working correlation of the data is important. A popular approach is the use of an unstructured working correlation matrix, as it is not as restrictive as simpler structures such as exchangeable and AR-1 and thus can theoretically improve efficiency. However, because of the potential for having to estimate a large number of correlation parameters, variances of regression parameter estimates can be larger than theoretically expected when utilizing the unstructured working correlation matrix. Therefore, standard error estimates can be negatively biased. To account for this additional finite-sample variability, we derive a bias correction that can be applied to typical estimators of the covariance matrix of parameter estimates. Via simulation and in application to a longitudinal study, we show that our proposed correction improves standard error estimation and statistical inference. Copyright © 2012 John Wiley & Sons, Ltd.

  4. Cosmos++: relativistic magnetohydrodynamics on unstructured grids with local adaptive refinement

    International Nuclear Information System (INIS)

    Salmonson, Jay D; Anninos, Peter; Fragile, P Chris; Camarda, Karen

    2007-01-01

    A code and methodology are introduced for solving the fully general relativistic magnetohydrodynamic (GRMHD) equations using time-explicit, finite-volume discretization. The code has options for solving the GRMHD equations using traditional artificial-viscosity (AV) or non-oscillatory central difference (NOCD) methods, or a new extended AV (eAV) scheme using artificial-viscosity together with a dual energy-flux-conserving formulation. The dual energy approach allows for accurate modeling of highly relativistic flows at boost factors well beyond what has been achieved to date by standard artificial viscosity methods. It provides the benefit of Godunov methods in capturing high Lorentz boosted flows but without complicated Riemann solvers, and the advantages of traditional artificial viscosity methods in their speed and flexibility. Additionally, the GRMHD equations are solved on an unstructured grid that supports local adaptive mesh refinement using a fully threaded oct-tree (in three dimensions) network to traverse the grid hierarchy across levels and immediate neighbors. Some recent studies will be summarized

  5. Automatic vertebral identification using surface-based registration

    Science.gov (United States)

    Herring, Jeannette L.; Dawant, Benoit M.

    2000-06-01

    This work introduces an enhancement to currently existing methods of intra-operative vertebral registration by allowing the portion of the spinal column surface that correctly matches a set of physical vertebral points to be automatically selected from several possible choices. Automatic selection is made possible by the shape variations that exist among lumbar vertebrae. In our experiments, we register vertebral points representing physical space to spinal column surfaces extracted from computed tomography images. The vertebral points are taken from the posterior elements of a single vertebra to represent the region of surgical interest. The surface is extracted using an improved version of the fully automatic marching cubes algorithm, which results in a triangulated surface that contains multiple vertebrae. We find the correct portion of the surface by registering the set of physical points to multiple surface areas, including all vertebral surfaces that potentially match the physical point set. We then compute the standard deviation of the surface error for the set of points registered to each vertebral surface that is a possible match, and the registration that corresponds to the lowest standard deviation designates the correct match. We have performed our current experiments on two plastic spine phantoms and one patient.

  6. Unstructured 3D core calculations with the descartes system application to the JHR research reactor

    International Nuclear Information System (INIS)

    Baudron, A. M.; Doderlein, C.; Guerin, P.; Lautard, J. J.; Moreau, F.

    2007-01-01

    Recent developments in the DESCARTES system enable neutronics calculations dealing with very complex unstructured geometrical configurations. The discretization can be made either by using a very fine Cartesian mesh and the fast simplified transport (SPN) solver MINOS, or a discretization based on triangles and the SP1 solver MINARET. In order to perform parallel calculations dealing with a very fine mesh in 3D, a domain decomposition with non overlapping domains has been implemented. To illustrate these capabilities, we present an application on the future European research reactor JHR dedicated to technological irradiations. (authors)

  7. Triangulation of Qualitative Methods for the Exploration of Activity Systems in Ergonomics

    Directory of Open Access Journals (Sweden)

    Monika Hackel

    2008-08-01

    Full Text Available Research concerning ergonomic issues in interdisciplinary projects often raises several very specific questions depending on project objectives. To answer these questions the application of research methods should be thoroughly considered, regarding both the expenditure and the options within the scope of the given resources. The project AQUIMO develops an adaptable modelling tool for mechatronical engineering and creates a related qualification program. The task of social scientific research within this project is to identify requirements viewed from the perspective of the subsequent users. This formative evaluation is based on the approach of "developmental work research" as set forth by ENGESTRÖM and, thus, is a form of "action research". This paper discusses the triangulation of several qualitative methods addressing the examination of difficulties in interdisciplinary collaboration in mechatronical engineering. After a description of both background and analytic approach within the project AQUIMO, the methods are briefly described concerning their advantages and critical points. Their application within the research project AQUIMO is explained from an activity theoretical perspective. URN: urn:nbn:de:0114-fqs0803158

  8. Parallel SOR methods with a parabolic-diffusion acceleration technique for solving an unstructured-grid Poisson equation on 3D arbitrary geometries

    Science.gov (United States)

    Zapata, M. A. Uh; Van Bang, D. Pham; Nguyen, K. D.

    2016-05-01

    This paper presents a parallel algorithm for the finite-volume discretisation of the Poisson equation on three-dimensional arbitrary geometries. The proposed method is formulated by using a 2D horizontal block domain decomposition and interprocessor data communication techniques with message passing interface. The horizontal unstructured-grid cells are reordered according to the neighbouring relations and decomposed into blocks using a load-balanced distribution to give all processors an equal amount of elements. In this algorithm, two parallel successive over-relaxation methods are presented: a multi-colour ordering technique for unstructured grids based on distributed memory and a block method using reordering index following similar ideas of the partitioning for structured grids. In all cases, the parallel algorithms are implemented with a combination of an acceleration iterative solver. This solver is based on a parabolic-diffusion equation introduced to obtain faster solutions of the linear systems arising from the discretisation. Numerical results are given to evaluate the performances of the methods showing speedups better than linear.

  9. De-identification of unstructured paper-based health records for privacy-preserving secondary use.

    Science.gov (United States)

    Fenz, Stefan; Heurix, Johannes; Neubauer, Thomas; Rella, Antonio

    2014-07-01

    Abstract Whenever personal data is processed, privacy is a serious issue. Especially in the document-centric e-health area, the patients' privacy must be preserved in order to prevent any negative repercussions for the patient. Clinical research, for example, demands structured health records to carry out efficient clinical trials, whereas legislation (e.g. HIPAA) regulates that only de-identified health records may be used for research. However, unstructured and often paper-based data dominates information technology, especially in the healthcare sector. Existing approaches are geared towards data in English-language documents only and have not been designed to handle the recognition of erroneous personal data which is the result of the OCR-based digitization of paper-based health records.

  10. Three-dimensional Gravity Inversion with a New Gradient Scheme on Unstructured Grids

    Science.gov (United States)

    Sun, S.; Yin, C.; Gao, X.; Liu, Y.; Zhang, B.

    2017-12-01

    Stabilized gradient-based methods have been proved to be efficient for inverse problems. Based on these methods, setting gradient close to zero can effectively minimize the objective function. Thus the gradient of objective function determines the inversion results. By analyzing the cause of poor resolution on depth in gradient-based gravity inversion methods, we find that imposing depth weighting functional in conventional gradient can improve the depth resolution to some extent. However, the improvement is affected by the regularization parameter and the effect of the regularization term becomes smaller with increasing depth (shown as Figure 1 (a)). In this paper, we propose a new gradient scheme for gravity inversion by introducing a weighted model vector. The new gradient can improve the depth resolution more efficiently, which is independent of the regularization parameter, and the effect of regularization term will not be weakened when depth increases. Besides, fuzzy c-means clustering method and smooth operator are both used as regularization terms to yield an internal consecutive inverse model with sharp boundaries (Sun and Li, 2015). We have tested our new gradient scheme with unstructured grids on synthetic data to illustrate the effectiveness of the algorithm. Gravity forward modeling with unstructured grids is based on the algorithm proposed by Okbe (1979). We use a linear conjugate gradient inversion scheme to solve the inversion problem. The numerical experiments show a great improvement in depth resolution compared with regular gradient scheme, and the inverse model is compact at all depths (shown as Figure 1 (b)). AcknowledgeThis research is supported by Key Program of National Natural Science Foundation of China (41530320), China Natural Science Foundation for Young Scientists (41404093), and Key National Research Project of China (2016YFC0303100, 2017YFC0601900). ReferencesSun J, Li Y. 2015. Multidomain petrophysically constrained inversion and

  11. Parallel CFD Algorithms for Aerodynamical Flow Solvers on Unstructured Meshes. Parts 1 and 2

    Science.gov (United States)

    Barth, Timothy J.; Kwak, Dochan (Technical Monitor)

    1995-01-01

    The Advisory Group for Aerospace Research and Development (AGARD) has requested my participation in the lecture series entitled Parallel Computing in Computational Fluid Dynamics to be held at the von Karman Institute in Brussels, Belgium on May 15-19, 1995. In addition, a request has been made from the US Coordinator for AGARD at the Pentagon for NASA Ames to hold a repetition of the lecture series on October 16-20, 1995. I have been asked to be a local coordinator for the Ames event. All AGARD lecture series events have attendance limited to NATO allied countries. A brief of the lecture series is provided in the attached enclosure. Specifically, I have been asked to give two lectures of approximately 75 minutes each on the subject of parallel solution techniques for the fluid flow equations on unstructured meshes. The title of my lectures is "Parallel CFD Algorithms for Aerodynamical Flow Solvers on Unstructured Meshes" (Parts I-II). The contents of these lectures will be largely review in nature and will draw upon previously published work in this area. Topics of my lectures will include: (1) Mesh partitioning algorithms. Recursive techniques based on coordinate bisection, Cuthill-McKee level structures, and spectral bisection. (2) Newton's method for large scale CFD problems. Size and complexity estimates for Newton's method, modifications for insuring global convergence. (3) Techniques for constructing the Jacobian matrix. Analytic and numerical techniques for Jacobian matrix-vector products, constructing the transposed matrix, extensions to optimization and homotopy theories. (4) Iterative solution algorithms. Practical experience with GIVIRES and BICG-STAB matrix solvers. (5) Parallel matrix preconditioning. Incomplete Lower-Upper (ILU) factorization, domain-decomposed ILU, approximate Schur complement strategies.

  12. Advantages and disadvantages of unstructured cardiovascular risk factor screening for follow-up in primary care.

    Science.gov (United States)

    de Boer, Anna W; de Mutsert, Renée; den Heijer, Martin; Rosendaal, Frits R; Jukema, Johan W; Blom, Jeanet W; Numans, Mattijs E

    2016-07-01

    In contrast to structured, integrated risk assessment in primary care, unstructured risk factor screening outside primary care and corresponding recommendations to consult a general practitioner (GP) are often based on one abnormal value of a single risk factor. This study investigates the advantages and disadvantages of unstructured screening of blood pressure and cholesterol outside primary care. After the baseline visit of the Netherlands Epidemiology of Obesity study (population-based prospective cohort study in persons aged 45-65 years, recruited 2008-2012) all participants received a letter with results of blood pressure and cholesterol, and a recommendation to consult a GP if results were abnormal. Four years after the start of the study, participants received a questionnaire about the follow-up of their results. The study population consisted of 6343 participants, 48% men, mean age 56 years, mean body mass index 30 kg/m(2). Of all participants 66% had an abnormal result and, of these, 49% had a treatment indication based on the risk estimation system SCORE-NL 2006. Of the 25% of the participants who did not consult a GP, 40% had a treatment indication. Of the participants with an abnormal result 19% were worried, of whom 60% had no treatment indication. In this population 51% of the participants with an abnormal result had unnecessarily received a recommendation to consult a GP, and 10% were unnecessarily worried. GPs should be informed about the complete risk assessment, and only participants at intermediate or high risk should receive a recommendation to consult a GP. © The European Society of Cardiology 2015.

  13. Thermal Entanglement and Critical Behavior of Magnetic Properties on a Triangulated Kagomé Lattice

    Directory of Open Access Journals (Sweden)

    N. Ananikian

    2011-01-01

    Full Text Available The equilibrium magnetic and entanglement properties in a spin-1/2 Ising-Heisenberg model on a triangulated Kagomé lattice are analyzed by means of the effective field for the Gibbs-Bogoliubov inequality. The calculation is reduced to decoupled individual (clusters trimers due to the separable character of the Ising-type exchange interactions between the Heisenberg trimers. The concurrence in terms of the three qubit isotropic Heisenberg model in the effective Ising field in the absence of a magnetic field is non-zero. The magnetic and entanglement properties exhibit common (plateau, peak features driven by a magnetic field and (antiferromagnetic exchange interaction. The (quantum entangled and non-entangled phases can be exploited as a useful tool for signalling the quantum phase transitions and crossovers at finite temperatures. The critical temperature of order-disorder coincides with the threshold temperature of thermal entanglement.

  14. Time-domain analysis of planar microstrip devices using a generalized Yee-algorithm based on unstructured grids

    Science.gov (United States)

    Gedney, Stephen D.; Lansing, Faiza

    1993-01-01

    The generalized Yee-algorithm is presented for the temporal full-wave analysis of planar microstrip devices. This algorithm has the significant advantage over the traditional Yee-algorithm in that it is based on unstructured and irregular grids. The robustness of the generalized Yee-algorithm is that structures that contain curved conductors or complex three-dimensional geometries can be more accurately, and much more conveniently modeled using standard automatic grid generation techniques. This generalized Yee-algorithm is based on the the time-marching solution of the discrete form of Maxwell's equations in their integral form. To this end, the electric and magnetic fields are discretized over a dual, irregular, and unstructured grid. The primary grid is assumed to be composed of general fitted polyhedra distributed throughout the volume. The secondary grid (or dual grid) is built up of the closed polyhedra whose edges connect the centroid's of adjacent primary cells, penetrating shared faces. Faraday's law and Ampere's law are used to update the fields normal to the primary and secondary grid faces, respectively. Subsequently, a correction scheme is introduced to project the normal fields onto the grid edges. It is shown that this scheme is stable, maintains second-order accuracy, and preserves the divergenceless nature of the flux densities. Finally, for computational efficiency the algorithm is structured as a series of sparse matrix-vector multiplications. Based on this scheme, the generalized Yee-algorithm has been implemented on vector and parallel high performance computers in a highly efficient manner.

  15. Finite volume method for radiative heat transfer in an unstructured flow solver for emitting, absorbing and scattering media

    International Nuclear Information System (INIS)

    Gazdallah, Moncef; Feldheim, Véronique; Claramunt, Kilian; Hirsch, Charles

    2012-01-01

    This paper presents the implementation of the finite volume method to solve the radiative transfer equation in a commercial code. The particularity of this work is that the method applied on unstructured hexahedral meshes does not need a pre-processing step establishing a particular marching order to visit all the control volumes. The solver simply visits the faces of the control volumes as numbered in the hexahedral unstructured mesh. A cell centred mesh and a spatial differencing step scheme to relate facial radiative intensities to nodal intensities is used. The developed computer code based on FVM has been integrated in the CFD solver FINE/Open from NUMECA Int. Radiative heat transfer can be evaluated within systems containing uniform, grey, emitting, absorbing and/or isotropically or linear anisotropically scattering medium bounded by diffuse grey walls. This code has been validated for three test cases. The first one is a three dimensional rectangular enclosure filled with emitting, absorbing and anisotropically scattering media. The second is the differentially heated cubic cavity. The third one is the L-shaped enclosure. For these three test cases a good agreement has been observed when temperature and heat fluxes predictions are compared with references taken, from literature.

  16. Improved Degree Search Algorithms in Unstructured P2P Networks

    Directory of Open Access Journals (Sweden)

    Guole Liu

    2012-01-01

    Full Text Available Searching and retrieving the demanded correct information is one important problem in networks; especially, designing an efficient search algorithm is a key challenge in unstructured peer-to-peer (P2P networks. Breadth-first search (BFS and depth-first search (DFS are the current two typical search methods. BFS-based algorithms show the perfect performance in the aspect of search success rate of network resources, while bringing the huge search messages. On the contrary, DFS-based algorithms reduce the search message quantity and also cause the dropping of search success ratio. To address the problem that only one of performances is excellent, we propose two memory function degree search algorithms: memory function maximum degree algorithm (MD and memory function preference degree algorithm (PD. We study their performance including the search success rate and the search message quantity in different networks, which are scale-free networks, random graph networks, and small-world networks. Simulations show that the two performances are both excellent at the same time, and the performances are improved at least 10 times.

  17. Surface meshing with curvature convergence

    KAUST Repository

    Li, Huibin; Zeng, Wei; Morvan, Jean-Marie; Chen, Liming; Gu, Xianfengdavid

    2014-01-01

    Surface meshing plays a fundamental role in graphics and visualization. Many geometric processing tasks involve solving geometric PDEs on meshes. The numerical stability, convergence rates and approximation errors are largely determined by the mesh qualities. In practice, Delaunay refinement algorithms offer satisfactory solutions to high quality mesh generations. The theoretical proofs for volume based and surface based Delaunay refinement algorithms have been established, but those for conformal parameterization based ones remain wide open. This work focuses on the curvature measure convergence for the conformal parameterization based Delaunay refinement algorithms. Given a metric surface, the proposed approach triangulates its conformal uniformization domain by the planar Delaunay refinement algorithms, and produces a high quality mesh. We give explicit estimates for the Hausdorff distance, the normal deviation, and the differences in curvature measures between the surface and the mesh. In contrast to the conventional results based on volumetric Delaunay refinement, our stronger estimates are independent of the mesh structure and directly guarantee the convergence of curvature measures. Meanwhile, our result on Gaussian curvature measure is intrinsic to the Riemannian metric and independent of the embedding. In practice, our meshing algorithm is much easier to implement and much more efficient. The experimental results verified our theoretical results and demonstrated the efficiency of the meshing algorithm. © 2014 IEEE.

  18. Surface meshing with curvature convergence

    KAUST Repository

    Li, Huibin

    2014-06-01

    Surface meshing plays a fundamental role in graphics and visualization. Many geometric processing tasks involve solving geometric PDEs on meshes. The numerical stability, convergence rates and approximation errors are largely determined by the mesh qualities. In practice, Delaunay refinement algorithms offer satisfactory solutions to high quality mesh generations. The theoretical proofs for volume based and surface based Delaunay refinement algorithms have been established, but those for conformal parameterization based ones remain wide open. This work focuses on the curvature measure convergence for the conformal parameterization based Delaunay refinement algorithms. Given a metric surface, the proposed approach triangulates its conformal uniformization domain by the planar Delaunay refinement algorithms, and produces a high quality mesh. We give explicit estimates for the Hausdorff distance, the normal deviation, and the differences in curvature measures between the surface and the mesh. In contrast to the conventional results based on volumetric Delaunay refinement, our stronger estimates are independent of the mesh structure and directly guarantee the convergence of curvature measures. Meanwhile, our result on Gaussian curvature measure is intrinsic to the Riemannian metric and independent of the embedding. In practice, our meshing algorithm is much easier to implement and much more efficient. The experimental results verified our theoretical results and demonstrated the efficiency of the meshing algorithm. © 2014 IEEE.

  19. Parallel discontinuous Galerkin FEM for computing hyperbolic conservation law on unstructured grids

    Science.gov (United States)

    Ma, Xinrong; Duan, Zhijian

    2018-04-01

    High-order resolution Discontinuous Galerkin finite element methods (DGFEM) has been known as a good method for solving Euler equations and Navier-Stokes equations on unstructured grid, but it costs too much computational resources. An efficient parallel algorithm was presented for solving the compressible Euler equations. Moreover, the multigrid strategy based on three-stage three-order TVD Runge-Kutta scheme was used in order to improve the computational efficiency of DGFEM and accelerate the convergence of the solution of unsteady compressible Euler equations. In order to make each processor maintain load balancing, the domain decomposition method was employed. Numerical experiment performed for the inviscid transonic flow fluid problems around NACA0012 airfoil and M6 wing. The results indicated that our parallel algorithm can improve acceleration and efficiency significantly, which is suitable for calculating the complex flow fluid.

  20. Structural studies of human Naked2: A biologically active intrinsically unstructured protein

    International Nuclear Information System (INIS)

    Hu Tianhui; Krezel, Andrzej M.; Li Cunxi; Coffey, Robert J.

    2006-01-01

    Naked1 and 2 are two mammalian orthologs of Naked Cuticle, a canonical Wnt signaling antagonist in Drosophila. Naked2, but not Naked1, interacts with transforming growth factor-α (TGFα) and escorts TGFα-containing vesicles to the basolateral membrane of polarized epithelial cells. Full-length Naked2 is poorly soluble. Since most functional domains, including the Dishevelled binding region, EF-hand, vesicle recognition, and membrane targeting motifs, reside in the N-terminal half of the protein, we expressed and purified the first 217 residues of human Naked2 and performed a functional analysis of this fragment. Its circular dichroism (CD) and nuclear magnetic resonance (NMR) spectra showed no evidence of secondary and/or tertiary structure. The fragment did not bind calcium or zinc. These results indicate that the N-terminal half of Naked2 behaves as an intrinsically unstructured protein

  1. Laser ray tracing and power deposition on an unstructured three-dimensional grid

    International Nuclear Information System (INIS)

    Kaiser, Thomas B.

    2000-01-01

    A scheme is presented for laser beam evolution and power deposition on three-dimensional unstructured grids composed of hexahedra, prisms, pyramids, and tetrahedra. The geometrical-optics approximation to the electromagnetic wave equation is used to follow propagation of a collection of discrete rays used to represent the beam(s). Ray trajectory equations are integrated using a method that is second order in time, exact for a constant electron-density gradient, and capable of dealing with density discontinuities that arise in certain hydrodynamics formulations. Power deposition by inverse-bremsstrahlung is modeled with a scheme based on Gaussian quadrature to accommodate a deposition rate whose spatial variation is highly nonuniform. Comparisons with analytic results are given for a density ramp in three dimensions, and a ''quadratic-well'' density trough in two dimensions. (c) 2000 The American Physical Society

  2. Machine Vision for Object Detection and Profiling in an Unstructured Environment

    Energy Technology Data Exchange (ETDEWEB)

    Walton, Miles Conley; Kinoshita, Robert Arthur

    2002-08-01

    The Handling and Sorting System for 55-Gallon Drums (HANDSS-55) is a DOE project to develop an automated method for retrieving items that are not acceptable at the Waste Isolation Pilot Plant (WIPP) from 55-gallon drums of low-level waste. The HANDSS-55 is a modular system that opens drums, sorts the waste, and then repackages the remaining waste in WIPP compliant barrels. The Sorting Station module relies on a non-contact measurement system to quickly provide a 3D profile of the sorting area. It then analyses the 3D profile and a color image to determine the position and orientation of an operator selected waste item. The item is then removed from the sorting area by a robotic arm. The use of both image and profile information for object determination provides a fast, effective method of finding and retrieving selected objects in the unstructured environment of the sorting module.

  3. Machine Vision for Object Detection and Profiling in an Unstructured Environment

    Energy Technology Data Exchange (ETDEWEB)

    Kinoshita, R.A.; Walton, M.C.

    2002-05-23

    The Handling and Sorting System for 55-Gallon Drums (HANDSS-55) is a DOE project to develop an automated method for retrieving items that are not acceptable at the Waste Isolation Pilot Plant (WIPP) from 55-gallon drums of low-level waste. The HANDSS-55 is a modular system that opens drums, sorts the waste, and then repackages the remaining waste in WIPP compliant barrels. The Sorting Station module relies on a non-contact measurement system to quickly provide a 3D profile of the sorting area. It then analyses the 3D profile and a color image to determine the position and orientation of an operator selected waste item. The item is then removed from the sorting area by a robotic arm. The use of both image and profile information for object determination provides a fast, effective method of finding and retrieving selected objects in the unstructured environment of the sorting module.

  4. Machine Vision for Object Detection and Profiling in an Unstructured Environment

    International Nuclear Information System (INIS)

    Kinoshita, R.A.; Walton, M.C.

    2002-01-01

    The Handling and Sorting System for 55-Gallon Drums (HANDSS-55) is a DOE project to develop an automated method for retrieving items that are not acceptable at the Waste Isolation Pilot Plant (WIPP) from 55-gallon drums of low-level waste. The HANDSS-55 is a modular system that opens drums, sorts the waste, and then repackages the remaining waste in WIPP compliant barrels. The Sorting Station module relies on a non-contact measurement system to quickly provide a 3D profile of the sorting area. It then analyses the 3D profile and a color image to determine the position and orientation of an operator selected waste item. The item is then removed from the sorting area by a robotic arm. The use of both image and profile information for object determination provides a fast, effective method of finding and retrieving selected objects in the unstructured environment of the sorting module

  5. Health, utilisation of health services, 'core' information, and reasons for non-participation: a triangulation study amongst non-respondents.

    Science.gov (United States)

    Näslindh-Ylispangar, Anita; Sihvonen, Marja; Kekki, Pertti

    2008-11-01

    To explore health, use of health services, 'core' information and reasons for non-participation amongst males. Gender may provide an explanation for non-participation in the healthcare system. A growing body of research suggests that males are less likely than females to seek help from health professionals for their problems. The current research had its beginnings with the low response rate in a prior voluntary survey and health examination for Finnish males born in 1961. Data triangulation among 28 non-respondent middle-aged males in Helsinki was used. The methods involved structured and in-depth interviews and health measurements to explore the views of these males concerning their health-related behaviours and use of health services. Non-respondent males seldom used healthcare services. Despite clinical risk factors (e.g. obesity and blood pressure) and various symptoms, males perceived their health status as good. Work was widely experienced as excessively demanding, causing insomnia and other stress symptoms. Males expressed sensitive messages when a session was ending and when the participant was close to the door and leaving the room. This 'core' information included major causes of concern, anxiety, fears and loneliness. This triangulation study showed that by using an in-depth interview as one research strategy, more sensitive 'feminist' expressions in health and ill-health were got by men. The results emphasise a male's self-perception of his masculinity that may have relevance to the health experience of the male population. Nurses and physicians need to pay special attention to the requirements of gender-specific healthcare to be most effective in the delivery of healthcare to males.

  6. Multiphase flow modelling of volcanic ash particle settling in water using adaptive unstructured meshes

    Science.gov (United States)

    Jacobs, C. T.; Collins, G. S.; Piggott, M. D.; Kramer, S. C.; Wilson, C. R. G.

    2013-02-01

    Small-scale experiments of volcanic ash particle settling in water have demonstrated that ash particles can either settle slowly and individually, or rapidly and collectively as a gravitationally unstable ash-laden plume. This has important implications for the emplacement of tephra deposits on the seabed. Numerical modelling has the potential to extend the results of laboratory experiments to larger scales and explore the conditions under which plumes may form and persist, but many existing models are computationally restricted by the fixed mesh approaches that they employ. In contrast, this paper presents a new multiphase flow model that uses an adaptive unstructured mesh approach. As a simulation progresses, the mesh is optimized to focus numerical resolution in areas important to the dynamics and decrease it where it is not needed, thereby potentially reducing computational requirements. Model verification is performed using the method of manufactured solutions, which shows the correct solution convergence rates. Model validation and application considers 2-D simulations of plume formation in a water tank which replicate published laboratory experiments. The numerically predicted settling velocities for both individual particles and plumes, as well as instability behaviour, agree well with experimental data and observations. Plume settling is clearly hindered by the presence of a salinity gradient, and its influence must therefore be taken into account when considering particles in bodies of saline water. Furthermore, individual particles settle in the laminar flow regime while plume settling is shown (by plume Reynolds numbers greater than unity) to be in the turbulent flow regime, which has a significant impact on entrainment and settling rates. Mesh adaptivity maintains solution accuracy while providing a substantial reduction in computational requirements when compared to the same simulation performed using a fixed mesh, highlighting the benefits of an

  7. Exploring Forms of Triangulation to Facilitate Collaborative Research Practice: Reflections From a Multidisciplinary Research Group

    Directory of Open Access Journals (Sweden)

    Tarja Tiainen

    2006-10-01

    Full Text Available This article contains critical reflections of a multidisciplinary research group studying the human and technological dynamics around some newly offered electronic services in a specific rural area of Finland. For their research, the group adopted ethnography. On facing the challenges of doing ethnographic research in a multidisciplinary setting, the group evolved its own breed of research practice based on multiple forms of triangulation. This implied the use of multiple data sources, methods, theories, and researchers, in different combinations. One of the outcomes of the work is a model for collaborative research. It highlights, among others, the importance of creating a climate for collaboration within the research group and following a process of individual and collaborative writing to achieve the potential benefits of such research. The article also identifies a set of remaining challenges relevant to collaborative research.

  8. Multiomics Data Triangulation for Asthma Candidate Biomarkers and Precision Medicine.

    Science.gov (United States)

    Pecak, Matija; Korošec, Peter; Kunej, Tanja

    2018-06-01

    Asthma is a common complex disorder and has been subject to intensive omics research for disease susceptibility and therapeutic innovation. Candidate biomarkers of asthma and its precision treatment demand that they stand the test of multiomics data triangulation before they can be prioritized for clinical applications. We classified the biomarkers of asthma after a search of the literature and based on whether or not a given biomarker candidate is reported in multiple omics platforms and methodologies, using PubMed and Web of Science, we identified omics studies of asthma conducted on diverse platforms using keywords, such as asthma, genomics, metabolomics, and epigenomics. We extracted data about asthma candidate biomarkers from 73 articles and developed a catalog of 190 potential asthma biomarkers (167 human, 23 animal data), comprising DNA loci, transcripts, proteins, metabolites, epimutations, and noncoding RNAs. The data were sorted according to 13 omics types: genomics, epigenomics, transcriptomics, proteomics, interactomics, metabolomics, ncRNAomics, glycomics, lipidomics, environmental omics, pharmacogenomics, phenomics, and integrative omics. Importantly, we found that 10 candidate biomarkers were apparent in at least two or more omics levels, thus promising potential for further biomarker research and development and precision medicine applications. This multiomics catalog reported herein for the first time contributes to future decision-making on prioritization of biomarkers and validation efforts for precision medicine in asthma. The findings may also facilitate meta-analyses and integrative omics studies in the future.

  9. Adaptive unstructured simulations of diaphragm rupture and perforation opening to start hypersonic air inlets

    International Nuclear Information System (INIS)

    Timofeev, E.V.; Tahir, R.B.; Voinovich, P.A.; Moelder, S.

    2004-01-01

    The concept of 'twin' grid nodes is discussed in the context of unstructured, adaptive meshes that are suitable for highly unsteady flows. The concept is applicable to internal boundary contours (within the computational domain) where the boundary conditions may need to be changed dynamically; for instance, an impermeable solid wall segment can be redefined as a fully permeable invisible boundary segment during the course of the simulation. This can be used to simulate unsteady gas flows with internal boundaries where the flow conditions may change rapidly and drastically. As a demonstration, the idea is applied to study the starting process in hypersonic air inlets by rupturing a diaphragm or by opening wall-perforations. (author)

  10. Data governance requirements for distributed clinical research networks: triangulating perspectives of diverse stakeholders.

    Science.gov (United States)

    Kim, Katherine K; Browe, Dennis K; Logan, Holly C; Holm, Roberta; Hack, Lori; Ohno-Machado, Lucila

    2014-01-01

    There is currently limited information on best practices for the development of governance requirements for distributed research networks (DRNs), an emerging model that promotes clinical data reuse and improves timeliness of comparative effectiveness research. Much of the existing information is based on a single type of stakeholder such as researchers or administrators. This paper reports on a triangulated approach to developing DRN data governance requirements based on a combination of policy analysis with experts, interviews with institutional leaders, and patient focus groups. This approach is illustrated with an example from the Scalable National Network for Effectiveness Research, which resulted in 91 requirements. These requirements were analyzed against the Fair Information Practice Principles (FIPPs) and Health Insurance Portability and Accountability Act (HIPAA) protected versus non-protected health information. The requirements addressed all FIPPs, showing how a DRN's technical infrastructure is able to fulfill HIPAA regulations, protect privacy, and provide a trustworthy platform for research. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  11. Development and acceleration of unstructured mesh-based cfd solver

    Science.gov (United States)

    Emelyanov, V.; Karpenko, A.; Volkov, K.

    2017-06-01

    The study was undertaken as part of a larger effort to establish a common computational fluid dynamics (CFD) code for simulation of internal and external flows and involves some basic validation studies. The governing equations are solved with ¦nite volume code on unstructured meshes. The computational procedure involves reconstruction of the solution in each control volume and extrapolation of the unknowns to find the flow variables on the faces of control volume, solution of Riemann problem for each face of the control volume, and evolution of the time step. The nonlinear CFD solver works in an explicit time-marching fashion, based on a three-step Runge-Kutta stepping procedure. Convergence to a steady state is accelerated by the use of geometric technique and by the application of Jacobi preconditioning for high-speed flows, with a separate low Mach number preconditioning method for use with low-speed flows. The CFD code is implemented on graphics processing units (GPUs). Speedup of solution on GPUs with respect to solution on central processing units (CPU) is compared with the use of different meshes and different methods of distribution of input data into blocks. The results obtained provide promising perspective for designing a GPU-based software framework for applications in CFD.

  12. A matrix-free implicit treatment for all speed flows on unstructured grids

    International Nuclear Information System (INIS)

    Kloczko, Th.

    2006-03-01

    The aim of this research work is the development of an efficient implicit scheme for computing compressible and low-speed flows on unstructured meshes. The first part is devoted to the review and analysis of some standard block-implicit treatments for the two-dimensional Euler and Navier-Stokes equations with a view to identify the best candidate for a fair comparison with the matrix-free treatment. The second part forms the main original contribution of this research work. It describes and analyses a matrix-free treatment that can be applied to any type of flow (inviscid/viscous, low Mach/highly compressible, steady/unsteady). The third part deals with the implementation of this treatment within the CAST3M code, and the demonstration of its advantages over existing techniques for computing applications of interest for the CEA: low-Mach number steady and unsteady flows in a Tee junction for example

  13. Riding Bare-Back on unstructured meshes for 21. century criticality calculations - 244

    International Nuclear Information System (INIS)

    Kelley, K.C.; Martz, R.L.; Crane, D.L.

    2010-01-01

    MCNP has a new capability that permits tracking of neutrons and photons on an unstructured mesh which is embedded as a mesh universe within its legacy geometry capability. The mesh geometry is created through Abaqus/CAE using its solid modeling capabilities. Transport results are calculated for mesh elements through a path length estimator while element to element tracking is performed on the mesh. The results from MCNP can be exported to Abaqus/CAE for visualization or other-physics analysis. The simple Godiva criticality benchmark problem was tested with this new mesh capability. Computer run time is proportional to the number of mesh elements used. Both first and second order polyhedrons are used. Models that used second order polyhedrons produced slightly better results without significantly increasing computer run time. Models that used first order hexahedrons had shorter runtimes than models that used first order tetrahedrons. (authors)

  14. Portable Parallel Programming for the Dynamic Load Balancing of Unstructured Grid Applications

    Science.gov (United States)

    Biswas, Rupak; Das, Sajal K.; Harvey, Daniel; Oliker, Leonid

    1999-01-01

    The ability to dynamically adapt an unstructured -rid (or mesh) is a powerful tool for solving computational problems with evolving physical features; however, an efficient parallel implementation is rather difficult, particularly from the view point of portability on various multiprocessor platforms We address this problem by developing PLUM, tin automatic anti architecture-independent framework for adaptive numerical computations in a message-passing environment. Portability is demonstrated by comparing performance on an SP2, an Origin2000, and a T3E, without any code modifications. We also present a general-purpose load balancer that utilizes symmetric broadcast networks (SBN) as the underlying communication pattern, with a goal to providing a global view of system loads across processors. Experiments on, an SP2 and an Origin2000 demonstrate the portability of our approach which achieves superb load balance at the cost of minimal extra overhead.

  15. A New Concept to Transport a Droplet on Horizontal Hydrophilic/Hydrophobic Surfaces

    International Nuclear Information System (INIS)

    Myong, Hyon Kook

    2014-01-01

    A fluid transport technique is a key issue for the development of microfluidic systems. In this paper, a new concept for transporting a droplet without external power sources is proposed and verified numerically. The proposed device is a heterogeneous surface which has both hydrophilic and hydrophobic horizontal surfaces. The numerical simulation to demonstrate the new concept is conducted by an in-house solution code (PowerCFD) which employs an unstructured cell-centered method based on a conservative pressure-based finite-volume method with interface capturing method (CICSAM) in a volume of fluid (VOF) scheme for phase interface capturing. It is found that the proposed concept for droplet transport shows superior performance for droplet transport in microfluidic systems

  16. Evolutionary computation applied to the reconstruction of 3-D surface topography in the SEM.

    Science.gov (United States)

    Kodama, Tetsuji; Li, Xiaoyuan; Nakahira, Kenji; Ito, Dai

    2005-10-01

    A genetic algorithm has been applied to the line profile reconstruction from the signals of the standard secondary electron (SE) and/or backscattered electron detectors in a scanning electron microscope. This method solves the topographical surface reconstruction problem as one of combinatorial optimization. To extend this optimization approach for three-dimensional (3-D) surface topography, this paper considers the use of a string coding where a 3-D surface topography is represented by a set of coordinates of vertices. We introduce the Delaunay triangulation, which attains the minimum roughness for any set of height data to capture the fundamental features of the surface being probed by an electron beam. With this coding, the strings are processed with a class of hybrid optimization algorithms that combine genetic algorithms and simulated annealing algorithms. Experimental results on SE images are presented.

  17. linear time algorithm for finding the convex ropes between two vertices of a simple polygon without triangulation

    International Nuclear Information System (INIS)

    Phan Thanh An

    2008-06-01

    The convex rope problem, posed by Peshkin and Sanderson in IEEE J. Robotics Automat, 2 (1986) pp. 53-58, is to find the counterclockwise and clockwise convex ropes starting at the vertex a and ending at the vertex b of a simple polygon, where a is on the boundary of the convex hull of the polygon and b is visible from infinity. In this paper, we present a linear time algorithm for solving this problem without resorting to a linear-time triangulation algorithm and without resorting to a convex hull algorithm for the polygon. The counterclockwise (clockwise, respectively) convex rope consists of two polylines obtained in a basic incremental strategy described in convex hull algorithms for the polylines forming the polygon from a to b. (author)

  18. Detailed Aerodynamic Analysis of a Shrouded Tail Rotor Using an Unstructured Mesh Flow Solver

    Science.gov (United States)

    Lee, Hee Dong; Kwon, Oh Joon

    The detailed aerodynamics of a shrouded tail rotor in hover has been numerically studied using a parallel inviscid flow solver on unstructured meshes. The numerical method is based on a cell-centered finite-volume discretization and an implicit Gauss-Seidel time integration. The calculation was made for a single blade by imposing a periodic boundary condition between adjacent rotor blades. The grid periodicity was also imposed at the periodic boundary planes to avoid numerical inaccuracy resulting from solution interpolation. The results were compared with available experimental data and those from a disk vortex theory for validation. It was found that realistic three-dimensional modeling is important for the prediction of detailed aerodynamics of shrouded rotors including the tip clearance gap flow.

  19. A 3D unstructured grid nearshore hydrodynamic model based on the vortex force formalism

    Science.gov (United States)

    Zheng, Peng; Li, Ming; van der A, Dominic A.; van der Zanden, Joep; Wolf, Judith; Chen, Xueen; Wang, Caixia

    2017-08-01

    A new three-dimensional nearshore hydrodynamic model system is developed based on the unstructured-grid version of the third generation spectral wave model SWAN (Un-SWAN) coupled with the three-dimensional ocean circulation model FVCOM to enable the full representation of the wave-current interaction in the nearshore region. A new wave-current coupling scheme is developed by adopting the vortex-force (VF) scheme to represent the wave-current interaction. The GLS turbulence model is also modified to better reproduce wave-breaking enhanced turbulence, together with a roller transport model to account for the effect of surface wave roller. This new model system is validated first against a theoretical case of obliquely incident waves on a planar beach, and then applied to three test cases: a laboratory scale experiment of normal waves on a beach with a fixed breaker bar, a field experiment of oblique incident waves on a natural, sandy barred beach (Duck'94 experiment), and a laboratory study of normal-incident waves propagating around a shore-parallel breakwater. Overall, the model predictions agree well with the available measurements in these tests, illustrating the robustness and efficiency of the present model for very different spatial scales and hydrodynamic conditions. Sensitivity tests indicate the importance of roller effects and wave energy dissipation on the mean flow (undertow) profile over the depth. These tests further suggest to adopt a spatially varying value for roller effects across the beach. In addition, the parameter values in the GLS turbulence model should be spatially inhomogeneous, which leads to better prediction of the turbulent kinetic energy and an improved prediction of the undertow velocity profile.

  20. Influence of surface position along the working range of conoscopic holography sensors on dimensional verification of AISI 316 wire EDM machined surfaces.

    Science.gov (United States)

    Fernández, Pedro; Blanco, David; Rico, Carlos; Valiño, Gonzalo; Mateos, Sabino

    2014-03-06

    Conoscopic holography (CH) is a non-contact interferometric technique used for surface digitization which presents several advantages over other optical techniques such as laser triangulation. Among others, the ability for the reconstruction of high-sloped surfaces stands out, and so does its lower dependence on surface optical properties. Nevertheless, similarly to other optical systems, adjustment of CH sensors requires an adequate selection of configuration parameters for ensuring a high quality surface digitizing. This should be done on a surface located as close as possible to the stand-off distance by tuning frequency (F) and power (P) until the quality indicators Signal-to-Noise Ratio (SNR) and signal envelope (Total) meet proper values. However, not all the points of an actual surface are located at the stand-off distance, but they could be located throughout the whole working range (WR). Thus, the quality of a digitized surface may not be uniform. The present work analyses how the quality of a reconstructed surface is affected by its relative position within the WR under different combinations of the parameters F and P. Experiments have been conducted on AISI 316 wire EDM machined flat surfaces. The number of high-quality points digitized as well as distance measurements between different surfaces throughout the WR allowed for comparing the metrological behaviour of the CH sensor with respect to a touch probe (TP) on a CMM.

  1. PDF modeling of turbulent flows on unstructured grids

    Science.gov (United States)

    Bakosi, Jozsef

    In probability density function (PDF) methods of turbulent flows, the joint PDF of several flow variables is computed by numerically integrating a system of stochastic differential equations for Lagrangian particles. Because the technique solves a transport equation for the PDF of the velocity and scalars, a mathematically exact treatment of advection, viscous effects and arbitrarily complex chemical reactions is possible; these processes are treated without closure assumptions. A set of algorithms is proposed to provide an efficient solution of the PDF transport equation modeling the joint PDF of turbulent velocity, frequency and concentration of a passive scalar in geometrically complex configurations. An unstructured Eulerian grid is employed to extract Eulerian statistics, to solve for quantities represented at fixed locations of the domain and to track particles. All three aspects regarding the grid make use of the finite element method. Compared to hybrid methods, the current methodology is stand-alone, therefore it is consistent both numerically and at the level of turbulence closure without the use of consistency conditions. Since both the turbulent velocity and scalar concentration fields are represented in a stochastic way, the method allows for a direct and close interaction between these fields, which is beneficial in computing accurate scalar statistics. Boundary conditions implemented along solid bodies are of the free-slip and no-slip type without the need for ghost elements. Boundary layers at no-slip boundaries are either fully resolved down to the viscous sublayer, explicitly modeling the high anisotropy and inhomogeneity of the low-Reynolds-number wall region without damping or wall-functions or specified via logarithmic wall-functions. As in moment closures and large eddy simulation, these wall-treatments provide the usual trade-off between resolution and computational cost as required by the given application. Particular attention is focused on

  2. Gradient Calculation Methods on Arbitrary Polyhedral Unstructured Meshes for Cell-Centered CFD Solvers

    Science.gov (United States)

    Sozer, Emre; Brehm, Christoph; Kiris, Cetin C.

    2014-01-01

    A survey of gradient reconstruction methods for cell-centered data on unstructured meshes is conducted within the scope of accuracy assessment. Formal order of accuracy, as well as error magnitudes for each of the studied methods, are evaluated on a complex mesh of various cell types through consecutive local scaling of an analytical test function. The tests highlighted several gradient operator choices that can consistently achieve 1st order accuracy regardless of cell type and shape. The tests further offered error comparisons for given cell types, leading to the observation that the "ideal" gradient operator choice is not universal. Practical implications of the results are explored via CFD solutions of a 2D inviscid standing vortex, portraying the discretization error properties. A relatively naive, yet largely unexplored, approach of local curvilinear stencil transformation exhibited surprisingly favorable properties

  3. Axisymmetric charge-conservative electromagnetic particle simulation algorithm on unstructured grids: Application to microwave vacuum electronic devices

    Science.gov (United States)

    Na, Dong-Yeop; Omelchenko, Yuri A.; Moon, Haksu; Borges, Ben-Hur V.; Teixeira, Fernando L.

    2017-10-01

    We present a charge-conservative electromagnetic particle-in-cell (EM-PIC) algorithm optimized for the analysis of vacuum electronic devices (VEDs) with cylindrical symmetry (axisymmetry). We exploit the axisymmetry present in the device geometry, fields, and sources to reduce the dimensionality of the problem from 3D to 2D. Further, we employ 'transformation optics' principles to map the original problem in polar coordinates with metric tensor diag (1 ,ρ2 , 1) to an equivalent problem on a Cartesian metric tensor diag (1 , 1 , 1) with an effective (artificial) inhomogeneous medium introduced. The resulting problem in the meridian (ρz) plane is discretized using an unstructured 2D mesh considering TEϕ-polarized fields. Electromagnetic field and source (node-based charges and edge-based currents) variables are expressed as differential forms of various degrees, and discretized using Whitney forms. Using leapfrog time integration, we obtain a mixed E - B finite-element time-domain scheme for the full-discrete Maxwell's equations. We achieve a local and explicit time update for the field equations by employing the sparse approximate inverse (SPAI) algorithm. Interpolating field values to particles' positions for solving Newton-Lorentz equations of motion is also done via Whitney forms. Particles are advanced using the Boris algorithm with relativistic correction. A recently introduced charge-conserving scatter scheme tailored for 2D unstructured grids is used in the scatter step. The algorithm is validated considering cylindrical cavity and space-charge-limited cylindrical diode problems. We use the algorithm to investigate the physical performance of VEDs designed to harness particle bunching effects arising from the coherent (resonance) Cerenkov electron beam interactions within micro-machined slow wave structures.

  4. Knowledge and theme discovery across very large biological data sets using distributed queries: a prototype combining unstructured and structured data.

    Directory of Open Access Journals (Sweden)

    Uma S Mudunuri

    Full Text Available As the discipline of biomedical science continues to apply new technologies capable of producing unprecedented volumes of noisy and complex biological data, it has become evident that available methods for deriving meaningful information from such data are simply not keeping pace. In order to achieve useful results, researchers require methods that consolidate, store and query combinations of structured and unstructured data sets efficiently and effectively. As we move towards personalized medicine, the need to combine unstructured data, such as medical literature, with large amounts of highly structured and high-throughput data such as human variation or expression data from very large cohorts, is especially urgent. For our study, we investigated a likely biomedical query using the Hadoop framework. We ran queries using native MapReduce tools we developed as well as other open source and proprietary tools. Our results suggest that the available technologies within the Big Data domain can reduce the time and effort needed to utilize and apply distributed queries over large datasets in practical clinical applications in the life sciences domain. The methodologies and technologies discussed in this paper set the stage for a more detailed evaluation that investigates how various data structures and data models are best mapped to the proper computational framework.

  5. Baseline Validation of Unstructured Grid Reynolds-Averaged Navier-Stokes Toward Flow Control

    Science.gov (United States)

    Joslin, Ronald D.; Viken, Sally A.

    2001-01-01

    The value of the use of the Reynolds-averaged Navier-Stokes methodology for active flow control applications is assessed. An experimental flow control database exists for a NACA0015 airfoil modified at the leading edge to implement a fluidic actuator; hence, this configuration is used. Computational results are documented for the baseline wing configuration (no control) with the experimental results and assumes two-dimensional flow. The baseline wing configuration has discontinuities at the leading edge, trailing edge, and aft of midchord on the upper surface. A limited number of active flow control applications have been tested in the laboratory and in flight. These applications include dynamic stall control using a deformable leading edge, separation control for takeoff and landing flight conditions using piezoelectric devices, pulsed vortex generators, zero-net-mass oscillations, and thrust vectoring with zero-net-mass piezoelectric-driven oscillatory actuation. As yet, there is no definitive comparison with experimental data that indicates current computational capabilities can quantitatively predict the large aerodynamic performance gains achieved with active flow control in the laboratory. However, one study using the Reynolds-averaged Navier-Stokes (RANS) methodology has shown good quantitative agreement with experimental results for an isolated zero-net-mass actuator. In addition, some recent studies have used RANS to demonstrate qualitative performance gains compared with the experimental data for separation control on an airfoil. Those quantitative comparisons for both baseline and flow control cases indicated that computational results were in poor quantitative agreement with the experiments. The current research thrust will investigate the potential use of an unstructured grid RANS approach to predict aerodynamic performance for active flow control applications building on the early studies. First the computational results must quantitatively match

  6. Stripping scattering of fast atoms on surfaces of metal-oxide crystals and ultrathin films

    International Nuclear Information System (INIS)

    Blauth, David

    2010-01-01

    In the framework of the present dissertation the interactions of fast atoms with surfaces of bulk oxides, metals and thin films on metals were studied. The experiments were performed in the regime of grazing incidence of atoms with energies of some keV. The advantage of this scattering geometry is the high surface sensibility and thus the possibility to determine the crystallographic and electronic characteristics of the topmost surface layer. In addition to these experiments, the energy loss and the electron emission induced by scattered projectiles was investigated. The energy for electron emission and exciton excitation on Alumina/NiAl(110) and SiO 2 /Mo(112) are determined. By detection of the number of projectile induced emitted electrons as function of azimuthal angle for the rotation of the target surface, the geometrical structure of atoms forming the topmost layer of different adsorbate films on metal surfaces where determined via ion beam triangulation. (orig.)

  7. 'Unstructured Data' Practices in Polar Institutions and Networks: a Case Study with the Arctic Options Project

    Directory of Open Access Journals (Sweden)

    Paul Arthur Berkman

    2014-10-01

    Full Text Available Arctic Options: Holistic Integration for Arctic Coastal-Marine Sustainability is a new three-year research project to assess future infrastructure associated with the Arctic Ocean regarding: (1 natural and living environment; (2 built environment; (3 natural resource development; and (4 governance. For the assessments, Arctic Options will generate objective relational schema from numeric data as well as textual data. This paper will focus on the ‘long tail of smaller, heterogeneous, and often unstructured datasets’ that ‘usually receive minimal data management consideration’,as observed in the 2013 Communiqué from the International Forum on Polar Data Activities in Global Data Systems.

  8. Role of the EHD2 unstructured loop in dimerization, protein binding and subcellular localization.

    Directory of Open Access Journals (Sweden)

    Kriti Bahl

    Full Text Available The C-terminal Eps 15 Homology Domain proteins (EHD1-4 play important roles in regulating endocytic trafficking. EHD2 is the only family member whose crystal structure has been solved, and it contains an unstructured loop consisting of two proline-phenylalanine (PF motifs: KPFRKLNPF. In contrast, despite EHD2 having nearly 70% amino acid identity with its paralogs, EHD1, EHD3 and EHD4, the latter proteins contain a single KPF or RPF motif, but no NPF motif. In this study, we sought to define the precise role of each PF motif in EHD2's homo-dimerization, binding with the protein partners, and subcellular localization. To test the role of the NPF motif, we generated an EHD2 NPF-to-NAF mutant to mimic the homologous sequences of EHD1 and EHD3. We demonstrated that this mutant lost both its ability to dimerize and bind to Syndapin2. However, it continued to localize primarily to the cytosolic face of the plasma membrane. On the other hand, EHD2 NPF-to-APA mutants displayed normal dimerization and Syndapin2 binding, but exhibited markedly increased nuclear localization and reduced association with the plasma membrane. We then hypothesized that the single PF motif of EHD1 (that aligns with the KPF of EHD2 might be responsible for both binding and localization functions of EHD1. Indeed, the EHD1 RPF motif was required for dimerization, interaction with MICAL-L1 and Syndapin2, as well as localization to tubular recycling endosomes. Moreover, recycling assays demonstrated that EHD1 RPF-to-APA was incapable of supporting normal receptor recycling. Overall, our data suggest that the EHD2 NPF phenylalanine residue is crucial for EHD2 localization to the plasma membrane, whereas the proline residue is essential for EHD2 dimerization and binding. These studies support the recently proposed model in which the EHD2 N-terminal region may regulate the availability of the unstructured loop for interactions with neighboring EHD2 dimers, thus promoting

  9. Mobile 3D Viewer Supporting RFID System

    International Nuclear Information System (INIS)

    Kim, J. J.; Yang, S. W.; Choi, Y.

    2007-01-01

    As hardware capabilities of mobile devices are being rapidly enhanced, applications based upon mobile devices are also being developed in wider areas. In this paper, a prototype mobile 3D viewer with the object identification through RFID system is presented. To visualize 3D engineering data such as CAD data, we need a process to compute triangulated data from boundary based surface like B-rep solid or trimmed surfaces. Since existing rendering engines on mobile devices do not provide triangulation capability, mobile 3D programs have focused only on an efficient handling with pre-tessellated geometry. We have developed a light and fast triangulation process based on constrained Delaunay triangulation suitable for mobile devices in the previous research. This triangulation software is used as a core for the mobile 3D viewer on a PDA with RFID system that may have potentially wide applications in many areas

  10. Mobile 3D Viewer Supporting RFID System

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J J; Yang, S W; Choi, Y [Chungang Univ., Seoul (Korea, Republic of)

    2007-07-01

    As hardware capabilities of mobile devices are being rapidly enhanced, applications based upon mobile devices are also being developed in wider areas. In this paper, a prototype mobile 3D viewer with the object identification through RFID system is presented. To visualize 3D engineering data such as CAD data, we need a process to compute triangulated data from boundary based surface like B-rep solid or trimmed surfaces. Since existing rendering engines on mobile devices do not provide triangulation capability, mobile 3D programs have focused only on an efficient handling with pre-tessellated geometry. We have developed a light and fast triangulation process based on constrained Delaunay triangulation suitable for mobile devices in the previous research. This triangulation software is used as a core for the mobile 3D viewer on a PDA with RFID system that may have potentially wide applications in many areas.

  11. The VIPER project (Visualization Integration Platform for Exploration Research): a biologically inspired autonomous reconfigurable robotic platform for diverse unstructured environments

    Science.gov (United States)

    Schubert, Oliver J.; Tolle, Charles R.

    2004-09-01

    Over the last decade the world has seen numerous autonomous vehicle programs. Wheels and track designs are the basis for many of these vehicles. This is primarily due to four main reasons: a vast preexisting knowledge base for these designs, energy efficiency of power sources, scalability of actuators, and the lack of control systems technologies for handling alternate highly complex distributed systems. Though large efforts seek to improve the mobility of these vehicles, many limitations still exist for these systems within unstructured environments, e.g. limited mobility within industrial and nuclear accident sites where existing plant configurations have been extensively changed. These unstructured operational environments include missions for exploration, reconnaissance, and emergency recovery of objects within reconfigured or collapsed structures, e.g. bombed buildings. More importantly, these environments present a clear and present danger for direct human interactions during the initial phases of recovery operations. Clearly, the current classes of autonomous vehicles are incapable of performing in these environments. Thus the next generation of designs must include highly reconfigurable and flexible autonomous robotic platforms. This new breed of autonomous vehicles will be both highly flexible and environmentally adaptable. Presented in this paper is one of the most successful designs from nature, the snake-eel-worm (SEW). This design implements shape memory alloy (SMA) actuators which allow for scaling of the robotic SEW designs from sub-micron scale to heavy industrial implementations without major conceptual redesigns as required in traditional hydraulic, pneumatic, or motor driven systems. Autonomous vehicles based on the SEW design posses the ability to easily move between air based environments and fluid based environments with limited or no reconfiguration. Under a SEW designed vehicle, one not only achieves vastly improved maneuverability within a

  12. Hanging Out with Which Friends? Friendship-Level Predictors of Unstructured and Unsupervised Socializing in Adolescence

    Science.gov (United States)

    Siennick, Sonja E.; Osgood, D. Wayne

    2012-01-01

    Companions are central to explanations of the risky nature of unstructured and unsupervised socializing, yet we know little about whom adolescents are with when hanging out. We examine predictors of how often friendship dyads hang out via multilevel analyses of longitudinal friendship-level data on over 5,000 middle schoolers. Adolescents hang out most with their most available friends and their most generally similar friends, not with their most at-risk or similarly at-risk friends. These findings vary little by gender and wave. Together, the findings suggest that the risks of hanging out stem from the nature of hanging out as an activity, not the nature of adolescents’ companions, and that hanging out is a context for friends’ mutual reinforcement of pre-existing characteristics. PMID:23204811

  13. Hanging Out with Which Friends? Friendship-Level Predictors of Unstructured and Unsupervised Socializing in Adolescence.

    Science.gov (United States)

    Siennick, Sonja E; Osgood, D Wayne

    2012-12-01

    Companions are central to explanations of the risky nature of unstructured and unsupervised socializing, yet we know little about whom adolescents are with when hanging out. We examine predictors of how often friendship dyads hang out via multilevel analyses of longitudinal friendship-level data on over 5,000 middle schoolers. Adolescents hang out most with their most available friends and their most generally similar friends, not with their most at-risk or similarly at-risk friends. These findings vary little by gender and wave. Together, the findings suggest that the risks of hanging out stem from the nature of hanging out as an activity, not the nature of adolescents' companions, and that hanging out is a context for friends' mutual reinforcement of pre-existing characteristics.

  14. Medical and dermatology dictionaries: an examination of unstructured definitions and a proposal for the future.

    Science.gov (United States)

    DeVries, David Todd; Papier, Art; Byrnes, Jennifer; Goldsmith, Lowell A

    2004-01-01

    Medical dictionaries serve to describe and clarify the term set used by medical professionals. In this commentary, we analyze a representative set of skin disease definitions from 2 prominent medical dictionaries, Stedman's Medical Dictionary and Dorland's Illustrated Medical Dictionary. We find that there is an apparent lack of stylistic standards with regard to content and form. We advocate a new standard form for the definition of medical terminology, a standard to complement the easy-to-read yet unstructured style of the traditional dictionary entry. This new form offers a reproducible structure, paving the way for the development of a computer readable "dictionary" of medical terminology. Such a dictionary offers immediate update capability and a fundamental improvement in the ability to search for relationships between terms.

  15. Influence of Surface Position along the Working Range of Conoscopic Holography Sensors on Dimensional Verification of AISI 316 Wire EDM Machined Surfaces

    Directory of Open Access Journals (Sweden)

    Pedro Fernández

    2014-03-01

    Full Text Available Conoscopic holography (CH is a non-contact interferometric technique used for surface digitization which presents several advantages over other optical techniques such as laser triangulation. Among others, the ability for the reconstruction of high-sloped surfaces stands out, and so does its lower dependence on surface optical properties. Nevertheless, similarly to other optical systems, adjustment of CH sensors requires an adequate selection of configuration parameters for ensuring a high quality surface digitizing. This should be done on a surface located as close as possible to the stand-off distance by tuning frequency (F and power (P until the quality indicators Signal-to-Noise Ratio (SNR and signal envelope (Total meet proper values. However, not all the points of an actual surface are located at the stand-off distance, but they could be located throughout the whole working range (WR. Thus, the quality of a digitized surface may not be uniform. The present work analyses how the quality of a reconstructed surface is affected by its relative position within the WR under different combinations of the parameters F and P. Experiments have been conducted on AISI 316 wire EDM machined flat surfaces. The number of high-quality points digitized as well as distance measurements between different surfaces throughout the WR allowed for comparing the metrological behaviour of the CH sensor with respect to a touch probe (TP on a CMM.

  16. Geometrically Flexible and Efficient Flow Analysis of High Speed Vehicles Via Domain Decomposition, Part 1: Unstructured-Grid Solver for High Speed Flows

    Science.gov (United States)

    White, Jeffery A.; Baurle, Robert A.; Passe, Bradley J.; Spiegel, Seth C.; Nishikawa, Hiroaki

    2017-01-01

    The ability to solve the equations governing the hypersonic turbulent flow of a real gas on unstructured grids using a spatially-elliptic, 2nd-order accurate, cell-centered, finite-volume method has been recently implemented in the VULCAN-CFD code. This paper describes the key numerical methods and techniques that were found to be required to robustly obtain accurate solutions to hypersonic flows on non-hex-dominant unstructured grids. The methods and techniques described include: an augmented stencil, weighted linear least squares, cell-average gradient method, a robust multidimensional cell-average gradient-limiter process that is consistent with the augmented stencil of the cell-average gradient method and a cell-face gradient method that contains a cell skewness sensitive damping term derived using hyperbolic diffusion based concepts. A data-parallel matrix-based symmetric Gauss-Seidel point-implicit scheme, used to solve the governing equations, is described and shown to be more robust and efficient than a matrix-free alternative. In addition, a y+ adaptive turbulent wall boundary condition methodology is presented. This boundary condition methodology is deigned to automatically switch between a solve-to-the-wall and a wall-matching-function boundary condition based on the local y+ of the 1st cell center off the wall. The aforementioned methods and techniques are then applied to a series of hypersonic and supersonic turbulent flat plate unit tests to examine the efficiency, robustness and convergence behavior of the implicit scheme and to determine the ability of the solve-to-the-wall and y+ adaptive turbulent wall boundary conditions to reproduce the turbulent law-of-the-wall. Finally, the thermally perfect, chemically frozen, Mach 7.8 turbulent flow of air through a scramjet flow-path is computed and compared with experimental data to demonstrate the robustness, accuracy and convergence behavior of the unstructured-grid solver for a realistic 3-D geometry on

  17. Surface Modeling, Grid Generation, and Related Issues in Computational Fluid Dynamic (CFD) Solutions

    Science.gov (United States)

    Choo, Yung K. (Compiler)

    1995-01-01

    The NASA Steering Committee for Surface Modeling and Grid Generation (SMAGG) sponsored a workshop on surface modeling, grid generation, and related issues in Computational Fluid Dynamics (CFD) solutions at Lewis Research Center, Cleveland, Ohio, May 9-11, 1995. The workshop provided a forum to identify industry needs, strengths, and weaknesses of the five grid technologies (patched structured, overset structured, Cartesian, unstructured, and hybrid), and to exchange thoughts about where each technology will be in 2 to 5 years. The workshop also provided opportunities for engineers and scientists to present new methods, approaches, and applications in SMAGG for CFD. This Conference Publication (CP) consists of papers on industry overview, NASA overview, five grid technologies, new methods/ approaches/applications, and software systems.

  18. The quasidiffusion method for transport problems on unstructured meshes

    Science.gov (United States)

    Wieselquist, William A.

    2009-06-01

    In this work, we develop a quasidiffusion (QD) method for solving radiation transport problems on unstructured quadrilateral meshes in 2D Cartesian geometry, for example hanging-node meshes from adaptive mesh refinement (AMR) applications or skewed quadrilateral meshes from radiation hydrodynamics with Lagrangian meshing. The main result of the work is a new low-order quasidiffusion (LOQD) discretization on arbitrary quadrilaterals and a strategy for the efficient iterative solution which uses Krylov methods and incomplete LU factorization (ILU) preconditioning. The LOQD equations are a non-symmetric set of first-order PDEs that in second-order form resembles convection- diffusion with a diffusion tensor, with the difference that the LOQD equations contain extra cross-derivative terms. Our finite volume (FV) discretization of the LOQD equations is compared with three LOQD discretizations from literature. We then present a conservative, short characteristics discretization based on subcell balances (SCSB) that uses polynomial exponential moments to achieve robust behavior in various limits (e.g. small cells and voids) and is second- order accurate in space. A linear representation of the isotropic component of the scattering source based on face-average and cell-average scalar fluxes is also proposed and shown to be effective in some problems. In numerical tests, our QD method with linear scattering source representation shows some advantages compared to other transport methods. We conclude with avenues for future research and note that this QD method may easily be extended to arbitrary meshes in 3D Cartesian geometry.

  19. A novel consistent and well-balanced algorithm for simulations of multiphase flows on unstructured grids

    Science.gov (United States)

    Patel, Jitendra Kumar; Natarajan, Ganesh

    2017-12-01

    We discuss the development and assessment of a robust numerical algorithm for simulating multiphase flows with complex interfaces and high density ratios on arbitrary polygonal meshes. The algorithm combines the volume-of-fluid method with an incremental projection approach for incompressible multiphase flows in a novel hybrid staggered/non-staggered framework. The key principles that characterise the algorithm are the consistent treatment of discrete mass and momentum transport and the similar discretisation of force terms appearing in the momentum equation. The former is achieved by invoking identical schemes for convective transport of volume fraction and momentum in the respective discrete equations while the latter is realised by representing the gravity and surface tension terms as gradients of suitable scalars which are then discretised in identical fashion resulting in a balanced formulation. The hybrid staggered/non-staggered framework employed herein solves for the scalar normal momentum at the cell faces, while the volume fraction is computed at the cell centroids. This is shown to naturally lead to similar terms for pressure and its correction in the momentum and pressure correction equations respectively, which are again treated discretely in a similar manner. We show that spurious currents that corrupt the solution may arise both from an unbalanced formulation where forces (gravity and surface tension) are discretised in dissimilar manner and from an inconsistent approach where different schemes are used to convect the mass and momentum, with the latter prominent in flows which are convection-dominant with high density ratios. Interestingly, the inconsistent approach is shown to perform as well as the consistent approach even for high density ratio flows in some cases while it exhibits anomalous behaviour for other scenarios, even at low density ratios. Using a plethora of test problems of increasing complexity, we conclusively demonstrate that the

  20. An evaluation of orthopaedic nurses’ participation in an educational intervention promoting research utilization – A triangulation convergence model

    DEFF Research Database (Denmark)

    Berthelsen, Connie Bøttcher; Hølge-Hazelton, Bibi

    2016-01-01

    Aims and objectives To describe the orthopaedic nurses' experiences regarding the relevance of an educational intervention and their personal and contextual barriers to participation in the intervention. Background One of the largest barriers against nurses' research usage in clinical practice...... is the lack of participation. A previous survey identified 32 orthopaedic nurses as interested in participating in nursing research. An educational intervention was conducted to increase the orthopaedic nurses' research knowledge and competencies. However, only an average of six nurses participated. Design...... A triangulation convergence model was applied through a mixed methods design to combine quantitative results and qualitative findings for evaluation. Methods Data were collected from 2013–2014 from 32 orthopaedic nurses in a Danish regional hospital through a newly developed 21-item questionnaire and two focus...

  1. Second order finite volume scheme for Maxwell's equations with discontinuous electromagnetic properties on unstructured meshes

    Energy Technology Data Exchange (ETDEWEB)

    Ismagilov, Timur Z., E-mail: ismagilov@academ.org

    2015-02-01

    This paper presents a second order finite volume scheme for numerical solution of Maxwell's equations with discontinuous dielectric permittivity and magnetic permeability on unstructured meshes. The scheme is based on Godunov scheme and employs approaches of Van Leer and Lax–Wendroff to increase the order of approximation. To keep the second order of approximation near dielectric permittivity and magnetic permeability discontinuities a novel technique for gradient calculation and limitation is applied near discontinuities. Results of test computations for problems with linear and curvilinear discontinuities confirm second order of approximation. The scheme was applied to modelling propagation of electromagnetic waves inside photonic crystal waveguides with a bend.

  2. Energy harvesting through gas dynamics in the free molecular flow regime between structured surfaces at different temperatures

    Science.gov (United States)

    Baier, Tobias; Dölger, Julia; Hardt, Steffen

    2014-05-01

    For a gas confined between surfaces held at different temperatures the velocity distribution shows a significant deviation from the Maxwell distribution when the mean free path of the molecules is comparable to or larger than the channel dimensions. If one of the surfaces is suitably structured, this nonequilibrium distribution can be exploited for momentum transfer in a tangential direction between the two surfaces. This opens up the possibility to extract work from the system which operates as a heat engine. Since both surfaces are held at constant temperatures, the mode of momentum transfer is different from the thermal creep flow that has gained more attention so far. This situation is studied in the limit of free-molecular flow for the case that an unstructured surface is allowed to move tangentially with respect to a structured surface. Parameter studies are conducted, and configurations with maximum thermodynamic efficiency are identified. Overall, it is shown that significant efficiencies can be obtained by tangential momentum transfer between structured surfaces.

  3. CFD simulation of rotor aerodynamic performance when using additional surface structure array

    Science.gov (United States)

    Wang, Bing; Kong, Deyi

    2017-10-01

    The present work analyses the aerodynamic performance of the rotor with additional surface structure array in an attempt to maximize its performance in hover flight. The unstructured grids and the Reynolds Average Navier-Stokes equations were used to calculate the performance of the prototype rotor and the rotor with additional surface structure array in the air. The computational fluid dynamics software FLUENT was used to simulate the thrust of the rotors. The results of the calculations are in reasonable agreement with experimental data, which shows that the calculation model used in this work is useful in simulating the performance of the rotor with additional surface structure array. With this theoretical calculation model, the thrusts of the rotors with arrays of surface structure in three different shapes were calculated. According to the simulation results and the experimental data, the rotor with triangle surface structure array has better aerodynamic performance than the other rotors. In contrast with the prototype rotor, the thrust of the rotor with triangle surface structure array increases by 5.2% at the operating rotating speed of 3000r/min, and the additional triangle surface structure array has almost no influence on the efficiency of the rotor.

  4. An Automated Approach to the Generation of Structured Building Information Models from Unstructured 3d Point Cloud Scans

    DEFF Research Database (Denmark)

    Tamke, Martin; Evers, Henrik Leander; Wessel, Raoul

    2016-01-01

    In this paper we present and evaluate an approach for the automatic generation of building models in IFC BIM format from unstructured Point Cloud scans, as they result from 3dlaser scans of buildings. While the actual measurement process is relatively fast, 85% of the overall time are spend...... on the interpretation and transformation of the resulting Point Cloud data into information, which can be used in architectural and engineering design workflows. Our approach to tackle this problem, is in contrast to existing ones which work on the levels of points, based on the detection of building elements...

  5. On Discrete Killing Vector Fields and Patterns on Surfaces

    KAUST Repository

    Ben-Chen, Mirela

    2010-09-21

    Symmetry is one of the most important properties of a shape, unifying form and function. It encodes semantic information on one hand, and affects the shape\\'s aesthetic value on the other. Symmetry comes in many flavors, amongst the most interesting being intrinsic symmetry, which is defined only in terms of the intrinsic geometry of the shape. Continuous intrinsic symmetries can be represented using infinitesimal rigid transformations, which are given as tangent vector fields on the surface - known as Killing Vector Fields. As exact symmetries are quite rare, especially when considering noisy sampled surfaces, we propose a method for relaxing the exact symmetry constraint to allow for approximate symmetries and approximate Killing Vector Fields, and show how to discretize these concepts for generating such vector fields on a triangulated mesh. We discuss the properties of approximate Killing Vector Fields, and propose an application to utilize them for texture and geometry synthesis. Journal compilation © 2010 The Eurographics Association and Blackwell Publishing Ltd.

  6. Reconstruction of freeform surfaces for metrology

    International Nuclear Information System (INIS)

    El-Hayek, N; Nouira, H; Anwer, N; Damak, M; Gibaru, O

    2014-01-01

    The application of freeform surfaces has increased since their complex shapes closely express a product's functional specifications and their machining is obtained with higher accuracy. In particular, optical surfaces exhibit enhanced performance especially when they take aspheric forms or more complex forms with multi-undulations. This study is mainly focused on the reconstruction of complex shapes such as freeform optical surfaces, and on the characterization of their form. The computer graphics community has proposed various algorithms for constructing a mesh based on the cloud of sample points. The mesh is a piecewise linear approximation of the surface and an interpolation of the point set. The mesh can further be processed for fitting parametric surfaces (Polyworks ® or Geomagic ® ). The metrology community investigates direct fitting approaches. If the surface mathematical model is given, fitting is a straight forward task. Nonetheless, if the surface model is unknown, fitting is only possible through the association of polynomial Spline parametric surfaces. In this paper, a comparative study carried out on methods proposed by the computer graphics community will be presented to elucidate the advantages of these approaches. We stress the importance of the pre-processing phase as well as the significance of initial conditions. We further emphasize the importance of the meshing phase by stating that a proper mesh has two major advantages. First, it organizes the initially unstructured point set and it provides an insight of orientation, neighbourhood and curvature, and infers information on both its geometry and topology. Second, it conveys a better segmentation of the space, leading to a correct patching and association of parametric surfaces

  7. Constructing Social Networks from Unstructured Group Dialog in Virtual Worlds

    Science.gov (United States)

    Shah, Fahad; Sukthankar, Gita

    Virtual worlds and massively multi-player online games are rich sources of information about large-scale teams and groups, offering the tantalizing possibility of harvesting data about group formation, social networks, and network evolution. However these environments lack many of the cues that facilitate natural language processing in other conversational settings and different types of social media. Public chat data often features players who speak simultaneously, use jargon and emoticons, and only erratically adhere to conversational norms. In this paper, we present techniques for inferring the existence of social links from unstructured conversational data collected from groups of participants in the Second Life virtual world. We present an algorithm for addressing this problem, Shallow Semantic Temporal Overlap (SSTO), that combines temporal and language information to create directional links between participants, and a second approach that relies on temporal overlap alone to create undirected links between participants. Relying on temporal overlap is noisy, resulting in a low precision and networks with many extraneous links. In this paper, we demonstrate that we can ameliorate this problem by using network modularity optimization to perform community detection in the noisy networks and severing cross-community links. Although using the content of the communications still results in the best performance, community detection is effective as a noise reduction technique for eliminating the extra links created by temporal overlap alone.

  8. Three-Dimensional Incompressible Navier-Stokes Flow Computations about Complete Configurations Using a Multiblock Unstructured Grid Approach

    Science.gov (United States)

    Sheng, Chunhua; Hyams, Daniel G.; Sreenivas, Kidambi; Gaither, J. Adam; Marcum, David L.; Whitfield, David L.

    2000-01-01

    A multiblock unstructured grid approach is presented for solving three-dimensional incompressible inviscid and viscous turbulent flows about complete configurations. The artificial compressibility form of the governing equations is solved by a node-based, finite volume implicit scheme which uses a backward Euler time discretization. Point Gauss-Seidel relaxations are used to solve the linear system of equations at each time step. This work employs a multiblock strategy to the solution procedure, which greatly improves the efficiency of the algorithm by significantly reducing the memory requirements by a factor of 5 over the single-grid algorithm while maintaining a similar convergence behavior. The numerical accuracy of solutions is assessed by comparing with the experimental data for a submarine with stem appendages and a high-lift configuration.

  9. Matrix equation decomposition and parallel solution of systems resulting from unstructured finite element problems in electromagnetics

    Energy Technology Data Exchange (ETDEWEB)

    Cwik, T. [California Institute of Technology, Pasadena, CA (United States); Katz, D.S. [Cray Research, El Segundo, CA (United States)

    1996-12-31

    Finite element modeling has proven useful for accurately simulating scattered or radiated electromagnetic fields from complex three-dimensional objects whose geometry varies on the scale of a fraction of an electrical wavelength. An unstructured finite element model of realistic objects leads to a large, sparse, system of equations that needs to be solved efficiently with regard to machine memory and execution time. Both factorization and iterative solvers can be used to produce solutions to these systems of equations. Factorization leads to high memory requirements that limit the electrical problem size of three-dimensional objects that can be modeled. An iterative solver can be used to efficiently solve the system without excessive memory use and in a minimal amount of time if the convergence rate is controlled.

  10. A higher-order conservation element solution element method for solving hyperbolic differential equations on unstructured meshes

    Science.gov (United States)

    Bilyeu, David

    This dissertation presents an extension of the Conservation Element Solution Element (CESE) method from second- to higher-order accuracy. The new method retains the favorable characteristics of the original second-order CESE scheme, including (i) the use of the space-time integral equation for conservation laws, (ii) a compact mesh stencil, (iii) the scheme will remain stable up to a CFL number of unity, (iv) a fully explicit, time-marching integration scheme, (v) true multidimensionality without using directional splitting, and (vi) the ability to handle two- and three-dimensional geometries by using unstructured meshes. This algorithm has been thoroughly tested in one, two and three spatial dimensions and has been shown to obtain the desired order of accuracy for solving both linear and non-linear hyperbolic partial differential equations. The scheme has also shown its ability to accurately resolve discontinuities in the solutions. Higher order unstructured methods such as the Discontinuous Galerkin (DG) method and the Spectral Volume (SV) methods have been developed for one-, two- and three-dimensional application. Although these schemes have seen extensive development and use, certain drawbacks of these methods have been well documented. For example, the explicit versions of these two methods have very stringent stability criteria. This stability criteria requires that the time step be reduced as the order of the solver increases, for a given simulation on a given mesh. The research presented in this dissertation builds upon the work of Chang, who developed a fourth-order CESE scheme to solve a scalar one-dimensional hyperbolic partial differential equation. The completed research has resulted in two key deliverables. The first is a detailed derivation of a high-order CESE methods on unstructured meshes for solving the conservation laws in two- and three-dimensional spaces. The second is the code implementation of these numerical methods in a computer code. For

  11. The unstructured linker arms of Mlh1-Pms1 are important for interactions with DNA during mismatch repair

    Science.gov (United States)

    Plys, Aaron J.; Rogacheva, Maria V.; Greene, Eric C.; Alani, Eric

    2012-01-01

    DNA mismatch repair (MMR) models have proposed that MSH proteins identify DNA polymerase errors while interacting with the DNA replication fork. MLH proteins (primarily Mlh1-Pms1 in baker’s yeast) then survey the genome for lesion-bound MSH proteins. The resulting MSH-MLH complex formed at a DNA lesion initiates downstream steps in repair. MLH proteins act as dimers and contain long (20 – 30 nanometers) unstructured arms that connect two terminal globular domains. These arms can vary between 100 to 300 amino acids in length, are highly divergent between organisms, and are resistant to amino acid substitutions. To test the roles of the linker arms in MMR, we engineered a protease cleavage site into the Mlh1 linker arm domain of baker’s yeast Mlh1-Pms1. Cleavage of the Mlh1 linker arm in vitro resulted in a defect in Mlh1-Pms1 DNA binding activity, and in vivo proteolytic cleavage resulted in a complete defect in MMR. We then generated a series of truncation mutants bearing Mlh1 and Pms1 linker arms of varying lengths. This work revealed that MMR is greatly compromised when portions of the Mlh1 linker are removed, whereas repair is less sensitive to truncation of the Pms1 linker arm. Purified complexes containing truncations in Mlh1 and Pms1 linker arms were analyzed and found to have differential defects in DNA binding that also correlated with the ability to form a ternary complex with Msh2-Msh6 and mismatch DNA. These observations are consistent with the unstructured linker domains of MLH proteins providing distinct interactions with DNA during MMR. PMID:22659005

  12. Leisure-time physical activity behavior: structured and unstructured choices according to sex, age, and level of physical activity.

    Science.gov (United States)

    Mota, Jorge; Esculcas, Carlos

    2002-01-01

    The main goals of this cross-sectional survey were (a) to describe the associations between sex, age, and physical activity behavior and (b) to describe the age and sex-related associations with the choice of structured (formal) and unstructured (nonformal) physical activity programs. At baseline, data were selected randomly from 1,013 students, from the 7th to the 12th grades. A response rate of 73% (n = 739) was obtained. Accordingly, the sample of this study consisted of 594 adolescents (304 females and 290 males) with mean age of 15.9 years (range 13-20). Physical activity was assessed by means of a questionnaire. A questionnaire about leisure activities was applied to the sample to define the nominal variable "nature of physical activity." The data showed that significantly more girls than boys (p < or = .001) belonged to the sedentary group (80.7% girls) and low activity group (64.5% girls). Boys more frequently belonged to the more active groups (92.1%; p < or = .001). The older participants were more engaged in formal physical activities, whereas the younger mostly chose informal ones whatever their level of physical activity. There were more significant differences in girls' physical activity groups (chi 2 = 20.663, p < or = .001) than in boys' (chi 2 = 7.662, p < or = .05). Furthermore, active girls chose more structured physical activities than their sedentary counterparts (18.8% vs. 83.3%). However, boys preferred unstructured activities regardless of physical activity group (83.7% vs. 58.5%; p < or = .05). It can be concluded that as age increased, organized sports activities became a relatively more important component of total weekly activity for both male and female participants.

  13. Large-Scale Parallel Viscous Flow Computations using an Unstructured Multigrid Algorithm

    Science.gov (United States)

    Mavriplis, Dimitri J.

    1999-01-01

    The development and testing of a parallel unstructured agglomeration multigrid algorithm for steady-state aerodynamic flows is discussed. The agglomeration multigrid strategy uses a graph algorithm to construct the coarse multigrid levels from the given fine grid, similar to an algebraic multigrid approach, but operates directly on the non-linear system using the FAS (Full Approximation Scheme) approach. The scalability and convergence rate of the multigrid algorithm are examined on the SGI Origin 2000 and the Cray T3E. An argument is given which indicates that the asymptotic scalability of the multigrid algorithm should be similar to that of its underlying single grid smoothing scheme. For medium size problems involving several million grid points, near perfect scalability is obtained for the single grid algorithm, while only a slight drop-off in parallel efficiency is observed for the multigrid V- and W-cycles, using up to 128 processors on the SGI Origin 2000, and up to 512 processors on the Cray T3E. For a large problem using 25 million grid points, good scalability is observed for the multigrid algorithm using up to 1450 processors on a Cray T3E, even when the coarsest grid level contains fewer points than the total number of processors.

  14. Positivity-preserving CE/SE schemes for solving the compressible Euler and Navier–Stokes equations on hybrid unstructured meshes

    KAUST Repository

    Shen, Hua

    2018-05-28

    We construct positivity-preserving space–time conservation element and solution element (CE/SE) schemes for solving the compressible Euler and Navier–Stokes equations on hybrid unstructured meshes consisting of triangular and rectangular elements. The schemes use an a posteriori limiter to prevent negative densities and pressures based on the premise of preserving optimal accuracy. The limiter enforces a constraint for spatial derivatives and does not change the conservative property of CE/SE schemes. Several numerical examples suggest that the proposed schemes preserve accuracy for smooth flows and strictly preserve positivity of densities and pressures for the problems involving near vacuum and very strong discontinuities.

  15. The Impact of Unstructured Case Studies on Surface Learners: A Study of Second-Year Accounting Students

    Science.gov (United States)

    Wynn-Williams, Kate; Beatson, Nicola; Anderson, Cameron

    2016-01-01

    The empirical study described here uses the R-SPQ-2F questionnaire [Biggs, J., Kember, D., & Leung, D. Y. (2001). The revised two-factor study process questionnaire: R-SPQ-2F. "British Journal of Educational Psychology," 71(1), 133-149] to test deep and surface approaches to learning in a university intermediate-level accounting…

  16. Studies on aerodynamic interferences between the components of transport airplane using unstructured Navier-Stokes simulations

    International Nuclear Information System (INIS)

    Wang, G.; Ye, Z.

    2005-01-01

    It is well known that the aerodynamic interference flows widely exist between the components of conventional transport airplane, for example, the wing-fuselage juncture flow, wing-pylon-nacelle flow and tail-fuselage juncture flow. The main characteristic of these aerodynamic interferences is flow separation, which will increase the drag, reduce the lift and cause adverse influence on the stability and controllability of the airplane. Therefore, the modern civil transport designers should do their best to eliminate negative effects of aerodynamic interferences, which demands that the aerodynamic interferences between the aircraft components should be predicted and analyzed accurately. Today's CFD techniques provide us powerful and efficient analysis tools to achieve this objective. In this paper, computational investigations of the interferences between transport aircraft components have been carried out by using a viscous flow solver based on mixed element type unstructured meshes. (author)

  17. Unstructured characteristic method embedded with variational nodal method using domain decomposition techniques

    Energy Technology Data Exchange (ETDEWEB)

    Girardi, E.; Ruggieri, J.M. [CEA Cadarache (DER/SPRC/LEPH), 13 - Saint-Paul-lez-Durance (France). Dept. d' Etudes des Reacteurs; Santandrea, S. [CEA Saclay, Dept. Modelisation de Systemes et Structures DM2S/SERMA/LENR, 91 - Gif sur Yvette (France)

    2005-07-01

    This paper describes a recently-developed extension of our 'Multi-methods,multi-domains' (MM-MD) method for the solution of the multigroup transport equation. Based on a domain decomposition technique, our approach allows us to treat the one-group equation by cooperatively employing several numerical methods together. In this work, we describe the coupling between the Method of Characteristics (integro-differential equation, unstructured meshes) with the Variational Nodal Method (even parity equation, cartesian meshes). Then, the coupling method is applied to the benchmark model of the Phebus experimental facility (Cea Cadarache). Our domain decomposition method give us the capability to employ a very fine mesh in describing a particular fuel bundle with an appropriate numerical method (MOC), while using a much large mesh size in the rest of the core, in conjunction with a coarse-mesh method (VNM). This application shows the benefits of our MM-MD approach, in terms of accuracy and computing time: the domain decomposition method allows us to reduce the Cpu time, while preserving a good accuracy of the neutronic indicators: reactivity, core-to-bundle power coupling coefficient and flux error. (authors)

  18. Unstructured characteristic method embedded with variational nodal method using domain decomposition techniques

    International Nuclear Information System (INIS)

    Girardi, E.; Ruggieri, J.M.

    2005-01-01

    This paper describes a recently-developed extension of our 'Multi-methods,multi-domains' (MM-MD) method for the solution of the multigroup transport equation. Based on a domain decomposition technique, our approach allows us to treat the one-group equation by cooperatively employing several numerical methods together. In this work, we describe the coupling between the Method of Characteristics (integro-differential equation, unstructured meshes) with the Variational Nodal Method (even parity equation, cartesian meshes). Then, the coupling method is applied to the benchmark model of the Phebus experimental facility (Cea Cadarache). Our domain decomposition method give us the capability to employ a very fine mesh in describing a particular fuel bundle with an appropriate numerical method (MOC), while using a much large mesh size in the rest of the core, in conjunction with a coarse-mesh method (VNM). This application shows the benefits of our MM-MD approach, in terms of accuracy and computing time: the domain decomposition method allows us to reduce the Cpu time, while preserving a good accuracy of the neutronic indicators: reactivity, core-to-bundle power coupling coefficient and flux error. (authors)

  19. A new approach for categorizing pig lying behaviour based on a Delaunay triangulation method.

    Science.gov (United States)

    Nasirahmadi, A; Hensel, O; Edwards, S A; Sturm, B

    2017-01-01

    Machine vision-based monitoring of pig lying behaviour is a fast and non-intrusive approach that could be used to improve animal health and welfare. Four pens with 22 pigs in each were selected at a commercial pig farm and monitored for 15 days using top view cameras. Three thermal categories were selected relative to room setpoint temperature. An image processing technique based on Delaunay triangulation (DT) was utilized. Different lying patterns (close, normal and far) were defined regarding the perimeter of each DT triangle and the percentages of each lying pattern were obtained in each thermal category. A method using a multilayer perceptron (MLP) neural network, to automatically classify group lying behaviour of pigs into three thermal categories, was developed and tested for its feasibility. The DT features (mean value of perimeters, maximum and minimum length of sides of triangles) were calculated as inputs for the MLP classifier. The network was trained, validated and tested and the results revealed that MLP could classify lying features into the three thermal categories with high overall accuracy (95.6%). The technique indicates that a combination of image processing, MLP classification and mathematical modelling can be used as a precise method for quantifying pig lying behaviour in welfare investigations.

  20. Identifying influenza-like illness presentation from unstructured general practice clinical narrative using a text classifier rule-based expert system versus a clinical expert.

    Science.gov (United States)

    MacRae, Jayden; Love, Tom; Baker, Michael G; Dowell, Anthony; Carnachan, Matthew; Stubbe, Maria; McBain, Lynn

    2015-10-06

    We designed and validated a rule-based expert system to identify influenza like illness (ILI) from routinely recorded general practice clinical narrative to aid a larger retrospective research study into the impact of the 2009 influenza pandemic in New Zealand. Rules were assessed using pattern matching heuristics on routine clinical narrative. The system was trained using data from 623 clinical encounters and validated using a clinical expert as a gold standard against a mutually exclusive set of 901 records. We calculated a 98.2 % specificity and 90.2 % sensitivity across an ILI incidence of 12.4 % measured against clinical expert classification. Peak problem list identification of ILI by clinical coding in any month was 9.2 % of all detected ILI presentations. Our system addressed an unusual problem domain for clinical narrative classification; using notational, unstructured, clinician entered information in a community care setting. It performed well compared with other approaches and domains. It has potential applications in real-time surveillance of disease, and in assisted problem list coding for clinicians. Our system identified ILI presentation with sufficient accuracy for use at a population level in the wider research study. The peak coding of 9.2 % illustrated the need for automated coding of unstructured narrative in our study.

  1. Split-Cell Exponential Characteristic Transport Method for Unstructured Tetrahedral Meshes

    International Nuclear Information System (INIS)

    Brennan, Charles R.; Miller, Rodney L.; Mathews, Kirk A.

    2001-01-01

    The nonlinear, exponential characteristic (EC) method is extended to unstructured meshes of tetrahedral cells in three-dimensional Cartesian coordinates. The split-cell approach developed for the linear characteristic (LC) method on such meshes is used. Exponential distributions of the source within a cell and of the inflow flux on upstream faces of the cell are assumed. The coefficients of these distributions are determined by nonlinear root solving so as to match the zeroth and first moments of the source or entering flux. Good conditioning is achieved by casting the formulas for the moments of the source, inflow flux, and solution flux as sums of positive functions and by using accurate and robust algorithms for evaluation of those functions. Various test problems are used to compare the performance of the EC and LC methods. The EC method is somewhat less accurate than the LC method in regions of net out leakage but is strictly positive and retains good accuracy with optically thick cells, as in shielding problems, unlike the LC method. The computational cost per cell is greater for the EC method, but the use of substantially coarser meshes can make the EC method less expensive in total cost. The EC method, unlike the LC method, may fail if negative cross sections or angular quadrature weights are used. It is concluded that the EC and LC methods should be practical, reliable, and complimentary schemes for these meshes

  2. Adaptation of an unstructured-mesh, finite-element ocean model to the simulation of ocean circulation beneath ice shelves

    Science.gov (United States)

    Kimura, Satoshi; Candy, Adam S.; Holland, Paul R.; Piggott, Matthew D.; Jenkins, Adrian

    2013-07-01

    Several different classes of ocean model are capable of representing floating glacial ice shelves. We describe the incorporation of ice shelves into Fluidity-ICOM, a nonhydrostatic finite-element ocean model with the capacity to utilize meshes that are unstructured and adaptive in three dimensions. This geometric flexibility offers several advantages over previous approaches. The model represents melting and freezing on all ice-shelf surfaces including vertical faces, treats the ice shelf topography as continuous rather than stepped, and does not require any smoothing of the ice topography or any of the additional parameterisations of the ocean mixed layer used in isopycnal or z-coordinate models. The model can also represent a water column that decreases to zero thickness at the 'grounding line', where the floating ice shelf is joined to its tributary ice streams. The model is applied to idealised ice-shelf geometries in order to demonstrate these capabilities. In these simple experiments, arbitrarily coarsening the mesh outside the ice-shelf cavity has little effect on the ice-shelf melt rate, while the mesh resolution within the cavity is found to be highly influential. Smoothing the vertical ice front results in faster flow along the smoothed ice front, allowing greater exchange with the ocean than in simulations with a realistic ice front. A vanishing water-column thickness at the grounding line has little effect in the simulations studied. We also investigate the response of ice shelf basal melting to variations in deep water temperature in the presence of salt stratification.

  3. Outcomes and impact of HIV prevention, ART and TB programs in Swaziland--early evidence from public health triangulation.

    Science.gov (United States)

    van Schalkwyk, Cari; Mndzebele, Sibongile; Hlophe, Thabo; Garcia Calleja, Jesus Maria; Korenromp, Eline L; Stoneburner, Rand; Pervilhac, Cyril

    2013-01-01

    Swaziland's severe HIV epidemic inspired an early national response since the late 1980s, and regular reporting of program outcomes since the onset of a national antiretroviral treatment (ART) program in 2004. We assessed effectiveness outcomes and mortality trends in relation to ART, HIV testing and counseling (HTC), tuberculosis (TB) and prevention of mother to child transmission (PMTCT). Data triangulated include intervention coverage and outcomes according to program registries (2001-2010), hospital admissions and deaths disaggregated by age and sex (2001-2010) and population mortality estimates from the 1997 and 2007 censuses and the 2007 demographic and health survey. By 2010, ART reached 70% of the estimated number of people living with HIV/AIDS with CD4impact to specific interventions (versus natural epidemic dynamics) will require additional data from future household surveys, and improved routine (program, surveillance, and hospital) data at district level.

  4. Assimilation of coastal acoustic tomography data using an unstructured triangular grid ocean model for water with complex coastlines and islands

    Science.gov (United States)

    Zhu, Ze-Nan; Zhu, Xiao-Hua; Guo, Xinyu; Fan, Xiaopeng; Zhang, Chuanzheng

    2017-09-01

    For the first time, we present the application of an unstructured triangular grid to the Finite-Volume Community Ocean Model using the ensemble Kalman filter scheme, to assimilate coastal acoustic tomography (CAT) data. The fine horizontal and vertical current field structures around the island inside the observation region were both reproduced well. The assimilated depth-averaged velocities had better agreement with the independent acoustic Doppler current profiler (ADCP) data than the velocities obtained by inversion and simulation. The root-mean-square difference (RMSD) between depth-averaged current velocities obtained by data assimilation and those obtained by ADCPs was 0.07 m s-1, which was less than the corresponding difference obtained by inversion and simulation (0.12 and 0.17 m s-1, respectively). The assimilated vertical layer velocities also exhibited better agreement with ADCP than the velocities obtained by simulation. RMSDs between assimilated and ADCP data in vertical layers ranged from 0.02 to 0.14 m s-1, while RMSDs between simulation and ADCP data ranged from 0.08 to 0.27 m s-1. These results indicate that assimilation had the highest accuracy. Sensitivity experiments involving the elimination of sound transmission lines showed that missing data had less impact on assimilation than on inversion. Sensitivity experiments involving the elimination of CAT stations showed that the assimilation with four CAT stations was the relatively economical and reasonable procedure in this experiment. These results indicate that, compared with inversion and simulation, data assimilation of CAT data with an unstructured triangular grid is more effective in reconstructing the current field.

  5. Social influence and adolescent health-related physical activity in structured and unstructured settings: role of channel and type.

    Science.gov (United States)

    Spink, Kevin S; Wilson, Kathleen S; Ulvick, Jocelyn

    2012-08-01

    Social influence channels (e.g., parents) and types (e.g., compliance) have each been related to physical activity independently, but little is known about how these two categories of influence may operate in combination. This study examined the relationships between various combinations of social influence and physical activity among youth across structured and unstructured settings. Adolescents (N=304), classified as high or low active, reported the social influence combinations they received for being active. Participants identified three channels and three types of influence associated with being active. For structured activity, compliance with peers and significant others predicted membership in the high active group (values of psocial influence, when examining health-related physical activity.

  6. Modelling the Influence of Ground Surface Relief on Electric Sounding Curves Using the Integral Equations Method

    Directory of Open Access Journals (Sweden)

    Balgaisha Mukanova

    2017-01-01

    Full Text Available The problem of electrical sounding of a medium with ground surface relief is modelled using the integral equations method. This numerical method is based on the triangulation of the computational domain, which is adapted to the shape of the relief and the measuring line. The numerical algorithm is tested by comparing the results with the known solution for horizontally layered media with two layers. Calculations are also performed to verify the fulfilment of the “reciprocity principle” for the 4-electrode installations in our numerical model. Simulations are then performed for a two-layered medium with a surface relief. The quantitative influences of the relief, the resistivity ratios of the contacting media, and the depth of the second layer on the apparent resistivity curves are established.

  7. Overlapping Schwarz for Nonlinear Problems. An Element Agglomeration Nonlinear Additive Schwarz Preconditioned Newton Method for Unstructured Finite Element Problems

    Energy Technology Data Exchange (ETDEWEB)

    Cai, X C; Marcinkowski, L; Vassilevski, P S

    2005-02-10

    This paper extends previous results on nonlinear Schwarz preconditioning ([4]) to unstructured finite element elliptic problems exploiting now nonlocal (but small) subspaces. The non-local finite element subspaces are associated with subdomains obtained from a non-overlapping element partitioning of the original set of elements and are coarse outside the prescribed element subdomain. The coarsening is based on a modification of the agglomeration based AMGe method proposed in [8]. Then, the algebraic construction from [9] of the corresponding non-linear finite element subproblems is applied to generate the subspace based nonlinear preconditioner. The overall nonlinearly preconditioned problem is solved by an inexact Newton method. Numerical illustration is also provided.

  8. Retrieval Algorithms for Road Surface Modelling Using Laser-Based Mobile Mapping

    Directory of Open Access Journals (Sweden)

    Antero Kukko

    2008-09-01

    Full Text Available Automated processing of the data provided by a laser-based mobile mapping system will be a necessity due to the huge amount of data produced. In the future, vehiclebased laser scanning, here called mobile mapping, should see considerable use for road environment modelling. Since the geometry of the scanning and point density is different from airborne laser scanning, new algorithms are needed for information extraction. In this paper, we propose automatic methods for classifying the road marking and kerbstone points and modelling the road surface as a triangulated irregular network. On the basis of experimental tests, the mean classification accuracies obtained using automatic method for lines, zebra crossings and kerbstones were 80.6%, 92.3% and 79.7%, respectively.

  9. Automated matching of corresponding seed images of three simulator radiographs to allow 3D triangulation of implanted seeds

    Science.gov (United States)

    Altschuler, Martin D.; Kassaee, Alireza

    1997-02-01

    To match corresponding seed images in different radiographs so that the 3D seed locations can be triangulated automatically and without ambiguity requires (at least) three radiographs taken from different perspectives, and an algorithm that finds the proper permutations of the seed-image indices. Matching corresponding images in only two radiographs introduces inherent ambiguities which can be resolved only with the use of non-positional information obtained with intensive human effort. Matching images in three or more radiographs is an `NP (Non-determinant in Polynomial time)-complete' problem. Although the matching problem is fundamental, current methods for three-radiograph seed-image matching use `local' (seed-by-seed) methods that may lead to incorrect matchings. We describe a permutation-sampling method which not only gives good `global' (full permutation) matches for the NP-complete three-radiograph seed-matching problem, but also determines the reliability of the radiographic data themselves, namely, whether the patient moved in the interval between radiographic perspectives.

  10. A C-terminal segment of the V{sub 1}R vasopressin receptor is unstructured in the crystal structure of its chimera with the maltose-binding protein

    Energy Technology Data Exchange (ETDEWEB)

    Adikesavan, Nallini Vijayarangan; Mahmood, Syed Saad; Stanley, Nithianantham; Xu, Zhen; Wu, Nan [Department of Biochemistry, School of Medicine, Case Western Reserve University, 10900 Euclid Avenue, Cleveland, OH 44106-4935 (United States); Thibonnier, Marc [Department of Medicine, School of Medicine, Case Western Reserve University, 10900 Euclid Avenue, Cleveland, OH 44106-4935 (United States); Shoham, Menachem, E-mail: mxs10@case.edu [Department of Biochemistry, School of Medicine, Case Western Reserve University, 10900 Euclid Avenue, Cleveland, OH 44106-4935 (United States)

    2005-04-01

    The 1.8 Å crystal structure of an MBP-fusion protein with the C-terminal cytoplasmic segment of the V1 vasopressin receptor reveals that the receptor segment is unstructured. The V{sub 1} vascular vasopressin receptor (V{sub 1}R) is a G-protein-coupled receptor (GPCR) involved in the regulation of body-fluid osmolality, blood volume and blood pressure. Signal transduction is mediated by the third intracellular loop of this seven-transmembrane protein as well as by the C-terminal cytoplasmic segment. A chimera of the maltose-binding protein (MBP) and the C-terminal segment of V{sub 1}R has been cloned, expressed, purified and crystallized. The crystals belong to space group P2{sub 1}, with unit-cell parameters a = 51.10, b = 66.56, c = 115.72 Å, β = 95.99°. The 1.8 Å crystal structure reveals the conformation of MBP and part of the linker region of this chimera, with the C-terminal segment being unstructured. This may reflect a conformational plasticity in the C-terminal segment that may be necessary for proper function of V{sub 1}R.

  11. Parallel implementation of a dynamic unstructured chimera method in the DLR finite volume TAU-code

    Energy Technology Data Exchange (ETDEWEB)

    Madrane, A.; Raichle, A.; Stuermer, A. [German Aerospace Center, DLR, Numerical Methods, Inst. of Aerodynamics and Flow Technology, Braunschweig (Germany)]. E-mail: aziz.madrane@dlr.de

    2004-07-01

    Aerodynamic problems involving moving geometries have many applications, including store separation, high-speed train entering into a tunnel, simulation of full configurations of the helicopter and fast maneuverability. Overset grid method offers the option of calculating these procedures. The solution process uses a grid system that discretizes the problem domain by using separately generated but overlapping unstructured grids that update and exchange boundary information through interpolation. However, such computations are complicated and time consuming. Parallel computing offers a very effective way to improve the productivity in doing computational fluid dynamics (CFD). Therefore the purpose of this study is to develop an efficient parallel computation algorithm for analyzing the flowfield of complex geometries using overset grids method. The strategy adopted in the parallelization of the overset grids method including the use of data structures and communication, is described. Numerical results are presented to demonstrate the efficiency of the resulting parallel overset grids method. (author)

  12. Parallel implementation of a dynamic unstructured chimera method in the DLR finite volume TAU-code

    International Nuclear Information System (INIS)

    Madrane, A.; Raichle, A.; Stuermer, A.

    2004-01-01

    Aerodynamic problems involving moving geometries have many applications, including store separation, high-speed train entering into a tunnel, simulation of full configurations of the helicopter and fast maneuverability. Overset grid method offers the option of calculating these procedures. The solution process uses a grid system that discretizes the problem domain by using separately generated but overlapping unstructured grids that update and exchange boundary information through interpolation. However, such computations are complicated and time consuming. Parallel computing offers a very effective way to improve the productivity in doing computational fluid dynamics (CFD). Therefore the purpose of this study is to develop an efficient parallel computation algorithm for analyzing the flowfield of complex geometries using overset grids method. The strategy adopted in the parallelization of the overset grids method including the use of data structures and communication, is described. Numerical results are presented to demonstrate the efficiency of the resulting parallel overset grids method. (author)

  13. Multidimensional upwind hydrodynamics on unstructured meshes using graphics processing units - I. Two-dimensional uniform meshes

    Science.gov (United States)

    Paardekooper, S.-J.

    2017-08-01

    We present a new method for numerical hydrodynamics which uses a multidimensional generalization of the Roe solver and operates on an unstructured triangular mesh. The main advantage over traditional methods based on Riemann solvers, which commonly use one-dimensional flux estimates as building blocks for a multidimensional integration, is its inherently multidimensional nature, and as a consequence its ability to recognize multidimensional stationary states that are not hydrostatic. A second novelty is the focus on graphics processing units (GPUs). By tailoring the algorithms specifically to GPUs, we are able to get speedups of 100-250 compared to a desktop machine. We compare the multidimensional upwind scheme to a traditional, dimensionally split implementation of the Roe solver on several test problems, and we find that the new method significantly outperforms the Roe solver in almost all cases. This comes with increased computational costs per time-step, which makes the new method approximately a factor of 2 slower than a dimensionally split scheme acting on a structured grid.

  14. A second-order cell-centered Lagrangian ADER-MOOD finite volume scheme on multidimensional unstructured meshes for hydrodynamics

    Science.gov (United States)

    Boscheri, Walter; Dumbser, Michael; Loubère, Raphaël; Maire, Pierre-Henri

    2018-04-01

    In this paper we develop a conservative cell-centered Lagrangian finite volume scheme for the solution of the hydrodynamics equations on unstructured multidimensional grids. The method is derived from the Eucclhyd scheme discussed in [47,43,45]. It is second-order accurate in space and is combined with the a posteriori Multidimensional Optimal Order Detection (MOOD) limiting strategy to ensure robustness and stability at shock waves. Second-order of accuracy in time is achieved via the ADER (Arbitrary high order schemes using DERivatives) approach. A large set of numerical test cases is proposed to assess the ability of the method to achieve effective second order of accuracy on smooth flows, maintaining an essentially non-oscillatory behavior on discontinuous profiles, general robustness ensuring physical admissibility of the numerical solution, and precision where appropriate.

  15. Indoor Trajectory Tracking Scheme Based on Delaunay Triangulation and Heuristic Information in Wireless Sensor Networks.

    Science.gov (United States)

    Qin, Junping; Sun, Shiwen; Deng, Qingxu; Liu, Limin; Tian, Yonghong

    2017-06-02

    Object tracking and detection is one of the most significant research areas for wireless sensor networks. Existing indoor trajectory tracking schemes in wireless sensor networks are based on continuous localization and moving object data mining. Indoor trajectory tracking based on the received signal strength indicator ( RSSI ) has received increased attention because it has low cost and requires no special infrastructure. However, RSSI tracking introduces uncertainty because of the inaccuracies of measurement instruments and the irregularities (unstable, multipath, diffraction) of wireless signal transmissions in indoor environments. Heuristic information includes some key factors for trajectory tracking procedures. This paper proposes a novel trajectory tracking scheme based on Delaunay triangulation and heuristic information (TTDH). In this scheme, the entire field is divided into a series of triangular regions. The common side of adjacent triangular regions is regarded as a regional boundary. Our scheme detects heuristic information related to a moving object's trajectory, including boundaries and triangular regions. Then, the trajectory is formed by means of a dynamic time-warping position-fingerprint-matching algorithm with heuristic information constraints. Field experiments show that the average error distance of our scheme is less than 1.5 m, and that error does not accumulate among the regions.

  16. Simultaneous hierarchical segmentation and vectorization of satellite images through combined data sampling and anisotropic triangulation

    Energy Technology Data Exchange (ETDEWEB)

    Grazzini, Jacopo [Los Alamos National Laboratory; Prasad, Lakshman [Los Alamos National Laboratory; Dillard, Scott [PNNL

    2010-10-21

    The automatic detection, recognition , and segmentation of object classes in remote sensed images is of crucial importance for scene interpretation and understanding. However, it is a difficult task because of the high variability of satellite data. Indeed, the observed scenes usually exhibit a high degree of complexity, where complexity refers to the large variety of pictorial representations of objects with the same semantic meaning and also to the extensive amount of available det.ails. Therefore, there is still a strong demand for robust techniques for automatic information extraction and interpretation of satellite images. In parallel, there is a growing interest in techniques that can extract vector features directly from such imagery. In this paper, we investigate the problem of automatic hierarchical segmentation and vectorization of multispectral satellite images. We propose a new algorithm composed of the following steps: (i) a non-uniform sampling scheme extracting most salient pixels in the image, (ii) an anisotropic triangulation constrained by the sampled pixels taking into account both strength and directionality of local structures present in the image, (iii) a polygonal grouping scheme merging, through techniques based on perceptual information , the obtained segments to a smaller quantity of superior vectorial objects. Besides its computational efficiency, this approach provides a meaningful polygonal representation for subsequent image analysis and/or interpretation.

  17. Experiences with Text Mining Large Collections of Unstructured Systems Development Artifacts at JPL

    Science.gov (United States)

    Port, Dan; Nikora, Allen; Hihn, Jairus; Huang, LiGuo

    2011-01-01

    Often repositories of systems engineering artifacts at NASA's Jet Propulsion Laboratory (JPL) are so large and poorly structured that they have outgrown our capability to effectively manually process their contents to extract useful information. Sophisticated text mining methods and tools seem a quick, low-effort approach to automating our limited manual efforts. Our experiences of exploring such methods mainly in three areas including historical risk analysis, defect identification based on requirements analysis, and over-time analysis of system anomalies at JPL, have shown that obtaining useful results requires substantial unanticipated efforts - from preprocessing the data to transforming the output for practical applications. We have not observed any quick 'wins' or realized benefit from short-term effort avoidance through automation in this area. Surprisingly we have realized a number of unexpected long-term benefits from the process of applying text mining to our repositories. This paper elaborates some of these benefits and our important lessons learned from the process of preparing and applying text mining to large unstructured system artifacts at JPL aiming to benefit future TM applications in similar problem domains and also in hope for being extended to broader areas of applications.

  18. Unstructured Grid Adaptation: Status, Potential Impacts, and Recommended Investments Toward CFD Vision 2030

    Science.gov (United States)

    Park, Michael A.; Krakos, Joshua A.; Michal, Todd; Loseille, Adrien; Alonso, Juan J.

    2016-01-01

    Unstructured grid adaptation is a powerful tool to control discretization error for Computational Fluid Dynamics (CFD). It has enabled key increases in the accuracy, automation, and capacity of some fluid simulation applications. Slotnick et al. provides a number of case studies in the CFD Vision 2030 Study: A Path to Revolutionary Computational Aerosciences to illustrate the current state of CFD capability and capacity. The authors forecast the potential impact of emerging High Performance Computing (HPC) environments forecast in the year 2030 and identify that mesh generation and adaptivity continue to be significant bottlenecks in the CFD work flow. These bottlenecks may persist because very little government investment has been targeted in these areas. To motivate investment, the impacts of improved grid adaptation technologies are identified. The CFD Vision 2030 Study roadmap and anticipated capabilities in complementary disciplines are quoted to provide context for the progress made in grid adaptation in the past fifteen years, current status, and a forecast for the next fifteen years with recommended investments. These investments are specific to mesh adaptation and impact other aspects of the CFD process. Finally, a strategy is identified to diffuse grid adaptation technology into production CFD work flows.

  19. Application of the FUN3D Unstructured-Grid Navier-Stokes Solver to the 4th AIAA Drag Prediction Workshop Cases

    Science.gov (United States)

    Lee-Rausch, Elizabeth M.; Hammond, Dana P.; Nielsen, Eric J.; Pirzadeh, S. Z.; Rumsey, Christopher L.

    2010-01-01

    FUN3D Navier-Stokes solutions were computed for the 4th AIAA Drag Prediction Workshop grid convergence study, downwash study, and Reynolds number study on a set of node-based mixed-element grids. All of the baseline tetrahedral grids were generated with the VGRID (developmental) advancing-layer and advancing-front grid generation software package following the gridding guidelines developed for the workshop. With maximum grid sizes exceeding 100 million nodes, the grid convergence study was particularly challenging for the node-based unstructured grid generators and flow solvers. At the time of the workshop, the super-fine grid with 105 million nodes and 600 million elements was the largest grid known to have been generated using VGRID. FUN3D Version 11.0 has a completely new pre- and post-processing paradigm that has been incorporated directly into the solver and functions entirely in a parallel, distributed memory environment. This feature allowed for practical pre-processing and solution times on the largest unstructured-grid size requested for the workshop. For the constant-lift grid convergence case, the convergence of total drag is approximately second-order on the finest three grids. The variation in total drag between the finest two grids is only 2 counts. At the finest grid levels, only small variations in wing and tail pressure distributions are seen with grid refinement. Similarly, a small wing side-of-body separation also shows little variation at the finest grid levels. Overall, the FUN3D results compare well with the structured-grid code CFL3D. The FUN3D downwash study and Reynolds number study results compare well with the range of results shown in the workshop presentations.

  20. Effect of DEM resolution on rainfall-triggered landslide modeling within a triangulated network-based model. A case study in the Luquillo Forest, Puerto Rico

    Science.gov (United States)

    Arnone, E.; Dialynas, Y. G.; Noto, L. V.; Bras, R. L.

    2013-12-01

    Catchment slope distribution is one of the topographic characteristics that significantly control rainfall-triggered landslide modeling, in both direct and indirect ways. Slope directly determines the soil volume associated with instability. Indirectly slope also affects the subsurface lateral redistribution of soil moisture across the basin, which in turn determines the water pore pressure conditions that impact slope stability. In this study, we investigate the influence of DEM resolution on slope stability and the slope stability analysis by using a distributed eco-hydrological and landslide model, the tRIBS-VEGGIE (Triangulated Irregular Network (TIN)-based Real-time Integrated Basin Simulator - VEGetation Generator for Interactive Evolution). The model implements a triangulated irregular network to describe the topography, and it is capable of evaluating vegetation dynamics and predicting shallow landslides triggered by rainfall. The impact of DEM resolution on the landslide prediction was studied using five TINs derived from five grid DEMs at different resolutions, i.e. 10, 20, 30, 50 and 70 m respectively. The analysis was carried out on the Mameyes Basin, located in the Luquillo Experimental Forest in Puerto Rico, where previous landslide analyses have been carried out. Results showed that the use of the irregular mesh reduced the loss of accuracy in the derived slope distribution when coarser resolutions were used. The impact of the different resolutions on soil moisture patterns was important only when the lateral redistribution was considerable, depending on hydrological properties and rainfall forcing. In some cases, the use of different DEM resolutions did not significantly affect tRIBS-VEGGIE landslide output, in terms of landslide locations, and values of slope and soil moisture at failure.

  1. COMPUTATIONAL EFFICIENCY OF A MODIFIED SCATTERING KERNEL FOR FULL-COUPLED PHOTON-ELECTRON TRANSPORT PARALLEL COMPUTING WITH UNSTRUCTURED TETRAHEDRAL MESHES

    Directory of Open Access Journals (Sweden)

    JONG WOON KIM

    2014-04-01

    In this paper, we introduce a modified scattering kernel approach to avoid the unnecessarily repeated calculations involved with the scattering source calculation, and used it with parallel computing to effectively reduce the computation time. Its computational efficiency was tested for three-dimensional full-coupled photon-electron transport problems using our computer program which solves the multi-group discrete ordinates transport equation by using the discontinuous finite element method with unstructured tetrahedral meshes for complicated geometrical problems. The numerical tests show that we can improve speed up to 17∼42 times for the elapsed time per iteration using the modified scattering kernel, not only in the single CPU calculation but also in the parallel computing with several CPUs.

  2. River salinity on a mega-delta, an unstructured grid model approach.

    Science.gov (United States)

    Bricheno, Lucy; Saiful Islam, Akm; Wolf, Judith

    2014-05-01

    With an average freshwater discharge of around 40,000 m3/s the BGM (Brahmaputra Ganges and Meghna) river system has the third largest discharge worldwide. The BGM river delta is a low-lying fertile area covering over 100,000 km2 mainly in India and Bangladesh. Approximately two-thirds of the Bangladesh people work in agriculture and these local livelihoods depend on freshwater sources directly linked to river salinity. The finite volume coastal ocean model (FVCOM) has been applied to the BGM delta in order to simulate river salinity under present and future climate conditions. Forced by a combination of regional climate model predictions, and a basin-wide river catchment model, the 3D baroclinic delta model can determine river salinity under the current climate, and make predictions for future wet and dry years. The river salinity demonstrates a strong seasonal and tidal cycle, making it important for the model to be able to capture a wide range of timescales. The unstructured mesh approach used in FVCOM is required to properly represent the delta's structure; a complex network of interconnected river channels. The model extends 250 km inland in order to capture the full extent of the tidal influence and grid resolutions of 10s of metres are required to represent narrow inland river channels. The use of FVCOM to simulate flows so far inland is a novel challenge, which also requires knowledge of the shape and cross-section of the river channels.

  3. The Unstructured Paramyxovirus Nucleocapsid Protein Tail Domain Modulates Viral Pathogenesis through Regulation of Transcriptase Activity.

    Science.gov (United States)

    Thakkar, Vidhi D; Cox, Robert M; Sawatsky, Bevan; da Fontoura Budaszewski, Renata; Sourimant, Julien; Wabbel, Katrin; Makhsous, Negar; Greninger, Alexander L; von Messling, Veronika; Plemper, Richard K

    2018-04-15

    The paramyxovirus replication machinery comprises the viral large (L) protein and phosphoprotein (P-protein) in addition to the nucleocapsid (N) protein, which encapsidates the single-stranded RNA genome. Common to paramyxovirus N proteins is a C-terminal tail (Ntail). The mechanistic role and relevance for virus replication of the structurally disordered central Ntail section are unknown. Focusing initially on members of the Morbillivirus genus, a series of measles virus (MeV) and canine distemper virus (CDV) N proteins were generated with internal deletions in the unstructured tail section. N proteins with large tail truncations remained bioactive in mono- and polycistronic minireplicon assays and supported efficient replication of recombinant viruses. Bioactivity of Ntail mutants extended to N proteins derived from highly pathogenic Nipah virus. To probe an effect of Ntail truncations on viral pathogenesis, recombinant CDVs were analyzed in a lethal CDV/ferret model of morbillivirus disease. The recombinant viruses displayed different stages of attenuation ranging from ameliorated clinical symptoms to complete survival of infected animals, depending on the molecular nature of the Ntail truncation. Reinfection of surviving animals with pathogenic CDV revealed robust protection against a lethal challenge. The highly attenuated virus was genetically stable after ex vivo passaging and recovery from infected animals. Mechanistically, gradual viral attenuation coincided with stepwise altered viral transcriptase activity in infected cells. These results identify the central Ntail section as a determinant for viral pathogenesis and establish a novel platform to engineer gradual virus attenuation for next-generation paramyxovirus vaccine design. IMPORTANCE Investigating the role of the paramyxovirus N protein tail domain (Ntail) in virus replication, we demonstrated in this study that the structurally disordered central Ntail region is a determinant for viral

  4. Quantization of super Teichmueller spaces

    International Nuclear Information System (INIS)

    Aghaei, Nezhla

    2016-08-01

    The quantization of the Teichmueller spaces of Riemann surfaces has found important applications to conformal field theory and N=2 supersymmetric gauge theories. We construct a quantization of the Teichmueller spaces of super Riemann surfaces, using coordinates associated to the ideal triangulations of super Riemann surfaces. A new feature is the non-trivial dependence on the choice of a spin structure which can be encoded combinatorially in a certain refinement of the ideal triangulation. We construct a projective unitary representation of the groupoid of changes of refined ideal triangulations. Therefore, we demonstrate that the dependence of the resulting quantum theory on the choice of a triangulation is inessential. In the quantum Teichmueller theory, it was observed that the key object defining the Teichmueller theory has a close relation to the representation theory of the Borel half of U q (sl(2)). In our research we observed that the role of U q (sl(2)) is taken by quantum superalgebra U q (osp(1 vertical stroke 2)). A Borel half of U q (osp(1 vertical stroke 2)) is the super quantum plane. The canonical element of the Heisenberg double of the quantum super plane is evaluated in certain infinite dimensional representations on L 2 (R) x C 1 vertical stroke 1 and compared to the flip operator from the Teichmueller theory of super Riemann surfaces.

  5. Stakeholder management in the local government decision-making area: evidences from a triangulation study with the English local government

    Directory of Open Access Journals (Sweden)

    Ricardo Corrêa Gomes

    2006-01-01

    Full Text Available The stakeholder theory has been in the management agenda for about thirty years and reservations about its acceptance as a comprehensive theory still remains. It was introduced as a managerial issue by the Labour Party in 1997 aiming to make public management more inclusive. This article aims to contribute to the stakeholder theory adding descriptive issues to its theoretical basis. The findings are derived from an inductive investigationcarried out with English Local Authorities, which will most likely be reproduced in other contexts. Data collection and analysis is based on a data triangulation method that involves case-studies, interviews of validation and analysis of documents. The investigation proposes a model for representing the nature of therelationships between stakeholders and the decision-making process of such organizations. The decision-making of local government organizations is in fact a stakeholder-based process in which stakeholders are empowered to exert influences due to power over and interest in the organization’s operations and outcomes.

  6. Multidimensional Riemann problem with self-similar internal structure. Part II - Application to hyperbolic conservation laws on unstructured meshes

    Science.gov (United States)

    Balsara, Dinshaw S.; Dumbser, Michael

    2015-04-01

    Multidimensional Riemann solvers that have internal sub-structure in the strongly-interacting state have been formulated recently (D.S. Balsara (2012, 2014) [5,16]). Any multidimensional Riemann solver operates at the grid vertices and takes as its input all the states from its surrounding elements. It yields as its output an approximation of the strongly interacting state, as well as the numerical fluxes. The multidimensional Riemann problem produces a self-similar strongly-interacting state which is the result of several one-dimensional Riemann problems interacting with each other. To compute this strongly interacting state and its higher order moments we propose the use of a Galerkin-type formulation to compute the strongly interacting state and its higher order moments in terms of similarity variables. The use of substructure in the Riemann problem reduces numerical dissipation and, therefore, allows a better preservation of flow structures, like contact and shear waves. In this second part of a series of papers we describe how this technique is extended to unstructured triangular meshes. All necessary details for a practical computer code implementation are discussed. In particular, we explicitly present all the issues related to computational geometry. Because these Riemann solvers are Multidimensional and have Self-similar strongly-Interacting states that are obtained by Consistency with the conservation law, we call them MuSIC Riemann solvers. (A video introduction to multidimensional Riemann solvers is available on http://www.elsevier.com/xml/linking-roles/text/html". The MuSIC framework is sufficiently general to handle general nonlinear systems of hyperbolic conservation laws in multiple space dimensions. It can also accommodate all self-similar one-dimensional Riemann solvers and subsequently produces a multidimensional version of the same. In this paper we focus on unstructured triangular meshes. As examples of different systems of conservation laws we

  7. Comparison of discrete Hodge star operators for surfaces

    KAUST Repository

    Mohamed, Mamdouh S.

    2016-05-10

    We investigate the performance of various discrete Hodge star operators for discrete exterior calculus (DEC) using circumcentric and barycentric dual meshes. The performance is evaluated through the DEC solution of Darcy and incompressible Navier–Stokes flows over surfaces. While the circumcentric Hodge operators may be favorable due to their diagonal structure, the barycentric (geometric) and the Galerkin Hodge operators have the advantage of admitting arbitrary simplicial meshes. Numerical experiments reveal that the barycentric and the Galerkin Hodge operators retain the numerical convergence order attained through the circumcentric (diagonal) Hodge operators. Furthermore, when the barycentric or the Galerkin Hodge operators are employed, a super-convergence behavior is observed for the incompressible flow solution over unstructured simplicial surface meshes generated by successive subdivision of coarser meshes. Insofar as the computational cost is concerned, the Darcy flow solutions exhibit a moderate increase in the solution time when using the barycentric or the Galerkin Hodge operators due to a modest decrease in the linear system sparsity. On the other hand, for the incompressible flow simulations, both the solution time and the linear system sparsity do not change for either the circumcentric or the barycentric and the Galerkin Hodge operators.

  8. Parallel unstructured mesh optimisation for 3D radiation transport and fluids modelling

    International Nuclear Information System (INIS)

    Gorman, G.J.; Pain, Ch. C.; Oliveira, C.R.E. de; Umpleby, A.P.; Goddard, A.J.H.

    2003-01-01

    In this paper we describe the theory and application of a parallel mesh optimisation procedure to obtain self-adapting finite element solutions on unstructured tetrahedral grids. The optimisation procedure adapts the tetrahedral mesh to the solution of a radiation transport or fluid flow problem without sacrificing the integrity of the boundary (geometry), or internal boundaries (regions) of the domain. The objective is to obtain a mesh which has both a uniform interpolation error in any direction and the element shapes are of good quality. This is accomplished with use of a non-Euclidean (anisotropic) metric which is related to the Hessian of the solution field. Appropriate scaling of the metric enables the resolution of multi-scale phenomena as encountered in transient incompressible fluids and multigroup transport calculations. The resulting metric is used to calculate element size and shape quality. The mesh optimisation method is based on a series of mesh connectivity and node position searches of the landscape defining mesh quality which is gauged by a functional. The mesh modification thus fits the solution field(s) in an optimal manner. The parallel mesh optimisation/adaptivity procedure presented in this paper is of general applicability. We illustrate this by applying it to a transient CFD (computational fluid dynamics) problem. Incompressible flow past a cylinder at moderate Reynolds numbers is modelled to demonstrate that the mesh can follow transient flow features. (authors)

  9. Numerical Study of Detonation Wave Propagation in the Variable Cross-Section Channel Using Unstructured Computational Grids

    Directory of Open Access Journals (Sweden)

    Alexander Lopato

    2018-01-01

    Full Text Available The work is dedicated to the numerical study of detonation wave initiation and propagation in the variable cross-section axisymmetric channel filled with the model hydrogen-air mixture. The channel models the large-scale device for the utilization of worn-out tires. Mathematical model is based on two-dimensional axisymmetric Euler equations supplemented by global chemical kinetics model. The finite volume computational algorithm of the second approximation order for the calculation of two-dimensional flows with detonation waves on fully unstructured grids with triangular cells is developed. Three geometrical configurations of the channel are investigated, each with its own degree of the divergence of the conical part of the channel from the point of view of the pressure from the detonation wave on the end wall of the channel. The problem in consideration relates to the problem of waste recycling in the devices based on the detonation combustion of the fuel.

  10. Unstructured Finite Elements and Dynamic Meshing for Explicit Phase Tracking in Multiphase Problems

    Science.gov (United States)

    Chandra, Anirban; Yang, Fan; Zhang, Yu; Shams, Ehsan; Sahni, Onkar; Oberai, Assad; Shephard, Mark

    2017-11-01

    Multi-phase processes involving phase change at interfaces, such as evaporation of a liquid or combustion of a solid, represent an interesting class of problems with varied applications. Large density ratio across phases, discontinuous fields at the interface and rapidly evolving geometries are some of the inherent challenges which influence the numerical modeling of multi-phase phase change problems. In this work, a mathematically consistent and robust computational approach to address these issues is presented. We use stabilized finite element methods on mixed topology unstructured grids for solving the compressible Navier-Stokes equations. Appropriate jump conditions derived from conservations laws across the interface are handled by using discontinuous interpolations, while the continuity of temperature and tangential velocity is enforced using a penalty parameter. The arbitrary Lagrangian-Eulerian (ALE) technique is utilized to explicitly track the interface motion. Mesh at the interface is constrained to move with the interface while elsewhere it is moved using the linear elasticity analogy. Repositioning is applied to the layered mesh that maintains its structure and normal resolution. In addition, mesh modification is used to preserve the quality of the volumetric mesh. This work is supported by the U.S. Army Grants W911NF1410301 and W911NF16C0117.

  11. Unstructured grid modelling of offshore wind farm impacts on seasonally stratified shelf seas

    Science.gov (United States)

    Cazenave, Pierre William; Torres, Ricardo; Allen, J. Icarus

    2016-06-01

    Shelf seas comprise approximately 7% of the world's oceans and host enormous economic activity. Development of energy installations (e.g. Offshore Wind Farms (OWFs), tidal turbines) in response to increased demand for renewable energy requires a careful analysis of potential impacts. Recent remote sensing observations have identified kilometre-scale impacts from OWFs. Existing modelling evaluating monopile impacts has fallen into two camps: small-scale models with individually resolved turbines looking at local effects; and large-scale analyses but with sub-grid scale turbine parameterisations. This work straddles both scales through a 3D unstructured grid model (FVCOM): wind turbine monopiles in the eastern Irish Sea are explicitly described in the grid whilst the overall grid domain covers the south-western UK shelf. Localised regions of decreased velocity extend up to 250 times the monopile diameter away from the monopile. Shelf-wide, the amplitude of the M2 tidal constituent increases by up to 7%. The turbines enhance localised vertical mixing which decreases seasonal stratification. The spatial extent of this extends well beyond the turbines into the surrounding seas. With significant expansion of OWFs on continental shelves, this work highlights the importance of how OWFs may impact coastal (e.g. increased flooding risk) and offshore (e.g. stratification and nutrient cycling) areas.

  12. Bloch Modes and Evanescent Modes of Photonic Crystals: Weak Form Solutions Based on Accurate Interface Triangulation

    Directory of Open Access Journals (Sweden)

    Matthias Saba

    2015-01-01

    Full Text Available We propose a new approach to calculate the complex photonic band structure, both purely dispersive and evanescent Bloch modes of a finite range, of arbitrary three-dimensional photonic crystals. Our method, based on a well-established plane wave expansion and the weak form solution of Maxwell’s equations, computes the Fourier components of periodic structures composed of distinct homogeneous material domains from a triangulated mesh representation of the inter-material interfaces; this allows substantially more accurate representations of the geometry of complex photonic crystals than the conventional representation by a cubic voxel grid. Our method works for general two-phase composite materials, consisting of bi-anisotropic materials with tensor-valued dielectric and magnetic permittivities ε and μ and coupling matrices ς. We demonstrate for the Bragg mirror and a simple cubic crystal closely related to the Kelvin foam that relatively small numbers of Fourier components are sufficient to yield good convergence of the eigenvalues, making this method viable, despite its computational complexity. As an application, we use the single gyroid crystal to demonstrate that the consideration of both conventional and evanescent Bloch modes is necessary to predict the key features of the reflectance spectrum by analysis of the band structure, in particular for light incident along the cubic [111] direction.

  13. Colliding holes in Riemann surfaces and quantum cluster algebras

    Science.gov (United States)

    Chekhov, Leonid; Mazzocco, Marta

    2018-01-01

    In this paper, we describe a new type of surgery for non-compact Riemann surfaces that naturally appears when colliding two holes or two sides of the same hole in an orientable Riemann surface with boundary (and possibly orbifold points). As a result of this surgery, bordered cusps appear on the boundary components of the Riemann surface. In Poincaré uniformization, these bordered cusps correspond to ideal triangles in the fundamental domain. We introduce the notion of bordered cusped Teichmüller space and endow it with a Poisson structure, quantization of which is achieved with a canonical quantum ordering. We give a complete combinatorial description of the bordered cusped Teichmüller space by introducing the notion of maximal cusped lamination, a lamination consisting of geodesic arcs between bordered cusps and closed geodesics homotopic to the boundaries such that it triangulates the Riemann surface. We show that each bordered cusp carries a natural decoration, i.e. a choice of a horocycle, so that the lengths of the arcs in the maximal cusped lamination are defined as λ-lengths in Thurston-Penner terminology. We compute the Goldman bracket explicitly in terms of these λ-lengths and show that the groupoid of flip morphisms acts as a generalized cluster algebra mutation. From the physical point of view, our construction provides an explicit coordinatization of moduli spaces of open/closed string worldsheets and their quantization.

  14. Resolving high-frequency internal waves generated at an isolated coral atoll using an unstructured grid ocean model

    Science.gov (United States)

    Rayson, Matthew D.; Ivey, Gregory N.; Jones, Nicole L.; Fringer, Oliver B.

    2018-02-01

    We apply the unstructured grid hydrodynamic model SUNTANS to investigate the internal wave dynamics around Scott Reef, Western Australia, an isolated coral reef atoll located on the edge of the continental shelf in water depths of 500,m and more. The atoll is subject to strong semi-diurnal tidal forcing and consists of two relatively shallow lagoons separated by a 500 m deep, 2 km wide and 15 km long channel. We focus on the dynamics in this channel as the internal tide-driven flow and resulting mixing is thought to be a key mechanism controlling heat and nutrient fluxes into the reef lagoons. We use an unstructured grid to discretise the domain and capture both the complex topography and the range of internal wave length scales in the channel flow. The model internal wave field shows super-tidal frequency lee waves generated by the combination of the steep channel topography and strong tidal flow. We evaluate the model performance using observations of velocity and temperature from two through water-column moorings in the channel separating the two reefs. Three different global ocean state estimate datasets (global HYCOM, CSIRO Bluelink, CSIRO climatology atlas) were used to provide the model initial and boundary conditions, and the model outputs from each were evaluated against the field observations. The scenario incorporating the CSIRO Bluelink data performed best in terms of through-water column Murphy skill scores of water temperature and eastward velocity variability in the channel. The model captures the observed vertical structure of the tidal (M2) and super-tidal (M4) frequency temperature and velocity oscillations. The model also predicts the direction and magnitude of the M2 internal tide energy flux. An energy analysis reveals a net convergence of the M2 energy flux and a divergence of the M4 energy flux in the channel, indicating the channel is a region of either energy transfer to higher frequencies or energy loss to dissipation. This conclusion is

  15. El uso de la triangulación en un estudio de detección de necesidades de formación permanente en profesorado no universitario de la Comunidad de Madrid. Using triangulation to assess continuing education teacher needs in Madrid (Spain

    Directory of Open Access Journals (Sweden)

    Coral González

    2009-01-01

    Full Text Available El presente artículo pretende destacar la importancia de la triangulación como elemento o herramienta para comparar y validar informaciones obtenidas mediante diferentes fuentes y métodos. Para ello se apoya en los resultados de un estudio realizado en la Comunidad de Madrid con el fin de determinar las necesidades que el profesorado manifiesta con respecto a la oferta de formación permanente que se les ofrece en la actualidad. Dichos resultados son producto de la utilización de diferentes modos de recogida de información así como de diferentes técnicas de análisis de datos, hecho que los dota de mayor complejidad y riqueza. Partiendo de una breve introducción sobre la técnica de triangulación, se presentan los métodos, fuentes y análisis de datos llevados a cabo junto a los resultados y las conclusiones principales del estudio. This article aims at highlighting the importance of triangulation as tool to compare and validate information coming from different sources and procedures. To do so, we assessed the needs for in-service training demanded by teachers and offered by the educational administration in Madrid (Spain. The data was collected using different techniques and analyzed with different data-analysis method and from this combination the results are richer and more complex. Starting with a short introduction about triangulation, we present methods, sources and analysis of the data as well main results and conclusions obtained via triangulation.

  16. Free-Surface Modeling of Cryogenic Fluids Using a Higher-Order, Unstructured Grid Volume-of-Fluid (VOF) Method, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Accurate and efficient computational modeling of free-surface flows has numerous applications of current and future relevance to NASA. At present, NASA engineers use...

  17. Application of ordinary kriging for interpolation of micro-structured technical surfaces

    International Nuclear Information System (INIS)

    Raid, Indek; Kusnezowa, Tatjana; Seewig, Jörg

    2013-01-01

    Kriging is an interpolation technique used in geostatistics. In this paper we present kriging applied in the field of three-dimensional optical surface metrology. Technical surfaces are not always optically cooperative, meaning that measurements of technical surfaces contain invalid data points because of different effects. These data points need to be interpolated to obtain a complete area in order to fulfil further processing. We present an elementary type of kriging, known as ordinary kriging, and apply it to interpolate measurements of different technical surfaces containing different kinds of realistic defects. The result of the interpolation with kriging is compared to six common interpolation techniques: nearest neighbour, natural neighbour, inverse distance to a power, triangulation with linear interpolation, modified Shepard's method and radial basis function. In order to quantify the results of different interpolations, the topographies are compared to defect-free reference topographies. Kriging is derived from a stochastic model that suggests providing an unbiased, linear estimation with a minimized error variance. The estimation with kriging is based on a preceding statistical analysis of the spatial structure of the surface. This comprises the choice and adaptation of specific models of spatial continuity. In contrast to common methods, kriging furthermore considers specific anisotropy in the data and adopts the interpolation accordingly. The gained benefit requires some additional effort in preparation and makes the overall estimation more time-consuming than common methods. However, the adaptation to the data makes this method very flexible and accurate. (paper)

  18. Proposals for the Operationalisation of the Discourse Theory of Laclau and Mouffe Using a Triangulation of Lexicometrical and Interpretative Methods

    Directory of Open Access Journals (Sweden)

    Georg Glasze

    2007-05-01

    Full Text Available The discourse theory of Ernesto LACLAU and Chantal MOUFFE brings together three elements: the FOUCAULTian notion of discourse, the (post- MARXist notion of hegemony, and the poststructuralist writings of Jacques DERRIDA and Roland BARTHES. Discourses are regarded as temporary fixations of differential relations. Meaning, i.e. any social "objectivity", is conceptualised as an effect of such a fixation. The discussion on an appropriate operationalisation of such a discourse theory is just beginning. In this paper, it is argued that a triangulation of two linguistic methods is appropriate to reveal temporary fixations: by means of corpus-driven lexicometric procedures as well as by the analysis of narrative patterns, the regularities of the linkage of elements can be analysed (for example, in diachronic comparisons. The example of a geographic research project shows how, in so doing, the historically contingent constitution of an international community and "world region" can be analysed. URN: urn:nbn:de:0114-fqs0702143

  19. A complete solution of cartographic displacement based on elastic beams model and Delaunay triangulation

    Science.gov (United States)

    Liu, Y.; Guo, Q.; Sun, Y.

    2014-04-01

    In map production and generalization, it is inevitable to arise some spatial conflicts, but the detection and resolution of these spatial conflicts still requires manual operation. It is become a bottleneck hindering the development of automated cartographic generalization. Displacement is the most useful contextual operator that is often used for resolving the conflicts arising between two or more map objects. Automated generalization researches have reported many approaches of displacement including sequential approaches and optimization approaches. As an excellent optimization approach on the basis of energy minimization principles, elastic beams model has been used in resolving displacement problem of roads and buildings for several times. However, to realize a complete displacement solution, techniques of conflict detection and spatial context analysis should be also take into consideration. So we proposed a complete solution of displacement based on the combined use of elastic beams model and constrained Delaunay triangulation (CDT) in this paper. The solution designed as a cyclic and iterative process containing two phases: detection phase and displacement phase. In detection phase, CDT of map is use to detect proximity conflicts, identify spatial relationships and structures, and construct auxiliary structure, so as to support the displacement phase on the basis of elastic beams. In addition, for the improvements of displacement algorithm, a method for adaptive parameters setting and a new iterative strategy are put forward. Finally, we implemented our solution on a testing map generalization platform, and successfully tested it against 2 hand-generated test datasets of roads and buildings respectively.

  20. On Discrete Killing Vector Fields and Patterns on Surfaces

    KAUST Repository

    Ben-Chen, Mirela; Butscher, Adrian; Solomon, Justin; Guibas, Leonidas

    2010-01-01

    , and show how to discretize these concepts for generating such vector fields on a triangulated mesh. We discuss the properties of approximate Killing Vector Fields, and propose an application to utilize them for texture and geometry synthesis. Journal

  1. Stripping scattering of fast atoms on surfaces of metal-oxide crystals and ultrathin films; Streifende Streuung schneller Atome an Oberflaechen von Metalloxid-Kristallen und ultraduennen Filmen

    Energy Technology Data Exchange (ETDEWEB)

    Blauth, David

    2010-03-11

    In the framework of the present dissertation the interactions of fast atoms with surfaces of bulk oxides, metals and thin films on metals were studied. The experiments were performed in the regime of grazing incidence of atoms with energies of some keV. The advantage of this scattering geometry is the high surface sensibility and thus the possibility to determine the crystallographic and electronic characteristics of the topmost surface layer. In addition to these experiments, the energy loss and the electron emission induced by scattered projectiles was investigated. The energy for electron emission and exciton excitation on Alumina/NiAl(110) and SiO{sub 2}/Mo(112) are determined. By detection of the number of projectile induced emitted electrons as function of azimuthal angle for the rotation of the target surface, the geometrical structure of atoms forming the topmost layer of different adsorbate films on metal surfaces where determined via ion beam triangulation. (orig.)

  2. Barriers to energy efficiency in shipping: A triangulated approach to investigate the principal agent problem

    International Nuclear Information System (INIS)

    Rehmatulla, Nishatabbas; Smith, Tristan

    2015-01-01

    Energy efficiency is a key policy strategy to meet some of the challenges being faced today and to plan for a sustainable future. Numerous empirical studies in various sectors suggest that there are cost-effective measures that are available but not always implemented due to existence of barriers to energy efficiency. Several cost-effective energy efficient options (technologies for new and existing ships and operations) have also been identified for improving energy efficiency of ships. This paper is one of the first to empirically investigate barriers to energy efficiency in the shipping industry using a novel framework and multidisciplinary methods to gauge implementation of cost-effective measures, perception on barriers and observations of barriers. It draws on findings of a survey conducted of shipping companies, content analysis of shipping contracts and analysis of energy efficiency data. Initial results from these methods suggest the existence of the principal agent problem and other market failures and barriers that have also been suggested in other sectors and industries. Given this finding, policies to improve implementation of energy efficiency in shipping need to be carefully considered to improve their efficacy and avoid unintended consequences. -- Highlights: •We provide the first analysis of the principal agent problem in shipping. •We develop a framework that incorporates methodological triangulation. •Our results show the extent to which this barrier is observed and perceived. •The presence of the barrier has implications on the policy most suited to shipping

  3. Multi-phase Volume Segmentation with Tetrahedral Mesh

    DEFF Research Database (Denmark)

    Nguyen Trung, Tuan; Dahl, Vedrana Andersen; Bærentzen, Jakob Andreas

    Volume segmentation is efficient for reconstructing material structure, which is important for several analyses, e.g. simulation with finite element method, measurement of quantitative information like surface area, surface curvature, volume, etc. We are concerned about the representations of the 3......D volumes, which can be categorized into two groups: fixed voxel grids [1] and unstructured meshes [2]. Among these two representations, the voxel grids are more popular since manipulating a fixed grid is easier than an unstructured mesh, but they are less efficient for quantitative measurements....... In many cases, the voxel grids are converted to explicit meshes, however the conversion may reduce the accuracy of the segmentations, and the effort for meshing is also not trivial. On the other side, methods using unstructured meshes have difficulty in handling topology changes. To reduce the complexity...

  4. Extension of the mixed dual finite element method to the solution of the SPN transport equation in 2D unstructured geometries composed by arbitrary quadrilaterals

    International Nuclear Information System (INIS)

    Lautard, J.J.; Flumiani, T.

    2003-01-01

    The mixed dual finite element method is usually used for the resolution of the SPN transport equations (simplified PN equations) in 3D homogenized geometries (composed by homogenized rectangles or hexagons). This method produces fast results with little memory requirements. We have extended the previous method to the treatment of unstructured geometries composed by quadrilaterals (for the moment limited to 2D), allowing us to treat geometries where fuel pins are exactly represented. The iterative resolution of the resulting matrix system is a generalization of the one already developed for the cartesian and the hexagonal geometries. In order to illustrate and to show the efficiency of this method, results on the NEA-C5G7-MOX benchmark are given. The previous benchmark has been extended for the hexagonal geometry and we provide here some results. This method is a first step towards the treatment of pin by pin core calculations without homogenization. The present solver is a prototype. It shows the efficiency of the method and it has to be extended to 3D calculations as well as to exact transport calculations. We also intend to extend the method to the treatment of unstructured geometries composed by quadrilaterals with curved edges (sectors of a circle).The iterative algorithm has yet to be accelerated using multigrid techniques through a coupling with the present homogenized solver (MINOS). In the future, it will be included in the next generation neutronic toolbox DESCARTES currently under development

  5. Natural language processing systems for capturing and standardizing unstructured clinical information: A systematic review.

    Science.gov (United States)

    Kreimeyer, Kory; Foster, Matthew; Pandey, Abhishek; Arya, Nina; Halford, Gwendolyn; Jones, Sandra F; Forshee, Richard; Walderhaug, Mark; Botsis, Taxiarchis

    2017-09-01

    We followed a systematic approach based on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses to identify existing clinical natural language processing (NLP) systems that generate structured information from unstructured free text. Seven literature databases were searched with a query combining the concepts of natural language processing and structured data capture. Two reviewers screened all records for relevance during two screening phases, and information about clinical NLP systems was collected from the final set of papers. A total of 7149 records (after removing duplicates) were retrieved and screened, and 86 were determined to fit the review criteria. These papers contained information about 71 different clinical NLP systems, which were then analyzed. The NLP systems address a wide variety of important clinical and research tasks. Certain tasks are well addressed by the existing systems, while others remain as open challenges that only a small number of systems attempt, such as extraction of temporal information or normalization of concepts to standard terminologies. This review has identified many NLP systems capable of processing clinical free text and generating structured output, and the information collected and evaluated here will be important for prioritizing development of new approaches for clinical NLP. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. A 3D transport-based core analysis code for research reactors with unstructured geometry

    International Nuclear Information System (INIS)

    Zhang, Tengfei; Wu, Hongchun; Zheng, Youqi; Cao, Liangzhi; Li, Yunzhao

    2013-01-01

    Highlights: • A core analysis code package based on 3D neutron transport calculation in complex geometry is developed. • The fine considerations on flux mapping, control rod effects and isotope depletion are modeled. • The code is proved to be with high accuracy and capable of handling flexible operational cases for research reactors. - Abstract: As an effort to enhance the accuracy in simulating the operations of research reactors, a 3D transport core analysis code system named REFT was developed. HELIOS is employed due to the flexibility of describing complex geometry. A 3D triangular nodal S N method transport solver, DNTR, endows the package the capability of modeling cores with unstructured geometry assemblies. A series of dedicated methods were introduced to meet the requirements of research reactor simulations. Afterwards, to make it more user friendly, a graphical user interface was also developed for REFT. In order to validate the developed code system, the calculated results were compared with the experimental results. Both the numerical and experimental results are in close agreement with each other, with the relative errors of k eff being less than 0.5%. Results for depletion calculations were also verified by comparing them with the experimental data and acceptable consistency was observed in results

  7. Simulating the Agulhas system in global ocean models - nesting vs. multi-resolution unstructured meshes

    Science.gov (United States)

    Biastoch, Arne; Sein, Dmitry; Durgadoo, Jonathan V.; Wang, Qiang; Danilov, Sergey

    2018-01-01

    Many questions in ocean and climate modelling require the combined use of high resolution, global coverage and multi-decadal integration length. For this combination, even modern resources limit the use of traditional structured-mesh grids. Here we compare two approaches: A high-resolution grid nested into a global model at coarser resolution (NEMO with AGRIF) and an unstructured-mesh grid (FESOM) which allows to variably enhance resolution where desired. The Agulhas system around South Africa is used as a testcase, providing an energetic interplay of a strong western boundary current and mesoscale dynamics. Its open setting into the horizontal and global overturning circulations also requires global coverage. Both model configurations simulate a reasonable large-scale circulation. Distribution and temporal variability of the wind-driven circulation are quite comparable due to the same atmospheric forcing. However, the overturning circulation differs, owing each model's ability to represent formation and spreading of deep water masses. In terms of regional, high-resolution dynamics, all elements of the Agulhas system are well represented. Owing to the strong nonlinearity in the system, Agulhas Current transports of both configurations and in comparison with observations differ in strength and temporal variability. Similar decadal trends in Agulhas Current transport and Agulhas leakage are linked to the trends in wind forcing.

  8. Critically appraising qualitative research: a guide for clinicians more familiar with quantitative techniques.

    Science.gov (United States)

    Kisely, Stephen; Kendall, Elizabeth

    2011-08-01

    Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.

  9. An empirical study on the influence of IFRS and regulations on the quality of financial reporting of isted companies in a developing country

    Directory of Open Access Journals (Sweden)

    Wadesango, N.

    2016-11-01

    Full Text Available This research sought to establish if International Accounting Standards (IAS, International Financial Reporting Standards (IFRS and regulations in Zimbabwe have been associated with increased financial reporting quality for listed companies. The study adopted mixed research approach. Questionnaires and unstructured interviews were used as research instruments to collect primary data. Content analysis was also adopted to triangulate the results. Target population was the listed companies in Zimbabwe. The study found a significant negative relationship between voluntary adoption of IFRS and earnings management of listed companies in Zimbabwe. The negative relationship may indicate that IFRS does not promote earnings management for voluntary adopters, thereby implying an increased financial reporting quality. It is recommended that top management, external auditors and regulators being the key players in standards, should work together and tighten compliance so that impact of IFRS could be felt more

  10. [Effects of unstructured video exposure on EEG power in situations of forced attention and rest].

    Science.gov (United States)

    Dan'ko, S G; Boĭtsova, Iu A; Kachalova, L M

    2011-01-01

    Group 1 (N = 30) and group 2 (N = 22) of healthy volunteers participated in the experiment. EEG registration took place while the examinees were in the resting states: with closed eyes; with opened eyes; with opened eyes and being under exposure to TV channel noises (white noise). Group 1 had also to fulfill a task to count randomly appearing symbols on a screen and group 2 had to fulfill a task to find an image in the noises. Averaged values of EEG power in each of the derivations in each of the derivations were calculated for an every examinee and for each of the states. The estimations were done in delta, theta, alpha1, alpha2, beta1, beta2, gamma frequency bands. The received results demonstrate that exposure to unstructured non-informative video noise can lead to significant changes of EEG power in a variety of frequency bands which are most prominent in the band alpha2. The changes are topically widespread, reflecting systemic changes in corresponding brain mechanisms, but are much less intensive if compared to changes between resting states with opened and closed eyes.

  11. L1 Use in EFL Classes with English-only Policy: Insights from Triangulated Data

    Directory of Open Access Journals (Sweden)

    Seyyed Hatam Tamimi Sa’d

    2015-06-01

    Full Text Available This study examines the role of the use of the L1 in EFL classes from the perspective of EFL learners. The triangulated data were collected using class observations, focus group semi-structured interviews and the learners’ written reports of their perceptions and attitudes in a purpose-designed questionnaire. The participants consisted of sixty male Iranian EFL learners who constituted three classes. The results indicated a strong tendency among the participants toward L1 and its positive effects on language learning; while only a minority of the learners favoured an English-only policy, the majority supported the judicious, limited and occasional use of the L1, particularly on the part of the teacher. The participants mentioned the advantages as well as the disadvantages of the use/non-use of the L1. While the major advantage and the main purpose of L1 use was said to be the clarification and intelligibility of instructions, grammatical and lexical items, the main advantages of avoiding it were stated as being the improvement of speaking and listening skills, aximizing learners’ exposure to English and their becoming accustomed to it. The study concludes that, overall and in line with the majority of the previous research studies, a judicious, occasional and limited use of the L1 is a better approach to take in EFL classes than to include or exclude it totally. In conclusion, a re-examination of the English-only policy and a reconsideration of the role of the L1 are recommended. Finally, the commonly held assumption that L1 is a hindrance and an impediment to the learners’ language learning is challenged.

  12. Fog collecting biomimetic surfaces: Influence of microstructure and wettability.

    Science.gov (United States)

    Azad, M A K; Ellerbrok, D; Barthlott, W; Koch, K

    2015-01-19

    We analyzed the fog collection efficiency of three different sets of samples: replica (with and without microstructures), copper wire (smooth and microgrooved) and polyolefin mesh (hydrophilic, superhydrophilic and hydrophobic). The collection efficiency of the samples was compared in each set separately to investigate the influence of microstructures and/or the wettability of the surfaces on fog collection. Based on the controlled experimental conditions chosen here large differences in the efficiency were found. We found that microstructured plant replica samples collected 2-3 times higher amounts of water than that of unstructured (smooth) samples. Copper wire samples showed similar results. Moreover, microgrooved wires had a faster dripping of water droplets than that of smooth wires. The superhydrophilic mesh tested here was proved more efficient than any other mesh samples with different wettability. The amount of collected fog by superhydrophilic mesh was about 5 times higher than that of hydrophilic (untreated) mesh and was about 2 times higher than that of hydrophobic mesh.

  13. Replication of specifically microstructured surfaces in A356-alloy via lost wax investment casting

    International Nuclear Information System (INIS)

    Ivanov, Todor; Bührig-Polaczek, Andreas; Vroomen, Uwe; Hartmann, Claudia; Holtkamp, Jens; Gillner, Arnold; Bobzin, Kirsten; Bagcivan, Nazlim; Theiss, Sebastian

    2011-01-01

    A common way of realizing microstructural features on metallic surfaces is to generate the designated pattern on each single part by means of microstructuring technologies such as e.g. laser ablation, electric discharge machining or micromilling. The disadvantage of these process chains is the limited productivity due to the additional processing of each part. The approach of this work is to replicate microstructured surfaces from a master pattern via lost wax investment casting in order to reach a higher productivity. We show that microholes of different sizes ( 15–22 µm at depths of 6–14 µm) can be replicated in AlSi7Mg-alloy from a laser-structured master pattern via investment casting. However, some loss of molding accuracy during the multi-stage molding process occurs. Approximately 50% of the original microfeature's heights are lost during the wax injection step. In the following process step of manufacturing a gypsum-bonded mold, a further loss in the surface quality of the microfeatures can be observed. In the final process step of casting the aluminum melt, the microfeatures are filled without any loss of molding accuracy and replicate the surface quality of the gypsum mold. The contact angle measurements of ultrapure water on the cast surfaces show a decrease in wettability on the microstructured regions (75°) compared to the unstructured region (60°)

  14. An unstructured finite volume solver for two phase water/vapour flows based on an elliptic oriented fractional step method

    International Nuclear Information System (INIS)

    Mechitoua, N.; Boucker, M.; Lavieville, J.; Pigny, S.; Serre, G.

    2003-01-01

    Based on experience gained at EDF and Cea, a more general and robust 3-dimensional (3D) multiphase flow solver has been being currently developed for over three years. This solver, based on an elliptic oriented fractional step approach, is able to simulate multicomponent/multiphase flows. Discretization follows a 3D full unstructured finite volume approach, with a collocated arrangement of all variables. The non linear behaviour between pressure and volume fractions and a symmetric treatment of all fields are taken into account in the iterative procedure, within the time step. It greatly enforces the realizability of volume fractions (i.e 0 < α < 1), without artificial numerical needs. Applications to widespread test cases as static sedimentation, water hammer and phase separation are shown to assess the accuracy and the robustness of the flow solver in different flow conditions, encountered in nuclear reactors pipes. (authors)

  15. Integration of Neuroimaging and Microarray Datasets  through Mapping and Model-Theoretic Semantic Decomposition of Unstructured Phenotypes

    Directory of Open Access Journals (Sweden)

    Spiro P. Pantazatos

    2009-06-01

    Full Text Available An approach towards heterogeneous neuroscience dataset integration is proposed that uses Natural Language Processing (NLP and a knowledge-based phenotype organizer system (PhenOS to link ontology-anchored terms to underlying data from each database, and then maps these terms based on a computable model of disease (SNOMED CT®. The approach was implemented using sample datasets from fMRIDC, GEO, The Whole Brain Atlas and Neuronames, and allowed for complex queries such as “List all disorders with a finding site of brain region X, and then find the semantically related references in all participating databases based on the ontological model of the disease or its anatomical and morphological attributes”. Precision of the NLP-derived coding of the unstructured phenotypes in each dataset was 88% (n = 50, and precision of the semantic mapping between these terms across datasets was 98% (n = 100. To our knowledge, this is the first example of the use of both semantic decomposition of disease relationships and hierarchical information found in ontologies to integrate heterogeneous phenotypes across clinical and molecular datasets.

  16. Characterizing Pavement Surface Distress Conditions with Hyper-Spatial Resolution Natural Color Aerial Photography

    Directory of Open Access Journals (Sweden)

    Su Zhang

    2016-05-01

    Full Text Available Roadway pavement surface distress information is critical for effective pavement asset management, and subsequently, transportation management agencies at all levels (i.e., federal, state, and local dedicate a large amount of time and money to routinely evaluate pavement surface distress conditions as the core of their asset management programs. However, currently adopted ground-based evaluation methods for pavement surface conditions have many disadvantages, like being time-consuming and expensive. Aircraft-based evaluation methods, although getting more attention, have not been used for any operational evaluation programs yet because the acquired images lack the spatial resolution to resolve finer scale pavement surface distresses. Hyper-spatial resolution natural color aerial photography (HSR-AP provides a potential method for collecting pavement surface distress information that can supplement or substitute for currently adopted evaluation methods. Using roadway pavement sections located in the State of New Mexico as an example, this research explored the utility of aerial triangulation (AT technique and HSR-AP acquired from a low-altitude and low-cost small-unmanned aircraft system (S-UAS, in this case a tethered helium weather balloon, to permit characterization of detailed pavement surface distress conditions. The Wilcoxon Signed Rank test, Mann-Whitney U test, and visual comparison were used to compare detailed pavement surface distress rates measured from HSR-AP derived products (orthophotos and digital surface models generated from AT with reference distress rates manually collected on the ground using standard protocols. The results reveal that S-UAS based hyper-spatial resolution imaging and AT techniques can provide detailed and reliable primary observations suitable for characterizing detailed pavement surface distress conditions comparable to the ground-based manual measurement, which lays the foundation for the future application

  17. Meta-analysis and other approaches for synthesizing structured and unstructured data in plant pathology.

    Science.gov (United States)

    Scherm, H; Thomas, C S; Garrett, K A; Olsen, J M

    2014-01-01

    The term data deluge is used widely to describe the rapidly accelerating growth of information in the technical literature, in scientific databases, and in informal sources such as the Internet and social media. The massive volume and increased complexity of information challenge traditional methods of data analysis but at the same time provide unprecedented opportunities to test hypotheses or uncover new relationships via mining of existing databases and literature. In this review, we discuss analytical approaches that are beginning to be applied to help synthesize the vast amount of information generated by the data deluge and thus accelerate the pace of discovery in plant pathology. We begin with a review of meta-analysis as an established approach for summarizing standardized (structured) data across the literature. We then turn to examples of synthesizing more complex, unstructured data sets through a range of data-mining approaches, including the incorporation of 'omics data in epidemiological analyses. We conclude with a discussion of methodologies for leveraging information contained in novel, open-source data sets through web crawling, text mining, and social media analytics, primarily in the context of digital disease surveillance. Rapidly evolving computational resources provide platforms for integrating large and complex data sets, motivating research that will draw on new types and scales of information to address big questions.

  18. Further validation of liquid metal MHD code for unstructured grid based on OpenFOAM

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Jingchao; Chen, Hongli, E-mail: hlchen1@ustc.edu.cn; He, Qingyun; Ye, Minyou

    2015-11-15

    Highlights: • Specific correction scheme has been adopted to revise the calculating result for non-orthogonal meshes. • The developed MHD code based on OpenFOAM platform has been validated by benchmark cases under uniform and non-uniform magnetic field in round and rectangular ducts. • ALEX experimental results have been used to validate the MHD code based on OpenFOAM. - Abstract: In fusion liquid metal blankets, complex geometries involving contractions, expansions, bends, manifolds are very common. The characteristics of liquid metal flow in these geometries are significant. In order to extend the magnetohydrodynamic (MHD) solver developed on OpenFOAM platform to be applied in the complex geometry, the MHD solver based on unstructured meshes has been implemented. The adoption of non-orthogonal correction techniques in the solver makes it possible to process the non-orthogonal meshes in complex geometries. The present paper focused on the validation of the code under critical conditions. An analytical solution benchmark case and two experimental benchmark cases were conducted to validate the code. Benchmark case I is MHD flow in a circular pipe with arbitrary electric conductivity of the walls in a uniform magnetic field. Benchmark cases II and III are experimental cases of 3D laminar steady MHD flow under fringing magnetic field. In all these cases, the numerical results match well with the benchmark cases.

  19. Further validation of liquid metal MHD code for unstructured grid based on OpenFOAM

    International Nuclear Information System (INIS)

    Feng, Jingchao; Chen, Hongli; He, Qingyun; Ye, Minyou

    2015-01-01

    Highlights: • Specific correction scheme has been adopted to revise the calculating result for non-orthogonal meshes. • The developed MHD code based on OpenFOAM platform has been validated by benchmark cases under uniform and non-uniform magnetic field in round and rectangular ducts. • ALEX experimental results have been used to validate the MHD code based on OpenFOAM. - Abstract: In fusion liquid metal blankets, complex geometries involving contractions, expansions, bends, manifolds are very common. The characteristics of liquid metal flow in these geometries are significant. In order to extend the magnetohydrodynamic (MHD) solver developed on OpenFOAM platform to be applied in the complex geometry, the MHD solver based on unstructured meshes has been implemented. The adoption of non-orthogonal correction techniques in the solver makes it possible to process the non-orthogonal meshes in complex geometries. The present paper focused on the validation of the code under critical conditions. An analytical solution benchmark case and two experimental benchmark cases were conducted to validate the code. Benchmark case I is MHD flow in a circular pipe with arbitrary electric conductivity of the walls in a uniform magnetic field. Benchmark cases II and III are experimental cases of 3D laminar steady MHD flow under fringing magnetic field. In all these cases, the numerical results match well with the benchmark cases.

  20. Three-Dimensional TIN Algorithm for Digital Terrain Modeling%数字地形建模的真三维TIN算法研究

    Institute of Scientific and Technical Information of China (English)

    朱庆; 张叶廷; 李逢春

    2008-01-01

    The problem of taking an unorganized point cloud in 3D space and fitting a polyhedral surface to those points is both important and difficult. Aiming at increasing applications of full three dimensional digital terrain surface modeling, a new algorithm for the automatic generation of three dimensional triangulated irregular network from a point cloud is proposed. Based on the local topological consistency test, a combined algorithm of constrained 3D Delaunay triangulation and region-growing is extended to ensure topologically correct reconstruction. This paper also introduced an efficient neighboring triangle location method by making full use of the surface normal information. Experimental results prove that this algorithm can efficiently obtain the most reasonable reconstructed mesh surface with arbitrary topology, wherein the automatically reconstructed surface has only small topological difference from the true surface. This algorithm has potential applications to virtual environments, computer vision, and so on.